KR20150118377A - Information inputting system and method by movements of finger - Google Patents
Information inputting system and method by movements of finger Download PDFInfo
- Publication number
- KR20150118377A KR20150118377A KR1020140044139A KR20140044139A KR20150118377A KR 20150118377 A KR20150118377 A KR 20150118377A KR 1020140044139 A KR1020140044139 A KR 1020140044139A KR 20140044139 A KR20140044139 A KR 20140044139A KR 20150118377 A KR20150118377 A KR 20150118377A
- Authority
- KR
- South Korea
- Prior art keywords
- information
- motion
- finger
- mode
- motion information
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/014—Hand-worn input/output arrangements, e.g. data gloves
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/038—Indexing scheme relating to G06F3/038
- G06F2203/0384—Wireless input, i.e. hardware and software details of wireless interface arrangements for pointing devices
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Abstract
The present invention relates to an information input system using finger movement and an input method thereof. More particularly, the present invention relates to an information input system that can be worn on a finger of a user, senses finger movement information of a user, A motion input unit 100 for transmitting the motion information to the motion input unit 100 and a finger mode information input from the motion input unit 100 to determine a mouse mode, an operation mode, and a handwriting mode, The integrated management unit 200 may be configured to control the operation of the motion input unit 100 before remote control of the smart device 300. [ And outputs the finger motion maximum movement distance and the finger movement Calculating a bacteria moving distance setting an initial reference value, sensitivity, and to an information input system using a finger movement, characterized in that to determine the finger motion information based on said initial sensitivity threshold.
Description
The present invention relates to an information input system using a finger movement and an input method thereof, and more particularly, to a system and method for inputting information by using a motion input unit that integrates a mouse function and a keyboard function, And more particularly, to an information input system and a method for inputting information using a finger movement that can remotely control various smart devices in a mouse mode, an operation mode, and a handwriting mode.
In recent years, the types and the penetration rates of smart devices have increased rapidly throughout the world, and smart appliances such as refrigerators and washing machines have also been added to the smart devices.
In such a smart / wearable environment, an input device such as a conventional keyboard, a mouse, and a touch pad may cause inconvenience of use.
Since the keyboard and mouse are optimized for the input of the PC environment, it is difficult to have a keyboard and a mouse in the case of a portable smart device. Therefore, the conventional keyboard and mouse are not suitable for application, so the touch interface using the position sensor is used .
In the case of such a touch interface, there is an advantage that a user can intuitively command and control an interlocked device. In recent years, a pen-shaped motion input unit capable of recognizing a touch is also developed .
However, such a touch interface is often used in a small portable smart device, but it is disadvantageous in that it is inefficient in medium and large smart devices such as a smart TV or a smart device.
In order to solve these problems, a sensor-based smart device control technology using an acceleration sensor, a geomagnetic sensor, a gyro sensor, or the like has been developed for a smart device-based smart device control technology using a camera, have.
However, in the case of the image recognition-based control technology, since the user's face, hands, and gestures must be received through the camera and a specific motion must be detected, it can be used only when there is sufficient light. It is only possible to control it.
In addition, in the case of the speech recognition-based control technique, since the user transmits commands to the smart device through the natural language, there is a disadvantage that the use of the device is limited depending on the circumstances such as noise environment and public places.
In addition, in the case of the sensor-based control technology, a product that replaces the mouse function by sensing the movement of the user's finger is described, but the accuracy of sensing is lowered and the handwriting input is impossible.
In addition, since smart devices use different input methods in recent years, incompatibility of interfaces between smart devices is increasing the burden of user's UI (User Interface) learning.
In order to solve such a problem, the information input system and the input method using the finger movement of the present invention include a motion input unit capable of integrating a mouse function and a keyboard function, and the integrated management unit includes a finger To an information input system and an input method for remotely controlling various smart devices connected to the mobile device in a mouse mode, an operation mode, and a handwriting mode .
Japanese Laid-Open Patent Application No. 2005-0047329 ("Information Input Device and Method Using Finger Movement ", hereinafter referred to as Prior Art 1) discloses a method of attaching to a fingertip of a user to move a fingertip on any contact surface, A signal sensing unit for sensing an axis information signal and a pressure change due to pressing; and a signal sensing unit for sensing a signal provided by the signal sensing unit, the signal sensing unit being installed separately from the signal sensing unit, And a signal processing unit for receiving and analyzing the input / output data and transmitting the input data to the computing device, and for providing input device events corresponding to the computing device. The control and input functions of the computing device are more convenient in a virtual space, a wearable computing environment, Discloses an apparatus and method for inputting information.
SUMMARY OF THE INVENTION The present invention has been made in order to solve the problems of the related art as described above, and it is an object of the present invention to provide a motion input unit, which uses a motion input unit to recognize a motion of a finger, The present invention provides an information input system and a method of inputting information using a finger movement that can remotely control various smart devices connected to a mobile device in a mouse mode, an operation mode, and a handwriting mode.
The information input system using the finger movement according to an exemplary embodiment of the present invention includes a motion input unit that is wearable on a finger of a user and senses finger movement information of a user and transmits the finger movement information to the
In this case, the
The
Here, the
The operation
The
In addition, the
A method of inputting information using a finger movement according to an embodiment of the present invention is a method for inputting information using finger movement information of a user input through a motion input unit that can be worn on a finger by a user, A signal preprocessing step (S100) of receiving the first finger motion information from the motion input unit in the integrated management unit and recognizing the motion input unit worn by a user, An initial sensitivity setting step (S200) of analyzing the finger motion information received from the motion input unit in the integrated management unit to calculate an initial sensitivity standard value by calculating a finger movement maximum movement distance and a finger motion average movement distance, The finger movement rate and the finger movement distance are calculated by analyzing the finger movement information received from the finger movement input unit, and the finger movement rate and the finger movement distance are calculated based on the calculated initial sensitivity (S300) for determining valid motion information of the finger motion information by comparing the finger motion information with a reference value, a valid motion information determining step (S300) of determining whether the valid motion information determined in the valid motion information determining step (S300) The control unit transmits a control signal corresponding to the valid motion information to the smart device according to the mode determined in step S400 and the mode determined in the mode determination step S400 in the integrated management unit, The remote control step (S500) Characterized in that eojineun.
In this case, when the valid motion information is determined to be a mouse mode in the mode determination step (S400), the remote control step S500 analyzes the finger movement direction of the valid motion information to obtain mouse cursor information of the smart device Remote control, or remote control of mouse click information of the smart device by analyzing a finger operation,
If it is determined that the effective motion information is an operation mode in the mode determination step (S400), the control unit compares the previously stored motion information with the valid motion information to generate control information corresponding to the matched motion information, Characterized by remote control,
If the valid motion information is determined to be a handwriting mode in the mode determination step S400, the valid motion information is analyzed to determine a predetermined decision tree algorithm, And the smart device is controlled by the control information corresponding to the matching character information by comparing the stored character information with the character input information.
In this case, in the mode determination step (S400), the finger movement information received from the motion input unit is compared with the mode change motion information using the mode change motion information stored in the integrated management unit, And the mode switching is performed by determining whether or not the mode is switched.
According to an embodiment of the present invention, there is provided an information input system using finger movement and a method for inputting the same, which includes a motion input unit capable of integrating a mouse function and a keyboard function, Accordingly, various smart devices connected to the integrated management unit can be easily remotely controlled and input information in a mouse mode, an operation mode, and a handwriting mode.
In addition, the integrated management unit requests the initial input for each of the xy, yz, and zz planes through the motion input unit prior to remote control of the smart device, thereby acquiring the device movement range of the user three- Thereby enabling the sensitivity of the 3D motion input unit to be customized for each user.
In addition, when the motion input unit is controlled in the handwriting mode, the inputted motion information is applied to a decision tree algorithm, and the character can be quickly recognized.
1 is a block diagram illustrating an information input system using finger movement according to an embodiment of the present invention.
FIG. 2 is a flowchart illustrating an information input method using finger movement according to an exemplary embodiment of the present invention. Referring to FIG.
3 is a diagram illustrating a relationship between planes for setting sensitivity of a motion input unit in an information input system using finger movement and an input method thereof according to an embodiment of the present invention.
4 is a diagram illustrating an example of a motion signal in a handwriting mode through a motion input unit in an information input system using a finger motion according to an embodiment of the present invention and an input method thereof.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Reference will now be made in detail to the embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the like elements throughout. The following drawings are provided by way of example so that those skilled in the art can fully understand the spirit of the present invention. Therefore, the present invention is not limited to the following drawings, but may be embodied in other forms. In addition, like reference numerals designate like elements throughout the specification.
In this case, unless otherwise defined, technical terms and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. In the following description and the accompanying drawings, A description of known functions and configurations that may unnecessarily obscure the description of the present invention will be omitted.
In addition, a system refers to a collection of components, including devices, mechanisms, and means that are organized and regularly interact to perform the required function.
1 is a diagram illustrating an information input system using finger movement according to an embodiment of the present invention. The information input system using the finger movement of the present invention will be described in detail with reference to FIG.
According to an embodiment of the present invention, there is provided an information input system using finger movements, which uses a motion input unit capable of integrating a mouse function and a keyboard function, and controls various types of smart devices according to finger movement information of a user, 1 is an information input system using finger movements that can easily be switched to a mouse mode, an operation mode, and a handwriting mode by using finger movement information inputted through a motion input unit, A
To learn more about each configuration,
1, the
The
The
The
Specifically, the three-axis acceleration sensor can measure the acceleration of the X, Y, and Z axes at regular intervals according to the movement of a finger of a user to which the
The three-axis gyro sensor can measure the angular speeds of the X, Y, and Z axes at regular intervals according to the movement of the user's finger attached to the
In other words, the motion sensor unit 20 senses the movement of the user in the three-dimensional space using the three-axis acceleration sensor and the three-axis gyro sensor, and senses finger motion information.
The integrated
The
1, the
The
The
In detail, the finger movement information input from the
As shown in FIG. 3, the
The
That is, the
If the finger motion information is within the initial sensitivity reference value, it can be determined that the finger motion information is a valid motion.
The
In this case, the determination of the mouse mode, the operation mode, or the handwriting mode may be performed by using the mode determination motion information stored in advance in the
Accordingly, the
To be more specific,
The mouse
That is, the
The user can remotely control the mouse click and left click of the
Here, according to the initial sensitivity default value set by the initial
The operation
That is, the operation
When the user performs the stored operation information through the
The
In other words, the handwriting
The
Specifically, the handwriting
Also, when the stroke information is stored in the clockwise or counterclockwise direction with respect to the consecutive strokes and the stroke progresses in the continuous direction, the distance calculated by calculating the distance between the finger movement time and the end point is greater than a predetermined constant If it is not large, it can be formed into a loop.
At this time, the handwriting
In addition, the characters in the following Table 1 are as shown in FIG.
The handwriting
The writing
In this case, the handwriting
In addition, the handwriting
The virtual plane is composed of the two-dimensional axes of X and Y, and the vertical and horizontal histograms of the motion coordinates obtained with respect to the X and Y axes are obtained, and the midpoints of the motion coordinates The radiation histogram is obtained while rotating in the direction of the edge of the coordinates. Since the sensor data of the motion input unit is discrete data, when obtaining the histogram, the histogram value can be obtained by supplementing the coordinates between the two points.
And one final character input information is calculated in one character group by using this.
However, when the character input information is recognized in a plurality of or a plurality of character groups, each character input information is calculated in each character group.
In this case, the decision tree algorithm is performed again.
In other words, when the final character input information calculated by analyzing the valid motion information is not included in one character group but is contained in a plurality of or a plurality of character groups, Tree algorithm, and the final character input information is calculated by the second decision tree algorithm.
In detail, the first decision tree algorithm first determines the direction of the progression of successive strokes of the input valid motion information, and the second decision tree algorithm can use the histogram characteristic first.
That is, the second decision tree algorithm calculates the final character input information by using the character group recognition result, which is the result of the first decision tree algorithm, and the histogram characteristic of the inputted valid motion information.
For example, when the user inputs a character of 'e' through the finger motion information, the 'e' character may be 'a character starting to be handwritten in a loop' or ' A character consisting of one loop as a whole can be recognized in two groups of characters.
Accordingly, the decision tree algorithm is independently performed in each character group (the first decision tree algorithm), and the second decision tree algorithm is performed using the histogram characteristic by the decision tree algorithm in each character group Tree algorithm) to recognize the final character input information, that is, the character 'e'.
In addition, when the user inputs the character 's' through the finger motion information, the 's' character generally has a 'rope-free character' as shown in Table 1, Looped character '.
In order to correct such a recognition error problem, the information input system using the finger motion of the present invention is configured to perform a decision tree algorithm on the
Since the information input system using the finger movement of the present invention can remotely control various
For this, the
The
2 is a flowchart illustrating an information input method using a finger movement according to an embodiment of the present invention. Referring to FIG. 2, a method for inputting information using the finger movement according to the present invention will be described in detail.
The method of inputting information using the finger movement of the present invention is a method of inputting information from a
The method of inputting information using the finger movement of the present invention is a method of inputting information of the user to the
To learn more about each step,
The signal preprocessing step S100 may be performed by the
In this case, the finger motion information refers to sensor data of all six axes sensed by the three-axis gyro sensor included in the
In the initial sensitivity setting step S200, the
In detail, the finger movement information input from the
In the valid motion information determination step S300, the valid
That is, the
If the finger motion information is within the initial sensitivity reference value, it can be determined that the finger motion information is a valid motion.
The mode determination step S400 is a step in which the
In this case, the determination of the mouse mode, the operation mode, or the handwriting mode may be performed by using the mode determination motion information stored in advance in the
Accordingly, the
In addition, in the mode determination step S400, the finger movement information received from the
In the remote control step S500, the
The control signal transmission to the
More specifically, when the valid motion information is determined to be a mouse mode in the mode determination step (S400), the remote control step (S500) controls the mouse mode control unit (250) to determine a finger movement direction of the valid motion information Analyzing the mouse cursor information of the
The mouse click information of the
If it is determined in operation S400 that the valid motion information is an operation mode, the remote control step S500 controls the operation
If it is determined in step S400 that the valid motion information is a handwriting mode, the remote control step S500 controls the
The character information stored in advance in the writing
That is, in other words, the information input system and the input method using finger movement according to an embodiment of the present invention include a
As described above, the present invention has been described with reference to specific embodiments such as specific components and exemplary embodiments. However, the present invention is not limited to the above-described embodiments, And various modifications and changes may be made thereto by those skilled in the art to which the present invention pertains.
Accordingly, the spirit of the present invention should not be construed as being limited to the embodiments described, and all of the equivalents or equivalents of the claims, as well as the following claims, fall within the scope of the present invention .
100:
110: network management unit 120: motion sensor unit
130:
200: Integrated Management Department
210: Signal preprocessing unit 220: Sensitivity setting unit
230: valid motion determiner 240: central controller
250: Mouse mode control unit 260: Operation mode control unit
270: handwriting mode control section
300: Smart device
Claims (12)
An operation mode and a handwriting mode by analyzing the finger movement information received from the motion input unit 100 and determines a corresponding mode among the mouse mode, the operation mode, and the handwriting mode, and controls the smart device 300, An integrated management unit 200 for remotely controlling the mobile terminal 100;
And,
The integrated management unit 200
Before performing remote control of the smart device 300,
The finger motion information input from the motion input unit 100 is analyzed to calculate an initial sensitivity reference value by calculating a finger movement maximum movement distance and a finger movement average movement distance and determines the finger movement information based on the initial sensitivity reference value Wherein the information input system comprises:
The motion input unit 100
A network management unit 110 for managing a wireless network with the integrated management unit 200;
A motion sensor unit 120 including a 3-axis acceleration sensor and a 3-axis gyro sensor, and sensing a user's finger movement through the 3-axis acceleration sensor and the 3-axis gyro sensor; And
A power supply unit 130;
Wherein the information input system comprises:
The integrated management unit 200
A signal preprocessing unit 210 receiving the first finger motion information from the motion input unit 100 and recognizing the motion input unit 100 worn by the user;
A sensitivity setting unit 220 for setting the initial sensitivity reference value when the signal preprocessing unit 210 recognizes the motion input unit 100;
Calculates the finger motion velocity and the finger motion distance using the finger motion information received from the motion input unit 100 and outputs the calculated finger motion velocity and finger motion distance to the initial sensitivity reference value of the sensitivity setting unit 220 An effective motion determiner 230 for determining valid motion information among the finger motion information by comparing the motion information with the motion information;
A central controller 240 for determining whether the finger motion information is a mouse mode, an operation mode, or a handwriting mode using the valid motion information determined by the valid motion determiner 230 and generating a control signal for each mode;
A mouse mode control unit 250 for transmitting a control signal corresponding to the valid motion information to the smart device 300 in a mouse mode under the control of the central control unit 240;
An operation mode control unit (260) for transmitting a control signal corresponding to the valid motion information to the smart device (300) in an operation mode under the control of the central control unit (240); And
A handwriting mode controller 270 for transmitting a control signal corresponding to the valid motion information to the smart device 300 in a handwriting mode under the control of the central controller 240;
Wherein the information input system comprises:
The mouse mode control unit 250
Receiving the valid motion information of the mouse mode,
Wherein the mouse cursor information of the smart device (300) is remotely controlled by analyzing the finger movement direction, or the mouse click information of the smart device (300) is remotely controlled by analyzing the finger operation. system.
The operation mode control unit 260
Receiving the valid motion information of the operation mode,
And compares the stored motion information with the valid motion information to remotely control the smart device (300) with control information corresponding to the matched motion information.
The handwriting mode control unit 270
Receiving the valid motion information of the handwriting mode,
The valid motion information is analyzed and a predetermined decision tree algorithm is performed to transmit the calculated character input information to the smart device 300 and output,
And the smart device (300) is remotely controlled with the control information corresponding to the character information to be matched by comparing the stored character information with the character input information.
The central control unit 240
The finger movement information received from the motion input unit 100 is compared with the mode switching motion information using the stored mode switching motion information,
And controls the mouse mode control unit 250, the operation mode control unit 260, and the handwriting mode control unit 270 to perform mode switching when performing a mode change according to a result of the determination. Input system.
A signal preprocessing step (S100) of receiving the first finger motion information from the motion input unit in the integrated management unit and recognizing the motion input unit worn by the user;
An initial sensitivity setting step (S200) of analyzing the finger motion information received from the motion input unit in the integrated management unit and calculating an initial sensitivity reference value by calculating a finger movement maximum movement distance and a finger movement average movement distance;
The integrated management unit analyzes the finger motion information received from the motion input unit to calculate a finger motion velocity and a finger motion distance, and calculates the calculated finger motion velocity and finger motion distance in the initial sensitivity setting step (S200) A valid motion information determination step (S300) of determining valid motion information among the finger motion information by comparing the finger motion information with the initial sensitivity reference value;
A mode determination step (S400) of determining whether the effective motion information determined in the valid motion information determination step (S300) is a mouse mode, an operation mode, or a handwriting mode; And
A remote control step (S 500) of remotely controlling the smart device by transmitting a control signal corresponding to the valid motion information to the smart device according to the mode determined in the mode determination step (S400) in the integrated management part;
Wherein the information is input by a user.
The remote control step (S500)
If it is determined that the valid motion information is a mouse mode in the mode determination step (S400)
Wherein the mouse cursor information of the smart device is remotely controlled by analyzing the finger movement direction of the valid motion information or by remote control of the mouse click information of the smart device by analyzing the finger operation, .
The remote control step (S500)
If it is determined that the effective motion information is an operation mode in the mode determination step (S400)
Comparing the stored motion information with the valid motion information, and remotely controlling the smart device with control information corresponding to the matched motion information.
The remote control step (S500)
If it is determined in step S400 that the valid motion information is a handwriting mode,
The valid motion information is analyzed and a predetermined decision tree algorithm is performed to transmit the calculated character input information to the smart device and output,
Comparing the stored character information with the character input information, and remotely controlling the smart device with control information corresponding to the character information to be matched.
The mode determination step (S400)
Using the mode switching motion information stored in the integrated management unit, the finger motion information received from the motion input unit and the mode switching motion information are compared with each other, Wherein the information is input by using the finger movement.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020140044139A KR20150118377A (en) | 2014-04-14 | 2014-04-14 | Information inputting system and method by movements of finger |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020140044139A KR20150118377A (en) | 2014-04-14 | 2014-04-14 | Information inputting system and method by movements of finger |
Publications (1)
Publication Number | Publication Date |
---|---|
KR20150118377A true KR20150118377A (en) | 2015-10-22 |
Family
ID=54426841
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020140044139A KR20150118377A (en) | 2014-04-14 | 2014-04-14 | Information inputting system and method by movements of finger |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR20150118377A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102207510B1 (en) * | 2020-04-30 | 2021-01-27 | (주)콕스스페이스 | Electronic device for controlling host device using motion signals and mouse signals |
KR102286018B1 (en) * | 2020-09-09 | 2021-08-05 | 주식회사 피앤씨솔루션 | Wearable augmented reality device that inputs mouse events using hand gesture and method of mouse event input for wearable augmented reality device using hand gesture |
KR20230007109A (en) * | 2021-07-05 | 2023-01-12 | 주식회사 피앤씨솔루션 | Wearable augmented reality device that inputs operation signal using a two-handed gesture and method of operating a wearable augmented reality device using a two-handed gesture |
-
2014
- 2014-04-14 KR KR1020140044139A patent/KR20150118377A/en not_active Application Discontinuation
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102207510B1 (en) * | 2020-04-30 | 2021-01-27 | (주)콕스스페이스 | Electronic device for controlling host device using motion signals and mouse signals |
WO2022035027A1 (en) * | 2020-04-30 | 2022-02-17 | 주식회사 콕스스페이스 | Electronic device for controlling host device by using motion signal and mouse signal |
US11797112B1 (en) | 2020-04-30 | 2023-10-24 | Cox Space Co., Ltd. | Electronic device for controlling host device by using motion signal and mouse signal |
KR102286018B1 (en) * | 2020-09-09 | 2021-08-05 | 주식회사 피앤씨솔루션 | Wearable augmented reality device that inputs mouse events using hand gesture and method of mouse event input for wearable augmented reality device using hand gesture |
KR20230007109A (en) * | 2021-07-05 | 2023-01-12 | 주식회사 피앤씨솔루션 | Wearable augmented reality device that inputs operation signal using a two-handed gesture and method of operating a wearable augmented reality device using a two-handed gesture |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11914792B2 (en) | Systems and methods of tracking moving hands and recognizing gestural interactions | |
US11561519B2 (en) | Systems and methods of gestural interaction in a pervasive computing environment | |
KR102181588B1 (en) | Method and apparatus for optimal control based on motion-voice multi-modal command | |
US9600078B2 (en) | Method and system enabling natural user interface gestures with an electronic system | |
KR100630806B1 (en) | Command input method using motion recognition device | |
KR101533319B1 (en) | Remote control apparatus and method using camera centric virtual touch | |
US20140184499A1 (en) | Remote manipulation device and method using a virtual touch of a three-dimensionally modeled electronic device | |
US9569103B2 (en) | Remote control apparatus and method for performing virtual touch by using information displayed by a projector | |
US20150177836A1 (en) | Wearable information input device, information input system, and information input method | |
CN105190483A (en) | Detection of a gesture performed with at least two control objects | |
US9965041B2 (en) | Input device, apparatus, input method, and recording medium | |
KR101928971B1 (en) | A wearable device for controlling an electronic device based on hand motion and method for controlling the wearable device thereof | |
US10725550B2 (en) | Methods and apparatus for recognition of a plurality of gestures using roll pitch yaw data | |
US20160334880A1 (en) | Gesture recognition method, computing device, and control device | |
US9525906B2 (en) | Display device and method of controlling the display device | |
KR20150118377A (en) | Information inputting system and method by movements of finger | |
US9529446B2 (en) | Re-anchorable virtual panel in three-dimensional space | |
CN113569635B (en) | Gesture recognition method and system | |
KR101233793B1 (en) | Virtual mouse driving method using hand motion recognition | |
KR102346294B1 (en) | Method, system and non-transitory computer-readable recording medium for estimating user's gesture from 2d images | |
KR20160011451A (en) | Character input apparatus using virtual keyboard and hand gesture recognition and method thereof | |
WO2009116079A2 (en) | Character based input using pre-defined human body gestures | |
US20150123893A1 (en) | Remote controller for motion recognition | |
CN104461524A (en) | Song requesting method based on Kinect | |
CN110083226B (en) | Virtual space positioning method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A201 | Request for examination | ||
E902 | Notification of reason for refusal | ||
E601 | Decision to refuse application |