KR20060069985A - System for wearable general-purpose 3-dimensional input - Google Patents

System for wearable general-purpose 3-dimensional input Download PDF

Info

Publication number
KR20060069985A
KR20060069985A KR1020040108592A KR20040108592A KR20060069985A KR 20060069985 A KR20060069985 A KR 20060069985A KR 1020040108592 A KR1020040108592 A KR 1020040108592A KR 20040108592 A KR20040108592 A KR 20040108592A KR 20060069985 A KR20060069985 A KR 20060069985A
Authority
KR
South Korea
Prior art keywords
wearable
means
acceleration
host
user
Prior art date
Application number
KR1020040108592A
Other languages
Korean (ko)
Other versions
KR100674090B1 (en
Inventor
심재철
조기성
Original Assignee
한국전자통신연구원
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 한국전자통신연구원 filed Critical 한국전자통신연구원
Priority to KR1020040108592A priority Critical patent/KR100674090B1/en
Publication of KR20060069985A publication Critical patent/KR20060069985A/en
Application granted granted Critical
Publication of KR100674090B1 publication Critical patent/KR100674090B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/00402Recognising digital ink, i.e. recognising temporal sequences of handwritten position coordinates
    • G06K9/00409Preprocessing; Feature extraction
    • G06K9/00416Sampling; contour coding; stroke extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/00335Recognising movements or behaviour, e.g. recognition of gestures, dynamic facial expressions; Lip-reading
    • G06K9/00355Recognition of hand or arm movements, e.g. recognition of deaf sign language

Abstract

1. TECHNICAL FIELD OF THE INVENTION
The present invention relates to a wearable general purpose three-dimensional input system.
2. The technical problem to be solved by the invention
The present invention is to provide a wearable universal three-dimensional input system that can perform all the input functions such as a remote control, a mouse and a pen as one wearable device.
3. Summary of Solution to Invention
The present invention provides a wearable universal three-dimensional input system, comprising: wearable input means configured to be worn on a user's body and to transmit movement information of a user; And host means for interpreting the motion information received from the wearable input means and feeding back to the user.
4. Important uses of the invention
The present invention is used for general purpose input system.
Wearable, input device, accelerometer, mouse, stylus, remote controller

Description

System for Wearable General-Purpose 3-Dimensional Input}             

1 is a block diagram of an embodiment of a wearable universal three-dimensional input system according to the present invention,

2 is a detailed configuration diagram of an embodiment of a host connection device according to the present invention;

3 is a conceptual diagram showing an example of use of the wearable device according to an embodiment of the present invention;

4 is a detailed configuration diagram of an embodiment of the wearable device according to the present invention;

5 is an exemplary view of the screen of the display means according to an embodiment of the present invention,

6 is a detailed configuration diagram of an embodiment of a host according to the present invention;

7 is a detailed configuration diagram of an embodiment of the motion analysis unit of the host according to the present invention;

8a, 8b and 8c is a view showing a change in acceleration according to the movement of the wearable device according to an embodiment of the present invention,

9 is a view showing the movement of the pen and the movement detected by the wearable device when writing according to an embodiment of the present invention,

10 is a view showing a relationship between a wearable device and a writing plane when writing in accordance with a preferred embodiment of the present invention;

11 is a diagram showing an example of writing in a state where a wearable device is worn on a finger according to an exemplary embodiment of the present invention.

* Description of main parts of drawing *

100: wearable device 200: host

300: host connection device 110: network

120: display means 101: acceleration sensor

The present invention relates to a wearable universal three-dimensional input system, and more particularly, a wearable universal three-dimensional input system for controlling all devices using a remote control, a mouse, and a pen as one wearable device, and a control method thereof. It is about.

As home appliances and portable home appliances in homes and offices are becoming more common, various types of input devices are used to control them.

Remote controls are common for TVs and audio devices, keyboards and mice for PCs, and touch screens and stylus for PDAs.

Increasingly, two or three generations of TVs and PCs are used at home. Accordingly, a remote controller, a mouse, and a pen for controlling each device are also used in proportion to the number of devices.

As a result, in order to control the device, the control device corresponding to the device to be controlled must be found and controlled.

As the concept of home network and middleware for this has become commonplace, various information appliances connected to the network have been able to control other devices by sending commands on the network. It is not out of category.

For example, a conventional technology for mitigating such a constraint may include, for example, a mouse for detecting a motion with an acceleration sensor, an input device for detecting movement by placing an acceleration sensor at the tip of a pen, and detecting whether the pen is touched with a pressure sensor, There is a pen for transmitting the motion information detected by the acceleration sensor to Bluetooth.

At this time, the pen for transmitting the motion information detected by the acceleration sensor to Bluetooth uses a method of operating the pen by pressing a button attached to the pen.

Another conventional technique is a pen using a three-axis acceleration sensor and an optical sensor (see US Patent Publication No. US2002 / 0148658a1).

The pen using the conventional three-axis acceleration sensor and the optical sensor is to measure the angle of the pen in contact with the plane by using the optical sensor to determine the position of the pen more accurate even when using the acceleration sensor. An additional pressure sensor is also used to determine the start and end of the stroke by determining if the pen is in contact with the plane.

Meanwhile, the remote controller may control a plurality of devices by using a universal remote.

However, in the situation where the current TV, VTR and audio system are used together, each time a user selects a device to control by pressing a mode select button on the remote controller and controls each individual device.

If you have a TV, VTR, or CD connected to audio, watching a TV requires a series of actions: select TV on the universal remote control, select channels, select audio mode on the remote control, change the input to TV, and control the volume. I did it.

Conventional technology for solving the above problems, the user's activity (user activity) and the information necessary to perform a series of processes (signals that the remote control outputs) and their corresponding mapping (mapping) storage connected to the Internet Another option is to store them separately on the device.

Here, when the user selects a desired activity with the remote control, the stored information is retrieved and a series of processes are performed, and the buttons of the remote control are limited to control of a specific device according to the selected activity.

In addition, as another conventional technology, the input of a method using an acceleration sensor to sense the vibration caused by the user hitting the device or the surrounding body parts, and converts the detected vibration into a command for a predefined device There is also a device (see Japanese Patent Laid-Open No. 2003-143683).

However, these conventional technologies can be used only in a system having a predetermined number of functions such as a TV and an audio system and corresponding to each button even if the universal remote controller is used.

In the case of PCs requiring more complex control, a mouse or a digitizing tablet is used for control, and a pen is used in a portable device such as a PDA.

In addition, it is known that an inertial sensor such as an acceleration sensor or an angular velocity sensor can be used to monitor a user's motion to create a three-dimensional space mouse, pen, or remote controller. Since only one function is performed among the remote controllers, there is a problem in that the versatility is poor.

Moreover, a mouse using such an inertial sensor also uses a separate button like a normal mouse. For pens with inertial sensors, use a separate button or pressure sensor to identify the start and end of the stroke.

In addition to acquiring pen movement information using an acceleration sensor, an optical sensor or an angular velocity sensor is additionally used to recognize a contact angle between the pen and a plane to identify the pen movement more accurately.

Because of the use of such buttons or the use of additional sensors, it is difficult to miniaturize the device and there is a problem in that it is a limitation to make a wearable device.

In addition, the related arts have a problem in that even when using an inertial sensor, only a single input device such as a remote controller, a pen, and a mouse can be operated.

The present invention has been proposed to solve the above problems, and an object of the present invention is to provide a wearable general-purpose three-dimensional input system capable of performing all input functions such as a remote controller, a mouse, and a pen as one wearable device. .

Other objects and advantages of the present invention can be understood by the following description, and will be more clearly understood by the embodiments of the present invention. Also, it will be readily appreciated that the objects and advantages of the present invention may be realized by the means and combinations thereof indicated in the claims.

According to an aspect of the present invention, there is provided a wearable general-purpose three-dimensional input system, comprising: wearable input means configured to be worn on a user's body and transmitting movement information of a user; And a host means for interpreting the motion information received from the wearable input means and feeding back to the user.

In addition, the present invention is characterized in that it further comprises a host connecting means for wirelessly connecting the wearable input means and the host means.

The above objects, features and advantages will become more apparent from the following detailed description taken in conjunction with the accompanying drawings, whereby those skilled in the art may easily implement the technical idea of the present invention. There will be. In addition, in describing the present invention, when it is determined that the detailed description of the known technology related to the present invention may unnecessarily obscure the gist of the present invention, the detailed description thereof will be omitted. Hereinafter, exemplary embodiments of the present invention will be described in detail with reference to the accompanying drawings.

1 is a block diagram of an embodiment of a wearable universal three-dimensional input system according to the present invention.

As shown in FIG. 1, the wearable universal three-dimensional input system according to the present invention includes a wearable device 100, a host 200, and a host connection device connecting the wearable device 100 and the host 200 ( 300 and the display means 120 connected with the host 200, and may further include a network 110.

The host 200 may be an information device such as a PC, a PDA, and a set-top box, and directly communicate with a wearable device without a separate host connection device when a device such as a PC or a PDA having a wireless processing function is used as a host. can do.

The wearable device 100 transmits periodically measured user's movement information to the host.

The host 200 interprets the received motion information and converts the motion information into a motion of a mouse, a button of a remote controller, and a motion of a pen.

In addition, the host 200 feeds back the movement and state of the wearable device 100 to the user through the display means 20.

In addition, the host may be connected to the network 10 to use the control information obtained by analyzing the movement of the wearable device to control other devices connected to the network 10.

2 is a detailed block diagram of an embodiment of a host connection device according to the present invention.

As shown in FIG. 2, the host connection device 300 receives and processes wireless information from the wearable device 100, and a host for interfacing with the wireless processor 301 and the host 200. And a control unit 302 which controls the I / F unit 303, the wireless processing unit 301, and the host I / F unit 303 and transmits information.

3 is a conceptual view showing an example of use of the wearable device according to an embodiment of the present invention.

As shown in FIG. 3, the wearable device 100 according to the present invention is worn on a user's finger in a ring shape, and uses a remote controller 1, a mouse 2, and a pen 3. You can control all of them wirelessly.

Figure 4 is a detailed configuration of an embodiment of the wearable device according to the present invention.

As shown in FIG. 4, the wearable device 100 according to the present invention receives the acceleration sensor 101 for detecting a user's motion, the wireless processor 103 transmitting the same to the host device, and the information of the acceleration sensor. Control unit 102 for transmitting to the wireless processing unit 103 and an antenna 104 for transmitting and receiving radio waves.

5 is an exemplary view of the screen of the display means according to an embodiment of the present invention,

As shown in Fig. 5, the host displays "controls" such as dial 804, button 802, slide bar 803, text entry window 809 on screen 801 of the display means.

6 is a detailed configuration diagram of an embodiment of a host according to the present invention, and FIG. 7 is a detailed configuration diagram of an embodiment of a motion analysis unit of the host according to the present invention.

As shown in FIG. 6, the host includes a motion analyzer 205, a motion recognizer 204, an application / GUI unit 207, and a handwriting recognizer 208.

In addition, as shown in FIG. 7, the host motion analysis unit 205 integrates the feature pattern detection unit 901, the tilt estimation unit 902, the slope correction unit 903, and the state information management unit 906 with data. Integrators 903, 904, 905, 907, and 908 that do the same.

Hereinafter, the functions of the components will be described based on mutual coupling with reference to FIGS. 4 to 7.

The control unit 102 of the wearable device 100 continuously monitors a user's motion detected by the acceleration sensor 101 with respect to the wearable device 100 in a normal standby state, and when the motion is detected, becomes active. When the motion sensor 100 outputs the motion information on each of the detected axes output from the acceleration sensor 101 to the host 200 through the wireless processor 103 and the antenna 104.

Monitoring of the user's motion for the wearable device 100 to switch between the standby state and the active state can be easily implemented by comparing the magnitude of the movement output by the acceleration sensor 101 with a predefined threshold value. The magnitude of the motion can be obtained by adding the square of the absolute value of the acceleration on each axis and then finding the square root.

In Table 1, which is a preferred embodiment of the present invention, which will be described later, a motion for switching a state is defined as a motion exceeding a threshold value is repeated several times within a predetermined time.

Three-dimensional motion information, which is motion information on each axis of the wearable device, received from the wearable device, is input to the motion analyzer 205 and the motion recognizer 204.

The motion analyzer 205 analyzes the three-dimensional motion information of the wearable device using the state information and coordinates of the controls on the screen while maintaining the state information of the wearable device.

If the state of the wearable device of the motion analysis unit 205 is a mouse state, the three-dimensional motion information of the wearable device is interpreted as two-dimensional movement or mouse button state information, and is transmitted to the application / GUI unit 207 to transmit the mouse pointer ( 806) move the position.

On the other hand, if the character input window 809 is selected and the wearable device is changed to the pen state, the application / GUI unit 207 displays the appropriate pointers 807 and 808 on the screen 801, and the motion analyzer is two-dimensional. The analyzed motion information and stroke start and end information are transmitted to the handwriting recognition unit 208 to recognize the text written by the user.

The character recognized by the handwriting recognition unit 208 is transferred to the application / GUI unit 207 and displayed in the character input window 809.

The motion recognition unit 204 compares whether the movement of the wearable device matches the predefined movement using various predefined motion-command association information and the state information of the wearable device provided by the motion analyzer 205. do.

When the movement of the wearable device coincides with a predefined movement, the command information of the movement defined in the movement-command association information is transmitted to the application / GUI unit 207.

The handwriting recognition unit 208 and the motion recognition unit 204 operate by various known handwriting recognition algorithms and motion recognition algorithms, and the operation of these algorithms will not be described in detail.

Since the sensitivity needs for various movements may be different for each user, user-set information on a sensitivity or a mode change motion set by the user may also be provided to the motion analyzer 205 to be used to analyze three-dimensional movements of the wearable device. have.

The acceleration sensor 101 measures the acceleration when the object to which the sensor is attached moves and outputs it as an electrical signal. When the acceleration sensor collects the acceleration information that is output at all times, it is possible to obtain a velocity according to time by integrating 907 once, and by integrating 908 twice, the distance the object moves.

In addition, since the acceleration sensor 101 includes an acceleration sensor that monitors three axes with respect to the X, Y, and Z axes, it is possible to obtain a velocity and a moving distance with respect to the movement of the object in the three-dimensional space from the measured acceleration. .

When the wearable device 100 operates with a mouse, the speed and the moving distance are transmitted to the application / GUI unit 207 together with the button state information interpreted by the state information management unit 906 to move or click the pointer on the screen. The same operation is done.

When a wearable device including an accelerometer is worn on a finger and moved, the pattern of acceleration changes when a finger moves and stops in space and when a bump hits an object and moves and moves in the opposite direction is different.

In the present invention, by using the user setting for the acceleration and sensitivity input from the feature pattern detection unit 901 determines the match with a predetermined pattern, and outputs the type of the matched pattern.

Whether the pattern matches can be easily implemented by a pattern matching algorithm in the conventional signal processing, and the type of the matched pattern is used to change state information of the wearable device.

8A, 8B, and 8C are diagrams illustrating a change in acceleration according to the movement of the wearable device detected by the feature pattern detector 901 according to an exemplary embodiment of the present invention.

The wearable device periodically transmits acceleration information measured by the acceleration sensor to the host.

At this time, as shown in Figure 8b, according to the sensitivity setting previously set by the user, the acceleration information that the acceleration is rapidly changed to +->-or--> + within a certain time to move in the opposite direction from the host Interpreted

In addition, as shown in FIG. 8A, if a sudden + or − acceleration is displayed and the acceleration in the opposite direction does not appear and attenuates, it is interpreted as being stopped by hitting the object.

In addition, as shown in Figure 8c, when the + or-acceleration is gradually interpreted to start or stop moving.

FIG. 9 is a diagram illustrating a movement of a pen during writing and a movement detected by a wearable device according to an exemplary embodiment of the present invention, and FIG. 10 is a diagram illustrating a relationship between a wearable device and a writing plane during writing.

In the conventional invention, a sensor is embedded in a device such as a mouse or a pen to directly observe the movement of the device. However, the wearable device according to the present invention cannot directly observe the movement of the pen when writing.

11 is a diagram showing an example of writing in a state where a wearable device is worn on a finger according to an exemplary embodiment of the present invention.

As shown in FIG. 11, the information sensed by the accelerometer sensor of the wearable device when the user grabs and writes the pen is the movement 504 reduced and moved in plane with respect to the movement of the pen tip 505.

As shown in FIG. 9, when operating with a pen using a wearable device including an acceleration sensor, a method of holding the pen 603 is different for each person, and a plane in which the pen moves according to the posture of wearing the device on a finger ( The slope of 601 is different.

That is, the X, Y and Z axes 604 and 704 of the planes 605 and 701 of the pen movement are shifted from the X, Y and Z axes 703 of the actual wearable device 100.

In order to more accurately track the trajectory of the pen on the plane, the wearable device 100 and the angle 705 of the plane in which the pen is moving must be known.

To this end, the acceleration in each axis at the start of the stroke, ie when the pen or finger touches the plane, is measured and the angle of the plane in which the wearable device and the pen move is calculated.

At the point when the pen or finger is in contact with the plane, an acceleration pattern as shown in FIG. 8A is observed. The acceleration pattern is detected by the feature pattern detector 901 and transmitted to the gradient estimator 902.

In addition, when the acceleration maximum value of each axis observed in the pattern is x, y, z, the vector D = (x, y, z) becomes a vector perpendicular to the plane in which the pen moves.

When the tilt estimator 902 detects the vector of the plane where the finger contacts when the pattern of FIG. 8A is detected, the angle θ formed by these two vectors is expressed as Obtained according to [Equation 1].

Figure 112004059922241-PAT00001

For example, if the X and Y axes of the wearable device are parallel to the plane in which the pen moves, only the movement of the Z axis will be detected when the pen touches the plane.

In this case, only the movement of the X and Y axes detected by the sensor worn on the finger may be regarded as the pen moving on the plane.

On the other hand, if only motion on the X axis is detected, only motion detected on the Y and Z axes can be regarded as forming a stroke.

If only the movement of one axis is detected as above, the other two axes are horizontal to the plane where the pen moves, and the movement information of these two axes faithfully reflects the precision of the sensor.

However, if motion is detected on two or more axes, then the acceleration of the device detected by the sensor is exaggerated.

For example, if the angle between the X axis and the plane is θ, the acceleration measured on the X axis is x? And the acceleration on the X axis is x? The relationship of cos θ holds.

The tilt correction unit 903 performs this operation on the acceleration measured by the sensor using the slope obtained by the tilt estimation unit 902 to obtain the corrected acceleration.

The corrected acceleration is integrated by the integrator 904 and converted into a speed, which is in turn integrated by the integrator 905 and converted into a travel distance.

The corrected speed and movement distance for each axis thus obtained are transmitted to the handwriting recognition unit 208 together with the stroke start / end information determined by the state information management unit 906.

According to the functions of the above-described components, the input method as shown in the following example may be implemented. Each input method defined in the examples below is provided for illustrative purposes, and may be modified according to the user.

<Start of device use>

The user initiates interaction with the host through a predetermined action, such as shaking the wearable device from side to side several times.

When the control unit 102 of the wearable device detects a special condition according to the pre-defined operation, the controller 102 regards the start of the operation of the device, initiates communication with the host, and transmits three-dimensional motion information output by the sensor to the host.

The host interprets the 3D motion information transmitted from the wearable device as described with reference to FIG. 6.

At this time, the host regards the wearable device to perform a function of a mouse and outputs a mouse pointer or the like on the screen of the display means so that the user can know that the operation is started.

<Mouse as Mouse>

As the user moves the wearable device, the wearable device transmits motion information of the device in three dimensions sensed by the acceleration sensor to the host.

The host interprets this as corresponding to the movement of the mouse. In particular, moving a finger up and down at high speed can be interpreted as a click on the mouse.

Repeating this action twice in succession can be interpreted as a double click.

Also, if you move your finger up or down and wait, you can interpret it as pressing the right mouse button.

<Select for control>

When the pointer is over a control in the mouse state, moving a finger up or down can be interpreted as a selection for the control.

Moving a finger left, right, and back following this selection is interpreted as operating on a control such as a dial or slide bar, and interpreted as deselecting by moving a finger up.

On the other hand, moving the finger up and down relative to the text entry window (click) means that the action of the wearable device should be interpreted as a pen.

<Operation in Pen State>

In the pen state, you can write by holding a writing instrument or a finger with a wearable device as a pen.

At this time, the acceleration sensor recognizes the sudden stop motion that occurs in the wearable device at the moment when the pen moves down and touches the plane or when the finger touches the plane and recognizes it as the start of a valid stroke (501).

In addition, by recognizing the acceleration in three dimensions at this point in time, the reference plane of the plane that the pen contacts is set (704).

Then, the detected movement of the wearable device is interpreted as a two-dimensional movement 505 on this reference plane to indirectly calculate the movement information of the stroke during writing.

Up or near vertical motion with respect to this reference plane is interpreted as releasing the pen from the plane and interpreted as the end of a valid stroke (502).

The movement from the start of the stroke to the end of the stroke is transferred to the handwriting recognition block 504 to be recognized as a character or interpreted as a control operation such as space, backspace, or the like.

Even in the pen state, movement of the wearable device before the start of a valid stroke (501) or after the end of a valid stroke (502) (503) is considered to be a movement, as in a mouse, and on the screen of the display means the pointer instead of the stroke (505). Move is displayed.

<Return from pen state to mouse state>

When the input of the text is completed, the user's gesture of shaking the hand in a state of releasing the pen may be interpreted as returning to the mouse state after completing the action in the pen state.

<Operation as a Remote Control>

The movement of the wearable device in the mouse state is compared with whether or not it corresponds with a predefined action for the remote control.

If a user selects a device with predefined actions such as an audio or video device on the screen by clicking it, or if the currently active and running application has a predefined action, subsequent user actions are defined by the device and application. Are compared first.

If the action matches a predefined action, it is interpreted as a control command for the selected device, which is then passed to the device and the application.

If the action does not match a predefined action, this action is only considered to have changed the mouse position.

For example, if a TV is selected, moving the finger quickly to the right (left) to stop is defined as raising (lowering) the volume, and moving the finger up or down quickly to stop is to change the channel. Can be specified

Table 1 below is an embodiment of the meaning of the user's finger motion according to the operation state and condition of the wearable device according to the above-described input method. The operations of the finger indicated as a symbol in Table 1 are identified by the feature pattern detection unit 901, and accordingly, the state information management unit 906 manages state information such as a mouse, pen, and remote control mode.

Figure 112004059922241-PAT00002

As long as there is no contradiction or confusion according to the user's taste under each state and condition, the embodiment described in [Table 1] should not be regarded as the only method because a convenient operation can be selected according to the user.

As described above, the method of the present invention may be implemented as a program and stored in a recording medium (CD-ROM, RAM, ROM, floppy disk, hard disk, magneto-optical disk, etc.) in a computer-readable form. Since this process can be easily implemented by those skilled in the art will not be described in more detail.

The present invention described above is capable of various substitutions, modifications, and changes without departing from the technical spirit of the present invention for those skilled in the art to which the present invention pertains. It is not limited by the drawings.

As described above, the present invention has an effect that can be applied to an input device integrating a remote controller, a pen, and a mouse according to a user's operation and use context.

In addition, the present invention has the effect of controlling a variety of information devices existing in the home network environment or ubiquitous environment as a single wearable device.

In addition, the present invention by analyzing the user's operation can be used naturally without using a separate button and there is an effect that can be configured in a more compact in configuring the wearable device.                     

In addition, the present invention has the effect that the wearable device can be configured to a smaller wearable device without using a separate switch because the user's operation between the standby state and the active state.

In addition, the present invention has the effect that it is possible to detect the movement required for the operation of the device only by the acceleration sensor of more than three axes without using an additional sensor.

In addition, even if the present invention is implemented as a device that is constantly attached to the hand, such as a ring, there is an effect that can infer the user's writing motion from the movement of the attached device.

Claims (14)

  1. Wearable general purpose three-dimensional input system,
    Wearable input means for being structured to be worn on a user's body and transmitting movement information of the user; And
    Host means for interpreting the motion information received from the wearable input means and feedback to the user
    Wearable universal three-dimensional input system comprising a.
  2. The method of claim 1,
    Host connection means for wirelessly connecting the wearable input means and the host means.
    More,
    The host connection means,
    Wireless communication means for receiving the motion information from the wearable input means via an antenna; And
    Host interface means for transferring the received motion information to a host
    Wearable universal three-dimensional input system comprising a.
  3. The method of claim 1,
    The wearable input means,
    Acceleration detection means for detecting motion of a user and generating motion information;
    Wireless communication means for wirelessly transmitting the movement information to the host means; And
    Control means for controlling said acceleration sensing means and wireless communication means and for monitoring and transmitting said movement information
    Wearable universal three-dimensional input system comprising a.
  4. The method of claim 3, wherein
    The acceleration detecting means,
    X-axis sensor, y-axis sensor and z-axis sensor to detect the user's movement in three-dimensional space
    Wearable universal three-dimensional input system comprising a.
  5. The method of claim 1,
    The host means,
    Wearable universal three-dimensional input system, characterized in that the operation in one of the mouse, pen, and the remote control mode corresponding to the movement information and transition to another mode.
  6. The method of claim 1,
    The host means,
    Motion analysis means for receiving and analyzing the motion information and feeding back the user with visual information according to the corresponding mode;
    Motion recognition means for outputting command information corresponding to the motion information; And
    User interface means for transmitting control and input signals to an external user device in accordance with the command information
    Wearable universal three-dimensional input system comprising a.
  7. The method of claim 6,
    The host means,
    Handwriting recognition unit for recognizing the motion information as a character
    Wearable universal three-dimensional input system further comprising.
  8. The method of claim 6,
    The motion analysis means,
    Acceleration patterns in which the acceleration suddenly changes from + to-or-to + depending on the sensitivity setting are interpreted as moving in the opposite direction from the host means.
    After a sudden + or-acceleration, an acceleration pattern that does not appear in the opposite direction and decays is interpreted as being stopped by hitting an object.
    Wearable general purpose three-dimensional input system, characterized in that the acceleration pattern in which + or-acceleration is shown slowly is interpreted as starting or stopping while moving.
  9. The method of claim 6,
    The motion analysis means,
    Wearable universal three-dimensional input system, characterized in that for using the acceleration pattern of claim 8 to replace the input by the button.
  10. The method of claim 6,
    When the mode is the pen mode, the inclination information of the wearable device and the plane is acquired from the acceleration detected by the acceleration detecting means at the moment when the body wearing the wearable input device touches the plane, and thus the detected acceleration is obtained. Wearable universal three-dimensional input system, characterized in that for correcting.
  11. The method of claim 6,
    If the mode is pen mode,
    Wearable general purpose three-dimensional input system, characterized by the acceleration pattern of the body wearing the wearable device or the body-mounted pen contact the plane and falling moment to identify the start and end of the stroke.
  12. The method of claim 6,
    If the mode is pen mode,
    Wearable input device, characterized in that to use the movement of the wearable input means instead of the movement of the fingertip or the hand-held pen tip in the state worn on the user's finger, such as a ring.
  13. The method of claim 3, wherein
    Wearable universal three-dimensional input system, characterized in that the control means in the wearable input means implements a switching function between the standby state and the active state of the wearable input means by monitoring the motion information generated by the acceleration sensing means.
  14. The method of claim 3, wherein
    The wearable universal three-dimensional input system, characterized in that the host means monitors the movement generated by the acceleration sensing means to correspond to the button and the switch without having the button or switch.
KR1020040108592A 2004-12-20 2004-12-20 System for Wearable General-Purpose 3-Dimensional Input KR100674090B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020040108592A KR100674090B1 (en) 2004-12-20 2004-12-20 System for Wearable General-Purpose 3-Dimensional Input

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020040108592A KR100674090B1 (en) 2004-12-20 2004-12-20 System for Wearable General-Purpose 3-Dimensional Input
PCT/KR2005/002264 WO2006068357A1 (en) 2004-12-20 2005-07-14 System for wearable general-purpose 3-dimensional input

Publications (2)

Publication Number Publication Date
KR20060069985A true KR20060069985A (en) 2006-06-23
KR100674090B1 KR100674090B1 (en) 2007-01-24

Family

ID=36601922

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020040108592A KR100674090B1 (en) 2004-12-20 2004-12-20 System for Wearable General-Purpose 3-Dimensional Input

Country Status (2)

Country Link
KR (1) KR100674090B1 (en)
WO (1) WO2006068357A1 (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100790818B1 (en) * 2006-12-06 2008-01-03 한국과학기술연구원 Apparatus and method for controlling electronic appliances based on hand gesture recognition
KR100884553B1 (en) * 2006-12-18 2009-02-19 한국과학기술연구원 System and Method of Transmitting Data in a Multiplex Computing Network
WO2009028921A2 (en) * 2007-08-30 2009-03-05 Lg Electronics Inc. Apparatus and method for providing feedback for three-dimensional touchscreen
KR100901482B1 (en) * 2007-09-27 2009-06-08 한국전자통신연구원 Remote control system and method by using virtual menu map
US8219936B2 (en) 2007-08-30 2012-07-10 Lg Electronics Inc. User interface for a mobile device using a user's gesture in the proximity of an electronic device
US8502769B2 (en) 2006-10-16 2013-08-06 Samsung Electronics Co., Ltd. Universal input device
KR101393803B1 (en) * 2011-12-15 2014-05-12 엘지이노텍 주식회사 Remote controller, Display device and controlling method thereof
KR101517001B1 (en) * 2008-12-09 2015-04-30 삼성전자주식회사 Input device and input method
CN104793747A (en) * 2015-04-24 2015-07-22 百度在线网络技术(北京)有限公司 Method, device and system for inputting through wearable device
KR101632014B1 (en) * 2015-04-06 2016-06-21 엘지전자 주식회사 Mobile terminal
US9529434B2 (en) 2013-06-17 2016-12-27 Samsung Electronics Co., Ltd. Presentation device and method for operating the device
US10241574B2 (en) 2013-08-20 2019-03-26 Samsung Electronics Co., Ltd. Wearable biosignal interface and method thereof
US10285039B2 (en) 2013-06-17 2019-05-07 Samsung Electronics Co., Ltd. Wearable device and communication method using the wearable device

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8125448B2 (en) 2006-10-06 2012-02-28 Microsoft Corporation Wearable computer pointing device
KR101284797B1 (en) 2008-10-29 2013-07-10 한국전자통신연구원 Apparatus for user interface based on wearable computing environment and method thereof
KR20100048090A (en) * 2008-10-30 2010-05-11 삼성전자주식회사 Interface apparatus for generating control command by touch and motion, interface system including the interface apparatus, and interface method using the same
US8503932B2 (en) * 2008-11-14 2013-08-06 Sony Mobile Comminications AB Portable communication device and remote motion input device
KR101095012B1 (en) * 2008-12-10 2011-12-20 한국전자통신연구원 Ring type input device and method thereof
KR101061072B1 (en) * 2008-12-15 2011-09-01 시사게임 주식회사 Realistic game system and method
KR101027313B1 (en) 2009-04-13 2011-04-06 주식회사 뉴티씨 (Newtc) Mouse using 3D acceleration sensor
US20130027307A1 (en) * 2009-08-20 2013-01-31 Shanda Computer (Shanghai) Co., Ltd Human-machine interface apparatus and operating method thereof
KR100970585B1 (en) * 2009-10-07 2010-07-16 서창수 Apparatus for charactreristic recognizing based on movement of finger
CN102478956B (en) * 2010-11-25 2014-11-19 安凯(广州)微电子技术有限公司 Virtual laser keyboard input device and input method
CN102184011B (en) * 2011-05-06 2013-03-27 中国科学院计算技术研究所 Human-computer interaction equipment
CN103513901A (en) * 2012-06-26 2014-01-15 联想(北京)有限公司 Information processing method and electronic device
KR101485679B1 (en) * 2013-06-26 2015-01-28 길재수 Character input method using motion sensor and apparatus performing the same
KR20150082079A (en) * 2014-01-06 2015-07-15 삼성전자주식회사 Apparatus and method for controlling home device using wearable device
CN105467852A (en) * 2015-12-29 2016-04-06 惠州Tcl移动通信有限公司 Intelligent household appliance control method based on intelligent watch and intelligent watch

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000132305A (en) 1998-10-23 2000-05-12 Olympus Optical Co Ltd Operation input device
JP2002358149A (en) * 2001-06-01 2002-12-13 Sony Corp User inputting device
KR100446613B1 (en) * 2001-07-16 2004-09-04 삼성전자주식회사 Information input method using wearable information input device
KR100634494B1 (en) * 2002-08-19 2006-10-16 삼성전기주식회사 Wearable information input device, information processing device and information input method
KR100537503B1 (en) * 2002-12-31 2005-12-19 삼성전자주식회사 Method for configuring 3D information input device, method for reconfiguring 3D information input device, method for recognizing wearable information input device, and the apparatus therefor
KR20040081223A (en) * 2003-03-14 2004-09-21 이성일 Chording-Glove Typed Input Device for Korean language
KR20050047329A (en) * 2003-11-17 2005-05-20 한국전자통신연구원 Input information device and method using finger motion

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8502769B2 (en) 2006-10-16 2013-08-06 Samsung Electronics Co., Ltd. Universal input device
KR101299682B1 (en) * 2006-10-16 2013-08-22 삼성전자주식회사 Universal input device
KR100790818B1 (en) * 2006-12-06 2008-01-03 한국과학기술연구원 Apparatus and method for controlling electronic appliances based on hand gesture recognition
KR100884553B1 (en) * 2006-12-18 2009-02-19 한국과학기술연구원 System and Method of Transmitting Data in a Multiplex Computing Network
WO2009028921A3 (en) * 2007-08-30 2009-05-07 Lg Electronics Inc Apparatus and method for providing feedback for three-dimensional touchscreen
US8219936B2 (en) 2007-08-30 2012-07-10 Lg Electronics Inc. User interface for a mobile device using a user's gesture in the proximity of an electronic device
US8432365B2 (en) 2007-08-30 2013-04-30 Lg Electronics Inc. Apparatus and method for providing feedback for three-dimensional touchscreen
WO2009028921A2 (en) * 2007-08-30 2009-03-05 Lg Electronics Inc. Apparatus and method for providing feedback for three-dimensional touchscreen
KR100901482B1 (en) * 2007-09-27 2009-06-08 한국전자통신연구원 Remote control system and method by using virtual menu map
US9058091B2 (en) 2008-12-09 2015-06-16 Samsung Electronics Co., Ltd. Input device and input method
KR101517001B1 (en) * 2008-12-09 2015-04-30 삼성전자주식회사 Input device and input method
KR101393803B1 (en) * 2011-12-15 2014-05-12 엘지이노텍 주식회사 Remote controller, Display device and controlling method thereof
US9529434B2 (en) 2013-06-17 2016-12-27 Samsung Electronics Co., Ltd. Presentation device and method for operating the device
US10285039B2 (en) 2013-06-17 2019-05-07 Samsung Electronics Co., Ltd. Wearable device and communication method using the wearable device
US10241574B2 (en) 2013-08-20 2019-03-26 Samsung Electronics Co., Ltd. Wearable biosignal interface and method thereof
KR101632014B1 (en) * 2015-04-06 2016-06-21 엘지전자 주식회사 Mobile terminal
CN104793747A (en) * 2015-04-24 2015-07-22 百度在线网络技术(北京)有限公司 Method, device and system for inputting through wearable device

Also Published As

Publication number Publication date
KR100674090B1 (en) 2007-01-24
WO2006068357A1 (en) 2006-06-29

Similar Documents

Publication Publication Date Title
US9746934B2 (en) Navigation approaches for multi-dimensional input
US10139918B2 (en) Dynamic, free-space user interactions for machine control
US20190346940A1 (en) Computing interface system
US9519350B2 (en) Interface controlling apparatus and method using force
US9575570B2 (en) 3D pointing devices and methods
US20180181208A1 (en) Gesture Recognition Devices And Methods
US9389779B2 (en) Depth-based user interface gesture control
US9116571B2 (en) Method and system of data input for an electronic device equipped with a touch screen
US9971422B2 (en) Object orientation detection with a digitizer
TWI546724B (en) Apparatus and method for transferring information items between communications devices
US9110505B2 (en) Wearable motion sensing computing interface
JP6370893B2 (en) System and method for performing device actions based on detected gestures
KR101793566B1 (en) Remote controller, information processing method and system
US8292833B2 (en) Finger motion detecting apparatus and method
US20150323998A1 (en) Enhanced user interface for a wearable electronic device
US10007351B2 (en) Three-dimensional user interface device and three-dimensional operation processing method
US8922530B2 (en) Communicating stylus
US8421634B2 (en) Sensing mechanical energy to appropriate the body for data input
US10019078B2 (en) Device, method, and system to recognize motion using gripped object
US9841827B2 (en) Command of a device by gesture emulation of touch gestures
Kim et al. The gesture watch: A wireless contact-free gesture based wrist interface
KR100518824B1 (en) Motion recognition system capable of distinguishment a stroke for writing motion and method thereof
US8316324B2 (en) Method and apparatus for touchless control of a device
Kratz et al. HoverFlow: expanding the design space of around-device interaction
US20140267024A1 (en) Computing interface system

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant
LAPS Lapse due to unpaid annual fee