KR20160031940A - User input apparatus using location or change thereof - Google Patents

User input apparatus using location or change thereof Download PDF

Info

Publication number
KR20160031940A
KR20160031940A KR1020150018907A KR20150018907A KR20160031940A KR 20160031940 A KR20160031940 A KR 20160031940A KR 1020150018907 A KR1020150018907 A KR 1020150018907A KR 20150018907 A KR20150018907 A KR 20150018907A KR 20160031940 A KR20160031940 A KR 20160031940A
Authority
KR
South Korea
Prior art keywords
user input
input device
control unit
angle
unit
Prior art date
Application number
KR1020150018907A
Other languages
Korean (ko)
Inventor
고재용
Original Assignee
주식회사 와이드벤티지
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 와이드벤티지 filed Critical 주식회사 와이드벤티지
Publication of KR20160031940A publication Critical patent/KR20160031940A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors

Abstract

The present invention confirms the position or the position change (movement path) of the electric device and reflects the confirmed position or the position change to the input of the user, thereby changing the position or the position change capable of performing the operation control of the electric device A position detection unit provided in the user input device for detecting a position or an angle of the user input device and a change in position or angle due to the movement of the user input device, And a controller for determining user input according to the position, angle, position, or angle of the user input device.

Description

USER INPUT APPARATUS USING LOCATION OR CHANGE THEREOF USING THE SAME

The present invention relates to a user input device, and more particularly, it relates to a user input device, in which a position or a position change (movement path) of an electric device is checked without a complicated device such as a stylus having a user touch screen or various sensors and a communication module, The present invention relates to a user input device using a change in position or position capable of performing an operation control of an electric device (software) by reflecting the input to a user.

With the advent of tablet PCs and smartphones, input devices have become commonplace when the pen touches the screen on which the image is displayed, recognizing the position of the stylus pen tip by static electricity, static pressure, or optically. Most of the touch screens are input by pressing with a finger or input by a pointing device such as a pen or a stylus. In particular, a magnetic resonance-based technology called EMR (electromagnetic resonance) is disclosed in U.S. Patent No. 6556190 Currently, it is used in various products including Samsung Galaxy Note product line. This technology can distinguish touch and stylus by touching the skin such as finger or hand ball on the input pad, and it is possible to measure the pressing pressure, and it has advanced functions such as controlling the thickness of stroke according to the pressure.

In the United States, Apple Inc. has filed a US patent application named Optical Stylus 2012/0127110 to add a camera, a power supply, a circuit, a processor and a wireless communication module to the stylus without a separate touch screen input, The camera of the stylus recognizes the minute mark formed on the screen and recognizes the position of the stylus pen tip on the screen and discloses a technique of detecting the tilt angle between the stylus and the screen . There are a variety of stylus types that can be input without a touchscreen. OID (optical identification) pens that are equipped with a camera on the stylus, recognize minute patterns on special paper, find out the touch position on special paper and transfer the strokes to the computer, or stylus and stylus that generate ultrasound and infrared An example is an ultrasonic pen product that calculates the position of a stylus pen tip from the time difference between two signals through two or more sensor modules that recognize the generated ultrasonic waves and infrared rays and transmits them to a computer. In addition, infrared ray lamps and several dedicated cameras are mounted on the side of the screen, and the technology of transferring the position of the finger or pen touching the screen in the x- and y-coordinate system from the camera image to the computer by triangulation is also commercialized have.

On the other hand, game machines such as PS VITA of Nintendo Co., Ltd. of Japan use various cameras and markers (visual signs) for various augmented reality systems. These game devices recognize the position and angle of the visual mark placed on the table through camera shooting, output the captured image on the screen of the game device, recognize the visual mark in the image, and display the game character, The information that is not present in reality is added and outputted on the screen of the game machine. The Wii remote, which is a sports game machine, also has an infrared camera. The infrared LED is set as a visual indicator, and the position of the remote controller is detected from the visual information of the infrared LED taken through the camera.

  On the other hand, Microsoft 's Kinecta product uses 3D infrared information including depth of the surrounding objects by using infrared laser and two infrared cameras. In the US, Has developed a technology to recognize three-dimensional shapes around the phone by installing a plurality of wide-angle cameras or visual depth sensors with various partners in earnest. New products, such as Leapmotion in the US, have released products that utilize two cameras to sense the individual positions of 10 fingers in the airborne hand.

Other imaging and related fields, such as mirrors, lenses, and prisms, have been traditionally used for various optical instruments. Various wide-angle lenses have been used in cameras, and wide-angle mirrors are used in vehicles and curves. To create a wide enough wide angle, several precision lenses must be used and the volume becomes large.

Computing devices equipped with a camera can capture a user's finger or other pointing device with a built-in camera so that the software executed on the computing device analyzes the screen and uses the screen as a touch screen to more effectively perform the user interface There is a potential to be. However, since the installation direction of the camera is the same direction as that of the screen in view of the above-mentioned reason and is attached to each other like most notebook computers, the pointing device moving in the screen plane of the computing device can not be photographed.

However, the touch screen of a smart phone is so small that it is difficult to input an extremely simple memo in case of a handwriting application, and it is difficult to carry with a large touch screen, and the cost increases. On the other hand, an ultrasonic stylus, an OID pen, an external keyboard, etc., which do not use the conventional touch screen input, has an advantage that a sufficiently wide space can be used for input, but a pointing device for generating a signal with a complicated sensor and communication module is required , A separate sensor module to pick up the signal, or special paper is required in most cases. It is frequently charged, expensive, and frequent.

There are already studies to track the fingertip, point light source, and other visual markers using a camera, but these methods are easy to track two-dimensional motion that is perpendicular to the direction in which the camera looks at the three- It is difficult to keep track of moving in the viewing direction. 2) In order to operate even when the lighting environment is dark or too bright, a separate light source for the visual indicator, a circuit for that purpose and a power supply are required. ) There is a problem that it becomes difficult to trace if there is a color or pattern similar to the visual mark, and the light source is also in the vicinity.

In order to track the x, y two-dimensional position of the pointing device such as a finger moving on the table, the camera moves to the table Should be oriented to look down, and should be in a high enough position in the table.

By the way, most smartphones, tablets, and wearable computing devices are equipped with a screen that is either facing the user or on the opposite side of the screen to support the main application, recording or video calling. Therefore, installing a computing device to lower the table surface at a sufficient height above the table to track the pointing device requires a large cradle and also prevents the user from viewing the screen of the computing device. If the user can not see the screen, the cursor position on the computing device software is not known only by moving the pointing device, such as a finger or pen, on a table that is not displayed, and software feedback by the pointing device operation can not be received. You will not be able to do most user interfaces, including selection, dragging, and drawing. Thus, the camera is in a separate position from the screen, or the user is forced to move the pointing device in the air. Separate cameras make portability much worse, and gestures that move fingers in the air and click on objects are often unnatural and inconvenient.

Since the computing device is mounted on the same plane as the pointing device to photograph the side of the pointing device, the camera built in the smartphone must obtain a wider wide angle image in a horizontal direction parallel to the plane with a higher resolution. Optical instruments, such as conventional fisheye lenses, are not only expensive, but they also include more images in a given area of the image sensor, which inevitably leads to lower resolution in proportion to the wider field of view.

The present invention can be applied to various information communication devices, such as a touch screen, a sensor module, a stylus equipped with a sensor and a power supply, a special paper, and the like, And a user input device using a change in position or position capable of performing an operation control of the electric device (software) by reflecting the confirmed position or position change to the input of the user.

The present invention also provides a user input device using a position or position change having a function (palm rejection) that allows a user to confirm an input start time and an end time, The purpose.

In the present invention, a simple mechanism including a mirror is installed in front of a camera to photograph an object at a position where the information communication device makes contact by changing the photographing angle of the camera, and information corresponding to the photographed object (audio animation, And a user input device using a change in position or position for displaying corresponding information.

The present invention tracks movements in a space of an information communication device such as a smart phone and outputs various sounds and voices such as music and voice and flashes of the entire screen to implement diverse paradigms and music games not limited to screens The present invention provides a user input device using a change in position or position.

A user input device using the position or position change of the present invention is provided in a user input device and includes a position sensing part for sensing a position and an angle of the user input device and a position or an angle change due to movement of the user input device, And a controller for determining a user input based on a position, an angle, a position, or an angle change of the user input device based on the sensing value from the user.

Preferably, the controller determines the position and angle of the reference portion of the user input device based on the sensed value from the position sensing portion, and controls the currently executed program according to the determined position or angle.

Also, the user input device may include a display unit for displaying a user input or a program currently being executed, or a storage unit for storing judgment reference data for judging a user input according to a moving path of the user input device, Comparing the movement route according to the position change and the previously stored judgment reference data to judge the user input as the judgment reference data corresponding to the movement route and displaying the judged user input on the display unit or storing it in the storage unit, .

Further, the position sensing unit includes a camera for photographing an image, and the determination reference data includes an original image of an object or an image displayed on the object, and the control unit displays the image of the object photographed by the camera, It is preferable to compare the original image to determine the position and angle of the user input device with respect to the object, the position change and the angle change or the position and angle of the object with respect to the user input device, and the position change and the angle change.

The position sensing unit may include a camera that captures an image of an object or a depth sensor. The control unit may compare images of an object photographed at a plurality of points of view by a camera, It is desirable to compare the depth sensing values to determine the position and angle of the user input device with respect to the object, the position change and the angle change, or the position and angle of the object with respect to the user input device, and the position change and the angle change.

In addition, the control unit preferably irradiates the object with light through the display unit in order to identify the object.

In addition, it is preferable that the control unit displays an original image on a display unit, photographs an image reflected from a mirror, which is an object, using a camera, and compares the taken reflected image with the original image.

In addition, the control unit preferably determines a start time and an end time of the user input.

In addition, it is preferable that the control unit processes the user input between the start point and the end point and the user input before and after the start point differently from each other.

Preferably, the user input device includes a microphone, the storage unit stores the reference input sound, and the control unit compares the sound obtained from the microphone with the reference input sound to determine a start time and an end time of the user input.

The position sensing unit may include a magnetic field sensor, a motion sensor composed of at least one of an accelerometer and a gyroscope, and the control unit may control the positional change of the user input device when the acceleration from the motion sensor is greater than the reference acceleration, It is determined that the start time of the user input is determined.

The user input device may further include a case, a pressing detection adapter protruding from the case to generate a pressing signal, and a connection unit connected to the pressing detection adapter and applying a pressing signal to the control unit.

Preferably, the controller determines the start and end points of the user input according to the electrical characteristics of the push signal.

Also, the user input device includes a communication unit that performs wired communication or wireless communication, and the control unit transmits a signal including the start time or the ending time of the user input to the external electric device through the communication unit, It is preferable to determine a user input based on a start time or an end time included in the transmitted signal.

In addition, the user input system of the present invention is a user input system which is mounted on a tissue of a human body and detects a sound transmitted through a tissue of a human body, generates a push signal including a sensed sound and transmits the push signal to a user input device, And a user input device for determining start and end points of the user input based on the push signal received from the sensing device.

The touch sensing device includes a microphone for acquiring a sound in contact with or proximity to a tissue of a human body, a communication unit for transmitting a push signal to a user input device, a touch signal including a sound from the microphone, And a communication unit for transmitting the data to the device.

The present invention can be applied to various information communication devices, such as a touch screen, a sensor module, a stylus equipped with a sensor and a power supply, a special paper, and the like, Path), and the changed position or position change is reflected to the input of the user, so that the operation control of the electric device (software) can be made convenient.

In addition, the present invention can identify input start and end points by a user, so that it is possible to distinguish between a simple position change of a user input device, which is a pointing device, and a position change including a handwriting or a drag, , And a function of excluding a user's unintended input (palm rejection).

In the present invention, a simple mechanism including a mirror is installed in front of a camera to photograph an object at a position where the information communication device makes contact by changing the photographing angle of the camera, and information corresponding to the photographed object (audio animation, Link, annotation, etc.) of the user, or the corresponding information is displayed, thereby improving the user's convenience.

In addition, the present invention has the effect of accurately determining the start time and the end time for recognizing the position change when the user uses the information communication device itself as a writing tool such as a pen or the like.

In addition, the present invention has an effect of obtaining a user's true input (intended input) by acquiring and using a signal including a start point or a end point of a user input from an independent electric device for the user input device.

1 is a block diagram of a first embodiment of a user input device using position or position changes according to the present invention.
Figure 2 is an example of use of the user input device of Figure 1;
3A and 3B are a perspective view and a plan view of a second embodiment of a user input device.
4A to 4E are embodiments of the push-detection adapter of the user input device according to the second embodiment.
5 is a third embodiment of a user input device.
6A and 6B are fourth and fifth embodiments of a user input device.
7 is an embodiment of a user input system to which a user input device according to the second embodiment is applied.
8 is a sixth embodiment of the user input device.
9A and 9B show an embodiment and a configuration diagram of a pressure sensing device corresponding to the pressure sensing adapter.

Hereinafter, the present invention will be described in detail with reference to embodiments and drawings.

1 is a block diagram of a first embodiment of a user input device using position or position changes according to the present invention.

The user input device 1 includes an input unit 10 for obtaining an input from a user, a connection unit 11 to which the push detection adapter is connected, and a display unit for displaying various information and contents (for example, A microphone 13 for acquiring a sound / voice signal; and a display unit 12 for displaying various judgment criterion information and a program 13. The display unit 12, the camera 13 for photographing an external image, the speaker 14 for emitting sound / voice to the outside, A communication unit 17 for communicating in accordance with various communication methods, a depth sensor 18 for measuring a distance between the external object and a storage unit 16 for storing the position and position of the user input device 1, A motion sensor 19 (magnetic field sensor 19a, accelerometer 19b and gyroscope 19c) for sensing the change and the above components to control the inherent functions of the user input processing apparatus 1 Play, etc.), and the position sensing value from the motion sensor 19 (Handwriting or the like) by calculating the position and the positional change of the user input device 1 using the obtained image (still / moving picture) from the camera 13, the distance value from the depth sensor 18, As shown in FIG. A camera 13, a speaker 14, a microphone 15, a storage unit 16, a display unit 12, a camera 13, a speaker 14, a microphone 15, and a storage unit 16, The communication unit 17, the depth sensor 18, the magnetic field sensor 19a, the accelerometer 19b, the gyroscope 19c, and the like are also omitted. The motion sensor 19 and the camera 13 and the depth sensor 18 are examples of a position sensing unit for sensing the position and the position of the user input device 1 and the like. (Position sensing value, image, distance value, etc.) of the vehicle.

The input unit 10 may be implemented as a general button type input unit and a touch screen type located on the display unit 230.

The connection unit 11 corresponds to a component that is electrically connected to an external device (a push detection adapter in the present invention) having a headset jack, a USB port, or the like to perform communication.

The control unit 20 judges whether or not the user directly moves the user input device 1 (position and position change) using input values (image, position sensing values) from the provided components (Handwritten) from the user and stores the handwriting in the storage unit 16, the display unit 12, or the control input to the program.

Figure 2 is an example of use of the user input device of Figure 1; It is possible for the user to grasp the user input device 1 with the hand H like a writing instrument such as a pen or a stylus. The user can press the thumb face of the front face of the case provided with the display unit 12 of the user input device 1 and hold the back face of the case by the side of the second finger and move it, 1a (or the convex portion of the marginal portion) can be drawn on a reference plane E such as a flat surface such as a desk surface. In addition, the user writes strokes, that is, the shape of the user input device 1 may be plate-shaped but may be held like a stick-shaped pen, and a convex portion may be handwritten as a pen tip.

2, at least the reference plane E (for example, the position (and the angle)) of the user input device 1 is determined from the position (and angle) of the user input device 1, It is necessary to determine whether the corner 1a of the user input device 1 touches the reference plane E or not. Accordingly, the starting point and the ending point of drawing a stroke, which is the moving path of the user input device 1, should be checked.

When the user input device 1 is moved by the user's movement operation (handwriting operation), the controller 20 controls the position and the tilt angle of the user input device 1 to the camera 13, the motion sensor 19, (18) and so on.

The controller 20 controls the position and angle of the user input device 1 (or the reference corner 1a) by using the image processing function stored in the storage unit 16 as an image of the object photographed by the camera 13 . The image processing function compares the original image information of the object stored in the storage unit 16 with the photographed image to determine the position and the angle of the user input device 1. [ The control unit 20 determines the positional change and the angular change based on the determined position and angle, and compares the movement route according to the positional change with pre-stored read reference data (letters, numbers, symbols, figures, Selects the corresponding read criterion data, displays the selected criterion data on the display unit 12 as a user input, or processes it as a user input of the currently executed program.

The control unit 20 can also use the images of the object (reference position) photographed by the camera 13 at a plurality of points of view or the object (reference position) from the depth sensor 18 without using the original image information of the object It is possible to calculate the position, angle, position and angle change of the user input device 1 relative to the object by obtaining the depth detection value and comparing the obtained images or the depth detection values with each other.

In addition, the control unit 20 may judge the angle change as a new input such as the thickness and color of the stroke, which is a user input, and may process the change as a change in thickness or color. At this time, the control unit 20 uses the depth sensor 18 to calculate the position, angle, and movement path of the user input device 1 in consideration of the distance between the object and the camera 13 (or the depth sensor 18) It can be judged more accurately.

The control unit 20 may integrate the value of the accelerometer 19b to estimate the velocity and the position of the accelerometer 19b and use the gyroscope 19c and the magnetic field sensor 19a together to improve the accuracy of the position . The control unit 20 recognizes the position and angle of the corner 1a (or the user input device 1), which is a reference portion, using the motion sensor 19, and determines a position change (movement path) And the change in the determined position, angle, and position can be stored in the storage unit 16 or displayed on the display unit 12 as a user's input. In addition, the control unit 20 compares the determined travel route with pre-stored read reference data, selects the corresponding read reference data, displays the selected reference data on the display unit 12 as a user input, And processes it as a user input of the program being executed. For example, the control unit 20 can output various sounds such as music and voice, vibration, and twinkling of the entire screen according to the determined position change, thereby performing various diopters and music games not limited to the screen.

In addition, in FIG. 2, the control unit 20 determines whether the entire travel path of the electric device is determined as a user input or when a user starts and ends a user input.

First, the control unit 20 can use the microphone 15 to determine whether the user input device 1 moves or touches the reference plane E or not. For example, since the collision sound, scratching vibration, or sound may be generated, such as [peek, perfect, etc.], the control unit 20 may control the sound obtained from the microphone 15 and the previously stored reference input sound Scratching vibration sound) can be compared to determine the input start time and the end time. That is, the control unit 20 can determine that the movement of the user input device 1 is proceeding while the input sound is acquired from the microphone 15.

Further, the control unit 20 uses the position sensing value from the motion sensor 19. For example, the control unit 20 determines that the change in the speed of moving on three dimensions, that is, the acceleration is relatively large for a short time (when the change in velocity is greater than the previously stored reference acceleration) When it is confirmed from the detection value, it is judged that the user input device 1 touches the reference plane E. At this time, since the user input device 1 touches or strikes the reference plane E moving in space, the direction of the acceleration will be close to the direction opposite to the direction of the earth's gravity. After this point, the control unit 20 generally determines whether the reference portion 1a moves in a plane, that is, in a direction perpendicular to the earth's gravity, using the position sensing value from the motion sensor 19, It can be judged that it moves.

3A and 3B are a perspective view and a plan view of a second embodiment of a user input device. 3A, a jack 32 of the push-detection adapter 30 is inserted into the receptacle 11a of the connection part 11 of the user input device 1 so as to confirm the start and end of inputting of the user 3B is a plan view in which the push-detection adapter 30 protrudes outside the case of the user input device 1 and is fixedly mounted so as to be able to communicate with the connection portion 11. As shown in Fig.

The push detection adapter 30 generates a push signal when the user presses the push detection adapter 30 on the reference plane E or the like and includes a case 30 forming the outside and a push signal (Not shown). A circuit or the like for generating a push signal is provided inside the case 30, and this circuit is described in detail in Figs. 4A to 4E.

3A and 3B, instead of the movement operation (handwriting operation) being performed with the corner 1a of the user input device 1 of Fig. 2 touching the reference surface E, The moving operation is performed in a state in which it is pressed or pressed against the reference plane E. The control unit 20 determines the position and the positional change of the user input device 1 while the push signal is being input (while the pushing state is maintained), displays it on the display unit 12 as a user input intended by the user, Unit 16, or selects read reference data corresponding to the determined position and position change, and performs control of the program in accordance with the selected read reference data. Alternatively, the user input device 1 may change the position and the position of the user input device 1 while the push signal is input, and the position and the position change of the user input device 1 during the time when no push signal is input It is possible to process them as different user inputs and to use them as control inputs in the program currently being executed.

4A to 4E are embodiments of the push-detection adapter of the user input device according to the second embodiment.

FIG. 4A is a cross-sectional view of the pressure-sensitive adapter, and FIG.

The touch sensing adapter 300 according to the first embodiment of the touch sensing adapter 30 is connected to the input ground line GND and the microphone signal line MIC to the connection portion 11 of the user input device 1 and the ground line GND A hollow conductor ring 310b which is connected to the ground line GND and which covers the hollow front face of the conductor ring 310b and which is connected to the ground line GND, (Flange) 310c having an elastic edge and inserted into the conductor ring 310b, a flange 310c positioned in the hollow of the conductor ring 310b and at least partially protruding out of the conductor ring 310b, A protruding conductor 310d which is located inside the end portion 310c and connected to the microphone signal line MIC and an insulator which is located in the hollow of the conductor ring 310b and which insulates the protruding conductor 310d from the conductor ring 310b 310e. The push-sensing adapter 300 includes a case 310a that forms an outer shape, protects the internal circuitry, and supports the conductor ring 310b.

When the jack 32 is connected to the connection unit 11 through the receptacle 11a of the user input device 1, the pressure sensing adapter 300 recognizes that the microphone is plugged in, Can receive.

4A, when the conductive end portion 310c and the protruding conductor 310d are not electrically connected with each other in a state in which the pressing detection adapter 300 is not pressed (with the conductive end portion 310c and the protruding conductive portion 310d being spaced apart) State), the control unit 20 acquires a microphone input corresponding to an impedance (first impedance) determined by the resistor R1. The conductive end 310c and the protruding conductor 310c are pressed in the pressed state (the conductive end 310c of the pressure sensitive adapter 300 is pressed down and in contact with the protruding conductor 310d) And the conductive end 310c is electrically connected to the protruding conductor 310d before the protruding conductor 310d touches the protruding conductor 310d, the resistance of the resistor R1 and the conductive end 310c are connected in parallel with each other. The parasitic capacitance formed around it is applied to the control unit 20 as a push signal in the form of an AC voltage change. Therefore, the control unit 20 can distinguish the microphone input corresponding to the first impedance from the push signal, and can change the position and position of the user input device 1 while the push signal is applied, The position and the positional change of the user input device 1 are distinguished from each other and processed differently. As shown in FIG. 4A, since the push-sensing adapter 300 does not include an actual microphone but is composed only of a resistor and a conductor, even if the push-sensing adapter 300 does not have an additional soundproofing device, Since the signal or noise other than the pressed signal is not transmitted to the control unit 20 through the microphone input, the control unit 20 can accurately determine whether the pressure detection adapter 300 is pressed.

A circuit portion including only a passive element including a coil or a capacitor as well as the resistor R1 of FIG. 4A is applied so that the push-detection adapter 300 generates a signal matching the characteristic of the connection portion 11 as a microphone input, 310c can be converted into electric signals that can be recognized by the control unit 20 and transmitted. The electrical characteristics that the control unit 20 recognizes through the connection unit 11 as the microphone input terminal are different depending on the types of the microphones supported by the user input device 1. The electrical characteristics of the conductive ring 310b, And the protruding conductor 310d are all different from each other, the circuit portion must convert the electrical characteristic change caused by the pressing to the electrical characteristic change of the microphone input terminal in accordance with the electrical characteristic.

4C is a cross-sectional view of the push-sensing adapter 301 which is the second embodiment of the push-detection adapter 30. As shown in FIG. The push-sensing adapter 301 is connected to the speaker signal line (SPK-L) and the ground line (GND), a case 311a constituting an outer shape when the groove is formed therein, An end portion 311c of an elastic material which covers the sensor portion 311b and has a rim at the periphery of the groove of the case 311a and a connection portion 311b of the user input device 1 A speaker signal line SPK-L for supplying a speaker output to the sensor unit 311b as power or a signal and a speaker signal line SPK-L connected at one end to a microphone signal line MIC and the other end to a ground line GND A resistor R2, a ground line GND, and a microphone signal line MIC.

The user input device 1 has at least two speaker outputs SPK-L and SPK-R on the left and right sides (L and R), and there are not many cases where the left and right speakers are used separately. , And the second speaker output.

The control unit 20 transmits a specific electric signal waveform to the first or second speaker output through a method of writing data in the audio output buffer or the like so that the speaker output (SPK-L, for example) and the end 311c An electric signal of a speaker deformed through a resistor R2 connected to the sensor unit 311b is received via the ground line GND and the microphone signal line MIC. When the control unit 20 receives the electric signal of the deformed speaker (GND and SPK-L) of the speaker sent to the push-detection adapter 301 and the electrical state of the sensor unit 311b, The controller 20 can compare the transmitted speaker signal waveform with the received electrical signal to check the electrical state of the sensor unit 311b (whether the speaker unit 311b is pressed or not).

That is, the pressure sensing adapter 301 uses the output of the first speaker SPK-L of the control unit 2 as a signal source, and the pressure applied to the sensor unit 311b through the end portion 311c by the user's depression (FSR) is used as the resistor R3. For example, the control unit 20 transmits a sine wave of a predetermined size to the SPK-L output, and measures the size of a sine wave input to the microphone signal line-ground line (MIC-GND). When the resistance value of the resistor R3 increases according to the voltage division of the resistor R2 and the resistor R3 in the sensor portion 311b, the voltage applied to the resistor R2, which is a fixed resistor, becomes smaller and the resistance of the resistor R3 The control unit 20 can determine the magnitude of the resistance value of the resistor R3 by measuring the magnitude of the sine wave read from the microphone input so that the end 311c and The pressure (pressure) applied to the sensor unit 311b can be calculated. The control unit 20 can change the thickness or the size of the stroke drawn on the display unit 12 or change the control state of other programs according to the magnitude of the pressure (pressure) applied to the end 311c. At this time, the resistor R2 also functions to check whether the control unit 20 is connected to the connection unit 11 of the user input device 1 when the push detection adapter 301 is connected.

4D is a cross-sectional view of the push-sensing adapter 302, which is a third embodiment of the push-detection adapter 30. The pressure sensing adapter 302 has the same case 312a as the case 311a, a sensor portion 312b for sensing pressure, an end 312c that is the same as the end portion 311c, A sub control unit 312d for generating a sine wave of a proportional frequency and applying the generated sine wave to the control unit 20 through the microphone signal line-ground line MIC-GND, and a speaker output line SPK- And a power harvesting unit 312e for supplying DC power to the sub control unit 312d.

The control unit 20 receives the sinusoidal wave from the pressure sensing adapter 302 and analyzes the frequency of the sinusoidal wave to determine the pressure of the sensor unit 312b.

The power harvesting unit 312e may be constructed of a simple circuit that outputs AC output generated by an AC-coupled speaker output stage using only a few analog devices through a process of transforming, rectifying, and stabilizing the DC output. It is apparent that the sub-control unit 312d operated by the DC power supply may be composed of various low-power devices including a microprocessor. When the microprocessor is provided in the sub control unit 312d, information on the electrical state of the sensor unit 312b may be sent to the control unit 20 in the form of digital data through a microphone input (MIC).

The sensor unit 312b may be a single pressure sensor or a plurality of pressure sensors provided at various positions on the curved surface of the pen point so as to find out which part of the curved surface of the pen point has pressure, Or a tilt angle of about 20 degrees. The sensor unit 312b may use an FSR whose electric resistance changes in a pen tip in order to measure a pressure, or may use a piezo element or the like in which a voltage is generated in response to a pressure. In addition, the degree of tilt of the push-detection adapter 302 may be measured using the tilt sensor with the sensor unit 3121b. The speaker output (SPK-R) may be a power source and the speaker output (SPK-L) may be used as a signal source by the control unit 20 in the sub control unit 312d.

4E is a cross-sectional view of the pressure sensitive adapter 303, which is a fourth embodiment of the pressure sensitive adapter 30. The pressing detection adapter 303 has the same case 313a as the case 311a and the same sensor portion 313b as the sensor portion 312b and the same end portion 313c as the end portion 311c and the sensor portion 313b, And a sub control unit 313d for generating data including information on the pressure of the data line DATA to the control unit 20 and a USB plug 32a instead of the jack 32. [

The USB plug 32a of the push detecting adapter 303 is connected to a USB OTG receptacle (not shown) provided at the connection part 11 of the user input device 1 by wire. The sub control unit 313d receives power from the user input device 1 through the power line VCC.

The control unit 20 analyzes the data related to the pressure state of the sensor unit 313b received from the sub control unit 313d via the USB type connection unit 11 and determines the thickness of the stroke drawn by the pushing sensing adapter 303, You can change the color, transparency, thickness, and so on of a pen or a pen. The USB plug 32a and the receptacle may be terminals for supplying power and transmitting data and control signals to a device such as a USB host terminal or an Apple's dock connector in addition to the USB OTG.

Between the contact structure or the end portions 311c, 312c and 313c between the end portion 310c and the protruding conductor 310d described in the structure of the pressure sensitive adapter 30 described in Figs. 4a to 4e and the sensor portions 311b, 312b, and 313b In addition to the contact structure, an elastic member such as a spring is disposed between the end portion and the protruding conductor or the sensor portion. If the end portion is not pressed from the outside, the end portion and the protruding conductor or sensor portion are separated by the elasticity of the elastic member, Or a structure in which the end portion and the protruding conductor or the sensor portion come into contact with each other when the sensor portion is pushed in the direction in which the sensor portion is positioned, and the contact position may also be realized inside the case, not outside the case of the user input device 1.

5 is a third embodiment of a user input device. The electric device 2 corresponds to a device capable of wired or wireless communication, and has a display portion 2a. The portable user input device 1 and the electric device 2 can transmit information through wired or wireless communication and the user input device 1 and the electric device 2 constitute a kind of user input system.

The electric appliance 2 transmits the information corresponding to the original of the image displayed on the display unit 2a to the user input device 1 including the first and second images I1 and I2 displayed on the display unit 2a . The user input device 1 photographs a screen image including the first and second images I1 and I2 displayed on the display unit 2a of the electric device 2 by using the camera 13. [ The control unit 20 refers to the original image information transmitted by the electric device 2 through the wired / wireless communication in the image photographed by the camera 13 and determines the position, size and angle of the image photographed by the camera 13 It is possible to grasp the presence or absence of the first and second images I1 and I2 corresponding to the original image information by using the computer vision algorithm to calculate the position and angle of the user input device 1 with respect to the electric device 2 And the position and angle of the electric device 2 with respect to the user input device 1 may be calculated. The display portion 2a and the first and second images I1 and I2 in the photographed image will differ depending on the relative positions and angles between the user input device 1 and the electric device 2, The reference position O2 of the electric device 2 becomes a reference point and the controller 20 can calculate the position and the angle O1 of the user input device 1 with respect to the reference point O2 . The control unit 20 calculates the movement route according to the positional change by using the determined position and angle and stores the movement route on the display unit 12 or in the storage unit 16, It can be used as the user input of the currently executing program by using the angle.

Further, the control unit 20 transmits information on the position, the angle, and the movement path to the electric device 2, and the electric device 2 controls the position, the angle and the movement path received by the user input of the program Can be used. The electric device 2 may calculate the position and the angle of the electric device 2 with respect to the reference point O1 of the user input device 1 by using information on the received position,

The user input device 1 in Fig. 5 may be a glasses or a goggles provided with a camera such as Google glass, Hololens, Magic Leap or Oculus VR of the MS, and the electric device 2 may be a smart phone. At this time, the moving path (locus) and the like of the electric device 2 as a smartphone can be reflected in the output contents of the user input device 1 which is a spectacle / goggles. For example, when a user writes on a desk with a smart phone as the electric device 2, the user input device 1 such as Google Glass tracks the position of the electric device 2 in the above-described manner, And display it on the display unit of the user input device 1 in a superimposed manner on the desk image.

6A and 6B are fourth and fifth embodiments of a user input device.

6A shows a fourth embodiment of the user input device. The user input device 1 photographs the fixed object P1 using the camera 13, determines the position and angle using the captured image, do. The object P1 is an object displaying rectangles r1, r2, r3 and r4 and a visual marker on the whole. The storage unit 16 stores an object P1 And the like). The control unit 20 photographs the image of the object P1 using the camera 13 and compares the photographed image with the image information of the object P1 stored in the storage unit 16, the relative position and angle with respect to r1 can be calculated.

The control unit 20 can determine a movement path, which is a position change, using the estimated position and angle, and can use the movement path and the angle as a user input.

The object P1 may be a mirror, and the storage unit 16 stores the original image displayed on the display unit 12. [ The control unit 20 displays an original image on the display unit 12 and the original image is reflected by the object P1 which is a reflector. The control unit 20 photographs the reflected image using the camera 13 and compares the captured reflected image with the original image information to calculate the position and angle of the user input device 1 with respect to the object P1 do. The taken reflection image will be different according to the relative position between the user input device 1 and the object P1 and the position of the edge r1 of the object P1 becomes the reference point, The position and angle of the user input device 1 can be determined.

The control unit 20 can determine a movement path, which is a position change, using the estimated position and angle, and can use the movement path and the angle as a user input.

6B shows a fifth embodiment of the user input device in which the user input device 1 transmits information (for example, dot code, etc.) displayed in the area S1 of the reference plane P2 to the camera (For example, an image corresponding to dot code, text, etc.) stored in the storage unit 16, and the reference surface P2 and the position and angle of the user input device 1 are determined.

The reflecting device 40 includes a supporting part 41 for fixing and mounting the user input device 1 on the outer surface around the camera 13 and a supporting part 41 connected to the supporting part 41, And a reflection part 42 for changing the reflection part 42 to the reflection part 42. The reflection section 42 is located in front of the camera 13 and allows the camera 13 to photograph an image of the area S1 reflected by the reflection section 42. [ The first side 41a closer to the area S1 to be photographed in the trapezoidal shape is longer than the second side 41b opposite to the first side 41b and the third side 41c is longer than the second side 41b, (Not shown).

The user input device 1 of FIG. 6B is provided with a reflection device 40 and a push-detection adapter 30 in addition to the components of FIG.

The control unit 20 compares and analyzes the image information of the area S1 obtained by the camera 13 with the previously stored read information while acquiring the push signal from the push detection adapter 30, ) Of the user input device 1 on the reference plane P2. Further, the control unit 20 displays on the display unit 12 an image, animation, annotation, text, or Internet link included in the corresponding read information. The control unit 20 stores user input (for example, a handwriting operation) corresponding to the positional change of the user input device 1 in the storage unit 16 with respect to the displayed read information, The read information and the user input are displayed on the display unit 12 together.

In addition, a position indicator VM (for example, a position indicator) VM is provided between the outer edge P2-E of the reference plane P2 and an inner borderline P2-I spaced inwardly by a predetermined distance from the outer edge P2- A visual marker, etc.) is displayed. The position display unit VM is configured in a different shape for each position displayed on the reference plane P2 and the control unit 20 of the user input device 1 controls the position of the user input device 1 while the user moves or stops the user input device 1. [ Analyzes the image of the position display unit VM obtained through the camera 40 and confirms the position of the position display unit VM on the reference plane P2 and stores it in the storage unit 16. [ The control unit 20 compares the image of the currently obtained position indicator VM with the position information of the position indicator VM stored in the storage unit 16 to obtain an image of the currently obtained position indicator VM It is possible to determine the position and the angle of the camera 13 or the user input device 1 that can be used. Through this process, the controller 20 confirms the relative position and angle (degree of tilt) of the user input device 1 with respect to the reference plane P2.

For example, when the reference plane P2 is a specific page of a book, and the storage unit 16 stores information on the contents of the book, the control unit 20 determines whether or not the reference plane P2 is a specific page corresponding to the position and angle of the user input device 1 The contents corresponding to the area S1 of the page can be read out from the information on the contents of the book of the storage unit 16 and provided to the user through the display unit 12. [

5, 6A and 6B, the user input device 1 includes first and second images I1 and I2 displayed on the display portion 2a of the electric device 2, an object P1, It is also possible to display the bright screen by controlling the display unit 12 so as to photograph the area on the screen P2 more clearly. When the colors of the first and second images I1 and I2 of the display unit 2a and the colors of the object P1 and the reference plane P2 are the same or similar to each other, It is difficult to obtain. In this case, the user input device 1 controls the display unit 12 to more clearly identify the first and second images I1 and I2, the object P1, and the reference plane P2 from the periphery It is possible to irradiate the first and second images I1 and I2, the object P1 and the reference plane P2 with light different from the surrounding color, for example, red light. Alternatively, the user input device 1 may include a separate lamp unit to irradiate the first and second images I1 and I2, the object P1, and the reference plane P2 with light different from the surrounding color.

7 is an embodiment of a user input system to which a user input device according to the second embodiment is applied. The user input device 1 of FIG. 7 corresponds to a case where the push-detection adapter 30 is mounted on the user input device 1 of FIG. 1 and the electric device 3 corresponds to a device capable of wired or wireless communication, The user input device 1 and the electric device 3 constitute a kind of user input system. The user input device 1 includes a display unit 3a formed of a screen, .

When the user holds the user input device 1 by hand and performs a writing operation by connecting to the touch screen of the electric device 3, the touch screen 3a of the electric device 3, which is commonly used, 30, as well as contact input by the palm or other finger. If the intention of the user is to input only the touch of the push-detection adapter 30 as if the user were writing with a writing tool on the paper, the palm or finger touch input should be ignored (palm rejections). When the push signal from the push detection adapter 30 is obtained, the control unit 20 transmits the push signal to the electrical apparatus 3 via the communication unit 17, and the electrical apparatus 3 transmits the push signal Only the contact input of the push-detection adapter 30 can be selected using the reception timing.

As shown in the lower part of Fig. 7, a control unit (not shown) of the electric device 3 acquires a plurality of contact inputs from the touch screen 3a. The control unit of the electric device 3 acquires the touch input corresponding to the stroke TI2 during the time period t1 to t3 and acquires the contact input corresponding to the stroke TI1 during the time period t2 to t5, And a touch input corresponding to the stroke TI3 is obtained between times t4 and t6. Then, the control unit of the electric device 3 acquires a push signal from the user input device 1 during the time period (t2'-t5 '). The control unit of the electric device 3 identifies the touch input inputted at the time t2-t5 equal to or near the time t2'-t5 'at which the push signal is acquired and held as the input of the push detection adapter 30 can do. Therefore, the control unit of the electric device 3 judges the contact input intended by the user as the contact input of the stroke TI1 obtained at the time t2-t5, and ignores the contact input of the other strokes TI2 and TI3 . That is, the electric device 3 performs a palm rejection function using a push signal.

The user input device 1 and the electric device 3 calculate their absolute positions / directions (angles) using the built-in motion sensor 19, the camera 13, the depth sensor 18, can do. The user input device 1 and the electric device 3 transmit and receive their position and direction information through communication with each other so that the user input device 1 and the electric device 3 can communicate with each other in position / The rotation angle can also be calculated.

8 is a sixth embodiment of the user input device. The user input device 100 includes a sound generating unit 30a for generating a sound in a state of being in contact with a reference surface on one side and a sound generating unit 30b for guiding a sound generated by the sound generating unit 30a to the microphone 15 A sound guide pipe SP is additionally provided. The sound guiding pipe SP is formed inside the case of the user input device 100 and also functions to shut off sounds other than the sound generating part 30a.

The sound generating unit 30a is mounted on the outside of the user input device 100 in contact with the sound guide pipe SP. The sound generating unit 30a may be a device that generates sound by friction between the reference surface and the outer surface of the sound generating unit 30a. When the outer surface of the sound generating unit 30a touches the reference surface or is pressed, Or may be an element that occurs.

 The user input device 100 may equally include elements of the user input device 1 and the control unit of the user input device 100 may control the operation of the user input device 100 ) Position, angle, and position changes are processed as user input.

9A and 9B show an embodiment and a configuration diagram of a pressure sensing device corresponding to the pressure sensing adapter.

9A is an embodiment in which the pushing detection device 50 is used. The pushing detection device 50 is, for example, in a ring form and is fitted on the finger f and moves with the finger f. The pushing detection device 50 detects at least a part of the surface of the finger f and detects or acquires sound transmitted through the finger f (or the skin of the finger f) To the user input device (1). The push-detection device 50 includes components that are mounted in contact with the tissue of the human body in addition to the finger, and move with the movement of the tissue of the human body.

When the finger f is positioned at the position L1, the finger f is floating in the air on the reference plane P3, so that the pushing detection device 50 hardly receives sound through the finger f.

In the case of the position L2, a sound (impact sound, vibration, or the like) that the end of the finger f touches the contact point T1 with the end of the finger f touching the contact point T1 of the reference plane P3 Is transmitted to the push-detection device (50) through the finger (f).

When the end of the finger f touches the reference plane P3 from the position L2 to the position L3 and is dragged and moved, the sound due to the friction between the tip of the finger f and the reference plane P3 Vibration) is transmitted to the push-detection device 50 through the finger f.

The pushing detection device 50 generates a pushing signal including the transmitted sound and transmits the pushing signal to the user input device 1. The user input device 1 is identical to the pushing signal from the above- .

9B is a block diagram of the push-detection device 50. The push-detection device 50 includes a power supply unit 51 for supplying power to be used, a communication unit 53 for transmitting the push signal to the outside via wireless communication, And a control unit 57 for generating a sound obtained from the microphone 55 and the microphone 55 for acquiring the sound signal from the microphone 55 and transmitting it to the user input device 1 through the communication unit 53. [ The power supply section 51 supplies power to the communication section 53, the microphone 55 and the control section 57 with a battery, for example. The communication section 53 performs wireless communication such as Bluetooth or RF communication It corresponds to the device to perform. 9A, the microphone 55 is exposed to the inner surface of the annular case of the pressure sensing device 50 so as to be brought into contact with the surface of the finger f, And sound other than sound through the finger f is blocked, so that only sound through the finger f can be precisely obtained.

The control unit 57 generates a push signal when sound is obtained through the microphone 55 and controls the communication unit 53 to transmit the push signal.

The user input device 1 receives a push signal from the push detection device 50 and detects the start point of the user input (the point at which the push signal was received) and the end point ), And inputs only the user input between the start point and the end point, or inputs between the start point and the end point, and inputs other than the input, and processes them differently, Can be used.

The process of processing an electrical signal such as a push signal and the process of the position and angle of the user input device, the process of changing the position and the angle change, and the like are stored in the storage medium as a program file This program file is transferred between the electric devices by transmission over the network, installed in various electric devices, and the same operation is performed. That is, the functions performed by the user input processing device may be provided as a program stored in a computer-readable storage medium.

In the foregoing, the present invention has been described in detail by way of examples on the basis of the embodiments of the present invention and the accompanying drawings. However, the scope of the present invention is not limited by the above embodiments and drawings, and the scope of the present invention will be limited only by the content of the following claims.

10: input unit 11: connection unit
12: display section 13: camera
14: Speaker 15: Microphone
16: storage unit 17: communication unit
18: Depth sensor 19: Motion sensor
20: control unit 30: push detection adapter

Claims (27)

A position sensing unit provided in the user input device for sensing a position or an angle of the user input device and a position or an angle change caused by movement of the user input device;
And a controller for determining a user input according to a position, an angle, a position, or an angle change of the user input device based on the sensed value from the position sensing unit.
The method according to claim 1,
The control unit judges the position and angle of the reference unit of the user input device based on the sensing value from the position sensing unit and controls the currently executed program according to the determined position or angle. User input device using change.
The method according to claim 1,
The user input device includes a display unit for displaying a user input or a program currently being executed, and a storage unit for storing judgment reference data for determining a user input according to a moving path of the user input device,
The control unit compares the movement route according to the positional change from the position sensing unit and the previously stored judgment reference data, judges the user input of the judgment reference data corresponding to the movement route, displays the judged user input on the display unit, Or controlling a program that is currently being executed.
The method according to claim 1,
The position sensing unit includes a camera that captures an image, and the determination reference data includes an original image of an object or an image displayed on the object,
The control unit compares the image of the object photographed by the camera with the original image of the object or the image displayed on the object to determine the position and angle of the user input device with respect to the object, And a change of an angle, a position, and an angle.
The method according to claim 1,
The position sensing unit includes a camera for photographing an image of an object, a depth sensor
The control unit compares the images of the object photographed by the camera at a plurality of viewpoints with each other or compares the depth detection values of the objects obtained at a plurality of viewpoints from the depth sensor with each other to determine the position and angle of the user input device with respect to the object, And a position and an angle of the object with respect to the user input device, and a change of the position and an angle of the user with respect to the user input device.
The method according to claim 4 or 5,
Wherein the control unit irradiates light to the object through the display unit for identification of the object.
The method according to claim 4 or 5,
Wherein the control unit processes the change of the angle by changing the thickness or color of the stroke, which is a user input displayed on the display unit.
The method according to claim 4 or 5,
Wherein the object is a mirror or an electric device having a display unit for displaying an image.
9. The method of claim 8,
Wherein the control unit displays an original image on a display unit, photographs an image reflected from a mirror, which is an object, using a camera, and compares the taken reflected image with the original image.
9. The method of claim 8,
Wherein the electric device provides the original image including the image displayed on the electric device to the user input device.
The method according to claim 4 or 5,
Wherein the user input device is provided with a reflecting device for changing the photographing angle in front of the camera.
6. The method according to any one of claims 1 to 5,
Wherein the control unit determines a start time and an end time of a user input.
13. The method of claim 12,
Wherein the control unit processes the user input between the start point and the end point and the user input before and after the start point differently from each other.
13. The method of claim 12,
The user input device includes a microphone, the storage unit stores a reference input sound,
Wherein the control unit compares the sound obtained from the microphone with the reference input sound to determine the start and end points of the user input.
15. The method of claim 14,
The user input device comprises a case, a sound generating part for generating sound on one side of the case, and a sound induction pipe for microwaving the sound generated by the sound generating part inside the case. .
13. The method of claim 12,
The position sensing unit includes a magnetic field sensor, a motion sensor composed of at least one of an accelerometer and a gyroscope, and the control unit determines that the position change of the user input device moves on a plane after the point of time when the acceleration from the motion sensor is greater than the reference acceleration The user input is determined as the start time of the user input.
13. The method of claim 12,
The user input device includes a case, a pressing detection adapter protruding outside the case to generate a pressing signal, and a connection part formed inside the case and connected to the pressing detection adapter to apply a pressing signal to the control part. User input device using position change.
18. The method of claim 17,
Wherein the control unit determines start and end points of the user input according to the electrical characteristics of the push signal.
18. The method of claim 17,
The push-sensing adapter includes a ground circuit (GND) for input and a microphone signal line (MIC), a circuit part composed of a resistor connected between the ground line (GND) and the microphone signal line (MIC), a first conductor connected to the ground line And a second conductor portion connected to the microphone signal line (MIC), wherein the first and second conductors are not electrically connected or connected by an external force.
18. The method of claim 17,
The push-sensing adapter includes a case constituting an outer shape with a groove in the inner part, a sensor part connected to a speaker signal line and a ground line (GND), a part of which protrudes to the front to sense pressure, The speaker signal line is connected to the microphone signal line (MIC) and the other end is connected to the ground line (GND). The electrical signal corresponding to the pressure applied to the sensor unit is supplied to the microphone signal line A ground line (GND), and a microphone signal line (MIC).
18. The method of claim 17,
The pressure sensing adapter includes a case constituting an outer shape with a groove in the interior thereof, a sensor part inserted in the groove and sensing pressure, an end part contacting the sensor part with an external force covering the sensor part, A sub-control unit for generating a control signal for controlling the speaker output through the microphone signal line and ground line (MIC-GND), and a power harvesting unit for performing a process of transforming and rectifying the speaker output through the speaker output line to supply the DC power to the sub- Wherein the position or position change is characterized by comprising:
18. The method of claim 17,
The pressure sensing adapter includes a case constituting an outer shape when the groove is formed therein, a sensor inserted in the groove and sensing pressure, an end contacting with the sensor by an external force, data including information on the pressure of the sensor, And a sub-control unit for generating the data by applying the data to the control unit through the data line DATA.
The user input device according to claim 12 has a communication unit for performing wired communication or wireless communication,
The control unit transmits a signal including a start point or an end point of a user input to an external electric device through a communication unit,
And an external electric device for determining a user input based on a start time or an end time included in the transmitted signal.
A touch sensing device mounted on a tissue of a human body for sensing a sound transmitted through a tissue of a human body and generating a push signal including a sensed sound and transmitting the push signal to a user input device;
And a user input device for determining start and end points of a user input based on a push signal received from the push detection device.
25. The method of claim 24,
The pushing detection device includes a microphone for acquiring sounds approaching or coming close to the tissue of the human body, a communication unit for transmitting the pushing signal to the user input device, a pushing signal including sound from the microphone, And a communication unit for transmitting the user input information.
25. The method of claim 24,
Wherein the user input device processes a user input between a start point and an end point of a user input and a user input before or after a start point and controls a program currently being executed.
25. The method of claim 24,
Wherein the push-detection device has a ring-shaped structure that fits on the finger.
KR1020150018907A 2014-09-15 2015-02-06 User input apparatus using location or change thereof KR20160031940A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR20140121796 2014-09-15
KR1020140121796 2014-09-15

Publications (1)

Publication Number Publication Date
KR20160031940A true KR20160031940A (en) 2016-03-23

Family

ID=55645271

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150018907A KR20160031940A (en) 2014-09-15 2015-02-06 User input apparatus using location or change thereof

Country Status (1)

Country Link
KR (1) KR20160031940A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113777621A (en) * 2020-06-09 2021-12-10 北京小米移动软件有限公司 Electronic device, relative position relation detection method and device, and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113777621A (en) * 2020-06-09 2021-12-10 北京小米移动软件有限公司 Electronic device, relative position relation detection method and device, and storage medium

Similar Documents

Publication Publication Date Title
US10444908B2 (en) Virtual touchpads for wearable and portable devices
JP6539816B2 (en) Multi-modal gesture based interactive system and method using one single sensing system
KR102496531B1 (en) Method for providing fingerprint recognition, electronic apparatus and storage medium
US10509489B2 (en) Systems and related methods for facilitating pen input in a virtual reality environment
US10268277B2 (en) Gesture based manipulation of three-dimensional images
KR20150104808A (en) Electronic device and method for outputing feedback
JP2012515966A (en) Device and method for monitoring the behavior of an object
EP2338154A1 (en) Finger-worn device and interaction methods and communication methods
TW201405414A (en) Virtual hand based on combined data
US10540023B2 (en) User interface devices for virtual reality system
US10114512B2 (en) Projection system manager
JP7086116B2 (en) Electronic devices for analog stroke generation and digital recording of analog strokes, as well as input systems and methods for digitizing analog recordings.
US11360550B2 (en) IMU for touch detection
KR102591686B1 (en) Electronic device for generating augmented reality emoji and method thereof
US10168838B2 (en) Displaying an object indicator
US11886643B2 (en) Information processing apparatus and information processing method
KR102176575B1 (en) Electronic device and method for sensing inputs
TWI592862B (en) Tracking a handheld device on surfaces with optical patterns
KR20210017081A (en) Apparatus and method for displaying graphic elements according to object
KR100749033B1 (en) A method for manipulating a terminal using user's glint, and an apparatus
KR20160031940A (en) User input apparatus using location or change thereof
KR101397812B1 (en) Input system of touch and drag type in remote
JP7275885B2 (en) DISPLAY DEVICE, DIRECTION SPECIFICATION METHOD, PROGRAM
KR20170004102A (en) Electronic device
US11836320B2 (en) Spurious hand signal rejection during stylus use

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E601 Decision to refuse application