KR20110094737A - Keyboard with mouse using touchpad - Google Patents

Keyboard with mouse using touchpad Download PDF

Info

Publication number
KR20110094737A
KR20110094737A KR1020100014318A KR20100014318A KR20110094737A KR 20110094737 A KR20110094737 A KR 20110094737A KR 1020100014318 A KR1020100014318 A KR 1020100014318A KR 20100014318 A KR20100014318 A KR 20100014318A KR 20110094737 A KR20110094737 A KR 20110094737A
Authority
KR
South Korea
Prior art keywords
keyboard
mouse
finger
touch pad
touch
Prior art date
Application number
KR1020100014318A
Other languages
Korean (ko)
Inventor
홍상훈
Original Assignee
홍상훈
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 홍상훈 filed Critical 홍상훈
Priority to KR1020100014318A priority Critical patent/KR20110094737A/en
Publication of KR20110094737A publication Critical patent/KR20110094737A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03543Mice or pucks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)

Abstract

The present invention relates to a touch pad type keyboard, and more particularly, to a keyboard and a key input method including a touch pad type mouse function.
The present invention provides a display unit for displaying an image of a matter being executed, a touch pad unit which is covered by an upper surface of the display unit and recognizes pressure applied to an upper portion of the display unit, and a signal input to the touch pad unit. The present invention provides a touch pad-type mouse combined keyboard including a controller configured to recognize at least one of a keyboard mode and a mouse mode, and to control the display unit and the touch pad unit to perform a function corresponding to the recognized one.

Description

Keyboard with touchpad type mouse {Keyboard with mouse using touchpad}

The present invention relates to a touch pad type keyboard, and more particularly, to a keyboard and a key input method including a touch pad type mouse function.

1 is a view showing a computer system using a conventional keyboard.

The conventional computer system 100 includes a main body 110, a monitor 120, a keyboard 130, and a mouse 140.

The main body 110 is a part that controls the overall operation of the computer system 100. The main body 110 recognizes and processes inputs of the keyboard 130 and the mouse 140, and then displays the result on the monitor 120. to be.

The keyboard 130 is an input device of the computer system 100 and includes letters, numbers, special characters, and function keys. The number of keys on the keyboard 130 tends to increase according to the user's function expansion request, and the number of keys of the keyboard generally used is about 100.

The function keys include Control, Alt, Shift, Enter, and Tab, which can be used to change the original meaning of a key or to control program operation. It is used to move the text or the cursor on the monitor 120.

Meanwhile, the mouse 140 is an input device of the computer system 100 together with the keyboard 130, and includes a case, a ball, and a cable, and a button and a wheel are attached to an upper surface of the case. The ball indicates a cursor position on the screen as a device that sends a signal to the main body, and the cable is a line connecting the main body 110. The button is a device that functions to select or execute a command, one to three, usually between the buttons is a wheel for quickly moving up and down the screen. In order to use the mouse 140, a mouse driver for receiving and processing the movement of the mouse 140 and a mouse pad allowing the ball to be freely moved are required.

As described above, the conventional computer system 100 has a space utilization rate because the main body 110, the monitor 120, the keyboard 130, the mouse 140, and the wires connecting them to each other are arranged in a predetermined space. Decrease dramatically.

In addition, in order to use the computer system 100, since both the keyboard 130 and the mouse 140 must be provided, there is a problem in that the cost burden of the user increases.

In addition, since the current keyboard has to be physically pressed, the size of a certain rectangle and a distance between each key are far from each other to prevent the surrounding alphabetic keys from being pressed together. As a result, the size of the keyboard cannot be smaller than any limit. If it gets smaller, it will not fit on an adult's finger, making typing difficult.

It is an object of the present invention to enable any area of the touch pad type keyboard to function as a keyboard and a mouse so that a key button signal and a mouse signal input can be easily performed without a separate mouse and a keyboard using the same It is to provide a key input method.

In addition, an object of the present invention is to provide a keyboard and a key input method using the same to perform a key button input and a mouse signal input as a single device by operating a region on the touch screen according to the user's selection with a mouse. .

According to the present invention, a display unit 10 displaying an image of a matter being executed; and a touch pad unit 20 which is covered by an upper surface of the display unit 10 and recognizes a pressure applied to an upper portion of the display unit 10. And recognizes any one or more of a keyboard mode or a mouse mode as a signal input to the touch pad unit 20, and corresponds to a function that the display unit 10 and the touch pad unit 20 are recognized. It provides a touch pad-type mouse combined keyboard comprising a; control unit 30 to control to be performed.

In addition, a mode selection step (S1) of selecting a keyboard mode or a mouse mode by touching a finger on the touch pad unit 20; and, in the case of keyboard mode selection, a generated keyboard that is automatically generated at a portion where a finger is touched. A keyboard forming step (S2) in which either the keyboard layout 11 or the pre-generated keyboard keyboard layout 12 is displayed on the display unit 10; and, in the case of selecting a mouse mode, view a portion where a finger is touched. Mouse function application step (S3) to enable the operation of the mouse; And, a signal input step (S4) for inputting a signal by touching the keys of the keyboard layout (11, 12) or operating the mouse of the mouse mode ); And a transmission step (S5) of transmitting the input signal to an external device.

By utilizing the present invention, since the keyboard layout can be automatically formed by reflecting the user's key input tendency, it is very convenient to use and can be utilized as a user-oriented keyboard.

In addition, since a separate mouse is not necessary, it is convenient to carry and use, and since both the keyboard and the mouse are provided in the form of a touch pad, the noise is greatly reduced, which is very convenient for use in public places.

In addition, since the keyboard and mouse are used simultaneously as one device, space utilization is high.

1 illustrates a computer system using a conventional keyboard,
Figure 2 (a) shows that the present invention is applied to a notebook computer,
Figure 2 (b) shows that the invention is applied to a desktop computer,
3 is an exploded perspective view showing the present invention,
Figure 4 (a) and (b) shows a case in which the display on the display that the keyboard layout is automatically generated in the present invention,
Figure 4 (c) shows a case in which the keyboard layout stored in the present invention is displayed on the display unit,
5 is a block diagram schematically showing the configuration of the present invention;
6 is a flowchart illustrating a key input method of a keyboard according to an embodiment of the present invention.

Hereinafter, exemplary embodiments of the present invention will be described in detail with reference to the accompanying drawings. Prior to this, terms or words used in this specification and claims should not be construed in a common or dictionary sense, and the inventors will be required to properly define the concepts of terms in order to best describe their invention. Based on the principle that it can, it should be interpreted as meaning and concept corresponding to the technical idea of the present invention.

Therefore, the embodiments described in the specification and the drawings shown in the drawings are only the most preferred embodiment of the present invention and do not represent all of the technical idea of the present invention, various equivalents may be substituted for them at the time of the present application. It should be understood that there may be water and variations.

Figure 2 (a) shows that the invention is applied to a notebook computer, Figure 2 (b) shows that the invention is applied to a desktop computer, Figure 3 is an exploded perspective view showing the present invention, Figure (A) and (b) of FIG. 4 show a case in which the keyboard keyboard arrangement is automatically generated in the present invention, and the display unit displays the keyboard keyboard arrangement stored in the present invention on the display unit. The case where it is displayed is shown, and FIG. 5 is a block diagram which shows the structure of this invention schematically.

As shown in (a) and (b) of FIG. 2, the present invention relates to a touchpad-type mouse combined keyboard so that key button signals and mouse input signals can be performed using a touchpad-type keyboard.

As shown in FIG. 2, a computer system using the present invention includes a monitor 1, a main body 3, and a keyboard 2. Although the name computer is used here, it refers to a variety of known terminals.

The monitor 1 is a device on which a program running on the main body 3 is displayed. In the present invention, key button signals input through the keyboard 2 are displayed, and when the keyboard 2 operates as a mouse, the cursor is displayed at a corresponding position according to the pointing data of the mouse.

The main body 3 is a part for controlling the overall operation of the computer system. In particular, in the present invention, a key button signal and a mouse signal input through the keyboard 2 are received, and the operation of each device is controlled according to the received signal. do.

The keyboard 2 is a touch screen type and is configured such that any area of the keyboard 2 can operate as a mouse.

While the present invention provides a touch pad-type mouse combined keyboard to simultaneously perform a keyboard function and a mouse function, the position at which the keyboard or mouse function is operated can be arbitrarily used. That is, the keyboard keyboard or the mouse mode can be executed according to the pattern of the touch of the human finger, and the keyboard keyboard array can be displayed on the touched part, or the mouse function can be performed.

More specifically, referring to FIG. 5, the keyboard 2 includes a control unit 30, a driving unit 15, a display unit 10, a touch pad unit 20, a touch pad device 25, and a keyboard array generation unit. (31), keyboard array storage unit 32, random access memory (RAM) 33, interface unit 34 and connector 35 are preferably included.

An arbitrary area of the display unit 10 is a portion for displaying a key arrangement corresponding to key images of a plurality of key buttons when the input mode is the keyboard mode. For example, an array of a plurality of text keys, numeric keys, and function keys corresponding to the current input mode is displayed.

In addition, when the input mode is the mouse mode, the touch panel and the touch pad button for the mouse, that is, the mouse image, may be set in an arbitrary area of the display unit 10. Of course, in the mouse mode, no image display is possible.

The display unit 10 may be configured to display the keyboard keyboard layout at an arbitrary position according to the user's selection, and the arrangement of each keyboard keyboard array may be generated in various ways according to the size of the user's hand and the method of pressing a key. 11), or a keyboard layout 12 having already been created and stored. This is an optional option. This is the same in the mouse mode, and it is possible to operate the mouse at any position. Keyboard keyboard layout and mouse operation will be described later.

On the other hand, the display unit 10 is implemented as an LCD (Liquid Crystal Display), it is not limited to implement other display elements.

The touch pad unit 20 for performing key button input and mouse signal input is positioned above the display unit 10. The touch pad unit 20 is a part in which a user inputs actual data, and may be manufactured by at least one or more of a contact capacitance method, a resistive film method, a surface ultrasonic conduction method, a tension measurement method, and an infrared light method.

Specifically, the contact capacitive method is manufactured by coating a transparent special conductive metal (TAO: Tin Antimony Oxide) material on both sides of the glass constituting the touch screen sensor, and uses the human body's capacitance. That is, when the user touches the coated touch screen, a certain amount of current flowing on the surface of the touch screen is absorbed into the user's body, and the touch screen recognizes the coordinates of the portion where the amount of current is changed.

The resistive film method is manufactured by coating a resistive component on a surface of a touch screen and covering a special film on the resistive component. In other words, a resistive coating is applied to the inside of the special film, and an insulating rod is placed at regular intervals so that the resistive component and the touch screen do not come into contact with each other. This is to recognize the coordinates of the part in contact with.

The surface ultrasonic conduction method is a method in which a transducer installed on the opposite side receives surface ultrasonic waves when a transducer attached to a touch screen emits surface ultrasonic waves. Since the surface ultrasonic waves to be transmitted are constant, when pressure is applied to a specific portion of the touch screen, the surface ultrasonic waves of the portion to which the pressure is applied are weakened to recognize the coordinates of the touched portion.

The tension measuring method is a method of detecting the coordinates of the contact point according to the degree of force applied to the screen of the touch screen by attaching a sensor that can measure the tension on the four corners of the touch screen.

The infrared light method is a method of recognizing the coordinates of the touched part by blocking the infrared scanning light flowing on the touch screen surface.

Meanwhile, the controller 30 reads the key arrangement information from the keyboard array generation unit 31 or the keyboard array storage unit 32 according to the input mode selected by the user, and displays the corresponding key image or mouse image on the display unit 10. do.

Of course, the keyboard array generation unit 31 needs to touch at least four fingers on the touch pad unit 20 in order to select the keyboard mode during the input mode selection process. The display keyboard 10 displays the generated keyboard layout 11 that automatically recognizes the layout and automatically generates the layout of the keyboard layout. In addition, the keyboard array storage unit 32 stores the keyboard keyboard array 12, which is a keyboard keyboard array previously formed, and displays the keyboard keyboard array 12 on the display unit 10 under the control of the controller 30.

In addition, the controller 30 receives various data and key input values input through the touch pad unit 20 according to the key image or the mouse image displayed on the display unit 10, and the data received through the connector 35. Is transmitted to the main body (3).

The driver 15 drives the display unit 10 according to a control signal of the controller 30.

The touch pad device 25 recognizes an input signal transmitted from the touch pad unit 20, calculates a position coordinate of a point in contact with the touch pad unit 20, and transmits a corresponding data signal to the controller 30. do.

The key arrangement storage unit 36 stores an image corresponding to each key button displayed on the display unit 10.

The coordinate table unit 37 stores coordinate values on the display unit 10 of the image stored in the key array storage unit 36. Therefore, the controller 30 outputs a key image and a mouse image to a specific position on the display unit 10 using the coordinate values stored in the coordinate table unit 37. In addition, the controller 30 recognizes a key button signal or a mouse signal corresponding to the data transmitted from the touch pad device 25 using the coordinate values stored in the coordinate table 37.

The RAM 33 temporarily stores data input from the touch pad unit 20 and data generated during the operation of the controller 30.

The interface unit 37 converts the data input from the touch pad unit 20 into a data format for transmission to the main body 3.

The connector 35 is a connection terminal for connecting the keyboard 2 with the main body 3, and an RS-232C connector, a parallel connector, a USB connector, and the like may be used.

Hereinafter, the keyboard mode and the mouse mode using the keyboard 2 according to the present invention will be described in detail.

As described above, in the present invention, the keyboard 2 performs not only a keyboard function but also a mouse function. For convenience, it is referred to as keyboard mode and mouse mode.

Touch pad type mouse combined keyboard of the present invention, the display unit 10 for displaying an image of the matter being executed, and is covered on the upper surface of the display unit 10, the input is performed by recognizing the pressure applied to the upper portion The touch pad unit 20 and the signal input to the touch pad unit 20 are recognized as one or more of a keyboard mode or a mouse mode, and the display unit 10 and the touch pad unit 20 are recognized. It is configured to include a control unit 30 for controlling to perform a function corresponding to the.

That is, the touch pad unit 20 is provided on the upper surface of the display unit 10 so as to select what is displayed on the display unit 10.

In addition, a feature of the present invention is that the keyboard mode or the mouse mode can be used depending on how the finger touches the touch pad unit 20 without a separate button. Of course, it is also possible to execute the keyboard mode and the mouse mode simultaneously in one keyboard.

First, an embodiment of a keyboard mode execution method will be described.

The selection of the keyboard mode is based on any one or more of four touches of the left hand finger or four touches of the right hand finger on the touch pad unit 20. That is, when four fingers of either hand are simultaneously touched by the touch pad unit 20, the finger is recognized as the keyboard mode. Of course, both hands may be touched. Once recognized as a keyboard mode, the keyboard mode is maintained for a certain time. Of course, this time can be set and changed by the user.

Since only four fingers are required for either hand, the keyboard layout can be generated by four touches of either of the left or right hand, and depending on the setting, only the keyboard corresponding to the right or left hand is displayed. (10) can be displayed. That is, the keyboard corresponding to the left hand and the right hand may be provided separately. In addition, the keyboard keyboard arrangement may be generated at a portion where the fingers of the left and right hands are touched.

As shown in (a) and (b) of FIG. 4, in the present invention, the keyboard corresponding to the left hand and the right hand is provided with a function of the generated keyboard keyboard arrangement 11. In addition, as shown in (c) of FIG. 4, the present invention may display a keyboard keyboard array 12 that is already generated and stored on the display unit 10 when the keyboard mode is expressed.

This is performed by the controller 30. When the keyboard mode is selected in the input mode selection process, the controller 30 generates the keyboard layout 11 on the display unit 10 of the touch pad unit 20 where the finger is touched. ) Is displayed so that a key input is performed by touching the upper touch pad unit 20. In addition, the controller 30 recognizes an interval between the fingers touched by the touch pad unit 20 and a positional arrangement between the fingers in the process of selecting the keyboard mode, and thus generates the keyboard layout 11 to correspond to the shape. Is automatically formed and displayed on the display unit 10, and the remaining keyboard keyboard is also displayed in the same pattern (see FIG. 4A).

In addition, as shown in (b) of FIG. 4, the present invention may have a rounded shape of the generated keyboard key, and may have other shapes. As shown in the arrangement of the left and right keyboards, it can be seen that the central part is provided in a convex upward shape. This is because the length of the middle finger is generally considered to be located on the upper side of the keyboard ergonomically arranged more conveniently. That is, the keyboard layout is formed in consideration of the vertical arrangement as well as the horizontal arrangement of the fingers.

In other words, there is a difference in the way each user uses the keyboard, and there is also a difference in the size of the user's hand to enable the arrangement and layout of the keyboard adapted to the user. When the user touches a finger on the touch pad unit 20, the factor reflected in this is determined by determining the distance between each finger and the relative position of the touched finger so that the arrangement of the remaining keyboard keys is based on the four touched fingers. It is formed automatically.

In addition, the controller 30 controls the display keyboard 10 to generate the pre-stored keyboard keyboard array 12, which is generated and stored in advance when the keyboard mode is selected, to input keys by touching the upper touch pad unit 20. FIG. This can be done (see FIG. 4C). Here, it is preferable that the ready-made keyboard layout 12 can change the spacing and arrangement between each key.

In addition, the controller 30 may control the keyboard layout 11 or the keyboard layout 12 generated on the display 10 to be transparent. This is because the keyboard layout is familiar to the layout and layout of the keyboard, and the keyboard layout optimized for the user will be generated by the user's settings.

Next, an embodiment of a mouse mode execution method will be described.

The selection of the mouse mode may be performed by one touch of the finger or two touches of the finger on the touch pad unit 20.

That is, in the present invention, since the touch pad type keyboard can be used as a mouse, selection of a method different from the keyboard mode is required, so that the mouse mode is executed by one or two touches.

In this case, when the mouse mode is selected, the mouse can operate the mouse from the point where the finger is touched by the touch pad unit 20, which is controlled by the controller 30.

To use the mouse mode, first, when selecting a mouse mode with one finger touch, add a touch of one finger, and touch the left finger while keeping the touch of the finger located on the right side. When the button function is activated and the finger on the right side is touched while keeping the touch of the finger on the left side, the right button function of the mouse is activated, and the cursor movement function is activated by dragging one finger. This is possible. Of course, when two fingers are dragged to a state in which the left button function is operated (two fingers are touched) as in the conventional mouse, the selected one is dragged in the monitor. Here, left and right may be changed according to a user's selection.

Next, in the case of selecting a mouse mode with two fingers touching, if the left finger is touched while maintaining the touch of the finger located on the right side, the left button function of the mouse is activated and the touch of the finger located on the left side is performed. If you touch the finger on the right side while holding, the right button function of the mouse works. If you drag one finger, the cursor movement function works. In this case, the drag method is the same. Here, left and right may be changed according to a user's selection.

Hereinafter, a key input method according to the present invention will be described in detail with reference to FIG. 6.

As shown in FIG. 6, in the signal input method of the touch pad-type mouse combined keyboard according to the present invention, a mode selection step of selecting one or more of a keyboard mode and a mouse mode by touching a finger on the touch pad unit 20 (S1) And, in the case of keyboard mode selection, the keyboard in which either the generated keyboard keyboard array 11 or the pre-generated keyboard keyboard array 12 that is automatically generated at the portion where the finger is touched is displayed on the display unit 10. Forming step (S2); and, in the case of mouse mode selection, the mouse function application step (S3) to enable the operation of the mouse from the point where the finger is touched; and the keys of the keyboard layout (11, 12) A signal input step (S4) of inputting a signal by touching or operating a mouse in the mouse mode; And a transmission step (S5) of transmitting the input signal to an external device.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. It is to be understood that various changes and modifications may be made without departing from the scope of the appended claims.

1: monitor
2: keyboard
3: body
10: display unit
11: generate keyboard keyboard layout
12: keyboard layout
20: touch pad unit
30:

Claims (11)

A display unit 10 for displaying an image of the matter being executed;
A touch pad unit 20 which is covered on an upper surface of the display unit 10 and recognizes a pressure applied to an upper portion of the display unit 10 to perform an input;
Recognizes at least one of a keyboard mode and a mouse mode as a signal input to the touch pad unit 20, and controls the display unit 10 and the touch pad unit 20 to be performed as a function corresponding to the recognized one. Touch pad type mouse combined keyboard comprising a;
The method of claim 1,
The selection of the keyboard mode may be based on any one or more of four touches of the left hand finger or four touches of the right hand finger on the touch pad unit 20.
Selecting the mouse mode is a touch pad type mouse combined keyboard, characterized in that the touch pad by using any one of the touch of the finger or two touch of the finger.
The method of claim 2,
The control unit 30 controls the display keyboard 10 to be displayed on the display unit 10 of a portion where the finger is touched in the touch pad unit 20 in the process of selecting the keyboard mode. (20) A touchpad mouse combined keyboard characterized in that key input is performed by touch.
The method of claim 3,
The controller 30 recognizes an interval between the fingers touched by the touch pad unit 20 and a positional arrangement between the fingers in the process of selecting the keyboard mode, and automatically generates the keyboard layout 11 corresponding to the shape. Formed and displayed on the display unit 10, the remaining keyboard keyboard is characterized in that the same pattern to display the touch pad mouse keyboard.
The method of claim 2,
When the keyboard mode is selected, the controller 30 controls the display keyboard 10 to display the pre-generated keyboard layout 12, which is generated in advance, so as to perform key input by touching the upper touch pad unit 20. Touchpad-type mouse combined keyboard, characterized in that to be.
The method of claim 5,
The ready-made keyboard layout 12 is a touch pad mouse keyboard, it characterized in that it is possible to change the spacing and arrangement between each key.
The method of claim 2,
The control unit 30 is a touch pad mouse combined keyboard, characterized in that to enable the operation of the mouse from the point where the finger touches the touch pad unit 20 in the process of selecting the mouse mode.
The method of claim 7, wherein
If you choose mouse mode with one finger touch, add a touch of one finger.
If you touch the left finger while keeping the touch of the finger located on the right side, the left button of the mouse is activated.If you touch the right finger while keeping the touch of the finger on the left side, the right button of the mouse works. This works,
Touch pad type mouse keyboard, characterized in that the cursor movement function is activated by dragging one finger.
The method of claim 7, wherein
When you select mouse mode with two fingers touch,
If you touch the left finger while keeping the touch of the finger located on the right side, the left button of the mouse is activated.If you touch the right finger while keeping the touch of the finger on the left side, the right button of the mouse works. This works,
Touch pad type mouse keyboard, characterized in that the cursor movement function is activated by dragging one finger.
The method according to any one of claims 3 to 6,
The control unit (30) is a touch pad type mouse keyboard, characterized in that for controlling the keyboard layout (11) or the keyboard layout (12) generated on the display unit 10 to be transparent.
A mode selection step (S1) of selecting a keyboard mode or a mouse mode by touching a finger on the touch pad unit 20;
In the case of keyboard mode selection, a keyboard forming step in which either the generated keyboard keyboard layout 11 or the pre-generated keyboard keyboard layout 12 that is automatically generated at a portion where a finger is touched is displayed on the display unit 10 (S2). );
In the case of mouse mode selection step (S3) of applying a mouse function to enable the operation of the mouse from the point where the finger is touched;
A signal input step (S4) of inputting a signal by touching a key of the keyboard layout (11) (12) or operating a mouse of the mouse mode;
And a transmission step (S5) of transmitting the input signal to an external device.
KR1020100014318A 2010-02-17 2010-02-17 Keyboard with mouse using touchpad KR20110094737A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020100014318A KR20110094737A (en) 2010-02-17 2010-02-17 Keyboard with mouse using touchpad

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020100014318A KR20110094737A (en) 2010-02-17 2010-02-17 Keyboard with mouse using touchpad

Publications (1)

Publication Number Publication Date
KR20110094737A true KR20110094737A (en) 2011-08-24

Family

ID=44930694

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020100014318A KR20110094737A (en) 2010-02-17 2010-02-17 Keyboard with mouse using touchpad

Country Status (1)

Country Link
KR (1) KR20110094737A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015047358A1 (en) * 2013-09-28 2015-04-02 Intel Corporation Multi-function key in a keyboard for an electronic device
KR101682214B1 (en) * 2016-04-27 2016-12-02 김경신 an electric ink keyboard
CN111552391A (en) * 2020-04-29 2020-08-18 重庆工程职业技术学院 Clean type input device of computer
KR20220103907A (en) * 2011-11-15 2022-07-25 조은형 Multi human interface device having text input unit and pointer location information input unit
KR20220132700A (en) 2021-03-23 2022-10-04 중앙대학교 산학협력단 Multi-functional input device with virtual mouse and keyboard

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20220103907A (en) * 2011-11-15 2022-07-25 조은형 Multi human interface device having text input unit and pointer location information input unit
KR20230145290A (en) * 2011-11-15 2023-10-17 조은형 Multi human interface device having text input unit and pointer location information input unit
WO2015047358A1 (en) * 2013-09-28 2015-04-02 Intel Corporation Multi-function key in a keyboard for an electronic device
US9529393B2 (en) 2013-09-28 2016-12-27 Intel Corporation Multi-function key in a keyboard for an electronic device
KR101682214B1 (en) * 2016-04-27 2016-12-02 김경신 an electric ink keyboard
WO2017188643A3 (en) * 2016-04-27 2018-08-02 콜럼버스테크 주식회사 Electronic ink keyboard
CN111552391A (en) * 2020-04-29 2020-08-18 重庆工程职业技术学院 Clean type input device of computer
KR20220132700A (en) 2021-03-23 2022-10-04 중앙대학교 산학협력단 Multi-functional input device with virtual mouse and keyboard

Similar Documents

Publication Publication Date Title
TWI588734B (en) Electronic apparatus and method for operating electronic apparatus
JP5295328B2 (en) User interface device capable of input by screen pad, input processing method and program
US8294047B2 (en) Selective input signal rejection and modification
US9886108B2 (en) Multi-region touchpad
US10061510B2 (en) Gesture multi-function on a physical keyboard
US20110221684A1 (en) Touch-sensitive input device, mobile device and method for operating a touch-sensitive input device
US20140062875A1 (en) Mobile device with an inertial measurement unit to adjust state of graphical user interface or a natural language processing unit, and including a hover sensing function
US20110227947A1 (en) Multi-Touch User Interface Interaction
KR20130052749A (en) Touch based user interface device and methdo
KR101019254B1 (en) apparatus having function of space projection and space touch and the controlling method thereof
EP2474890A1 (en) Virtual keyboard configuration putting fingers in rest positions on a multitouch screen, calibrating key positions thereof
KR20130054759A (en) Remote controller, system and method for controlling by using the remote controller
EP2418573A2 (en) Display apparatus and method for moving displayed object
US20120038586A1 (en) Display apparatus and method for moving object thereof
KR20170108001A (en) Information processing apparatus, input apparatus, control method of information processing apparatus, control method and program of input apparatus
KR20110094737A (en) Keyboard with mouse using touchpad
JP2011090422A (en) Input processor
KR20160097410A (en) Method of providing touchless input interface based on gesture recognition and the apparatus applied thereto
JP5243379B2 (en) Input device
JP5845585B2 (en) Information processing device
US11216121B2 (en) Smart touch pad device
KR20080024381A (en) Keyboard including mouse function and key input method using the same
US20160004384A1 (en) Method of universal multi-touch input
TWI439922B (en) Handheld electronic apparatus and control method thereof
JP2011204092A (en) Input device

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
AMND Amendment
E601 Decision to refuse application
AMND Amendment