KR20160006932A - Computing apparatus and method for controlling the same - Google Patents

Computing apparatus and method for controlling the same Download PDF

Info

Publication number
KR20160006932A
KR20160006932A KR1020140086640A KR20140086640A KR20160006932A KR 20160006932 A KR20160006932 A KR 20160006932A KR 1020140086640 A KR1020140086640 A KR 1020140086640A KR 20140086640 A KR20140086640 A KR 20140086640A KR 20160006932 A KR20160006932 A KR 20160006932A
Authority
KR
South Korea
Prior art keywords
finger
key
display unit
keyboard
dragging
Prior art date
Application number
KR1020140086640A
Other languages
Korean (ko)
Inventor
홍석현
윤재선
박영준
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020140086640A priority Critical patent/KR20160006932A/en
Priority to PCT/KR2014/010748 priority patent/WO2015194712A1/en
Publication of KR20160006932A publication Critical patent/KR20160006932A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A computing device and a control method thereof are disclosed.
A computing device according to an embodiment of the present invention includes a keyboard having a plurality of keys and at least a part of the plurality of keys having a touch sensor, a display unit, and a control unit, A first finger touching a first key of a key, a second finger touching a second key, and a third finger touching a third key, wherein the first key recognizes at least the left side of the second key Wherein the third key is located on the right side of the second key and the second key is located on the same line as at least one of the first key and the third key on the keyboard, 3 keys, and controls the display unit to display the indicator on the first area on the screen based on the recognition result.

Description

[0001] COMPUTING APPARATUS AND METHOD FOR CONTROLLING THE SAME [0002]

The present invention relates to a computing device and a control method thereof.

Portable computing devices, such as notebooks or netbooks, are evolving in a direction that emphasizes user convenience. Particularly, in the portable computing device, a portable computing device equipped with a touch pad is also becoming common in order to improve the inconvenience of carrying an additional input device such as a mouse separately. However, when the touch pad is provided, the size of the computing device must be increased to such a degree, and user input through touch is limited to the area of the touch pad.

The present invention is directed to solving the above-mentioned problems and other problems.

SUMMARY OF THE INVENTION It is an object of the present invention to provide a computing device and a control method thereof that emphasize user convenience by proposing a keyboard having a touch sensor.

It is another object of the present invention to provide a computing apparatus and a control method thereof that can provide an intuitive user environment using a keyboard having a touch sensor.

It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are not restrictive of the invention, unless further departing from the spirit and scope of the invention as defined by the appended claims. It will be possible.

A computing device according to an embodiment of the present invention includes a keyboard having a plurality of keys and at least a part of the plurality of keys having a touch sensor, a display unit, and a control unit, A first finger touching a first key of a key, a second finger touching a second key, and a third finger touching a third key, wherein the first key recognizes at least the left side of the second key Wherein the third key is located on the right side of the second key and the second key is located on the same line as at least one of the first key and the third key on the keyboard, 3 keys, and controls the display unit to display the indicator on the first area on the screen based on the recognition result.

According to another aspect of the present invention, there is provided a computing apparatus including a keyboard including a plurality of keys and having a touch sensor on at least a part of the plurality of keys, a display unit, and a control unit, Recognizing at least a first finger touching a first key of a plurality of keys and a second finger touching a second key, recognizing dragging of at least one of the first finger and the second finger on the keyboard, Executes a specific function among the functions that can be implemented in the computing device, based on the recognition result of the dragging.

According to an embodiment of the present invention, a keyboard having a touch sensor can be provided, and a computing device emphasizing user convenience and a control method thereof can be provided.

According to another aspect of the present invention, there is provided a computing device capable of providing an intuitive user environment using a keyboard having a touch sensor and a control method thereof.

Further scope of applicability of the present invention will become apparent from the following detailed description. It should be understood, however, that the detailed description and specific examples, such as the preferred embodiments of the invention, are given by way of illustration only, since various changes and modifications within the spirit and scope of the invention will become apparent to those skilled in the art.

1 is a block diagram of a computing device of the present invention.
2 is a diagram showing the appearance of a computing device related to the present invention.
3 is a flowchart illustrating an example of a control method of a computing device according to an embodiment of the present invention.
4 is a diagram illustrating an example of a screen displayed on a display unit of a computing device according to an embodiment of the present invention.
5 is a diagram illustrating another example of a screen displayed on a display unit of a computing device according to an embodiment of the present invention.
6 is a view for explaining an example of a screen displayed on a display unit in a computing device according to an embodiment of the present invention.
7 is a view illustrating an example of a screen displayed on a display unit of a computing device according to an embodiment of the present invention.
8 is a diagram illustrating another example of a screen displayed on a display unit of a computing device according to an embodiment of the present invention.
9 is a diagram illustrating another example of a screen displayed on a display unit of a computing device according to an embodiment of the present invention.
10 is a diagram illustrating another example of a screen displayed on a display unit of a computing device according to an embodiment of the present invention.
11 is a flowchart for explaining another example of a method of controlling a computing device according to an embodiment of the present invention.
12 is a diagram illustrating an example of a screen displayed on a display unit of a computing device according to an embodiment of the present invention.
13 is a view for explaining an example of a screen displayed on a display unit in a computing device according to an embodiment of the present invention.
14 is a flowchart for explaining another example of a control method of a computing apparatus according to an embodiment of the present invention.
15 is a diagram illustrating an example of a screen displayed on a display unit of a computing device according to an embodiment of the present invention.
16 is a diagram illustrating another example of a screen displayed on a display unit of a computing device according to an embodiment of the present invention.
17 is a diagram illustrating another example of a screen displayed on a display unit of a computing device according to an embodiment of the present invention.
18 is a diagram illustrating an example of a screen displayed on a display unit of a computing device according to an embodiment of the present invention.
FIG. 19 is a diagram illustrating an example of a screen displayed on a display unit of a computing device according to an embodiment of the present invention.
20 is a flowchart for explaining another example of a control method of a computing apparatus according to an embodiment of the present invention.
21 is a diagram illustrating an example of a screen displayed on the display unit of the computing device according to an embodiment of the present invention.
22 is a diagram illustrating another example of a screen displayed on a display unit of a computing device according to an embodiment of the present invention.
23 is a diagram illustrating an example of a screen displayed on a display unit of a computing device according to an embodiment of the present invention.
24 is a diagram illustrating an example of a screen displayed on a display unit in a computing device according to an embodiment of the present invention.
25 is a diagram illustrating an example of a screen displayed on a display unit in a computing device according to an embodiment of the present invention.
26 is a diagram illustrating an example of a screen displayed on a display unit in a computing device according to an embodiment of the present invention.

Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like reference numerals are used to designate identical or similar elements, and redundant description thereof will be omitted. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role. In the following description of the embodiments of the present invention, a detailed description of related arts will be omitted when it is determined that the gist of the embodiments disclosed herein may be blurred. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed. , ≪ / RTI > equivalents, and alternatives.

Terms including ordinals, such as first, second, etc., may be used to describe various elements, but the elements are not limited to these terms. The terms are used only for the purpose of distinguishing one component from another.

It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between.

The singular expressions include plural expressions unless the context clearly dictates otherwise.

In the present application, the terms "comprises", "having", and the like are used to specify that a feature, a number, a step, an operation, an element, a component, But do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or combinations thereof.

The computing device described herein may include a desktop computer, a laptop computer, an ultrabook, a netbook, a tablet PC, a digital signage, etc., Device. ≪ / RTI >

1 is a block diagram of a computing device of the present invention.

The computing device 100 includes a communication unit 110, a user input unit 120, an external device interface unit 130, a sensing unit 140, an output unit 150, a memory 160, a power supply unit 170, 180), and the like. However, the components shown in FIG. 1 are not essential to implementing a computing device, so that a computing device 100 having more or fewer components may be implemented.

More specifically, the communication unit 110 among the above-described components can communicate with the computing device 100 and between the computing device 100 and the other communication device, between the computing device 100 and the external communication device, Lt; RTI ID = 0.0 > and / or < / RTI > In addition, the communication unit 110 may include one or more modules that connect the computing device 100 to one or more networks.

The communication unit 110 may include at least one of a broadcast receiving unit 111, a network interface unit 112, and a mobile communication unit 113.

The broadcast receiver 111 receives broadcast signals and / or broadcast-related information from an external broadcast management server through a broadcast channel. The broadcast channel may include a satellite channel and a terrestrial channel. More than one such broadcast receiving module may be provided to the computing device 100 for simultaneous broadcast reception or broadcast channel switching for at least two broadcast channels.

The broadcast management server may refer to a server for generating and transmitting broadcast signals and / or broadcast related information, or a server for receiving broadcast signals and / or broadcast related information generated by the broadcast management server and transmitting the generated broadcast signals and / or broadcast related information. The broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and a broadcast signal in which a data broadcast signal is combined with a TV broadcast signal or a radio broadcast signal.

The broadcast signal may be encoded according to at least one of technical standards for transmitting and receiving a digital broadcast signal (or a broadcast system, for example, ISO, IEC, DVB, ATSC, etc.) It is possible to receive the digital broadcasting signal using a method conforming to the technical standard defined by the standards.

The broadcast-related information may be information related to a broadcast channel, a broadcast program, or a broadcast service provider. The broadcast-related information may also be provided through a mobile communication network. In this case, the broadcast-related information may be received by the mobile communication unit 113.

The network interface unit 112 provides an interface for connecting the computing device 100 to a wired / wireless network including the Internet network. The network interface unit 112 may be a concept including a local communication unit capable of communicating with another computing device 100, an external server, or an external device by using a wireless area network.

The network interface unit 112 may include an Ethernet terminal or the like for connection to a wired network and may be connected to a wireless network through a WLAN (Wireless LAN), Wi-Fi (Wi-Fi), Wi-Fi (Wireless Fidelity) Direct, DLNA, WiBro, WiMAX, High Speed Downlink Packet Access (HSDPA) Speed Uplink Packet Access), Long Term Evolution (LTE), Long Term Evolution-Advanced (LTE-A), Bluetooth ™, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), UWB Wideband), ZigBee, NFC (Near Field Communication), and Wireless USB (Wireless Universal Serial Bus) communication standards. The network interface unit 112 can transmit or receive data to another user or another electronic device via the network to which it is connected or another network linked to the connected network.

The mobile communication unit 113 may be a mobile communication system or a mobile communication system that uses technology standards or a communication scheme (for example, Global System for Mobile communication (GSM), Code Division Multi Access (CDMA), Code Division Multi Access 2000 (Enhanced Voice-Data Optimized or Enhanced Voice-Data Only), WCDMA (Wideband CDMA), HSDPA (High Speed Downlink Packet Access), HSUPA (Long Term Evolution), LTE Terminal Evolution-Advanced), and the like to / from a base station, an external terminal, and a server. The wireless signal may include various types of data depending on a voice call signal, a video call signal or a text / multimedia message transmission / reception.

A network interface unit 112 for performing a wireless Internet connection through the mobile communication network as viewed from the viewpoint that wireless Internet access by WiBro, HSDPA, HSUPA, GSM, CDMA, WCDMA, LTE, LTE- May be understood as a kind of the mobile communication unit 113.

The user input unit 120 receives information from the user. When information is input through the user input unit 120, the control unit 180 can control the operation of the computing device 100 to correspond to the input information .

The user input 120 may include a mechanical input means (or a mechanical key, e.g., a button located on the front, rear, or side of the computing device 100, a dome switch, a jog wheel, Etc.) and touch-type input means.

The user input unit 120 may include a keyboard 121 and may further include at least one of a touch pad 122 and a mouse 123. [ The user input unit 120 may include buttons, a dome switch, a jog wheel, a jog switch, and the like located on the front, rear, or side surface of the housing 124 of the computing device 100, , A touch ball, and the like. The housing 124 may also be regarded as a kind of the user input unit 120 when the touch sensor is provided in the housing 124 surrounding the keyboard 121. [

The keyboard 121 includes a plurality of keys, and at least a part or a plurality of keys of the plurality of keys are provided with a touch sensor. The keyboard 121 may be configured in such a manner that a plurality of keys are coupled with the respective keys protruding upward in the same manner as a keyboard of a general computing device. In this case, the keyboard 121 may receive a user input signal when the user presses each key by a predetermined pressure or more, or may receive a user input signal when the user touches each key. In addition, the keyboard 121 may be formed in a flat shape such as a touch pad 122 without a plurality of keys protruding upward. In this case, the keyboard 121 is provided with a touch sensor on all of the plurality of keys, and can receive a user input signal when the user touches each key.

The touch pad 122 may include a touch sensor, and may be disposed on one side of the housing 124 facing the display unit 151. The mouse 123 may be connected to the computing device 100 through an external device interface unit 130, which will be described later. As the user input unit 120, the touch pad 122 and the mouse 123 are components well known to those skilled in the art, and thus a detailed description thereof will be omitted.

The external device interface unit 130 is an interface that enables data communication between the external device and the computing device 100. The external device interface unit 130 includes a port for connecting a device equipped with a wired / wireless headset port, a wired / wireless data port, a memory card port, an identification module, And may include at least one of an input / output (I / O) port, a video I / O port, and an earphone port. The external device interface unit 130 may be connected to an external device such as an external memory, an external computing device, a separate user input unit, an earphone, a camera, a game device, or the like. The external device interface unit 130 transmits external video, audio, or data signals to the controller 180 through a connected external device. Also, the external device interface unit 130 may output the video, audio, or data signal processed by the control unit 180 to an external device.

The sensing unit 140 senses at least one of opening / closing information of a cover of the computing device, information in the computing device, surrounding environment information surrounding the computing device, and user information, and generates a corresponding sensing signal. The control unit 180 may control the operation or operation of the computing device 100 based on the sensing signal, or may perform data processing, function, or operation related to the application program installed in the computing device 100.

The output unit 150 may include at least one of a display unit 151, an audio output unit 152, and a light output unit 153 to generate an output related to time, According to the embodiment, the display unit 151 may have a mutual layer structure with the touch sensor or may be integrally formed to realize a touch screen. Such a touch screen may function as a user input 120 that provides an input interface between the computing device 100 and a user and may provide an output interface between the computing device 100 and a user.

The display unit 151 displays a video signal, a data signal, a GUI signal processed by the control unit 180, a video signal and a data signal received from the communication unit 110 or the external device interface unit 130, B signal to generate a driving signal. The display unit 151 may be a PDP, an LCD, an OLED, a flexible display, a 3D display, or the like.

The sound output unit 152 outputs a sound signal processed by the control unit 180, a sound signal received from the communication unit 110, a sound signal stored in the memory 160, a sound signal received through the external device interface unit 130 Can be output. The audio output unit 152 may include a receiver, a speaker, a buzzer, and the like.

The memory 160 stores data that supports various functions of the computing device 100. The memory 160 may store a plurality of application programs (application programs or applications) running on the computing device 100, data for operation of the mobile terminal 100, and commands.

The memory 160 may be a flash memory type, a hard disk type, a solid state disk type, an SDD type (Silicon Disk Drive type), a multimedia card micro type ), Card type memory (e.g., SD or XD memory), random access memory (RAM), static random access memory (SRAM), read-only memory (ROM), electrically erasable programmable read memory, a programmable read-only memory (PROM), a magnetic memory, a magnetic disk, and / or an optical disk.

The power supply unit 170 receives external power and internal power under the control of the controller 180 and supplies power necessary for operation of the respective components. The power supply unit 170 may include a rechargeable battery.

The control unit 180 controls the operation related to the application program and the first half of the mobile terminal 100 in general. The control unit 180 may include a central processing unit (CPU) 181 and a main chipset 182.

The central processing unit 181 controls the overall operation of the computing device 100. The main chipset 182 performs a function similar to a chipset provided on a main board of a general computing device. More specifically, the main chipset 182 may refer to a core controller chip that connects and controls each component. The main chipset 182 refers to a south bridge that controls peripheral devices such as a north bridge and an interface for controlling memory and graphics, and each bridge may be implemented as an individual device, It may be implemented as a module. At this time, the north bridge can perform a function of controlling a separately provided graphics processing unit (GPU) or a graphics card including the GPU. However, in the present specification, regardless of whether or not the graphics card is separately provided, .

2 is a diagram showing the appearance of a computing device related to the present invention.

2, the computing device 100 includes a display portion 151, a keyboard 121, and a housing 124. [ The housing 124 may be referred to as a case and a portion surrounding the display unit 151 may be referred to as a front case 124a and a portion surrounding the keyboard 121 facing the display unit 151 may be referred to as a rear case 124b have. In this specification, for the sake of convenience, the housing 124 will be referred to as a rear case 124b surrounding the keyboard 121 without distinguishing between the front case 124a and the rear case 124b . According to the embodiment, the touch pad 122 may be provided on one surface of the housing 124 facing the display unit 151, but the present invention is not limited thereto.

According to an embodiment, the housing 124 may comprise a touch sensor. In this case, the touch sensor may be provided at one side of the housing 124 facing the display unit 151 and at a peripheral portion of the keyboard 121.

2 illustrates a computing device 100 according to the present invention in the form of a notebook computer having a touch pad 122. However, this is merely an example, Computing device 100 is also within the scope of the present invention.

According to an embodiment of the present invention, the user can utilize the keyboard 121 provided with the touch sensor as a mouse. This will be described with reference to FIG. 3 to FIG.

3 is a flowchart illustrating an example of a control method of a computing device according to an embodiment of the present invention.

The controller 180 includes a first finger for touching the first key among a plurality of keys included in the keyboard 121, a second finger for touching the second key, and a third finger for touching the third key (S201). The user may take a posture in which his / her hand is placed on the keyboard 121 similarly to the use of the mouse 123. [

On the keyboard 121, the first key is located to the left of the second key, and the third key is located to the right of the second key. Also, on the keyboard 121, the second key may be located on the same line as at least one of the first key and the third key, or may be located above the first key and the third key. Also, Among the plurality of keys included in the key 121, the touch input may be located at the top of the detected keys. Also, on the keyboard 121, the first key and the third key may be located adjacent to the second key, respectively. Here, the meaning that the two keys are adjacent on the keyboard 121 may mean that the two keys are located adjacent to each other on the keyboard 121 in a straight line, a diagonal line, or a diagonal line. The first finger to the third finger correspond to the user's finger using the keyboard 121. [ According to an embodiment, when one finger touches two adjacent keys at the same time, at least one of the first key, the second key and the third key may mean two adjacent keys.

The control unit 180 recognizes the first finger that touches the first key when the touch input that touches the first key, the second key, and the third key is detected, And recognizes the second finger and recognizes the third finger that touches the third key. The controller 180 can detect the touch input even in the keys other than the first key through the third key, but the first finger through the third finger can be recognized only through the first key through the third key.

The control unit 180 controls the display unit 151 to display the first indicator on the first area on the screen based on the recognition results of the first finger, the second finger, and the third finger (S202).

The memory 160 stores first coordinate value data related to at least a partial area on the keyboard 121, second coordinate value data related to the entire area of the screen of the display unit 151, 2 < / RTI > coordinate value data.

The first indicator may be displayed on a predetermined first area on the screen of the display unit 151 when the first finger, the second finger, and the third finger are recognized, and may be displayed on the keyboard 121, The first coordinate value data associated with at least one of the key and the third key, and the first area on the screen of the display unit 151 based on the mapping data.

The controller 180 recognizes dragging of the first finger, the second finger, and the third finger on the keyboard 121 (S203). For example, when the touch input is sensed in the first key, the touch input is sensed simultaneously in the keys adjacent to the first key and the first key, and the touch input is detected in the key adjacent to the first key , It is possible to recognize the dragging of the first finger. Similarly, the control unit 180 can recognize the dragging of the second finger and the third finger. The memory 160 may previously store a program or application necessary for recognizing the dragging of the finger.

The control unit 180 calculates a variation amount of the first coordinate value data associated with at least one of the first finger, the second finger, and the third finger by the dragging (S204). For example, the control unit 180 determines whether or not the first coordinate value data of the first finger on the keyboard 121 and the first coordinate value data of the first finger on the keyboard 121 before dragging the first finger, the second finger, The first coordinate value data of the first finger on the keyboard 121 after the dragging of the third finger can be used to calculate the amount of change of the first coordinate value data associated with the first finger.

The control unit 180 can control the display unit 151 to display the first indicator in the first area on the screen based on the calculated change amount (S205). For example, based on the mapping data of the first coordinate value data and the second coordinate value data stored in the memory 160 and the calculated variation amount, the controller 180 controls the first indicator to be displayed on the screen in the first area The display unit 151 can be controlled to move to the second area. Alternatively, the control unit 180 may be configured to move the first indicator from the first area to the second area on the screen based on the direction of dragging of the first finger, the second finger, and the third finger, The display unit 151 can be controlled.

4 is a diagram illustrating an example of a screen displayed on a display unit of a computing device according to an embodiment of the present invention.

Referring to FIG. 4A, the controller 180 includes a first finger for touching a first key among a plurality of keys included in the keyboard 121, a second finger for touching a second key, At least a third finger touching the finger.

4B showing an enlarged view of the keyboard 121 on which the touch input is received, the control unit 180 recognizes the first finger touching the first key "T key" Recognizes the second finger that touches the "Y key" and the "U key", and recognizes the third finger that touches the third key "J key". Since the second finger is touching two adjacent keys, the second key may correspond to two adjacent keys "Y key" and "U key ". The control unit 180 controls the keys other than the first key, the second key and the third key (for example, "T key", "Y key", "U key" and "J key" ) And / or the housing 124. However, the touch input sensed at portions other than the first key, the second key, and the third key satisfying the above-described conditions with respect to FIG. 3 is ignored or may be ignored.

If the first finger, the second finger and the third finger are recognized, the control unit 180 controls the display unit (not shown) to output the first indicator I 1 , similarly to the case where there is a user input by the mouse 123 151 can be controlled. The first indicator I 1 may be displayed in a predetermined first area on the screen. For example, the first area may be a central area on the screen of the display unit 151.

5 is a diagram illustrating another example of a screen displayed on a display unit of a computing device according to an embodiment of the present invention.

The control unit 180 recognizes at least a first finger that touches the first key among the plurality of keys included in the keyboard 121, a second finger that touches the second key, and a third finger that touches the third key , And controls the display unit 151 to display the first indicator I 1 on the first area on the screen based on the recognition result.

The control unit 180 detects the first coordinate value data related to at least one of the first key, the second key and the third key on the keyboard 121 and outputs the detected first coordinate value data and the detected first coordinate value data to the memory 160 The display unit 151 can be controlled to display the first indicator I 1 on the first area on the screen based on the mapping data of the stored first coordinate value data and the second coordinate value data. The first area may correspond to the second coordinate value data on the screen of the display unit 151 mapped to the detected first coordinate value data on the keyboard 121. [ That is, when the user places the hand on the upper left area of the keyboard 121, the first indicator I 1 may be displayed in the upper left area on the screen of the display unit 151.

According to the embodiment of the present invention, when the user moves his / her hand on the keyboard 121, the first indicator I 1 on the screen of the display unit 151 can be moved accordingly. This will be described with reference to FIG.

6 is a view for explaining an example of a screen displayed on a display unit in a computing device according to an embodiment of the present invention.

6A, the controller 180 includes a first finger for touching a first key among a plurality of keys included in the keyboard 121, a second finger for touching a second key, And controls the display unit 151 to display the first indicator I 1 on the first area on the screen based on the recognition result.

The controller 180 recognizes dragging of the first finger, the second finger, and the third finger on the keyboard 121, and based on the recognition result, The display unit 151 may be controlled to move the indicator I 1 from the first area to the second area on the screen.

The control unit 180 may calculate the amount of change of the first coordinate value data associated with at least one of the first finger, the second finger, and the third finger by the dragging. The amount of change in the first coordinate value data of each of at least two of the first finger, the second finger, and the third finger may be the same or different from each other. For example, the control unit 180 determines whether or not the first coordinate value data of the first finger on the keyboard 121 and the first coordinate value data of the first finger on the keyboard 121 before dragging the first finger, the second finger, The first coordinate value data of the first finger on the keyboard 121 after the dragging of the third finger can be used to calculate the amount of change of the first coordinate value data associated with the first finger.

The control unit 180 controls the first indicator I 1 in the first area on the screen based on the mapping data of the first coordinate value data and the second coordinate value data stored in the memory 160 and the calculated amount of change, The display unit 151 can be controlled to move to the second area. Alternatively, based on the direction of dragging of the first finger, the second finger, and the third finger and the calculated amount of change, the controller 180 may set the first indicator I 1 to the second region on the screen in the first region, The display unit 151 can be controlled to move the display unit 151 to the display unit.

For example, the distance by which the finger, which is the object of calculation of the amount of change of the first coordinate value data, moves on the keyboard 121 is determined by the first indicator I 1 in the first region on the screen of the display unit 151, May be shorter than the distance traveled.

Meanwhile, according to an embodiment of the present invention, a user can input a predetermined user command by tapping the keyboard 121 with the index finger while placing his / her hand on the keyboard 121. This will be described with reference to FIG. 7 to FIG. Figures 7 to 10 illustrate the finger recognition and display of the first indicator I 1 as described above with respect to Figures 4 and 5 and the dragging recognition of the finger as described above with respect to Figure 6, It may be an example after the movement of the indicator is done. 7 and 8 illustrate an example similar to that when the left button of the mouse 123 is clicked, and Figs. 9 and 10 show an example similar to that when the right button of the mouse 123 is clicked Respectively.

7 is a view illustrating an example of a screen displayed on a display unit of a computing device according to an embodiment of the present invention.

The control unit 180 controls the first finger on the keyboard 121 in the first area on the keyboard 121 where the first finger is positioned by the dragging of the first finger, And detects a first touch input that taps an area once. In response to the first touch input, the control unit 180 displays the screen of the display unit 151 where the first indicator I 1 is positioned by dragging the first finger, the second finger, It is possible to perform a specific function corresponding to the second area on the screen. For example, the execution of a specific function means that an execution screen of a specific program is displayed in an area other than the task bar on the screen of the display unit 151 and the first indicator I 1 is displayed on the execution screen of the specific program The first indicator I 1 may be positioned on the menu option of the specific function included in the task bar area on the screen of the display unit 151 If the first indicator I 1 is located on the menu icon corresponding to the specific file displayed on the desktop of the display unit 151, the function of highlighting the menu icon .

For example, referring to Figure 7 (a), the user may move his / her hand on the keyboard 121 to place the first indicator I 1 in the "Start" menu option 211 on the desktop 210 And then the keyboard 121 can be tapped once with the index finger (first finger). The control unit 180 controls the first finger on the keyboard 121 in the first area on the keyboard 121 where the first finger is positioned by the dragging of the first finger, And detects a first touch input that taps an area once.

7 (b), the desktop 210 of the display unit 151 where the first indicator I 1 is positioned by dragging the first finger, the second finger, and the third finger, On " menu option 211 on the "start" The control unit 180 can control the display unit 151 to display the GUI 212 which is the execution result of the function corresponding to the "start" menu option 211. [

8 is a diagram illustrating another example of a screen displayed on a display unit of a computing device according to an embodiment of the present invention.

The control unit 180 controls the first finger on the keyboard 121 in the first area on the keyboard 121 where the first finger is positioned by the dragging of the first finger, And detects a second touch input that taps the region twice within a predetermined time. In response to the second touch input, the control unit 180 displays the screen of the display unit 151 where the first indicator I 1 is positioned by dragging the first finger, the second finger, It is possible to perform a specific function corresponding to the second area on the screen. For example, the execution of a specific function may mean execution of the specific file when the first indicator I 1 is located on a specific file displayed on the desktop of the display unit 151, and the execution screen of the word processing program in an area other than the task bar on the screen is displayed first indicator (I 1) a case which is located on the text included in the execution screen of the word processing program, a first indicator (I 1 May highlight a portion of the text in which it is located.

For example, referring to FIG. 8A, the user moves his / her hand on the keyboard 121 to move the first indicator I 1 to a menu icon 221 corresponding to a specific file on the desktop 210 And then the keyboard 121 can be quickly tapped twice with the index finger (first finger). The control unit 180 controls the first finger on the keyboard 121 in the first area on the keyboard 121 where the first finger is positioned by the dragging of the first finger, And detects a second touch input that taps the region twice within a predetermined time.

8 (b), the desktop 210 of the display unit 151 where the first indicator I 1 is positioned by dragging the first finger, the second finger, and the third finger, It is possible to execute a specific file corresponding to the menu icon 221 on the menu screen. The control unit 180 may control the display unit 151 to display the screen 222 corresponding to the execution result of the specific file.

9 is a diagram illustrating another example of a screen displayed on a display unit of a computing device according to an embodiment of the present invention.

The control unit 180 controls the second finger on the keyboard 121 in the second area on the keyboard 121 where the second finger is positioned by dragging the first finger, And detects a third touch input that taps the area once. The control unit 180 may sense the screen displayed on the display unit 151 when the third touch input is sensed and perform a specific function corresponding to the third touch input based on the sensed screen. For example, the execution of a specific function may mean an output of a GUI including a list of functions related to the desktop if the first indicator I 1 is located in an empty space of the desktop of the display unit 151 The first indicator I 1 may be an output of a GUI including a list of functions related to the specific file if the first indicator I 1 is located on a specific file displayed on the desktop of the display unit 151, ), The execution screen of a specific program is displayed in an area other than the task bar and the first indicator (I 1 ) is located on the execution screen of the specific program, a GUI including a list of functions related to the specific program It can also mean output.

For example, referring to FIG. 9A, the user moves his / her hand on the keyboard 121 to position the first indicator I 1 in the empty space of the desktop 210, The finger 121 can be tapped once. The control unit 180 controls the second finger on the keyboard 121 in the second area on the keyboard 121 where the second finger is positioned by dragging the first finger, And detects a third touch input that taps the area once.

When the third touch input is sensed, the controller 180 senses that the screen output from the display unit 151 is the desktop 210, and as shown in FIG. 9 (b) A list of functions related to the desktop 210 is included in the empty space of the desktop 210 of the display unit 151 where the first indicator I 1 is located by dragging the second finger and the third finger The display unit 151 can be controlled so as to output the GUI 231 to be displayed.

10 is a diagram illustrating another example of a screen displayed on a display unit of a computing device according to an embodiment of the present invention. The contents overlapping with those described above with reference to Fig. 9 will not be described again.

10A, the user moves his / her hand on the keyboard 121 to position the first indicator I 1 on a predetermined area on the screen of the display unit 151, and then displays a stop finger (second finger) The user can tap the keyboard 121 once. The control unit 180 controls the second finger on the keyboard 121 in the second area on the keyboard 121 where the second finger is positioned by dragging the first finger, And detects a third touch input that taps the area once.

When the third touch input is detected, the controller 180 detects that the screen output to the display unit 151 is the execution screen 240 of the specific program, and, as shown in FIG. 10 (b) A GUI 241 including a list of functions related to the specific program is displayed in a predetermined area of the display unit 151 where the first indicator I 1 is located by dragging the first finger, the second finger, The display unit 151 can be controlled so as to output the image.

Meanwhile, according to an embodiment of the present invention, the user can utilize the keyboard 121 having the touch sensor as the touch pad 122. This will be described with reference to FIGS. 11 to 13. FIG.

11 is a flowchart for explaining another example of a method of controlling a computing device according to an embodiment of the present invention.

The control unit 180 recognizes the first finger that touches the first key among a plurality of keys included in the keyboard 121 (S301). The user may take a posture in which one finger is placed on the keyboard 121 similarly to the case of using the touch pad 122. [ The first key may be any one of a plurality of keys included in the keyboard 121. According to an embodiment, when one finger touches two adjacent keys at the same time, the first key may mean two adjacent keys. The controller 180 may recognize a first finger that touches the first key when a touch input for touching the first key is sensed.

The control unit 180 controls the display unit 151 to display the second indicator on the first area on the screen based on the recognition result of the first finger (S302).

The memory 160 stores first coordinate value data related to at least a partial area on the keyboard 121, second coordinate value data related to the entire area of the screen of the display unit 151, 2 < / RTI > coordinate value data.

The second indicator may be displayed on a predetermined first area on the screen of the display unit 151 when the first finger is recognized or may be displayed on the keyboard 121 using the first coordinate value data associated with the first key, Or may be displayed in the first area on the screen of the display unit 151 based on the data.

The control unit 180 recognizes dragging of the first finger on the keyboard 121 (S303). For example, when the touch input is sensed in the first key, the touch input is sensed simultaneously in the keys adjacent to the first key and the first key, and the touch input is detected in the key adjacent to the first key , It is possible to recognize the dragging of the first finger. The memory 160 may previously store a program or application necessary for recognizing the dragging of the finger.

The control unit 180 calculates the amount of change of the first coordinate value data associated with the first finger by the dragging (S304). For example, the control unit 180 determines whether or not the first coordinate value data of the first finger on the keyboard 121 and the first coordinate value data of the first finger on the keyboard 121 after dragging of the first finger before the dragging of the first finger The amount of change of the first coordinate value data associated with the first finger can be calculated using the coordinate value data.

The control unit 180 may control the display unit 151 to display the second indicator in the first area on the screen based on the calculated amount of change (S305). For example, based on the mapping data of the first coordinate value data and the second coordinate value data stored in the memory 160 and the calculated variation amount, the second indicator is moved from the first area to the second area on the screen The display unit 151 can be controlled. Alternatively, the control unit 180 controls the display unit 151 to move the second indicator from the first area to the second area on the screen, based on the direction of dragging of the first finger and the calculated amount of change .

12 is a diagram illustrating an example of a screen displayed on a display unit of a computing device according to an embodiment of the present invention.

The control unit 180 recognizes the first finger that touches the first key among a plurality of keys included in the keyboard 121. [ The control unit 180 may control the display unit 151 to output the second indicator I 2 similarly to the case where there is a user input by the touch pad 122.

The second indicator (I 2 ) may be displayed in a predetermined first area on the screen. For example, the first area may be a central area on the screen of the display unit 151.

Alternatively, although not shown, the control unit 180 may detect the first coordinate value data related to the first key on the keyboard 121, and may detect the first coordinate value data and the first coordinate value stored in the memory 160 The display unit 151 may be controlled to display the second indicator I 2 on the first area on the screen based on the mapping data of the data and the second coordinate value data. The first area may correspond to the second coordinate value data on the screen of the display unit 151 mapped to the detected first coordinate value data on the keyboard 121. [ That is, when the user places the finger on the upper left area of the keyboard 121, the second indicator I 2 can be displayed in the upper left area on the screen of the display part 151.

The second indicator I 2 may differ from the first indicator I 1 in at least one of shape, size, color, and transparency.

According to the embodiment of the present invention, when the user moves the finger on the keyboard 121, the second indicator I 2 on the screen of the display unit 151 can be moved along with it. This will be described with reference to FIG.

13 is a view for explaining an example of a screen displayed on a display unit in a computing device according to an embodiment of the present invention.

13A, the control unit 180 recognizes the first finger that touches the first key among a plurality of keys included in the keyboard 121, and based on the recognition result, the second indicator I 2 on the first area on the screen.

The control unit 180 recognizes the dragging of the first finger on the keyboard 121 and displays the second indicator I 2 on the first area on the screen as shown in FIG. The display unit 151 can be controlled to move from the first area to the second area.

The control unit 180 may calculate the amount of change of the first coordinate value data related to the first finger by the dragging. For example, the control unit 180 determines whether or not the first coordinate value data of the first finger on the keyboard 121 and the first coordinate value data of the first finger on the keyboard 121 after dragging of the first finger before the dragging of the first finger The amount of change of the first coordinate value data associated with the first finger can be calculated using the coordinate value data.

The control unit 180 controls the second indicator I 2 in the first area on the screen based on the mapping data of the first coordinate value data and the second coordinate value data stored in the memory 160 and the calculated variation amount The display unit 151 can be controlled to move to the second area. Alternatively, the control unit 180 may control the display unit (not shown) to move the second indicator I 2 from the first area to the second area on the screen based on the direction of dragging of the first finger and the calculated amount of change 151 can be controlled.

For example, the distance traveled by the first finger on the keyboard 121 may be shorter than the distance traveled from the first area to the second area on the screen of the display unit 151 by the second indicator I 2 .

According to the present embodiment, the user can input a predetermined user command by tapping the keyboard 121 with the first finger while placing his / her finger on the keyboard 121. This is similar to that described above with reference to FIGS. 7 and 8, so that detailed description thereof will be omitted.

Meanwhile, according to an embodiment of the present invention, the user may virtually split the screen of the display unit 151 or scroll the screen using the keyboard 121 provided with the touch sensor. This will be described with reference to FIG. 14 to FIG.

14 is a flowchart for explaining another example of a control method of a computing apparatus according to an embodiment of the present invention.

The control unit 180 recognizes the first finger touching the first key and the second finger touching the second key among a plurality of keys included in the keyboard 121 (S401).

The first key and the second key may be located adjacent to each other on the keyboard 121, and the first key may be located on the same line as the second key, or may be located below the second key. Here, the meaning that the two keys are adjacent on the keyboard 121 may mean that the two keys are located adjacent to each other on the keyboard 121 in a straight line, a diagonal line, or a diagonal line. According to an embodiment, when one finger touches two adjacent keys at the same time, at least one of the first key and the second key may mean two adjacent keys.

When a touch input for touching the first key and the second key is sensed, the controller 180 recognizes the first finger that touches the first key and recognizes the second finger that touches the second key can do.

The memory 160 stores first coordinate value data related to at least a partial area on the keyboard 121, second coordinate value data related to the entire area of the screen of the display unit 151, 2 < / RTI > coordinate value data.

The control unit 180 recognizes dragging of the first finger and the second finger on the keyboard 121 (S402). For example, when the touch input is sensed in the first key, the touch input is sensed simultaneously in the keys adjacent to the first key and the first key, and the touch input is detected in the key adjacent to the first key , It is possible to recognize the dragging of the first finger. Similarly, the control unit 180 can recognize the dragging of the second finger. The memory 160 may previously store a program or application necessary for recognizing the dragging of the finger.

In the first embodiment, the control unit 180 can recognize dragging of the first finger and the second finger on the keyboard 121 in the downward direction of the first key and the second key.

In the second embodiment, the control unit 180 can recognize the dragging of the first finger and the second finger on the keyboard 121 in the right or left direction of the first key and the second key.

In the third embodiment, the dragging of the first finger and the second finger can be recognized on the keyboard 121 in the upper / lower or left / right direction of the first key and the second key.

The control unit 180 can execute a specific function among the functions that can be implemented in the computing device 100 based on the recognition result of the dragging (S403).

In the first embodiment, a key, which is lastly touched by each of the first finger and the second finger by dragging of the first finger and the second finger, is disposed on an outermost (e.g., lower) edge on the keyboard 121, The control unit 180 can divide the screen of the display unit 151 into a first area and a second area located on the right side of the first area based on the recognition result of the dragging . Here, the controller 180 detects the first coordinate value data associated with at least one of the first key and the second key, and stores the mapping data of the first coordinate value data and the second coordinate value data stored in the memory 160 The second coordinate value data corresponding to the detected first coordinate value data is detected using the first coordinate value data and the second coordinate value data corresponding to a part of the boundary line between the first area and the second area, 151 may be divided.

In the second embodiment, a key, which is lastly touched by the first finger and the second finger respectively by dragging of the first finger and the second finger, is displayed on the keyboard 121 as an outermost (e.g., The control unit 180 divides the screen of the display unit 151 into a first area and a second area located below the first area based on the recognition result of the dragging can do. Here, the controller 180 detects the first coordinate value data associated with at least one of the first key and the second key, and stores the mapping data of the first coordinate value data and the second coordinate value data stored in the memory 160 The second coordinate value data corresponding to the detected first coordinate value data is detected using the first coordinate value data and the second coordinate value data corresponding to a part of the boundary line between the first area and the second area, 151 may be divided.

In the third embodiment, when the keys, which are respectively lastly touched by the first finger and the second finger due to dragging of the first finger and the second finger, are not the outermost keys on the keyboard 121, The control unit 180 scrolls the screen of the display unit 151 up / down or scrolls the left / right direction based on the recognition result of the dragging.

According to the present embodiment, the user has the advantage of being able to input a desired user command by intuitively moving his / her hand on the keyboard 121.

15 is a diagram illustrating an example of a screen displayed on a display unit of a computing device according to an embodiment of the present invention.

Referring to FIG. 15A, the controller 180 recognizes a first finger that touches a first key and a second finger that touches a second key among a plurality of keys included in the keyboard 121. The first key and the second key may be positioned adjacent to each other on the keyboard 121. Also, the first key and the second key may be biased upwardly on the keyboard 121.

The control unit 180 can recognize the dragging of the first finger and the second finger in the downward direction of the first key and the second key on the keyboard 121. [ Here, it is assumed that a key which is lastly touched by the first finger and the second finger by dragging of the first finger and the second finger is a key positioned at the outermost (e.g., lower edge) on the keyboard 121 I suppose.

15B, the control unit 180 displays the screen of the display unit 151 on the right side of the first area 411 and the first area 411 based on the recognition result of the dragging And a second area 412 in which the first area 412 is located. For example, when the dragging of the first and second fingers is recognized, the controller 180 controls the first area 411 and the second area 412 located on the right side of the first area 411, The screen of the display unit 151 can be divided so that the horizontal widths are the same. In another example, when the dragging of the first finger and the second finger is recognized, the control unit 180 detects first coordinate value data associated with at least one of the first key and the second key, Detects second coordinate value data corresponding to the detected first coordinate value data by using mapping data of first coordinate value data and second coordinate value data stored in the second coordinate value data, The screen of the display unit 151 can be divided so as to correspond to a part of the boundary line between the first area 411 and the second area 512.

16 is a diagram illustrating another example of a screen displayed on a display unit of a computing device according to an embodiment of the present invention. The contents overlapping with those described above with reference to Fig. 15 will not be described again.

The control unit 180 divides the screen of the display unit 151 into a first area 411 and a second area 412 located on the right side of the first area 411 based on the recognition result of the dragging can do. Here, when the dragging of the first finger and the second finger is recognized, the controller 180 detects first coordinate value data associated with at least one of the first key and the second key, and stores the first coordinate value data stored in the memory 160 Detecting second coordinate value data corresponding to the detected first coordinate value data using the mapping data of the first coordinate value data and the second coordinate value data, and detecting the second coordinate value data as the first coordinate value data The screen of the display unit 151 can be divided so as to correspond to a part of the boundary line between the area 411 and the second area 512. That is, when the user places the finger on the left side of the keyboard 121 and then drags the finger downward, the boundary line between the first area 411 and the second area 412 of the display unit 151 is displayed on the left side Can be located.

17 is a diagram illustrating another example of a screen displayed on a display unit of a computing device according to an embodiment of the present invention.

Referring to FIG. 17A, the controller 180 recognizes a first finger that touches a first key and a second finger that touches a second key among a plurality of keys included in the keyboard 121. The first key and the second key may be positioned adjacent to each other on the keyboard 121. Also, the first key and the second key may be biased to the left or right on the keyboard 121.

The control unit 180 can recognize the dragging of the first finger and the second finger in the right or left direction of the first key and the second key on the keyboard 121. [ Here, the keys, which are respectively lastly touched by the first finger and the second finger by the dragging of the first finger and the second finger, are positioned on the outermost side (e.g., the right edge or the left edge) on the keyboard 121 .

17B, the control unit 180 displays the screen of the display unit 151 on the first area 411 and the lower part of the first area 411 on the basis of the result of the dragging recognition And a second area 412 in which the first area 412 is located. For example, when the dragging of the first finger and the second finger is recognized, the controller 180 controls the first region 411 and the second region 412 located below the first region 411, The screen of the display unit 151 can be divided so that the vertical widths are the same. In another example, when the dragging of the first finger and the second finger is recognized, the control unit 180 detects first coordinate value data associated with at least one of the first key and the second key, Detects second coordinate value data corresponding to the detected first coordinate value data by using mapping data of first coordinate value data and second coordinate value data stored in the second coordinate value data, The screen of the display unit 151 can be divided so as to correspond to a part of the boundary line between the first area 411 and the second area 512.

18 is a diagram illustrating an example of a screen displayed on a display unit of a computing device according to an embodiment of the present invention.

Referring to FIG. 18 (a), an execution screen 420 of a specific program is output on the display unit 151. A scroll bar 421 and an indicator 422 located on the scroll bar 421 are displayed on the right side of the execution screen 420. The indicator 422 may indicate the portion of the page displayed on the current execution screen 420 among the entire pages that can be displayed through the execution screen 420.

The control unit 180 recognizes the first finger touching the first key and the second finger touching the second key among a plurality of keys included in the keyboard 121. [ The first key and the second key may be positioned adjacent to each other on the keyboard 121.

The control unit 180 can recognize the dragging of the first finger and the second finger in the lower or upper direction of the first key and the second key on the keyboard 121. [ Here, the keys, which are respectively lastly touched by the first finger and the second finger by the dragging of the first finger and the second finger, are positioned on the outermost side (e.g., the lower edge or the upper edge) It is assumed that the key is not a key.

Referring to FIG. 18B, the control unit 180 can scroll the screen of the display unit 151 in the lower or upper direction (not shown) based on the recognition result of the dragging. Referring to the execution screen 420 output to the display unit 151, it can be confirmed that the position of the indicator 422 on the scroll bar 421 moves from the upper part to the lower part based on the recognition result of the dragging.

FIG. 19 is a diagram illustrating an example of a screen displayed on a display unit of a computing device according to an embodiment of the present invention.

Referring to FIG. 19A, an execution screen 420 of a specific program is output to the display unit 151. FIG. A scroll bar 421 and an indicator 422 located on the scroll bar 421 are displayed below the execution screen 420. The indicator 422 may indicate the portion of the page displayed on the current execution screen 420 among the entire pages that can be displayed through the execution screen 420.

The control unit 180 recognizes the first finger touching the first key and the second finger touching the second key among a plurality of keys included in the keyboard 121. [ The first key and the second key may be positioned adjacent to each other on the keyboard 121.

The control unit 180 can recognize the dragging of the first finger and the second finger in the right or left direction of the first key and the second key on the keyboard 121. [ Here, the keys, which are respectively lastly touched by the first finger and the second finger by the dragging of the first finger and the second finger, are positioned on the outermost side (e.g., the right edge or the left edge) on the keyboard 121 It is assumed that the key is not a key.

19B, the control unit 180 can scroll the screen of the display unit 151 in the right or left direction (not shown) based on the recognition result of the dragging. Referring to the execution screen 420 output to the display unit 151, it can be confirmed that the position of the indicator 422 on the scroll bar 421 has shifted from the left to the right based on the recognition result of the dragging.

20 is a flowchart for explaining another example of a control method of a computing apparatus according to an embodiment of the present invention.

The control unit 180 recognizes the first finger touching the first key and the second finger touching the second key among a plurality of keys on the keyboard 121 (S501). The first key and the second key may be spaced apart from each other on the keyboard 121 and the first key may be a special key (e.g., "Shift key", "Ctrl key", "Alt key", etc.).

When a touch input for touching the first key and the second key is sensed, the controller 180 recognizes the first finger that touches the first key and recognizes the second finger that touches the second key can do.

The memory 160 stores first coordinate value data related to at least a partial area on the keyboard 121, second coordinate value data related to the entire area of the screen of the display unit 151, 2 < / RTI > coordinate value data.

The control unit 180 recognizes the dragging of the second finger on the keyboard 121 (S502). The control unit 180 can recognize the dragging of the second finger while the first finger touches the first key.

The control unit 180 can execute a specific function among the functions that can be implemented in the computing device 100, based on the recognition result of the dragging (S503).

In the first embodiment, the control unit 180 can control the display unit 151 to display a locus corresponding to the dragging of the second finger on the screen. For example, the control unit 180 detects the first coordinate value data associated with the second finger on the keyboard 121 before dragging the second finger, and detects the amount of change in the first coordinate value data according to the dragging of the second finger And outputs the locus corresponding to the dragging of the second finger to the display unit 151 based on the mapping data of the first coordinate value data and the second coordinate value data stored in the memory 160 and the calculated change amount .

In the second embodiment, the control unit 180 outputs a locus corresponding to the dragging of the second finger to the display unit 151, performs handwriting recognition based on the locus displayed on the display unit 151, The locus can be converted into letters, numbers, or symbols based on the recognition result and output to the display unit 151. In the memory 160, a program or application necessary for handwriting recognition may be stored in advance.

In the third embodiment, the control unit 180 can control the display unit 151 to highlight and display a part of the text displayed on the screen of the display unit 151, based on the dragging of the second finger. For example, the control unit 180 detects the first coordinate value data associated with the second finger on the keyboard 121 before dragging the second finger, and detects the amount of change in the first coordinate value data according to the dragging of the second finger And may control the display unit 151 to highlight a part of the text based on the mapping data of the first coordinate value data and the second coordinate value data stored in the memory 160 and the calculated variation amount. According to the embodiment, the control unit 180 detects the number of keys touched by the second finger by dragging the second finger, detects the number of keys (including letters, numbers, and symbol modes) included in the text The display unit 151 can be controlled to highlight and display the same number of characters (including letters, numbers, and symbol modes) as the number of the keys.

21 is a diagram illustrating an example of a screen displayed on the display unit of the computing device according to an embodiment of the present invention.

Referring to FIG. 21A, the control unit 180 recognizes a first finger touching the first key and a second finger touching the second key among a plurality of keys on the keyboard 121. The first key and the second key may be spaced apart from each other on the keyboard 121 and the first key may be a special key (e.g., "Shift key", "Ctrl key", "Alt key", etc.). For example, an execution screen 510 of the Paint program may be output to the display unit 151. [

Then, the control unit 180 recognizes the dragging of the second finger while the first finger touches the first key.

Referring to FIG. 21B, the control unit 180 may control the display unit 151 to display the locus 511 corresponding to the dragging of the second finger on the execution screen 510. For example, the control unit 180 detects the first coordinate value data associated with the second finger on the keyboard 121 before dragging the second finger, and detects the amount of change in the first coordinate value data according to the dragging of the second finger And displays the locus 511 corresponding to the dragging of the second finger on the display unit 151 based on the mapping data of the first coordinate value data and the second coordinate value data stored in the memory 160 and the calculated amount of change, .

22 is a diagram illustrating another example of a screen displayed on a display unit of a computing device according to an embodiment of the present invention.

Referring to FIG. 22 (a), the control unit 180 recognizes the first finger touching the first key and the second finger touching the second key among a plurality of keys on the keyboard 121. The first key and the second key may be spaced apart from each other on the keyboard 121 and the first key may be a special key (e.g., "Shift key", "Ctrl key", "Alt key", etc.). For example, an execution screen 520 of a document work program may be output to the display unit 151. [

Then, the control unit 180 recognizes the dragging of the second finger while the first finger touches the first key.

Referring to FIG. 22B, the control unit 180 may control the display unit 151 to display a locus 521 corresponding to the dragging of the second finger on the execution screen 520.

22 (c), the control unit 180 performs handwriting recognition based on the locus 521 displayed on the execution screen 520, and based on the result of handwriting recognition, The display unit 151 can be controlled so as to convert the character string 521 into letters, numbers, or symbols.

23 is a diagram illustrating an example of a screen displayed on a display unit of a computing device according to an embodiment of the present invention.

Referring to FIG. 23 (a), the control unit 180 recognizes a first finger touching the first key and a second finger touching the second key among a plurality of keys on the keyboard 121. The first key and the second key may be spaced apart from each other on the keyboard 121 and the first key may be a special key (e.g., "Shift key", "Ctrl key", "Alt key", etc.). As an example, it is assumed that an execution screen 530 of the document work program is output to the display unit 151, and that the text 531 is included in the execution screen 530.

Then, the control unit 180 recognizes the dragging of the second finger while the first finger touches the first key.

Referring to FIG. 23 (b), the control unit 180 controls the display unit 151 to highlight a portion 532 of the text displayed on the execution screen of the display unit 151 based on the dragging of the second finger Can be controlled.

For example, when the dragging of the second finger is recognized while the first finger touches the first key, the control unit 180 controls the first finger on the keyboard 121 before the dragging of the second finger, Calculates the amount of change of the first coordinate value data according to the dragging of the second finger, and outputs the mapping data of the first coordinate value data and the second coordinate value data stored in the memory 160 and the calculated variation amount The display unit 151 can be controlled so as to highlight a part of the text 531. [

According to the embodiment, the control unit 180 detects the number of keys touched by the second finger by dragging the second finger, and displays a character (letter, number, symbol mode (Including characters, numbers, and symbol modes) that are the same as the number of the detected keys among the plurality of characters (including the character mode, the number mode, and the symbol mode).

According to an embodiment of the present invention, a user may use a keyboard 121 provided with a touch sensor to turn a page of a screen output to the display unit 151, Can be zoomed in / out. This will be described with reference to FIGS. 24 and 25. FIG.

24 is a diagram illustrating an example of a screen displayed on a display unit in a computing device according to an embodiment of the present invention.

Referring to FIG. 24 (a), the control unit 180 recognizes the first finger that touches the first key among the plurality of keys on the keyboard 121. In this case, the control unit 180 recognizes the first finger that touches the first key for a predetermined time or more, or recognizes the first finger that touches the first key when the first key is applied with a predetermined pressure or more .

Then, the control unit 180 recognizes dragging of the first finger on the keyboard 121.

Referring to FIG. 24 (b), the control unit 180 can pass the page of the screen output to the display unit 151 based on the recognition result of the dragging of the first finger. For example, when the dragging of the first finger is recognized in the right direction of the first key, the control unit 180 can pass the screen output from the display unit 151 from the current page to the next page, If the dragging of the first finger is recognized in the left direction, the screen output to the display unit 151 can be passed from the current page to the previous page.

25 is a diagram illustrating an example of a screen displayed on a display unit in a computing device according to an embodiment of the present invention.

25 (a), the control unit 180 recognizes five fingers touching at least five keys on the keyboard 121, and recognizes five fingers of the first finger of the five fingers Dragging can be recognized.

Referring to FIG. 25 (b), the control unit 180 can zoom in on a specific screen output to the display unit 151 based on the recognized first dragging.

25 (b), the controller 180 can recognize the second dragging of the five fingers in the direction in which the five fingers approach each other.

Referring to (c) of FIG. 25, the control unit 180 can zoom out (reduce) a specific screen output to the display unit 151 based on the recognized second dragging.

The degree of zooming in on a specific screen output to the display unit 151 is proportional to the degree to which the five fingers are distant from each other and the degree of zooming out of a specific screen displayed on the display unit 151 The number of fingers can be proportional to how close they are to each other.

Meanwhile, according to an embodiment of the present invention, a user can set or input a password using a keyboard 121 provided with a touch sensor. This will be described with reference to FIG.

26 is a diagram illustrating an example of a screen displayed on a display unit in a computing device according to an embodiment of the present invention.

For example, the keyboard 121 has a function of receiving a fingerprint, and the control unit 180 can recognize the fingerprint received through the keyboard 121. [ The memory 160 may previously store a program or application necessary for fingerprint recognition. The user can register his / her fingerprint. According to an embodiment, the user may register fingerprints of each of at least two of the ten fingers. The control unit 180 may output the first screen 610 to the display unit 151 to guide the input of the user password after the booting of the computing device 100 is completed. The user can place the finger corresponding to the registered fingerprint on the keyboard 121 and the control unit 180 recognizes the fingerprint sensed through the keyboard 121. If the recognized fingerprint matches the registered fingerprint You can unlock it.

In another example, the user may set the touch input to touch the key of some of the plurality of keys included in the keyboard 121 as the user password. The control unit 180 may output the first screen 610 to the display unit 151 to guide the input of the user password after the booting of the computing device 100 is completed. The user can place his or her finger on a specific key on the keyboard 121 according to the set user password and the control unit 180 can determine whether to unlock based on the touch input through the specific key on the keyboard 121 have.

As another example, the user may set a user input to input a touch input for touching a part of a plurality of keys included in the keyboard 121, and a combination of numbers, letters, and symbols. The control unit 180 may output the first screen 610 to the display unit 151 to guide the input of the user password after the booting of the computing device 100 is completed. The user can input the password in a state in which his / her finger is positioned on a specific key on the keyboard 121 according to the set user password, and the control unit 180 controls the touch input through the specific key on the keyboard 121, And determine whether to release the lock.

According to the embodiments described above, it is possible to provide a computing device and a control method thereof, in which the user convenience is emphasized by proposing a keyboard having a touch sensor.

The present invention described above can be implemented as computer readable codes on a medium on which a program is recorded. The computer readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of the computer readable medium include a hard disk drive (HDD), a solid state disk (SSD), a silicon disk drive (SDD), a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, , And may also be implemented in the form of a carrier wave (e.g., transmission over the Internet). In addition, the computer may include a control unit 180 of the terminal. Accordingly, the above description should not be construed in a limiting sense in all respects and should be considered illustrative. The scope of the present invention should be determined by rational interpretation of the appended claims, and all changes within the scope of equivalents of the present invention are included in the scope of the present invention.

100: computing device
121: keyboard
151:
180:

Claims (20)

A keyboard including a plurality of keys and having a touch sensor on at least a part of the plurality of keys;
A display unit; And
And a control unit,
Wherein the control unit recognizes at least a first finger that touches a first key of the plurality of keys, a second finger that touches a second key, and a third finger that touches a third key, Is positioned to the left of the second key and the third key is located to the right of the second key and the second key on the keyboard is located on the same line as at least one of the first key and the third key A second key located above the first key and the third key,
And controls the display unit to display an indicator in a first area on the screen based on the recognition result.
The method according to claim 1,
Wherein on the keyboard, the first key and the third key are located adjacent to the second key.
The method according to claim 1,
A memory for storing first coordinate value data related to at least a part of the area on the keyboard, second coordinate value data related to an entire area of a screen of the display unit, and mapping data of the first coordinate value data and the second coordinate value data; Lt; / RTI >
The method of claim 3,
Wherein,
Recognizing dragging of the first finger, the second finger, and the third finger on the keyboard,
Calculating a variation amount of first coordinate value data associated with at least one of the first finger, the second finger, and the third finger by the dragging;
And controls the display unit to move the indicator from the first area to the second area on the screen based on the mapping data of the first coordinate value data and the second coordinate value data and the calculated variation amount, Computing device.
5. The method of claim 4,
Wherein,
In a first area on the keyboard where the first finger is positioned by dragging of the first finger, the second finger and the third finger, the first finger is arranged to move the first area twice Detects a first touch input to be tapped,
And executes a specific function corresponding to a second area on the screen of the display unit where the indicator is located, in accordance with the detected first touch input.
5. The method of claim 4,
Wherein,
In a second region on the keyboard where the second finger is positioned by dragging of the first finger, the second finger and the third finger, the second finger touches the second region Touch input is detected,
When the second touch input is sensed, sensing a screen image output to the display unit,
And executes a specific function corresponding to the second touch input based on the sensed screen.
A computing device comprising:
A keyboard including a plurality of keys and having a touch sensor on at least a part of the plurality of keys;
A display unit; And
And a control unit,
Wherein,
A first finger touching a first key of the plurality of keys and a second finger touching a second key,
Recognizing dragging of at least one of the first finger and the second finger on the keyboard,
And executes a specific one of the functions that can be implemented in the computing device, based on the recognition result of the dragging.
8. The method of claim 7,
Wherein the first key and the second key are located adjacent to each other on the keyboard.
9. The method of claim 8,
Wherein the first key on the keyboard is located on the same line as the second key or is located below the second key.
9. The method of claim 8,
Wherein,
Recognizing dragging of the first finger and the second finger in the downward direction of the first key and the second key on the keyboard,
When a key last touched by the first finger and the second finger by dragging the first finger and the second finger is a key located at the outermost position on the keyboard, And controls the display unit to divide and display the screen of the display unit into a first area and a second area located to the right of the first area.
11. The method of claim 10,
A memory for storing first coordinate value data related to an entire area on the keyboard, second coordinate value data related to an entire area of a screen of the display unit, and mapping data of the first coordinate value data and the second coordinate value data ≪ / RTI >
12. The method of claim 11,
Wherein,
Detecting first coordinate value data associated with at least one of the first key and the second key,
Detecting second coordinate value data corresponding to the detected first coordinate value data using the mapping data of the first coordinate value data and the second coordinate value data,
And controls the display unit to divide and display the screen of the display unit into the first area and the second area such that the detected second coordinate value data corresponds to a part of the boundary line between the first area and the second area , Computing device.
9. The method of claim 8,
Wherein the control unit recognizes dragging of the first finger and the second finger in the right or left direction of the first key and the second key,
When a key last touched by the first finger and the second finger by dragging the first finger and the second finger is a key located at the outermost position on the keyboard, And controls the display unit to divide and display the screen of the display unit into a first area and a second area located below the first area.
9. The method of claim 8,
Recognizing dragging of the first finger and the second finger in the upper or lower direction of the first key and the second key on the keyboard,
When a key last touched by the first finger and the second finger is not the outermost key on the keyboard due to dragging of the first finger and the second finger, And controls the display unit to scroll the screen of the display unit in an upward direction or a downward direction.
9. The method of claim 8,
Recognizing dragging of the first finger and the second finger in the right or left direction of the first key and the second key on the keyboard,
When a key last touched by the first finger and the second finger is not the outermost key on the keyboard due to dragging of the first finger and the second finger, And controls the display unit to scroll the screen of the display unit to the right or left direction.
8. The method of claim 7,
Wherein the first key and the second key are located apart from each other on the keyboard, and the first key is at least one of a Shift key, a Ctrl key, and an Alt key.
17. The method of claim 16,
Wherein,
Recognizing dragging of the second finger while the first finger touches the first key,
And controls the display unit to display a locus corresponding to dragging of the second finger.
18. The method of claim 17,
Wherein,
Performs handwriting recognition based on a locus displayed on the display unit,
And controls the display unit to convert the locus into a letter, number, or symbol and display it on the basis of the result of the handwriting recognition.
17. The method of claim 16,
The display unit displays a screen including text,
Wherein,
Recognizing dragging of the second finger while the first finger touches the first key,
And controls the display unit to highlight and display a portion of the text based on dragging of the second finger.
17. The method of claim 16,
Wherein,
Detecting the number of keys touched by the second finger by dragging the second finger,
And controls the display unit to highlight and display the same number of characters as the number of the detected keys among the characters included in the text.
KR1020140086640A 2014-06-19 2014-07-10 Computing apparatus and method for controlling the same KR20160006932A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR1020140086640A KR20160006932A (en) 2014-07-10 2014-07-10 Computing apparatus and method for controlling the same
PCT/KR2014/010748 WO2015194712A1 (en) 2014-06-19 2014-11-10 Computing apparatus and method for controlling same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020140086640A KR20160006932A (en) 2014-07-10 2014-07-10 Computing apparatus and method for controlling the same

Publications (1)

Publication Number Publication Date
KR20160006932A true KR20160006932A (en) 2016-01-20

Family

ID=55307686

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020140086640A KR20160006932A (en) 2014-06-19 2014-07-10 Computing apparatus and method for controlling the same

Country Status (1)

Country Link
KR (1) KR20160006932A (en)

Similar Documents

Publication Publication Date Title
EP3557387B1 (en) Method of operating a display unit and a terminal supporting the same
US9280275B2 (en) Device, method, and storage medium storing program
US20150077362A1 (en) Terminal with fingerprint reader and method for processing user input through fingerprint reader
US9298292B2 (en) Method and apparatus for moving object in terminal having touch screen
KR20130052151A (en) Data input method and device in portable terminal having touchscreen
CN109923511B (en) Object processing method and terminal
EP2613247B1 (en) Method and apparatus for displaying a keypad on a terminal having a touch screen
EP2770423A2 (en) Method and apparatus for operating object in user device
US20140164976A1 (en) Input method and electronic device for processing the same
US20140354605A1 (en) Electronic device and handwriting input method
US9658703B2 (en) Method and apparatus for operating mobile terminal
KR101893928B1 (en) Page displaying method and apparatus of terminal
EP3839702B1 (en) Electronic device and method for processing letter input in electronic device
EP2787429B1 (en) Method and apparatus for inputting text in electronic device having touchscreen
US9025878B2 (en) Electronic apparatus and handwritten document processing method
US20150370786A1 (en) Device and method for automatic translation
WO2019072172A1 (en) Method for displaying multiple content cards, and terminal device
KR102125212B1 (en) Operating Method for Electronic Handwriting and Electronic Device supporting the same
KR102096070B1 (en) Method for improving touch recognition and an electronic device thereof
KR20190063853A (en) Method and apparatus for moving an input field
US9886741B2 (en) Method and apparatus for displaying images in touchscreen-based devices
CN108509138A (en) A kind of method and its terminal that taskbar button is shown
KR102157078B1 (en) Method and apparatus for creating electronic documents in the mobile terminal
KR20160006932A (en) Computing apparatus and method for controlling the same
CN103106023B (en) Apparatus and method for controlling the display size in portable terminal

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination