JPH07104914A - Touch input device - Google Patents

Touch input device

Info

Publication number
JPH07104914A
JPH07104914A JP24511293A JP24511293A JPH07104914A JP H07104914 A JPH07104914 A JP H07104914A JP 24511293 A JP24511293 A JP 24511293A JP 24511293 A JP24511293 A JP 24511293A JP H07104914 A JPH07104914 A JP H07104914A
Authority
JP
Japan
Prior art keywords
finger
pointer
movement data
touch
sub
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP24511293A
Other languages
Japanese (ja)
Inventor
Hideyuki Hara
秀之 原
Original Assignee
Toshiba Corp
株式会社東芝
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toshiba Corp, 株式会社東芝 filed Critical Toshiba Corp
Priority to JP24511293A priority Critical patent/JPH07104914A/en
Publication of JPH07104914A publication Critical patent/JPH07104914A/en
Pending legal-status Critical Current

Links

Abstract

(57) [Abstract] [Object] The present invention changes the movement data of the touched finger and the movement data of the pointer detected by the device at a predetermined ratio so that the pointer can be visually recognized. Eliminate erroneous operation and improve operability. [Configuration] When a certain main function name is touched with a finger, the coordinate detection means (1, 2) displays a pointer at the touch position of the finger to obtain the coordinates of the touch position, and the finger movement data detection means (2) detects the coordinate. The movement data of the finger in a touched state is obtained from the coordinates obtained by the means, and the reduced movement data calculation means (2) reduces the movement data obtained by the finger movement data detection means at a predetermined ratio to obtain reduced movement data. The pointer moving means (2, 3, 5) moves the pointer on the screen corresponding to the reduced movement data obtained by the reduced movement data calculation means, and the sub-function selection means (2, 3, 4) releases the finger. A touch input device for selecting a sub-function name designated by the pointer moved by the pointer moving means.

Description

Detailed Description of the Invention

[0001]

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a touch input device for selecting and inputting a function name on a display screen by touching it with a finger, and in particular, movement data of a touched finger and movement data of a pointer detected by the device. The present invention relates to a touch input device capable of eliminating an erroneous operation at the time of selecting a function by changing and at a predetermined ratio so that the pointer can be visually recognized.

[0002]

2. Description of the Related Art A mouse or a trackball is generally used as a pointing device in a graphic display computer system of an office processing system, whereas a pointing device is used in a graphic display computer system of a plant monitoring control system due to an installation environment or the like. As a touch input device is widely used as.

This type of touch input device, when any one of the main function names displayed in the first menu is touched, details corresponding to the touched main function name. The menu is displayed as a popup menu. The detailed menu is made up of a plurality of sub-function names, and the desired sub-function name is selected by the touch operation of the operator.

The pop-up menu method is a technique which is widely used in pointing processing using a mouse or a touch screen. The pop-up menu method includes a device control section for processing an input from the mouse and the touch screen and a signal from the device control section. An address calculation unit for executing address calculation for display control of the pointer is provided.

The device controller, as shown in FIG.
For example, when a signal by an operator's operation is received from the event waiting state (ST1) of the device while the first menu is displayed, it is determined whether the transmission source of this signal is a mouse or a touch screen (ST2).

When the result of this determination is that the transmission source is a mouse, it is further determined whether or not the content of the signal is a movement amount (ST3). When it is a movement amount, this movement amount is sent to the address calculation section. When sending (ST4) and the amount of movement is not, whether the button was pressed (Press) or released (Rel
A Press / Release signal is sent to the address calculation section in correspondence with the content indicating (ease) (ST5).

When the address calculator receives the Press signal, it sends this Press signal to the host computer in order to display the detailed menu in response to this Press signal, and when it receives the movement amount, it responds to this movement amount. A signal for moving the mouse cursor is generated and sent to the window control unit. When the Release signal is received, Rel
The Release signal is sent to the host computer in order to specify the detailed menu corresponding to the position of the mouse cursor when the ease signal is received and to erase the detailed menu display.

Further, such menu selection can also be executed by a touch operation, and in the touch operation,
The state where the finger is touched and the state where the mouse button is pressed are the same.

For example, when the first menu is touched with a finger in step ST1, the device controller determines in step ST2 that the signal is input from the touch screen.

Next, the device control unit determines whether the content of the signal is a touch (ST6), and when it is a touch, the address calculation unit outputs the Press signal including the touch position to display the pop-up menu. To (ST
7) If it is not touch, it is determined whether or not the finger is released (ST8).

When it is determined in step ST8 that the finger has not been released, the finger movement position is sent to the address calculation section to move the pointer displayed under the finger (ST9), and the finger is released. When it is determined that the detailed menu is selected, the Release signal for designating the detailed menu is sent to the address calculation section (ST10). As a result, a desired detailed menu is selected, and the function corresponding to this detailed menu is executed.

[0012]

However, in the above touch input device, when the display size of the detailed menu in the pop-up menu is smaller than the touched finger, the operator determines which detailed menu is specified by the pointer. Cannot be recognized, which makes it difficult to select the desired detailed menu correctly.

Further, if the detail menu is selected incorrectly, there is a problem that the operation is delayed because the selection is made again.
Further, even when the detailed menu is correctly selected, the finger is moved with great care, which makes it difficult to operate.

The present invention has been made in consideration of the above-mentioned circumstances, and by changing the movement data of the touched finger and the movement data of the pointer detected by the device at a predetermined ratio, the pointer can be visually recognized. An object of the present invention is to provide a touch input device capable of improving operability by eliminating erroneous operation when selecting a function.

[0015]

The invention according to claim 1 has a display screen for displaying a plurality of main function names, and a main function name in the display screen is touched with a finger to perform the main function. By displaying a plurality of sub-function names corresponding to names on the display screen and moving the finger with the finger touching the display screen to specify a sub-function name among the sub-function names, In a touch input device for inputting an execution command of a function corresponding to a certain sub-function name, when the certain main function name is touched with a finger, a pointer is displayed at the touch position of this finger and the coordinates of the touch position are obtained. Coordinate detection means, finger movement data detection means for obtaining movement data of the finger when the finger moves in the touched state from the coordinates obtained by the coordinate detection means, and this finger movement data detection The reduced movement data calculation means for reducing the movement data obtained by the step at a predetermined rate to obtain the reduced movement data, and the pointer corresponding to the reduced movement data obtained by the reduction movement data calculation means on the display screen. A touch input device comprising: a pointer moving means for moving; and a sub-function selecting means for selecting a sub-function name designated by the pointer moved by the pointer moving means when the finger is separated from the display screen.

The invention corresponding to claim 2 is, as the reduced movement data calculation means, a screen size storage unit in which screen size data indicating the vertical and horizontal lengths of the display screen are stored, and each of the sub-function names. Area size storage unit storing area size data indicating the vertical and horizontal lengths of the display areas, and when the finger movement data is received from the finger movement data detection unit, the area size storage unit stores the finger movement data. The touch input device according to claim 1, further comprising: a reduction ratio calculation unit that divides the area size data by the screen size data stored in the screen size storage unit to obtain the predetermined ratio.

[0017]

Therefore, according to the invention corresponding to claim 1, by taking the means as described above, when a certain main function name is touched with the finger, the coordinate detecting means causes the pointer to be displayed at the touch position of this finger. Together with the coordinates of the touch position,
When the finger movement data detecting means moves the finger in a state of being touched from the coordinates obtained by the coordinate detecting means, the movement data of the finger is obtained, and the reduction movement data calculating means is obtained by the finger movement data detecting means. The movement data is reduced at a predetermined rate to obtain reduced movement data, and the pointer moving means moves the pointer on the display screen in accordance with the reduced movement data calculated by the reduced movement data calculating means, and the sub-function selecting means. However, when the finger is moved away from the display screen, the sub-function name designated by the pointer moved by the pointer moving means is selected, so that the movement data of the touched finger and the movement data of the pointer detected by the device are specified. By changing the ratio and making the pointer visible, it is possible to eliminate erroneous operation when selecting a function and improve operability.

Further, in the invention according to claim 2, as the reduced movement data calculation means, a screen size storage section in which screen size data indicating the vertical and horizontal lengths of the display screen are stored and each sub-function name. An area size storage unit that stores area size data indicating the vertical and horizontal lengths of the display area of
A reduction ratio calculation unit, and when the reduction ratio calculation unit receives the finger movement data from the finger movement data detection unit, the area size data stored in the area size storage unit is stored in the screen size storage unit. Since the predetermined ratio is obtained by dividing by the screen size data, the operability can be further improved by effectively utilizing the entire display screen for the touch operation in addition to the effect of the first aspect.

[0019]

Embodiments of the present invention will be described below with reference to the drawings. 1 is a block diagram showing a configuration of a touch input device according to a first embodiment of the present invention. The touch input device has a function of detecting whether or not the touch screen 1 is touched by a finger and a touch position, and sending corresponding X and Y signals to the device control unit 2.

The device control unit 2 includes a touch screen 1
The presence / absence of a touch is detected based on the X and Y signals received from the address calculator 3 and the corresponding Press / Release signal is obtained.
It has a function of sending the data to the network control unit 4 via the, and sending the corresponding touch position data to the address calculation unit 3 by detecting the touch position. In addition, this Pre
The ss / Release signal includes coordinate information of the touch position in addition to the presence / absence information of the touch.

Further, the device control unit 2 converts the movement amount of the finger in the touched state into the movement amount of the pointer in accordance with the change of the touch position data and sends the corresponding pointer movement amount data to the address calculation unit 3. It has a function.

The address calculation unit 3 has a function of calculating a planned movement position of the pointer based on the touch position data and the pointer movement amount data received from the device control unit 2 and sending a pointer movement signal to the window control unit 5.

The window control unit 5 has a function of displaying a pointer on the display screen of the CRT 6 based on the pointer movement signal received from the address calculation unit 3. The network control unit 4 receives the Pre received from the address computer 3.
It has a function of sending the ss / Release signal to the host computer 8 via the network 7 and sending the display request received from the host computer 8 to the window control unit 5.

The host computer 8 reads out the program corresponding to the selected function name based on the Press / Release signal received from the network control unit 4, and issues a screen display command corresponding to the program to the network 7 and the network control. It has a function of sending to the window control unit 5 via the unit 4.

The window controller 5 has a function of displaying a corresponding display screen and a pointer on the CRT 6 based on a screen display command received from the network controller 4.

Next, the operation of the touch input device configured as described above will be described with reference to the flowcharts of FIGS. 2 and 3. First, the operation of the screen display program will be described with reference to FIG. 2, and then the operations of the device control unit and the address calculation unit will be described in detail with reference to FIG.

Now, on the display screen, as shown in FIG.
The three main function names M1 to M3 are displayed as the first menu (ST11). The device control unit 2 is
In the standby state for detecting the touch of the finger as the event of (ST12), it is determined whether or not the finger is touched on the touch screen 1 (ST13), and when there is no touch,
While continuing to wait for an event, when there is a touch, the touch position is detected (ST14), and the fact that there is a touch and the touch position are sent to the host computer 8 through the address calculation unit 3 and the network control unit 4.

The host computer 8 determines whether or not the touched position is within the first menu (ST15). When the touched position is outside the first menu, the touch computer determines that the touch is invalid and continues to wait for an event. When the touch position is within the first menu, the touch is determined to be valid, and a pop-up menu including six sub-function names S1 to S6 is displayed as shown in FIG. 5 on the network control unit 4 and the window control unit 5.
Through CRT6 (ST16).

When the pop-up menu is displayed, the device control unit 2 releases the finger (R
elease) Waiting for detection (ST17), if the finger touches the display screen, it continues to wait for the event, but if the finger is released from the display screen (ST18), the fact that the finger has been released and its release position Is sent to the host computer 8 through the address calculator 3 and the network controller 4.

When the host computer 8 receives the fact that the finger is removed, the host computer 8 erases the pop-up menu from the display screen through the network controller 4 and the window controller 5 (ST19), and detects the removed position (ST2).
0) It is determined whether or not this leaving position is within the pop-up menu (ST21).

If the leaving position is outside the menu, the host computer 8 returns to step ST12, and if the leaving position is within the menu, the host computer 8 operates the sub-function name in the menu (ST22). Return to.

Next, the operations of the device controller 2 and the address calculator 3 corresponding to the operation of the touch input device will be described in detail. Now, the three main function names M1 to M3 are displayed on the display screen by the operation of the operator. Further, the device control unit 2 determines whether or not there is a signal input from the touch screen 1 (ST31). When no signal is input, ST31 is continued.

In this state, it is assumed that the operator touches, for example, the first main function name M1 with a finger. The touch screen 1 detects the touch of a finger and sends X and Y signals to the device control unit 2.

The device control section 2 detects that a touch has been made in ST22 for determining the presence / absence of a touch, and
A Press signal including a touch position corresponding to the X and Y signals is sent to the address calculation unit 3 (ST33: coordinate position detecting means).

On the basis of this Press signal, the address calculation unit 3 determines that the movement amount is not included in step ST34 for determining the presence or absence of the movement amount, detects the touch position (ST35), and moves the touch position to the pointer. The coordinate signal is converted to a coordinate signal for movement (ST36: coordinate position detecting means) and the coordinate signal is sent to the window control section 5 (ST37: coordinate position detecting means).

As a result, the window control unit 5 displays the pointer under the finger according to the coordinate signal. Further, the address calculation unit 3 detects the destination host computer 8 based on the Press signal (ST38) and Pre
The ss signal and the address data of the transmission destination are sent to the network control unit 4 (ST39).

The network control unit 4 sends a Press signal to the host computer 8 and returns to receive a display command from the host computer 8 for a plurality of sub-function names S1 to S6 corresponding to the first main function name M1. As a result, the six sub-function names S1 corresponding to the first main function name M1
~ S6 is displayed.

Next, it is assumed that the operator's finger moves to the lower part of the screen in a touched state. The touch screen 1 detects the touch position of the finger and sends X and Y signals to the device control unit 2.

The device control section 2 detects that the finger is still in the moving state without being released based on the changes in the X and Y signals (ST40: finger movement data detecting means).
As shown in the following equations (1) and (2), the movement amount of the finger is converted into the movement amount of the pointer and the pointer movement amount data is sent to the address calculation unit 3 (ST41: finger movement data detecting means, reduction). Moving data detection means).

Pointer movement amount data = finger movement amount × reduction ratio Pointer x direction movement amount data = (L x−Cx) × fx (1) Pointer y direction movement amount data = (L y−Cy) × fy (2) where L x, L y; previous finger detection position C x, C y; current finger detection position fx, fy; reduction ratio, for example, 0.5.

The address calculation unit 3 determines on the basis of the pointer movement amount data that the movement amount has been received in step ST34, and calculates the planned movement position of the pointer (ST42: pointer movement means) pointer movement signal. Is sent to the window control unit 5 (ST43: pointer moving means).

The window control section 5 moves and displays the pointer on the display screen of the CRT 6 based on the pointer movement signal received from the address calculation section 3. Here, since the movement amount of the pointer is smaller than the movement amount of the finger, as shown in FIG.
Can be visually recognized as it comes off from under the finger as the finger moves.

Therefore, since the operator moves the finger while visually recognizing the pointer P to guide the pointer P to the desired sub-function name, the operation can be performed accurately. When the finger is removed from the display screen when the pointer P is guided to the desired sub-function name, the device control unit 2 detects in step ST40 (: sub-function selecting means) that the finger has been removed, The Release signal is sent to the address calculation section 3 (ST44: sub-function selection means).

The address calculation unit 3 detects the touch position when the finger is released in step ST35 based on the Release signal, converts the touch position into coordinates in step ST36, and converts the coordinate signal in step ST37. It is sent to the window control unit 5.

As a result, the window control section 5 executes the color-changing display of the sub-function name according to this coordinate signal. The color change display indicates that the sub-function name has been selected.

Further, the address calculation section 3 detects the destination host computer 8 in step ST28 (: sub-function selecting means) based on the Release signal, and in step ST29 (: sub-function selecting means).
The Release signal and the destination address data are sent to the network control unit 4.

The network controller 4 sends a Release signal to the host computer 8, and the host computer 8 receives this Release signal as an execution command for the sub-function name.

As a result, the host computer 8 executes the sub-function corresponding to the desired sub-function name. As described above, according to the first embodiment, a certain main function name M1
When is touched with a finger, the pointer P is displayed at the touch position of this finger, and when the finger moves in a state of being touched from this touch position, the moving amount of the finger is multiplied by the reduction ratio to obtain the moving amount of the pointer P. After conversion, the pointer P is moved on the display screen according to the movement amount of the pointer, and when the finger is released from the display screen, the sub-function name designated by the pointer P is selected, and thus touched. By changing the amount of movement of the finger and the amount of movement of the pointer detected by the device at a predetermined ratio so that the pointer can be visually recognized, erroneous operation at the time of function selection can be eliminated and operability can be improved.

Next, a touch input device according to a second embodiment of the present invention will be described with reference to FIGS. 7 to 9. Figure 7
Is a block diagram showing the configuration of this touch input device,
The same parts as those in FIG. 1 are designated by the same reference numerals and detailed description thereof will be omitted, and only different parts will be described here.

That is, in the apparatus of this embodiment, only the display area for displaying each of the sub-function names S1 to S6 in the display screen is set as the moving area of the pointer P, and the pointer P is moved.
An instruction to move the display screen corresponding to the finger that makes the entire display area a movement area (hereinafter referred to as pointer movement window instruction)
The operability is further improved by taking out and controlling the above. Specifically, in comparison with the device shown in FIG.
Instead of the device control unit 2, a screen size storage unit 21 in which screen size data indicating the vertical and horizontal lengths of the display screen is stored, and a region indicating the vertical and horizontal lengths of the display region of each sub-function name. Area size storage unit 2 storing size data
2 and a device control unit 24 that additionally includes a reduction ratio calculation unit 23 that refers to the screen size storage unit 21 and the area size storage unit 22 to calculate the reduction ratio of the movement amount of the pointer with respect to the movement amount of the finger. There is. The device control unit 24 also has the function of the device control unit 2 described above.

The operation of the apparatus of this embodiment is shown in FIGS.
8 is the same as that shown in the flowchart of FIG. 8, but the flowchart of FIG.
The operation is the same as that of T16 and steps ST17 to ST22, and the flowchart of FIG.
Since the operation is the same as that of step ST41 except for step ST41, only different parts will be described here.

That is, the device control section 24 receives an instruction within the pointer moving window from the host computer 8 after the plurality of sub-function names S1 to S6 are displayed as a pop-up menu (ST16) (ST16-
2). The device control unit 24 receives the instruction in the pointer movement window, and thus, in the event waiting state (ST17), the touch position data changes along with the change of the X and Y signals of the touch screen 1 due to the movement of the finger. Then, as step ST50 instead of step ST41, the x coordinate and the y coordinate in the movement area of the pointer are calculated as shown in the following equations (1) and (2), and the pointer movement coordinate indicating the calculation result is obtained. The data is sent to the address calculator 3 (ST50).

X coordinate = wx + tx * ww / sw (1) y coordinate = wy + ty * wh / sh (2) where sw: width of touch screen sh; height of touch screen tx; touched X coordinate ty; touched Y coordinate wx; display start X in pop-up menu window
Coordinates wy; Display start Y in pop-up menu window Y
Coordinates ww; Width of pop-up menu window wh; Height of pop-up menu window The address calculation unit 3 creates a coordinate signal in step ST36 based on this pointer movement coordinate data, and in step ST37, the coordinate signal is sent to the window control unit 5.
Send to. According to the coordinate signal, the window control unit 5 moves and displays the pointer so as to correspond to a finger having the entire display screen as a moving area, as shown in FIG.

Thereafter, when the pointer is guided and the finger is released from the display screen, the host computer 8
Executes the operation of the sub-function name corresponding to the detached position.

As described above, according to the second embodiment, only the display area in which the sub-function names S1 to S6 are displayed in the display screen is the moving area of the pointer P, and the pointer P is displayed. In order to move the entire screen corresponding to the finger that is the moving area, the device control unit 24 stores the area size data stored in the area size storage unit 22 in the screen size storage unit 21 when the finger moves. Divide by screen size data and get a predetermined ratio (ww / sw, wh / s
Since h) is obtained, in addition to the effect of the first embodiment, the operability can be further improved by effectively utilizing the entire display screen for the touch operation. In addition, the present invention can be modified in various ways without departing from the scope of the invention.

[0056]

As described above, according to the invention of claim 1, when a certain main function name is touched with a finger, the coordinate detecting means displays a pointer at the touch position of this finger and also displays the touch position. When the finger is moved in a state where the coordinates are obtained and the finger movement data detection means touches the coordinates obtained by the coordinates detection means, the movement data of the finger is obtained and the reduced movement data calculation means detects the finger movement data. The movement data obtained by the means is reduced at a predetermined rate to obtain reduction movement data, and the pointer movement means moves the pointer on the display screen in correspondence with the reduction movement data obtained by the reduction movement data calculation means, When the finger is separated from the display screen, the sub-function selecting means selects the sub-function name designated by the pointer moved by the pointer moving means. When, by the device is visible pointer by changing a ratio of the moving data of a predetermined pointer to detect, it can provide a touch input device capable of improving operability by eliminating an erroneous operation at the time of function selection.

Further, according to the invention of claim 2, as the reduced movement data calculating means, a screen size storage section in which screen size data indicating the vertical and horizontal lengths of the display screen are stored, and each sub-function name. An area size storage unit that stores area size data indicating vertical and horizontal lengths of the display area and a reduction ratio calculation unit, and the reduction ratio calculation unit uses the finger movement data detection unit to move the finger movement data. When receiving, the area size data stored in the area size storage unit is divided by the screen size data stored in the screen size storage unit to obtain a predetermined ratio. Therefore, in addition to the effect of claim 1, the entire display screen is obtained. It is possible to provide a touch input device capable of further improving the operability by effectively utilizing the touch operation.

[Brief description of drawings]

FIG. 1 is a block diagram showing a configuration of a touch input device according to a first embodiment of the present invention.

FIG. 2 is a flowchart for explaining the operation of the embodiment.

FIG. 3 is a flowchart for explaining the operation in the same embodiment.

FIG. 4 is a view for explaining a touch operation in the same embodiment.

FIG. 5 is a view for explaining a touch operation in the same embodiment.

FIG. 6 is a view for explaining a touch operation in the same embodiment.

FIG. 7 is a block diagram showing a configuration of a touch input device according to a second embodiment of the present invention.

FIG. 8 is a flowchart for explaining the operation in the same embodiment.

FIG. 9 is a flowchart for explaining the operation of the embodiment.

FIG. 10 is a view for explaining a touch operation in the same embodiment.

FIG. 11 is a flowchart for explaining the operation of a conventional device control unit.

[Explanation of symbols]

1 ... Touch screen, 2, 24 ... Device control unit, 3
... address control unit, 4 ... network control unit, 5 ... window control unit, 6 ... CRT, 7 ... network, 8 ... host computer, 21 ... screen size storage unit, 22 ... area size storage unit, 23 ... reduction ratio calculation unit , M1-M3 ...
Main function name, S1 to S6 ... Sub function name.

Claims (2)

[Claims]
1. A display screen for displaying a plurality of main function names, wherein a main function name in the display screen is touched with a finger to display a plurality of sub-function names corresponding to the main function name on the display screen. Is displayed on the display screen and the finger is moved while touching the display screen to specify a sub-function name among the sub-function names, thereby inputting an execution command of a function corresponding to the sub-function name. In the touch input device, when a certain main function name is touched with a finger, a pointer is displayed at the touch position of this finger, and the coordinate detection means for obtaining the coordinates of the touch position, and the coordinates obtained by this coordinate detection means From the above, when the finger moves in the touched state, the finger movement data detection means for obtaining movement data of the finger and the movement data obtained by the finger movement data detection means are reduced at a predetermined ratio. Reduction movement data calculation means for obtaining reduction movement data by means of the above, and pointer movement means for moving the pointer on the display screen corresponding to the reduction movement data calculated by the reduction movement data calculation means, A touch input device comprising: a sub-function selecting unit that selects a sub-function name designated by the pointer moved by the pointer moving unit when the finger is released.
2. The reduced movement data calculation means includes a screen size storage unit that stores screen size data indicating vertical and horizontal lengths of the display screen, and a vertical and horizontal display region of each sub-function name. Area size storage section storing area size data indicating the length of the area, and when the finger movement data is received from the finger movement data detection means, the area size data stored in the area size storage section is used as the screen size. The touch input device according to claim 1, further comprising a reduction ratio calculation unit that divides the screen size data stored in the storage unit to obtain the predetermined ratio.
JP24511293A 1993-09-30 1993-09-30 Touch input device Pending JPH07104914A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP24511293A JPH07104914A (en) 1993-09-30 1993-09-30 Touch input device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP24511293A JPH07104914A (en) 1993-09-30 1993-09-30 Touch input device

Publications (1)

Publication Number Publication Date
JPH07104914A true JPH07104914A (en) 1995-04-21

Family

ID=17128806

Family Applications (1)

Application Number Title Priority Date Filing Date
JP24511293A Pending JPH07104914A (en) 1993-09-30 1993-09-30 Touch input device

Country Status (1)

Country Link
JP (1) JPH07104914A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1997046926A3 (en) * 1996-06-07 1998-04-02 Amada Co Ltd Control method and apparatus for plate material processing machine
US6535787B1 (en) 1996-05-10 2003-03-18 Amada Company, Ltd. Control method and apparatus for plate material processing machine
JP2008181523A (en) * 2007-01-25 2008-08-07 Samsung Electronics Co Ltd Apparatus and method for improvement of usability of touch screen
US8411918B2 (en) 2008-05-13 2013-04-02 Canon Kabushiki Kaisha Image processing apparatus and image processing method
JP2014170346A (en) * 2013-03-04 2014-09-18 Mitsubishi Electric Corp Information display control device, information display device, and information display control method
JP2014170344A (en) * 2013-03-04 2014-09-18 Mitsubishi Electric Corp Information display control device, information display device, and information display control method

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6535787B1 (en) 1996-05-10 2003-03-18 Amada Company, Ltd. Control method and apparatus for plate material processing machine
WO1997046926A3 (en) * 1996-06-07 1998-04-02 Amada Co Ltd Control method and apparatus for plate material processing machine
JP2008181523A (en) * 2007-01-25 2008-08-07 Samsung Electronics Co Ltd Apparatus and method for improvement of usability of touch screen
JP2014056590A (en) * 2007-01-25 2014-03-27 Samsung Electronics Co Ltd Apparatus and method for improvement of usability of touch screen
US8760410B2 (en) 2007-01-25 2014-06-24 Samsung Electronics Co., Ltd. Apparatus and method for improvement of usability of touch screen
US8411918B2 (en) 2008-05-13 2013-04-02 Canon Kabushiki Kaisha Image processing apparatus and image processing method
JP2014170346A (en) * 2013-03-04 2014-09-18 Mitsubishi Electric Corp Information display control device, information display device, and information display control method
JP2014170344A (en) * 2013-03-04 2014-09-18 Mitsubishi Electric Corp Information display control device, information display device, and information display control method

Similar Documents

Publication Publication Date Title
US20150035752A1 (en) Image processing apparatus and method, and program therefor
US8325134B2 (en) Gesture recognition method and touch system incorporating the same
JP4093823B2 (en) View movement operation method
US6346935B1 (en) Touch-sensitive tablet
JP4666808B2 (en) Image display system, image display method, storage medium, and program
EP0677803B1 (en) A method and system for facilitating the selection of icons
JP2682364B2 (en) Electronic musical instrument data setting device
US5559944A (en) User specification of pull down menu alignment
CA2012795C (en) Image editor zoom function
AU2006243730B2 (en) Interactive large scale touch surface system
US5237653A (en) Multiwindow control method and apparatus for work station having multiwindow function
US5502803A (en) Information processing apparatus having a gesture editing function
US7154473B2 (en) Method for controlling position of indicator
US5179655A (en) Multiwindow control method and apparatus for work station having multiwindow function
KR100636184B1 (en) Location control method and apparatus therefor of display window displayed in display screen of information processing device
US6323839B1 (en) Pointed-position detecting apparatus and method
US6069626A (en) Method and apparatus for improved scrolling functionality in a graphical user interface utilizing a transparent scroll bar icon
US6072485A (en) Navigating with direction keys in an environment that permits navigating with tab keys
US5073771A (en) Control method of zooming screen
EP0243925B1 (en) Instruction input system for electronic processor
JP3996852B2 (en) Remote control with touchpad for highlighting preselected parts of displayed slides
JP3909230B2 (en) Coordinate input device
JP4684745B2 (en) User interface device and user interface method
KR920001696B1 (en) Data processing apparatus w/multi-window controller
JP5654340B2 (en) Aspect ratio suggestions for resizable video windows