JPH07230352A - Touch position detecting device and touch instruction processor - Google Patents

Touch position detecting device and touch instruction processor

Info

Publication number
JPH07230352A
JPH07230352A JP22020594A JP22020594A JPH07230352A JP H07230352 A JPH07230352 A JP H07230352A JP 22020594 A JP22020594 A JP 22020594A JP 22020594 A JP22020594 A JP 22020594A JP H07230352 A JPH07230352 A JP H07230352A
Authority
JP
Japan
Prior art keywords
touch position
touch
positions
instruction
plurality
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP22020594A
Other languages
Japanese (ja)
Inventor
Shunichi Ito
Toshio Kamimura
Yoshihiko Kunimori
Michihiro Mese
Shigeto Osuji
俊夫 上村
俊一 伊藤
義彦 國森
成人 大條
道弘 目瀬
Original Assignee
Hitachi Ltd
株式会社日立製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to JP23042093 priority Critical
Priority to JP32007593 priority
Priority to JP5-320075 priority
Priority to JP5-230420 priority
Application filed by Hitachi Ltd, 株式会社日立製作所 filed Critical Hitachi Ltd
Priority to JP22020594A priority patent/JPH07230352A/en
Publication of JPH07230352A publication Critical patent/JPH07230352A/en
Pending legal-status Critical Current

Links

Abstract

(57) [Abstract] [Purpose] To provide a touch position detection device capable of detecting individual touch positions when a plurality of touches are made at the same time and having improved durability. [Structure] Demultiplexer group 12x, via wave transmitters 11x, y
The wave is transmitted from y and is received by the wave receiver 14 via the collection raw groups x and y.
x and 14y receive pulse-shaped signal waves W2x and W2y. The received signals are touch position / press detection units 23x, 2
3y, where one of the signal waves
Whether or not the signal received at 3x, y is attenuated is detected from the attenuated position and the attenuation level of the signal waves W2x, W2y. Upon receiving this, the simultaneous plural touch position detection unit 24, for example, when two places are touched, the X detection positions X2, X5, the Y detection positions Y2, Y5 and their pressing Pa,
The touch position Ta (X2, Y2) and the touch position Tb (X5, Y5) are detected by Pb.

Description

Detailed Description of the Invention

[0001]

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a touch position detecting device for detecting a position touched by using a finger, a palm, a pen, or the like, and particularly, when a plurality of touches are made at the same time, individual touch positions are detected. The present invention relates to a touch position detection device that can detect a touch position and a touch instruction processing device that uses the touch position detection device.

[0002]

2. Description of the Related Art In recent years, information processing apparatuses have been reduced in size and weight, and an input / output integrated information processing apparatus in which a tablet and a display are integrated has been actively developed.

Regarding a tablet which is a kind of touch position detecting device, there is a technique described in Japanese Patent Laid-Open No. 3-77119 as a technique for detecting a plurality of touch positions when a plurality of positions are touched by a finger. . This is the time from the start of the first touch in the state where the first touch is kept in the analog touch panel that can output only the barycentric position of those positions when multiple positions are touched. Under the condition that the second touch is started after a delay, the first touch position is first detected when only the first touch is made. Next, when the second touch is started with the first touch still being made and with a time delay from the start of the first touch, the first touch position and the second touch position The center of gravity position of the touch position of is detected. The second touch position is detected from the first touch position and the center of gravity position.

[0004]

Since the position of the tablet becomes unknown when a plurality of touches are simultaneously started, it is necessary to sequentially touch the tablets one by one. For example,
Two different things, like when using the shift key, eg
When two instructions, point a and point b, are required, it is impossible to touch point a and point b at the same time. After touching point a, touch point b (touch each one in turn). Go)
There is a need. Therefore, there is a problem that the operability is poor as compared with the case where the touches can be performed simultaneously.

On the other hand, as a tablet that can detect a plurality of positions even if a plurality of positions are touched at the same time,
There is a digital touch panel. The structure of this touch panel is formed by stacking two transparent sheets each having a thin transparent electrode coated on one side in a striped pattern so that the directions of the electrodes intersect and the electrodes face each other. However, since the electrodes are always in contact with each other only with this, the electrodes are overlapped with each other, for example, via a spacer having a lattice shape for separating the electrodes. Then, the electrodes are brought into contact only when the user pushes the sheet. There are 100 or more X / Y axis sensing lines (electrodes) on the touch panel.

[0006] This touch panel is liable to wear due to repeated contact of the electrodes, resulting in poor contact. Therefore, there is a problem of poor durability.

By the way, as a technique for designating an editing range by a plurality of touched positions touched at the same time, there is a technique described in Japanese Patent Laid-Open No. 1-142969. This uses the above digital touch panel, and interprets two points designated on the transparent touch panel as both ends of a diagonal line of a quadrangle. Then, the edit block can be designated by using this quadrangle as the edit block. Thus, the designation of the edit block by the transparent touch panel becomes easy.

Further, as a technique for simultaneously pressing a plurality of positions to input a command, there is a technique described in Japanese Patent Laid-Open No. 4-322322. This uses the digital touch panel described above. When multiple pressing positions on the switch panel are pressed at the same time, this is recognized and a command determined by the number of pressing positions is executed. It can be operated. In this publication,
A technique is also disclosed in which only one pressing position is received on the switch panel, the movement of the pressing position is detected, and a command determined by the moving direction is executed. This technique also enables reliable operation without recognizing the switch panel with the naked eye during operation.

However, since these techniques use a digital type touch panel, there is a problem in durability.
Further, there is no description about moving a plurality of touch positions touched at the same time, giving an instruction, and performing a process according to the instruction.

A first object of the present invention is to provide a durable touch position detecting device which can detect the positions of a plurality of touches which are performed not only sequentially but at the same time by a finger, a palm, a pen or the like. It is in.

A second object of the present invention is to detect the positions of a plurality of touches that are performed not only sequentially but simultaneously with a finger, a palm, a pen, etc., and perform processing according to the touched positions. The object of the present invention is to provide a touch instruction processing device having a property.

A third object of the present invention is to detect movements of a plurality of touch positions which are performed not only sequentially but at the same time by a finger, a palm, a pen or the like, and displayed on a display device according to the movement instruction. An object of the present invention is to provide a touch instruction processing device that is capable of moving and displaying a display object and that is also durable.

[0013]

In order to achieve the first object, in a touch position detecting device for detecting the touch position two-dimensionally with the contact position of a contact object as a touch position, the touch position detecting device detects the touch position in each one-dimensional direction. A plurality of touch position detecting means for detecting a touch position and outputting a detection signal, and
A plurality of candidate positions, which are obtained when a plurality of positions are touched at the same time, are selected from a plurality of candidate positions that are larger than the number of touches. Each detects the touch position based on the detection signal that changes due to touch,
The simultaneous plural touch position determination means has one of the detection signals of the touch position detection means, which is obtained from a plurality of candidate positions larger than the number of touches, which is obtained when a plurality of positions are touched at the same time. A characteristic value of at least one touch that does not depend on the position and a characteristic value of at least one touch that does not depend on the position that is extracted from the detection signal of the other touch position detection unit are selected. By doing so, each touch position is determined.

In order to achieve the above second object,
The above-described touch position detection device and an information processing device that performs information processing according to the touch position are included.

Further, in order to achieve the third object, a plurality of display devices for displaying a display object and a plurality of display devices at the same time on the display surface of the display device on which the display object is displayed are provided. , The contact position of the contact object is the touch position,
A touch position detection device that detects the touch position in a two-dimensional manner in time series, and a plurality of touch positions detected in the time series are used as a touch position instruction for instructing movement of the display object. The touch position detecting device detects the touch position in the one-dimensional direction, and a control unit that causes the display device to display the moved display target on the display device. ,
Having a plurality of touch position detecting means for outputting a detection signal, and obtained when a plurality of positions are touched at the same time,
From a plurality of candidate positions larger than the number of touches, a simultaneous multiple touch position determination means for determining individual touch positions is provided, and each of the touch position detection means is based on a detection signal that changes with a touch, The touch position detection means detects the touch position, and the simultaneous plural-touch position determination means has a plurality of candidate positions, which are obtained when a plurality of positions are touched at the same time, which are larger than the number of the touches. At least one touch feature amount that does not depend on the position, which is extracted from one detection signal of the detection unit, and at least one touch feature amount that does not depend on the position, that is extracted from the detection signal of the other touch position detection unit. The touch positions are determined by selecting the ones that match.

[0016]

With the touch position detecting device described above, it is possible to simultaneously detect a plurality of touch positions of fingers, palms, pens and the like. Therefore, when it is necessary to simultaneously instruct two points a and b, it is possible to touch points a and b at the same time.
Operability is improved. Moreover, in the past, a touch was made by stacking two transparent sheets each having a thin transparent electrode coated on the entire surface. On the other hand, in the present invention, such an electrode does not exist, so that contact failure due to wear occurs. The durability is improved.

With the touch instruction processing device, it is possible to provide a touch instruction processing device which detects the positions of a plurality of touches simultaneously performed and can perform processing according to the touch positions and which is durable.

Further, a plurality of touch positions, which are performed not only sequentially but at the same time, are detected by a finger, a palm, a pen, or the like, and the positions are displayed, and a movement instruction by a plurality of touch position instructions is displayed on the display device. It is possible to provide a touch instruction processing device that is capable of moving and displaying the display object displayed on the touch instruction processing device and that is durable.

[0019]

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS First, three embodiments of a touch position detecting device according to the present invention capable of simultaneously detecting a plurality of touch positions will be described. In this embodiment, a case where two places are touched at the same time and the two touch positions are detected will be described.

A simultaneous multiple touch position detecting device using a surface acoustic wave type touch plate (touch panel) according to the first embodiment of the present invention will be described with reference to FIGS.

The configuration of the simultaneous plural touch position detecting device will be described with reference to FIG.

FIG. 1 shows a schematic system configuration of a simultaneous plural touch position detecting device.

The present apparatus has a surface acoustic wave type touch plate 1 which is touched by a finger and the like, and a touch position detecting section 2. Ta and Tb are touched simultaneously with fingers 2
Indicates one position. Further, 11x and 11y are wave transmitters for X and Y position detection, 12x and 12y are demultiplexer groups for X and Y position detection, 13x and 13y are concentrator groups for X and Y position detection, and 14x. , 14y are wave receivers for detecting the X and Y positions.

Here, as shown in FIG.
x, 11y, demultiplexer group 12x, 12y, concentrator group 13
The x, 13y and the wave receivers 14x, 14y are installed at the corners or sides of the touch surface of the touch plate 1. Also, 21x, 2
1y is a wave transmitter control unit that controls the wave transmission of the X, Y position detection wave transmitter, 22x, 22y is a wave receiver control unit that controls the wave reception of the X, Y position detection wave receiver, 23x, Reference numeral 23y is a touch position / press detection unit for detecting X and Y positions, 24 is a simultaneous plural touch position determination unit, and 25 is an external interface unit for exchanging touch positions and control commands with the outside.

The outer dimensions are about 230 mm in width and about 156 mm in length, the spacing between the demultiplexers 12 is about 1.5 mm, and the number of the demultiplexers 12 is about 120 in width and about 70 in length.

Next, the touch panel 1 will be described. The touch panel 1 uses surface acoustic waves (SAW). That is, when detecting the touch position on the touch panel 1, the SAW energy is absorbed by the object (finger, stylus) touching the panel 1 or S
It utilizes the phenomenon that AW energy is absent.

The touch panel 1 is a cylindrical, spherical or flat glass 1 (general-purpose plate glass, thickness about 3 mm, transmittance 92) that fits the surface of a display panel (that is, CRT, LCD, EL, gas plasma, etc.). %) And the bottom melting point glass powder (Fr
It) prints the reflection array (branching filter group 12, concentrator group 13, provided at intervals of about 1.5 mm), and one wave transmitter 11 at each end of each array, for a total of four wave transmitters 11 The wave receiver 14 is bonded.

The structures of the wave transmitter 11 and the wave receiver 14 are shown in FIG. FIG. 19 shows a state in which the wave transmitter 11 and the like are attached to the glass 15. The wave transmitter 11 and the like include a piezoelectric element 111 and a prism 112.

Surface acoustic waves are represented mathematically as a combination of inhomogeneous longitudinal waves (L waves) and transverse waves (S waves). Number M
Those having a frequency in the Hz region are used. The most efficient and convenient way to generate surface waves is by mode conversion of longitudinal waves. This phenomenon is that when the longitudinal wave generated by the piezoelectric element 111 on the first solid 112 is projected on the joint surface 113 with the second solid (glass 15), the incident angle α114 is made appropriate and the longitudinal wave energy is increased. Is completely reflected at the interface and no energy enters the second solid by refraction, only surface acoustic waves are generated in the second solid.

The angle of the prism 112 at which the surface wave is most efficiently generated on the surface of the glass 1 is represented by the following equation.

V L = V S sin α Here, V L is the velocity of the longitudinal wave in the prism 112, and V S is the velocity of the surface wave. Therefore, in order to obtain the surface wave, it is necessary to select a material that satisfies V L <V S for the prism 112. Acrylic resin is usually selected as the material having the velocity required to generate surface waves on the glass 15.

The surface wave can propagate along the curved surface. If the radius of the curved surface is larger than the wavelength, there is almost no change in the attenuation amount and the velocity.

Next, the configuration of the touch position detecting section 2 will be described.

The transmitter control units 21x and 21x are 5.53 MH.
The clock signal 26 is received from the external interface unit 25 so that the output timings of the pulse wave amplifier that generates and amplifies the RF signal of z, the wave transmitter 11x, and the wave transmitter 11y do not overlap. Adjust the timing,
And a timing circuit that outputs a signal having a width of several μs (called a pulse wave). Wave transmitter 11x and wave transmitter 11
In addition to preventing the output timings to y from overlapping, it is necessary to prevent the pulse waves output from the wave transmitters 11x and y from overlapping at the wave receivers 14x and y, respectively. Then, the pulse wave is received at the receivers 14x, y,
The signal is output from the transmitter control units 21x and 21 every several ms.
For example, the wave transmitter control unit 21x outputs a pulse wave at time 0, the wave transmitter control unit 21y outputs a pulse wave at time 0.5 ms, and the wave transmitter control unit 21x outputs the following pulse wave at time 1.5 ms. A pulse wave is output, and the next pulse wave is output from the transmitter control unit 21y at time 2.0 ms.

The wave receiver control units 22x and y are provided in the wave receiver 14
An RF amplifier for strengthening the RF signal received from x, y, an AM detector for obtaining the envelope of the amplified signal, and an AGC (Auto) for amplifying the output signal of the AM detector.
An amplifier with a Gain Control).

The AM detector is for removing high frequency components contained in the received signal. The amplifier with AGC performs AGC amplification using the signal included in the first part of the received signal as a reference signal. The reference signal is
It is assumed that the routes 16 and 17 shown in FIG. This route 1
When the place where there are 6 and 17 is touched, the reference signal is attenuated and cannot be used as the reference signal. Therefore, the positions of the paths 16 and 17 are set to the peripheral portion of the touch panel so as not to be exposed.

The touch position / press detection section 23x, y is A
The analog waveform sent from the frequency band amplifier with GC is 30 ms (for this time, the touch position / press detection unit 2
3x, y) (determined by the time required for processing 3x, y), and an analog / digital converter that converts it into a digital waveform, and the obtained digital waveform as a digital waveform (reference waveform) when there is no stored touch. It has a comparator for comparing and confirming presence / absence of touch and determining a touch position, and a memory (ROM) for storing a reference waveform. The waveforms obtained by the conversion and the reference waveforms to be compared are from the reference time of t1, t2, t3, and t4 in FIG. 2 (the time when the amplitude is increased to half the reference amplitude level L), The X coordinate is about 90 points and the Y coordinate is about 140 points until the time when the amplitude is reduced to the half of the reference amplitude level L. Note that t1, t2, t3, t4 in FIG.
Is proportional to the X coordinate value or the Y coordinate value of the touch position.

Next, the operation of the touch panel 1 will be described. The detection of the X and Y coordinates of the touch position is performed by utilizing the phenomenon that ultrasonic energy is absorbed by an object (finger, stylus) touching the panel 1. X-axis and Y-axis 1 each
A pair of transmitter 11 for transmitting and receiver 14 for receiving (total 4
Individual) and a reflective array 1 printed around the glass 15
Ultrasonic energy is transmitted through the surface of the glass 15 and received by the action of Nos. 2 and 13 (melted frit).

When the touch panel 1 is touched by a finger, ultrasonic waves are absorbed at the contact point. The attenuated signal is caught and the X, Y coordinates are calculated. FIG. 2 shows the signal waveform. FIG. 2A shows a signal for detecting the X coordinate, and FIG. 2B shows a signal for detecting the Y coordinate.

The piezoelectric element 111 is mounted on a small prism 112 made of acrylic resin, and the prism 112 is further adhered to the surface of the glass 1 with epoxy resin. The crystal oscillator in the transmitter control unit 21 in the touch position detection unit 2 is 5.53 MHz, and the output time is several μ.
When the electric vibration of s is applied to the piezoelectric element 111, a longitudinal wave (bulk wave) is generated in the acrylic prism 112. By properly maintaining the shape of the prism 112 and the mounting angle, the boundary surface 11 between the prism 112 and the glass 15
Parc waves are converted into surface waves at 3 and the reflection arrays 12, 1
Propagate on 3.

The reflection arrays 12 and 13 are about 0.05 mm.
A frit having a height is printed on the surface of the glass 15 and melted. The reflection arrays 12 and 13 and the wave transmitter 11,
The wave receiver 14 has a width of about 12 mm and is arranged behind the bezel of a CRT or the like so as not to be seen from the outside.

Surface wave W generated from the X-axis transmitter 11x
When 1x goes through the reflection array (demultiplexer group) 12x, 45 °
Each of the reflection arrays 12x arranged in parallel with each other reflects a small amount of energy. W1x is shown in FIG. W1x
Shows the signal waveform immediately after being output from the transmitter 11x. The reflected surface wave W2x traverses the glass 1 and another reflection array (receiver group) 13x at the other end of the panel.
Is reflected again. The reflected surface wave W3x is sent to the wave receiver 14x at the lower right end of the glass 1. The waveform W3x immediately before being input to the wave receiver 14x is shown in FIG. 2A. The wave receiver 14x reconverts the surface wave energy into an electric signal. When the signal W1x having a length of several μs shown in FIG. 2A arrives at the wave receiver 14x, it returns as a long signal W3x elongated in time. When the panel 1 is touched, energy is absorbed at that location, and a depression is formed in the reception waveform W3x at a time corresponding to the touch position on a one-to-one basis. 2A corresponds to Ta and Tb in FIG.

Surface wave W generated from the Y-axis transmitter 11y
When 1y goes through the reflection array (demultiplexer group) 12y, 45 °
Each of the reflection arrays 12y arranged in parallel with each other reflects a small amount of energy. W1y is shown in FIG. W1y
Shows the signal waveform immediately after being output from the transmitter 11y. The reflected surface wave W2y traverses the glass 1 and another reflection array (receiver group) 13y at the other end of the panel.
Is reflected again. The reflected surface wave W3y is sent to the wave receiver 14y at the lower right end of the glass 1. The waveform W3y immediately before being input to the wave receiver 14y is shown in FIG. 2B. The wave receiver 14y reconverts the surface wave energy into an electric signal. When the signal W1y having a length of several μs shown in FIG. 2B reaches the receiver 14y, the signal W1y returns as a long signal W3y extended in time. When the panel 1 is touched, energy is absorbed at that location, and a depression is formed in the reception waveform W3y at a time corresponding to the touch position on a one-to-one basis. 2B corresponds to Ta and Tb in FIG.

As described above, a wave passing through a place on the touch panel 1 passes through the touch positions Ta and Tb at a specific distance from the wave transmitter 11x and the wave transmitter 11y at a time corresponding to the X coordinate or the Y coordinate. To do. This time is obtained by comparing with the signal waveform when there is no touch in the touch position / press detection unit 23 in the touch position detection unit 2. Then, the touch position is calculated. The above operation is performed alternately in X and Y in order to avoid an error due to surface wave interference. Dimples in the signal waveform (Pa, P in FIG. 2
Using the depth of b) as the Z coordinate, the X, Y, and Z coordinate values are set and output from the touch position / press detection unit 23 to the simultaneous plural touch position determination unit 24 every 36 ms.

For the Z-coordinate value, the pressure of the touch, that is, the value of the Z-axis, is used by taking advantage of the fact that the degree of absorption of sound waves is proportional to the contact area (that is, a function of pressure on a finger or other flexible substance). Is measured.

Next, the operation of the touch position detector 2 will be described. The pulse wave output from the wave transmitter control unit 21x, y is input to the wave receiver control 22x, y as a signal attenuated according to the touch position on the touch panel 1. In the receiver control units 22x and 22y, the envelope of the signal shown in FIG. 2 is obtained.

The comparators of the touch position / press detection units 23x and 23x compare the signal from the touch panel 1 with the reference waveform in the ROM.

In order to check whether or not there is a touch, the comparator checks the signals from the receiver control units 22x, 22y. The presence / absence of a touch is determined by whether or not there has been attenuation of the received amplitude exceeding a threshold level for determining the presence / absence of a touch (usually 10% of the amplitude of the reference waveform when there is no touch). Then, the position where there is attenuation is X,
The touch position on the Y-axis.

When the X and Y axis coordinates are determined, the Z coordinate is also determined. The Z coordinate is determined by the attenuation amount (Pa, Pb) of the signal at the position where the attenuation is confirmed. The amount of attenuation depends on the contact area at the touch location. In the case of a soft material such as a finger, the contact area between the glass 1 and the finger is proportional to the pressure with which the glass 1 is pressed by the finger. Therefore, since the signal attenuation amounts (Pa, Pb) are considered to represent pressing, Pa and Pb are also referred to as pressing below. In addition, Z
Coordinates are expressed in 16 steps.

The reference waveform is periodically updated to the signal waveform from the touch panel 1 obtained when there is no touch in order to take into consideration the change over time of the surface of the touch panel 1. When the dirt on the surface of the panel 1 is detected, the signal waveform from the touch panel 1 is updated. Whether or not the panel is dirty is determined by whether or not there is a touch that has been stopped for a fixed time (for example, 2.5 s), or whether there are two or more simultaneous touches that exist for a fixed time or longer.

Next, the operation of the simultaneous plural touch position determination section 24 will be described. Touch positions Ta and Tb shown in FIG.
The signal waves W2x and W2y that pass through the respective points attenuate according to the touch pressures Pa and Pb. Therefore, the wave receivers 14x and 14y receive the signal waves attenuated according to the touch positions Ta and Tb and the touch pressures Pa and Pb. FIG. 3 shows a processing result of the simultaneous multiple touch position determination unit 24.

As shown in the figure, the X position X1 is pressed by Pa, the X position X2 is pressed by Pb, the Y position Y1 is pressed by Pa, and the Y position Y2 is determined based on the attenuation position and attenuation level of the signal wave.
The simultaneous plural touch position determination unit 24, which receives the signal that the pressure Pb is detected on the touch position Ta (X1, X2), determines the combination of the X detection positions X1, X2, Y detection positions Y1, Y2 and the pressures Pa, Pb. Y1) and touch position Tb
(X1, Y1) is detected. Specifically, from the four possible candidate positions, the pressing level at the X position (Z coordinate)
Is to find a combination with a level of pressure at the Y position equal to. Accordingly, the touch position at the X position (center position of touch) and the touch position at the Y position can be associated with each other, and the touch position (X, Y) can be detected.

The detected coordinate value of the touched position is sent to the outside by the signal line 27 through the external interface section 25 together with the number of the touched positions at the same time. Communication with the outside is performed by RS-232C or the like.

In this embodiment, X, Y touch positions X1, X
2, Y1, Y2 and the pressure Pa at each position,
The simultaneous multiple touch detection by the combination determination with Pb has been described, but it is not limited to the combination of the touch position and the pressing. The combination includes a combination of the touch position and one or more other combination elements. For example, a combination of the touch position and the touched area, and a combination of the touch position, the pressing force, and the area can be considered. Here, the area means the area of a circle having a diameter of Ha and Hb in FIG.

According to this embodiment, it is possible to realize the position detection when a plurality of touches are simultaneously made with a finger, a palm, a pen or the like, so that the function of the touch position detecting device can be improved.

Further, according to this embodiment, since the touch plate 1 can be realized by the transparent glass, it is possible to prevent the image quality deterioration which occurs in the above-mentioned conventional device when the touch plate and the display means are integrated. The structure of the conventional pressure-sensitive touch plate has a thin transparent electrode coated on the entire surface.
It is a stack of transparent sheets. Therefore, the transparent electrode has a certain degree of transparency, but is inferior in transparency to the glass of the surface acoustic wave type touch plate. Therefore, when the touch plate and the display means are integrated,
The pressure-sensitive display is harder to see than the surface acoustic wave method.

Further, the conventional digital system has a drawback that the number of control lines becomes huge. Specifically, the structure of the pressure-sensitive sensor plate is X / Y on the sensor plate.
The sensing line of the axis becomes the control line of the sensor plate as it is. Therefore, even when a sensing line matrix that is about 1/10 of a 640 × 480 medium-definition display matrix is realized, the number of control lines is as large as 112.

On the other hand, the surface acoustic wave type control lines may be eight in total, two for the X / Y axis transmitter / receiver. Specifically, in the structure of the surface acoustic wave type sensor plate, the X / Y axis sensing lines on the sensor plate do not directly serve as the control lines of the sensor plate, but the signal waves emitted by the wave transmitter are demultiplexed. Since the sensing line is created by dividing it with a receiver, and the sensing line is combined into one by a wave collector, it is received by the receiver.
X independent of the sensing line matrix
A total of eight for the Y-axis transmitter / receiver and two for the Y-axis transmitter / receiver.

This device is also resistant to scratches and the like. Scratches and foreign matter on the touch panel are also treated as stains on the panel,
This is because a new reference waveform is captured by the touch position / press detection unit.

The advantages of this device can be summarized as follows. (1) Excellent transparency. (2) The glass material is durable. (3) No brittle conductive film or film is used. (4) Since the structure of the touch panel is simple, it can be easily attached to the display. (5) Since the touch panel can be curved, it can be attached directly to the display. (6) Non-glare processing is also possible. (7) Panel manufacturing cost is low.

A simultaneous multiple touch position detecting device using the two-dimensional photosensor array type touch plate of the second embodiment of the present invention will be described with reference to FIGS.

The structure of the simultaneous plural touch position detecting device will be described with reference to FIG.

FIG. 4 shows a system configuration of the simultaneous multiple touch position detecting device.

In the figure, 3 is a two-dimensional photosensor array type touch plate which is touched by a finger or the like, and 4 is a touch position detecting portion. Tc and Td indicate two positions that are simultaneously touched with a finger or the like. Further, 35 is glass, 31x and 31y are light emitting diodes, X,
The Y-position detecting light emitter groups 32x and 32y are X- and Y-position detecting light receiver groups, each of which includes a photodiode for detecting light, a phototransistor, or a photoconductive cell. Here, as shown in the figure, the light emitter groups 31x, 31
y and the photoreceiver groups 32x and 32y are provided on the side of the touch surface of the touch plate 3 corresponding to the photoemitters 31x and y and the photoreceiver 3
It is installed so that the optical axes of 2x and y coincide with each other. Also, 41
x and 41y are light emitter control units that control the light emission of the X and Y position detecting light emitters 31x and y, and 42x and 42y receive electric signals from the X and Y position detecting light receivers 32x and y, and The photoreceiver control unit for amplifying the signal, 43x and 43y are X,
A touch position / width detection unit for Y position detection, 44 is a simultaneous plural touch position determination unit, and 45 is an external interface unit for exchanging touch positions and control commands with the outside.

The light-emitter control section 41x is individually connected to each of the light-emitter groups 31x, and can individually cause each of the light-emitter groups 31x to emit light. Light emitter control unit 41y
Is also individually connected to each of the light emitter groups 31y, and each of the light emitter groups 31y can emit light individually.
The light receiver control unit 42x is individually connected to each of the light receiver groups 32x, and can individually receive each of the signals received by the light receiver group 32x and converted into an electric signal. The photoreceiver control unit 42y is also individually connected to each of the photoreceiver groups 32y, and can individually receive each of the signals received by the photoreceiver group 32y and converted into electric signals.

Light emitter control section 41x and light emitter control section 41y
Is a clock signal 46 from the external interface unit 45.
Based on the above, the light emitter group 31x and the light emitter group 31y are controlled so that the light emitter group 31x and the light emitter group 31y alternately emit light. Thus, the photoreceiver group 32x and the photoreceiver group 32y
Don't interfere with. Further, the light emitter control unit 4
1x sequentially causes each of the light emitter groups 31x to emit light so that interference does not occur between each of the light receiver groups 32x. The light emitter control unit 41y also performs similar control.

Optical receiver control section 42x and optical receiver control section 42y
Is a clock signal 46 from the external interface unit 45.
Based on the above, the light receiver group 32x and the light receiver group 3 are matched with the light emission timings of the light emitter group 31x and the light emitter group 31y.
Light emitter group 31x and light emitter group 31y corresponding to each of 2y
The signals from the photoreceiver group 32x and the photoreceiver group 32y are taken in only when each of them emits light. Thus
Interference between the light receiver group 32x and the light receiver group 32y and interference between the light receiver group 32x and the light receiver group 32y are prevented.

The state of light received by the photoreceiver groups 32x, 32 will be described with reference to FIG.

FIG. 5 shows the state of the light received by the photoreceiver group 32x, y. In FIG. 5, for example, when the light entering the light receiver 32x at the position X1 is shielded, the light is shielded in a region having a width h corresponding to the interval h between the light receivers 32.

As shown in the figure, the light emitters 31 x 1 to 5 ,
31y 1 to 5 signal lights Lx 1 to 5 , L respectively
y 1 to 5 are opposed to the light receivers 32x 1 to 5 and 32y 1 to 5
Each receives light.

Here, the signal lights Lx 1 to 3 , Ly 1 to 3 , and Lx 5 and Ly 5 passing through the touch positions Tc and Td, respectively, are shielded. Therefore, the light receiver 32x
1 to 5 and 32y 1 to 5 receive the signal light shielded according to the touch positions Tc and Td and the touch widths Wc and Wd.

The processing flow of the touch position detector 4 will be described with reference to FIG.

FIG. 6 shows the processing result of each part of the touch position detecting section 4.

As shown in the figure, the light receivers 31x 1-5 ,
Signal light Lx 1 to 5 which 31y 1 to 5 receives the touch position received a Ly 1 to 5 / width detecting section 43x, 43y, respectively, the light receiver 31x 1 to 5, a signal from 31y 1 to 5, photodetector 31x 1 to 5, which is shielded from light, to determine whether 31y 1 to 5 is which. As a result, the width W at the position X2
c, the object having the pressing width Wd at the position X5 is detected. An object having a width Wc at the position Y2 and a width Wd at the position Y5 is detected. The width is (the number of light receivers 31 that are continuously shielded from light)
Calculate as xh. The position X2 is from the width Wc to the position X5.
Is from the width Wd, the position Y2 is from the width Wc, and the position Y5 is from the width W.
From d, it is determined as the position of the light receiver 31 existing at the center of each width. When the even number of the light receivers 31 are shielded from light, the position is determined as an intermediate position between the two light receivers 31 at the center position.

The simultaneous plural touch position determination unit 44, which has received the width and position information, determines the combination of the X detection positions X2, X5, Y detection positions Y2, Y5 and the widths Wc, Wd.
Touch position Tc (X2, Y2), touch position Td (X
5, Y5) is detected. Specifically, four possible candidate positions ((X2, Y2), (X2, Y5), (X5, Y
2), (X5, X5)) find a combination that has a touch width at the Y position equal to the touch width at the X position. Assuming that this condition is satisfied, (X2, Y2),
Two of (X5, X5) are selected. In this way, the touch position at the X position (center position of the touch) and the touch position at the Y position can be associated with each other, and the touch position (X,
Y) can be detected.

In this embodiment, each of the light emitter group 31 and the light receiver group 32 is sequentially operated in a time-division manner. However, the present invention is not limited to this, and a lens is attached to each light emitter group 31. However, if the emitted light is a parallel light of a narrow beam, interference between the light receiver groups 32 can be prevented. Therefore, it is not necessary to operate each of the light emitter group 31 and the light receiver group 32 in a time division manner, and they can be operated simultaneously. As a result, the light emitter control unit 41 and the light receiver control unit 4
In 2, the time-division control circuit is not required, so that the configurations of the light-emitter control unit 41 and the light-receiver control unit 42 can be simplified.

Further, in the present embodiment, the simultaneous multiple touch position detection in the 5 × 5 matrix in which the number of the light emitting elements and the light receiving elements is five has been described, but the configuration of the matrix is not limited to this. As a specific product, for example, the external dimensions are about 230 mm in width, about 156 mm in length, the distance between the light emitters is about 3 mm, and the number of light emitters and light receivers is
Approximately 60 objects horizontally and 35 objects vertically are also conceivable.

According to this embodiment, it is possible to detect the touch position when a plurality of touches are made at the same time with a finger, a palm, a pen or the like, so that the function of the touch position detecting device can be improved.

Further, according to this embodiment, since the touch plate 3 can be realized in the shape of a frame with an empty inside, it is possible to prevent the deterioration of image quality which would occur in the conventional device when the touch plate and the display means are integrated. The structure of the conventional pressure-sensitive touch plate is a stack of two transparent sheets coated with thin transparent electrodes. For this reason, although the transparent electrode has a certain degree of transparency, it is inferior in transparency to a touch plate of a two-dimensional photosensor array method in which there is nothing inside the frame that blocks light. Therefore, when the touch plate and the display unit are integrated, the pressure-sensitive display is harder to see than the two-dimensional photosensor array method.

A simultaneous multiple-touch position detecting device using a two-dimensional video camera type touch plate according to a third embodiment of the present invention will be described with reference to FIGS.

The configuration of the simultaneous plural touch position detecting device will be described with reference to FIG.

FIG. 7 shows a schematic system configuration of a simultaneous multiple touch position detecting device.

In the figure, reference numeral 5 is a touch plate of a two-dimensional video camera type touched by a finger or the like, and 6 is a touch position detecting portion. Te and Tf indicate two positions that are simultaneously touched with a finger or the like. Reference numeral 52 is glass. 5
1x and 51y are video cameras for detecting X and Y positions. The camera 51x photographs the range in the X direction, and the camera 5x
1y captures a range in the Y direction. Here, as shown in the figure, the cameras 51x and 51y are installed on the sides of the touch plate 5 on the touch surface. 53x and y are cameras 51
It is a partition having a height H provided so as to face x and y. The partition 53x, y is a camera 51x, behind the partition 53x, y.
This is provided in order to make it easy to detect the position of the finger or the like when the touch position / width detection unit 62x, 62 detects the touch position / width by not capturing an image by y. The surface of the partition 53x, y facing the camera 51x, y is white in order to increase the contrast with the finger so that the position of the finger can be easily recognized.

Reference numerals 61x and 61y denote camera control units 62x and 62y for controlling image pickup by the X and Y position detecting cameras.
Is a touch position / width detection unit for X and Y position detection, 63 is a simultaneous plural touch position determination unit, and 64 is an external interface unit for exchanging touch positions and control commands with the outside.

The camera control unit 61, every three frames,
A video signal from the camera 51 for one frame is converted into a digital signal for each pixel, stored in the RAM, and the stored data is output to the touch position / width detection units 62x, y.

A method of processing the video imaged by the cameras 51x and 51y will be described with reference to FIG.

FIG. 8 shows an image taken by the cameras 51x and 51y.

As shown in the figure, the touch positions Te, Tf
Each has a finger. Therefore, the camera 51x,
51y respectively images two fingers as images Vx and Vy (images of height H). Although the entire video image captured by the camera is higher in the height direction than the video images Vx and Vy, the touch position / width detection unit 62x, y
Is an image Vx, V shown in the figure from the camera control unit 61x, y.
The pixel data of only the y portion is received.

Video V taken by the cameras 51x and 51y
Touch position / width detectors 62x and 62y that have received x and Vy
Detects the width We at position X1, the pressing width Wf at position X2, the width We at position Y1, and the width We at position Y2 from the position and width of the finger, respectively. For this reason, the touch position / width detection units 62x and 62y are on the scanning line at a position where the height from the glass 52 surface is h (a constant value) in the pixel data from the camera control units 61x and y. Recognize a finger from pixel data. For recognition, the brightness of the pixel data on this scanning line is checked, and it is assumed that the pixel whose brightness is lower than the white color of the background by a threshold value is the pixel representing the finger. The width of the finger is obtained from the number of consecutive pixels. The X and Y coordinates of the position where the finger is located are determined from the center position of the width.

The simultaneous plural touch position determination section 63 which has received the coordinate value and the width information of the finger position detects the X detection positions X1, X2, Y detection positions Y1, Y2 and the same as in the second embodiment. The touch position Te is determined by the combination determination of the widths We and Wf.
(X1, Y1) and touch position Tf (X1, Y2) are detected.

In this embodiment, X, Y touch positions X1, X
2, Y1, Y2 and the widths We, W at the respective positions
The simultaneous multiple touch detection based on the combination determination with f has been described, but the present invention is not limited to the combination of the touch position and the width. The determination may be made in consideration of the shape and color of the finger.

Further, although the installation positions of the cameras 51x and 51y have been described as arbitrary positions on the side of the touch surface of the touch plate 5, the positions are not limited to this. The position may be separated from the side of the touch plate 5.

Further, the camera control section 61 may send an instruction to the camera 51 to sequentially change the focal length, and cause the camera 51 to perform photographing for each focal length. At this time, in consideration of the information of the focal length in the difference that determines the width and position of the finger,
The position of the pixel on the scan line can be converted into the position of the finger.

Although a normal video camera can be used as the camera, it is possible to reduce the size of the entire touch panel by using a thin camera that is as long as possible in the horizontal direction and has a small length in the height direction of the camera. It is preferable for conversion.

According to the present embodiment, it is possible to simultaneously detect a plurality of touched positions of fingers, palms, pens, etc., so that the function of the touched position detecting device can be improved.

Further, according to the present embodiment, the touch plate 5 can be realized in the shape of a frame in which the inside is vacant, so that it is possible to prevent the deterioration of the image quality in the integrated display.

In the embodiments of the first to third three simultaneous plural touch position detecting devices described above, the method of detecting two touch positions when touching two places at the same time has been described. The same applies when touching three or more locations. As for the tool for touching a plurality of positions at the same time, the case of a finger has been described, but a finger, a palm, a pen or the like may be used. Multiple types of tools such as fingers, palms and pens may be mixed.

A simultaneous multiple touch instruction processing device according to a fourth embodiment of the present invention will be described below.

First, a simultaneous plural touch instruction processing device in which a simultaneous plural touch position detecting device using a surface acoustic wave type touch plate is integrated with a display device will be described with reference to FIG.

FIG. 9 shows the configuration of the simultaneous multiple touch instruction processing device.

In the figure, 7 is a simultaneous plural touch position detecting device using a surface acoustic wave type touch plate, 8 is a display device using a liquid crystal display, 9 is an information processing device, and 1 is a display device.
0 is a speaker. Simultaneous multiple touch position detection device 7
Outputs the number of detected touch positions and the X and Y coordinates of the touch positions to the information processing device 9. Further, 91 is an MPU for performing information processing, 92 is a memory that is a ROM for storing an information processing program for performing processing such as word processing, piano, soccer, and ending processing described later, 94
Is a memory which is a RAM, 93 is a voice output unit, 96
Is an interface unit, and 95 is a bus. here,
As shown in the figure, the simultaneous multiple touch position detection device 8 and the display device 9 are integrated. The speaker 10 is an MPU
A warning sound or music is output by a signal output from the audio output unit 93 in response to the instruction from 91.

In the fourth embodiment of the present invention described below, when the power is turned on, the menu screen shown in FIG. 20 is displayed, and one of word processor, piano (virtual piano), soccer, and end can be selected. You can Figure 20
Is displayed on the display device 8 integrated with the simultaneous multiple touch position detection device 7 according to the present invention. It can be selected by touching the screen.

Then, in each of the selected processes, the MPU 91 which has received a plurality of simultaneously touched touch position information obtained from the simultaneous plural touch position detecting device 8
According to the information processing program stored in the memories 92 and 94, information processing corresponding to simultaneous multiple touch operations is performed.

Information processing of the word processor using the virtual keyboard in the fourth embodiment of the present invention will be described with reference to FIGS. In the following, information processing when inputting or editing a sentence will be described. In this process, the simultaneous multiple touch instruction processing device of FIG. 9 is used as a word processor. In this word processor, a virtual keyboard is displayed on the display device, and an input / edit operation can be performed on this virtual keyboard, and the information processing device 9 follows this operation.
It is for inputting and editing text.

An information processing display screen in the display device 8 integrated with the simultaneous plural touch position detection device 7 will be described with reference to FIG.

FIG. 10 shows a display screen in a word processor using a virtual keyboard. In order to display this screen, the user selects the word processor displayed as one of the menus in the menu screen of FIG. 20 displayed after turning on the power of the information processing device. The screen virtual keyboard and text are displayed.

In the figure, 811 is a virtual keyboard, and 8
12 is a sentence. The area for displaying the virtual keyboard 811 and the area for displaying the sentence 812 are determined as shown in the figure. Since information about whether the touch position output from the touch position detection device 7 falls within these areas is necessary as described later, the memory 92 stores a table representing these areas in the coordinate system of the touch position detection device 7. 923.
FIG. 10 shows a state in which the sentence 812 has already been input.

When a touch is made on the virtual keyboard,
The touch position is converted into a key code for identifying the key of the virtual keyboard 811 in the information processing device.
The memory 92 of FIG. 9 has a correspondence table 921 of touch positions and key codes for this conversion. Since the keys on the virtual keyboard 811 are displayed in a certain size, the touch position has a width according to the size, and the table 921 corresponds to the key code as the touch position, It also has information on the allowable width for each. Further, the memory 92 stores the key code
It also has a correspondence table 922 of touch positions and character key codes for converting characters or symbols into character key codes for identifying in the information processing apparatus. Further, it has a table 924 for converting a character code into a shift character key code when the shift key and the character key on the virtual keyboard 811 are simultaneously pressed.

The memory 92 has a correspondence table 925 of the touch position and the character pointing position for converting the touch position into the character position of the text 812 when the text 812 is touched. The character position of the sentence 812 is a line number and a column number in the displayed sentence 812, and indicates which line and column the character is. Since one character of the sentence 812 is displayed in a certain size, the touch position has a width corresponding to the size, and the table 925 also has information on the allowable width.

The flow of information processing for text input using the virtual keyboard 811 will be described with reference to FIG.

FIG. 11 shows a flow of a part of the information processing for inputting a sentence, which detects whether or not a position of the shift key of the virtual keyboard 811 is touched and performs the processing. In the following, it is assumed that the information processing apparatus is powered on, the menu is displayed, and the word processor in the menu is selected.

Step 1101: Since the word processor in the menu is selected, the MPU 9 displays the virtual keyboard 811 on the display device 8.

Step 1102: The MPU 9 acquires one or two or more touch positions from the simultaneous plural touch position detecting device 7.

Step 1103: The MPU 9 judges from the table 923 whether or not these touch positions are on the virtual keyboard 811, and if it is on the virtual keyboard 811, proceeds to step 1104, and if not, terminates the processing.

Step 1104: The MPU 9 uses the table 921 to set the acquired touch position to the virtual keyboard 8
Convert to 11 key code. At this time, if the touched position does not correspond to the position of any key on the virtual keyboard 811, the touched position is in a portion other than the key, and therefore the process proceeds to step 1102 and the touched position is acquired again.

Step 1105: The MPU 9 acquires the character key code from the key code by the table 922.

Step 1106: When the MPU 9 acquires a plurality of touch positions, the
It is determined whether or not there is a shift key code corresponding to the shift key, and if there is a shift key code, step 110
7. If not, proceed to step 1108.

Step 1107: The MPU 9 converts the character code into a shift character key code according to the table 924.

Step 1108: The MPU 9 outputs the character key code or the shift character key code to the display device 8. The display device displays the last position 8 of the sentence of the sentence 812.
At 121, the character designated by the character key code or the shift character key code is displayed.

Sentence 8 which follows the processing of FIG.
A process of determining whether or not a touch has been performed in the area 12 and a flow of a process when determining that a touch has been performed will be described with reference to FIG.

FIG. 12 shows a flow of processing relating to a touch in the area of the sentence 812. This process is performed when a target range of an editing operation such as copying, moving and deleting in a word processor is specified on the sentence 812, for example, when a plurality of characters are deleted, when a plurality of characters to be deleted are specified. , It is judged whether the range is specified for deletion. If the range is specified, the range is judged,
This is a process for sending this range to the process of the next stage (delete process in this case). Note that the touch position has already been acquired in step 1101 of FIG. 11, so the step of acquiring the touch position is not included in FIG.

Step 1201: It is judged by the table 923 whether all of the acquired one or two or more touch positions are on the sentence 812. If it is on the sentence 812, the process proceeds to step 1202, otherwise, the process is performed. finish.

Step 1202: The touch position is converted into the character position of the sentence 812 by the table 925.

Step 1203: Touch position detecting device 7
If a plurality of character positions are sent from, and all of them are in the area of the sentence 812, there are two or more character positions (that is, a plurality of positions on the sentence 812 are touched). The process proceeds to step 1204, and if not, the process ends.

Step 1204: A plurality of characters within a range enclosed by two or more character positions are determined, and the process is ended.

In this processing, the shift key combined processing of the virtual keyboard 811 has been described, but it is not limited to the shift key. The present invention can also be applied to other keys, for example, control key combined processing. Further, the range designation processing of the sentence 812 has been described, but the present invention is not limited to the sentence. It is also possible to target a figure.

Further, in this processing, the details of the information processing as a word processor, for example, the deletion processing has not been described, but it can be realized by a known technique.

According to this processing, since instructions by simultaneous multiple touch operations performed with fingers, palms or pens can be processed, the operability of the touch instruction processing device can be improved. Here, the instruction by the simultaneous multi-touch operation means, for example, an instruction of the input character as an uppercase letter or a lowercase letter by a combination of the character key and the shift key in FIG. 10. The device is, of course, also able to process instructions given by a single key.

In addition, according to the present processing, if the operation such as the range designation of the sentence 812 requires a plurality of operations,
Since it can be realized with a single operation, the operability can be improved in a word processor using the virtual keyboard 811.

Next, the performance of the piano using the virtual piano will be described with reference to FIGS. In this processing, as shown in FIG.
The simultaneous multiple touch instruction processing device of is used as a piano. In this process, the virtual piano 821 and the musical score 822 are displayed on the display device, and the virtual piano 821 can be touch-operated according to the musical score 822 to perform a performance operation on the virtual piano. In accordance with this operation, the information processing device 9 sends a sound signal to the speaker via the audio output unit 93.

An information processing display screen in the display device 8 integrated with the simultaneous plural touch position detection device 7 will be described with reference to FIG.

FIG. 13 shows a display screen in playing a piano using a virtual piano. In order to display this screen, the user selects the piano displayed as one of the menus on the menu screen displayed after turning on the power of the information processing apparatus, and the virtual screen of FIG. 13 is displayed. Piano, score, etc. are displayed.

In the figure, 821 is an area for displaying a virtual piano, 822 is an area for displaying a musical score, and 823 is an area for receiving an input for ending the performance. Since the information on whether the touch position output by the touch position detecting device 7 falls within these areas is required as described later, the memory 92
Has a table 923 that represents these areas in the coordinate system of the touch position detection device 7.

When a touch is made on the virtual piano 821, the touched position is converted into a key code for identifying the keyboard of the virtual piano 821 in the information processing apparatus. The memory 92 of FIG. 7 has a correspondence table 926 of touch positions and key codes used when performing this conversion. Some key codes correspond to the position 823 indicating the end of performance. Since the keyboard on the virtual piano 921 is displayed in a certain size, the touch position has a width corresponding to the size, and the table 926 corresponds to the key code and allows each key code. It also has information about the width to be processed. Further, the memory 92 uses the correspondence table 9 between the sound signal generation information and the key code, which is used by the audio output unit 93 to convert the key code into the sound signal.
Also has 27.

The flow of information processing regarding the virtual piano 821 will be described with reference to FIG.

FIG. 14 shows a flow of a multi-tone performance process of the virtual piano 821. In the following, it is assumed that the information processing apparatus is powered on, the menu is displayed, and the piano in the menu is selected.

Step 1401: Since the piano in the menu has been selected, the MPU 9 displays the virtual piano 821 on the display device 8.

Step 1402: The MPU 9 acquires one or a plurality of touch positions from the simultaneous plural touch position detecting device 7.

Step 1403: The MPU 9 judges from the table 925 whether or not at least one touch position is on the virtual piano 821. If it is on the virtual piano 821, the procedure proceeds to step 1404, and if not, the procedure proceeds to step 1406. .

Step 1404: The MPU 9 converts one or a plurality of touch positions into a key code of the virtual piano 821 corresponding to each touch position by using the table 926.

Step 1405: The MPU 9 outputs one or a plurality of key codes to the voice output section 93. The voice output unit 93 obtains the sound signal generation information corresponding to the key code from the table 927 in order to generate the sound corresponding to the key code, and generates a sound signal in which one or a plurality of sounds are synthesized. The generated sound signals are simultaneously output to the speaker 10, and the speaker 10 outputs one or a plurality of sounds according to the touch position. Then, step 140
Go to 2. As the sound signal generation information, MIDI (Mus
ical Instrument Digital I
data may be used.

Step 1406: It is judged whether or not the touch position is the position 823 indicating the end of the performance. If the touch position is the position indicating the end, the process is terminated.
Proceed to step 1402.

When the processing is completed in the above processing, the display returns to the menu screen shown in FIG.

In this processing, one virtual piano 821 has been described, but the number of virtual pianos is not limited to one, and a plurality of virtual pianos may be used, and in this case, continuous shots are possible. It can also be applied to musical instruments other than pianos.

According to this process, it is possible to realize a combination or individual instructions in simultaneous multiple touch operations with fingers, palms, pens, etc., so that the operability of the touch instruction processing device can be improved. Further, according to this processing, since a multi-tone performance of the virtual piano 821 can be realized, the operability of information processing of the performance of the piano using the virtual piano 821 can be improved.

Next, a competitive soccer game using virtual buttons will be described with reference to FIGS. Here, the competitive soccer game using virtual buttons is a game played on the soccer field by touching the virtual buttons.

In this process, the simultaneous multiple touch instruction processing device of FIG. 9 is used as a video game. In this process, the display device 8 has a soccer field 832, a plurality of players 839, 840, and virtual buttons 8311, 8312, 833 for making these players kick / dribble.
1, 8332, 8341, 8342, 8351, 835
2 is displayed, and the user operates this virtual button. In accordance with this operation, the information processing apparatus 9 causes the players 839 and 840 to kick / dribble and the like to proceed with the game.

The display screen of the display device 8 integrated with the simultaneous plural touch position detection device 7 will be described with reference to FIG. In order to display this screen, the user selects the soccer displayed as one of the menus in the menu screen displayed after turning on the power of the information processing device, and the screen of FIG. 15 is displayed. To be done.

FIG. 15 shows a screen of a competitive soccer game using virtual buttons. In the figure, 832 is a soccer field, 836 is a goal area, 838 is a penalty area line, 837 is a halfway line, and 841 is a ball.

Reference numerals 8311 and 8312 denote start buttons for starting the game. When the start buttons 8311 and 8312 are touched during the game, the game ends.

Regarding the buttons described below, the button whose last digit of the attached number is 1 is a button for operating the player 839 of the front team, and the button whose last digit of the number is 2 is , Buttons for operating the players 840 of the opposite team.

Reference numerals 8331 and 8332 are buttons for kicking or throwing. This button 833
The MPU 9 determines from the progress of the game whether to kick or throw when 1,8332 is touched.

Reference numerals 8341 and 8342 are buttons for heading or dribbling. This button 83
Whether the heading or the dribbling is performed when 41, 8342 is touched is determined based on the height at which the ball that the MPU 9 flies over hits the players 839, 840 (determined by the calculation by the MPU 9).

Reference numerals 8351 and 8352 are buttons for instructing which direction to operate the ball, such as front, rear, left, and right, at the time of kicking, throwing, heading, and dribbling, and four directions can be designated. Is indicated by an arrow. That is, 8351a, 8352
a is the forward direction, 8351b, 8352b is the right direction,
8351c and 8352c are rear, 8351d and 835.
2d is to the left. When the two directions are touched at the same time, the MPU 9 interprets that the middle direction is designated. For example, when 8351a and 8352a are touched at the same time, it is interpreted that a kick or the like is performed diagonally to the front right. This direction is also the direction in which the player who kicks, etc., moves after making a kick, etc.

When the button is operated, which player 83
As for whether 9,840 kicks, etc.,
The MPU 9 judges the player of each of the front side team and the other side team at the nearest position, and the player performs a kick or the like according to the button.

The area for displaying virtual buttons such as the button 8311 is determined as shown in the figure. Since information on whether or not the touch position output from the touch position detection device 7 falls within these areas is necessary as described later, the memory 92
Has a table 928 representing these areas in the coordinate system of the touch position detection device 7. The memory 92 has a correspondence table 929 of the touch position and the button code, which is used when the touch position is converted into a button code for identifying the touch position in the information processing device.

The processing when the virtual button is touched will be described with reference to FIG.

FIG. 16 shows a flow of multiple input processing of the virtual button group 831. In the following, it is assumed that the information processing apparatus is powered on, a menu is displayed, and soccer in the menu is selected.

Step 1601: The MPU 9 causes the display device 8 to perform the display shown in FIG.

Step 1602: The MPU 9 acquires one or more touch positions from the simultaneous plural touch position detecting device 7.

Step 1603: The MPU 9 judges from the table 928 whether or not at least one touch position is on any of the virtual buttons. If it is on any of the virtual buttons, the process proceeds to step 1604. Otherwise, it proceeds to step 1602.

Step 1604: The MPU 9 converts the touch position on the virtual button into the button code of the virtual button using the table 929.

Step 1605: It is judged from the button code whether or not there are start buttons 8311, 8312 in the touch position, and start buttons 8311, 831 are determined.
If there is 2, the process proceeds to step 1606, and if not, the process proceeds to step 1607.

Step 1606: It is judged whether or not the game has already started, and if it has started, the game is ended because the game is ended. If not, the game has started, and the process advances to step 1602.

Step 1607: The MPU 9 pushes the button 8
331, 8332, 834, 8342, 8351, 83
The above-described processing determined for each button code corresponding to 52 is performed to determine the movements of the players 839 and 840 and the ball 841, and the display device 8 displays the movements.

When the processing is completed in the above processing, the screen returns to the menu screen of FIG.

In this processing, the case where two users play the game has been described, but the present invention is not limited to this. Three
In order to allow the game to be played by more than one person, the virtual buttons should be increased by the number of people.

According to this device, since multiple inputs of virtual buttons can be realized, the operability of a competitive soccer game using virtual buttons is improved.

Next, the power-off process using the virtual power switch will be described with reference to FIGS.

The display screen of the display device 8 integrated with the simultaneous plural touch position detection device 7 will be described with reference to FIG.

FIG. 17 shows an information processing screen for turning on / off the power using the virtual power switch. In order to display this screen, the user selects the end displayed as one of the menus in the menu screen of FIG. 20 displayed after turning on the power of the information processing device,
The virtual power switch 841 on the screen of FIG. 17 is displayed.

Here, the power-off processing using the virtual power switch 841 means power-off performed by simultaneously touching two virtual power switches 841a and 841b. The reason why the power supply is cut off by using the two switches 841a and 841b is to prevent the power supply from being cut off by mistake.

Since information on whether or not the touch position output from the touch position detecting device 7 falls within the area of the virtual power switch 841 is required as described later, the memory 92 stores these areas in the touch position detecting device 7. It has a table 930 represented by a coordinate system.

The flow of power-off processing by the virtual power switch 841 will be described with reference to FIG.

FIG. 18 shows the flow of power-off processing by the virtual power switch 841.

Step 1801: The virtual power switch 841 is displayed on the display device 8.

Step 1802: One or more touch positions are acquired from the simultaneous plural touch position detecting device 7.

Step 1803: It is determined whether or not all the acquired touch positions are on the virtual power switch 841 and at least one touch position exists on any of the virtual power switch 841a and the virtual power switch 841b. If this condition is met, step 1804
Otherwise, to step 1805.

Step 1804: The power is turned off, and the process ends.

Step 1805: The menu screen shown in FIG. 20 is displayed.

In this processing, details of information processing performed when the power is turned off, that is, processing of turning on / off the power itself, checking of the battery and saving of data accompanying turning on / off of the power are not described, but a known technique is known. Can be achieved with.

Further, according to this processing, the power cannot be turned off unless the two virtual power switches 841 are pressed at the same time, so that careless operation can be prevented.

In the fourth embodiment described above,
Although the configuration in which the simultaneous plural touch position detecting device using the surface acoustic wave type touch plate and the display device are integrated has been described, the method of the touch plate of the simultaneous plural touch position detecting device is not limited to this. Of course, a touch plate using a light emitting element or a camera is also possible.

Further, the simultaneous plural touch position detecting device may be constructed independently from the display device.

The system configuration is not limited to this, and may be a combination of an input device such as a keyboard, an auxiliary storage device such as a hard disk device, and an output device such as a printer.

A simultaneous plural gesture instruction processing device according to the fifth embodiment of the present invention will be described below. This device is C
In AD, when a displayed two-dimensional or three-dimensional figure is moved in parallel, rotated, or expanded / contracted, it is a device that can easily give instructions for parallel movement, rotational movement, and expansion / contraction.

First, referring to Table 1, the basic concept of the simultaneous plural gesture instruction processing will be described.

[0188]

[Table 1]

Table 1 shows a basic algorithm for simultaneous multiple gesture instruction processing.

In the table, the vertical item indicates the method of specifying the simultaneous plural gesture objects, and the horizontal item indicates the instruction content of the simultaneous plural gesture operations. Simultaneous multiple gesture objects can be specified by specifying the inside of the outline that specifies one target of each finger with multiple fingers, specifying the outline of the outline (side) of one target with multiple fingers, and within the range surrounded by multiple fingers. There is a specification within the range that specifies multiple objects.

Here, when one or more touch positions are within the contour of one or more objects, it is determined that the designation is within the contour. When all touch positions are on the outer contour of a single object, it is determined that the outer contour is designated. When all touch positions are outside the contour of the target object and the target object is inside all touch positions, it is determined that the designation is within the range. When the range is designated by two fingers, for example, the range can be a rectangle having the points designated by these two fingers as diagonal points. Further, when designated by three or more fingers, for example, a polygon having a point designated by these fingers as an apex can be used. In addition, in designation in the outer contour,
Touch the inside of the figure with your finger to specify it.
Specify by touching the side of the figure with your finger.

In the case of designation on the outer contour, when the designation is made with a finger, if the side is thin, the position of the finger may be slightly deviated from the side. Therefore, even if the touch position deviates from the side, if the amount is small, it is determined to be on the side. For this reason, the present apparatus has, as data, an allowable amount of determining whether the position is on the side no matter how far from the side. This allowable value is also an allowable value when determining whether or not the user is inside the outer contour in the specification of the outer contour. That is, when the distance is more than the above allowable value from the side and is in the outer contour,
It is judged to be inside the outer contour. Furthermore, this allowable value is also an allowable value when determining whether or not it is within the range in the specification within the range. That is, when the touch position is away from the side by the allowable value or more, the entire touch positions are outside the contour of the target object, and the target object is inside the full touch position, it is determined to be within the range.

The instruction contents of the simultaneous multiple gesture operation include:
Parallel movement instructions to move while maintaining the positional relationship of multiple fingers,
Rotation movement instruction to rotate while maintaining the positional relationship of multiple fingers,
There is an expansion / contraction instruction for expanding / contracting the positional relationship of a plurality of fingers.

As shown in the table, in the designation within the outer contour, the parallel movement instruction moves all the objects in parallel, the rotation movement instruction moves all the objects in rotation, and the expansion deformation instruction moves each object individually. In the designation on the outer shell, a parallel movement instruction moves a target object in parallel, a rotation movement instruction rotates a target object, and a stretch deformation instruction stretches or deforms the target object. In the range designation, the parallel movement instruction causes parallel movement of all objects, the rotation instruction causes rotational movement of all objects, and the extension / contraction deformation instruction causes extension / contraction deformation of all objects.

The flow of information processing in the basic algorithm of the simultaneous plural gesture instruction processing of the present invention will be described with reference to FIGS. In FIGS. 21 and 22, the present device is in a state of waiting for input, a touch position is detected by a simultaneous multiple touch position detection device 7 which will be described later, and the touch position is input to an information processing device 9 which will be described later. The steps after the processing device 9 processes the touch position will be described. The determination of the method for designating the gesture target in FIG.
For example, it is performed as follows from the positional relationship between the starting point of the trajectory of the touch position drawn by the finger and the displayed graphic.

FIG. 21 shows a determination processing flow of a method of specifying a plurality of simultaneous gesture objects.

Step 2101: It is judged whether or not there are a plurality of touch positions at the same time. If yes, step 2102
If no, the process ends. In this case, since there is only one touch position, processing for the case of one touch position is performed.

Step 2102: It is judged whether or not one or more touch positions are within the outline of one or more objects, and y
If es, the process proceeds to step 2103, and if no, the process proceeds to step 2104.

Step 2103: The designation method is determined to be designation within the outer contour, and the process is terminated.

Step 2104: It is determined whether or not all touched positions are on the outer contour of a single object. If yes, the process proceeds to step 2105, and if no, step 2106.
Proceed to.

Step 2105: The designation method is judged to be outer contour designation, and the processing is terminated.

Step 2106: It is determined whether or not all touch positions are outside the contour of the object and there is an object inside all touch positions. If yes, the process proceeds to step 2107, and if no, the process is performed. To finish.

Step 2107: The designation method is judged to be designation within the range, and the processing is ended.

FIG. 22 shows a flow chart for determining the instruction contents of the simultaneous plural gesture operations. The content of the gesture operation instruction in FIG. 22 is determined, for example, as follows from the positional relationship between the start point and the end point of the trajectory of the touch position drawn by the finger. However, points other than the start point and the end point may be used. For example, it may be determined from such a point and a start point, a combination of such points, or a combination of such a point and an end point. When a plurality of determination results are obtained, the certainty of the determination may be increased, assuming that the gesture operation instruction like the determination result is given only when the plurality of determination results match.

Step 2201: It is judged whether or not there are a plurality of touch positions at the same time. If yes, Step 2202
If no, the process ends.

Step 2202: It is determined whether or not all touch positions move in parallel while maintaining the positional relationship. If yes, the process proceeds to step 2203, and if no, step 2
Proceed to 204. Details of the determination will be described later.

Step 2203: It is determined that the instruction content is the parallel movement instruction, and the processing is ended.

Step 2204: It is determined whether or not all touch positions rotate while maintaining the positional relationship. If yes, the process proceeds to step 2205, and if no, step 220.
Go to 6. It should be noted that the method of determining the rotational movement may be that the rotational movement is determined when the midpoint (or the center of gravity) is constant while maintaining the positional relationship among all touch positions. Details of the determination will be described later.

Step 2205: It is determined that the instruction content is the rotational movement instruction, and the processing is ended.

Step 2206: It is determined whether or not the touched positions expand or contract relative to each other. If yes, the process proceeds to step 2207, and if no, the process ends.
The determination is made based on whether the distance between the touch positions changes. Details of the determination will be described later.

Step 2207: It is determined that the instruction content is the expansion / contraction deformation instruction, and the processing is ended.

In the above-described embodiments, even if the inside of the outline of the object is specified, the outside of the outline is specified and the range is specified, or the concept of parallel movement, rotation movement and expansion / contraction deformation of the object is combined or the mode is switched. Good.

Here, the combination of concepts is a combination of gesture operations such as a parallel movement instruction and an expansion / contraction deformation instruction at the same time, a certain object is designated inside the contour at the same time, and another object is designated on the contour. Is a designated combination of objects.

The switching of the mode means changing the interpretation of the operation by the finger for each mode. For example,
When the operation corresponding to the expansion and contraction instruction is performed by the above-mentioned outline designation, it is interpreted that each object is individually moved in the A mode, and is expanded and contracted in the B mode. In this way, a plurality of modes are set and these modes are switched.

According to the above-described embodiment, a mouse or the like is used to issue a movement instruction, a rotation instruction, or an expansion / contraction instruction of a single or a plurality of figures, which requires a plurality of operations, once with a more natural finger or the like. Since it can be realized by operation, the operability of graphic editing processing can be improved.

Four embodiments of the simultaneous plural gesture instruction processing device of the present invention will be described below. First, referring to FIG. 23, the fifth to fifth simultaneous gesture instruction processing devices according to the present invention will be described.
A simultaneous plural touch instruction processing device in which the simultaneous plural touch position detecting device using the surface acoustic wave type touch plate used in the four embodiments of No. 8 is integrated with the display device will be described.

FIG. 23 shows a schematic system configuration of a simultaneous plural gesture instruction processing device.

In the figure, 7 is a simultaneous plural touch position detecting device using a surface acoustic wave type touch plate, 8 is a display device using a liquid crystal display, and 9 is an information processing device. The simultaneous multiple-touch position detection device 7 includes an information processing device 9
, The number of detected touch positions and the X and Y coordinates of the touch positions are output. Further, 91 is an M for information processing.
PU, 92 is a memory that is a ROM that stores an information processing program for performing the processing described below, and 94 is
A memory 96, which is a RAM, receives the number and coordinates of touch positions from the simultaneous multiple touch position detection device 7, receives data for display from the MPU 91, generates a video signal, and outputs the video signal to the display device 9. The interface unit 95 that operates is a bus. Here, as shown in the figure,
The simultaneous multiple touch position detection device 8 and the display device 9 are integrated.

Then, in each of the selected processes, the MPU 91 which has received a plurality of touch position information touched at the same time, which is obtained from the simultaneous plural touch position detection device 8,
According to the information processing program stored in the memories 92 and 94, information processing corresponding to simultaneous multiple gesture operations is performed.

In the fifth to eighth embodiments of the present invention described below, the MPU 91 performs an instruction for simultaneous multiple gesture operations while showing the figure edit processing display screen displayed on the display device 8. The processing will be described.

[0221] The MPU 91 receives the touch position information from the simultaneous plural touch position detecting device 7 at regular intervals. This time is 0.1 second or less. From this information, it is determined what kind of locus the touch position draws.
When touched by a plurality of fingers, a plurality of loci are obtained, and a method of discriminating which locus is by which finger is performed as follows. A case where two trajectories are drawn by two fingers will be described with reference to FIG.

FIG. 24 shows an instruction method for translating each object, and a state in which the object is translated by the MPU 91 that has received the instruction and is displayed at the instructed position.

In the figure, A and B are figures of the gesture object, and A'and B'are figures after the gesture operation. The solid line indicates the movement direction of the finger (movement of the touch position), and the dotted line indicates the movement direction of the figures A and B. Reference characters a and b indicate touch positions that are touched first and are start points of the loci of the touch positions. Symbols a'and b'represent the touch position touched last and are the end points of the trajectory of the touch position. Also in the following figures, “′” represents the end point of the figure or trajectory after movement. T indicates the thumb touching the touch position a, and T ′ indicates the thumb touching the touch position a ′ after the movement. V indicates the forefinger touching the touch position b, and V ′ indicates the forefinger touching the touch position b ′ after the movement.

As shown in the figure, after pointing inside the outlines of the figures A and B with the thumb and forefinger, the hands A and B are moved in parallel while maintaining the positional relationship of each touch position.
Moves in parallel to the positions of figures A'and B '.

Two touch positions are sent from the simultaneous plural touch position detecting device 7 at least every 0.1 seconds. It is assumed that the time is 0.1 seconds. The MPU 91 has two touch positions (a and b) sent most recently.
And two touch positions (a '' and b '') sent 0.1 seconds ago are compared, and when at least one of the two touch positions changes, the locus is traced. It is judged that is drawn. And point a and point a ''
And the distance between point a and point a ″. The absolute value of these two distances (| a−a ″ | and | a−b ″, respectively)
|)) And compare the smaller one, for example, | a-a ''
If | is smaller, it is determined that a and a ″ are points on one trajectory. Then, it is assumed that b and b ″ are on the same trajectory.

The touch position is detected at intervals of 0.1 second or less, and the distance between two points on the locus deviated by 0.1 second is the distance between the two touch positions at each time (two Since it is the distance between the fingers, there is at least a distance equal to the thickness of one finger).

The starting point of the locus is the starting point when the touch position is input when the touch position is not input for 0.5 seconds or more. In addition, the end point is the last touch position when the next touch position is not input for 0.5 seconds or more after the touch position is input.

Next, a method of determining whether or not the movement is parallel in step 2202 of FIG. 22 will be described with reference to FIGS. 24 and 39. When comparing the touch position of the previous time and the touch position of this time, when the difference (considering the sign) of the X coordinate and the Y coordinate between the corresponding touch positions is the same for all the touch positions, It is determined to be parallel movement. That is, in FIG. 39, x1 = x2,
When y1 = y2, it is assumed that the movement is parallel. At this time,
A permissible value is set for changes in the X coordinate and the Y coordinate, and if both the difference between x1 and x2 and the difference between y1 and y2 are equal to or less than the permissible value, it is determined that the movement is parallel.

A method of determining whether or not the movement is rotational movement in step 2204 of FIG. 22 will be described with reference to FIGS. 25 and 40. FIG. 25 shows an instruction method for rotationally moving each object and a state in which the object is rotationally moved by the MPU 91 that has received the instruction and is displayed at the instructed position.

In the figure, C and D are figures of the gesture object, and C'and D'are figures after the gesture operation. As shown in the figure, after instructing the inside of the contours of the figures C and D with the thumb and forefinger, by rotating the hand while maintaining the positional relationship of each touch position, the figures C and D become the figure C ′, D '
Move to the position of.

The distance between cd and the distance between c'd 'are constant, and further, when the touch position of the previous time and the touch position of this time are compared, the X coordinate and the Y coordinate between the corresponding touch positions. When at least one of the coordinate differences (considering the sign) is changing, it is determined to be rotational movement. That is, in FIG. 40, x1 ≠ x2,
When at least one of y1 ≠ y2 is satisfied, it is assumed that the movement is rotational. At this time, an allowable value is set for the difference between the distance between cd and the distance between c′d ′, and when the difference is equal to or less than the allowable value, they are considered equal. In addition, an allowable value is set for changes in the X coordinate and the Y coordinate, and if at least one of the difference between x1 and x2 and the difference between y1 and y2 is greater than or equal to the allowable value, there is a difference. In these cases, it is assumed that the movement is rotational.

A method of determining whether the deformation is expansion / contraction in step 2206 of FIG. 22 will be described with reference to FIGS. 26 and 41.

FIG. 26 shows an instruction method for elastically deforming each object, and a state in which the object is elastically deformed by the MPU 91 which has received the instruction and is displayed at the instructed position.

In the figure, E and F are figures of the gesture object, and E'and F'are figures after the gesture operation. As shown in the figure, after instructing the inside of the contours of the figures E and F with the thumb and forefinger, the respective touch positions are expanded / contracted and deformed, so that the figures E and F are individually moved to the positions of the figures E ′ and F ′. To do.

When the touch position of the previous time and the touch position of this time are compared, X between the corresponding touch positions.
The ratio of the difference between the coordinates and the Y coordinate (considering the sign) is the same for any touch position, and x shown in FIG.
When the signs of 1 and x2 are opposite or the signs of y1 and y2 are opposite, it is determined that the deformation is expansion / contraction. That is, in FIG. 41, y1 / x1 = y2 / x2 (x1 =
This condition is not considered when x2 = 0), and x1
When xx2 is negative or y1xy2 is negative, it is considered to be expansion / contraction deformation. At this time, an allowable value is set for the difference between y1 / x1 and y2 / x2, and y1 / x1 and y2 / x2x1 are set.
If the difference between and is less than or equal to the allowable value, it is considered equal. Also, x
1 × x2 and y1 × y2 are also negative when they are negative and larger than the allowable value.

A fifth embodiment of the present invention will be described with reference to FIGS. In the present embodiment, a process when a parallel movement instruction, a rotation movement instruction, or a stretching / deformation instruction is received when the method of designating an object is the designation within the outer shell will be described.

In FIG. 24, the MPU 91, to which the touch position is input at regular time intervals from the simultaneous plural touch position detecting device 7 that has detected the touch position, determines the trajectory of the touch position. In this case, as described above, It is determined that the method of designating the target object is inside the contour and the instruction content of the gesture operation is parallel movement. Then, the vector representing the movement amount is obtained as (x1, y1) in FIG. Data for displaying the graphics A ′ and B ′ at the position after movement is sent to the interface unit 96. The interface unit 96 generates a video signal and sends it to the display device 8. The display device 8 displays according to the video signal.

In the present embodiment, the parallel movement of two figures has been described, but the present invention is not limited to this.

According to the present embodiment, since a parallel movement instruction of a plurality of figures which requires a plurality of operations with a mouse or the like can be realized by a more natural one-time operation, the operability of the figure editing process is improved. be able to.

FIG. 25 shows an instruction method for rotationally moving each object, and a state in which the object is rotationally moved by the MPU 91 that has received the instruction and is displayed at the instructed position.

The MPU 91, to which the touch positions are input from the simultaneous plural-touch position detecting device 7 that has detected the touch positions at regular intervals, determines the trajectory of the touch positions. In this case, as described above, It is determined that the designation method is inside the contour and the instruction content of the gesture operation is rotational movement. Next, the MPU 91 calculates the rotation angle as follows. FIG. 37 shows a conceptual diagram of the calculation of the rotation angle Δθ.

In the figure, points a and b are touch positions before rotation, points a ′ and b ′ are touch positions after rotation, and an angle θ is set.
Is the horizontal angle of the line segment ab before rotation, the angle θ ′ is the horizontal angle of the line segment a′b ′ after the rotation, and the angle Δθ is between the line segment ab before the rotation and the line segment a′b ′ after the rotation. It is an angle.

As shown, the angle of rotation Δθ is Δθ = θ-θ '.

Therefore, in the following, the angle θ and the angle θ ′ will be calculated.

As shown in the figure, the angle θ and the angle θ'are

[0246]

[Equation 1]

That is.

After obtaining the rotation angle, the MPU 91 sends data for displaying the figures C ′ and D ′ at the position after movement to the interface section 96. Interface section 96
Generates a video signal and sends it to the display device 8. The display device 8 displays according to the video signal.

In this embodiment, the rotational movement of the two figures has been described, but the present invention is not limited to this.

According to the present embodiment, it is possible to realize a rotational movement instruction of a plurality of figures that requires a plurality of operations with a mouse or the like by a more natural one-time operation, so that the operability of the figure editing process is improved. Can be made.

FIG. 26 shows an instruction method for elastically deforming each object, and a state in which the object is elastically deformed by the MPU 91 that has received the instruction and is displayed at the instructed position.

The MPU 91, to which the touch position is input from the simultaneous plural-touch position detecting device 7 that has detected the touch position at regular intervals, determines the trajectory of the touch position. In this case, as described above, the object is touched. The designation method is within the outer contour, and the instruction content of the gesture operation is determined to be expansion / contraction deformation. The deformation amount (expansion / contraction ratio) is e (Xe, Ye), f (Xf, Yf) at the touch positions on the figures E, F before the movement, and these points are e '(Xe', Ye 'after the movement. ),
If the position of f '(Xf', Yf ') is reached, (X
f'-Xe ') / (Xf-Xe). Also,
The position is determined by the fact that the touch position e on the figure E before the movement comes to the touch position e ′ after the movement. MP
The U 91 sends data for displaying the figures E ′ and F ′ to the position after the movement to the interface unit 96. The interface unit 96 generates a video signal and sends it to the display device 8. The display device 8 displays according to the video signal.

In this embodiment, the individual movement of the two figures has been described, but the present invention is not limited to this.

According to the present embodiment, it is possible to realize an individual movement instruction of a plurality of figures which requires a plurality of operations with a mouse or the like by a more natural one-time operation, so that the operability of the figure editing process is improved. it can.

The sixth embodiment of the present invention will be described with reference to FIGS. In the present embodiment, a process when a parallel movement instruction, a rotational movement instruction, or an expansion / contraction deformation instruction is received when the designation method of the target object is the outer contour designation will be described.

FIG. 27 shows an instruction method for translating each object and a state in which the object is translated by the MPU 91 which has received the instruction and is displayed at the instructed position.

In the figure, G is the figure of the gesture object, and G'is the figure after the gesture operation.

As shown in the figure, the figure G is formed by the thumb and forefinger.
After instructing on the outer contour near the diagonal of G, move the hand in parallel while maintaining the positional relationship of each touch position.
Moves in parallel to the position of the figure G '.

The MPU 91, to which the touch positions are input from the simultaneous plural-touch position detecting device 7 that has detected the touch positions at regular time intervals, determines the trajectory of the touch positions. In this case, as described above, the object is touched. The designation method is on the outer contour, and the instruction content of the gesture operation is determined to be parallel movement. Then, the vector indicating the movement amount is represented by (x
1, y1). Data for displaying the graphic G ′ at the position after the movement is transferred to the interface unit 96.
Send to. The interface unit 96 generates a video signal and sends it to the display device 8. The display device 8 displays according to the video signal.

In the present embodiment, the parallel movement of the figure by the outer contour instruction near the diagonal of the figure has been described, but the present invention is not limited to this.

According to this embodiment, the instruction to move one figure in parallel can be realized by a more natural operation than that of a mouse, so that the operability of the figure editing process can be improved.

FIG. 28 shows an instruction method for rotationally moving an object and a state in which the object is rotationally moved by the MPU 91 that has received the instruction and is displayed at the instructed position.

In the figure, H is the figure of the gesture object, and H'is the figure after the gesture operation.

As shown in the figure, the figure H is formed by the thumb and forefinger.
After instructing on the outer contour near the diagonal of, the pattern H is rotated by moving the hand while maintaining the positional relationship of each touch position.
Rotates to the position of figure H '.

The MPU 91, to which the touch positions are input from the simultaneous plural touch position detecting device 7 that has detected the touch positions at regular intervals, determines the trajectory of the touch positions. In this case, as described above, The designation method is on the outer contour, and the instruction content of the gesture operation is determined to be rotational movement. Next, the MPU 91 calculates the rotation angle as shown in FIG. When the rotation angle is calculated, the MPU 91
Data for displaying the graphic H ′ at the position after movement is sent to the interface unit 96. Interface section 96
Generates a video signal and sends it to the display device 8. The display device 8 displays according to the video signal.

In the present embodiment, the rotational movement of the figure by the outer contour instruction near the diagonal of the figure has been described, but the present invention is not limited to this.

According to this embodiment, a mouse or the like can be used to instruct the rotation and movement of one figure, which requires a plurality of operations.
Since it can be realized by one more natural operation, the operability of the graphic editing process can be improved.

FIG. 29 shows an instruction method for elastically deforming an object, and a state in which the object is elastically deformed by the MPU 91 that has received the instruction and is displayed at the instructed position.

In the figure, I is the figure of the gesture object, and I'is the figure after the gesture operation.

As shown in the figure, a figure I is formed with the thumb and forefinger.
After instructing on the outline near the diagonal of, the graphic I expands and contracts to the position of the graphic I ′ by expanding and contracting and moving each touch position.

The MPU 91, to which the touch position is input from the simultaneous plural touch position detection device 7 that has detected the touch position at regular intervals, determines the trajectory of the touch position. In this case, as described above, the object is touched. The designation method is on the outer shell,
It is determined that the instruction content of the gesture operation is parallel movement. Next, the MPU 91 obtains the deformation amount (expansion / contraction ratio) and the position after the deformation in the same manner as in FIG. 26. The MPU 91 sends data for displaying the graphic I ′ at the position after the movement to the interface unit 96. The interface unit 96 is
A video signal is generated and sent to the display device 8. Display device 8
Are displayed according to the video signal.

In the present embodiment, the expansion / contraction of the graphic by the outer contour instruction near the diagonal of the graphic has been described, but the invention is not limited to this.

According to the present embodiment, it is possible to realize expansion / contraction instruction of one figure which requires a plurality of operations with a mouse or the like by a more natural one-time operation, thus improving the operability of the figure editing process. it can.

Next, the expansion / contraction deformation in the case of bending the figure as shown in FIG. 30, which can be specified only in the case of the outer contour specification and the range specification, will be described.

[0275] Fig. 30 shows an instruction method for elastically deforming an object, and a state in which the object is elastically deformed by the MPU 91 that has received the instruction and is displayed at the instructed position.

In the figure, J is the figure of the gesture object, and J'is the figure after the gesture operation. T is the right thumb, V is the right index finger, T1 is the left thumb, V1
Is the index finger of the left hand.

As shown in the figure, the thumb and forefinger of both hands are used to indicate on the contour near the four vertices of the figure J, and then the touch positions of both hands are expanded and contracted as shown in FIG. J is transformed into the position of figure J '.

The MPU 91, to which the touch position is input, determines the trajectory of the touch position at regular time intervals from the simultaneous plural touch position detecting device 7 which has detected the touch position. As described above, the designation method of the target object is on the outer contour.
Regarding the instruction content of the gesture operation, it is determined whether or not it is the expansion / contraction deformation shown in FIG. 30 as follows. When the touch positions are a, b, c, and d, it is determined that the loci of a and b are rotations by the rotation movement determination method described in step 4 of FIG. 22, and the same applies to c and d. When it is determined that the movement is rotational movement by the above method, it is determined that the deformation is expansion and contraction shown in FIG.

Next, the MPU 91 obtains the deformed position as follows. FIG. 38 shows a conceptual diagram of calculation of an arc-shaped center line of a deformed arc figure.

In FIG. 38, the figure S is a rectangle before transformation, and the figure S'is an arc figure having a width after transformation.
w and w ′ are the widths of the figures S and S ′, and l and l ′ are the figures S and S ′.
Is the center line in the width direction.

As shown in the figure, the radius r of the center line l'which is an arc is determined by the Pythagorean theorem.

[0282]

[Equation 2]

That is. In addition, the width w of the figure S before transformation,
The relation with the width w ′ of the transformed figure S ′ is w = w ′. Therefore, in the following, the center point O of the center line l'which is an arc is calculated.

As shown, a straight line y 1 passing through the two points a ′ and b ′ and a straight line y passing the two points c ′ and d ′.
2 is the formula of a straight line passing through two points,

[0285]

[Equation 3]

It is. Also, these straight line y 1 and straight line y 2
The coordinates (Xo, Yo) of the point O, which is the intersection point with

[0287]

[Equation 4]

It is.

When the shape and position are obtained, the MPU 91 sends the interface unit 96 data for displaying the figure I ′ at the position after deformation. Interface part 9
6 generates a video signal and sends it to the display device 8. The display device 8 displays according to the video signal.

In this embodiment, the deformation of the figure by the outer contour instruction near the four vertices of the figure has been described, but the present invention is not limited to this. It is possible if the four touch positions are in the positional relationship of the four vertices of the rectangle. At this time, since the deformation amount can be calculated, the shape and position after deformation can be calculated for any figure. Further, the case of specifying within the range is also possible.

According to the present embodiment, the deformation instruction of one figure, which requires a plurality of operations with a mouse or the like, can be realized by one more natural operation, so that the operability of the figure editing process is improved. Can be made.

The seventh embodiment of the present invention will be described with reference to FIGS. In the present embodiment, processing when a parallel movement instruction, a rotational movement instruction, and an expansion / contraction deformation instruction are received when the method of specifying the target object is within the range will be described.

FIG. 31 shows an instruction method for translating each object, and a state in which the object is translated by the MPU 91 that has received the instruction and is displayed at the instructed position.

In the figure, K and L are figures of the gesture object, and K ′ and L ′ are figures after the gesture operation.

As shown in the figure, after instructing the outside of the contours of the figures K and L with the thumb and the forefinger, the hands are moved in parallel while maintaining the positional relationship of the touch positions, whereby the figures K and L are moved.
Moves in parallel to the positions of figures K'and L '.

The MPU 91, to which the touch positions are input at regular time intervals from the simultaneous plural touch position detecting devices 7 that have detected the touch positions, determines the trajectory of the touch positions. In this case, the target object is designated as described above. It is determined that the method is within the range and the instruction content of the gesture operation is parallel movement.
Then, the vector representing the movement amount is represented by (x1, y in FIG.
Obtained in the same manner as 1). Figure K ', at the position after movement
The data for displaying L ′ is transferred to the interface unit 96.
Send to. The interface unit 96 generates a video signal and sends it to the display device 8. The display device 8 displays according to the video signal.

In the present embodiment, the parallel movement of two figures has been described, but the present invention is not limited to this.

According to the present embodiment, it is possible to realize a parallel movement instruction of a plurality of figures which requires a plurality of operations with a mouse or the like by a more natural one-time operation, so that the operability of the figure editing process is improved. Can be made.

FIG. 32 shows an instruction method for rotationally moving each object and a state in which the object is rotationally moved by the MPU 91 which received the instruction and is displayed at the instructed position.

In the figure, M and N are figures of the gesture object, and M'and N'are figures after the gesture operation.

As shown in the figure, after instructing the outside of the contours of the figures M and N with the thumb and forefinger, the hands are rotated and moved while maintaining the positional relationship of each touch position.
Moves to the positions of figures M'and N '.

The MPU 91, to which the touch position is input from the simultaneous plural touch position detection device 7 that has detected the touch position at regular intervals, determines the trajectory of the touch position. In this case, the target object is designated as described above. It is determined that the method is within the range and the instruction content of the gesture operation is rotational movement.
Then, the rotation angle is obtained in the same manner as in FIG. Data for displaying the figures M ′ and N ′ at the position after movement is sent to the interface unit 96. The interface unit 96 is
A video signal is generated and sent to the display device 8. Display device 8
Are displayed according to the video signal.

In this embodiment, the rotational movement of the two figures has been described, but the present invention is not limited to this.

According to the present embodiment, it is possible to realize a rotational movement instruction of a plurality of figures which requires a plurality of operations with a mouse or the like by one more natural operation, so that the operability of the figure editing process is improved. it can.

FIG. 33 shows an instruction method for elastically deforming each object, and a state in which the object is elastically deformed by the MPU 91 that has received the instruction and is displayed at the instructed position.

In the figure, O and P are figures of the gesture object, and O'and P'are figures after the gesture operation.

As shown in the figure, after instructing the outside of the contours of the figures O and P with the thumb and the forefinger, the respective touch positions are expanded / contracted and deformed so that the figures O and P are located at the positions of the figures O ′ and P ′. It expands and contracts. The deformation amount and the position after the deformation in this figure can be obtained in the same manner as in FIG. 34.

The MPU 91, to which the touch position is input from the simultaneous plural touch position detection device 7 which has detected the touch position at regular intervals, determines the trajectory of the touch position. In this case, the target object is designated as described above. It is determined that the method is within the range and the instruction content of the gesture operation is expansion / contraction deformation.
Then, the deformation amount and the position after the deformation are obtained in the same manner as in FIG. Data for displaying the figures O ′ and P ′ at the position after movement is sent to the interface unit 96. The interface unit 96 generates a video signal and sends it to the display device 8. The display device 8 displays according to the video signal.

In this embodiment, the expansion / contraction deformation of two figures has been described, but the present invention is not limited to this.

According to the present embodiment, it is possible to realize expansion / contraction deformation instruction of a plurality of figures which requires a plurality of operations with a mouse or the like by a more natural one-time operation, so that the operability of the figure editing process is improved. it can.

The eighth embodiment of the present invention will be described with reference to FIGS. In the present embodiment, the object is a three-dimensional figure, and the case where the simultaneous plural gesture instruction processing device according to the present invention is used for the editing process of the three-dimensional figure will be described. In the present embodiment, the designation method of the target object is the designation on the outer contour (FIGS. 34 and 36) and the designation method within the range (FIG. 3).
In 5), the processing when the rotational movement instruction is received will be described.

FIG. 34 shows an instruction method for rotationally moving an object and a state in which the object is rotationally moved by the MPU 91 that has received the instruction and is displayed at the instructed position.

FIG. 34 shows an example in the case where the designation method of the object is the outer contour designation and the instruction content of the gesture operation is the rotational movement instruction.

In the figure, Q is a three-dimensional figure which is a gesture object, and Q'is a three-dimensional figure after the gesture operation. Q'is 90 about Q about the rotation axis 346 in the X-axis direction.
It has been rotated once.

As shown in the figure, after instructing the outline of the three-dimensional figure Q near the diagonal with the thumb and forefinger, the hand is rotated while maintaining the positional relationship of each touch position.
The three-dimensional figure Q is rotationally moved to the position of the three-dimensional figure Q '.

The MPU 91, to which the touch position is input from the simultaneous plural touch position detecting device 7 which has detected the touch position at regular intervals, determines the trajectory of the touch position. In this case, the target object is designated as described above. It is determined that the method is on the outer contour.

The determination that the instruction content of the gesture operation is the rotational movement is also performed as described above. Furthermore, in the case of a three-dimensional figure, it is necessary to determine whether the rotation axis is the X, Y, or Z axis. The rotation axis is determined based on the normal direction of the surface including the touched outline (side). That is, in the case of FIG. 34, since the two sides 341 are touched, considering the normal direction of the surface 341 including these two sides, X
Since it is in the axial direction, the rotation axis is determined to be the X axis. 2 sides 3
Also when 42 is touched, the rotation axis is the X axis. 2 sides 3
When 43 is touched, the rotation axis is the Y axis.

Then, the rotation amount is obtained in the same manner as in FIG. Data for displaying the graphic Q ′ at the position after movement is sent to the interface unit 96. Interface part 9
6 generates a video signal and sends it to the display device 8. The display device 8 displays according to the video signal.

In this embodiment, the rotational movement of the three-dimensional figure by the outer contour instruction near the diagonal of the three-dimensional figure has been described, but the present invention is not limited to this.

According to the present embodiment, a rotational movement instruction of one three-dimensional figure which requires a complicated operation with a mouse or a three-dimensional dial can be realized by a more natural one-time operation. The operability of the graphic editing process can be improved.

In the above-described embodiment and the embodiments of FIGS. 25, 28 and 32, the method of determining the rotational movement is as shown in FIG.
The following method may be used in addition to the method described in step 4 of the above. Since the touch positions are sent from the simultaneous plural touch position detection device 7 at intervals of 0.1 second or less, a large number of coordinates of the touch positions are obtained as the finger moves. If the X and Y coordinates of 6 points are obtained, it is possible to determine whether or not it is a circle by inserting these coordinates into x and y of the following equation 5, and in the case of a circle, the equation of the circle is obtained. .

[0322]

## EQU00005 ## ax 2 + 2hxy + by 2 + 2gx + 2fy + c = 0 where a, h, b, g, f, and c are constants Since multiple touch positions are sent from the simultaneous multiple touch position detection device 7, the start point of the locus is A plurality of sets of 6 touch positions are obtained from the touch positions between the end points. From each of these sets, the coefficient of Equation 5 is obtained, and it can be determined from the coefficient whether or not it is a circle. Then, in the case of a circle, the equation of the circle is obtained for each set. Next, the center position of each circle and the radius of the circle are calculated from the equations obtained for each set. The center positions and radii of the obtained circles are compared, and if the positions and radii match within the allowable range, the trajectory of the touch position is
Judge as a circle.

In the case of FIG. 34, regarding the rotational movement which is the instruction content of the gesture operation, the locus of the touch position may be an ellipse other than the above circle. The case where the oval may be used is a case where the rotational movement is instructed on a surface that is oblique on the screen, such as surfaces 344 and 347 in FIG.
That is, this is a case of instructing the rotational movement of the graphic Q about the X axis or the Z axis.

At this time, FIG. 42 (surfaces 344, 34 in FIG. 34)
Ellipse 344 as shown in FIG.
The MPU 91, which receives the touch positions on 1,3471 from the simultaneous multiple touch position detection device 7, outputs an ellipse from the coefficient for a plurality of groups of 6 points in the same manner as in the case of the circle, according to the above-mentioned Equation 5. Determine if there is. Then, in the case of an ellipse, the equation of the ellipse is obtained for each set.
Next, the direction of each ellipse, the major axis of the ellipse, and the minor axis length are obtained from the equations obtained for each set. Compare the orientation of the obtained ellipses, the long axis of the ellipse, the length of the short axis,
When the direction of the ellipse, the long axis of the ellipse, and the length of the short axis match within the allowable range, the trajectory of the touch position is determined to be the ellipse.

Regarding the determination of the rotation axis, the orientation of the long axis 421 of the obtained ellipse is 6 from the horizontal direction as shown in FIG.
When it is 7.5 degrees, it is determined to be the X-axis, and when the orientation of the major axis 422 is 22.5 degrees from the horizontal direction, it is determined to be the Y-axis. Note that the numerical values of 67.5 degrees and 22.5 degrees are due to the stereoscopic view being displayed at this angle in the case of FIG. 34, and this angle changes depending on the display angle of the stereoscopic view. is there. Further, when the ellipse 3441 is within the screen 344, it may be determined as the X axis, and when the ellipse 3471 is within the surface 422, it may be determined as the Y axis.

How to determine the rotation amount will be described with reference to FIG. In FIG. 43, a indicates a touch start point and b indicates a touch end point. O is the intersection of the long axis and the short axis, α is the angle between the line segment Oa and the long axis, and β is the angle between the line segment Ob and the long axis. These can be obtained if the elliptic equation is determined. The rotation amount is obtained as an absolute value of α-β.

[0327] Next, Fig. 35 shows an example in which the method of designating an object is designation within a range, and the instruction content of the gesture operation is a rotational movement instruction.

In the figure, R is a three-dimensional figure of the gesture object, and R'is a three-dimensional figure after the gesture operation.
R ′ is obtained by rotating R by 90 degrees around the rotation axis 346 in the X-axis direction.

As shown in the figure, after instructing the outside of the three-dimensional figure R with the thumb and forefinger, the three-dimensional figure R is moved to 3 by rotating the hand while maintaining the positional relationship of each touch position. Rotate to the position of the three-dimensional figure R '.

The MPU 91, to which the touch positions are input at regular time intervals from the simultaneous plural touch position detecting device 7 that has detected the touch positions, determines the trajectory of the touch positions. In this case, the target object is designated as described above. It is determined that the method is the range specification.

The determination that the instruction content of the gesture operation is the rotational movement is performed as described above with reference to FIG. That is, with respect to the figure R, rotation about the X and Y axes is indicated by drawing an ellipse with a finger to indicate that it is a rotational movement, and the direction of the major axis is 67.5 degrees or 22.5 degrees. Depending on whether the rotation axis is the X axis or the Y axis is specified. About the Z axis, it is instructed by drawing a circle with a finger. Note that the numerical values of 67.5 degrees and 22.5 degrees are due to the stereoscopic view being displayed at this angle in the case of FIG. 35, and this angle changes depending on the display angle of the stereoscopic view. is there.

Then, the rotation amount is obtained in the same manner as in FIG. 37 or FIG. Data for displaying the graphic R ′ at the position after movement is sent to the interface unit 96. The interface unit 96 generates a video signal and sends it to the display device 8. The display device 8 displays according to the video signal.

In this embodiment, the rotational movement of the three-dimensional figure by designating the range of the three-dimensional figure has been described, but the present invention is not limited to this.

According to this embodiment, a rotational movement instruction of one three-dimensional figure, which requires a complicated operation with a mouse or a three-dimensional dial, can be realized by a more natural one-time operation. The operability of the graphic editing process can be improved.

FIG. 36 shows an instruction method for twisting and moving an object, which is a kind of rotational movement, and a state in which the object is twisted and moved by the MPU 91 that has received the instruction and is displayed at the instructed position. Show.

FIG. 36 shows an example in which the designation method of the object is the outer contour designation and the instruction content of the gesture operation is the twist movement instruction.

In the figure, S is a three-dimensional figure of the gesture object, and S'is a three-dimensional figure after the gesture operation.
S ′ is obtained by rotating S around the rotation axis 346 in the X-axis direction by 90 degrees.

As shown in the figure, after the thumb and forefinger of both hands are used to indicate the outer contour of the three-dimensional figure S in the vicinity of the diagonal, the touch positions of the left thumb T1 and the forefinger V1 are fixed and the right thumb is fixed. By rotating and moving the right hand while maintaining the positional relationship between the touch position of T and the index finger V, the three-dimensional figure S
Twists and rotates to the position of the three-dimensional figure S '.

The MPU 91, to which the touch position is input at regular time intervals from the simultaneous plural touch position detecting device 7 that has detected the touch position, determines the trajectory of the touch position. In this case, the target object is designated as described above. It is determined that the method is specified on the outer contour.

The determination that the instruction content of the gesture operation is the rotational movement is performed as shown in FIGS.

Then, the rotation amount is obtained in the same manner as in FIG. 37 or 43. Data for displaying the figure S ′ at the position after movement is sent to the interface unit 96. The interface unit 96 generates a video signal and sends it to the display device 8. The display device 8 displays according to the video signal. Figure 4
4 shows the figure S ′ after twisting. The left side surface 441 has not rotated, and the right side surface 442 has rotated 90 degrees. The straight lines 443, 444, 445 and 446 connecting the respective vertices of the right side surface 442 corresponding to the respective vertices of the left side surface 441 are in a twisted positional relationship.

In FIG. 44, the line connecting the left and right side surfaces is a straight line, but it may be a curve in consideration of the rigidity and hardness of the object displayed by the figure S. Depending on the hardness of the object, if it is bent by 90 degrees, it will be broken, so that it may be displayed that rotation is impossible.

In this embodiment, the twisting rotational movement of the three-dimensional figure by designating the outer contour near the diagonal of the three-dimensional figure has been described, but the present invention is not limited to this.

According to this embodiment, a twisting rotation movement instruction of one three-dimensional figure which requires a complicated operation with a mouse or a three-dimensional dial can be realized by a more natural one-time operation. The operability of the dimensional figure editing process can be improved.

In the embodiments of the fifth to eighth simultaneous plural gesture instruction processing devices described above, the simultaneous plural touch position detecting device using the surface acoustic wave type touch plate and the display device are integrated. However, the method of the touch plate of the simultaneous plural touch position detecting device is not limited to this, and the simultaneous plural touch position detecting device may be configured independently. Further, the system configuration is not limited to this, and an input device such as a keyboard, an auxiliary storage device such as a hard disk device, an output device such as a printer may be combined.

Further, although details of the graphic editing process have not been described in the fifth to eighth embodiments of the simultaneous plural gesture instruction processing device, they can be realized by a known technique and the graphic editing process can be realized. Not limited to this, it is also applied to graphic input processing, character area specification, character string movement, character input editing processing such as page turning, 3D processing and multimedia processing, and processing that combines these processing. it can. Further, although details of the items for the simultaneous multiple gesture operation have not been described, various items such as fingers, palms and pens may be mixed.

[0347]

According to the touch position detecting device of the present invention, since it is possible to detect the positions of a plurality of touches performed simultaneously with a finger, a palm, a pen or the like, the function of the touch position detecting device can be improved, and The durability is improved.

Further, according to the touch instruction processing device of the present invention, it is possible to process an instruction by a plurality of simultaneous touch operations with fingers, palms, pens, etc., and thus the operability of the touch instruction processing device is improved.

According to the simultaneous plural gesture instruction processing device of the present invention, a plurality of touch positions are detected by fingers, palms, pens, etc., which are simultaneously and not sequentially, and an instruction is given by moving the plurality of touch positions. According to the above, the display target displayed on the display device can be moved and displayed, so that the operability of the touch instruction processing device can be improved and the durability can be improved.

[Brief description of drawings]

FIG. 1 is a block diagram of a simultaneous multiple touch position detection device according to a first embodiment of the present invention.

FIG. 2 is an explanatory diagram showing a waveform of a signal for detecting the X coordinate according to the first embodiment of the present invention.

FIG. 3 is an explanatory diagram showing a processing result of the touch position detection unit 2 according to the first embodiment of this invention.

FIG. 4 is a block diagram of a simultaneous multiple touch position detection device according to a second embodiment of the present invention.

FIG. 5 is an explanatory diagram showing a state of light received by a photodetector group for detecting X and Y coordinates according to a second embodiment of the present invention.

FIG. 6 is an explanatory diagram showing a processing result of the touch position detection unit 4 according to the second embodiment of the present invention.

FIG. 7 is a block diagram of a simultaneous multiple touch position detection device according to a third embodiment of the present invention.

FIG. 8 is a camera 51x, 51y according to a third embodiment of the present invention.
Explanatory drawing which shows the image | video which imaged.

FIG. 9 is a block diagram of a simultaneous multiple touch instruction processing device according to a fourth embodiment of the present invention.

FIG. 10 is an explanatory diagram showing a display screen in a word processor using a virtual keyboard.

FIG. 11 is a processing flow when a shift key is also used in the virtual keyboard 811.

FIG. 12 is a processing flow when a range is specified in a sentence 812.

FIG. 13 is an explanatory diagram showing a display screen in playing a piano using a virtual piano.

FIG. 14 is a processing flow when playing multiple tones on a virtual piano 821.

FIG. 15 is an explanatory diagram showing a display screen in a competitive soccer game using virtual buttons.

FIG. 16 is a processing flow when multiple input is performed using a virtual button 831.

FIG. 17 is an explanatory diagram showing a display screen when power is cut off using a virtual power switch.

FIG. 18 is a processing flow of power off using a virtual power switch 841.

FIG. 19 is an explanatory diagram of structures of a wave transmitter and a wave receiver in the first embodiment.

FIG. 20 is an explanatory diagram of a menu screen according to the fourth embodiment.

FIG. 21 is a determination processing flowchart of a method for specifying multiple simultaneous gesture objects according to the fourth embodiment of the present invention.

FIG. 22 is a flowchart of a determination process of instruction content of simultaneous multiple gesture operations according to the fourth embodiment of this invention.

FIG. 23 is a schematic configuration diagram of a system of a simultaneous plural gesture instruction processing device according to fifth to seventh embodiments of the present invention.

FIG. 24 is an explanatory diagram showing the instruction content when the method of designating an object is the in-outer contour designation and the instruction content of the gesture operation is the parallel movement instruction according to the fifth embodiment of the present invention.

FIG. 25 is an explanatory diagram showing the instruction content when the method of designating an object is the inside-outer contour designation and the instruction content of the gesture operation is the rotational movement instruction according to the fifth example of the present invention.

FIG. 26 is an explanatory diagram showing the instruction content in the case where the method of designating an object is the in-outer contour designation and the instruction content of the gesture operation is the deformation / extension instruction according to the fifth embodiment of the present invention.

FIG. 27 is an explanatory diagram showing the instruction content in the case where the method of designating the target object is the outer contour designation and the instruction content of the gesture operation is the parallel movement instruction according to the sixth embodiment of the present invention.

FIG. 28 is an explanatory diagram showing the instruction content in the case where the method of designating the target object is the outer contour designation and the instruction content of the gesture operation is the rotational movement instruction according to the sixth embodiment of the present invention.

FIG. 29 is an explanatory diagram showing the instruction content in the case where the method of designating an object is outer contour designation and the instruction content of a gesture operation is a deformation / expansion / contraction instruction according to the sixth embodiment of the present invention.

FIG. 30 is an explanatory diagram showing the instruction content in the case where the method of designating an object is outer contour designation and the instruction content of a gesture operation is a deformation / expansion / contraction instruction according to the sixth embodiment of the present invention.

FIG. 31 is an explanatory diagram showing the instruction content when the method of designating an object is the range designation and the instruction content of the gesture operation is the parallel movement instruction according to the seventh embodiment of the present invention.

FIG. 32 is an explanatory diagram showing the instruction content when the method of designating an object is the in-range designation and the instruction content of the gesture operation is the rotational movement instruction according to the seventh embodiment of the present invention.

FIG. 33 is an explanatory diagram showing the instruction content in the case where the method of designating an object is the range designation and the instruction content of the gesture operation is the deformation / expansion / contraction instruction according to the seventh embodiment of the present invention.

FIG. 34 is an explanatory diagram showing the instruction content in the case where the method of designating the target object is the outer contour designation and the instruction content of the gesture operation is the rotational movement instruction according to the eighth embodiment of the present invention.

FIG. 35 is an explanatory diagram showing the instruction content when the method of designating an object is the in-range designation and the instruction content of the gesture operation is the rotational movement instruction according to the eighth example of the present invention.

FIG. 36 is an explanatory diagram showing the instruction content when the method of designating an object is outer contour designation and the instruction content of the gesture operation is the twist movement contraction instruction according to the eighth embodiment of the present invention.

FIG. 37 is an explanatory diagram of a method for calculating a rotation angle in a rotational movement.

FIG. 38 is an explanatory diagram of a method of calculating an arc-shaped center line of a deformed arc figure.

FIG. 39 is an explanatory diagram of a method of determining that the instruction content of the gesture operation is a parallel movement instruction.

FIG. 40 is an explanatory diagram of a method for determining that the instruction content of the gesture operation is a rotational movement instruction.

FIG. 41 is an explanatory diagram of a method for determining that the instruction content of the gesture operation is an expansion / contraction deformation instruction.

FIG. 42 is an explanatory diagram of a method of designating a trajectory of a touch position that is an ellipse.

FIG. 43 is an explanatory diagram of how to determine the rotation amount when the trajectory of the touch position is an ellipse.

FIG. 44 is an explanatory diagram of a display example of a three-dimensional figure when a twist movement is instructed.

[Explanation of symbols]

DESCRIPTION OF SYMBOLS 1 ... Surface acoustic wave type touch plate, 2 ... Touch position detection part, 3 ... Two-dimensional optical sensor array type touch plate,
4 ... Touch position detection unit, 5 ... Two-dimensional video camera type touch plate, 6 ... Touch position detection unit, 7 ... Simultaneous multiple touch position detection device, 8 ... Display device, 9 ... Information processing device, 1
1 ... Wave transmitter, 12 ... Divider group, 13 ... Wave collector group, 14 ...
Wave receiver, 21 ... Wave transmitter control unit, 22 ... Wave receiver control unit, 2
3 ... Touch position / press detection unit, 24 ... Simultaneous multiple touch position determination unit, 25 ... External interface unit, 31 ... Light emitter group, 32 ... Light receiver group, 41 ... Light emitter control unit, 42 ... Light receiver control unit, 43 ... Touch position / width detection unit, 44 ... Simultaneous multiple touch position determination unit, 45 ... External interface unit, 5
1 ... Camera, 61 ... Camera control unit, 62 ... Touch position /
Width detection unit 63 ... Simultaneous multiple touch position determination unit 64 ... External interface unit 91 ... MPU for performing information processing, 9
2 ... memory, 811 ... virtual keyboard, 812 ... text,
821 ... Virtual piano, 822 ... Music score, 831 ... Virtual button group, 832 ... Soccer field, 841 ... Virtual power switch group.

 ─────────────────────────────────────────────────── ─── Continuation of the front page (72) Yoshihiko Kunimori Inventor Yoshihiko Kunimori 292 Yoshida-cho, Totsuka-ku, Yokohama-shi, Kanagawa Ltd. Microelectronics Equipment Development Laboratory, Hitachi, Ltd. (72) Inventor Ojo Adult Kanagawa 292 Yoshida-cho, Totsuka-ku, Hitachi Ltd. Microelectronics Equipment Development Laboratory (72) Inventor Shunichi Ito 1410 Inada, Katsuta-shi, Ibaraki Hitachi Information & Video Media Division

Claims (10)

[Claims]
1. A contact position of a contact object is set as a touch position.
A touch position detecting device for detecting the touch position in a dimension, comprising a plurality of touch position detecting means for respectively detecting a touch position in a one-dimensional direction and outputting a detection signal, and simultaneously touching a plurality of positions. Which has a plurality of simultaneous touch position determining means for determining individual touch positions from among a plurality of candidate positions which are larger than the number of touches, each of the touch position detecting means has a detection signal which changes depending on the touch. The touch position is detected based on the touch position, and the simultaneous plural touch position determination means has a plurality of candidate positions which are obtained when the plural positions are touched at the same time A feature amount of at least one touch that does not depend on the position and a touch position detection hand of the other that is extracted from one detection signal of the touch position detection means. Is extracted from the detection signal, by the feature amount of at least one touch position-independent selects a match, the touch position detecting apparatus characterized by determining the individual touch positions.
2. The touch position detection device according to claim 1, wherein each of the touch position detection means includes a surface acoustic wave transmitting / receiving means and a panel which is a surface acoustic wave propagation medium. A transmitting means for transmitting a surface acoustic wave to the panel; a plurality of demultiplexing means arranged one-dimensionally on the panel for sequentially demultiplexing the transmitted surface acoustic wave;
A plurality of wave collecting means, which are provided to face each of the wave separating means and sequentially collect the separated surface acoustic waves, and a wave receiving means for receiving the collected surface acoustic waves. Then, the wave receiving means detects the received signal indicating that a touch is made on a panel between each of the wave collecting means and the wave separating means facing each of the wave collecting means. A touch position detection device characterized by outputting as a signal.
3. The touch position detection device according to claim 2, wherein the simultaneous plural touch position determination means is at least one of an attenuation amount of the detection signal at each touch position and an attenuation time for each transmission / reception means. A touch position detecting device characterized in that each touch position is detected by obtaining one of the touch feature amounts.
4. The touch position detecting device according to claim 1, wherein the touch position detecting means is a light beam transmitting / receiving means, and each of the transmitting / receiving means transmits a light beam arranged in one dimension. A plurality of transmitting means and a plurality of receiving means provided to face each of the transmitting means to receive a light beam, and each of the receiving means is provided between the transmitting means facing each other. A touch position detection device, which outputs a reception signal indicating that a touch has been performed in step S1 as the detection signal.
5. The touch position detection device according to claim 1, wherein the touch position detection means is a camera, the cameras are arranged so as not to face each other, and each of the cameras is in a field of view. A touch position detecting device, which outputs a photographing signal indicating that a touch has been performed as the detection signal.
6. The touch position detecting device according to claim 4, wherein the simultaneous plural touch position determining means determines the width of each touch for each touch position detecting means based on the detection signal. A touch position detecting device characterized by detecting each touch position by obtaining it as a quantity.
7. A touch position detecting device according to claim 1, 2, 3, 4, 5 or 6, and an information processing device for performing information processing according to a touch position when a plurality of positions are touched at the same time. A touch instruction processing device comprising:
8. A display device for displaying a display object, and a plurality of touching positions of touching objects on the display surface of the display device on which the display object is displayed are set as touch positions. A touch position detection device that detects the touch position in a two-dimensional manner in time series, and a plurality of touch positions detected in the time series are used as a touch position instruction for instructing movement to the display object. And a control unit that displays the moved display object on the display device based on the movement of the display object, and the touch position detection device detects the touch position in the one-dimensional direction. , A plurality of touch position detecting means for outputting a detection signal, and each of the plurality of candidate positions, which is obtained when a plurality of positions are touched at the same time, is larger than the number of touches, an individual touch position. H has a simultaneous multiple touch position determination means for determining the position, each of the touch position detection means detects the touch position based on the detection signal changed by the touch, the simultaneous multiple touch position determination means, Of a plurality of candidate positions, which are obtained when a plurality of positions are touched at the same time and are larger than the number of the touches, that position has an extracted position from one of the detection signals of the touch position detection means and does not depend on the position. Individual touch positions are determined by selecting one in which the feature amount of at least one touch and the feature amount of at least one touch that does not depend on the position extracted from the detection signal of the other touch position detection unit match. A touch position instruction processing device characterized by:
9. The touch position instruction processing device according to claim 8, wherein the control means performs parallel movement while maintaining a mutual positional relationship of the touch positions as a type of movement instruction by the touch position instruction. At least one of a movement instruction, a rotation movement instruction for rotationally moving while maintaining the mutual positional relationship of the touch positions, and an expansion / contraction deformation movement instruction for elastically deforming / moving the positional relationship of the touch positions, and based on these instructions, A touch position instruction processing device, wherein the display object is moved, and the moved display object is displayed on the display device.
10. The touch position instruction processing device according to claim 8, wherein there are a plurality of display objects, and the control means indicates the display object to be moved among the display objects. As the type of instruction, the touch position indicates an inside contour for instructing the insides of the plurality of display objects that are different from each other, and the touch position indicates an outside contour for instructing the outside of one object of the plurality of display objects. A touch position instruction processing device, characterized in that it receives at least one of an upper instruction and an in-range instruction for instructing a plurality of objects within a range surrounded by the touch position.
JP22020594A 1993-09-16 1994-09-14 Touch position detecting device and touch instruction processor Pending JPH07230352A (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
JP23042093 1993-09-16
JP32007593 1993-12-20
JP5-320075 1993-12-20
JP5-230420 1993-12-20
JP22020594A JPH07230352A (en) 1993-09-16 1994-09-14 Touch position detecting device and touch instruction processor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP22020594A JPH07230352A (en) 1993-09-16 1994-09-14 Touch position detecting device and touch instruction processor

Publications (1)

Publication Number Publication Date
JPH07230352A true JPH07230352A (en) 1995-08-29

Family

ID=27330411

Family Applications (1)

Application Number Title Priority Date Filing Date
JP22020594A Pending JPH07230352A (en) 1993-09-16 1994-09-14 Touch position detecting device and touch instruction processor

Country Status (1)

Country Link
JP (1) JPH07230352A (en)

Cited By (91)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001027867A1 (en) * 1999-10-13 2001-04-19 Elo Touchsystems, Inc. Contaminant processing system for an acoustic touchscreen
JP2001228971A (en) 2000-02-15 2001-08-24 Newcom:Kk Touch panel system to be operated at plural indicating positions
JP2001265475A (en) * 2000-03-15 2001-09-28 Ricoh Co Ltd Menu display controller and information processor and electronic blackboard system and method for controlling menu display system and method for controlling information processing system and computer readable recording medium with program for allowing the same method to be performed by computer recorded
JP2001290585A (en) * 2000-01-31 2001-10-19 Canon Inc Position information processor, position information processing method and program, and operation device and its method and program
JP2002079162A (en) * 2000-09-05 2002-03-19 Toppan Printing Co Ltd Ultrasonic coating head and coater using the same
JP2004502261A (en) * 2000-07-05 2004-01-22 スマート テクノロジーズ インコーポレイテッドSmart Technologies Inc. Camera-based touch system
JP2006034754A (en) * 2004-07-29 2006-02-09 Nintendo Co Ltd Game apparatus using touch panel and game program
US7202860B2 (en) 2001-10-09 2007-04-10 Eit Co., Ltd. Coordinate input device working with at least display screen and desk-top surface as the pointing areas thereof
JP2008507026A (en) * 2004-07-15 2008-03-06 エヌ−トリグ リミテッド Automatic switching of dual mode digitizer
US7342574B1 (en) 1999-10-29 2008-03-11 Ricoh Company, Ltd. Method and apparatus for inputting information including coordinate data
JP2008158842A (en) * 2006-12-25 2008-07-10 Xanavi Informatics Corp Map display device
JP2008216991A (en) * 2008-01-29 2008-09-18 Fujitsu Ten Ltd Display device
US7477243B2 (en) 2002-05-31 2009-01-13 Eit Co., Ltd. Apparatus for controlling the shift of virtual space and method and program for controlling same
KR100896711B1 (en) * 2007-02-08 2009-05-11 삼성전자주식회사 Method for executing function according to tap in mobile terminal with touch screen
JP2009522669A (en) * 2005-12-30 2009-06-11 アップル インコーポレイテッド Portable electronic device with multi-touch input
JP2009217543A (en) * 2008-03-11 2009-09-24 Brother Ind Ltd Contact-input type information processing apparatus, contact-input type information processing method, and information processing program
JP2009543246A (en) * 2006-07-12 2009-12-03 エヌ−トリグ リミテッド Hovering and touch detection for digitizers
JP2009282634A (en) * 2008-05-20 2009-12-03 Canon Inc Information processor, its control method, program and storage medium
JP2010003325A (en) * 2005-10-05 2010-01-07 Sony Corp Display apparatus and display method
JP2010029711A (en) * 2009-11-10 2010-02-12 Nintendo Co Ltd Game machine and game program using touch panel
JP2010055598A (en) * 2008-07-31 2010-03-11 Sony Corp Information processing apparatus and method, and program
KR100958491B1 (en) * 2004-07-30 2010-05-17 애플 인크. Mode-based graphical user interfaces for touch sensitive input devices
KR100958490B1 (en) * 2004-07-30 2010-05-17 애플 인크. Mode-based graphical user interfaces for touch sensitive input devices
JP2010122767A (en) * 2008-11-17 2010-06-03 Zenrin Datacom Co Ltd Map display apparatus, map display method, and computer program
JP2010191942A (en) * 2009-01-20 2010-09-02 Nitto Denko Corp Display equipped with optical coordinate input device
JP2010233958A (en) * 2009-03-31 2010-10-21 Namco Bandai Games Inc Program, information storage medium, and game device
JP2010233957A (en) * 2009-03-31 2010-10-21 Namco Bandai Games Inc Program, information storage medium, and game device
JP2010250493A (en) * 2009-04-14 2010-11-04 Hitachi Displays Ltd Touch panel device
JP2011014169A (en) * 2000-01-31 2011-01-20 Canon Inc Operation apparatus, method therefor, and program therefor
JP2011023004A (en) * 2007-01-07 2011-02-03 Apple Inc Scrolling for list in touch screen display, parallel movement of document and scaling and rotation
JP2011048663A (en) * 2009-08-27 2011-03-10 Hitachi Displays Ltd Touch panel device
JP2011065519A (en) * 2009-09-18 2011-03-31 Digital Electronics Corp Touch detecting device for touch panel, and touch detecting method therefor
WO2011040483A1 (en) * 2009-09-29 2011-04-07 日本電気株式会社 Display device, control method and recording medium
JP2011076563A (en) * 2009-10-02 2011-04-14 Mitsubishi Electric Corp Terminal device of monitoring control device
JP2011141680A (en) * 2010-01-06 2011-07-21 Kyocera Corp Input device, input method and input program
JP4771951B2 (en) * 2003-05-15 2011-09-14 エフ・ポスザツト・ヒユー・エル・エル・シー Non-contact human computer interface
JP2011209822A (en) * 2010-03-29 2011-10-20 Nec Corp Information processing apparatus and program
JP2011530123A (en) * 2008-08-07 2011-12-15 ドラム,オウエン Method and apparatus for detecting multi-touch events in optical touch sensitive devices
JP2011253550A (en) * 2011-07-26 2011-12-15 Kyocera Corp Portable electronic device
JP2012048570A (en) * 2010-08-27 2012-03-08 Ricoh Co Ltd Display device, input control program, and recording medium storing the same
JP2012080950A (en) * 2010-10-07 2012-04-26 Taito Corp Game device and game system
JP2012091290A (en) * 2010-10-27 2012-05-17 Makino Milling Mach Co Ltd Method and device for measuring tool dimension
JP2012099161A (en) * 1998-01-26 2012-05-24 John G Elias Method of merging manual operation inputs
US8248382B2 (en) 2008-03-11 2012-08-21 Alps Electric Co., Ltd. Input device
JP2012168977A (en) * 2012-05-07 2012-09-06 Fujitsu Component Ltd Resistance type touch panel
JP2012167943A (en) * 2011-02-10 2012-09-06 Touch Panel Systems Kk Acoustic wave type position detector
JP2012203497A (en) * 2011-03-24 2012-10-22 Hitachi Solutions Ltd Display integrated coordinate input device and activation method of virtual keyboard function
JP2013008326A (en) * 2011-06-27 2013-01-10 Canon Inc Image processing device and control method therefor
JP2013504116A (en) * 2009-09-02 2013-02-04 フラットフロッグ ラボラトリーズ アーベーFlatFrog Laboratories AB Contact surface with compensation signal profile
JP2013506222A (en) * 2009-09-29 2013-02-21 イーロ・タッチ・ソリューションズ・インコーポレイテッドElo Touch Solutions,Inc. Method and apparatus for simultaneous touch detection of bending wave type touch screen
JP2013041609A (en) * 2012-10-22 2013-02-28 Seiko Epson Corp Portable information apparatus, server, electronic book, program and information storage medium
JP2013041608A (en) * 2012-10-22 2013-02-28 Seiko Epson Corp Portable information apparatus, server, electronic book, program and information storage medium
JP2013041607A (en) * 2012-10-22 2013-02-28 Seiko Epson Corp Portable information apparatus, server, electronic book, program and information storage medium
JP2013050971A (en) * 2012-10-22 2013-03-14 Seiko Epson Corp Portable information apparatus, server, electronic book, program and information storage medium
JP2013061950A (en) * 2012-10-22 2013-04-04 Seiko Epson Corp Portable information apparatus, server, electronic book, program and information storage medium
JP2013073484A (en) * 2011-09-28 2013-04-22 Jvc Kenwood Corp Electronic apparatus, method for controlling electronic apparatus, and program
JP2013080513A (en) * 2012-12-28 2013-05-02 Zenrin Datacom Co Ltd Map display device
USRE44258E1 (en) 1999-11-04 2013-06-04 Sony Corporation Apparatus and method for manipulating a touch-sensitive display panel
US8461512B2 (en) 2008-08-07 2013-06-11 Rapt Ip Limited Optical control system with modulated emitters
JP2013522801A (en) * 2010-03-24 2013-06-13 ネオノード インコーポレイテッド Lens array for light-based touch screen
US8482534B2 (en) 1995-06-29 2013-07-09 Timothy R. Pryor Programmable tactile touch screen displays and man-machine interfaces for improved vehicle instrumentation and telematics
JP2013175216A (en) * 2013-04-17 2013-09-05 Casio Comput Co Ltd Electronic apparatus and program
JP2013539126A (en) * 2010-09-29 2013-10-17 シェンゼェン ビーワイディー オート アールアンドディー カンパニー リミテッド Object detection method and apparatus using the same
JP2013218204A (en) * 2012-04-11 2013-10-24 Nikon Corp Focus detection device and imaging device
JP2014032689A (en) * 2013-09-24 2014-02-20 Seiko Epson Corp Portable information apparatus, electronic book, and information storage medium
US8686964B2 (en) 2006-07-13 2014-04-01 N-Trig Ltd. User specific recognition of intended user interaction with a digitizer
US8692768B2 (en) 2009-07-10 2014-04-08 Smart Technologies Ulc Interactive input system
US8704804B2 (en) 2005-10-05 2014-04-22 Japan Display West Inc. Display apparatus and display method
JP2014142792A (en) * 2013-01-23 2014-08-07 Kddi Corp Terminal device and display program
US8902193B2 (en) 2008-05-09 2014-12-02 Smart Technologies Ulc Interactive input system and bezel therefor
JP2014225292A (en) * 2012-12-20 2014-12-04 キヤノンマーケティングジャパン株式会社 Information processing device, control method therefor, and program
JP2014237039A (en) * 2014-08-19 2014-12-18 株式会社タイトー Game apparatus and game system
JP2014241093A (en) * 2013-06-12 2014-12-25 株式会社リコー Information processing system, operation apparatus, information display method, and program
JP2015022625A (en) * 2013-07-22 2015-02-02 アルプス電気株式会社 Input device
US9024886B2 (en) 2009-04-14 2015-05-05 Japan Display Inc. Touch-panel device
US9024906B2 (en) 2007-01-03 2015-05-05 Apple Inc. Multi-touch input discrimination
USRE45559E1 (en) 1997-10-28 2015-06-09 Apple Inc. Portable computers
JP2015115038A (en) * 2013-12-16 2015-06-22 セイコーエプソン株式会社 Information processor and control method of the same
US9239673B2 (en) 1998-01-26 2016-01-19 Apple Inc. Gesturing with a multipoint sensing device
US9292111B2 (en) 1998-01-26 2016-03-22 Apple Inc. Gesturing with a multipoint sensing device
JP2016076017A (en) * 2014-10-03 2016-05-12 株式会社東芝 Graphic processing device and graphic processing program
US9348458B2 (en) 2004-07-30 2016-05-24 Apple Inc. Gestures for touch sensitive input devices
JP2016515742A (en) * 2013-04-15 2016-05-30 クアルコム,インコーポレイテッド Gesture touch geometry ID tracking
US9354811B2 (en) 2009-03-16 2016-05-31 Apple Inc. Multifunction device with integrated search and application selection
US9360993B2 (en) 2002-03-19 2016-06-07 Facebook, Inc. Display navigation
US9442607B2 (en) 2006-12-04 2016-09-13 Smart Technologies Inc. Interactive input system and method
US9448712B2 (en) 2007-01-07 2016-09-20 Apple Inc. Application programming interfaces for scrolling operations
US9513744B2 (en) 1994-08-15 2016-12-06 Apple Inc. Control systems employing novel physical controls and touch screens
JP2018063738A (en) * 2018-01-29 2018-04-19 株式会社東芝 Graphic processing device and graphic processing program
US10042418B2 (en) 2004-07-30 2018-08-07 Apple Inc. Proximity detector in handheld device
US10379728B2 (en) 2008-03-04 2019-08-13 Apple Inc. Methods and graphical user interfaces for conducting searches on a portable multifunction device

Cited By (149)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9513744B2 (en) 1994-08-15 2016-12-06 Apple Inc. Control systems employing novel physical controls and touch screens
US9758042B2 (en) 1995-06-29 2017-09-12 Apple Inc. Programmable tactile touch screen displays and man-machine interfaces for improved vehicle instrumentation and telematics
US8482534B2 (en) 1995-06-29 2013-07-09 Timothy R. Pryor Programmable tactile touch screen displays and man-machine interfaces for improved vehicle instrumentation and telematics
USRE46548E1 (en) 1997-10-28 2017-09-12 Apple Inc. Portable computers
USRE45559E1 (en) 1997-10-28 2015-06-09 Apple Inc. Portable computers
US9098142B2 (en) 1998-01-26 2015-08-04 Apple Inc. Sensor arrangement for use with a touch sensor that identifies hand parts
US9001068B2 (en) 1998-01-26 2015-04-07 Apple Inc. Touch sensor contact information
US8902175B2 (en) 1998-01-26 2014-12-02 Apple Inc. Contact tracking and identification module for touch sensing
US9626032B2 (en) 1998-01-26 2017-04-18 Apple Inc. Sensor arrangement for use with a touch sensor
US9329717B2 (en) 1998-01-26 2016-05-03 Apple Inc. Touch sensing with mobile sensors
US9552100B2 (en) 1998-01-26 2017-01-24 Apple Inc. Touch sensing with mobile sensors
US8866752B2 (en) 1998-01-26 2014-10-21 Apple Inc. Contact tracking and identification module for touch sensing
JP2015167028A (en) * 1998-01-26 2015-09-24 ウェスターマン,ウェイン Method and apparatus for integrating manual input
US9239673B2 (en) 1998-01-26 2016-01-19 Apple Inc. Gesturing with a multipoint sensing device
JP2012146345A (en) * 1998-01-26 2012-08-02 John G Elias Multi-touch surface device
JP2012099161A (en) * 1998-01-26 2012-05-24 John G Elias Method of merging manual operation inputs
US9292111B2 (en) 1998-01-26 2016-03-22 Apple Inc. Gesturing with a multipoint sensing device
US9298310B2 (en) 1998-01-26 2016-03-29 Apple Inc. Touch sensor contact information
US9448658B2 (en) 1998-01-26 2016-09-20 Apple Inc. Resting contacts
US9342180B2 (en) 1998-01-26 2016-05-17 Apple Inc. Contact tracking and identification module for touch sensing
US9804701B2 (en) 1998-01-26 2017-10-31 Apple Inc. Contact tracking and identification module for touch sensing
US9383855B2 (en) 1998-01-26 2016-07-05 Apple Inc. Identifying contacts on a touch surface
US9348452B2 (en) 1998-01-26 2016-05-24 Apple Inc. Writing using a touch sensor
WO2001027867A1 (en) * 1999-10-13 2001-04-19 Elo Touchsystems, Inc. Contaminant processing system for an acoustic touchscreen
US6366277B1 (en) 1999-10-13 2002-04-02 Elo Touchsystems, Inc. Contaminant processing system for an acoustic touchscreen
US7342574B1 (en) 1999-10-29 2008-03-11 Ricoh Company, Ltd. Method and apparatus for inputting information including coordinate data
USRE44258E1 (en) 1999-11-04 2013-06-04 Sony Corporation Apparatus and method for manipulating a touch-sensitive display panel
US7986308B2 (en) 2000-01-31 2011-07-26 Canon Kabushiki Kaisha Method and apparatus for detecting and interpreting path of designated position
JP2001290585A (en) * 2000-01-31 2001-10-19 Canon Inc Position information processor, position information processing method and program, and operation device and its method and program
JP2011014170A (en) * 2000-01-31 2011-01-20 Canon Inc Operation apparatus, method therefor, and program therefor
JP2011014169A (en) * 2000-01-31 2011-01-20 Canon Inc Operation apparatus, method therefor, and program therefor
JP2013050997A (en) * 2000-01-31 2013-03-14 Canon Inc Information processing device, control method therefor, and program therefor
JP2001228971A (en) 2000-02-15 2001-08-24 Newcom:Kk Touch panel system to be operated at plural indicating positions
JP2001265475A (en) * 2000-03-15 2001-09-28 Ricoh Co Ltd Menu display controller and information processor and electronic blackboard system and method for controlling menu display system and method for controlling information processing system and computer readable recording medium with program for allowing the same method to be performed by computer recorded
JP2004502261A (en) * 2000-07-05 2004-01-22 スマート テクノロジーズ インコーポレイテッドSmart Technologies Inc. Camera-based touch system
JP4617549B2 (en) * 2000-09-05 2011-01-26 凸版印刷株式会社 Ultrasonic coating head and coating apparatus using the same
JP2002079162A (en) * 2000-09-05 2002-03-19 Toppan Printing Co Ltd Ultrasonic coating head and coater using the same
US7414617B2 (en) 2001-10-09 2008-08-19 Eit Co., Ltd. Coordinate input device working with at least display screen and desk-top surface as the pointing areas thereof
US7202860B2 (en) 2001-10-09 2007-04-10 Eit Co., Ltd. Coordinate input device working with at least display screen and desk-top surface as the pointing areas thereof
US9606668B2 (en) 2002-02-07 2017-03-28 Apple Inc. Mode-based graphical user interfaces for touch sensitive input devices
US9851864B2 (en) 2002-03-19 2017-12-26 Facebook, Inc. Constraining display in display navigation
US9678621B2 (en) 2002-03-19 2017-06-13 Facebook, Inc. Constraining display motion in display navigation
US10365785B2 (en) 2002-03-19 2019-07-30 Facebook, Inc. Constraining display motion in display navigation
US9753606B2 (en) 2002-03-19 2017-09-05 Facebook, Inc. Animated display navigation
US9360993B2 (en) 2002-03-19 2016-06-07 Facebook, Inc. Display navigation
US10055090B2 (en) 2002-03-19 2018-08-21 Facebook, Inc. Constraining display motion in display navigation
US9886163B2 (en) 2002-03-19 2018-02-06 Facebook, Inc. Constrained display navigation
US9626073B2 (en) 2002-03-19 2017-04-18 Facebook, Inc. Display navigation
US7477243B2 (en) 2002-05-31 2009-01-13 Eit Co., Ltd. Apparatus for controlling the shift of virtual space and method and program for controlling same
JP4771951B2 (en) * 2003-05-15 2011-09-14 エフ・ポスザツト・ヒユー・エル・エル・シー Non-contact human computer interface
JP2008507026A (en) * 2004-07-15 2008-03-06 エヌ−トリグ リミテッド Automatic switching of dual mode digitizer
JP2006034754A (en) * 2004-07-29 2006-02-09 Nintendo Co Ltd Game apparatus using touch panel and game program
US7658675B2 (en) 2004-07-29 2010-02-09 Nintendo Co., Ltd. Game apparatus utilizing touch panel and storage medium storing game program
KR100958491B1 (en) * 2004-07-30 2010-05-17 애플 인크. Mode-based graphical user interfaces for touch sensitive input devices
US9348458B2 (en) 2004-07-30 2016-05-24 Apple Inc. Gestures for touch sensitive input devices
KR100958490B1 (en) * 2004-07-30 2010-05-17 애플 인크. Mode-based graphical user interfaces for touch sensitive input devices
KR100984596B1 (en) * 2004-07-30 2010-09-30 애플 인크. Gestures for touch sensitive input devices
US10042418B2 (en) 2004-07-30 2018-08-07 Apple Inc. Proximity detector in handheld device
JP2010003325A (en) * 2005-10-05 2010-01-07 Sony Corp Display apparatus and display method
US8704804B2 (en) 2005-10-05 2014-04-22 Japan Display West Inc. Display apparatus and display method
JP2009522669A (en) * 2005-12-30 2009-06-11 アップル インコーポレイテッド Portable electronic device with multi-touch input
US9069417B2 (en) 2006-07-12 2015-06-30 N-Trig Ltd. Hover and touch detection for digitizer
US9535598B2 (en) 2006-07-12 2017-01-03 Microsoft Technology Licensing, Llc Hover and touch detection for a digitizer
US10031621B2 (en) 2006-07-12 2018-07-24 Microsoft Technology Licensing, Llc Hover and touch detection for a digitizer
JP2009543246A (en) * 2006-07-12 2009-12-03 エヌ−トリグ リミテッド Hovering and touch detection for digitizers
US8686964B2 (en) 2006-07-13 2014-04-01 N-Trig Ltd. User specific recognition of intended user interaction with a digitizer
US9442607B2 (en) 2006-12-04 2016-09-13 Smart Technologies Inc. Interactive input system and method
JP2008158842A (en) * 2006-12-25 2008-07-10 Xanavi Informatics Corp Map display device
US9778807B2 (en) 2007-01-03 2017-10-03 Apple Inc. Multi-touch input discrimination
US9256322B2 (en) 2007-01-03 2016-02-09 Apple Inc. Multi-touch input discrimination
US9024906B2 (en) 2007-01-03 2015-05-05 Apple Inc. Multi-touch input discrimination
US9619132B2 (en) 2007-01-07 2017-04-11 Apple Inc. Device, method and graphical user interface for zooming in on a touch-screen display
US9448712B2 (en) 2007-01-07 2016-09-20 Apple Inc. Application programming interfaces for scrolling operations
US9760272B2 (en) 2007-01-07 2017-09-12 Apple Inc. Application programming interfaces for scrolling operations
US10606470B2 (en) 2007-01-07 2020-03-31 Apple, Inc. List scrolling and document translation, scaling, and rotation on a touch-screen display
US9052814B2 (en) 2007-01-07 2015-06-09 Apple Inc. Device, method, and graphical user interface for zooming in on a touch-screen display
JP2011023004A (en) * 2007-01-07 2011-02-03 Apple Inc Scrolling for list in touch screen display, parallel movement of document and scaling and rotation
US9641749B2 (en) 2007-02-08 2017-05-02 Samsung Electronics Co., Ltd. Onscreen function execution method for mobile terminal having a touchscreen
US9395913B2 (en) 2007-02-08 2016-07-19 Samsung Electronics Co., Ltd. Onscreen function execution method for mobile terminal having a touchscreen
KR100896711B1 (en) * 2007-02-08 2009-05-11 삼성전자주식회사 Method for executing function according to tap in mobile terminal with touch screen
US9041681B2 (en) 2007-02-08 2015-05-26 Samsung Electronics Co., Ltd. Onscreen function execution method for mobile terminal having a touchscreen
JP2008216991A (en) * 2008-01-29 2008-09-18 Fujitsu Ten Ltd Display device
US10379728B2 (en) 2008-03-04 2019-08-13 Apple Inc. Methods and graphical user interfaces for conducting searches on a portable multifunction device
JP2009217543A (en) * 2008-03-11 2009-09-24 Brother Ind Ltd Contact-input type information processing apparatus, contact-input type information processing method, and information processing program
JP4670879B2 (en) * 2008-03-11 2011-04-13 ブラザー工業株式会社 Contact input type information processing apparatus, contact input type information processing method, and information processing program
US8248382B2 (en) 2008-03-11 2012-08-21 Alps Electric Co., Ltd. Input device
US8902193B2 (en) 2008-05-09 2014-12-02 Smart Technologies Ulc Interactive input system and bezel therefor
US8917245B2 (en) 2008-05-20 2014-12-23 Canon Kabushiki Kaisha Information processing apparatus and control method thereof
JP2009282634A (en) * 2008-05-20 2009-12-03 Canon Inc Information processor, its control method, program and storage medium
JP2010055598A (en) * 2008-07-31 2010-03-11 Sony Corp Information processing apparatus and method, and program
US9335864B2 (en) 2008-08-07 2016-05-10 Rapt Ip Limited Method and apparatus for detecting a multitouch event in an optical touch-sensitive device
US8723840B2 (en) 2008-08-07 2014-05-13 Rapt Ip Limited Method and apparatus for detecting a multitouch event in an optical touch-sensitive device
US8723839B2 (en) 2008-08-07 2014-05-13 Rapt Ip Limited Method and apparatus for detecting a multitouch event in an optical touch-sensitive device
US8461512B2 (en) 2008-08-07 2013-06-11 Rapt Ip Limited Optical control system with modulated emitters
JP2011530123A (en) * 2008-08-07 2011-12-15 ドラム,オウエン Method and apparatus for detecting multi-touch events in optical touch sensitive devices
US9086762B2 (en) 2008-08-07 2015-07-21 Rapt Ip Limited Optical control system with modulated emitters
JP2010122767A (en) * 2008-11-17 2010-06-03 Zenrin Datacom Co Ltd Map display apparatus, map display method, and computer program
JP2010191942A (en) * 2009-01-20 2010-09-02 Nitto Denko Corp Display equipped with optical coordinate input device
US10067991B2 (en) 2009-03-16 2018-09-04 Apple Inc. Multifunction device with integrated search and application selection
US10042513B2 (en) 2009-03-16 2018-08-07 Apple Inc. Multifunction device with integrated search and application selection
US9354811B2 (en) 2009-03-16 2016-05-31 Apple Inc. Multifunction device with integrated search and application selection
JP2010233957A (en) * 2009-03-31 2010-10-21 Namco Bandai Games Inc Program, information storage medium, and game device
JP2010233958A (en) * 2009-03-31 2010-10-21 Namco Bandai Games Inc Program, information storage medium, and game device
JP2010250493A (en) * 2009-04-14 2010-11-04 Hitachi Displays Ltd Touch panel device
US9024886B2 (en) 2009-04-14 2015-05-05 Japan Display Inc. Touch-panel device
US8692768B2 (en) 2009-07-10 2014-04-08 Smart Technologies Ulc Interactive input system
JP2011048663A (en) * 2009-08-27 2011-03-10 Hitachi Displays Ltd Touch panel device
JP2013504116A (en) * 2009-09-02 2013-02-04 フラットフロッグ ラボラトリーズ アーベーFlatFrog Laboratories AB Contact surface with compensation signal profile
JP2011065519A (en) * 2009-09-18 2011-03-31 Digital Electronics Corp Touch detecting device for touch panel, and touch detecting method therefor
US9696856B2 (en) 2009-09-29 2017-07-04 Elo Touch Solutions, Inc. Method and apparatus for detecting simultaneous touch events on a bending-wave touchscreen
WO2011040483A1 (en) * 2009-09-29 2011-04-07 日本電気株式会社 Display device, control method and recording medium
JP5686735B2 (en) * 2009-09-29 2015-03-18 レノボ・イノベーションズ・リミテッド(香港) Display device, control method, and program
JP2013506222A (en) * 2009-09-29 2013-02-21 イーロ・タッチ・ソリューションズ・インコーポレイテッドElo Touch Solutions,Inc. Method and apparatus for simultaneous touch detection of bending wave type touch screen
US9063651B2 (en) 2009-09-29 2015-06-23 Lenovo Innovations Limited Display device, control method and recording medium
JP2011076563A (en) * 2009-10-02 2011-04-14 Mitsubishi Electric Corp Terminal device of monitoring control device
JP2010029711A (en) * 2009-11-10 2010-02-12 Nintendo Co Ltd Game machine and game program using touch panel
JP2011141680A (en) * 2010-01-06 2011-07-21 Kyocera Corp Input device, input method and input program
JP2013522801A (en) * 2010-03-24 2013-06-13 ネオノード インコーポレイテッド Lens array for light-based touch screen
JP2011209822A (en) * 2010-03-29 2011-10-20 Nec Corp Information processing apparatus and program
JP2012048570A (en) * 2010-08-27 2012-03-08 Ricoh Co Ltd Display device, input control program, and recording medium storing the same
JP2013539126A (en) * 2010-09-29 2013-10-17 シェンゼェン ビーワイディー オート アールアンドディー カンパニー リミテッド Object detection method and apparatus using the same
JP2012080950A (en) * 2010-10-07 2012-04-26 Taito Corp Game device and game system
JP2012091290A (en) * 2010-10-27 2012-05-17 Makino Milling Mach Co Ltd Method and device for measuring tool dimension
US9188437B2 (en) 2010-10-27 2015-11-17 Makino Milling Machine Co., Ltd. Method of measurement and apparatus for measurement of tool dimensions
JP2012167943A (en) * 2011-02-10 2012-09-06 Touch Panel Systems Kk Acoustic wave type position detector
JP2012203497A (en) * 2011-03-24 2012-10-22 Hitachi Solutions Ltd Display integrated coordinate input device and activation method of virtual keyboard function
JP2013008326A (en) * 2011-06-27 2013-01-10 Canon Inc Image processing device and control method therefor
JP2011253550A (en) * 2011-07-26 2011-12-15 Kyocera Corp Portable electronic device
US9223487B2 (en) 2011-09-28 2015-12-29 JVC Kenwood Corporation Electronic apparatus, method of controlling the same, and related computer program
JP2013073484A (en) * 2011-09-28 2013-04-22 Jvc Kenwood Corp Electronic apparatus, method for controlling electronic apparatus, and program
JP2013218204A (en) * 2012-04-11 2013-10-24 Nikon Corp Focus detection device and imaging device
JP2012168977A (en) * 2012-05-07 2012-09-06 Fujitsu Component Ltd Resistance type touch panel
JP2013041609A (en) * 2012-10-22 2013-02-28 Seiko Epson Corp Portable information apparatus, server, electronic book, program and information storage medium
JP2013061950A (en) * 2012-10-22 2013-04-04 Seiko Epson Corp Portable information apparatus, server, electronic book, program and information storage medium
JP2013050971A (en) * 2012-10-22 2013-03-14 Seiko Epson Corp Portable information apparatus, server, electronic book, program and information storage medium
JP2013041608A (en) * 2012-10-22 2013-02-28 Seiko Epson Corp Portable information apparatus, server, electronic book, program and information storage medium
JP2013041607A (en) * 2012-10-22 2013-02-28 Seiko Epson Corp Portable information apparatus, server, electronic book, program and information storage medium
JP2014225292A (en) * 2012-12-20 2014-12-04 キヤノンマーケティングジャパン株式会社 Information processing device, control method therefor, and program
JP2013080513A (en) * 2012-12-28 2013-05-02 Zenrin Datacom Co Ltd Map display device
JP2014142792A (en) * 2013-01-23 2014-08-07 Kddi Corp Terminal device and display program
JP2016515742A (en) * 2013-04-15 2016-05-30 クアルコム,インコーポレイテッド Gesture touch geometry ID tracking
JP2013175216A (en) * 2013-04-17 2013-09-05 Casio Comput Co Ltd Electronic apparatus and program
JP2014241093A (en) * 2013-06-12 2014-12-25 株式会社リコー Information processing system, operation apparatus, information display method, and program
JP2015022625A (en) * 2013-07-22 2015-02-02 アルプス電気株式会社 Input device
JP2014032689A (en) * 2013-09-24 2014-02-20 Seiko Epson Corp Portable information apparatus, electronic book, and information storage medium
JP2015115038A (en) * 2013-12-16 2015-06-22 セイコーエプソン株式会社 Information processor and control method of the same
JP2014237039A (en) * 2014-08-19 2014-12-18 株式会社タイトー Game apparatus and game system
JP2016076017A (en) * 2014-10-03 2016-05-12 株式会社東芝 Graphic processing device and graphic processing program
JP2018063738A (en) * 2018-01-29 2018-04-19 株式会社東芝 Graphic processing device and graphic processing program

Similar Documents

Publication Publication Date Title
US9335890B2 (en) Method and apparatus for user interface of input devices
US9939911B2 (en) Computer interface for remotely controlled objects and wearable articles with absolute pose detection component
US20160179188A1 (en) Hand tracker for device with display
KR101953165B1 (en) Gesture recognition devices and methods
JP5330473B2 (en) Method and system for enabling interaction between a virtual environment and a physical object
US20180067615A1 (en) Method and apparatus for data entry input
US7737959B2 (en) Position detection system using laser speckle
JP3847641B2 (en) Information processing apparatus, information processing program, computer-readable recording medium storing information processing program, and information processing method
EP1936478B1 (en) Position detecting device
US6525717B1 (en) Input device that analyzes acoustical signatures
US8022942B2 (en) Dynamic projected user interface
US7834850B2 (en) Method and system for object control
US6424334B1 (en) Computer data entry and manipulation apparatus and method
US7239302B2 (en) Pointing device and scanner, robot, mobile communication device and electronic dictionary using the same
CN101548547B (en) Object detection using video input combined with tilt angle information
JP3950837B2 (en) Projector, electronic blackboard system using projector, and indication position acquisition method
US8854433B1 (en) Method and system enabling natural user interface gestures with an electronic system
KR100910024B1 (en) Camera type touch-screen utilizing linear infrared emitter
US5852434A (en) Absolute optical position determination
KR100734894B1 (en) Quasi-three-dimensional method and apparatus to detect and localize interaction of user-object and virtual transfer device
CN101937289B (en) Optical touch device
US7038659B2 (en) Symbol encoding apparatus and method
US8063882B2 (en) Generating audio signals based on input device position
US7978184B2 (en) Interactive window display
DE69913371T2 (en) Input device with sensor sensors