WO2003104967A1 - 3次元空間内の任意の点を指定する処理のための情報処理方法 - Google Patents
3次元空間内の任意の点を指定する処理のための情報処理方法 Download PDFInfo
- Publication number
- WO2003104967A1 WO2003104967A1 PCT/JP2003/007334 JP0307334W WO03104967A1 WO 2003104967 A1 WO2003104967 A1 WO 2003104967A1 JP 0307334 W JP0307334 W JP 0307334W WO 03104967 A1 WO03104967 A1 WO 03104967A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information processing
- display screen
- user
- computer
- value
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03543—Mice or pucks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03547—Touch pads, in which fingers can move on a surface
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0414—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position
- G06F3/04144—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position using an array of force sensing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/014—Force feedback applied to GUI
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04801—Cursor retrieval aid, i.e. visual aspect modification, blinking, colour changes, enlargement or other visual cues, for helping user do find the cursor in graphical user interfaces
Definitions
- the present invention is applicable to, for example, a process of designating an arbitrary point in a three-dimensional space displayed on a two-dimensional display screen, and is suitable for an information processing method.
- the present invention relates to a recordable recording medium, an information processing program, and an information processing apparatus. Background art
- a user when specifying an arbitrary point in an image displayed on a two-dimensional display screen such as a liquid crystal display device or a CRT (Cathode Ray Tube), a user must use a mouse pointer, a tablet, or the like. A desired point was pointed to the system with an input device such as a touch panel or a touch panel, or with a finger.
- a two-dimensional display screen such as a liquid crystal display device or a CRT (Cathode Ray Tube)
- conventional systems have a configuration in which only the position of a point in the in-plane direction of the two-dimensional display screen can be specified. Cannot specify any point in the three-dimensional space.
- an input device that specifies the position (x, y) of a point in the in-plane direction of the display screen
- another method that specifies the position of a point in the vertical direction (z) (depth direction) with respect to the display screen is used.
- Specify the position of an arbitrary point in three-dimensional space by using a method that uses the input device of the above or by directly inputting the three-dimensional coordinates (x, y, z) of the specified point.
- the present invention has been made in order to solve the above-mentioned problems, and an object of the present invention is to easily and naturally operate any point in a three-dimensional space displayed on a two-dimensional display screen, and An object of the present invention is to provide an information processing method, a computer-readable recording medium on which the information processing program is recorded, an information processing program, and an information processing device, which enable the specification with high accuracy. Disclosure of the invention
- a first feature of the present invention is that a three-dimensional space is displayed on a two-dimensional display screen, coordinate values and depth values of points in the two-dimensional display screen specified by a user are detected, and coordinate and depth values are detected.
- the purpose is to specify the position in the three-dimensional space according to the size. That is, in the present invention, the position in the three-dimensional space specified by the user is specified based on the position of the point on the two-dimensional display screen specified by the user and the depth value at that point. According to such a configuration, the user can easily and accurately specify a point in the three-dimensional space with a natural operation feeling close to a motion in the real world.
- a second feature of the present invention is that at least one object is displayed on a two-dimensional display screen, and a coordinate value and a depth value of a point on the two-dimensional display screen specified by a user are detected. Performing processing according to the magnitude of the depth value on the object specified by the coordinate value It is in. That is, in the present invention, a predetermined operation is performed on an object at a point designated by the user according to the magnitude of the depth value. According to such a configuration, even a user unfamiliar with the operation of the device can easily operate an object displayed in the two-dimensional display screen with a natural operation feeling.
- FIG. 1 is a block diagram illustrating a configuration of an information processing apparatus according to a first embodiment of the present invention.
- FIG. 2 is a schematic diagram illustrating a configuration of the operation input unit according to the first embodiment of the present invention.
- FIG. 3 is a schematic diagram showing an application example of the operation input unit shown in FIG.
- FIG. 4 is a schematic diagram showing a connection relationship between the pressure-sensitive element shown in FIG. 2 and electric wiring.
- FIG. 5 is a flowchart showing a method of specifying a three-dimensional coordinate value according to the embodiment of the present invention.
- FIG. 6 is a schematic diagram for explaining a method of specifying the three-dimensional coordinate values shown in FIG.
- FIG. 7 is a schematic diagram for explaining an application example of the method of designating three-dimensional coordinate values shown in FIG.
- FIG. 8 is a schematic diagram for explaining an example of using the method of designating three-dimensional coordinate values shown in FIG.
- FIG. 9 is a schematic diagram for explaining an example of using the method of designating three-dimensional coordinate values shown in FIG.
- FIG. 10 is a schematic diagram for explaining an example of use of the method of designating three-dimensional coordinate values shown in FIG.
- FIG. 11 is a schematic diagram for explaining an example of using the method of designating three-dimensional coordinate values shown in FIG.
- FIG. 12 is a schematic diagram for explaining an example of use of the method of designating three-dimensional coordinate values shown in FIG.
- FIG. 13 is a flowchart showing a method of operating an icon according to an embodiment of the present invention.
- FIG. 14 is a schematic diagram illustrating a configuration of an operation input unit according to the second embodiment of the present invention.
- FIG. 15 is a schematic diagram showing an application example of the operation input unit shown in FIG.
- FIG. 16 is a schematic diagram showing an application example of the operation input unit shown in FIG.
- FIG. 17 is a schematic diagram showing an application example of the operation input unit shown in FIG.
- FIG. 18 is a flowchart illustrating the operation of the information processing apparatus according to the second embodiment of the present invention.
- FIG. 19 is a schematic diagram showing an application example of the operation input unit according to the embodiment of the present invention.
- FIG. 20 is a schematic diagram showing an application example of the operation input unit according to the embodiment of the present invention.
- FIG. 21 is a schematic diagram illustrating an application example of the operation input unit according to the embodiment of the present invention.
- An information processing apparatus is designed to specify and operate an arbitrary point in a three-dimensional space displayed on a two-dimensional display screen. This can be applied to a process that causes a predetermined process to be executed.
- the configuration and operation of the information processing apparatus according to the first and second embodiments of the present invention will be described in detail.
- an information processing apparatus 1 includes a CPU 2, a RAM 3, a ROM 4, a display unit 5, and an operation input unit 6 as main components. These components are electrically connected to each other via a bus wiring 7.
- the CPU 2 is constituted by a general processor device, and controls the operation of the information processing device in accordance with a computer program stored in the ROM 4.
- the RAM 3 is composed of volatile semiconductor memory and provides a computer program for realizing the processing executed by the CPU 2 and a work area for temporarily storing processing data.
- the ROM 4 is composed of a non-volatile semiconductor memory, and includes a computer program such as a startup program (not shown) for the information processing apparatus and an interface program 8 (to be described in detail later). It has a program section 9 for storing programs and a processing data section 10 for storing processing data necessary for executing a computer program. Note that a part or all of the computer program and the processing data may be received via an electronic network.
- the display unit 5 includes a display output device such as a liquid crystal display or a CRT, and displays various information such as a three-dimensional object in a two-dimensional screen according to an instruction from the CPU 2.
- a flexible base such as a plastic film is used as the display unit 5.
- a flexible display device using a panel may be used.
- the operation input unit 6 is a device capable of detecting a coordinate value (x, y) and a pressing value P at an arbitrary point on the two-dimensional screen, which are specified by pressing with a finger or a predetermined input device by a user. It is composed of
- the operation input unit 6 includes a touch panel unit 11 built in or bonded to the display unit 5 and a touch panel unit 11. It has a pressure-sensitive element 12 installed on the back, and a back plate 14 for supporting the pressure-sensitive element 12 from the back.
- the touch panel section 11 detects the coordinate value (x, y) of a point on the two-dimensional screen pressed by the user using an existing detection method such as an infrared type, a pressure type, and an electromagnetic type.
- the pressure-sensitive element 12 detects the pressure value P at a point on the two-dimensional screen pressed by the user, and outputs a pressure detection signal indicating the magnitude of the pressure value P to CPU2.
- the back plate 14 is fixed to the apparatus main body 13 in a form as shown in FIGS.
- the operation input unit 6 includes the coordinate detection mechanism (touch panel unit 11) and the pressing value detection mechanism (pressure-sensitive element 12) on the front of the display unit 5, respectively. ⁇ Since it is located on the back, the function of both the coordinate value detection mechanism and the pressure value detection mechanism is reduced as compared to the case where both functions are put together on the front of the display unit 5. As a result, when a user designates a point while observing the display unit 5 from an oblique direction, a shift generated between the display point and the pressed point can be reduced. .
- the pressure-sensitive element 12 when the pressure-sensitive element 12 is arranged in front of the display unit 5, the pressure-sensitive element 12 having a small thickness is usually used to suppress the thickness of the display unit 5, but the above operation input is performed.
- pressure sensitive elements 1 and 2 are displayed Since it is arranged on the back of the unit 5, for example, the operation input unit 6 has a certain degree of elasticity by giving the pressure-sensitive element 12 a certain thickness to widen the range of the detectable pressure value P.
- the degree of freedom of design can be increased, for example, by giving it.
- the manufacturing cost of the operation input unit can be reduced by using an inexpensive pressure-sensitive element.
- the surface of the display unit 5 is usually softened, and the user may feel uncomfortable in operation. According to this, since the surface of the display unit 5 becomes moderately soft, the user does not feel uncomfortable in operation.
- the back plate 14 is connected to the main unit 13 and the touch panel unit 11 itself is not fixed to the main unit 13, the pressing value of the point pressed by the user is set. P can be detected accurately.
- the pressure-sensitive element 12 is connected in series via an electric wiring 15 as shown in FIG. 4A, for example, and detects the pressure value P for the entire touch panel section 11. You may do it. Further, as shown in FIG. 4 (b), for example, as shown in FIG. 4 (b), the pressure-sensitive element 12 is connected to each desired block via the electric wiring 15 and the pressing value P is detected for each block. Is also good. Further, as shown in FIG. 4 (c), for example, an electric wiring 15 may be connected to each pressure-sensitive element 12 to detect the pressing value P of each pressure-sensitive element 12.
- an electronic circuit or software may be used. It is desirable to correct the two values so that they are the same by hardware-like processing. Also, soft According to the software-based correction processing, it is possible to easily deal with changes in correction values due to aging and differences in average pressing values due to individual differences and age differences between users. Should be performed by software-like processing.
- the information processing apparatus As described above, in the first embodiment, the information processing apparatus
- the user can specify and select an arbitrary three-dimensional position in the three-dimensional space displayed on the display unit 5.
- the operation of the information processing apparatus 1 when the user specifies and selects an arbitrary three-dimensional position in the three-dimensional space will be described in detail with reference to a flowchart shown in FIG.
- the flowchart shown in FIG. 5 is started when the user touches the two-dimensional display screen with a finger or a predetermined input device via the touch panel unit 11, and the CPU 2 operates according to the interface program 8. Perform the following processing.
- step S1 the CPU 2 determines the coordinate value (x) of the point 16 (hereinafter, designated as the designated point 16) on the two-dimensional display screen designated by the user via the touch panel section 11 , y). Thereby, the process of step S1 is completed, and the designation process proceeds from the process of step S1 to the process of step S2.
- step S2 the CPU 2 controls the display unit 5 to The cursor 17 is displayed at the detected coordinate position (x, y).
- the process of step S2 is completed, and the designation process proceeds to the process of step S3 from the process card of step S2.
- step S3 CPU2 detects the pressure value P at the designated point 16 with reference to the pressure detection signal output from the pressure-sensitive element 12. As a result, the process of step S3 is completed, and the designation process proceeds from the processing power of step S3 to the process of step S4.
- step S4 the CPU 2 determines, as shown in FIG. 6, the depth of the drawing area 20 forming a three-dimensional space from the designated point 16 parallel to the line of sight 18 of the user. z) A straight line 19 extending in the direction is specified. Then, the CPU 2 moves and displays the cursor 17 along the straight line 19 in the depth (z) direction of the drawing area 20 by a distance corresponding to the magnitude of the pressing value P. After that, the CPU 2 sets the position where the cursor 17 stops at the 3D position in the drawing area 20 specified by the user, for example, by selecting the object at the position where the cursor 17 stops. And specify. Thus, the process of step S4 is completed, and a series of designated processes is completed.
- the CPU 2 defines a straight line 19 parallel to the user's line of sight 18, but, for example, as shown in FIG. May be specified, and the cursor 17 may be moved and displayed along the straight line 21.
- the user can understand how the cursor 17 moves in the depth direction of the drawing area 20 as compared with the case where a straight line 19 parallel to the line of sight 18 is used. It becomes easier.
- the CPU 2 controls the display unit 5 to display the straight line 19 together with the cursor 17 so that the user can recognize the moving direction of the cursor 17.
- the CPU 2 changes the size, color, brightness, and the like of the cursor 17 according to the position of the cursor 17 in the depth (z) direction, for example, in the three-dimensional space in the drawing area 20.
- a drawing process such as displaying a state in which the object 17 and the cursor 17 interfere with each other, displaying a dalid line, and forming the drawing area 20 in a stereoscopic view, It is desirable that the user can easily grasp how the sol 17 moves in the depth direction of the drawing area 20.
- the display screen vibrates or generates a sound as the cursor 17 interferes with an object in the drawing area 20.
- the user may be able to easily grasp how the cursor 17 moves in the depth direction of the drawing area 20.
- the information processing apparatus 1 detects the pressure value P at a point on the two-dimensional display screen specified by the user with the pressure-sensitive element 12 and determines the magnitude in the depth direction of the drawing area 20. Is recognized as the coordinate value (z).
- the user designates a point on the two-dimensional display screen via the touch panel unit 11 so that the drawing that constitutes the three-dimensional space is performed.
- the 3D position of any point in the region 20 can be easily specified.
- this processing operation is similar to the actual three-dimensional position pointing operation in the real world, even a user who is unfamiliar with the operation of the device can easily draw the image without receiving training or training. Any three-dimensional position within 20 can be specified.
- an object 22 composed of three-dimensionally arranged object elements as shown in Fig. 8 (a) is displayed on the display unit 5, and the user is able to specify the designated point in the object.
- the specified point can be turned as shown in Fig. 9 as shown in Fig. 9. If the selected object element can be moved according to 23, the user can adjust the magnitude of the pressing value P to select the object element as shown in FIG. It is possible to intuitively change the number of object elements to be moved and to easily move a desired object element.
- FIGS. 11 (a) to 11 (c) two points 23a and 23b on the touch panel section 11 are designated and these two points are moved. Therefore, when the operation of picking up the object 25 placed on the texture 24 is performed according to the magnitude of the pressing value P, as shown in FIG. 11D and FIG. By transforming 4, it is possible to give the user the feeling of actually picking up the object 25 in the real world.
- the user does not need to perform a single-click operation or a double-click operation that is employed in a general computer system, and the display unit is not required. Operate the icons representing the folder files and application programs displayed in 5 with a natural feeling similar to real-world operations, and instruct the device to perform the processing corresponding to each operation can do.
- the processing operation of the information processing apparatus 1 when the user operates the icon will be described in detail according to the flowchart shown in FIG.
- the CPU 2 detects a change in the coordinate value (x, y) and the pressed value P of the specified point on the touch panel section 11 pressed by the user (event detection). ), And the CPU 2 executes the following processing in accordance with the interface program 8.
- the CPU 2 stores information on the coordinate value (x, y) and the pressing value P of the designated point before the event detection in the RAM 3 in accordance with the interface program 8.
- the user sets the first and second set values P 1, P 2 used when the CPU 2 determines which of the single click operation and the double click operation has been instructed. (P 1 ⁇ P 2) is input to the device in advance. Then, according to the input of the first and second set values P I, P 2, C P U
- CPU2 is used to determine the detected pressure value P and the first and second set values P1 and P2 stored in ROM4.
- the magnitude relation with P 2 is compared, and processing is executed as described below according to the magnitude relation.
- the CPU 2 detects the detected event, for example, whether the user is pressing the designated point with a finger or the user is moving the finger away from the designated point.
- the status of the immediately preceding event is stored in RAM 3 so that the contents can be recognized, and the detected event is compared with the status of the immediately preceding event to recognize a change in the status. Thus, the content of the detected event is determined.
- the CPU 2 assigns the first set value P 1 corresponding to a single click operation to the designated point (PRESS 1 state), and sets the second set value P corresponding to a double click operation to P 2.
- Two states are stored in RAM 3 as a status: a state in which 2 is given to the specified point (PRESS 2 state), and a state in which the finger is separated from the specified point (hereinafter referred to as the RELEASE state).
- the CP 2 advances the operation processing from the processing power of steps S 11 and S 12 to the processing of step S 13, and proceeds to step S 13
- the CPU refers to the data in RAM 3 to determine whether the status is the PRESS 2 state.
- step S13 if the status is in the PRESS2 state, CPU2 waits until the next event is detected. On the other hand, if the result of determination is that the status is not the PRES S2 state, CPU 2 proceeds with this operation to step S14.
- step S14 CPU2 sets the status to the PRESS2 state and stores it in RAM3.
- step S14 is completed, and the operation process proceeds from the process of step S14 to the process of step S15.
- step S15 the CPU 2 performs a process corresponding to a double-click operation such as, for example, starting an application program represented by an icon. As a result, the operation process for the detected event is completed, and the CPU 2 waits until the next event is detected.
- the CPU 2 stops the operation processing. The process proceeds from the processing of steps S11 and S12 to the processing of step S16, and as the processing of step S16, the data in RAM 3 is referred to and the status is changed to PRESS2. Determine whether it is in the state.
- the CPU 2 waits until the next event is detected. On the other hand, if the result of the determination is that the status is not in the PRESS 2 state, the CPU 2 advances this operation processing to the processing of step S17.
- step S17 CPU2 refers to the data in RAM3 to determine whether or not the status is PRESS1. If the result of the determination is that the status is not the PRESS 1 state, the CPU 2 sets the status to the PRESS 1 state in the processing of the step S 18, and then proceeds to the step S 19. For example, the processing corresponding to a single click operation such as setting an application program represented by an icon to a selected state is executed. Then, upon completion of the process in the step S 19, the CPU 2 advances this operation process to a process in the step S 22.
- step S 17 if the status is the PRESSI state, the CPU 2 determines that the specified point (x, y) is in RAM 3 in step S 20. It is determined whether the reference point ( ⁇ , ⁇ ) stored in the controller is separated from the force by DX1 or DX2 for a predetermined distance or more. If the result of the determination indicates that the designated point is not more than a predetermined distance from the reference point, the CPU 2 waits until the next event is detected.
- step S 21 a processing operation corresponding to the drag process is executed. As a result, the processing of this step S21 is completed, and this operation processing proceeds from the processing of step S21 to the processing of step S22.
- step S22 the CPU 2 uses the coordinate value (x, y) of the current designated point as the coordinate value (x0, yO) of the reference point used in the subsequent process. Store in. As a result, the operation process for the detected event is completed, and the CPU 2 waits until the next event is detected.
- the CPU 2 advances the operation processing from the processing of the step S11 to the processing of the step S23, and proceeds to the processing of the step S23. Then, referring to the data in RAM 3, it is determined whether or not the status is in the PRESS 1 state. If the result of the determination indicates that the status is in the PRESS 1 state, the CPU 2 detects the event where the user releases the finger pressure after the user single-clicks the icon. (Hereinafter referred to as “release operation after single-click operation”), and the process goes to step S24 to return the status to the RELEASE state.
- step S25 the processing corresponding to "release operation after single click operation", such as opening the folder if the icon is a folder, as the processing of step S25 Execute Then, when the processing in step S25 is completed, the CPU 2 returns this operation processing to the processing in step S11.
- step S 23 determines whether the status is not in the PRESSI state. If the status is SPRESS 2 as a result of the determination, the CPU 2 detects the event at a stage where the user releases the finger pressure after the user double-clicks the icon.
- step S27 the status is set to the RELEASE state and then the step is executed.
- step S28 a process corresponding to "release operation after double click operation” is executed.
- the CPU 2 returns this operation processing to the processing in step S11 described above.
- step S26 determines whether the status is not the status SPRESS2 or the CPU 2 executes this operation process from step S26 to step S1. Return to step 1.
- the information processing apparatus performs the single click operation by referring to the magnitude of the pressing value at the point on the two-dimensional display screen designated by the user. Then, it is determined which of the double-click operation and the double-click operation is instructed, and the processing corresponding to each operation is executed according to the determination result. According to such a process, the user does not have to perform a troublesome operation of once releasing his finger from the touch panel section 11 and pressing the same point again, and can perform the same operation on the two-dimensional display screen. It is possible to operate the icons displayed on the device, so that even a user unfamiliar with the operation of the device can easily operate the icons with a natural operation feeling. it can. In addition, the user operates the touch panel section 11 Since there is no need to release the finger, the icon can be operated faster than the double-click operation.
- the operation processing target is an icon, but the above processing may be applied to, for example, an operation of a slide-type volume control function displayed on the display unit 5. It is.
- the information processing apparatus according to the second embodiment of the present invention is different from the first embodiment in the configuration and operation of the operation input unit 6. Therefore, hereinafter, only the configuration and operation of the operation input unit 6 in the information processing apparatus according to the second embodiment of the present invention will be described in detail, and the description of the other components will be omitted because they are the same as above. I do.
- the operation input unit 6 according to the second embodiment of the present invention is different from that of the first embodiment in that, as shown in FIG. 14, a touch panel to which the pressure-sensitive element 12 is not connected is used.
- the unit 11 includes a plurality of vibrating elements 26 on the surface.
- the vibrating element 26 is composed of a piezoelectric element or a solenoid, and when the user presses down the touch panel section 11 to operate, generates vibration corresponding to the operation content according to the control of the CPU 2. .
- the vibrating element 26 shown in FIG. 14 is connected to the surface of the touch panel 11, for example, as shown in FIGS. 15 to 17, the vibrating element 26 is provided on the back of the back plate 14. May be connected. Further, a plurality of different vibration patterns may be generated by the CPU 2 individually controlling the plurality of vibration elements 26.
- the vibration pattern of the tallic vibration generated when a mechanical button is pressed is stored in the ROM 4, and the user executes a predetermined process.
- this vibration pattern is reproduced, a click feeling as if a mechanical button is pressed may be given to the user.
- the magnitude of the vibration generated according to the change of the pressing value P may be changed.
- a plurality of vibrating elements 26 are provided. However, when the user touches only one point on the surface of the touch panel section 11, the vibrating element 26 is used. May be generated.
- the operation input unit 6 includes the touch panel unit by adding the vibrating element 26 to the operation input unit 6 of the first embodiment. It is configured to generate vibration corresponding to the operation content according to the control of the CPU 2 when the operation is performed by pressing 11.
- the user can display the object displayed on the two-dimensional display screen. It is possible to operate with a natural operation feeling.
- the display unit 5 displays a button for instructing the information processing apparatus to execute a predetermined process as an object on a two-dimensional display screen. 1
- the processing assigned to each of the # buttons is instructed to the information processing unit.
- the flowchart shown in Figure 18 shows that the CPU 2 It starts by detecting the change of the coordinate value (x, y) and the pressing value P of the point on the touch panel section 11 (event detection), and the CPU 2 executes the interface program. Perform the following processing according to 8.
- step S31 CPU2 determines whether the pressing value P is equal to or greater than the first set value P1.
- This determination process is a process for determining whether or not the user is touching the touch panel unit 11. If the determination result indicates that the pressing value P is not equal to or greater than the first set value P 1, the CPU In step S32, it is determined whether or not the button displayed on the two-dimensional display screen is in the ON state as the process of step S32.
- the first set value P 1 is set in advance to a press value when the user touches the touch panel section 11 lightly with a finger.
- step S32 the CPU 2 detects the event that the user pressed the touch panel section 11 corresponding to the button. After that, it is determined that the operation is in a stage where the pressure of the finger is released, and as a process in step S33, the vibrating element 26 is controlled to clear the button release button. Generates vibration. Thereafter, the CPU 2 sets the button pressed by the user to the OFF state as the processing of step S34, and then waits until the next event is detected. On the other hand, if the result of determination in step S32 is that the button is not on, processing for the detected event is complete and CPU 2 waits until the next event is detected.
- step S31 if the pressing value P is equal to or larger than the first set value P1, the CPU 2 executes this operation processing from the processing in step S31. Proceed to step S35. Then, in the process of step S35, the CPU 2 determines whether or not the pressing value P is equal to or larger than the second set value P2 (here, P1 ⁇ P2). If the pressure value P is not equal to or greater than the second set value P2 as a result of the discrimination processing in step S35, the CPU 2 determines that the moved point is two-dimensional as the processing in step S36. It is determined whether or not the user has passed the position corresponding to the boundary of the button displayed on the display screen. The second set value P2 is set in advance to a pressure value when the user presses the touch panel section 11 with a finger.
- step S36 determines whether the moved point has passed the position corresponding to the boundary line.
- step S37 the vibrating element 26 is controlled to generate vibration corresponding to the step between the part where the button is displayed and the part where the button is not displayed, and the vibration is displayed on the two-dimensional display screen. Let the user recognize the shape of the displayed button. As a result, the processing for the detected event is completed, and the CPU 2 waits until the next event is detected. On the other hand, if the moved point does not pass through the position corresponding to the boundary as a result of the determination processing in step S36, the processing for the detected event is completed, and CPU 2 proceeds to the next step. Wait until an event is detected.
- step S35 if the pressing value P is equal to or larger than the second set value P2, the CPU 2 advances this operation processing from the processing power in step S35 to the processing in step S38. . Then, in the process of step S38, the CPU 2 determines whether or not the moved point is located at a position corresponding to the button displayed on the two-dimensional display screen. . Then, as a result of the determination processing in step S38, if the moved point is located at a position corresponding to the position within the display area of the button, the CPU 2 determines that the detected event is the type corresponding to the button corresponding to the button by the user.
- step S39 It is determined that the operation is that the touch panel section 11 was pressed, and the operation of step S39 is performed.
- the button is pressed by controlling the vibrating element 26, a corresponding click vibration is generated, and the user is made aware that the button has been pressed.
- the CPU 2 sets the button pressed by the user to the ON state in the process of step S40, and then waits until the next event is detected.
- the discrimination processing in step S38 if the moved button is not at the corresponding position in the display area, the processing for the detected event is completed, and CPU 2 executes the next event. Wait until detected.
- the information processing apparatus operates when the user operates an object displayed on the two-dimensional display screen via the touch panel unit 11.
- the user is fed back with the tactile sensation corresponding to the pressed position of the touch panel section 11 and the pressure, such as the tactile sensation of the position and shape of the object and the click sensation according to the pressing strength, so that the user can
- the object can be operated with a natural feeling of operation, and at the same time, the frequency of operation errors can be reduced.
- the touch panel section 11 is built in or adhered to the display section 5, but as shown in FIG. 19, the touch panel section 11 is Instead, a flexible display 27 formed using a flexible base such as a plastic film is used as the display unit 5, and a plurality of flexible displays 27
- the pressure-sensitive element 12 may be provided.
- the shape of the display unit 5 changes flexibly as the user presses it, so that the display unit 5 is formed using a solid base such as a liquid crystal display device or a CRT device. It is possible to more accurately detect the pressure value at an arbitrary point pressed by the user, as compared with the case where the displayed display device is used.
- the touch panel unit 11 is adhered to the surface of the flexible display 27, and the coordinate values of the point specified by the user are set to the touch panel unit 1.
- the number of the pressure-sensitive elements 12 arranged on the back surface of the flexible display 27 may be reduced by performing the detection using 1.
- a vibration element may be provided, and when the user touches the flexible display 27, the user may feed-knock a tactile sensation corresponding to the operation content to the user. Ray.
- the above configuration it is possible to detect the shape of the user's finger or hand touching the two-dimensional display screen, and the manner and behavior of the touch on the two-dimensional display screen. Therefore, for example, by associating such information with a calling process of a predetermined function, the shape or movement of the hand or finger can be changed. Based on the work, it is possible to realize an unprecedented new operation method.
- the operation input unit 6 may be constituted by a mouse pointer 30 as shown in FIG. 21 (a), for example.
- the mouse pointer 30 shown in FIG. 21 (a) is composed of a general mouse pointer, and has a button unit 31 for switching the on / off state in accordance with a user operation, and a mouse unit 30 for the user. It comprises a detection unit 32 for detecting a position on the designated screen, and a pressure-sensitive element 33 disposed below the button unit 31.
- the pressure-sensitive element 33 detects a pressure value when the user operates the button unit 31, and outputs a pressure detection signal indicating the magnitude of the pressure value to the CPU 2.
- the mouse pointer 30 can only detect the on / off state of the button section 31.
- the processing described in the above embodiment can be executed according to the magnitude of the pressing value when the unit 31 is operated.
- the scroll value, the movement, the enlargement / reduction, and the cursor position can be detected by analogously detecting the press value when the user operates the button section 31.
- Analog values can be easily input in various situations such as movement and volume adjustment.
- the mouse element 30 may be provided with a vibrating element 26. According to such a configuration, a tactile sensation corresponding to the operation content is provided to the user. Feedback You can do it.
- press values P max and P min corresponding to the maximum and minimum depth values in the three-dimensional space are defined in advance, and the press values P max and P min are compared with the press value P at the designated point. Thereby, the depth value in the three-dimensional space specified by the user may be calculated.
- the relationship between the pressing value P and the depth value in the three-dimensional space is stored in a table in advance, and the pressing value P at the designated point is searched using the table, thereby obtaining the 3D value specified by the user.
- the depth value of the dimensional space may be calculated.
- the pressure value P at the specified point is within the defined width, the depth value corresponding to the pressure value P is recognized, so that the operation of specifying the depth value object can be performed. It will be easier.
- the information processing apparatus 1 detects and uses the magnitude of the pressing value P at the designated point as the depth value of the designated point.
- Non-contact input device that detects the distance between the two (-depth direction), and the user's movement in the vertical (depth) direction with respect to the display screen of the display device using techniques such as pattern matching processing
- a value detected using a camera device (a so-called stereo camera) that performs the measurement may be used as the depth value of the designated point. Further, in this case, the information processing apparatus 1 may change the depth value of the designated point in accordance with the change in the detected value.
- the present invention can be applied to, for example, a process of designating an arbitrary point in a three-dimensional space displayed on a two-dimensional display screen.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Position Input By Displaying (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP03733344A EP1513050A1 (en) | 2002-06-11 | 2003-06-10 | Information processing method for specifying an arbitrary point in 3-dimensional space |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2002-170184 | 2002-06-11 | ||
JP2002170184 | 2002-06-11 | ||
JP2003084103A JP2004070920A (ja) | 2002-06-11 | 2003-03-26 | 情報処理プログラム、情報処理プログラムを記録したコンピュータ読み取り可能な記録媒体、情報処理方法、及び情報処理装置 |
JP2003-84103 | 2003-03-26 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2003104967A1 true WO2003104967A1 (ja) | 2003-12-18 |
Family
ID=29738359
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2003/007334 WO2003104967A1 (ja) | 2002-06-11 | 2003-06-10 | 3次元空間内の任意の点を指定する処理のための情報処理方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20040021663A1 (ja) |
EP (1) | EP1513050A1 (ja) |
JP (1) | JP2004070920A (ja) |
WO (1) | WO2003104967A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11068063B2 (en) | 2015-01-23 | 2021-07-20 | Sony Corporation | Information processing apparatus and method for adjusting detection information based on movement imparted by a vibrator |
Families Citing this family (70)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7808479B1 (en) | 2003-09-02 | 2010-10-05 | Apple Inc. | Ambidextrous mouse |
US7663607B2 (en) | 2004-05-06 | 2010-02-16 | Apple Inc. | Multipoint touchscreen |
US11275405B2 (en) | 2005-03-04 | 2022-03-15 | Apple Inc. | Multi-functional hand-held device |
US7656393B2 (en) | 2005-03-04 | 2010-02-02 | Apple Inc. | Electronic device having display and surrounding touch sensitive bezel for user interface and control |
CN101308442B (zh) * | 2004-10-12 | 2012-04-04 | 日本电信电话株式会社 | 三维指示方法和三维指示装置 |
JP4388878B2 (ja) | 2004-10-19 | 2009-12-24 | 任天堂株式会社 | 入力処理プログラムおよび入力処理装置 |
KR101354316B1 (ko) * | 2005-03-04 | 2014-01-22 | 애플 인크. | 다기능 휴대용 장치 |
JP4260770B2 (ja) | 2005-05-09 | 2009-04-30 | 任天堂株式会社 | ゲームプログラムおよびゲーム装置 |
JP4832826B2 (ja) * | 2005-07-26 | 2011-12-07 | 任天堂株式会社 | オブジェクト制御プログラムおよび情報処理装置 |
US7538760B2 (en) | 2006-03-30 | 2009-05-26 | Apple Inc. | Force imaging input device and system |
JP2007272067A (ja) * | 2006-03-31 | 2007-10-18 | Brother Ind Ltd | 画像表示装置 |
CN102981678B (zh) | 2006-06-09 | 2015-07-22 | 苹果公司 | 触摸屏液晶显示器 |
US9710095B2 (en) | 2007-01-05 | 2017-07-18 | Apple Inc. | Touch screen stack-ups |
KR101424259B1 (ko) * | 2007-08-22 | 2014-07-31 | 삼성전자주식회사 | 휴대단말에서 입력 피드백 제공 방법 및 장치 |
DE102007052008A1 (de) * | 2007-10-26 | 2009-04-30 | Andreas Steinhauser | Single- oder multitouchfähiger Touchscreen oder Touchpad bestehend aus einem Array von Drucksensoren sowie Herstellung solcher Sensoren |
JP4557058B2 (ja) * | 2007-12-07 | 2010-10-06 | ソニー株式会社 | 情報表示端末、情報表示方法、およびプログラム |
US9513765B2 (en) | 2007-12-07 | 2016-12-06 | Sony Corporation | Three-dimensional sliding object arrangement method and system |
US8395587B2 (en) * | 2007-12-21 | 2013-03-12 | Motorola Mobility Llc | Haptic response apparatus for an electronic device |
JP4650543B2 (ja) * | 2008-09-12 | 2011-03-16 | コニカミノルタビジネステクノロジーズ株式会社 | 課金システム、課金方法、課金プログラム、記録媒体 |
US20100103115A1 (en) * | 2008-10-24 | 2010-04-29 | Sony Ericsson Mobile Communications Ab | Display arrangement and electronic device |
JP4633166B2 (ja) | 2008-12-22 | 2011-02-16 | 京セラ株式会社 | 入力装置および入力装置の制御方法 |
US10019081B2 (en) * | 2009-01-15 | 2018-07-10 | International Business Machines Corporation | Functionality switching in pointer input devices |
JP4633184B1 (ja) | 2009-07-29 | 2011-02-23 | 京セラ株式会社 | 入力装置および入力装置の制御方法 |
US8654524B2 (en) | 2009-08-17 | 2014-02-18 | Apple Inc. | Housing as an I/O device |
CN102025814A (zh) * | 2009-09-11 | 2011-04-20 | 深圳富泰宏精密工业有限公司 | 便携式电子装置 |
US8487759B2 (en) | 2009-09-30 | 2013-07-16 | Apple Inc. | Self adapting haptic device |
JP2011177203A (ja) * | 2010-02-26 | 2011-09-15 | Nintendo Co Ltd | オブジェクト制御プログラムおよびオブジェクト制御装置 |
EP2390772A1 (en) * | 2010-05-31 | 2011-11-30 | Sony Ericsson Mobile Communications AB | User interface with three dimensional user input |
EP2395414B1 (en) * | 2010-06-11 | 2014-08-13 | BlackBerry Limited | Portable electronic device including touch-sensitive display and method of changing tactile feedback |
JP2012058896A (ja) | 2010-09-07 | 2012-03-22 | Sony Corp | 情報処理装置、プログラムおよび情報処理方法 |
US10013058B2 (en) | 2010-09-21 | 2018-07-03 | Apple Inc. | Touch-based user interface with haptic feedback |
US9569003B2 (en) * | 2010-09-30 | 2017-02-14 | Broadcom Corporation | Portable computing device including a three-dimensional touch screen |
US10120446B2 (en) | 2010-11-19 | 2018-11-06 | Apple Inc. | Haptic input device |
US8804056B2 (en) | 2010-12-22 | 2014-08-12 | Apple Inc. | Integrated touch screens |
JP5369087B2 (ja) * | 2010-12-24 | 2013-12-18 | 京セラ株式会社 | 入力装置および入力装置の制御方法 |
JP2011060333A (ja) * | 2010-12-24 | 2011-03-24 | Kyocera Corp | 入力装置および入力装置の制御方法 |
KR101763263B1 (ko) | 2010-12-24 | 2017-07-31 | 삼성전자주식회사 | 3d 디스플레이 단말 장치 및 그 조작 방법 |
US20120306849A1 (en) * | 2011-05-31 | 2012-12-06 | General Electric Company | Method and system for indicating the depth of a 3d cursor in a volume-rendered image |
JP5613126B2 (ja) * | 2011-09-09 | 2014-10-22 | Kddi株式会社 | 画面内の対象が押圧によって操作可能なユーザインタフェース装置、対象操作方法及びプログラム |
JP5634462B2 (ja) * | 2012-08-31 | 2014-12-03 | 京セラドキュメントソリューションズ株式会社 | 表示入力装置および画像形成装置 |
US9178509B2 (en) | 2012-09-28 | 2015-11-03 | Apple Inc. | Ultra low travel keyboard |
WO2015020663A1 (en) | 2013-08-08 | 2015-02-12 | Honessa Development Laboratories Llc | Sculpted waveforms with no or reduced unforced response |
US9779592B1 (en) | 2013-09-26 | 2017-10-03 | Apple Inc. | Geared haptic feedback element |
CN105579928A (zh) | 2013-09-27 | 2016-05-11 | 苹果公司 | 具有触觉致动器的带体 |
WO2015047343A1 (en) | 2013-09-27 | 2015-04-02 | Honessa Development Laboratories Llc | Polarized magnetic actuators for haptic response |
US10126817B2 (en) | 2013-09-29 | 2018-11-13 | Apple Inc. | Devices and methods for creating haptic effects |
US10236760B2 (en) | 2013-09-30 | 2019-03-19 | Apple Inc. | Magnetic actuators for haptic response |
US9317118B2 (en) | 2013-10-22 | 2016-04-19 | Apple Inc. | Touch surface for simulating materials |
US10019075B2 (en) * | 2013-10-24 | 2018-07-10 | Chunsheng ZHU | Control input apparatus |
WO2015088491A1 (en) | 2013-12-10 | 2015-06-18 | Bodhi Technology Ventures Llc | Band attachment mechanism with haptic response |
US9501912B1 (en) | 2014-01-27 | 2016-11-22 | Apple Inc. | Haptic feedback device with a rotating mass of variable eccentricity |
US20150253918A1 (en) * | 2014-03-08 | 2015-09-10 | Cherif Algreatly | 3D Multi-Touch |
WO2015163842A1 (en) | 2014-04-21 | 2015-10-29 | Yknots Industries Llc | Apportionment of forces for multi-touch input devices of electronic devices |
DE102015209639A1 (de) | 2014-06-03 | 2015-12-03 | Apple Inc. | Linearer Aktuator |
WO2016036671A2 (en) | 2014-09-02 | 2016-03-10 | Apple Inc. | Haptic notifications |
US10353467B2 (en) | 2015-03-06 | 2019-07-16 | Apple Inc. | Calibration of haptic devices |
AU2016100399B4 (en) | 2015-04-17 | 2017-02-02 | Apple Inc. | Contracting and elongating materials for providing input and output for an electronic device |
WO2016199309A1 (ja) * | 2015-06-12 | 2016-12-15 | パイオニア株式会社 | 電子機器 |
WO2017044618A1 (en) | 2015-09-08 | 2017-03-16 | Apple Inc. | Linear actuators for use in electronic devices |
US20170068374A1 (en) * | 2015-09-09 | 2017-03-09 | Microsoft Technology Licensing, Llc | Changing an interaction layer on a graphical user interface |
CN109918004B (zh) * | 2015-12-17 | 2021-04-23 | 网易(杭州)网络有限公司 | 虚拟角色控制方法及装置 |
US10039080B2 (en) | 2016-03-04 | 2018-07-31 | Apple Inc. | Situationally-aware alerts |
US10268272B2 (en) | 2016-03-31 | 2019-04-23 | Apple Inc. | Dampening mechanical modes of a haptic actuator using a delay |
US10622538B2 (en) | 2017-07-18 | 2020-04-14 | Apple Inc. | Techniques for providing a haptic output and sensing a haptic input using a piezoelectric body |
US10691211B2 (en) | 2018-09-28 | 2020-06-23 | Apple Inc. | Button providing force sensing and/or haptic output |
US10599223B1 (en) | 2018-09-28 | 2020-03-24 | Apple Inc. | Button providing force sensing and/or haptic output |
CN112445406A (zh) * | 2019-08-29 | 2021-03-05 | 中兴通讯股份有限公司 | 终端屏幕操作方法及终端和存储介质 |
US11380470B2 (en) | 2019-09-24 | 2022-07-05 | Apple Inc. | Methods to control force in reluctance actuators based on flux related parameters |
US11977683B2 (en) | 2021-03-12 | 2024-05-07 | Apple Inc. | Modular systems configured to provide localized haptic feedback using inertial actuators |
US11809631B2 (en) | 2021-09-21 | 2023-11-07 | Apple Inc. | Reluctance haptic engine for an electronic device |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0595746A1 (en) * | 1992-10-29 | 1994-05-04 | International Business Machines Corporation | Method and system for input device pressure indication in a data processing system |
JPH096526A (ja) * | 1995-06-21 | 1997-01-10 | Toshiba Corp | 3次元ポインティング装置 |
JPH0922330A (ja) * | 1995-07-06 | 1997-01-21 | Meidensha Corp | タッチパネルの入力方法 |
JPH09297650A (ja) * | 1996-05-01 | 1997-11-18 | Smk Corp | 感圧式3次元タブレットとその操作データ検出方法 |
JPH10275052A (ja) * | 1997-03-31 | 1998-10-13 | Mitsumi Electric Co Ltd | 座標情報入力装置 |
JP2001195187A (ja) * | 2000-01-11 | 2001-07-19 | Sharp Corp | 情報処理装置 |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6147684A (en) * | 1998-02-06 | 2000-11-14 | Sun Microysytems, Inc. | Techniques for navigating layers of a user interface |
US6229542B1 (en) * | 1998-07-10 | 2001-05-08 | Intel Corporation | Method and apparatus for managing windows in three dimensions in a two dimensional windowing system |
US6452617B1 (en) * | 2000-01-10 | 2002-09-17 | International Business Machines Corporation | Adjusting a click time threshold for a graphical user interface |
US7034803B1 (en) * | 2000-08-18 | 2006-04-25 | Leonard Reiffel | Cursor display privacy product |
US20040109031A1 (en) * | 2001-05-11 | 2004-06-10 | Kenneth Deaton | Method and system for automatically creating and displaying a customizable three-dimensional graphical user interface (3D GUI) for a computer system |
-
2003
- 2003-03-26 JP JP2003084103A patent/JP2004070920A/ja active Pending
- 2003-06-10 WO PCT/JP2003/007334 patent/WO2003104967A1/ja not_active Application Discontinuation
- 2003-06-10 EP EP03733344A patent/EP1513050A1/en not_active Withdrawn
- 2003-06-11 US US10/460,745 patent/US20040021663A1/en not_active Abandoned
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0595746A1 (en) * | 1992-10-29 | 1994-05-04 | International Business Machines Corporation | Method and system for input device pressure indication in a data processing system |
JPH096526A (ja) * | 1995-06-21 | 1997-01-10 | Toshiba Corp | 3次元ポインティング装置 |
JPH0922330A (ja) * | 1995-07-06 | 1997-01-21 | Meidensha Corp | タッチパネルの入力方法 |
JPH09297650A (ja) * | 1996-05-01 | 1997-11-18 | Smk Corp | 感圧式3次元タブレットとその操作データ検出方法 |
JPH10275052A (ja) * | 1997-03-31 | 1998-10-13 | Mitsumi Electric Co Ltd | 座標情報入力装置 |
JP2001195187A (ja) * | 2000-01-11 | 2001-07-19 | Sharp Corp | 情報処理装置 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11068063B2 (en) | 2015-01-23 | 2021-07-20 | Sony Corporation | Information processing apparatus and method for adjusting detection information based on movement imparted by a vibrator |
Also Published As
Publication number | Publication date |
---|---|
EP1513050A1 (en) | 2005-03-09 |
US20040021663A1 (en) | 2004-02-05 |
JP2004070920A (ja) | 2004-03-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2003104967A1 (ja) | 3次元空間内の任意の点を指定する処理のための情報処理方法 | |
JP5295328B2 (ja) | スクリーンパッドによる入力が可能なユーザインタフェース装置、入力処理方法及びプログラム | |
JP4734435B2 (ja) | タッチパネル式ディスプレイを持った携帯型ゲーム装置 | |
US8963882B2 (en) | Multi-touch device having dynamic haptic effects | |
US9524097B2 (en) | Touchscreen gestures for selecting a graphical object | |
JP3847641B2 (ja) | 情報処理装置、情報処理プログラム、情報処理プログラムを記録したコンピュータ読み取り可能な記録媒体、及び情報処理方法 | |
JP4990753B2 (ja) | 電子機器の入力装置及び入力操作処理方法、並びに入力制御プログラム | |
JPH11203044A (ja) | 情報処理システム | |
JP5640486B2 (ja) | 情報表示装置 | |
JPH10149254A6 (ja) | 座標入力装置 | |
TW201109994A (en) | Method for controlling the display of a touch screen, user interface of the touch screen, and electronics using the same | |
JP2011048686A (ja) | 入力装置 | |
JP2011048832A (ja) | 入力装置 | |
EP2075671A1 (en) | User interface of portable device and operating method thereof | |
JPWO2010047339A1 (ja) | 検知領域がディスプレイの表示領域よりも小さくても同等時のように動作するタッチパネル装置 | |
JP2016015181A (ja) | 押圧の度合いによって異なる機能を発動可能なユーザインタフェース装置、プログラム及び機能発動方法 | |
JP5767148B2 (ja) | 触覚対象画像の奥行き・高低に応じた触覚振動を付与可能なユーザインタフェース装置、触覚振動付与方法及びプログラム | |
JP2013196465A (ja) | オブジェクト選択時に触覚応答が付与されるユーザインタフェース装置、触覚応答付与方法及びプログラム | |
JP5246974B2 (ja) | 電子機器の入力装置及び入力操作処理方法、並びに入力制御プログラム | |
WO2021075143A1 (ja) | 制御装置、プログラム、及びシステム | |
JP2001195170A (ja) | 携帯型電子機器、入力制御装置、及び記憶媒体 | |
JP2016012376A (ja) | 押圧の度合いに応じて異なる触覚応答を付与可能なユーザインタフェース装置、触覚応答付与方法及びプログラム | |
TW200941307A (en) | Extended cursor generating method and device | |
JP6025920B2 (ja) | 触覚対象画像の奥行き・高低に応じた触覚振動を付与可能なユーザインタフェース装置、触覚振動付与方法及びプログラム | |
JP2019101876A (ja) | 入力装置、入力制御装置、操作対象機器、およびプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): CN KR |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PT RO SE SI SK TR |
|
DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2003733344 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 2003733344 Country of ref document: EP |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: 2003733344 Country of ref document: EP |