WO2012165325A1 - 電子機器、その制御方法及びプログラム - Google Patents
電子機器、その制御方法及びプログラム Download PDFInfo
- Publication number
- WO2012165325A1 WO2012165325A1 PCT/JP2012/063463 JP2012063463W WO2012165325A1 WO 2012165325 A1 WO2012165325 A1 WO 2012165325A1 JP 2012063463 W JP2012063463 W JP 2012063463W WO 2012165325 A1 WO2012165325 A1 WO 2012165325A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- auxiliary input
- unit
- electronic device
- stylus
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03545—Pens or stylus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03545—Pens or stylus
- G06F3/03546—Pens or stylus using a rotatable ball at the tip as position detecting member
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
- G06F3/0383—Signal control means within the pointing device
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04807—Pen manipulated menu
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04809—Textured surface identifying touch areas, e.g. overlay structure for a virtual keyboard
Definitions
- the present invention relates to an electronic device, a control method thereof, and a program.
- the present invention relates to an electronic apparatus including auxiliary input means, a control method thereof, and a program.
- Electronic devices such as mobile phones, PHSs (Personal Handyphone Systems), PDAs (Personal Digital Assistants), game consoles, notebook PCs (Personal Computers) have input means such as touch panels in addition to hardware keys and keyboards. There are also many. In particular, many electronic devices in recent years have high-definition display screens, and intuitive operation is possible by combining icon display and a touch panel.
- auxiliary input means includes a stylus that performs handwritten input of characters and symbols by pressing the tip against a touch panel or the like.
- Patent Documents 1 and 2 disclose techniques for feeding back a user's operation to the stylus and improving the operational feeling when using the stylus.
- auxiliary input means such as a stylus
- operations using a stylus are not limited to handwriting input.
- a stylus is used instead of a finger to operate an electronic device.
- a touch panel may be operated using a stylus instead of a finger.
- Such an operation includes an operation (drag operation) for moving the touch panel and the stylus in contact with the file or the like being selected. For example, an operation for selecting a file using a stylus on the operation screen shown in FIG.
- the resistance (friction) between the touch panel and the stylus is small, and the amount of movement of the stylus may be larger than expected by the user. That is, when a drag operation is performed, the target point may be overrun. Alternatively, the resistance between the touch panel and the stylus is small, and it may not be possible to move between the file and the target point (for example, the trash box in FIG. 2) in the shortest distance.
- the user may not be able to operate the stylus as intended due to a small resistance between the touch panel and the stylus.
- auxiliary input means sino-senor
- an electronic device that improves operability when operating using auxiliary input means, a control method thereof, and a program are desired.
- an operation unit that receives a user's operation and an auxiliary that can be used when the user inputs information to the operation unit and that can change a sense felt by the user.
- An input unit and a control unit that guides the user's operation to a predetermined operation by changing a sense given to the user by the auxiliary input unit according to an operation performed by the user using the auxiliary input unit.
- Electronic equipment is provided.
- an electronic device control method comprising: an operation unit that receives a user operation; and an auxiliary input unit that is used when the user inputs information to the operation unit.
- a method for controlling the electronic device including the step of guiding to the electronic device.
- This method is associated with a specific machine, which is an electronic device including an operation unit that receives a user's operation and auxiliary input means used when the user inputs information to the operation unit.
- a computer that controls an electronic device that includes an operation unit that receives a user's operation and an auxiliary input unit that is used when the user inputs information to the operation unit.
- a program to be executed by the user by changing the feeling given to the user from the auxiliary input unit in accordance with the process of detecting the operation performed by the user using the auxiliary input unit and the operation performed by the user.
- This program can be recorded on a computer-readable storage medium.
- the storage medium may be non-transient such as a semiconductor memory, a hard disk, a magnetic recording medium, an optical recording medium, or the like.
- the present invention can also be embodied as a computer program product.
- an electronic device that improves the operability when operating using auxiliary input means, a control method thereof, and a program are provided.
- auxiliary input means sinumeric keyboard
- the user may not operate the stylus as intended due to a small resistance between the touch panel and the stylus. Therefore, an electronic device that improves operability when operating using a stylus is desired.
- An electronic apparatus 300 shown in FIG. 1 is provided as an example.
- An electronic device 300 illustrated in FIG. 1 includes an operation unit 301 that receives a user's operation, and an auxiliary input that can be used when the user inputs information to the operation unit 301 and can change a sense felt by the user.
- a control unit 303 that guides the user's operation to a predetermined operation by changing a sense given to the user by the auxiliary input unit 302 according to an operation performed by the user using the auxiliary input unit 302. I have.
- an operation that the user will perform next may be estimated from the input information.
- a position hereinafter referred to as an operation point
- the user's operation is guided by changing the sense received by the user from the auxiliary input unit 302. That is, when the user's operation is not directed toward the operation point, the feeling given to the user from the auxiliary input unit 302 is changed in a direction that hinders the user's operation.
- the feeling given to the user is changed in a direction that facilitates the user's operation.
- the user can feel the frictional resistance generated between the touch panel and the stylus and the vibration of the stylus itself. In this way, the operation performed by the user is fed back to the auxiliary input unit 302, and the operability of the electronic device is improved by changing the feeling felt by the user.
- the control unit includes a first operation point at which the operation unit accepts a user operation and a second operation point that is assumed to be performed next by the user based on the first operation point. It is preferable that the movement assist line for guiding the user's operation is calculated, and that the sense that the auxiliary input means gives to the user when the user's operation deviates from the movement assist line.
- control unit and the auxiliary input unit can transmit and receive data to each other by wire or wireless.
- auxiliary input unit changes a feeling felt by the user by changing a frictional resistance of a portion in contact with the operation unit.
- auxiliary input means changes a sense perceived by the user by vibrating a housing of the auxiliary input means.
- the operation unit accepts a user operation through a touch panel, and the auxiliary input unit is a stylus.
- Mode 8 The step of guiding to the predetermined operation is assumed to be performed first by the user based on the first operation point at which the operation unit accepts the user's operation and the first operation point. It is preferable to calculate a movement auxiliary line that guides the user's operation from the two operation points, and to enhance the sense that the auxiliary input means gives to the user when the user's operation deviates from the movement auxiliary line.
- FIG. 3 is a diagram illustrating an example of an appearance of the electronic apparatus 1 according to the present embodiment.
- the electronic device 1 includes an electronic device main body 10 and a stylus 20.
- the electronic device main body 10 and the stylus 20 can transmit and receive data by wire or wirelessly. In the present embodiment, description will be made assuming that both communicate wirelessly.
- the electronic device main body 10 includes a display unit 101 and a touch panel 102. Information necessary for a user operation is displayed on the display unit 101 and receives a user operation from the touch panel 102.
- FIG. 4 is a diagram illustrating an example of the internal configuration of the electronic device main body 10.
- the electronic device main body 10 includes a display unit 101, a touch panel 102, a main body communication unit 103, and a main body control unit 104.
- the display unit 101 and the touch panel 102 are as described above, and a description thereof is omitted.
- the electronic device main body 10 performs wireless communication with the stylus 20 using the main body communication unit 103.
- the main body control unit 104 controls each unit of the display unit 101, the touch panel 102, and the main body communication unit 103.
- FIG. 5 is a diagram illustrating an example of the internal configuration of the stylus 20.
- the stylus 20 includes a stylus communication unit 201, a sphere control unit 202, a sphere unit 203, and a stylus control unit 204.
- the stylus communication unit 201 realizes wireless communication between the stylus 20 and the electronic device body 10.
- the sphere control unit 202 controls the rotation of the sphere unit 203 provided at the tip of the stylus 20 (controlled by the strength of the brake).
- the sphere control unit 202 controls the rotation of the sphere unit 203 to change the frictional resistance when the touch panel 102 and the stylus 20 are brought into contact with each other.
- the change in the frictional resistance when the touch panel 102 and the stylus 20 are brought into contact changes the feeling that the user feels from the stylus 20.
- the stylus control unit 204 receives an instruction from the electronic device main body 10 via the stylus communication unit 201 and instructs the sphere control unit 202 according to the instruction from the electronic device main body 10. More specifically, when the stylus control unit 204 receives an instruction to increase the frictional resistance felt by the user from the main body control unit 104 via the stylus communication unit 201, the stylus control unit 204 sends a sphere to the sphere control unit 202. An instruction to strengthen the brake to the unit 203 is given. On the other hand, when receiving an instruction to reduce the frictional resistance felt by the user, the stylus control unit 204 instructs the sphere control unit 202 to weaken the brake on the sphere unit 203.
- FIG. 6 is a flowchart showing an example of the operation of the electronic device 1 when the user operates using the stylus 20.
- step S01 the main body control unit 104 detects that an operation with the stylus 20 has been performed on the touch panel 102.
- step S02 the main body communication unit 103 notifies the stylus communication unit 201 that an operation using the stylus 20 has been performed.
- step S03 an operation point assumed for the next operation performed by the user is calculated. For example, when a file is selected as shown in FIG. 2, the next operation performed by the user is to move the file to the trash (delete the file), move the file to a folder, or open the file. Is estimated as When trying to move a file to a trash can or folder, it is necessary to drag the file, so the location of the trash can or folder is calculated as an operation point.
- step S04 the main body control unit 104 determines whether an operation point exists. If there is an operation point, the process proceeds to step S05. If there is no operation point, the process ends. A case where no operation point exists may be when a file is selected in the lowermost folder.
- step S05 the main body control unit 104 calculates a movement assist line.
- FIG. 7 is a diagram illustrating an example of a movement assist line.
- point A (X1, Y1) is a point at which the user presses the touch panel 102 with the stylus 20
- point B (X2, Y2) is an operation point.
- a straight line connecting points A and B is a movement assist line.
- the movement assistance line is calculated by the following formulas (1) and (2). Note that ⁇ is a movement angle, and L is a movement distance.
- step S06 the main body control unit 104 determines whether the user's drag operation and the above-described movement assist line overlap. If it is determined that the drag operation and the movement assist line do not overlap, the process proceeds to step S07. If it is determined that the drag operation and the movement assist line overlap, the process proceeds to step S08.
- step S07 the main body control unit 104 instructs the stylus 20 to increase the frictional resistance between the touch panel 102 and the stylus 20.
- the stylus control unit 204 instructs the sphere control unit 202 to increase the frictional resistance between the touch panel 102 and the sphere part 203.
- the sphere control unit 202 controls the rotation of the sphere unit 203 according to the instruction.
- step S08 the frictional resistance when the touch panel 102 and the stylus 20 are brought into contact is reduced in the same procedure as in step S07.
- step S09 it is determined whether the stylus 20 has arrived at the operation point. If the operation point has been reached, the process is terminated. If the operation point has not been reached, the process proceeds to step S06 and the process is continued.
- the degree to which the frictional resistance is increased may be determined based on the deviation between the user's drag operation and the movement assist line. That is, the frictional resistance is increased as it moves away from the movement assist line, and the user's drag operation is guided onto the movement assist line.
- the operation point that will move the stylus is calculated from the operation that the user is supposed to perform next. Further, the movement assist line is calculated from the operation point, and when the user's drag operation deviates from the movement assist line, the frictional resistance between the touch panel and the stylus is increased. On the other hand, if the user's drag operation overlaps the movement assist line, the frictional resistance between the touch panel and the stylus is reduced. As a result, since the user's drag operation is performed along the movement assistance line, even if the user's operation is rough, the user can naturally reach the operation point. That is, when the electronic device is operated using the stylus 20, the operability of the electronic device 1 can be improved by providing feedback to the user.
- the stylus is vibrated while operating with the stylus, thereby giving the user a sense close to actual writing (operation using the stylus). Improved feeling).
- the techniques disclosed in Patent Documents 1 and 3 do not improve operability even if the user's operational feeling is improved.
- the electronic device main body 10a of the electronic device 1a has the same configuration as the electronic device main body 10 described in the first embodiment, and a description thereof will be omitted.
- FIG. 8 is a diagram illustrating an example of the internal configuration of the stylus 20a of the electronic apparatus 1a according to the present embodiment.
- the same components as those of FIG. 5 are denoted by the same reference numerals, and the description thereof is omitted.
- the stylus 20a shown in FIG. 8 is different from the stylus 20 shown in FIG. 5 in that the sphere control unit 202 is deleted and a vibration unit 205 is newly added.
- the vibration unit 205 can receive a command from the stylus control unit 204 and can apply vibration to the user's hand holding the stylus 20a.
- a piezoelectric vibrator can be used for the vibration unit 205.
- the difference between the operations of the stylus 20a and the stylus 20 is that, instead of increasing the frictional resistance between the touch panel 102 and the stylus 20, vibration is applied when the user's drag operation deviates from the movement assist line.
- vibration is applied when the user's drag operation deviates from the movement assist line.
- the drag operation can be guided to the movement assist line by applying vibration to the user's hand.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Position Input By Displaying (AREA)
- User Interface Of Digital Computer (AREA)
- Telephone Function (AREA)
Abstract
Description
本発明は、日本国特許出願:特願2011-118733号(2011年5月27日出願)の優先権主張に基づくものであり、同出願の全記載内容は引用をもって本書に組み込み記載されているものとする。
次に、第1の実施形態について、図面を用いてより詳細に説明する。図3は、本実施形態に係る電子機器1の外観の一例を示す図である。電子機器1は、電子機器本体10とスタイラス20から構成されている。電子機器本体10とスタイラス20は、有線又は無線によってデータの送受信が可能である。本実施形態では、両者は無線によって通信を行うものとして説明する。また、電子機器本体10は、表示部101とタッチパネル102を備えており、ユーザの操作に必要な情報は表示部101に表示し、タッチパネル102からユーザの操作を受け付ける。
なお、θは移動角度であり、Lは移動距離である。
続いて、第2の実施形態について図面を参照して詳細に説明する。なお、本実施形態に係る電子機器1aの電子機器本体10aは、第1の実施形態において説明した電子機器本体10と同一の構成のため説明を省略する。
10、10a 電子機器本体
20、20a スタイラス
101 表示部
102 タッチパネル
103 本体通信部
104 本体制御部
201 スタイラス通信部
202 球体制御部
203 球体部
204 スタイラス制御部
205 振動部
301 操作部
302 補助入力手段
303 制御部
Claims (10)
- ユーザの操作を受け付ける操作部と、
ユーザが、前記操作部に対して情報を入力する際に使用し、ユーザが感じ取る感覚を変化させることが可能な補助入力手段と、
ユーザが前記補助入力手段を用いて行う操作に応じて、前記補助入力手段がユーザに与える感覚を変化させることで、ユーザの操作を所定の操作に誘導する制御部と、
を備えることを特徴とする電子機器。 - 前記制御部は、前記操作部がユーザの操作を受け付けた第1の操作ポイントと、前記第1の操作ポイントに基づいてユーザが次に行うと想定される第2の操作ポイントから、ユーザの操作を誘導する移動補助ラインを算出し、ユーザの操作が前記移動補助ラインから外れた場合には、前記補助入力手段がユーザに与える感覚を強める請求項1の電子機器。
- 前記制御部と、前記補助入力手段は、有線又は無線により、互いにデータの送受信が可能な請求項1又は2の電子機器。
- 前記補助入力手段は、前記操作部と接触する部分の摩擦抵抗を変化させることで、ユーザが感じ取る感覚を変化させる請求項1乃至3のいずれか一に記載の電子機器。
- 前記補助入力手段は、前記補助入力手段の筐体を振動させることで、ユーザが感じ取る感覚を変化させる請求項1乃至3のいずれか一に記載の電子機器。
- 前記操作部は、タッチパネルによりユーザの操作を受け付け、前記補助入力手段はスタイラスである請求項1乃至5のいずれか一に記載の電子機器。
- ユーザの操作を受け付ける操作部と、
ユーザが、前記操作部に対して情報を入力する際に使用する補助入力手段と、
を備える電子機器の制御方法であって、
ユーザが前記補助入力手段を用いて行う操作を検出する工程と、
ユーザが行った操作に応じて、前記補助入力手段からユーザに与える感覚を変化させることでユーザの操作を所定の操作に誘導する工程と、
を含むことを特徴とする電子機器の制御方法。 - 前記所定の操作に誘導する工程は、前記操作部がユーザの操作を受け付けた第1の操作ポイントと、前記第1の操作ポイントに基づいてユーザが次に行うと想定される第2の操作ポイントから、ユーザの操作を誘導する移動補助ラインを算出し、ユーザの操作が前記移動補助ラインから外れた場合には、前記補助入力手段がユーザに与える感覚を強める請求項7の電子機器の制御方法。
- ユーザの操作を受け付ける操作部と、
ユーザが、前記操作部に対して情報を入力する際に使用する補助入力手段と、
を備える電子機器を制御するコンピュータに実行させるプログラムであって、
ユーザが前記補助入力手段を用いて行う操作を検出する処理と、
ユーザが行った操作に応じて、前記補助入力手段からユーザに与える感覚を変化させることでユーザの操作を所定の操作に誘導する処理と、
を実行するプログラム。 - 前記所定の操作に誘導する処理は、前記操作部がユーザの操作を受け付けた第1の操作ポイントと、前記第1の操作ポイントに基づいてユーザが次に行うと想定される第2の操作ポイントから、ユーザの操作を誘導する移動補助ラインを算出し、ユーザの操作が前記移動補助ラインから外れた場合には、前記補助入力手段がユーザに与える感覚を強める請求項9のプログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013518051A JP6007905B2 (ja) | 2011-05-27 | 2012-05-25 | 電子機器、その制御方法及びプログラム |
EP20120792819 EP2717116A4 (en) | 2011-05-27 | 2012-05-25 | ELECTRONIC DEVICE, CONTROL PROCEDURE THEREFOR AND PROGRAM |
US14/119,742 US20140089793A1 (en) | 2011-05-27 | 2012-05-25 | Electronic apparatus, control method thereof, and program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011118733 | 2011-05-27 | ||
JP2011-118733 | 2011-05-27 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012165325A1 true WO2012165325A1 (ja) | 2012-12-06 |
Family
ID=47259182
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/063463 WO2012165325A1 (ja) | 2011-05-27 | 2012-05-25 | 電子機器、その制御方法及びプログラム |
Country Status (4)
Country | Link |
---|---|
US (1) | US20140089793A1 (ja) |
EP (1) | EP2717116A4 (ja) |
JP (1) | JP6007905B2 (ja) |
WO (1) | WO2012165325A1 (ja) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102446653B1 (ko) * | 2021-11-10 | 2022-09-23 | (주)웨이투메이크 | 광촉매재를 이용한 광촉매 모듈 및 광촉매 모듈의 광촉매 특성 측정방법 |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0535398A (ja) * | 1991-07-30 | 1993-02-12 | Toshiba Corp | マウス装置 |
JPH0683540A (ja) * | 1992-09-03 | 1994-03-25 | Pfu Ltd | カーソル位置指定方法 |
JP2003097964A (ja) * | 2001-09-26 | 2003-04-03 | Nissan Motor Co Ltd | 入力制御装置 |
JP2004171157A (ja) * | 2002-11-18 | 2004-06-17 | Fuji Xerox Co Ltd | 触覚インタフェース装置 |
JP2004246451A (ja) * | 2003-02-12 | 2004-09-02 | Matsushita Electric Ind Co Ltd | トラックボール装置及びこれを用いた電子機器 |
JP2005317006A (ja) | 2004-04-28 | 2005-11-10 | Fuji Xerox Co Ltd | 力フィードバック装置、力フィードバックの生成方法、力フィードバック生成システム、及びプログラム |
JP2007302213A (ja) * | 2006-05-15 | 2007-11-22 | Toyota Motor Corp | 車両用入力装置 |
JP2008257748A (ja) | 2000-05-24 | 2008-10-23 | Immersion Corp | 電気活性ポリマーを利用する触覚装置 |
JP2009075995A (ja) * | 2007-09-25 | 2009-04-09 | Hitachi Ltd | 座標指示装置 |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7578742B2 (en) * | 2004-03-26 | 2009-08-25 | Nintendo Co., Ltd. | Recording medium storing video game program and video game device |
WO2007049253A2 (en) * | 2005-10-28 | 2007-05-03 | Koninklijke Philips Electronics N.V. | Display system with a haptic feedback via interaction with physical objects |
US9468846B2 (en) * | 2009-01-30 | 2016-10-18 | Performance Designed Products Llc | Tactile feedback apparatus and method |
US9417695B2 (en) * | 2010-04-08 | 2016-08-16 | Blackberry Limited | Tactile feedback method and apparatus |
-
2012
- 2012-05-25 US US14/119,742 patent/US20140089793A1/en not_active Abandoned
- 2012-05-25 EP EP20120792819 patent/EP2717116A4/en not_active Withdrawn
- 2012-05-25 JP JP2013518051A patent/JP6007905B2/ja not_active Expired - Fee Related
- 2012-05-25 WO PCT/JP2012/063463 patent/WO2012165325A1/ja active Application Filing
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0535398A (ja) * | 1991-07-30 | 1993-02-12 | Toshiba Corp | マウス装置 |
JPH0683540A (ja) * | 1992-09-03 | 1994-03-25 | Pfu Ltd | カーソル位置指定方法 |
JP2008257748A (ja) | 2000-05-24 | 2008-10-23 | Immersion Corp | 電気活性ポリマーを利用する触覚装置 |
JP2003097964A (ja) * | 2001-09-26 | 2003-04-03 | Nissan Motor Co Ltd | 入力制御装置 |
JP2004171157A (ja) * | 2002-11-18 | 2004-06-17 | Fuji Xerox Co Ltd | 触覚インタフェース装置 |
JP2004246451A (ja) * | 2003-02-12 | 2004-09-02 | Matsushita Electric Ind Co Ltd | トラックボール装置及びこれを用いた電子機器 |
JP2005317006A (ja) | 2004-04-28 | 2005-11-10 | Fuji Xerox Co Ltd | 力フィードバック装置、力フィードバックの生成方法、力フィードバック生成システム、及びプログラム |
JP2007302213A (ja) * | 2006-05-15 | 2007-11-22 | Toyota Motor Corp | 車両用入力装置 |
JP2009075995A (ja) * | 2007-09-25 | 2009-04-09 | Hitachi Ltd | 座標指示装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2717116A4 |
Also Published As
Publication number | Publication date |
---|---|
EP2717116A1 (en) | 2014-04-09 |
JPWO2012165325A1 (ja) | 2015-02-23 |
JP6007905B2 (ja) | 2016-10-19 |
EP2717116A4 (en) | 2015-05-06 |
US20140089793A1 (en) | 2014-03-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5295328B2 (ja) | スクリーンパッドによる入力が可能なユーザインタフェース装置、入力処理方法及びプログラム | |
US20080122797A1 (en) | Apparatus, method, and medium for outputting tactile feedback on display device | |
US20120154293A1 (en) | Detecting gestures involving intentional movement of a computing device | |
JP5871965B2 (ja) | 電子装置のスクロール装置及びその方法 | |
US20120297336A1 (en) | Computer system with touch screen and associated window resizing method | |
US20210208768A1 (en) | Visual Manipulation of a Digital Object | |
EP3211510B1 (en) | Portable electronic device and method of providing haptic feedback | |
JP6248678B2 (ja) | 情報処理装置、手書き入力プログラム及び手書き入力方法 | |
US20120204127A1 (en) | Portable electronic device and method of controlling same | |
WO2013157663A1 (ja) | 入力制御方法、コンピュータ、および、プログラム | |
KR20130091140A (ko) | 터치스크린 기기와 보조 기기 간 햅틱 피드백 장치 및 방법 | |
CN108885556B (zh) | 控制数字输入 | |
JP6007905B2 (ja) | 電子機器、その制御方法及びプログラム | |
US10365757B2 (en) | Selecting first digital input behavior based on a second input | |
JP7037344B2 (ja) | 入力制御装置、入力装置、操作対象機器、およびプログラム | |
JP2016133978A (ja) | 情報処理装置、情報処理方法及びプログラム | |
JP6283280B2 (ja) | 電子書籍閲覧装置及び電子書籍閲覧方法 | |
US20130293483A1 (en) | Selectable object display method and apparatus | |
WO2015098560A1 (ja) | 制御装置、制御方法、及び、プログラム | |
EP2660698A1 (en) | Selectable object display method and apparatus | |
JP2006039819A (ja) | 座標入力装置 | |
EP2487569B1 (en) | Portable electronic device and method of controlling same | |
WO2017159796A1 (ja) | 情報処理方法及び情報処理装置 | |
WO2016158125A1 (ja) | 電子機器 | |
JP2019074999A (ja) | 電子機器とその制御方法、プログラム及び記憶媒体 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12792819 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2013518051 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14119742 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REEP | Request for entry into the european phase |
Ref document number: 2012792819 Country of ref document: EP |