WO2013128989A1 - 表示装置及びその動作方法 - Google Patents
表示装置及びその動作方法 Download PDFInfo
- Publication number
- WO2013128989A1 WO2013128989A1 PCT/JP2013/051412 JP2013051412W WO2013128989A1 WO 2013128989 A1 WO2013128989 A1 WO 2013128989A1 JP 2013051412 W JP2013051412 W JP 2013051412W WO 2013128989 A1 WO2013128989 A1 WO 2013128989A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- display
- contact
- unit
- proximity
- tactile sense
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03547—Touch pads, in which fingers can move on a surface
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04108—Touchless 2D- digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface without distance measurement in the Z direction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04806—Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
Definitions
- the present invention relates to a display device and an operation method thereof, and more particularly, to a display device operated using a touch panel and an operation method thereof.
- a touch panel is arranged on a display, and an operation corresponding to the button or the like is performed by touching a region where the button or the like is displayed on the display.
- JP 2010-250750 A Special table 2011-501298 gazette
- the tactile sensation presentation and display control are one set in advance according to the trajectory of the finger touching the touch panel. There is a case where an operation different from the operation actually desired by the user is performed.
- the present invention has been made in view of the above-described problems, and corresponds to each of various operations performed by a user in a display device that performs an operation according to a contact operation and a tactile sensation by contact. It is an object of the present invention to provide a display device capable of performing the above operation and an operation method thereof.
- the present invention provides: Display means for displaying information; A contact / proximity detecting means provided on the display means for detecting a contact and / or proximity operation; Tactile sense presenting means for presenting a tactile sense via the contact / proximity detecting means; Based on the conditions set for the contact and / or proximity operation, the display of the display unit and the operation of the tactile sense presenting unit according to the contact and / or proximity operation detected by the contact / proximity detection unit are determined.
- Input determination means for Control means for controlling the display of the display means and the operation of the tactile sense presenting means based on the determination in the input determining means.
- display means for displaying information comprising: Based on the conditions set for the contact and / or proximity operation, the display of the display unit and the operation of the tactile sense providing unit according to the contact and / or proximity operation detected by the contact / proximity detection unit are controlled. To do.
- display means for displaying information comprising: In the display device, Based on the conditions set for the contact and / or proximity operation, the display of the display unit and the operation of the tactile sense presenting unit according to the contact and / or proximity operation detected by the contact / proximity detection unit are determined. And the steps to Based on the determination, a procedure for controlling the display of the display means and the operation of the tactile sense presenting means is executed.
- the present invention since it is configured to control the display of the display unit and the operation of the tactile sense providing unit according to the detected contact and / or proximity operation based on the conditions set for the contact and / or proximity operation, Operations corresponding to various operations performed by the user can be performed.
- FIG. 1 and FIG.2 It is an external appearance perspective view which shows one Embodiment of the display apparatus of this invention. It is a block diagram which shows an example of an internal structure of the portable terminal shown in FIG. It is a figure which shows an example of the table which the input determination part shown in FIG. 2 has. 4 is a flowchart for explaining an operation method based on the table shown in FIG. 3 in the portable terminal shown in FIGS. It is a figure for demonstrating an example of operation
- FIG. 1 is an external perspective view showing an embodiment of a display device of the present invention.
- FIG. 2 is a block diagram showing an example of the internal configuration of the mobile terminal 1 shown in FIG.
- a touch panel unit 10 serving as a contact / proximity detection unit
- a display unit 20 serving as a display unit
- a speaker unit 30 are provided on one surface of a flat housing 2.
- the mobile terminal 1 is provided with the microphone unit 40 and the key button unit 50 on the side surface of the housing 2.
- the display unit 20 is composed of a liquid crystal or the like, and displays information such as an image under the control of the CPU 90 which is a control means.
- the touch panel unit 10 is provided on the display unit 20, and detects a contact operation when a contact operation with a contact body such as a finger is performed. For example, one using a resistance film, one using a capacitance, and the like can be given.
- the speaker unit 30 outputs sound received via the communication unit 80, sound stored in a memory (not shown) of the CPU 90, and the like under the control of the CPU 90.
- the microphone unit 40 is for inputting sound.
- the key button unit 50 is used to input information for operating the mobile terminal 1.
- the communication unit 80 transmits and receives information via the mobile terminal network.
- the tactile sensation presentation unit 60 includes, for example, a small vibrator or the like, and responds to an image displayed on the display unit 20 with respect to a contact body such as a finger touching the touch panel unit 10 by vibrating the touch panel unit 10. The tactile sensation is presented via the touch panel unit 10.
- the input determination unit 70 has a table (not shown) for determining the display of the display unit 20 and the operation of the tactile sense providing unit 60 according to the touch operation on the touch panel unit 10.
- the input determination unit 70 refers to this table and determines the display of the display unit 20 and the operation of the tactile sense presentation unit 60 according to the touch operation on the touch panel unit 10.
- FIG. 3 is a diagram illustrating an example of a table included in the input determination unit 70 illustrated in FIG.
- the table included in the input determination unit 70 shown in FIG. 2 is displayed by the display unit 20 and the tactile sense presentation unit 60 for each contact operation performed on the touch panel unit 10.
- the actions are associated. That is, in this table, the conditions for the touch operation on the touch panel unit 10 regarding the display of the display unit 20 and the operation of the tactile sense providing unit 60 are set. For example, when an operation of tracing the touch panel unit 10 is detected within 0.5 seconds after the touch operation is detected on the touch panel unit 10, an operation such as page display by scroll display or flicking is performed on the display unit 20.
- an operation that does not operate and does not present a tactile sense is set.
- Whether to perform scroll display on the display unit 20 or to turn the page by flicking is set according to the speed of movement by the operation of tracing the touch panel unit 10.
- the tactile sense providing unit 60 operates without changing the display on the display unit 20. Then, an operation for presenting a sense of touch is set.
- the moving image displayed on the display unit 20 is displayed.
- the input determination unit 70 determines the display of the display unit 20 and the operation of the tactile sense presentation unit 60 according to the contact operation detected by the touch panel unit 10 based on the conditions set in the table as described above. become.
- the CPU 90 incorporates a memory (not shown) and controls the entire portable terminal 1, and controls the display of the display unit 20 and the operation of the tactile sense presentation unit 60 based on the determination in the input determination unit 70.
- FIG. 4 is a flowchart for explaining an operation method based on the table shown in FIG. 3 in the portable terminal 1 shown in FIGS.
- step 1 When an image is displayed on the display unit 20 of the mobile terminal 1, when the user touches the touch panel unit 10 with a finger, the touch operation is detected on the touch panel unit 10 (step 1), and the CPU 90 performs the contact operation. The control of the operation of the corresponding portable terminal 1 is started.
- the input determination unit 70 pauses the moving image displayed on the display unit 20 based on the conditions set in the table shown in FIG.
- the tactile sense presenting unit 60 determines an operation that does not present a tactile sense. Then, this determination is sent to the CPU 90, and when the moving image is displayed on the display unit 20 under the control of the CPU 90, the moving image is temporarily stopped (step 4). At this time, when the tactile sense providing unit 60 is not operating, it is not necessary to perform control to stop the operation of the tactile sense providing unit 60. However, when the tactile sense providing unit 60 is operating, the tactile sense is controlled by the control of the CPU 90. The operation of the presentation unit 60 is temporarily stopped.
- step 5 In a state where the touch operation is detected on the touch panel unit 10, the user moves a finger touching the touch panel unit 10, and the region where the touch operation is detected on the touch panel unit 10 moves (step 5).
- the input determination unit 70 it is determined whether the contact operation is pinch-in or pinch-out (step 6). Whether or not the touch operation is a pinch-in or pinch-out is determined by the number of touch operations detected by the touch panel unit 10 being two, and the positions of the two points are directed toward or away from each other. For example, it is possible to use a technique generally used in a display device having the touch panel unit 10 such as determining that the pinch-in or pinch-out occurs when the user touches the screen.
- the input determination unit 70 determines that the touch operation on the touch panel unit 10 is pinch-in, among the images displayed on the display unit 20 based on the conditions set in the table shown in FIG.
- the image of the area where the pinch-in operation has been performed is reduced and displayed, and the tactile sense providing unit 60 determines an operation that does not present a tactile sense. Then, this determination is sent to the CPU 90, and the image of the area where the pinch-in operation has been performed among the images displayed on the display unit 20 is reduced and displayed under the control of the CPU 90 without the tactile sense providing unit 60 operating.
- the input determination unit 70 determines that the touch operation on the touch panel unit 10 is a pinch-out
- an enlarged image of an area in which the pinch-out operation has been performed among the images displayed on the display unit 20 is displayed.
- the tactile sense providing unit 60 determines an operation that does not present a tactile sense. Then, this determination is sent to the CPU 90. Under the control of the CPU 90, the image of the area where the pinch-out operation is performed among the images displayed on the display unit 20 without operating the tactile sense providing unit 60 is displayed in an enlarged manner. (Step 7).
- the input determination unit 70 detects the touch operation on the touch panel unit 10 after the touch operation is detected on the touch panel unit 10. It is determined whether or not the time until the area has moved is within 0.5 seconds (step 8).
- the input determination unit 70 Based on the conditions set in the table shown in FIG. 3, the operation of presenting a tactile sensation is determined by the tactile sense presenting unit 60 without changing the image displayed on the display unit 20 by scroll display or the like. Then, this determination is sent to the CPU 90, and the display of the display unit 20 does not change under the control of the CPU 90, and the tactile sense providing unit 60 operates and a tactile sensation is presented to the finger in contact with the touch panel unit 10. (Step 9). As described above, the tactile sense providing unit 60 is configured by a small vibrator or the like.
- the operation of the tactile sense providing unit 60 corresponding to the area where the touch operation is detected by the touch panel unit 10 is set in advance. ing.
- the tactile sense providing unit 60 is configured by a small vibrator
- the magnitude of vibration of the vibrator is set according to the area where the touch operation is detected by the touch panel unit 10. Then, by controlling the vibrator to vibrate at the size in the CPU 90, it is possible to make the user feel as if he / she is touching the object by the image displayed in the contact area.
- the input determining unit 70 3 When the time from when the touch operation is detected by the touch panel unit 10 until the region where the touch operation is detected by the touch panel unit 10 is within 0.5 seconds, the input determining unit 70 3, the image displayed on the display unit 20 is scroll-displayed in the direction in which the region where the touch operation is detected on the touch panel unit 10 is moved based on the conditions set in the table shown in FIG. At 60, an action not presenting a tactile sense is determined. Then, this determination is sent to the CPU 90, and the area where the touch operation is detected on the touch panel unit 10 moves in the image displayed on the display unit 20 without operating the tactile sense providing unit 60 under the control of the CPU 90. The display is scrolled in the direction (step 10).
- the input determination unit 70 determines that the flick operation has been performed based on the detected movement speed of the contact operation, the transition to the next item or page turning is performed. You may go. Further, a tactile sensation may be presented even when scroll display is performed. In that case, the input determination unit 70 sets that fact.
- FIGS. 5a to 5c are diagrams for explaining an example of the operation of the mobile terminal 1 shown in FIGS. 1 and 2.
- FIG. 5a to 5c are diagrams for explaining an example of the operation of the mobile terminal 1 shown in FIGS. 1 and 2.
- the touch operation on the image 4 is performed on the touch panel unit 10. Detected. Then, when the surface of the touch panel unit 10 is traced by the finger 3 that is in contact with the touch panel unit 10, after the contact operation is detected by the touch panel unit 10 in the input determination unit 70, the surface of the touch panel unit 10 is traced. Then, it is determined whether or not the time until the region where the touch operation is detected is within 0.5 seconds.
- the two contact operations detected by the touch panel unit 10 are identical to each other based on the time during which the contact operation is performed and the conditions regarding the locus of the contact operation on the touch panel unit 10. Even when the trajectory is drawn, when the touch times on the touch panel unit 10 before the trajectory are drawn are different from each other, the display of the display unit 10 and the operation of the tactile sense presentation unit 60 are determined to be different from each other. become.
- the contact operation is not detected in Step 2 after 0.3 seconds have passed since the contact operation was detected in Step 1 or the contact operation was detected in Step 1.
- the input determination unit 70 determines whether or not the state where the touch operation is detected in step 1 continues for 1 second or more (step 11). ).
- the input determination unit 70 displays the state on the display unit 20 based on the conditions set in the table shown in FIG.
- the image displayed in the area where the touch operation is detected on the touch panel unit 10 is enlarged and displayed, and the tactile sense providing unit 60 determines the operation for enhancing the tactile sense to be presented. Then, this determination is sent to the CPU 90.
- the image displayed in the area where the touch operation is detected in the touch panel unit 10 among the images displayed on the display unit 20 is enlarged and displayed.
- the presented tactile sensation is strengthened (step 12). Note that, for example, when the tactile sense providing unit 60 is composed of a small vibrator as described above, the tactile sense can be enhanced by increasing the vibration of the vibrator to increase the tactile sense given to the finger 3. It is done.
- the input determination unit 70 determines the operation of presenting the tactile sense in the tactile sense presenting unit 60. Then, this determination is sent to the CPU 90, and under the control of the CPU 90, a tactile sensation corresponding to the image displayed in the area where the contact operation is detected is presented by the tactile sense providing unit 60 (step 13). At this time, the display on the display unit 20 may be switched according to the image displayed in the area where the contact operation is detected in step 1 or may be left as it is.
- the display operation of the display unit 20 and the operation condition of the tactile sense providing unit 60 are set for the touch operation on the touch panel unit 10, and the touch panel unit 10 detects the touch condition based on this condition. Since the display of the display unit 20 and the operation of the tactile sense providing unit 60 are controlled in accordance with the touch operation, the conditions for the touch operation that the user is supposed to perform on the touch panel unit 10 are set. The operation corresponding to each of various operations performed by the user can be performed. For example, when the user traces the touch panel unit 10 with a finger, it can be determined whether the operation is a scroll operation or an operation for confirming a tactile sense, and an operation corresponding to the operation can be performed.
- the touch panel unit 10 by setting the operation according to any combination among the number, position, time, locus, and movement speed of the touch operation on the touch panel unit 10 in the table of the input determination unit 70, the touch panel unit The operation according to the number, position, time, trajectory, speed of movement, and a combination of these for the contact operation with respect to 10 is possible.
- an example using only the touch operation with respect to the touch panel unit 10 has been described as an example.
- a device that detects a proximity operation of a contact body such as a user's finger may be used.
- the touch panel unit 10 may detect not only a contact operation but also a proximity operation based on a distance from a contact body such as a user's finger.
- the information displayed on the display unit 20 may be information displayed on a browser screen, an e-mail screen, or the like received via the communication unit 80 in addition to the above-described images and moving images. .
- the display device of the present invention is not limited to the portable terminal 1 as described above, and can be applied as long as it displays information and can detect a contact and / or proximity operation.
- the processing in the mobile terminal 1 is recorded on a recording medium readable by the mobile terminal 1 in addition to the above-described dedicated hardware.
- the program recorded on the recording medium may be read by the portable terminal 1 and executed.
- the recording medium readable by the portable terminal 1 includes an IC card, a memory card, a recording medium that can be transferred such as a floppy disk (registered trademark), a magneto-optical disk, a DVD, and a CD, as well as a built-in portable terminal 1. Refers to the HDD or the like.
- the program recorded on this recording medium is read by a control block, for example, and the same processing as described above is performed under the control of the control block.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
情報を表示する表示手段と、
前記表示手段上に設けられ、接触及び/または近接操作を検知する接触/近接検知手段と、
前記接触/近接検知手段を介して触覚を提示する触覚提示手段と、
前記接触及び/または近接操作について設定された条件に基づいて、前記接触/近接検知手段にて検知された接触及び/または近接操作に応じた前記表示手段の表示及び前記触覚提示手段の動作を決定する入力判定手段と、
前記入力判定手段における決定に基づいて、前記表示手段の表示及び前記触覚提示手段の動作を制御する制御手段とを有する。
前記接触及び/または近接操作について設定された条件に基づいて、前記接触/近接検知手段にて検知された接触及び/または近接操作に応じた前記表示手段の表示及び前記触覚提示手段の動作を制御する。
前記表示装置に、
前記接触及び/または近接操作について設定された条件に基づいて、前記接触/近接検知手段にて検知された接触及び/または近接操作に応じた前記表示手段の表示及び前記触覚提示手段の動作を決定する手順と、
当該決定に基づいて、前記表示手段の表示及び前記触覚提示手段の動作を制御する手順とを実行させる。
Claims (6)
- 情報を表示する表示手段と、
前記表示手段上に設けられ、接触及び/または近接操作を検知する接触/近接検知手段と、
前記接触/近接検知手段を介して触覚を提示する触覚提示手段と、
前記接触及び/または近接操作について設定された条件に基づいて、前記接触/近接検知手段にて検知された接触及び/または近接操作に応じた前記表示手段の表示及び前記触覚提示手段の動作を決定する入力判定手段と、
前記入力判定手段における決定に基づいて、前記表示手段の表示及び前記触覚提示手段の動作を制御する制御手段とを有する表示装置。 - 請求項1に記載の表示装置において、
前記入力判定手段は、前記接触及び/または近接操作についての数、位置、時間、軌跡もしくは移動の速さ、またはこれらの組み合わせに関する条件に基づいて、前記接触/近接検知手段にて検知された接触及び/または近接操作に応じた前記表示手段の表示及び前記触覚提示手段の動作を決定する表示装置。 - 請求項2に記載の表示装置において、
前記入力判定手段は、前記接触/近接検知手段にて検知された2つの操作が、互いに同一の軌跡を描くものであっても、前記軌跡を描く前の前記接触及び/または近接の時間が互いに異なる場合に、前記表示手段の表示及び前記触覚提示手段の動作を互いに異なるものとして決定する表示装置。 - 請求項3に記載の表示装置において、
前記入力判定手段は、前記接触/近接検知手段にて検知された接触及び/または近接操作によっては、前記表示手段による表示が変化する動作と、前記触覚提示手段によって触覚を提示する動作とのうちいずれか一方しか動作させない決定をする表示装置。 - 情報を表示する表示手段と、前記表示手段上に設けられ、接触及び/または近接操作を検知する接触/近接検知手段と、前記接触/近接検知手段を介して触覚を提示する触覚提示手段とを有する表示装置の動作方法であって、
前記接触及び/または近接操作について設定された条件に基づいて、前記接触/近接検知手段にて検知された接触及び/または近接操作に応じた前記表示手段の表示及び前記触覚提示手段の動作を制御する、表示装置の動作方法。 - 情報を表示する表示手段と、前記表示手段上に設けられ、接触及び/または近接操作を検知する接触/近接検知手段と、前記接触/近接検知手段を介して触覚を提示する触覚提示手段とを有する表示装置を動作させるためのプログラムであって、
前記表示装置に、
前記接触及び/または近接操作について設定された条件に基づいて、前記接触/近接検知手段にて検知された接触及び/または近接操作に応じた前記表示手段の表示及び前記触覚提示手段の動作を決定する手順と、
当該決定に基づいて、前記表示手段の表示及び前記触覚提示手段の動作を制御する手順とを実行させるプログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/382,405 US9563297B2 (en) | 2012-03-02 | 2013-01-24 | Display device and operating method thereof |
EP13754946.5A EP2821892A4 (en) | 2012-03-02 | 2013-01-24 | DISPLAY DEVICE AND METHOD OF OPERATION |
JP2014502071A JP6048489B2 (ja) | 2012-03-02 | 2013-01-24 | 表示装置 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012-047144 | 2012-03-02 | ||
JP2012047144 | 2012-03-02 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013128989A1 true WO2013128989A1 (ja) | 2013-09-06 |
Family
ID=49082194
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/051412 WO2013128989A1 (ja) | 2012-03-02 | 2013-01-24 | 表示装置及びその動作方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US9563297B2 (ja) |
EP (1) | EP2821892A4 (ja) |
JP (1) | JP6048489B2 (ja) |
WO (1) | WO2013128989A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015111416A (ja) * | 2013-11-26 | 2015-06-18 | イマージョン コーポレーションImmersion Corporation | 摩擦効果及び振動触覚効果を生成するシステム及び方法 |
JP2015521328A (ja) * | 2012-05-31 | 2015-07-27 | ノキア コーポレイション | ディスプレイ装置 |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9715279B2 (en) | 2014-06-09 | 2017-07-25 | Immersion Corporation | Haptic devices and methods for providing haptic effects via audio tracks |
US9588586B2 (en) * | 2014-06-09 | 2017-03-07 | Immersion Corporation | Programmable haptic devices and methods for modifying haptic strength based on perspective and/or proximity |
JP6341294B2 (ja) * | 2014-12-05 | 2018-06-13 | 富士通株式会社 | 触感提供システム、及び、触感提供装置 |
JP2018005274A (ja) * | 2016-06-27 | 2018-01-11 | ソニー株式会社 | 情報処理装置、情報処理方法およびプログラム |
US11232530B2 (en) * | 2017-02-28 | 2022-01-25 | Nec Corporation | Inspection assistance device, inspection assistance method, and recording medium |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009217816A (ja) * | 2008-03-10 | 2009-09-24 | Lg Electronics Inc | 端末機及びその制御方法 |
JP2010147973A (ja) * | 2008-12-22 | 2010-07-01 | Nec Corp | 携帯端末装置、操作通知方法および操作通知プログラム |
JP2010250750A (ja) | 2009-04-20 | 2010-11-04 | Sharp Corp | 入力器具、その制御方法、その制御プログラムおよびコンピュータ読み取り可能な記録媒体、ならびに、タッチパネル入力システム |
JP2011501298A (ja) | 2007-10-18 | 2011-01-06 | マイクロソフト コーポレーション | 音声、視覚、および触覚のフィードバックを使用した三次元オブジェクトシュミレーション |
JP2011054025A (ja) * | 2009-09-03 | 2011-03-17 | Denso Corp | 触感付与装置及びプログラム |
US20110210834A1 (en) * | 2010-03-01 | 2011-09-01 | Research In Motion Limited | Method of providing tactile feedback and apparatus |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8174503B2 (en) * | 2008-05-17 | 2012-05-08 | David H. Cain | Touch-based authentication of a mobile device through user generated pattern creation |
US9880621B2 (en) * | 2010-04-08 | 2018-01-30 | Disney Enterprises, Inc. | Generating virtual stimulation devices and illusory sensations using tactile display technology |
US9678569B2 (en) * | 2010-04-23 | 2017-06-13 | Immersion Corporation | Systems and methods for providing haptic effects |
US20110273380A1 (en) * | 2010-05-07 | 2011-11-10 | Research In Motion Limited | Portable electronic device and method of controlling same |
-
2013
- 2013-01-24 EP EP13754946.5A patent/EP2821892A4/en not_active Withdrawn
- 2013-01-24 JP JP2014502071A patent/JP6048489B2/ja not_active Expired - Fee Related
- 2013-01-24 US US14/382,405 patent/US9563297B2/en not_active Expired - Fee Related
- 2013-01-24 WO PCT/JP2013/051412 patent/WO2013128989A1/ja active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011501298A (ja) | 2007-10-18 | 2011-01-06 | マイクロソフト コーポレーション | 音声、視覚、および触覚のフィードバックを使用した三次元オブジェクトシュミレーション |
JP2009217816A (ja) * | 2008-03-10 | 2009-09-24 | Lg Electronics Inc | 端末機及びその制御方法 |
JP2010147973A (ja) * | 2008-12-22 | 2010-07-01 | Nec Corp | 携帯端末装置、操作通知方法および操作通知プログラム |
JP2010250750A (ja) | 2009-04-20 | 2010-11-04 | Sharp Corp | 入力器具、その制御方法、その制御プログラムおよびコンピュータ読み取り可能な記録媒体、ならびに、タッチパネル入力システム |
JP2011054025A (ja) * | 2009-09-03 | 2011-03-17 | Denso Corp | 触感付与装置及びプログラム |
US20110210834A1 (en) * | 2010-03-01 | 2011-09-01 | Research In Motion Limited | Method of providing tactile feedback and apparatus |
Non-Patent Citations (1)
Title |
---|
See also references of EP2821892A4 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015521328A (ja) * | 2012-05-31 | 2015-07-27 | ノキア コーポレイション | ディスプレイ装置 |
JP2015111416A (ja) * | 2013-11-26 | 2015-06-18 | イマージョン コーポレーションImmersion Corporation | 摩擦効果及び振動触覚効果を生成するシステム及び方法 |
Also Published As
Publication number | Publication date |
---|---|
EP2821892A1 (en) | 2015-01-07 |
US20150103017A1 (en) | 2015-04-16 |
US9563297B2 (en) | 2017-02-07 |
JP6048489B2 (ja) | 2016-12-21 |
JPWO2013128989A1 (ja) | 2015-07-30 |
EP2821892A4 (en) | 2015-10-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6553136B2 (ja) | タッチ感応表面上でのマルチ圧力相互作用のためのシステムと方法 | |
JP6048489B2 (ja) | 表示装置 | |
US10013063B2 (en) | Systems and methods for determining haptic effects for multi-touch input | |
JP6177669B2 (ja) | 画像表示装置およびプログラム | |
JP5295328B2 (ja) | スクリーンパッドによる入力が可能なユーザインタフェース装置、入力処理方法及びプログラム | |
US9836150B2 (en) | System and method for feedforward and feedback with haptic effects | |
JP6157885B2 (ja) | 携帯端末装置の表示制御方法 | |
US10514796B2 (en) | Electronic apparatus | |
JP2013546110A (ja) | コンピューティング装置の動きを利用するコンピューティング装置と相互作用するときに発生する入力イベントの解釈の強化 | |
EP2860610A2 (en) | Devices and methods for generating tactile feedback | |
KR20140055880A (ko) | 가상 화면 제어 방법 및 장치 | |
JP2016126363A (ja) | タッチスクリーンに入力する方法、携帯式電子機器およびコンピュータ・プログラム | |
TWI564780B (zh) | 觸控螢幕姿態技術 | |
JP2014059722A (ja) | 携帯情報装置、ソフトキーボード表示方法、ソフトキーボード表示プログラム、および、プログラム記録媒体 | |
AU2022291627A1 (en) | Touch input device and method | |
JP6283280B2 (ja) | 電子書籍閲覧装置及び電子書籍閲覧方法 | |
JP7077024B2 (ja) | 電子機器、情報処理方法、プログラム及び記憶媒体 | |
WO2017159796A1 (ja) | 情報処理方法及び情報処理装置 | |
JP2017167792A (ja) | 情報処理方法及び情報処理装置 | |
JP2014160301A (ja) | 情報処理装置、情報処理方法およびプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13754946 Country of ref document: EP Kind code of ref document: A1 |
|
REEP | Request for entry into the european phase |
Ref document number: 2013754946 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2013754946 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2014502071 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14382405 Country of ref document: US |