WO2016038677A1 - 触感制御システムおよび触感制御方法 - Google Patents
触感制御システムおよび触感制御方法 Download PDFInfo
- Publication number
- WO2016038677A1 WO2016038677A1 PCT/JP2014/073770 JP2014073770W WO2016038677A1 WO 2016038677 A1 WO2016038677 A1 WO 2016038677A1 JP 2014073770 W JP2014073770 W JP 2014073770W WO 2016038677 A1 WO2016038677 A1 WO 2016038677A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- tactile sensation
- tactile
- user
- icon
- control unit
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03547—Touch pads, in which fingers can move on a surface
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0485—Scrolling or panning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
Definitions
- the present invention relates to a tactile sensation control system and a tactile sensation control method for controlling a tactile sensation of a user when operating a touch panel or a touch pad.
- a technique for imparting tactile sensation to a finger by irradiating the finger with ultrasonic waves is disclosed (for example, see Patent Documents 1 and 2).
- a technique for imparting a tactile sensation to a user by vibrating an arbitrary area of the touch panel by ultrasonic vibration is disclosed (for example, see Non-Patent Document 1).
- a technique for imparting a tactile sensation by dynamically (physically) raising and lowering an arbitrary area of a touch panel is disclosed (for example, see Patent Document 3).
- Patent Documents 1 to 3 and Non-Patent Document 1 When using the techniques of Patent Documents 1 to 3 and Non-Patent Document 1, it is considered that the user can operate with the tactile sensation without concentrating the line of sight on the display screen.
- Patent Documents 1 to 3 and Non-Patent Document 1 do not disclose any specific use and cannot provide a user interface that is easy to use.
- the present invention has been made to solve such a problem, and provides a tactile sensation control system and a tactile sensation control method that allow a user to perform a user-friendly operation without concentrating the line of sight on the display screen during operation.
- the purpose is to provide.
- a tactile sensation control system is a tactile sensation control system that controls a user's tactile sensation when operating a touch panel or touch pad operation surface, and detects a user's operation on the operation surface. Control of the operation surface so that the tactile sensation of the operation detection unit that detects the user's operation detected by the operation detection unit or the tactile sensation of the region that follows the movement of the user's operation changes over time.
- a tactile sensation control unit is a tactile sensation control system that controls a user's tactile sensation when operating a touch panel or touch pad operation surface, and detects a user's operation on the operation surface. Control of the operation surface so that the tactile sensation of the operation detection unit that detects the user's operation detected by the operation detection unit or the tactile sensation of the region that follows the movement of the user's operation changes over time.
- the tactile sensation control method is a tactile sensation control method for controlling the tactile sensation of the user when operating the operation surface of the touch panel or the touch pad.
- the tactile sensation on the operation surface is controlled so that the tactile sensation in the region that receives the change or the tactile sensation in the region following the movement of the user's operation changes with time.
- the tactile sensation control system is a tactile sensation control system that controls the tactile sensation of the user when operating the operation surface of the touch panel or the touch pad, and includes an operation detection unit that detects an operation on the operation surface of the user, A tactile sensation control unit that controls the tactile sensation of the operation surface so that the tactile sensation of the region that has received the user operation detected by the detection unit or the tactile sensation of the region that follows the movement of the user operation changes with time. Therefore, it becomes possible for the user to perform a user-friendly operation without concentrating the line of sight on the display screen during the operation.
- the tactile sensation control method is a tactile sensation control method for controlling the tactile sensation of the user when operating the operation surface of the touch panel or the touch pad.
- the tactile sensation control method detects an operation on the operation surface of the user and accepts the detected user operation.
- User-friendly operation without concentrating the line of sight on the display screen at the time of operation because the tactile sensation of the operation surface is controlled so that the tactile sensation of the area or the tactile sensation of the area following the user's operation changes over time Can be performed.
- FIG. 1 is a block diagram showing an example of the configuration of the tactile sensation control device 1 according to the first embodiment.
- FIG. 1 shows the minimum necessary components constituting the tactile sensation control device 1.
- the tactile sensation control device 1 includes at least an operation detection unit 2 and a tactile sensation control unit 3.
- the operation detection unit 2 detects a user operation on the operation surface of the touch panel or the touch pad.
- the tactile sensation control unit 3 changes the tactile sensation on the operation surface so that the tactile sensation in the region where the user's operation detected by the operation detection unit 2 is received or the tactile sensation in the region following the movement of the user's operation changes with time. Control.
- FIG. 2 is a diagram illustrating an example of three tactile sensations of “smooth”, “half-gritty”, and “gritty”.
- the horizontal axis indicates the strength of the tactile sensation
- the leftmost column indicates “smooth”
- the middle two columns indicate “half rough”
- the rightmost column indicates “gritty”.
- the tactile sensation of the entire square is expressed by vibrating the dot or line pattern portion expressed in black shown in each square by, for example, ultrasonic vibration. That is, for example, when the intensity of vibration in each square is the same, the tactile sensation of “roughness” is stronger on the right side than on the left side in FIG.
- the first line in FIG. 2 indicates that the larger the dot eyes, the stronger the feel of the grain.
- the second line indicates that the narrower the grid interval, the stronger the feel of the grain.
- the third line It shows that the feel of roughness increases as the line changes from a broken line to a solid line and the line becomes thicker. Note that the texture pattern of the rough texture is not particular to FIG. 2, and there are infinite combinations.
- FIG. 2 shows a method for obtaining a different rough feel by changing the pattern even if the vibration intensity is the same. However, if the vibration intensity is changed even if the pattern is the same, A tactile sensation can be obtained.
- the texture of“ smooth ” can be expressed by not performing ultrasonic vibration, for example.
- the texture of “gritty” can be expressed by, for example, performing ultrasonic vibration with a strength equal to or greater than a predetermined threshold.
- the “half rough” tactile sensation can be expressed by, for example, performing ultrasonic vibration smaller than the above threshold.
- the strength of the rough tactile sensation can be expressed.
- the rough texture pattern and the static rough texture that does not change the vibration intensity have been explained.
- the vibration intensity is changed with time, or the rough texture pattern is temporal.
- changing the tactile sensation pattern within the same region is referred to as fixed tactile sensation change.
- FIGS. 3 to 5 are diagrams illustrating an example of generating a “moving rough” tactile sensation by changing the intensity of vibration over time. 3 to 5, the horizontal axis indicates time, and the vertical axis indicates the strength of tactile sensation.
- FIG. 3 shows a case where the tactile sensation is made constant and the tactile sensation occurs at a constant cycle.
- FIG. 4 shows a case where the tactile sensation is changed and the tactile sensation occurs at a constant cycle.
- FIG. 5 shows a case where the tactile sensation is constant and the tactile sensation cycle changes.
- the user can obtain a tactile sensation as if, for example, the “gritty” area is moving (that is, the tactile sensation of “moving rough”).
- the “gritty” tactile feel and the “smooth” tactile feel are alternately switched.
- the “gritty” tactile feel and the “half-gritty” tactile feel may be alternately switched.
- the texture of the “roughness” may be switched continuously instead of discretely, and a continuous change and a discrete change may be freely combined.
- FIG. 6 is a diagram illustrating another example of generating a “moving rough” tactile sensation by changing the rough tactile sensation pattern over time.
- the vertical axis indicates time.
- region b have a tactile feeling, for example.
- the positions of the area a and the area b are moved with time.
- the region a and the region b having a tactile sensation the user can obtain a tactile sensation as if the “rough” region is moving (ie, a “moving rough” tactile sense).
- Each of region a and region b may have a tactile sensation as shown in any of FIGS.
- the change in the position where the tactile sensation occurs with the passage of time is called a moving tactile sensation change.
- the “gritty” tactile sensation region and the “smooth” tactile sensation region are moved temporally, but the “gritty” tactile sensation region and the “half-gritty” tactile sensation region are It may be configured and moved in time, or an area in which the “gritty” tactile sensation changes discretely or continuously may be configured to move in time. 3 to 6, when a “gritty” tactile sensation that changes continuously is adopted, a smooth “growing gritty” tactile sensation can be obtained.
- FIG. 7 is a block diagram showing an example of the configuration of the tactile sensation control device 4.
- the tactile sensation control device 4 includes an operation detection unit 2, a tactile sensation control unit 3, a control unit 5, and a display information generation / output unit 6. Further, the display information generation / output unit 6 is connected to a display 7, and the operation detection unit 2 and the tactile sensation control unit 3 are connected to a tactile touch panel 8.
- the control unit 5 controls the tactile sensation control device 4 as a whole. In the example shown in FIG. 7, the control unit 5 controls the tactile sensation control unit 3 and the display information generation / output unit 6.
- the display information generation / output unit 6 generates display information in accordance with instructions from the control unit 5.
- the display information generation / output unit 6 converts the generated display information into a video signal and outputs it to the display unit 7.
- the tactile sensation control unit 3 changes the tactile sensation on the operation surface so that the tactile sensation in the region where the user's operation detected by the operation detection unit 2 is received or the tactile sensation in the region following the movement of the user's operation changes with time.
- the tactile sensation control information for control is output to the tactile sensation touch panel 8.
- the operation detection unit 2 acquires information related to a user operation on the tactile touch panel 8 as operation information from the tactile touch panel 8 (that is, detects a user operation on the tactile touch panel 8).
- the display device 7 displays the display information input from the display information generation / output unit 6 on the display screen.
- the tactile touch panel 8 outputs information related to the touch operation by the user (information such as the presence / absence of touch, the position of the touch, and the operation content) to the operation detection unit 2 as operation information.
- the tactile touch panel 8 changes the tactile sensation (“smooth”, “half-rough”, “rough”, “moving”) on the touch panel based on tactile control information input from the tactile control unit 3.
- the tactile touch panel 8 is provided on the display screen of the display device 7, and the user can use the touch panel 8 with a sense of directly operating the display screen. That is, the display screen area of the display 7 and the area where the tactile sensation of the tactile touch panel 8 is generated may completely coincide. In addition, one of the display screen area of the display 7 and the area where the tactile sensation touch panel 8 generates the tactile sensation may be wider than the other.
- the tactile touch panel 8 is installed so that the area where the tactile sensation of the tactile touch panel 8 is generated protrudes from the display screen area of the display unit 7, and is used as an area where a touch operation can be input although not displayed in the protruding area. Also good.
- FIG. 8 is a flowchart showing an example of the operation of the tactile sensation control device 4.
- FIG. 9 is a diagram illustrating an example of the operation of the tactile sensation control device 4, and an operation icon 9 is displayed on the display 7.
- the operation icon 9 is an icon for the user to perform an operation (icon operation) by arbitrary touch input with respect to information displayed on the display screen of the display device 7.
- step S11 the display information generation / output unit 6 generates display information in accordance with an instruction from the control unit 5, converts the generated display information into a video signal, and outputs the video signal to the display 7.
- a screen as shown in FIG. 9 is displayed on the display unit 7.
- step S12 the tactile sensation control unit 3 sets the tactile sensation control information in the region of the operation icon 9 to “gritty” in accordance with the instruction from the control unit 5.
- step S13 the tactile sensation control unit 3 outputs the tactile sensation control information set in step S12 to the tactile touch panel 8.
- the tactile sensation touch panel 8 controls the tactile sensation in the region of the operation icon 9 to be “gritty” based on the tactile sensation control information input from the tactile sensation control unit 3.
- step S ⁇ b> 14 the control unit 5 determines whether the user has touched the tactile touch panel 8 (whether there is an operation input) based on the detection result by the operation detection unit 2. The process waits until the user touches the tactile touch panel 8, and when the user touches the tactile touch panel 8, the process proceeds to step S15.
- step S15 the tactile sensation control unit 3 performs control so as to change the tactile sensation in the region of the operation icon 9.
- the tactile sensation in the area of the operation icon 9 changes while the user touches the operation icon 9.
- the icon operation by the touch input in the first embodiment can feel a tactile sensation as a preliminary operation of the icon operation when the user lightly touches the operation surface of the tactile touch panel 8, and the user strongly strengthens the operation surface. It also includes an operation method that accepts an icon operation when pressed.
- a specific example of a change in tactile sensation in the region of the operation icon 9 will be described.
- the tactile sensation in the region of the operation icon 9 may be changed in two stages of “roughness” and “smoothly” with the passage of time, and “roughness”, “moving roughness”, and “roughness” with the passage of time. It may be changed in stages. Moreover, you may change in multistep combining each tactile sensation.
- the strength of the tactile sensation in the region of the operation icon 9 may be increased stepwise (discontinuously) with time. Note that the strength of the tactile sensation may be continuously increased, or the tactile sensation pattern may be changed discontinuously or continuously. As shown in FIG. 11, the strength of the tactile sensation in the area of the operation icon 9 may be made constant, and the generation cycle of the tactile sensation may be shortened with time. That is, the tactile sensation in the region of the operation icon 9 may change with a change in the cycle (or may be periodic). As shown in FIG. 12, the strength of the tactile sensation in the region of the operation icon 9 may be changed only in the upper region (which may be the lower side, the right side, or the left side).
- the change in tactile sensation may be different for each of a plurality of areas in the operation icon 9.
- the tactile sensation of the area of the operation icon 9 may be “moving rough”. By performing such an operation, the user can easily grasp that the operation icon 9 has been operated.
- the tactile sensation control unit 3 is based on a predetermined tactile sensation change rule. The tactile sensation is controlled so as to generate a moving tactile sensation change that changes over time.
- the strength of the tactile sensation in the region of the operation icon 9 may be made constant, and the shape of the region to which the tactile sensation in the region of the operation icon 9 is given may be changed with time. .
- the tactile sensation in the region of the operation icon 9 may be changed, and the shape of the region to which the tactile sensation is provided in the region of the operation icon 9 may be changed over time.
- the tactile sensation of the region of the operation icon 9 may be changed, and the shape of the region to which the tactile sensation is provided in the region of the operation icon 9 may be rotated over time.
- the shape of the operation icon 9 displayed on the display unit 7 may be changed in accordance with the change in the shape of the region to which the tactile sensation is given in the region of the operation icon 9. Further, the tactile sensation in the region of the operation icon 9 may be changed by any of the above methods.
- the tactile sensation in the area of the operation icon 9 may be changed according to the mode. For example, when switching between mode A and mode B, changes in the feel of the area of the operation icon 9 when switching from mode A to mode B, and the feel of the area of the operation icon 9 when switching from mode B to mode A. The change may be different. In this way, the user can easily grasp which mode the user is going to switch to.
- An example of an icon for receiving a mode switching operation is an icon for switching between a north-up map display and a heading-up map display when the display device 7 displays a map.
- the tactile sensation in the region of the operation icon 9 may be changed by any of the above methods.
- the mode switching is not limited to switching display contents or display methods, but includes changing the type of operation to be accepted, or changing the operation mode of a device connected to the system including the tactile sensation control device 4.
- the tactile sensation of the region of the operation icon 9 may be changed, and the display form of the operation icon 9 displayed on the display unit 7 may be changed.
- the display form of the operation icon 9 includes the shape and color of the operation icon 9.
- the shape of the operation icon 9 is changed so that the operation of the toggle switch can be understood, and the tactile sensation in the region of the operation icon 9 may be changed. Good.
- the tactile sensation of the region of the operation icon 9 may be changed by any of the above methods. By doing so, the user can easily grasp that the operation icon 9 has been operated.
- the tactile sensation is predetermined.
- the period is greater than a predetermined threshold value.
- the operation icon 9 is pushed in. It may be determined that the operation is performed by touching (with a predetermined press or more) (an icon operation which is a predetermined second operation). At this time, the tactile sensation in the area of the operation icon 9 may be changed when the touch is made to push in (change the second tactile sensation), and from when it is touched lightly until the touch is made to push in. (The first tactile sensation may be changed) or a combination thereof.
- the tactile sensation in the region of the operation icon 9 may be changed by any of the above methods. By doing so, it is possible to prevent an erroneous operation and reliably accept an operation intended by the user.
- the user since the tactile sensation is changed according to the acceptance of the operation by the user, the user does not need to concentrate the line of sight on the display screen during the operation. That is, an operation that is convenient for the user can be performed.
- the functions described in the first embodiment may be realized in a smartphone.
- a smartphone it is assumed that the user operates while walking, so that in such a case, an effect of preventing a reduction in attention to the surroundings can be obtained.
- FIG. 19 is a block diagram showing an example of the configuration of the tactile sensation control apparatus 10 according to the second embodiment.
- the tactile sensation control device 10 includes an external device information acquisition control unit 11, a vehicle information acquisition unit 12, and a map information acquisition unit 13.
- the external device information acquisition control unit 11 is connected to each of the audio 14 and the air conditioner 15, and the map information acquisition unit 13 is connected to a map DB (database) 16.
- map DB database
- the external device information acquisition control unit 11 acquires information regarding the external device (audio 14 and air conditioner 15) that is a user's operation target as external device information (operation target device information). That is, the external device information acquisition control unit 11 has a function as an operation target device information acquisition unit.
- the external device information acquisition control unit 11 controls external devices (audio 14 and air conditioner 15).
- the vehicle information acquisition unit 12 receives sensor information (such as vehicle speed pulse information) detected by various sensors provided in the vehicle, vehicle control information, or GPS (Global Positioning System) via the in-vehicle LAN (Local Area Network). ) Get information etc. as vehicle information.
- sensor information such as vehicle speed pulse information
- GPS Global Positioning System
- the map information acquisition unit 13 acquires map information from the map DB 16.
- the map DB 16 stores map information.
- the map DB 16 may be provided in the vehicle or provided outside.
- FIG. 20 is a flowchart showing an example of the operation of the tactile sensation control apparatus 10. Note that steps S22 and S24 to S26 in FIG. 20 correspond to steps S11 and S13 to S15 in FIG.
- step S21 the external device information acquisition control unit 11 acquires external device information from the external device (audio 14, air conditioner 15). The acquired external device information is output to the control unit 5.
- step S23 the tactile sensation control unit 3 sets tactile sensation control information for the region of the operation icon 9 in accordance with an instruction from the control unit 5. Specifically, the tactile sensation control unit 3 sets tactile sensation control information based on the external device information acquired by the external device information acquisition control unit 11 so that the tactile sensation varies depending on the external device or the function of the external device. To do.
- a navigation operation icon 17 and an air conditioner operation icon 18 are displayed on the display screen of the display unit 7.
- the tactile sensation in the area of the navigation operation icon 17 is “gritty”
- the tactile sensation in the area of the air conditioner operation icon 18 is “half rough”
- the tactile sensation in areas other than the navigation operation icon 17 and the air conditioner operation icon 18 is “Smooth”.
- the operation is performed by touching the navigation operation icon 17.
- the control unit 5 searches for a route based on the vehicle information acquired by the vehicle information acquisition unit 12 and the map information acquired by the map information acquisition unit 13. Process related to navigation. Note that the control unit 5 has a navigation function.
- the operation is performed by touching the air conditioner operation icon 18.
- the control unit 5 instructs the external device information acquisition control unit 11 to control the air conditioner 15.
- the external device information acquisition control unit 11 controls the air conditioner 15 according to an instruction from the control unit 5.
- the navigation operation icon 17 and the air conditioner operation icon 18 have different tactile sensations.
- the tactile sensations in the area of the navigation operation icon 17 or the air conditioner operation icon 18 change so as to be different from each other.
- the tactile sensation changes in the order of “gritty”, “moving rough”, “gritty”, and when the air conditioner operation icon 18 is touched, “half rough”, “gritty”, “half rough”
- the tactile sensation changes in the order.
- the user can easily grasp which device is being operated.
- FIG. 21 although the case where the change of the tactile sensation of the icon is different for each type of external device has been described, the present invention is not limited to this, and the change of the tactile sensation may be different for each function of the external device.
- FIG. 21 shows an AV (Audio Visual) screen and a volume adjustment icon and a station change icon are displayed, the change in tactile sensation differs between the volume adjustment icon area and the tuning icon area. It may be. In this way, the user can easily grasp which function is being operated.
- AV Audio Visual
- the tactile sensation may be changed for each icon.
- the tactile sensation may be changed for each icon.
- the channel selection icon 19 has a shape resembling a push button
- the volume adjustment icon 20 has a shape resembling a dial. In this way, the user can easily grasp what kind of icon is being operated.
- the change in the tactile sensation of the icon due to the touching operation may be changed in size as shown in FIG. 13. Such a rotational change may be made.
- a physical push switch, jog switch, rotation switch, or the like can be expressed by tactile sensation, so that a user-friendly operation can be performed.
- the tactile touch panel 8 is provided to extend to a region (non-display region) other than the display screen (display region) of the display 7 (that is, the tactile touch panel 8
- a region (non-display region) other than the display screen (display region) of the display 7 that is, the tactile touch panel 8
- the operation surface is placed in a plane including a display area for displaying information and a non-display area other than the display area will be described. Since the configuration of the tactile sensation control apparatus according to the third embodiment is the same as that of the tactile sensation control apparatus 10 according to the second embodiment (see FIG. 19), the description thereof is omitted here.
- FIG. 23 is a diagram illustrating an example of the operation of the tactile sensation control apparatus according to the third embodiment.
- the display screen of the display unit 7 corresponds to a display area
- the area of the tactile touch panel 8 corresponds to a combined area of the display area and the non-display area.
- the display screen of the display 7 displays the position of the vehicle on the map and operation icons 21 for performing various operations (“CD playback”, “CD stop”, “peripheral search”, “route change”). Has been.
- the tactile sensation of the operation icon 21 displayed in the display area is “half rough”, and the tactile sensation of the operation icon 22 in the non-display area is “rough”.
- the change in tactile sensation differs between the region of the operation icon 21 and the region of the operation icon 22.
- the tactile sensations in the operation icon 21 region and the operation icon 22 region may be changed by any of the methods described in the first embodiment.
- the operation icon 22 includes, for example, a button for operating a function of the air conditioner 15, a button for operating an AV function, a button for operating a navigation function, and the like. Further, the operation icon 22 may have a physically raised shape.
- the user intends The icon to be selected can be selected and operated. That is, an operation that is convenient for the user can be performed.
- Embodiment 4 of the present invention is characterized in that the tactile sensation control unit 3 gives a predetermined tactile sensation when the user performs an incorrect operation. Since the configuration of the tactile sensation control apparatus according to the fourth embodiment is the same as that of the tactile sensation control apparatus 10 according to the second embodiment (see FIG. 19), description thereof is omitted here.
- FIG. 24 is a diagram illustrating an example of the operation of the tactile sensation control apparatus according to the fourth embodiment.
- FIG. 24 shows the display and tactile sensation when the vehicle is traveling.
- a travel operation restriction icon 23 and a travel operation unrestricted icon 24 are displayed on the display screen of the display unit 7. Further, on the tactile touch panel 8, the tactile sensation in the area of the travel operation restriction icon 23 is “smooth”, the tactile sensation in the area of the travel operation unrestricted icon 24 is “roughness”, and other than the travel operation restriction icon 23 and the travel operation unrestricted icon 24. The feel of the area is “smooth”.
- the travel operation restriction icon 23 when the vehicle is traveling, the operation of the traveling operation restriction icon 23 is in an invalid state, and thus the tactile sensation is “smooth”.
- the travel operation unrestricted icon 24 is always in an effective state regardless of the travel state of the vehicle, and thus the tactile sensation is “gritty”.
- the travel operation restriction icon 23 is disabled when the vehicle is traveling, but may be enabled when the vehicle is stopped.
- examples of the travel operation restriction icon 23 include various operation icons on the navigation screen. In this case, it is assumed that the control unit 5 has a navigation function.
- Examples of the travel operation unrestricted icon 24 include a volume adjustment icon of the audio 14 and a temperature adjustment icon of the air conditioner 15.
- FIG. 25 is a diagram illustrating another example of the operation of the tactile sensation control apparatus according to the fourth embodiment.
- FIG. 25 shows the display and tactile sensation when the vehicle is traveling.
- the operation icon 25 is displayed on the display screen of the display unit 7.
- the tactile sensation in the area of the operation icon 25 is “gritty”, and the tactile sensation in the area other than the operation icon 25 is “half rough”.
- an area surrounded by a broken line in the left diagram of FIG. 25 indicates a gesture input area 26.
- the gesture input area refers to an area in which a user can input by a gesture operation in the display screen of the display device 7.
- the operation icon 25 may be the same as the travel operation unrestricted icon 24 in FIG.
- the operation of the gesture input area 26 is in an invalid state, so that the tactile sensation is “half rough”.
- the operation of the operation icon 25 is always in an effective state regardless of the traveling state of the vehicle, the tactile sensation is “gritty”.
- the operation is disabled when the vehicle is traveling, but the operation may be enabled when the vehicle is stopped.
- the touch feeling of the touched area may be changed.
- the shape of the region to which the tactile sensation is imparted may be changed (for example, see FIGS. 13 to 15).
- the user can easily grasp that the erroneous operation has been performed. That is, an operation that is convenient for the user can be performed.
- the fifth embodiment of the present invention is characterized in that the tactile sensation control unit 3 performs control so that the tactile sensation position changes following the movement of the gesture operation by the user.
- the configuration of the tactile sensation control device according to the fifth embodiment is the same as that of the tactile sensation control device 4 according to the first embodiment (see FIG. 7) or the tactile sensation control device 10 according to the second embodiment (see FIG. 19). Description is omitted.
- FIG. 26 is a diagram illustrating an example of the operation of the tactile sensation control apparatus according to the fifth embodiment.
- An arbitrary item 27 is displayed on the display screen of the display device 7, and a scroll operation icon 28 is displayed at the right end of the screen.
- the tactile sensation in the area of the scroll operation icon 28 is “gritty”.
- the user can select one item 27 of the items 27 by sliding the scroll operation icon 28 in the vertical direction of the screen.
- the slide operation refers to an operation of sliding while the user touches the operation surface of the tactile touch panel 8, and is a concept included in the gesture operation.
- the upper diagram of FIG. 26 shows a state in which the user touches the scroll operation icon 28 to slide.
- the item 27 at the top of the screen is selected, and the feel of the area of the scroll operation icon 28 is “gritty”.
- the third item 27 from the top of the screen is selected following the slide operation. Further, following the slide operation, the position of the tactile sensation in the area of the scroll operation icon 28 changes (moves).
- 27 to 31 are diagrams showing another example of the operation of the tactile sensation control apparatus according to the fifth embodiment. 29 to 31 indicate the position where the user first touched.
- the tactile sensation is “smooth” on the operation surface of the tactile touch panel 8.
- the tactile sensation in the area of the touch position becomes “gritty”.
- the position of the tactile sensation changes (moves) following the movement of the gesture operation.
- the tactile sensation at this time is "moving rough”.
- the tactile sensation becomes “gritty”.
- the tactile sensation becomes “smooth” (the user never obtains the “smooth” tactile sense).
- the tactile sensation control unit 3 may perform control so that the tactile sensation changes according to the speed of the gesture operation. As described above, the tactile sensation control unit 3 performs control so that the tactile sensation at the start or end of the gesture operation is different from the tactile sensation during the gesture operation. Therefore, the user can easily recognize that the gesture operation is correctly performed.
- FIGS. 27 to 31 can be applied when, for example, the arrangement of operation icons (display objects) displayed on the display screen of the display device 7 is changed.
- FIGS. 32 to 34 are diagrams showing another example of the operation of the tactile sensation control device according to the fifth embodiment.
- the gesture operation performed in the order of FIGS. 32, 33, and 34 is particularly referred to as a pinch-out operation, and the gesture operation performed in the order of FIGS. 34, 33, and 32 is particularly referred to as a pinch-in operation.
- pinch out operation is demonstrated as an example.
- the tactile sensation of a predetermined region (here, a circular region) including the two points at the end becomes “gritty”.
- a predetermined region here, a circular region
- FIGS. 33 and 34 when the user slides each in a direction in which the distance between the two points is increased, the area shown in FIG. 32 expands following the slide operation. The tactile sensation at this time is “gritty”.
- the tactile sensation control unit 3 performs control so that the tactile sensation during the gesture operation differs according to the type of gesture operation. By doing so, the user can easily recognize what kind of gesture operation is accepted.
- the tactile sensations (“smooth”, “half-gritty”, “gritty”, “moving gritty”) may be used in appropriate combination. Further, in the pinch-out operation or the pinch-in operation shown in FIGS. 32 to 34, the tactile sensation of the pinched boundary line area may be changed.
- the user since the position of the tactile sensation is changed (moved) following the movement of the gesture operation, the user can easily recognize that the gesture operation is performed correctly. can do.
- the tactile sensation during the gesture operation varies depending on the type of gesture operation, the user can easily recognize what gesture operation is accepted.
- Embodiment 6 of the present invention is characterized in that the tactile sensation control unit 3 performs control so that the tactile sensation of at least a region where the gesture operation is performed on the operation surface changes according to the movement of the gesture operation.
- the configuration of the tactile sensation control apparatus according to the sixth embodiment is the same as that of the tactile sensation control apparatus 4 according to the first embodiment (see FIG. 7) or the tactile sensation control apparatus 10 according to the second embodiment (see FIG. 19). Description is omitted.
- FIGS. 35 to 38 are diagrams showing an example of the operation of the tactile sensation control apparatus according to the sixth embodiment. Note that broken-line circles shown in FIGS. 35 to 38 indicate positions touched by the user.
- a “gritty” tactile feel and a “smooth” tactile feel are applied to the entire operation surface of the tactile touch panel 8.
- a gesture operation for example, a scroll operation
- the “gritty” tactile sensation region and the “smooth” tactile sensation region imparted to the entire operation surface move in a direction opposite to the direction of the gesture operation.
- the position of the tactile feeling of “gritty” and the position of the tactile feeling of “smooth” are moved, but they may not be moved. This is because the finger feels “dragging” and the “smooth” tactile sensation can be obtained alternately.
- a part of the operation surface of the tactile touch panel 8 so that the “gritty” tactile sensation and the “smooth” tactile sensation alternate along the direction in which the user performs a gesture operation has been granted.
- a gesture operation for example, scroll operation
- the “gritty” tactile sensation region and the “smooth” tactile sensation region imparted to the entire operation surface move in a direction opposite to the direction of the gesture operation.
- each tactile sensation region given to the operation surface may be moved for a while. By doing so, the user can feel the inertia of the gesture operation at the end of the gesture operation.
- each tactile sensation region given to the operation surface does not have to be moved during the gesture operation.
- the direction in which each tactile sensation region given to the operation surface moves may be the same direction as the gesture operation direction. In this case, it is preferable that the speed of movement of each tactile sensation area given to the operation surface is faster than the speed of movement of the gesture operation.
- the tactile sensations (“smooth”, “half-gritty”, “gritty”, “moving gritty”) may be used in appropriate combination.
- the tactile sensation of at least the area where the gesture operation is performed is changed according to the movement of the gesture operation, the user can perform the gesture operation correctly. Can be easily recognized.
- Embodiment 7 of the present invention is characterized in that the tactile sensation control unit 3 performs control so that the tactile sensation region changes following the movement of the gesture operation by the user.
- the structure of the tactile sensation control device according to the seventh embodiment is the same as that of the tactile sensation control device 4 according to the first embodiment (see FIG. 7) or the tactile sensation control device 10 according to the second embodiment (see FIG. 19). Description is omitted.
- 39 to 43 are diagrams showing an example of the operation of the tactile sensation control apparatus according to the seventh embodiment.
- the display area 29 and the display area 30 are displayed on the display screen of the display device 7.
- the tactile sensation in the display area 29 is “half rough”
- the tactile sensation in the display area 30 is “smooth”.
- 44 to 48 are diagrams showing another example of the operation of the tactile sensation control apparatus according to the seventh embodiment.
- the display area 31 is displayed on the display surface of the display. In the tactile touch panel 8, the tactile sensation in the display area 31 is “gritty”.
- the tactile sensations (“smooth”, “half-gritty”, “gritty”, “moving gritty”) may be used in appropriate combination.
- the tactile sensation area is changed following the movement of the gesture operation, the user can easily recognize that the gesture operation is performed correctly. it can.
- the tactile sensation control device described above is not only a vehicle navigation device, that is, a car navigation device, but also a PND (Portable Navigation Device) and a mobile communication terminal (for example, a mobile phone, a smartphone, a tablet terminal, etc.) that can be mounted on a vehicle.
- the present invention can also be applied to a navigation device constructed as a system by appropriately combining servers and the like or a device other than the navigation device. In this case, each function or each component of the tactile sensation control device is distributed and arranged in each function that constructs the system.
- the function of the tactile sensation control device can be arranged in the server.
- the tactile sensation control system can be constructed by providing the display device 33 and the tactile touch panel 34 on the user side and providing the server 32 with at least the operation detection unit 2 and the tactile sensation control unit 3.
- the functions of the operation detection unit 2 and the tactile sensation control unit 3 are the same as the functions of the operation detection unit 2 and the tactile sensation control unit 3 in FIG.
- the server 32 may be provided with each component as shown to FIG. 7, 19 as needed. At this time, each component provided in the server 32 may be appropriately distributed and arranged in the server 32 and the display device 33.
- a tactile sensation control system includes a display device 33 and a tactile touch panel 34 on the user side, at least the operation detection unit 2 in the server 35, and at least the tactile control unit 3 in the mobile communication terminal 36.
- the functions of the operation detection unit 2 and the tactile sensation control unit 3 are the same as the functions of the operation detection unit 2 and the tactile sensation control unit 3 in FIG.
- the server 35 and the portable communication terminal 36 may include each component as shown in FIGS. At this time, each component provided in the server 35 and the portable communication terminal 36 may be appropriately distributed and arranged in the display device 33, the server 35, and the portable communication terminal 36.
- software for executing the operation in the above embodiment may be incorporated in, for example, a server or a mobile communication terminal.
- the tactile sensation control method described above is a tactile sensation control method for controlling the tactile sensation of the user at the time of operation on the operation surface of the touch panel or the touch pad, and detects an operation on the operation surface of the user, The tactile sensation of the operation surface is controlled so that the tactile sensation of the area that has received the detected user operation or the tactile sensation of the area following the movement of the user operation changes with time.
- each of the map information acquisition units 13 is realized by executing a program process using a CPU (Central Processing Unit) based on software. If possible, each of the operation detection unit 2, the tactile sensation control unit 3, the control unit 5, the display information generation output unit 6, the external device information acquisition control unit 11, the vehicle information acquisition unit 12, and the map information acquisition unit 13 May be configured as hardware (for example, an arithmetic / processing circuit configured to perform a specific calculation or process on an electric signal). Further, both of the above may be mixed.
- the display 7 and the tactile touch panel 8 are integrated (when the tactile touch panel 8 is provided on the display screen of the display 7) has been described.
- the present invention is not limited to this. Absent.
- the display device 7 may be provided on the meter panel of the instrument panel portion of the vehicle, and the tactile touch panel 8 may be provided as a tactile touch pad provided separately at a location different from the meter panel. Even with this configuration, the user can perform operations such as icons displayed on the display 7 provided on the meter panel without looking at the tactile touchpad. That is, an operation that is convenient for the user can be performed.
- the display device 7 provided in the meter panel may use a partial region or the entire region of the meter panel as a display region of the display device 7.
- the size of the region where the tactile sensation occurs on the tactile touch pad and the size of the display region on the display device 7 may be the same, may be similar, or may not be similar.
- 1 tactile sensation device 2 operation detection unit, 3 tactile control unit, 4 tactile control device, 5 control unit, 6 display information generation output unit, 7 display, 8 tactile touch panel, 9 operation icon, 10 tactile control device, 11 external Device information acquisition control unit, 12 Vehicle information acquisition unit, 13 Map information acquisition unit, 14 Audio, 15 Air conditioner, 16 Map DB, 17 Navigation operation icon, 18 Air conditioner operation icon, 19 Channel selection icon, 20 Volume adjustment icon, 21, 22 operation icons, 23 travel operation restriction icons, 24 travel operation unrestricted icons, 25 operation icons, 26 gesture input areas, 27 items, 28 scroll operation icons, 29, 30, 31 display areas, 32 servers, 33 display devices, 34 Tactile touch panel, 3 Server, 36 mobile communication terminal.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
まず、本発明の実施の形態1による触感制御システムの構成について説明する。なお、本実施の形態および以下の各実施の形態においては、触感制御システムを触感制御装置単体で実現した場合について説明する。
まず、本発明の実施の形態2による触感制御装置の構成について説明する。
本発明の実施の形態3では、触感タッチパネル8が、表示器7の表示画面(表示領域)以外の領域(非表示領域)にまで延在して設けられている場合(すなわち、触感タッチパネル8の操作面が、情報を表示する表示領域と、当該表示領域以外の非表示領域とを含んで面位設置されている場合)について説明する。本実施の形態3による触感制御装置の構成は、実施の形態2による触感制御装置10(図19参照)と同様であるため、ここでは説明を省略する。
本発明の実施の形態4では、触感制御部3が、ユーザが誤った操作を行った場合に予め定められた触感を付与することを特徴としている。本実施の形態4による触感制御装置の構成は、実施の形態2による触感制御装置10(図19参照)と同様であるため、ここでは説明を省略する。
本発明の実施の形態5では、触感制御部3が、ユーザによるジェスチャー操作の動きに追従して触感の位置が変化するように制御することを特徴としている。本実施の形態5による触感制御装置の構成は、実施の形態1による触感制御装置4(図7参照)または実施の形態2による触感制御装置10(図19参照)と同様であるため、ここでは説明を省略する。
本発明の実施の形態6では、触感制御部3が、ジェスチャー操作の動きに応じて操作面のうちの少なくともジェスチャー操作を行う領域の触感が変化するように制御することを特徴としている。本実施の形態6による触感制御装置の構成は、実施の形態1による触感制御装置4(図7参照)または実施の形態2による触感制御装置10(図19参照)と同様であるため、ここでは説明を省略する。
本発明の実施の形態7では、触感制御部3が、ユーザによるジェスチャー操作の動きに追従して触感の領域が変化するように制御することを特徴としている。本実施の形態7による触感制御装置の構成は、実施の形態1による触感制御装置4(図7参照)または実施の形態2による触感制御装置10(図19参照)と同様であるため、ここでは説明を省略する。
Claims (20)
- タッチパネルまたはタッチパッドの操作面に対する操作時のユーザの触感を制御する触感制御システムであって、
前記ユーザの前記操作面への操作を検出する操作検出部と、
前記操作検出部にて検出された前記ユーザの操作を受け付けた領域の触感または前記ユーザの操作の動きに追従する領域の触感が時間の経過とともに変化するように前記操作面の前記触感を制御する触感制御部と、
を備える、触感制御システム。 - 前記触感制御部は、予め定められた触感変化規則に基づいて、前記触感の強さ、前記触感のパターンが同一領域内で変化する固定型触感変化、または前記触感を発生させる位置が時間の経過とともに変化する移動型触感変化を発生させるように前記触感を制御することを特徴とする、請求項1に記載の触感制御システム。
- 前記触感制御部は、前記触感の強さまたは前記触感のパターンが不連続または連続的に変化するように制御することを特徴とする、請求項2に記載の触感制御システム。
- 前記操作を受け付ける操作アイコンが前記操作面に対応して表示される場合において、
前記触感制御部は、前記操作アイコンの操作に対応する機能種別または操作の対象となる機器種別に応じて前記触感の変化が異なるように制御することを特徴とする、請求項1に記載の触感制御システム。 - 前記触感制御部は、前記操作アイコン内における複数の領域ごとに前記触感の変化が異なるように制御することを特徴とする、請求項4に記載の触感制御システム。
- 前記触感制御部は、前記触感が付与される領域の形状が時間の経過とともに変化するように制御することを特徴とする、請求項4に記載の触感制御システム。
- 前記操作アイコンは、前記触感制御部によって前記操作アイコンに対応する領域の触感の変化に応じて表示形態が変化することを特徴とする、請求項4に記載の触感制御システム。
- 前記操作を受け付ける操作アイコンが前記操作面に対応して表示される場合において、
前記触感制御部は、前記操作検出部が予め定められた第1の操作であるアイコン予備操作を検出した場合に第1の触感を発生させ、前記操作検出部が予め定められた第2の操作であるアイコン操作を検出した場合に第2の触感を発生させるように前記触感を制御し、
前記操作検出部が前記第2の操作を検出した場合は、前記操作が有効であると判断することを特徴とする、請求項1に記載の触感制御システム。 - 前記第1の操作は、前記ユーザが予め定められた押圧未満で前記操作面をタッチする操作であり、
前記第2の操作は、前記ユーザが予め定められた押圧以上で前記操作面をタッチする操作か、または前記第1の操作を開始してから予め定められた時間が経過したときの操作であることを特徴とする、請求項8に記載の触感制御システム。 - 前記触感制御部は、前記第1の操作が行われている間、前記第1の触感が時間の経過とともに変化するように制御することを特徴とする、請求項9に記載の触感制御システム。
- 前記操作はジェスチャー操作であり、
前記触感制御部は、前記ジェスチャー操作の開始時または終了時の前記触感と、前記ジェスチャー操作中の前記触感とが異なるように制御することを特徴とする、請求項1に記載の触感制御システム。 - 前記操作はジェスチャー操作であり、
前記触感制御部は、前記ジェスチャー操作の種別または前記ジェスチャー操作の動きの速さに応じて前記ジェスチャー操作中の前記触感が異なるように制御することを特徴とする、請求項1に記載の触感制御システム。 - 前記操作はジェスチャー操作であり、
前記触感制御部は、前記ジェスチャー操作の動きに応じて前記操作面のうちの少なくとも前記ジェスチャー操作を行う領域の前記触感が変化するように制御することを特徴とする、請求項1に記載の触感制御システム。 - 前記触感制御部は、前記ジェスチャー操作の終了時において、前記ジェスチャー操作を行う領域の前記触感の位置が移動するように制御することを特徴とする、請求項13に記載の触感制御システム。
- 前記操作はジェスチャー操作であり、
前記触感制御部は、前記ジェスチャー操作の動きに追従して前記触感の位置が変化するように制御することを特徴とする、請求項1に記載の触感制御システム。 - 前記操作はジェスチャー操作であり、
前記触感制御部は、前記ジェスチャー操作の動きに追従して前記触感の領域が変化するように制御することを特徴とする、請求項1に記載の触感制御システム。 - 前記触感制御部は、前記ユーザが前記操作を誤った場合において、予め定められた前記触感となるように制御することを特徴とする、請求項1に記載の触感制御システム。
- 前記操作の誤りは、操作を受け付けない領域に対して前記操作を行うこと、または予め定められた条件によって制限された操作を行うことであることを特徴とする、請求項17に記載の触感制御システム。
- 前記タッチパネルの前記操作面は、情報を表示する表示領域と、当該表示領域以外の非表示領域とを含んで面位設置され、
前記触感制御部は、前記操作面における前記表示領域に対応する領域と前記非表示領域に対応する領域とで前記触感の変化が異なるように制御することを特徴とする、請求項1に記載の触感制御システム。 - タッチパネルまたはタッチパッドの操作面に対する操作時のユーザの触感を制御する触感制御方法であって、
前記ユーザの前記操作面への操作を検出し、
前記検出した前記ユーザの操作を受け付けた領域の触感または前記ユーザの操作の動きに追従する領域の触感が時間の経過とともに変化するように前記操作面の前記触感を制御する、触感制御方法。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/319,051 US20170115734A1 (en) | 2014-09-09 | 2014-09-09 | Tactile sensation control system and tactile sensation control method |
DE112014006935.3T DE112014006935T5 (de) | 2014-09-09 | 2014-09-09 | Tastempfindung-Steuersystem und Tastempfindung-Steuerverfahren |
CN201480081806.6A CN107077281A (zh) | 2014-09-09 | 2014-09-09 | 触感控制系统及触感控制方法 |
PCT/JP2014/073770 WO2016038677A1 (ja) | 2014-09-09 | 2014-09-09 | 触感制御システムおよび触感制御方法 |
JP2016547285A JP6258513B2 (ja) | 2014-09-09 | 2014-09-09 | 触感制御システムおよび触感制御方法 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2014/073770 WO2016038677A1 (ja) | 2014-09-09 | 2014-09-09 | 触感制御システムおよび触感制御方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016038677A1 true WO2016038677A1 (ja) | 2016-03-17 |
Family
ID=55458470
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/073770 WO2016038677A1 (ja) | 2014-09-09 | 2014-09-09 | 触感制御システムおよび触感制御方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20170115734A1 (ja) |
JP (1) | JP6258513B2 (ja) |
CN (1) | CN107077281A (ja) |
DE (1) | DE112014006935T5 (ja) |
WO (1) | WO2016038677A1 (ja) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022054382A1 (ja) * | 2020-09-11 | 2022-03-17 | 菱洋エレクトロ株式会社 | 電子機器及び電子機器の制御方法 |
JP2023518892A (ja) * | 2020-03-26 | 2023-05-08 | 維沃移動通信有限公司 | 検証方法、電子機器及びコンピュータ可読記憶媒体 |
US12048818B2 (en) | 2020-07-05 | 2024-07-30 | New Wave Endo-Surgical Corp. | Handheld elongate medical device advancer and related systems, devices and methods |
JP7542095B2 (ja) | 2016-06-12 | 2024-08-29 | アップル インコーポレイテッド | 触覚フィードバックを提供するデバイス、方法、及びグラフィカルユーザインタフェース |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DK201670728A1 (en) | 2016-09-06 | 2018-03-19 | Apple Inc | Devices, Methods, and Graphical User Interfaces for Providing Feedback During Interaction with an Intensity-Sensitive Button |
KR102591807B1 (ko) * | 2016-11-21 | 2023-10-23 | 한국전자통신연구원 | 촉감자극을 생성하는 방법 및 장치 |
CN110244845B (zh) * | 2019-06-11 | 2022-08-05 | Oppo广东移动通信有限公司 | 触觉反馈方法、装置、电子设备及存储介质 |
USD1002009S1 (en) | 2021-08-31 | 2023-10-17 | New Wave Endo-Surgical Corp. | Medical device |
USD1029259S1 (en) | 2021-08-31 | 2024-05-28 | New Wave Endo-Surgical Corp. | Portion of a medical device |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10113969A (ja) * | 1996-10-09 | 1998-05-06 | Nissei Plastics Ind Co | 射出成形機の入力装置 |
JP2005258666A (ja) * | 2004-03-10 | 2005-09-22 | Sony Corp | 入力装置および電子機器並びに電子機器の感触フィードバック入力方法 |
JP2010009321A (ja) * | 2008-06-26 | 2010-01-14 | Kyocera Corp | 入力装置 |
JP5427331B1 (ja) * | 2013-09-05 | 2014-02-26 | 株式会社ポケモン | タイピング訓練システム、タイピング訓練方法、及びタイピング訓練プログラム |
JP2014512619A (ja) * | 2011-04-22 | 2014-05-22 | イマージョン コーポレーション | 電気振動型タクタイル・ディスプレイ |
JP2014112357A (ja) * | 2012-10-31 | 2014-06-19 | Immersion Corp | 触覚効果を用いてユーザインタフェース上で表面フィーチャをシミュレートする方法及び装置 |
Family Cites Families (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
SE515663C2 (sv) * | 1996-08-23 | 2001-09-17 | Ericsson Telefon Ab L M | Pekskärm och användning av pekskärm |
JP3987182B2 (ja) * | 1998-01-26 | 2007-10-03 | Idec株式会社 | 情報表示装置および操作入力装置 |
JP3949912B2 (ja) * | 2000-08-08 | 2007-07-25 | 株式会社エヌ・ティ・ティ・ドコモ | 携帯型電子機器、電子機器、振動発生器、振動による報知方法および報知制御方法 |
WO2003050754A1 (en) * | 2001-12-12 | 2003-06-19 | Koninklijke Philips Electronics N.V. | Display system with tactile guidance |
TWI405101B (zh) * | 2009-10-05 | 2013-08-11 | Wistron Corp | 具觸控面板之電子裝置及其運作方法 |
JP5635274B2 (ja) * | 2010-01-27 | 2014-12-03 | 京セラ株式会社 | 触感呈示装置および触感呈示方法 |
WO2011115060A1 (ja) * | 2010-03-15 | 2011-09-22 | 日本電気株式会社 | 入力装置、入力方法及びプログラム |
JP5580155B2 (ja) * | 2010-09-27 | 2014-08-27 | スタンレー電気株式会社 | タッチパネル入力装置の製造方法 |
US20120113008A1 (en) * | 2010-11-08 | 2012-05-10 | Ville Makinen | On-screen keyboard with haptic effects |
EP2456175B1 (en) * | 2010-11-19 | 2014-01-15 | BlackBerry Limited | Portable electronic device including flexible display |
EP2715499B1 (en) * | 2011-05-23 | 2020-09-02 | Microsoft Technology Licensing, LLC | Invisible control |
CN104604207A (zh) * | 2012-06-05 | 2015-05-06 | Nec卡西欧移动通信株式会社 | 便携终端设备 |
US9176586B2 (en) * | 2012-06-29 | 2015-11-03 | Panasonic Intellectual Property Management Co., Ltd. | Touch panel device with tactile sense presenting function |
JP6061528B2 (ja) * | 2012-07-23 | 2017-01-18 | キヤノン株式会社 | 操作装置、その制御方法及びプログラム並びに記録媒体 |
CN103777743B (zh) * | 2012-10-23 | 2016-12-28 | 联想(北京)有限公司 | 一种信息处理的方法及电子设备 |
US9330544B2 (en) * | 2012-11-20 | 2016-05-03 | Immersion Corporation | System and method for simulated physical interactions with haptic effects |
CN103902200A (zh) * | 2012-12-24 | 2014-07-02 | 联想(北京)有限公司 | 一种信息处理方法及电子设备 |
US9632584B2 (en) * | 2013-01-08 | 2017-04-25 | 2236008 Ontario Inc. | On-demand user control |
US10359857B2 (en) * | 2013-07-18 | 2019-07-23 | Immersion Corporation | Usable hidden controls with haptic feedback |
US9639158B2 (en) * | 2013-11-26 | 2017-05-02 | Immersion Corporation | Systems and methods for generating friction and vibrotactile effects |
-
2014
- 2014-09-09 WO PCT/JP2014/073770 patent/WO2016038677A1/ja active Application Filing
- 2014-09-09 US US15/319,051 patent/US20170115734A1/en not_active Abandoned
- 2014-09-09 JP JP2016547285A patent/JP6258513B2/ja not_active Expired - Fee Related
- 2014-09-09 CN CN201480081806.6A patent/CN107077281A/zh active Pending
- 2014-09-09 DE DE112014006935.3T patent/DE112014006935T5/de not_active Withdrawn
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10113969A (ja) * | 1996-10-09 | 1998-05-06 | Nissei Plastics Ind Co | 射出成形機の入力装置 |
JP2005258666A (ja) * | 2004-03-10 | 2005-09-22 | Sony Corp | 入力装置および電子機器並びに電子機器の感触フィードバック入力方法 |
JP2010009321A (ja) * | 2008-06-26 | 2010-01-14 | Kyocera Corp | 入力装置 |
JP2014512619A (ja) * | 2011-04-22 | 2014-05-22 | イマージョン コーポレーション | 電気振動型タクタイル・ディスプレイ |
JP2014112357A (ja) * | 2012-10-31 | 2014-06-19 | Immersion Corp | 触覚効果を用いてユーザインタフェース上で表面フィーチャをシミュレートする方法及び装置 |
JP5427331B1 (ja) * | 2013-09-05 | 2014-02-26 | 株式会社ポケモン | タイピング訓練システム、タイピング訓練方法、及びタイピング訓練プログラム |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7542095B2 (ja) | 2016-06-12 | 2024-08-29 | アップル インコーポレイテッド | 触覚フィードバックを提供するデバイス、方法、及びグラフィカルユーザインタフェース |
JP2023518892A (ja) * | 2020-03-26 | 2023-05-08 | 維沃移動通信有限公司 | 検証方法、電子機器及びコンピュータ可読記憶媒体 |
JP7465989B2 (ja) | 2020-03-26 | 2024-04-11 | 維沃移動通信有限公司 | 検証方法、電子機器及びコンピュータ可読記憶媒体 |
US12048818B2 (en) | 2020-07-05 | 2024-07-30 | New Wave Endo-Surgical Corp. | Handheld elongate medical device advancer and related systems, devices and methods |
WO2022054382A1 (ja) * | 2020-09-11 | 2022-03-17 | 菱洋エレクトロ株式会社 | 電子機器及び電子機器の制御方法 |
Also Published As
Publication number | Publication date |
---|---|
DE112014006935T5 (de) | 2017-06-22 |
CN107077281A (zh) | 2017-08-18 |
JPWO2016038677A1 (ja) | 2017-04-27 |
JP6258513B2 (ja) | 2018-01-10 |
US20170115734A1 (en) | 2017-04-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6258513B2 (ja) | 触感制御システムおよび触感制御方法 | |
JP6113281B2 (ja) | 情報処理装置 | |
US9836150B2 (en) | System and method for feedforward and feedback with haptic effects | |
JP6429886B2 (ja) | 触感制御システムおよび触感制御方法 | |
US20160162092A1 (en) | Operation device | |
JP5852514B2 (ja) | タッチセンサ | |
JP5901663B2 (ja) | 表示装置及び表示制御プログラム | |
JP6463457B2 (ja) | 地図表示制御装置および地図スクロールの操作感触制御方法 | |
JP2015133080A5 (ja) | ||
JP6548852B2 (ja) | タッチ入力判定装置、タッチパネル入力装置、タッチ入力判定方法、及びタッチ入力判定プログラム | |
JP2014182808A (ja) | タッチスクリーンユーザインターフェースのナビゲーション制御 | |
JP6433347B2 (ja) | 地図表示制御装置および地図の自動スクロール方法 | |
JP6393604B2 (ja) | 操作装置 | |
JP6284648B2 (ja) | 触感制御システムおよび触感制御方法 | |
WO2018135183A1 (ja) | 座標入力装置 | |
JP6483379B2 (ja) | 触感制御システムおよび触感制御方法 | |
JP6434259B2 (ja) | 触感制御システムおよび触感制御方法 | |
US20230004285A1 (en) | Control Value Setting Device and Control Value Setting Program | |
WO2015151154A1 (ja) | 表示装置、表示方法および表示プログラム | |
JP2015176471A (ja) | 表示制御装置、表示制御方法、および、表示制御装置用プログラム | |
TWI483155B (zh) | 電子裝置及觸控回饋方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14901687 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2016547285 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15319051 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 112014006935 Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 14901687 Country of ref document: EP Kind code of ref document: A1 |