WO2013186844A1 - Dispositif électronique et programme de commande d'activation - Google Patents

Dispositif électronique et programme de commande d'activation Download PDF

Info

Publication number
WO2013186844A1
WO2013186844A1 PCT/JP2012/064945 JP2012064945W WO2013186844A1 WO 2013186844 A1 WO2013186844 A1 WO 2013186844A1 JP 2012064945 W JP2012064945 W JP 2012064945W WO 2013186844 A1 WO2013186844 A1 WO 2013186844A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch panel
vibration
actuator
drive command
lra
Prior art date
Application number
PCT/JP2012/064945
Other languages
English (en)
Japanese (ja)
Inventor
遠藤 康浩
谷中 聖志
裕一 鎌田
矢吹 彰彦
Original Assignee
富士通株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士通株式会社 filed Critical 富士通株式会社
Priority to JP2014520830A priority Critical patent/JP5822022B2/ja
Priority to PCT/JP2012/064945 priority patent/WO2013186844A1/fr
Publication of WO2013186844A1 publication Critical patent/WO2013186844A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the present invention relates to an electronic device and a drive control program.
  • the touch panel detects contact with the operation surface as an operation input, and does not provide a tactile sensation when a button or the like is pressed. For this reason, in the conventional touch panel, mounting of the apparatus which provides the tactile sense according to the operation input was desired.
  • a conventional mechanical key switch using a metal dome or the like has a protrusion on the surface of a specific key, so that a user can identify a specific key only by touch.
  • the touch panel has a flat surface on which a user performs operation input, and a GUI (Graphic User Interface) button is displayed on the touch panel.
  • GUI Graphic User Interface
  • an object is to provide an electronic device and a drive control program that can identify a specific GUI button by tactile sensation.
  • the electronic device includes a touch panel, an actuator that vibrates the touch panel, a load detection unit that detects a load of a user's operation input to the touch panel, and an operation input to the touch panel.
  • the load position is within a predetermined area of the touch panel and the load detected by the load detector is equal to or greater than a first predetermined value and less than a second predetermined value greater than the first predetermined value,
  • a drive control unit that drives the actuator with a first drive command that causes the touch panel to generate vibration according to the first vibration pattern.
  • FIG. 11 illustrates an electronic device of an embodiment. It is a flowchart which shows the process at the time of the drive device of embodiment driving LRA. It is a figure which shows two types of threshold values for determining the magnitude
  • FIG. 6 is a flowchart showing processing when electronic device 300 of the embodiment vibrates touch panel 120. It is a figure which shows the operation example of 300 A of smart phones of embodiment. It is a figure which shows the operation example of 300 A of smart phones of embodiment. It is a figure which shows the operation example of 300 A of smart phones of embodiment. It is a figure which shows the operation example of 300 A of smart phones of embodiment. It is a figure for demonstrating the principle of operation of LRA. It is a figure which shows the example of the input waveform applied to LRA. It is a figure explaining the displacement of LRA. It is a figure which shows the example of the displacement of a vibration of LRA, a speed, and an acceleration.
  • FIG. 1A is a diagram showing a waveform 11 of acceleration of vibration generated when the accelerometer 1 is attached to a human finger and the button 2 is pressed.
  • FIG. 1B is a diagram showing a waveform 12 of acceleration of vibration generated when the accelerometer 1 is attached to a human finger and a touch panel 3 to which an LRA (Linear Resonant Actuator) is attached is touched.
  • the button 2 is, for example, a metal dome type button.
  • the button 2 and the touch panel 3 are provided in the electronic device.
  • the vibration indicated by waveform 11 is rapidly damped in one to several cycles.
  • the vibration indicated by the waveform 12 continues until the free vibration due to the natural frequency of the LRA is attenuated even after the supply of the drive command is stopped.
  • a human finger cannot sense vibration when the vibration acceleration is 0.02 G or less at a vibration frequency of 200 Hz.
  • the vibration frequency is the number of vibrations per second.
  • the acceleration of vibration indicates the speed change amount of vibration per unit time.
  • FIG. 2 is a diagram showing the sensitivity of an organ that detects acceleration included in human tissue.
  • human mechanical stimulus receptors Merkel cells that sense displacement, Meissner bodies that sense speed, and Patini bodies that sense acceleration.
  • the finger does not sense the vibration because the acceleration of the vibration is 0.02 G or less within 0.01 sec.
  • waveform 12 0.1 sec is required until the acceleration of vibration becomes 0.02 G or less, and the finger continues to sense vibration until 0.1 sec elapses. Therefore, the vibration indicated by the waveform 11 and the vibration indicated by the waveform 12 are completely different as tactile sensations that humans sense.
  • FIG. 3 is a diagram illustrating a cross-sectional structure of the electronic apparatus according to the embodiment.
  • the electronic apparatus 300 includes a housing 110, a touch panel 120, a double-sided tape 130, an LRA 140, a substrate 150, and a load sensor 160.
  • the electronic device 300 is a portable terminal device such as a smartphone, for example.
  • the electronic device 300 is not limited to a mobile terminal device such as a smartphone, and may be any device that uses the touch panel 120 as an operation input unit.
  • the electronic device 300 is installed in a specific place such as ATM (Automatic Teller Machine). It may be a device to be used.
  • the touch panel 120 is fixed to the housing 110 by the double-sided tape 130.
  • the LRA 140 is attached to the surface of the touch panel 120 on the housing side.
  • the LRA 140 is a combination of a vibration system having a resonance frequency designed in advance and an actuator.
  • the LRA 140 is a vibration device that generates vibration by driving mainly at the resonance frequency. The amount of vibration changes depending on the amplitude of the drive waveform. .
  • the substrate 150 is disposed inside the housing 110.
  • the board 150 is mounted with a driver IC (Integrated Circuit) that outputs a drive command to the LRA 140 in order to control the drive of the LRA 140.
  • a driver IC Integrated Circuit
  • FIG. 3 the driver IC and the like are omitted.
  • the electronic device 300 when a user's finger contacts the touch panel 120, the contact is detected and the LRA 140 is driven by the driving device mounted on the substrate 150 to propagate the vibration of the LRA 140 to the touch panel 120.
  • the load sensor 160 is an example of a load detection unit that detects a load input to the surface of the touch panel 120.
  • the load sensor 160 is disposed on both sides of the substrate 150 inside the housing 110. Note that the number and position of the load sensor 160 in a plan view are arbitrary. For example, one load sensor 160 may be provided at each of the four corners of the touch panel 120 or one at the center.
  • the load sensor 160 may be any type of load sensor as long as it can detect a load input to the touch panel 120. Therefore, for example, the sensor may be a sensor that is built in the touch panel 120 and detects a load based on a change in capacitance that occurs when a user touches the surface of the touch panel 120. Alternatively, a sheet-like sensor that is disposed on the back surface (the lower surface in FIG. 3) side of the touch panel 120 and detects the capacitance with the touch panel 120 may be used.
  • FIG. 4 is a diagram showing cross-sectional structures of two types of LRAs.
  • FIG. 4A is a diagram showing an LRA using a voice coil
  • FIG. 4B is a diagram showing an LRA using a piezoelectric element.
  • the 4A includes a spring 31, a magnet 32, and a coil 33.
  • An LRA 40 illustrated in FIG. 4B includes a weight 41, a beam 42, and a piezoelectric element 43.
  • the mass of the weight 41 is m
  • the Young's modulus of the beam 42 is E
  • the cross-sectional secondary moment of the beam 42 is I
  • the natural frequency f0 is expressed by the following formula 2. Note that L is the length in the longitudinal direction of the beam.
  • an LRA 30 using a voice coil may be applied, or an LRA 40 using a piezoelectric element 43 may be applied.
  • FIG. 5 is a diagram illustrating the electronic apparatus according to the embodiment.
  • the electronic device 300 includes a driving device 200, a driver IC (Integrated Circuit) 260, an LRA 140, a load sensor 160, a display 301, a touch sensor 302, an input unit 303, and a signal processing unit as main components. 304, a communication unit 305, and a recording medium I / F (Interface) unit 308.
  • a driving device 200 a driver IC (Integrated Circuit) 260, an LRA 140, a load sensor 160, a display 301, a touch sensor 302, an input unit 303, and a signal processing unit as main components.
  • 304 a communication unit 305, and a recording medium I / F (Interface) unit 308.
  • the driving device 200 includes a control unit 210 and a memory 220.
  • the control unit 210 includes a drive control unit 211, a position determination unit 212, and a load determination unit 213.
  • the control unit 210 is realized by a CPU (Central Processing Unit).
  • the drive control unit 211 reads and executes the drive control program 230 stored in the memory 220 and generates a drive command based on the waveform data, thereby driving the LRA 140 described later using the drive command.
  • the drive control unit 211 generates a drive command representing a voltage value, a current value, and the like to be supplied to the LRA 140 based on the waveform data representing the drive waveform of the LRA 140.
  • the position determination unit 212 determines whether the position (operation position) where the user's finger touches the touch panel 120 is inside or outside the predetermined area.
  • the load determination unit 213 compares the load detected by the load sensor 160 with the first threshold value and the second threshold value, and determines whether or not the load detected by the load sensor 160 is greater than or equal to the first threshold value and the second threshold value. To do. Threshold data representing the first threshold and the second threshold is stored in the threshold database 350.
  • the memory 220 stores a storage area for storing a drive control program 230 for controlling the driving of the LRA 140, a storage area for storing waveform data (waveform database 240), and a storage for storing an API (Application Programming Interface) 250. And an area. Further, the memory 220 is further provided with a storage area in which a threshold database 350 representing the threshold value of the load sensor 160 is stored.
  • the waveform database 240 and the threshold database 350 are storage areas in the memory 220.
  • the waveform database 240 and the threshold database 350 are stored in separate memories that are physically separated from the memory 220. It may be stored.
  • the drive control program 230 is a computer program executed by the control unit 210 when causing the control unit 210 to execute drive control of the LRA 140.
  • the waveform database 240 is a database that stores waveform data used when generating a drive command for driving the LRA 140.
  • four types of waveform data for driving LRA 140 are stored in waveform database 240.
  • the API 250 is activated by the drive control program 230 and performs various processes for presenting a tactile sensation.
  • the API 250 is stored in the memory 220, but may be stored in another memory mounted on the substrate 150.
  • the threshold database 350 is a database that stores two types of threshold data for determining the magnitude of the load detected by the load sensor 160.
  • the driver IC 260 drives the LRA 140 based on the drive command input from the control unit 210.
  • the driver IC 260 amplifies the drive command input from the control unit 210 and inputs it to the LRA 140.
  • the display 301 is, for example, an LCD (Liquid Crystal Display).
  • LCD Liquid Crystal Display
  • GUI Graphic User Interface
  • a GUI component for example, when the electronic device 300 is a smartphone, a keypad for inputting a telephone number is a typical example.
  • the display content of the display 301 is controlled by the control unit 210.
  • the touch sensor 302 is disposed on the surface side of the display 301 and detects the coordinates of the position where the user touches the finger. Coordinates detected by the touch sensor 302 are input to the control unit 210. Note that an electronic component including the display 301 and the touch sensor 302 is a touch panel 120.
  • the input unit 303 is, for example, a switch other than the touch panel of the smartphone when the electronic device 300 is a smartphone. Examples of such a switch include a home key and a volume adjustment button.
  • the signal processing unit 304 controls data that is uploaded from the electronic device 300 to the server 400 and data that the electronic device 300 downloads from the server 400.
  • 210 is a processing unit that exchanges data with 210.
  • the signal processing unit 304 functions as a communication interface of the electronic device 300.
  • the communication unit 305 performs data communication when the electronic device 300 communicates with the server 400 via the Internet 401.
  • the server 400 is, for example, a mail server or a cloud server.
  • the recording medium I / F unit 308 is an interface between the electronic device 300 and a recording medium 309 (for example, a flash memory) connected via a data transmission path such as USB (Universal Serial Bus).
  • a recording medium 309 for example, a flash memory
  • USB Universal Serial Bus
  • a predetermined program is stored in the recording medium 309, and the program stored in the recording medium 309 is installed in the electronic device 300 via the recording medium I / F unit 308.
  • the installed predetermined program can be executed by the electronic device 300.
  • FIG. 6 is a flowchart illustrating processing when the electronic apparatus 300 according to the embodiment drives the LRA 140.
  • the electronic device 300 When the electronic device 300 according to the embodiment detects that the user has touched the touch panel 120 (step S601), the electronic device 300 activates the API 250 (step S602). Specifically, the electronic device 300 activates the API 250 when, for example, the user touches a GUI button displayed on the touch panel 120.
  • the API 250 reads the waveform data stored in the memory 220 and outputs a drive command generated based on the waveform data to the driver IC 260 (step S603).
  • the driver IC 260 converts the drive command to D / A (Digital-to-Analog) (step S604), and amplifies it with an amplifier or the like (step S605).
  • the driver IC 260 outputs the amplified signal to the LRA 140 (step S606).
  • FIG. 6 is performed when the user's finger touches the touch panel 120 regardless of the type of the GUI button operated by the user on the touch panel 120 and the load detected by the load sensor 160.
  • FIG. 6 illustrates basic processing that the electronic device 300 performs to drive the LRA 140.
  • the electronic device 300 seems to have touched a protrusion provided on a specific switch of a mechanical key switch by driving the LRA 140 when the user touches a specific GUI button on the touch panel 120. This provides a tactile feel to the user.
  • FIG. 7 is a diagram illustrating two types of threshold values for determining the magnitude of the operation input load to the touch panel 120 in the electronic device 300 according to the embodiment. These two types of threshold values are represented by two types of threshold data stored in the threshold value database 350.
  • the first threshold value of the two types of threshold values is set to a value lower than the second threshold value
  • the second threshold value of the two types of threshold values is set to a value higher than the first threshold value. What is necessary is just to set a 1st threshold value and a 2nd threshold value based on the average value of the load applied to the touch panel 120 from a fingertip, for example, when a user operates the touch panel 120.
  • the first threshold value is, for example, a value that the user has pressed the touch panel 120 lightly
  • the second threshold value is, for example, the value that the user has pressed the touch panel 120 relatively strongly.
  • the operation input to the touch panel 120 is accepted when the user lifts his / her finger or hand from the touch panel 120.
  • the value detected by the load sensor 160 is equal to or greater than the first threshold value.
  • the LRA 140 is driven with the first vibration pattern.
  • the touch panel 120 is vibrated by the first vibration pattern.
  • the electronic device 300 does not accept an operation input to a predetermined GUI button.
  • the electronic device 300 is different from the first vibration pattern.
  • the LRA 140 is driven with the vibration pattern.
  • the touch panel 120 is vibrated by the second vibration pattern.
  • the electronic device 300 also accepts an operation input to a predetermined GUI button when the second threshold value is reached.
  • the electronic device 300 vibrates the touch panel 120 with two types of vibration patterns according to the magnitude of the force with which the user presses the touch panel 120 as described above.
  • the electronic device 300 first vibrates the touch panel 120 with the first vibration pattern before accepting an operation input to the predetermined GUI button.
  • the electronic device 300 causes the touch panel 120 to vibrate with the second vibration pattern.
  • the user when the user lightly presses the touch panel 120, the user can recognize that the predetermined GUI button is being pressed only by touch.
  • the electronic device 300 accepts an operation input to the predetermined GUI button when the predetermined GUI button is pressed more strongly, and at that time, the touch panel 120 is vibrated with the second vibration pattern, and only the tactile sensation is received. It is possible to recognize that the operation input has been accepted.
  • the electronic device 300 does not accept the operation of the GUI button.
  • the electronic device 300 vibrates the touch panel 120 with the second vibration pattern, and operates the GUI button. Accept input.
  • the position of the predetermined GUI button can be recognized by the user with only the tactile sensation with the first vibration pattern, and the completion of the operation input with the second vibration pattern is tactile. Just let the user recognize.
  • FIG. 8 is a flowchart illustrating processing when the electronic device 300 according to the embodiment vibrates the touch panel 120. The processing shown in FIG. 8 is executed by the control unit 210.
  • step S801 determines whether or not there is an operation input.
  • the process of step S801 is performed by determining whether or not the control unit 210 has received coordinate data representing coordinate values for which an operation input has been made from the touch panel 120.
  • control part 210 repeats the process of step S801, when it determines with there being no operation input by step S801 (S801: NO). This is because the control unit 210 performs a series of processes shown in FIG. 8 when there is a user operation input.
  • step S802 determines whether or not the position where the operation input is within a predetermined area (step S802).
  • the process in step S802 is performed by determining whether or not the coordinates represented by the coordinate data received from the touch panel 120 in step S801 by the control unit 210 are within a predetermined area. Further, the process of step S802 is a process performed by the position determination unit 212 of the control unit 210.
  • the predetermined area may be specified by expressing the area where the predetermined GUI button is displayed by coordinates. If the coordinates with the operation input are included in the coordinates representing the area where the predetermined GUI button is displayed, the control unit 210 determines that the position with the operation input is within the predetermined area.
  • step S803 determines whether or not the load is greater than or equal to the first threshold.
  • the process of step S803 is performed by determining whether or not the load represented by the load data received by the control unit 210 from the load sensor 160 (see FIG. 5) is greater than or equal to the first threshold value.
  • the process in step S803 is a process performed by the load determination unit of the control unit 210.
  • step S804 determines whether or not the load is less than the second threshold.
  • the process of step S804 is performed by determining whether or not the load represented by the load data received by the control unit 210 from the load sensor 160 (see FIG. 5) is less than the second threshold value. Further, the process of step S804 is a process performed by the load determination unit of the control unit 210.
  • the controller 210 determines that the load is less than the second threshold (S804: YES)
  • the controller 210 drives the LRA 140 with the first drive command (step S805).
  • the first drive command is a drive command for causing the touch panel 120 to generate a vibration according to the first vibration pattern.
  • step S805 is performed when the control unit 210 inputs a first drive command to the driver IC 260. Note that the process of step S805 is performed by the drive control unit 211 of the control unit 210.
  • the touch panel 120 is vibrated by the first vibration pattern by lightly pressing a predetermined GUI button on the touch panel 120. This is equivalent to In this case, the user can recognize that he / she is touching a predetermined GUI button only with the finger touch.
  • the control unit 210 determines whether or not the power of the electronic device 300 has been turned off following the process of step S805 (step S806). This is because it is not necessary to continue the process when the power is turned off.
  • control unit 210 determines that the power is not turned off (S806: NO)
  • the flow returns to step S801.
  • step S803 When the controller 210 determines in step S803 that the load is not equal to or greater than the first threshold (S803: NO), the flow returns to step S801.
  • This case corresponds to, for example, a case where the user touches a predetermined GUI button very lightly. In such a case, since it is considered that the user is not in a state of searching for a predetermined GUI button, the flow is returned to S801 to determine whether or not there is an operation input.
  • step S804 determines in step S804 that the load is not less than the second threshold (S804: NO)
  • the flow proceeds to step S807.
  • the controller 210 drives the LRA 140 with the second drive command (step S807).
  • the second drive command is a drive command for causing the touch panel 120 to generate a vibration according to the second vibration pattern.
  • step S807 is performed when the control unit 210 inputs a second drive command to the driver IC 260. Note that the process of step S807 is performed by the drive control unit 211 of the control unit 210.
  • step S807 the flow proceeds to steps S801, S802, S803, S804, S805, and S806, the flow returns to S801, and proceeds to S802, S803, S804, and S807. It is.
  • step S804 the load is less than the second threshold, This is because the process proceeds to S805.
  • the first vibration pattern vibration is first generated on the touch panel 120 by the process of step S805, and then the second vibration pattern is displayed on the touch panel 120 by the process of step S807. Vibration will occur.
  • the user can recognize that the predetermined GUI button is first touched by the vibration according to the first vibration pattern, and then the operation of the predetermined GUI button is performed on the electronic device 300 by the vibration according to the second vibration pattern. You can recognize that it was accepted.
  • step S802 determines in step S802 that the position where the operation input has been made is outside the predetermined region (S802: NO), the flow proceeds to step S808.
  • the control unit 210 determines whether or not the load is equal to or greater than the first threshold value (step S808).
  • the process of step S808 is performed by determining whether or not the load represented by the load data received from the load sensor 160 (see FIG. 5) is equal to or greater than the first threshold, similarly to the process of step S803. Is called. Further, the process of step S808 is a process performed by the load determination unit of the control unit 210.
  • step S809 When the control unit 210 determines that the load is equal to or greater than the first threshold (S808: YES), the control unit 210 determines whether the load is less than the second threshold (step S809).
  • the process in step S809 is performed by determining whether or not the load represented by the load data received from the load sensor 160 (see FIG. 5) is less than the second threshold, similarly to the process in step S804. Is called. Further, the process of step S809 is a process performed by the load determination unit of the control unit 210.
  • the controller 210 drives the LRA 140 with the third drive command (step S810).
  • the third drive command is a drive command for causing the touch panel 120 to generate a vibration according to the third vibration pattern.
  • step S810 is performed when the control unit 210 inputs a third drive command to the driver IC 260. Note that the process of step S810 is performed by the drive control unit 211 of the control unit 210.
  • the third vibration pattern may be a vibration pattern that does not drive the touch panel 120 without driving the LRA 140.
  • the touch panel 120 can be vibrated (in the first vibration pattern) only when the user lightly touches a predetermined GUI button of the touch panel 120.
  • step S809 determines in step S809 that the load is not less than the second threshold (S809: NO)
  • the flow proceeds to step S811.
  • the controller 210 drives the LRA 140 with the fourth drive command (step S811).
  • the fourth drive command is a drive command for causing the touch panel 120 to generate a vibration according to the fourth vibration pattern.
  • step S811 is performed when the control unit 210 inputs a fourth drive command to the driver IC 260. Note that the process of step S811 is performed by the drive control unit 211 of the control unit 210.
  • step S811 the flow proceeds to steps S801, S802, S808, S809, S810, and S806, the flow returns to S801, and proceeds to S802, S808, S809, and S811. It is.
  • step S809 the load is less than the second threshold, This is because the process proceeds to S810.
  • the vibration of the third vibration pattern is first generated on the touch panel 120 by the process of step S810, and then the touch panel 120 is processed by the process of step S811. Four vibration patterns are generated.
  • the user can recognize that the GUI button other than the predetermined GUI button is first touched by the vibration by the third vibration pattern, and then the GUI other than the predetermined GUI button by the vibration by the fourth vibration pattern. It can be recognized that the button operation is accepted by the electronic device 300.
  • the fourth drive command may be the same drive command as the second drive command.
  • the vibration provided to the user when the operation input is received by the electronic device 300 using a predetermined GUI button and when the operation input is received by the electronic device 300 using a GUI button other than the predetermined GUI button. Can be unified.
  • vibration of the first vibration pattern is first generated on touch panel 120, and then the second Vibration of the vibration pattern is generated on the touch panel 120.
  • the user can first recognize that the predetermined GUI button is touched by the vibration according to the first vibration pattern only by tactile sensation, and then input the operation to the predetermined GUI button by the vibration according to the second vibration pattern. Can be recognized only by touch.
  • the touch panel 120 when a user lightly presses a predetermined GUI button on the touch panel 120, the touch panel 120 generates vibration of the first vibration pattern, which is provided in a specific key switch of a conventional mechanical key switch. As in the case of touching the protrusion, the user can recognize the position of a specific GUI button only by touch.
  • the electronic device 300 accepts an operation input to the predetermined GUI button when the predetermined GUI button is pressed more strongly, and the operation input is performed only by tactile sensation by vibrating the touch panel 120 with the second vibration pattern. Can be recognized.
  • the position of the predetermined GUI button can be recognized by the user with only the tactile sensation with the first vibration pattern, and the completion of the operation input with the second vibration pattern is tactile. Just let the user recognize.
  • the third vibration pattern is a pattern that does not vibrate the touch panel 120.
  • 9 to 12 are diagrams illustrating an operation example of the smartphone 300A according to the embodiment.
  • GUI buttons 121A, 121B, 121C, 121D, 121E, 121F are displayed on the touch panel 120 of the smartphone 300A.
  • the GUI button 121A is a GUI button that is pressed when using a telephone function
  • the GUI button 121B is a GUI button that is pressed when using a mail function
  • the GUI button 121C is a GUI button that is pressed when using the telephone directory.
  • the GUI buttons 121D to 121F are GUI buttons to which a specific operation can be assigned. For example, the GUI buttons 121D to 121F can be used as shortening buttons for making a call to a specific contact.
  • the GUI button 121A is set as a predetermined GUI button in the smartphone 300A.
  • the color of the button of the GUI button 121A may be changed.
  • the touch panel 120 does not vibrate.
  • the GUI button 121E is a GUI button other than the predetermined GUI button, and here, the third vibration pattern is a pattern that does not vibrate the touch panel 120.
  • the touch panel 120 is vibrated with the fourth vibration pattern, and the user can recognize that the operation input has been accepted.
  • the smartphone 300A first vibrates the touch panel 120 with the first vibration pattern, and when the GUI button 121A is further strongly pressed, 120 is vibrated with the second vibration pattern.
  • the user can recognize the position of the GUI button 121A only by tactile sensation and can recognize that the operation input has been accepted by the smartphone 300A.
  • a GUI button representing a numeric keypad for a calculator is displayed on the touch panel of the smartphone 300A.
  • the touch panel 120 does not vibrate.
  • the “9” GUI button is a GUI button other than the predetermined GUI button, and here, the third vibration pattern is a pattern that does not vibrate the touch panel 120.
  • the touch panel 120 is vibrated in the fourth vibration pattern, and the user can recognize that the operation input has been accepted.
  • the smartphone 300A When the user's finger touches the '5' GUI button, the smartphone 300A first vibrates the touch panel 120 with the first vibration pattern, and when the '5' GUI button is further strongly pressed, Vibrate in the second vibration pattern.
  • the user can recognize the position of the “5” GUI button only by tactile sensation and can recognize that the operation input has been accepted by the smartphone 300A.
  • first vibration of the first vibration pattern is generated on the touch panel 120, and then the second vibration is generated.
  • a vibration of the pattern is generated on the touch panel 120.
  • the user can recognize from the tactile sensation that the user first touches the predetermined GUI button with the vibration according to the first vibration pattern, and then the operation of the predetermined GUI button is performed with the vibration according to the second vibration pattern. It can be recognized only by tactile sensation that it has been accepted by 300A.
  • GUI button 121A shown in FIG. 9B and '5' GUI button shown in FIG. 11B the first vibration pattern is displayed. Vibration is generated on the touch panel 120. Accordingly, it is possible to provide a smartphone 300A that allows a user to recognize the position of a specific GUI button only with a tactile sensation, in the same manner as when a protrusion provided on a specific key switch of a conventional mechanical key switch is touched. Can do.
  • the embodiments of the electronic device 300 and the smartphone 300A that can identify a specific GUI button by touch have been described above.
  • the rising or rising shape of the waveform represented by the waveform data described above may be any shape.
  • the vibration waveform corresponding to the operation input can be freely generated by combining the vibration waveforms.
  • the smartphone 300A can be provided to the user with a tactile sensation as if a mechanical button other than the touch panel 120 was pressed (a feeling as if the button was clicked), it would be easier for the user to recognize various vibration waveforms.
  • the LRA driving method according to the first method, the second method, and the third method is a driving method that provides a tactile sensation that makes it easier for the user to recognize the vibration waveform of the touch panel 120.
  • the waveform data generated by the first method, the second method, and the third method described below is the same as the first drive command, the second drive command, the third drive command, and the fourth drive command described above. It can be used as waveform data for generation.
  • the first vibration pattern generated by the first drive command is made to have a tactile sensation as if a mechanical button was clicked, when the user's finger lightly touches the predetermined GUI button, the position of the predetermined GUI button is changed. You can quickly communicate to users.
  • the second vibration pattern generated by the second drive command is made to have a tactile sensation as if a mechanical button was clicked, when a user's finger strongly touches to operate the predetermined GUI button, the predetermined GUI is displayed. It becomes easier for the user to recognize that the operation input to the button has been accepted.
  • the third vibration pattern generated by the third drive command has a tactile sensation as if a mechanical button was clicked, when a user's finger lightly touches a GUI button other than the predetermined GUI button, the predetermined GUI is displayed.
  • the position of the GUI button other than the buttons can be quickly transmitted to the user.
  • the fourth vibration pattern generated by the fourth drive command has a tactile sensation as if a mechanical button was clicked, when a user's finger touches strongly to operate a GUI button other than a predetermined GUI button, The user can easily recognize that the operation input to the GUI buttons other than the predetermined GUI button has been accepted.
  • the second vibration pattern and the fourth vibration pattern may be the same vibration pattern, but the first vibration pattern and the third vibration pattern are set to different vibration patterns. This is to allow the user to recognize the position of a predetermined GUI button only by touch when the user starts touching the touch panel 120.
  • the LRA vibration pattern is changed using three methods to express a tactile sensation as if a mechanical button was clicked.
  • the first method is a method of suppressing free vibration due to the natural frequency of the LRA that continues even after the supply of the drive command is stopped.
  • the free vibration due to the natural frequency of the LRA that continues even after the supply of the drive command is stopped is referred to as residual vibration.
  • the vibration of the LRA 140 stops in one to several cycles when a drive command that satisfies a specific condition described later is supplied to the LRA 140.
  • a drive command that satisfies a specific condition is applied to the LRA 140 to stop the residual vibration, so that a vibration that rapidly attenuates in one to several cycles is generated and a mechanical button is clicked.
  • Express tactile sensation
  • FIG. 13 is a diagram for explaining the operating principle of the LRA
  • FIG. 14 is a diagram illustrating an example of an input waveform applied to the LRA.
  • the LRA 140 When the sine wave F is applied to the LRA 140, the LRA 140 generates a vibration having a natural frequency (resonance frequency) f0 of the LRA 140. That is, the LRA 140 generates a combined wave in which the sine wave having the frequency f1 and the sine wave having the natural frequency f0 of the LRA 140 are combined, and the LRA 140 is displaced according to the combined wave.
  • FIG. 15 is a diagram showing the acceleration of vibration of the LRA when the input of FIG. 14 is applied as a drive command.
  • FIG. 15A is a diagram illustrating a forced displacement component and a free vibration component of the displacement generated in the LRA 140
  • FIG. 15B is a diagram illustrating a combined wave.
  • a waveform y1 indicated by a dotted line indicates a forced vibration displacement of the LRA 140 generated when the sine wave F is applied to the LRA 140
  • a waveform y2 indicated by a solid line indicates a free vibration displacement.
  • the displacement y3 generated in the LRA 140 is a composite wave of the waveform y1 and the waveform y2.
  • FIG. 15B is a diagram illustrating an example of a combined wave y3 of the waveform y1 and the waveform y2. It can be seen that the synthesized wave y3 becomes 0 at the timing T when the sine wave F becomes 0.
  • FIG. 16 is a diagram illustrating an example of the vibration speed and vibration acceleration of the LRA.
  • FIG. 16A shows the waveform of the composite wave y3
  • FIG. 16B shows the velocity waveform y3 ′ obtained by differentiating the displacement of the composite wave y3
  • FIG. It is a figure which shows the waveform y3 '' of the acceleration obtained by differentiating the displacement of the synthetic wave y3 twice.
  • the velocity waveform y3 ′ and the acceleration waveform y3 ′′ become 0 when the synthesized wave y3 becomes 0. That is, the vibration of the LRA 140 stops at the timing T.
  • the vibration acceleration waveform y3 ′′ of the LRA 140 stops in two cycles within 0.01 sec. Therefore, in the example of FIG. 16, the acceleration of vibration falls within the human detection limit of 0.02 G within 0.01 sec. A click feeling as if the button 2 was pressed can be expressed.
  • FIG. 17 is a diagram illustrating vibration acceleration of the LRA 140 when a drive command is a sine wave having the natural frequency of the LRA.
  • FIG. 17B shows the acceleration of vibration of the LRA 140 when a simulation is performed using the sine wave of FIG. 17A as a drive command.
  • the vibration acceleration of the touch panel 120 is measured by installing an accelerometer in the center of the touch panel 120.
  • FIG. 18 is a diagram showing acceleration measured by an electronic device when a conventional method of applying a voltage having an antiphase (180 degree phase shift) of vibration generated in the LRA 140 after the drive signal is stopped.
  • FIG. 18B shows the LRA 140 when the sine wave of FIG. 18A is used as a drive command in an actual machine equipped with the LRA 140, and a voltage having an opposite phase of the residual vibration generated in the LRA 140 after the drive command is stopped is applied. Indicates the acceleration of vibration.
  • the residual vibration is smaller than that of FIG. 17, but it takes 0.05 sec or more until the acceleration of the vibration becomes 0.02 G or less of the human detection lower limit.
  • FIG. 19 is a diagram illustrating an LRA acceleration response simulation when a signal that does not satisfy the conditions of the embodiment is used as an input drive signal, and an acceleration measurement result in an actual electronic device.
  • FIG. 19A shows a sine wave having a frequency of 300 Hz that does not satisfy the specific condition according to the embodiment.
  • FIG. 19B shows the acceleration of vibration when a simulation is performed using the sine wave of FIG. 19A as a drive command.
  • FIG. 20 is a diagram illustrating an LRA acceleration response simulation when a signal satisfying the conditions of the embodiment is an input drive signal, and an acceleration measurement result in an actual electronic device.
  • FIG. 20 (A) shows a sine wave having a frequency of 350 Hz that satisfies a specific condition.
  • FIG. 20B shows the acceleration of vibration of the LRA 140 when a simulation is performed using the sine wave of FIG. 20A as a drive command.
  • the acceleration of the residual vibration is 0.02 G or less of the detection lower limit, and the vibration waveform is a short-time waveform.
  • the natural frequency f0 may be the natural frequency of the LRA 140 after the LRA 140 is incorporated into the electronic device 300.
  • the frequency f1 is preferably set so that the error is 1% or less with respect to m / n ⁇ f0. If the frequency f1 is set in this way, even if residual vibration occurs after the application of the drive command is stopped, the acceleration of the vibration is 0.02 G or less, which is the lower limit of human detection, and is not detected by humans. There is no loss of tactile feel like clicking a button.
  • the touch panel 120 itself fixed to the housing 110 is also a vibrating body that vibrates at a high frequency.
  • the drive command for the LRA 140 is a signal for stopping the excitation of the LRA 140 when the amplitude reaches its peak, and the high frequency vibration of the touch panel 120 itself is excited to rapidly attenuate in one to several cycles. Expresses the tactile sensation of clicking a mechanical button by generating vibration.
  • FIG. 21 is a diagram for explaining excitation of vibration by the resonance frequency of the touch panel.
  • FIG. 21A shows a sine waveform of a drive command applied to the LRA 140
  • FIG. 21B shows a waveform of vibration acceleration of the touch panel 120.
  • the drive command is a voltage.
  • the resonance frequency of the LRA 140 is 225 Hz
  • the resonance frequency of the touch panel 120 is 1 kHz. That is, it can be said that the vibration of the LRA 140 is a low-frequency vibration, and the vibration of the touch panel 120 is a high-frequency vibration.
  • the resonance frequency of the touch panel 120 is a resonance frequency in a state where the four sides of the touch panel 120 are fixed to the housing 110.
  • the drive command is a signal for stopping the excitation to the LRA 140 at the point P ⁇ b> 1 where the amplitude reaches a peak.
  • the amplitude of the drive command shown in FIG. 21A becomes 0 immediately after the excitation to the LRA 140 is stopped.
  • the vibration of the LRA 140 is removed from the harmonic vibration by setting the amplitude of the drive command to 0 from the peak.
  • the drive time of the LRA 140 by the drive command is set to 7/4 period, and the point P1 at which the amplitude reaches the peak is the end of the drive command. Note that the end of the drive command is a point at which the vibration to the LRA 140 is stopped.
  • FIG. 22 is a diagram showing acceleration of vibration of the touch panel when the voltage of the resonance frequency of the LRA is used as a drive command.
  • the acceleration of vibration of the touch panel 120 when attempting to express a tactile sensation as if the driving time of the LRA 140 is shortened and a mechanical button is clicked is shown.
  • the vibration of the touch panel 120 requires a rise time for amplifying the vibration amount and a time until the acceleration of the amplified vibration is attenuated to 0.02 G or less, even if the driving time of the LRA 140 is shortened.
  • the vibration continues for several cycles. In the example of FIG. 22, it can be seen that it takes about 25 msec from the rise to the decay, and the vibration continues for about 4 cycles. Therefore, it is difficult to present a sharp tactile sensation as if a mechanical button was clicked.
  • FIG. 21B it can be seen that the high frequency vibration of 1 kHz frequency is excited, and the vibration is attenuated in about two cycles.
  • the high frequency vibration is excited at the point P1, which is the end of the drive command, and the acceleration of the high frequency vibration reaches its peak. Therefore, the timing at which the acceleration of the high-frequency vibration reaches a peak slightly deviates from the timing at which the drive command becomes the point P1.
  • FIG. 23 is a diagram illustrating an example in which the location for exciting the high-frequency vibration is shifted from the point P1.
  • 23A shows a sine waveform of a drive command applied to the LRA 140
  • FIG. 23B shows a waveform of acceleration of vibration of the LRA 140.
  • the drive command ends at a point P2 slightly deviated from the amplitude peak.
  • the acceleration of the superimposed low frequency vibration is smaller than the maximum value, and the acceleration peak of the high frequency vibration is the value shown in FIG. Although smaller than that, the same effect as the example of FIG. 21 can be obtained.
  • the waveform representing the driving command of the LRA 140 generated using the first method and the second method is held in the waveform database 240 in the memory 220 as waveform data.
  • a drive command is a signal that satisfies the specific condition described in the first method and ends at a point where the amplitude reaches a peak as described in the second method.
  • FIG. 24 is a diagram showing an example of the LRA drive command of the third method.
  • FIG. 24A shows the waveform of the drive command G of the third method
  • FIG. 24B shows the acceleration of vibration of the touch panel 120 when the drive command G of the third method is applied to the LRA 140.
  • FIG. 24A shows the waveform of the drive command G of the third method
  • FIG. 24B shows the acceleration of vibration of the touch panel 120 when the drive command G of the third method is applied to the LRA 140.
  • the drive command G of the third method terminates at a point P3 where the amplitude becomes the maximum value.
  • the drive command G is a cosine wave whose phase is shifted by ⁇ / 2 from the sine wave waveform so that the drive command G is an m-cycle signal and the signal whose amplitude peak ends.
  • the drive command G is a cosine wave, whereby the drive command G can be a signal that satisfies a specific condition and has a peak at the end.
  • the resonance frequency of the touch panel 120 is the resonance frequency in a state where the four sides of the touch panel 120 are fixed to the housing 110.
  • the resonance frequency of the touch panel 120 is the resonance frequency of the touch panel 120 in a state where the touch panel 120 is incorporated in the housing 110, for example, when the LRA 140 is disposed inside the housing 110.
  • the waveform data of the driving device 200 of the third method includes the frequency f1, the amplitude, the phase, the period (value of m), etc. of the driving command G.
  • the waveform data of the third method may include an expression representing the waveform of the drive command G.
  • step S603 of FIG. 6 the driving device 200 of the third method reads waveform data indicating the driving command G by the API 250 and outputs a driving command corresponding to the waveform data to the driver IC 260.
  • the driver IC 260 D / A converts and amplifies the waveform data and outputs it to the LRA 140.
  • FIG. 25 is a diagram showing an input waveform for the LRA in the third method.
  • the waveform shown in FIG. 25 shows the force applied to the LRA 140 by applying the drive command G to the LRA 140.
  • the waveform shown in FIG. 25 is a cosine wave G1 in which the phase of the sine wave F is shifted by ⁇ / 2 when the frequency of the drive command G is f1.
  • the LRA 140 When the cosine wave G1 is applied to the LRA 140, the LRA 140 is vibrated at the natural frequency f0 of the LRA 140 (ie, the resonance frequency). That is, the LRA 140 generates a combined wave obtained by combining the cosine wave G1 having the frequency f1 and the cosine wave having the natural frequency f0 of the LRA 140, and the LRA 140 is displaced according to the combined wave.
  • FIG. 26 is a diagram showing the displacement of the LRA by the third method.
  • FIG. 26A is a first diagram for explaining the displacement
  • FIG. 26B is a second diagram for explaining the displacement.
  • a waveform y11 indicated by a dotted line indicates a forced vibration component of vibration displacement generated when a cosine wave G1 is applied to the LRA 140
  • a waveform y12 indicated by a solid line indicates a free vibration component.
  • the response displacement y13 when the cosine wave G1 is applied to the LRA 140 is a composite wave of the waveform y11 and the waveform y12.
  • FIG. 26B is a diagram illustrating an example of the displacement of the composite wave y13 of the waveform y11 and the waveform y12. It can be seen that the synthesized wave y13 becomes 0 at the timing T1 when the cosine wave G1 becomes 0.
  • FIG. 27 is a diagram showing an example of the vibration speed and vibration acceleration of the LRA in the third method.
  • FIG. 27A shows the waveform of the composite wave y13
  • FIG. 27B shows the velocity waveform y13 ′ obtained by differentiating the displacement of the composite wave y13
  • FIG. It is a figure which shows the waveform y13 '' of the acceleration obtained by differentiating the displacement of the synthetic wave y13 twice.
  • the velocity waveform y13 ′ and the acceleration waveform y13 ′′ become 0 at the timing T1 when the synthesized wave y13 becomes 0. That is, the vibration of the LRA 140 stops at the timing T1.
  • the acceleration waveform y13 ′′ stops in three cycles within 0.01 sec. Therefore, in the third method, the vibration acceleration becomes 0.02 G or less within 0.01 sec, and the metal dome type button 2 is clicked. It is possible to express a tactile feeling like that.
  • the excitation is stopped at the point where the amplitude of the cosine wave G1 reaches a peak, but the present invention is not limited to this.
  • the end of the drive command may be any point that can generate a steep peak representing a click feeling in a waveform indicating acceleration of vibration of the touch panel 120, for example.
  • the end of the drive command may be other than 0, which is the center point of the amplitude, and the end of the drive command is better as the point is closer to the peak of the amplitude.
  • the LRA 140 is attached to the surface of the touch panel 120 on the housing side, but is not limited thereto.
  • the LRA 140 may be disposed in the vicinity of the substrate 150 disposed in the housing 110.
  • FIG. 28 is a diagram illustrating an example of an electronic device in which the LRA 140 is provided in the housing.
  • the LRA 140 is disposed in the vicinity of the substrate 150 provided in the housing 110.
  • the third method can also be applied to the electronic device 100A. Further, when the third method is applied to the electronic device 100A, it is possible to express a tactile sensation as if the metal dome type button 2 was clicked, similarly to the electronic device 300 of the third method.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

L'invention concerne un dispositif électronique pour lequel un bouton d'interface graphique utilisateur (GUI) spécifique peut être identifié de manière tactile; et un programme de commande d'activation. Le dispositif électronique comprend : un écran tactile; un actionneur qui fait vibrer l'écran tactile; une unité de détection de charge qui détecte la charge pour une saisie d'opération sur l'écran tactile par un utilisateur; et une unité de commande d'activation qui active l'actionneur au moyen d'une première instruction d'activation qui provoque la vibration de l'écran tactile selon un premier modèle de vibration, si la saisie d'opération sur l'écran tactile est effectuée à l'intérieur d'une zone prescrite de l'écran tactile, et si la charge détectée par l'unité de détection de charge est supérieure ou égale à une première valeur prescrite et inférieure à une seconde valeur prescrite qui est supérieure à la première valeur prescrite.
PCT/JP2012/064945 2012-06-11 2012-06-11 Dispositif électronique et programme de commande d'activation WO2013186844A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2014520830A JP5822022B2 (ja) 2012-06-11 2012-06-11 電子機器、及び駆動制御プログラム
PCT/JP2012/064945 WO2013186844A1 (fr) 2012-06-11 2012-06-11 Dispositif électronique et programme de commande d'activation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2012/064945 WO2013186844A1 (fr) 2012-06-11 2012-06-11 Dispositif électronique et programme de commande d'activation

Publications (1)

Publication Number Publication Date
WO2013186844A1 true WO2013186844A1 (fr) 2013-12-19

Family

ID=49757708

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/064945 WO2013186844A1 (fr) 2012-06-11 2012-06-11 Dispositif électronique et programme de commande d'activation

Country Status (2)

Country Link
JP (1) JP5822022B2 (fr)
WO (1) WO2013186844A1 (fr)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017513165A (ja) * 2014-03-21 2017-05-25 イマージョン コーポレーションImmersion Corporation 力ベースのオブジェクト操作及び触覚的感覚のためのシステム及び方法
JP2021515941A (ja) * 2018-03-08 2021-06-24 センセル インコーポレイテッドSensel,Inc. 人間コンピュータインタフェースシステム
US11360563B2 (en) 2016-03-31 2022-06-14 Sensel, Inc. System and method for detecting and responding to touch inputs with haptic feedback
US11409388B2 (en) 2016-03-31 2022-08-09 Sensel, Inc. System and method for a touch sensor interfacing a computer system and a user
US11422631B2 (en) 2016-03-31 2022-08-23 Sensel, Inc. Human-computer interface system
US11460926B2 (en) 2016-03-31 2022-10-04 Sensel, Inc. Human-computer interface system
US11460924B2 (en) 2016-03-31 2022-10-04 Sensel, Inc. System and method for detecting and characterizing inputs on a touch sensor surface
WO2023037968A1 (fr) * 2021-09-09 2023-03-16 京セラ株式会社 Dispositif électrique et procédé de commande de vibrations
US11880506B2 (en) 2020-10-06 2024-01-23 Sensel, Inc. Haptic keyboard system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11212725A (ja) * 1998-01-26 1999-08-06 Idec Izumi Corp 情報表示装置および操作入力装置
WO2010103693A1 (fr) * 2009-03-09 2010-09-16 シコー株式会社 Moteur à vibrations et composant électronique
JP2010287232A (ja) * 2009-06-09 2010-12-24 Immersion Corp アクチュエータを用い触覚効果を生成する方法及び装置
JP2012020284A (ja) * 2004-11-30 2012-02-02 Immersion Corp 振動触覚ハプティック効果を発生させるための共振装置を制御するためのシステムおよび方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11212725A (ja) * 1998-01-26 1999-08-06 Idec Izumi Corp 情報表示装置および操作入力装置
JP2012020284A (ja) * 2004-11-30 2012-02-02 Immersion Corp 振動触覚ハプティック効果を発生させるための共振装置を制御するためのシステムおよび方法
WO2010103693A1 (fr) * 2009-03-09 2010-09-16 シコー株式会社 Moteur à vibrations et composant électronique
JP2010287232A (ja) * 2009-06-09 2010-12-24 Immersion Corp アクチュエータを用い触覚効果を生成する方法及び装置

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017513165A (ja) * 2014-03-21 2017-05-25 イマージョン コーポレーションImmersion Corporation 力ベースのオブジェクト操作及び触覚的感覚のためのシステム及び方法
US11360563B2 (en) 2016-03-31 2022-06-14 Sensel, Inc. System and method for detecting and responding to touch inputs with haptic feedback
US11409388B2 (en) 2016-03-31 2022-08-09 Sensel, Inc. System and method for a touch sensor interfacing a computer system and a user
US11422631B2 (en) 2016-03-31 2022-08-23 Sensel, Inc. Human-computer interface system
US11460926B2 (en) 2016-03-31 2022-10-04 Sensel, Inc. Human-computer interface system
US11460924B2 (en) 2016-03-31 2022-10-04 Sensel, Inc. System and method for detecting and characterizing inputs on a touch sensor surface
US11592903B2 (en) 2016-03-31 2023-02-28 Sensel, Inc. System and method for detecting and responding to touch inputs with haptic feedback
JP2021515941A (ja) * 2018-03-08 2021-06-24 センセル インコーポレイテッドSensel,Inc. 人間コンピュータインタフェースシステム
JP7210603B2 (ja) 2018-03-08 2023-01-23 センセル インコーポレイテッド 触覚フィードバックを使用してタッチ入力を検出し当該タッチ入力に応答するためのシステム及び方法
US11880506B2 (en) 2020-10-06 2024-01-23 Sensel, Inc. Haptic keyboard system
WO2023037968A1 (fr) * 2021-09-09 2023-03-16 京セラ株式会社 Dispositif électrique et procédé de commande de vibrations

Also Published As

Publication number Publication date
JPWO2013186844A1 (ja) 2016-02-01
JP5822022B2 (ja) 2015-11-24

Similar Documents

Publication Publication Date Title
JP5822022B2 (ja) 電子機器、及び駆動制御プログラム
JP5831635B2 (ja) 駆動装置、電子機器、及び駆動制御プログラム
JP5962757B2 (ja) プログラム及び電子機器
JP5822023B2 (ja) 電子機器、振動発生プログラム、及び振動パターン利用システム
US20170108931A1 (en) Multiple mode haptic feedback system
JP6142928B2 (ja) 駆動装置、電子機器、駆動制御プログラム、及び駆動信号の生成方法
JP6032364B2 (ja) 駆動装置、電子機器及び駆動制御プログラム
JP2019114270A (ja) 触覚変換を行うためのシステム及び方法
JP6032362B2 (ja) 駆動装置、電子機器及び駆動制御プログラム
JP5935885B2 (ja) 駆動装置、電子機器及び駆動制御プログラム
KR20140109292A (ko) 선형 공진 액추에이터를 구비한 햅틱 디바이스
WO2012105273A1 (fr) Équipement électronique
KR100980855B1 (ko) 이종 액츄에이터로 햅틱 피드백을 제공하는 휴대기기 및 그제공방법
KR100957005B1 (ko) 이종 액츄에이터를 이용한 햅틱 피드백 제공모듈, 이를 갖는 휴대기기 및 그 제공방법
KR20120115159A (ko) 택타일 피드백 방법 및 장치
JP5907260B2 (ja) 駆動装置、電子機器及び駆動制御プログラム
JP5910741B2 (ja) プログラム及び電子機器
WO2015136835A1 (fr) Dispositif électronique
JP5962756B2 (ja) 電子機器及び振動提供方法
WO2012102026A1 (fr) Dispositif d'entrée
JP5907261B2 (ja) 駆動装置、電子機器及び駆動制御プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12878915

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2014520830

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12878915

Country of ref document: EP

Kind code of ref document: A1