JP2017049699A - Input device, integrated input system, method for controlling input device, and program - Google Patents

Input device, integrated input system, method for controlling input device, and program Download PDF

Info

Publication number
JP2017049699A
JP2017049699A JP2015171161A JP2015171161A JP2017049699A JP 2017049699 A JP2017049699 A JP 2017049699A JP 2015171161 A JP2015171161 A JP 2015171161A JP 2015171161 A JP2015171161 A JP 2015171161A JP 2017049699 A JP2017049699 A JP 2017049699A
Authority
JP
Japan
Prior art keywords
vibration
unit
operation
user
input device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2015171161A
Other languages
Japanese (ja)
Other versions
JP6585431B2 (en
Inventor
修 久木元
Osamu Kukimoto
修 久木元
将嘉 伊井野
Takeyoshi Iino
将嘉 伊井野
寛 松涛
Hiroshi Matsunami
寛 松涛
斉 津田
Hitoshi Tsuda
斉 津田
輝 沢田
Teru Sawada
輝 沢田
Original Assignee
富士通テン株式会社
Fujitsu Ten Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士通テン株式会社, Fujitsu Ten Ltd filed Critical 富士通テン株式会社
Priority to JP2015171161A priority Critical patent/JP6585431B2/en
Priority claimed from US15/240,238 external-priority patent/US20170060245A1/en
Publication of JP2017049699A publication Critical patent/JP2017049699A/en
Application granted granted Critical
Publication of JP6585431B2 publication Critical patent/JP6585431B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Abstract

PROBLEM TO BE SOLVED: To provide an input device capable of operating various objects to be controlled with high operability, an integrated input system, a method for controlling the input device, and a program.SOLUTION: The input device includes one operation part 11, a selection part 12c, a detection part 12e, at least one vibration part 13, a setting part 12d, and a vibration control part 12f. The selection part 12c selects an object to be controlled using the operation part 11 on the basis of the behavior of a user. The detection part 12e detects the predetermined contact operation of the user with respect to the operation part 11. The vibration part 13 vibrates the operation part 11. The setting part 12d sets the vibration state of the vibration part 11 corresponding to the contact operation when the contact operation is detected by the detection part 12e, according to the object to be controlled selected by the selection part 12c. When the contact operation is detected by the detection part 12e, the vibration control part 12f controls the vibration part 11, so as to obtain the vibration state set by the setting part 12d.SELECTED DRAWING: Figure 2

Description

  Embodiments disclosed herein relate to an input device, an integrated input system, an input device control method, and a program.

  2. Description of the Related Art Conventionally, there is known an input device that notifies a user that an input has been accepted by giving a tactile sensation. In such an input device, for example, vibration is generated according to the pressing force by the user, thereby informing the user that the input has been accepted (see, for example, Patent Document 1).

JP 2013-235614 A

  However, in the conventional input device, only vibration is generated according to the pressing force of the user at the contact position. For example, what happens when the user performs an operation of moving the contact position with respect to the operation surface. The tactile feedback was not considered. Thus, in the conventional input device, there is room for further improvement in improving user operability.

  In addition, for example, in an in-vehicle system in which various devices are mounted and the user needs to control the modes of these devices and the devices, etc., the user wants to operate these various control targets with higher operability. There is a need.

  One aspect of the embodiments has been made in view of the above, and provides an input device, an integrated input system, an input device control method, and a program capable of operating various control objects with high operability. Objective.

  An input device according to an aspect of an embodiment includes one operation surface, a selection unit, a detection unit, at least one vibration element, a setting unit, and a vibration control unit. The said selection part selects the control object using an operation surface based on a user's behavior. The detection unit detects a predetermined contact operation of the user with respect to the operation surface. The vibration element vibrates the operation surface. The setting unit sets a vibration state of the vibration element corresponding to the contact operation when the contact operation is detected by the detection unit according to the control target selected by the selection unit. The vibration control unit controls the vibration element to be in the vibration state set by the setting unit when the contact operation is detected by the detection unit.

  According to one aspect of the embodiment, various control objects can be operated with high operability.

FIG. 1A is a diagram (part 1) illustrating an overview of a method for controlling an input device according to an embodiment. FIG. 1B is a diagram (part 2) illustrating an overview of the control method of the input device according to the embodiment. FIG. 1C is a diagram (part 3) illustrating an overview of the control method of the input device according to the embodiment. FIG. 2 is a block diagram of the integrated input system according to the embodiment. FIG. 3A is a diagram (part 1) illustrating a specific example of tactile feedback. FIG. 3B is a second diagram illustrating a specific example of tactile feedback. FIG. 3C is a third diagram illustrating a specific example of tactile feedback. FIG. 3D is a diagram (part 4) illustrating a specific example of tactile feedback. FIG. 4A is a diagram (part 1) illustrating a specific example of the gesture operation. FIG. 4B is a second diagram illustrating a specific example of the gesture operation. FIG. 4C is a diagram (No. 3) illustrating a specific example of the gesture operation. FIG. 5A is a diagram illustrating a specific example of combination information. FIG. 5B is a diagram illustrating a specific example of vibration state information. FIG. 6 is a flowchart illustrating a processing procedure executed by the input device according to the embodiment. FIG. 7 is a hardware configuration diagram illustrating an example of a computer that realizes the functions of the integrated input system according to the embodiment.

  Hereinafter, embodiments of an input device, an integrated input system, an input device control method, and a program disclosed in the present application will be described in detail with reference to the accompanying drawings. In addition, this invention is not limited by embodiment shown below.

  Moreover, below, after describing the outline | summary of the control method of the input device 10 which concerns on this embodiment using FIG. 1A-FIG. 1C, FIG. 2-7 is demonstrated about the input device 10 and the integrated input system 1 containing this. It will be explained using. In the present embodiment, an example in which the integrated input system 1 is configured as an in-vehicle system is given.

  First, the outline | summary of the control method of the input device 10 which concerns on this embodiment is demonstrated using FIG. 1A-FIG. 1C. 1A to 1C are diagrams (No. 1) to (No. 3) illustrating an outline of a control method of the input device 10 according to the embodiment.

  As shown in FIG. 1A, the integrated input system 1 includes an input device 10. The input device 10 has an operation surface P. The operation surface P is configured by using, for example, a touch pad having a capacitive information input function, and the user performs contact operations for controlling various devices 60 (described later in FIG. 2) to be controlled by the user D. Accept from D.

  The input device 10 is provided such that the operation surface P is disposed at a position that can be reached from the driving user D, for example, in the vicinity of the shift lever of the driver's seat.

  More specifically, the input device 10 includes at least one vibration element 13a that vibrates the operation surface P, as shown in FIG. Here, an example in which two vibration elements 13a are provided is illustrated.

  The vibration element 13a is, for example, a piezoelectric element, and can vibrate the operation surface P at high frequency. For example, when the vibration element 13a is vibrated in a state where the finger U1 of the user D presses the operation surface P, the frictional force between the finger U1 and the operation surface P can be changed.

  When the finger U1 is moved in such a state, a tactile sensation according to the changed frictional force can be fed back to the finger U1. Also, by changing the vibration state of the vibration element 13a, the magnitude of the frictional force between the finger U1 and the operation surface P can be changed, and the tactile sensation fed back to the finger U1 can be changed.

  For example, as shown in FIGS. 1B and 1B, when the finger U1 is slid to the left and right along the X-axis direction, the frictional force is changed to be larger in the section D1 than in the other sections. Thus, a tactile sensation such as the button B1 on the operation surface P can be fed back to the user D. Note that such a tactile aspect is merely an example, and other specific examples will be described later with reference to FIGS. 3A to 3D.

  As shown in FIG. 1A, the integrated input system 1 includes a microphone 20 and an imaging unit 30. The microphone 20 and the imaging unit 30 are disposed, for example, at the top of the steering column. The microphone 20 collects and inputs the voice uttered by the user D. The imaging unit 30 captures, for example, a face image of the user D seated in the driver's seat.

  Further, the integrated input system 1 includes, for example, a center display 41 and a HUD (head-up display) 42 as a display unit 40 (described later in FIG. 2).

  The center display 41 is used as a display unit of an AV-integrated navigation device mounted as one of the various devices 60, for example, and outputs various information for each selected mode such as a navigation mode and an audio mode. The HUD 42 outputs various information related to the driving situation such as the vehicle speed and the engine speed within the field of view of the user D who is driving.

  The integrated input system 1 includes an air conditioner 61 as one of the other various devices 60. The integrated input system 1 includes a speaker 70.

  In such a system in which various devices are mounted, the various devices 60 to be controlled by the user D and the operation modes for the respective modes tend to be diverse. For this reason, it has been desired that these various control objects can be operated with high operability from the viewpoint of improving convenience for the user D and ensuring safety.

  Therefore, in the integrated input system 1 according to the present embodiment, various devices 60, their modes, and the like can be operated intensively by contact operation on one operation surface P basically. In addition, at this time, for the operation surface P, various tactile sensations according to various devices 60 to be controlled by the user D and their respective modes are fed back.

  In addition, about selection of a different control object, methods other than contact operation in the operation surface P, for example, audio | voice input operation etc., can be combined combining.

  Thereby, for example, it becomes possible to control only the blind touch operation without the user D visually recognizing the control target. Specifically, as shown in FIG. 1C, the user D utters, for example, various devices 60 to be controlled and their modes. In the example shown in FIG. 1C, it is assumed that the user D utters “audio!”.

  In this case, in the present embodiment, the input device 10 receives and accepts the utterance content of the user D via the microphone 20, and selects the control target by the operation surface P to be the audio mode of the car navigation (FIG. Step S1 in the middle).

  Then, the input device 10 sets the vibration state of the vibration element 13a according to the audio mode that is the selected control target (see step S2 in the figure). Thereby, the vibration state of the vibration element 13a unique to the audio mode when the contact operation to the operation surface P is performed with the finger U1 of the user D in the audio mode is set in the input device 10.

  In this setting state, when the input device 10 detects a contact operation on the operation surface P by the user D, the input device 10 controls the vibration element 13a so as to be in the vibration state set in step S2. A tactile sensation corresponding to the target audio mode is fed back (see step S3 in the figure).

  Thereby, it becomes possible to operate various control objects by one operation surface P. Here, the contact operation is a few simple gesture operations, and the combination of vibration states corresponding to each of the gesture operations is set to be different depending on the control target.

  That is, in this embodiment, each control object can be operated by a set of simple gesture operations that can be used in common among different control objects. However, if the control objects are different, each gesture operation has a different tactile sensation. Added feedback.

  Thereby, the user D can operate various control objects with the same blind touch operation only by learning several simple gesture operations. That is, various control objects can be operated with high operability.

  A specific example of a combination of gesture operations for each control target will be described later with reference to FIGS. 5A and 5B. Moreover, the selection method of a control object should just be based on the behavior of the user D, and is not restricted to the thing by the audio | voice input mentioned above. This point will be supplemented by the description using FIG.

  As described above, in this embodiment, the control object is selected based on the behavior of the user D, the vibration state corresponding to the selected control object is set, and the contact operation on the operation surface P by the user D is detected. Then, the tactile sensation according to the controlled object is fed back by controlling the vibration element 13a so as to be in the set vibration state. Therefore, according to this embodiment, various control objects can be operated with high operability.

  In addition, although the case where the operation surface P of the input device 10 is, for example, a touch pad has been described here, the present invention is not limited thereto. For example, a touch panel integrated with the center display 41 may be used. Hereinafter, the integrated input system 1 including the input device 10 controlled by the above-described control method will be described more specifically.

  FIG. 2 is a block diagram of the integrated input system 1 according to the embodiment. In FIG. 2, only components necessary for explaining the features of the present embodiment are represented by functional blocks, and descriptions of general components are omitted.

  In other words, each component illustrated in FIG. 2 is functionally conceptual and does not necessarily need to be physically configured as illustrated. For example, the specific form of distribution / integration of each functional block is not limited to the one shown in the figure, and all or a part thereof is functionally or physically distributed in arbitrary units according to various loads or usage conditions.・ It can be integrated and configured.

  As shown in FIG. 2, the integrated input system 1 includes an input device 10, a microphone 20, an imaging unit 30, a display unit 40, a display control unit 50, various devices 60, a speaker 70, and a storage unit 80. With.

  The microphone 20 collects the voice spoken by the user D and inputs it to the input device 10. The imaging unit 30 includes, for example, an infrared LED and an infrared camera. The imaging unit 30 captures, for example, a face image of the user D with the infrared camera while illuminating the user D with the infrared LED, and inputs the image to the input device 10.

  The display unit 40 is, for example, the center display 41 or the HUD 42 described above, and presents an image as visual information output from the display control unit 50 to the user D.

  For example, the display control unit 50 generates an image to be displayed on the display unit 40 based on the operation content received by the input device 10 from the user D, and outputs the image to the display unit 40. In addition, the display control unit 50 controls the display unit 40 to cause the user D to present an image.

  The various devices 60 are, for example, the above-described navigation device, the air conditioner 61, and the like, and are objects to be controlled by the user D via the input device 10. The speaker 70 presents audio as auditory information to the user D based on, for example, the operation content received by the input device 10 from the user D.

  The storage unit 80 is a storage device such as a hard disk drive, a nonvolatile memory, or a register, and stores combination information 80a and vibration state information 80b.

  As described above, the input device 10 is an information input device including, for example, a touch pad, a touch panel, and the like. The input device 10 receives an input operation from the user D and outputs a signal corresponding to the operation content to the display control unit 50 and various devices 60. And output to the speaker 70.

  The input device 10 includes an operation unit 11, a control unit 12, and a vibration unit 13. First, the operation unit 11 and the vibration unit 13 will be described. The operation unit 11 is, for example, a flat plate sensor such as the touch pad or the touch panel described above, and includes an operation surface P (see, for example, FIG. 1A) that receives an input operation by the user D. When the user D performs a contact operation on the operation surface P, the operation unit 11 outputs a sensor value corresponding to the contact operation of the user D to the control unit 12.

  The vibration unit 13 includes at least one vibration element 13a (for example, see FIG. 1B (a)). The vibration element 13 a is a piezoelectric actuator such as a piezoelectric element (piezo element), for example, and vibrates the operation unit 11 by expanding and contracting according to a voltage signal given from the control unit 12. The vibration element 13a is disposed so as to be in contact with the operation unit 11 at a position where the user D cannot visually recognize, such as an end of the operation unit 11, for example.

  In the example shown in (a) of FIG. 1B, the vibration element 13a is a region on the left and right outer sides of the operation surface P and is disposed on the surface facing the operation surface P. The arrangement is an example, and the present invention is not limited to this.

  For example, the operation surface P may be vibrated with only one vibration element 13a. As described above, the number and arrangement of the vibration elements 13a are arbitrary, but the number and arrangement are preferable so as to vibrate the entire operation surface P uniformly. The vibration element 13a is not limited to a piezoelectric element, and may be any element that can vibrate the operation surface P in an ultrasonic frequency band, for example.

  Next, the control unit 12 will be described. As shown in FIG. 2, the control unit 12 includes a voice reception unit 12a, a line-of-sight detection unit 12b, a selection unit 12c, a setting unit 12d, a detection unit 12e, a vibration control unit 12f, and an operation processing unit 12g. Is provided.

  The control unit 12 controls each unit of the input device 10. The voice reception unit 12a receives the voice input from the microphone 20, analyzes the voice content, and passes the analysis result to the selection unit 12c.

  The line-of-sight detection unit 12b detects the direction of the line of sight of the user D based on the positional relationship between, for example, an infrared illumination reflection image (corneal reflection) generated on the eyeball in the face image captured by the imaging unit 30, and the detection result. To the selection unit 12c.

  When receiving the analysis result of the voice reception unit 12a, the selection unit 12c selects a control target desired by the user D based on the analysis result. Further, when receiving the detection result of the line-of-sight detection unit 12b, the selection unit 12c selects a control target desired by the user D based on the detection result.

  That is, the selection unit 12c can select a control target from the direction in which the user D is gazing. The selection unit 12c notifies the selected control target to the setting unit 12d.

  The setting unit 12d sets the vibration state of the vibration element 13a corresponding to the contact operation when the contact operation of the user D is detected according to the control target selected by the selection unit 12c.

  Specifically, the setting unit 12d selects the control selected by the selection unit 12c based on the combination information 80a that is defined so that the combination of vibration states of the vibration element 13a corresponding to each of the gesture operations described above varies depending on the control target. The vibration state of the vibration element 13a unique to each control object when each gesture operation is performed toward the object is set. An example of the combination information 80a will be described later with reference to FIG. 5A.

  The setting unit 12d stores the set content in the storage unit 80 as vibration state information 80b. For example, the vibration state information 80b is information including a control value of the vibration element 13a. An example of the vibration state information 80b will be described later with reference to FIG. 5B.

  The detection unit 12e detects a predetermined gesture operation of the user D with respect to the operation surface P based on the sensor value output from the operation unit 11, and passes the detection result to the vibration control unit 12f and the operation processing unit 12g.

  The vibration control unit 12f controls the vibration element 13a of the vibration unit 13 based on the vibration state information 80b so that the vibration state set by the setting unit 12d is obtained when a gesture operation is detected by the detection unit 12e. . A specific example of tactile feedback by the control of the vibration element 13a of the vibration control unit 12f will be described later with reference to FIGS. 3A to 3D.

  The operation processing unit 12g causes the display control unit 50 to visually feed back the operation content corresponding to the gesture operation detected by the detection unit 12e to the display unit 40. In addition, the operation processing unit 12g performs a process of reflecting the operation content corresponding to the gesture operation on the various devices 60.

  In addition, the operation processing unit 12g causes the speaker 70 to output, for example, a guidance voice corresponding to the gesture operation. That is, when a tactile sensation is fed back to the user D from the operation surface P, the guidance voice from the speaker 70 is used in this manner, thereby assisting the user D for example in a blind touch operation and enhancing operability. Can do.

  Next, a specific example of tactile feedback for each control target will be described with reference to FIGS. 3A to 3D. 3A to 3D are diagrams (part 1) to (part 4) illustrating specific examples of tactile feedback.

  First, in FIGS. 3A and 3B, a case where the control target is the function of volume UP / DOWN in the audio mode will be described. Therefore, it is assumed that the user D has made an utterance such as “volume” and has been selected as a control target.

  In this case, as shown in FIG. 3A, for example, the vibration state of the vibration element 13a is set on the operation surface P of the input device 10 so as to enable tactile feedback as if it were a volume adjustment dial.

  Specifically, in such a case, for example, a region R1 that draws a circular locus on the operation surface P is set. The other area is set as the area R2. The region R1 is set as a region where the frictional force is small, and the region R2 is set as a region where the frictional force is relatively large.

  The magnitude of the frictional force is realized by controlling the vibration state of the vibration element 13a by the vibration control unit 12f. That is, when the contact position of the finger U1 is within the region R1, the vibration control unit 12f generates a voltage signal that causes the vibration element 13a to vibrate at a high frequency (for example, an ultrasonic frequency band), and vibrates using the voltage signal. The element 13a is vibrated.

  On the other hand, when the contact position of the finger U1 is in the region R2, the vibration control unit 12f generates a voltage signal that vibrates in a lower frequency band than in the region R1, and the vibration is generated by the voltage signal. The element 13a is vibrated.

  Thereby, in area | region R1, the tactile sense that finger | toe U1 is easy to slip can be fed back to the user D (refer arrow 301 in a figure). Further, in the region R2 deviating from the region R1, a tactile sensation in which the finger U1 is difficult to slip can be fed back to the user D (see arrow 302 in the figure).

  Accordingly, the user D is guided along the region R1 by slippery tactile feedback, and draws a circular locus close to the image of actual dial adjustment on the operation surface P with the finger U1, so that the function of the volume UP / DOWN The operation input can be performed. At this time, for example, an image of a volume adjustment dial may be visually fed back to the display unit 40 as shown in FIG. 3A.

  This is effective, for example, when it is clear that the user D is continuously gazing at the center display 41 by the line-of-sight detection unit 12b. That is, in this case, since it is assumed that the user D is not in the driving operation, the operation of the controlled object can be surely performed by using visual feedback together with the sense of touch.

  On the other hand, for example, when it is clear that the user D is gazing at the HUD 42 by the line-of-sight detection unit 12b, it is assumed that the user D is in the driving operation. It is preferable that the feedback is limited and only a blind touch operation from the operation surface P is accepted.

  As an example of tactile feedback that supports such a blind touch operation, for example, as shown in FIG. 3B, when a circular locus is drawn on the finger U1 (see arrow 303 in the figure), the volume adjustment dial Control may be performed to change the frictional force so as to give a tactile sensation over the boundary “click” at the position corresponding to the scale.

  At this time, for example, a “click!” Sound may be output via the speaker 70, or the sound may be output by vibrating the vibration element 13a of the vibration unit 13 in the audible region by the vibration control unit 12f. Also good.

  Next, with reference to FIGS. 3C and 3D, a case where the control target is a function of adjusting the set temperature of the air conditioner 61 will be described. Therefore, it is assumed that the user D has made a speech such as “air conditioner temperature” and has been selected as a control target.

  In this case, as shown in FIG. 3C, for example, the vibration state of the vibration element 13a is set on the operation surface P of the input device 10 so as to enable tactile feedback as if it were an UP / DOWN button for temperature adjustment. The

  Specifically, in this case, for example, an area R11 corresponding to the UP button and an area R12 corresponding to the DOWN button are set on the operation surface P. These regions R11 and R12 are set as regions having a large frictional force. In addition, regions other than the regions R11 and R12 are set as regions having relatively small frictional forces.

  Thereby, in area | region R11, the tactile sense in which an UP button exists can be fed back to the user D. In the region R12, the sense of touch where the DOWN button exists can be fed back to the user D.

  Then, as shown in FIG. 3D, the user D can perform an operation of increasing the set temperature of the air conditioner 61 by pressing the region R11, for example. At this time, for example, at the timing when the user D releases the finger U1 from the region R11 (see the arrow 304 in the figure), the blind touch described above is output from the speaker 70 such as “the set temperature is XX ° C.”. It can contribute to performing the operation with high operability and reliability.

  3C and 3D show an example corresponding to the temperature adjustment UP / DOWN button. However, the example is just an example, and for example, an area resembling a linear slide bar extending up and down is set. May be operated by raising and lowering the finger U1 in such a region.

  Next, the gesture operation will be described with reference to FIGS. 4A to 4C. So far, for example, a case where a circular locus is drawn with the finger U1 and a case where it is raised and lowered up and down have been described, and the other cases will be described here.

  4A to 4C are diagrams (part 1) to (part 3) illustrating specific examples of the gesture operation. As described above, the gesture operation is preferably a simple mode that is easy for the user D to remember in order to improve operability.

  As one of them, for example, as shown in FIG. 4A, a gesture operation of sliding the finger U1 on the operation surface P in the left-right direction can be exemplified.

  As another example, as shown in FIG. 4B, for example, a gesture operation in which the finger U1 is slid on the operation surface P so as to draw a triangular trajectory can be given.

  As another example, as shown in FIG. 4C, for example, a gesture operation in which the finger U1 is slid on the operation surface P so as to draw a trajectory of an X mark can be given.

  In the present embodiment, the combination of vibration states corresponding to each of these simple and easy-to-remember gesture operations is set to be different depending on the control target.

  In other words, in the input device 10 according to the present embodiment, each control object is operated by a set of simple gesture operations that can be used in common among different control objects. Different tactile sensations will be fed back.

  Thereby, the user D can operate various control objects with the same gesture operation only by learning several simple gesture operations, and can receive feedback of different tactile sensations for each different control object. It is possible to operate a simple control target with high operability.

  Specific examples of the combination information 80a and the vibration state information 80b for realizing this will be described with reference to FIGS. 5A and 5B. FIG. 5A is a diagram illustrating a specific example of the combination information 80a. FIG. 5B is a diagram illustrating a specific example of the vibration state information 80b.

  First, the combination information 80a is information that is defined so that the combination of vibration states of the vibration element 13a corresponding to each gesture operation varies depending on the control target, as described above.

  Specifically, as illustrated in FIG. 5A, the combination information 80a includes, for example, a control target item, a gesture operation item, a function item, and a vibration state item. Control target items are further broken down into, for example, device items and mode items possessed by such devices.

  For example, the navigation apparatus has a plurality of modes such as a navigation mode and an audio mode as control targets. Each mode is assigned a common set of gesture operations. For example, here, the above-described set of five gesture operations “upper and lower”, “left and right”, “circular”, “triangle”, and “x” is assumed.

  In the navigation mode of the navigation device, for example, a map scroll (up / down) function is assigned to the “up / down” gesture operation, and the vibration state item is associated with a first vibration state corresponding to the map state.

  On the other hand, in the audio mode of the navigation device, a track switching function is assigned to the same “up / down” gesture operation, and a sixth vibration state corresponding to the vibration state item is associated with the vibration state item.

  Similarly, separate functions are assigned to each gesture operation of “Left / Right”, “Circle”, “Triangle”, and “X” in the navigation mode and audio mode, and the vibration state items correspond to these individually. The 2nd to 5th and 7th to 10th vibration states are associated.

  For the air conditioner 61, for example, the function of the set temperature UP / DOWN is assigned to the “up / down” gesture operation, and, for example, the eleventh vibration state corresponding to the function is uniquely associated.

  By the way, various devices 60 to be controlled and their modes may be invalid because they are not connected or are in failure. For such a case, the combination information 80a can include a twelfth vibration state in which the vibration element 13a is not vibrated, for example, “in common”.

  The setting unit 12d described above refers to, for example, the combination information 80a defined as illustrated in FIG. 5A, and selects the vibration state of the vibration element 13a corresponding to the gesture operation when the gesture operation is detected by the detection unit 12e. This is set according to the control target selected by the unit 12c. Such setting is performed, for example, when the setting unit 12d writes information indicating which vibration state is selected in the vibration state information 80b.

  The vibration state information 80b is information including a control value of the vibration element 13a in each vibration state. Specifically, as shown in FIG. 5B, the vibration state information 80b includes, for example, a setting item, a vibration state item, a contact position coordinate item, and a vibration frequency item.

  The vibration state item is information for identifying each vibration state, and the contact position coordinates of the finger U1 on the operation surface P are defined for each vibration state. Each contact position coordinate is associated with a vibration frequency for vibrating the vibration element 13a, for example.

  In the setting item, information indicating which vibration state has been selected is written by the setting unit 12d. For example, FIG. 5B shows an example in which a check mark is added to the first vibration state, but the setting unit 12d adds a check mark to indicate that the first vibration state is being set. Is.

  FIG. 5B also shows an example relating to the twelfth vibration state in which the vibration element 13a is not vibrated in FIG. 5A, for example. In the twelfth vibration state, control values such as contact position coordinates and vibration frequency are not defined, for example.

  The vibration control unit 12f refers to such vibration state information 80b and controls the vibration element 13a by using control values such as contact position coordinates and vibration frequency associated with the vibration state being set. Become.

  Note that the combination information 80a and the vibration state information 80b illustrated in FIGS. 5A and 5B are merely examples, and the configurations thereof are not limited.

  Next, a processing procedure executed by the input device 10 according to the embodiment will be described with reference to FIG. FIG. 6 is a flowchart illustrating a processing procedure executed by the input device 10 according to the embodiment.

  As illustrated in FIG. 6, in the input device 10, the selection unit 12 c first selects a control target based on the behavior of the user D (Step S <b> 101). Subsequently, the setting unit 12d sets the vibration state of the vibration element 13a corresponding to the control target selected by the selection unit 12c (step S102).

  Subsequently, the detection unit 12e detects the contact operation of the user D with respect to the operation surface P (step S103).

  Then, the vibration control unit 12f controls the vibration element 13a of the vibration unit 13 based on the set vibration state information 80b.

  At this time, the vibration control unit 12f vibrates the vibration element 13a based on the vibration state information 80b if the operation is valid (Yes in step S104), that is, if the control target is effectively selected by the selection unit 12c. Thus, the tactile sensation according to the controlled object is fed back (step S105), and the process is terminated.

  Note that in order to obtain a vibration state corresponding to each gesture operation described above in step S105, it is necessary to select a vibration state corresponding to the gesture operation desired by the user D. For example, the sound from the microphone 20 described above is selected. It may be selected by recognition or the like.

  In the case of speech recognition, specifically, the following procedure is taken. First, if the user D wants to perform a “circular” gesture operation, he / she utters “circle”, for example, and the voice reception unit 12a receives the voice at this time and passes the analysis result to the selection unit 12c.

  The selection unit 12c selects a “circular” gesture operation based on the analysis result, and notifies the setting unit 12d of the gesture operation. Upon receiving the notification, the setting unit 12d selects a vibration state corresponding to the “circular” gesture operation on the control target being selected from the combination information 80a, and if the vibration state is “setting”, the vibration state Information 80b is set.

  Then, based on the setting result, the vibration control unit 12f controls the vibration element 13a of the vibration unit 13 so as to be in a vibration state corresponding to the desired “circular” gesture operation of the user D. Thereafter, for example, the user D may perform a “circular” gesture operation by moving the finger U1 while being guided along the region R1 shown in FIG. 3A.

  Further, as shown in FIG. 6, the vibration control unit 12f is based on the vibration state information 80b when the operation is invalid (No at Step S104), that is, when the selection of the control target by the selection unit 12c is invalid. By not vibrating the vibration element 13a, the process is terminated without feeding back the tactile sensation according to the control target (step S106).

  The integrated input system 1 according to the present embodiment can be realized by a computer 600 having a configuration shown as an example in FIG. FIG. 7 is a hardware configuration diagram illustrating an example of a computer that realizes the functions of the integrated input system 1 according to the embodiment.

  The computer 600 includes a CPU (Central Processing Unit) 610, a ROM (Read Only Memory) 620, a RAM (Random Access Memory) 630, and an HDD (Hard Disk Drive) 640. The computer 600 also includes a media interface (I / F) 650, a communication interface (I / F) 660, and an input / output interface (I / F) 670.

  The computer 600 may include an SSD (Solid State Drive), and the SSD may execute a part or all of the functions of the HDD 640. Further, an SSD may be provided instead of the HDD 640.

  The CPU 610 operates based on a program stored in at least one of the ROM 620 and the HDD 640 and controls each unit. The ROM 620 stores a boot program executed by the CPU 610 when the computer 600 starts up, a program depending on the hardware of the computer 600, and the like. The HDD 640 stores a program executed by the CPU 610, data used by the program, and the like.

  The media I / F 650 reads programs and data stored in the storage medium 680 and provides them to the CPU 610 via the RAM 630. The CPU 610 loads such a program from the storage medium 680 onto the RAM 630 via the media I / F 650, and executes the loaded program. Alternatively, the CPU 610 executes a program using such data. The storage medium 680 is, for example, a magneto-optical recording medium such as a DVD (Digital Versatile Disc), an SD card, or a USB memory.

  The communication I / F 660 receives data from other devices via the network 690 and sends the data to the CPU 610, and transmits the data generated by the CPU 610 to other devices via the network 690. Alternatively, the communication I / F 660 receives a program from another device via the network 690, sends the program to the CPU 610, and the CPU 610 executes the program.

  The CPU 610 controls a display unit 40 such as a display, an output unit such as a speaker 70, and an input unit such as a keyboard, a mouse, a button, and an operation unit 11 via the input / output I / F 670. The CPU 610 acquires data from the input unit via the input / output I / F 670. In addition, the CPU 610 outputs the generated data to the display unit 40 and the output unit via the input / output I / F 670.

  For example, when the computer 600 functions as the integrated input system 1, the CPU 610 of the computer 600 executes a program loaded on the RAM 630, so that the voice reception unit 12 a, the line-of-sight detection unit 12 b, the selection unit 12 c, Each function of the control unit 12 and the display control unit 50 of the input device 10 including the setting unit 12d, the detection unit 12e, the vibration control unit 12f, and the operation processing unit 12g is realized.

  For example, the CPU 610 of the computer 600 reads these programs from the storage medium 680 and executes them, but as another example, these programs may be acquired from other devices via the network 690. Further, the HDD 640 can store information stored in the storage unit 80.

  As described above, the input device according to the embodiment includes one operation surface, a selection unit, a detection unit, at least one vibration element, a setting unit, and a vibration control unit. The selection unit selects a control target using the operation surface based on a user's behavior.

  The detection unit detects a predetermined contact operation of the user with respect to the operation surface. The vibration element vibrates the operation surface. The setting unit sets the vibration state of the vibration element corresponding to the contact operation when the contact operation is detected by the detection unit according to the control target selected by the selection unit.

  The vibration control unit controls the vibration element so as to be in a vibration state set by the setting unit when a contact operation is detected by the detection unit.

  Therefore, according to the input device according to the embodiment, various control objects can be operated with high operability.

  In the above-described embodiment, the case where the selection unit 12c selects the control target based on the voice input from the microphone 20 or the detection of the line of sight of the user D is taken as an example, but the present invention is not limited to this. For example, the input device 10 may include a switch for switching the control target, and the control target may be selected based on the behavior of the user D who simply presses the switch to switch.

  Further, in the above-described embodiment, the case where there is one operation surface P is taken as an example. The control unit 12f may control the vibration element 13a.

  In this case, for example, each mode of the navigation device is assigned to each region, and the user D selects each region according to tactile feedback, so that the selection unit 12c selects a mode to be controlled by the operation surface P. Also good. Thereby, the user D can easily perform a mode selection operation to be controlled without using voice recognition or line-of-sight detection.

  Specifically, the operation surface P has a plurality of divided areas, and a different mode is assigned to each of the divided areas, and the vibration control unit 12f causes the vibration element to be in a different vibration state in each of the divided areas. 13a is controlled. The selection unit 12c selects the mode corresponding to the divided area selected by the user D according to the tactile feedback from each of the divided areas.

  Further effects and modifications can be easily derived by those skilled in the art. Thus, the broader aspects of the present invention are not limited to the specific details and representative embodiments shown and described above. Accordingly, various modifications can be made without departing from the spirit or scope of the general inventive concept as defined by the appended claims and their equivalents.

DESCRIPTION OF SYMBOLS 1 Integrated input system 10 Input device 11 Operation part 12 Control part 12a Audio | voice reception part 12b Eye-gaze detection part 12c Selection part 12d Setting part 12e Detection part 12f Vibration control part 12g Operation processing part 13 Vibration part 13a Vibration element 20 Microphone 30 Imaging part 40 Display unit 50 Display control unit 60 Various devices 70 Speaker 80 Storage unit D User P Operation surface U1 Finger

Claims (13)

  1. One operating surface,
    A selection unit that selects a control target using the operation surface based on a user's behavior;
    A detection unit for detecting a predetermined contact operation of the user on the operation surface;
    At least one vibration element for vibrating the operation surface;
    A setting unit for setting a vibration state of the vibration element corresponding to the contact operation when the contact operation is detected by the detection unit according to the control target selected by the selection unit;
    An input device comprising: a vibration control unit that controls the vibration element so as to be in the vibration state set by the setting unit when the contact operation is detected by the detection unit.
  2. The contact operation is a number of simple gesture operations including contact with the operation surface,
    The setting unit
    Each of the gesture operations is performed toward the control target selected by the selection unit based on combination information that is defined so that combinations of the vibration states corresponding to the gesture operations differ depending on the control target. The input device according to claim 1, wherein the vibration state in the case of a failure is set.
  3. The input device according to claim 1, wherein the control target is one of a plurality of devices.
  4. The input device according to claim 1, wherein the control target is one of a plurality of modes of one device.
  5. A voice reception unit for receiving voice input;
    The selection unit includes:
    The input device according to any one of claims 1 to 4, wherein the control target is selected based on content of the voice input received by the voice receiving unit.
  6. A line-of-sight detection unit that detects a line-of-sight direction of the user based on a captured image from an imaging unit that captures the user;
    The selection unit includes:
    The input device according to claim 1, wherein the control target is selected based on a detection result of the line-of-sight detection unit.
  7. The vibration control unit
    When the control unit is effectively selected by the selection unit, if the contact operation toward the control target is detected by the detection unit, a vibration state that feeds back a tactile sensation from the operation surface to the user The input device according to claim 1, wherein the vibration element is controlled such that
  8. The vibration control unit
    When it is invalid to select the control target by the selection unit, if the contact operation toward the control target is detected by the detection unit, a tactile sensation is not fed back to the user from the operation surface. The input device according to claim 1, wherein the vibration element is controlled so as to be in a vibration state.
  9. The operation surface has a plurality of divided areas, and each of the divided areas is assigned a different control target,
    The vibration control unit
    Controlling the vibrating element to be in different vibration states in each of the divided regions;
    The said selection part selects the said control object corresponding to this division area which the said user selected according to the tactile feedback from each of the said division area, The Claim 1 characterized by the above-mentioned. Input device.
  10. The input device according to any one of claims 1 to 9, wherein when a tactile sensation is fed back to the user from the operation surface, a guidance voice output by a voice output unit is used together.
  11. An input device according to any one of claims 1 to 10,
    An integrated input system comprising: a display unit that displays an image corresponding to a predetermined contact operation of the user on the operation surface.
  12. A method for controlling an input device having one operation surface,
    A selection step of selecting a control target using the operation surface based on a user's behavior;
    A detection step of detecting a predetermined contact operation of the user with respect to the operation surface;
    A vibration step of vibrating the operation surface by at least one vibration element;
    A setting step of setting a vibration state of the vibration element corresponding to the contact operation when the contact operation is detected in the detection step according to the control object selected in the selection step;
    And a vibration control step of controlling the vibration element so as to be in the vibration state set in the setting step when the contact operation is detected in the detection step. .
  13. A selection step of selecting a control target using one operation surface based on a user's behavior;
    A detection step of detecting a predetermined contact operation of the user on the operation surface;
    A vibration step of vibrating the operation surface by at least one vibration element;
    A setting step for setting a vibration state of the vibration element corresponding to the contact operation when the contact operation is detected in the detection step, according to the control object selected in the selection step;
    When the contact operation is detected in the detection step, the program causes the computer to execute a vibration control step of controlling the vibration element so as to be in the vibration state set in the setting step.
JP2015171161A 2015-08-31 2015-08-31 Input device, integrated input system, input device control method, and program Active JP6585431B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2015171161A JP6585431B2 (en) 2015-08-31 2015-08-31 Input device, integrated input system, input device control method, and program

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015171161A JP6585431B2 (en) 2015-08-31 2015-08-31 Input device, integrated input system, input device control method, and program
US15/240,238 US20170060245A1 (en) 2015-08-31 2016-08-18 Input device, integrated input system, input device control method, and program

Publications (2)

Publication Number Publication Date
JP2017049699A true JP2017049699A (en) 2017-03-09
JP6585431B2 JP6585431B2 (en) 2019-10-02

Family

ID=58279433

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2015171161A Active JP6585431B2 (en) 2015-08-31 2015-08-31 Input device, integrated input system, input device control method, and program

Country Status (1)

Country Link
JP (1) JP6585431B2 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09160720A (en) * 1995-12-05 1997-06-20 Mitsubishi Electric Corp Touch operation device
JPH11334493A (en) * 1998-05-25 1999-12-07 Fujikura Ltd Input system for automobile
US20100238115A1 (en) * 2009-03-19 2010-09-23 Smk Corporation Operation input device, control method, and program
US20130307771A1 (en) * 2012-05-18 2013-11-21 Microsoft Corporation Interaction and management of devices using gaze detection
JP2014102829A (en) * 2012-11-20 2014-06-05 Immersion Corp System and method for feedforward and feedback with haptic effects
JP2015133109A (en) * 2013-12-20 2015-07-23 イマージョン コーポレーションImmersion Corporation Gesture based input system in vehicle with haptic feedback

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09160720A (en) * 1995-12-05 1997-06-20 Mitsubishi Electric Corp Touch operation device
JPH11334493A (en) * 1998-05-25 1999-12-07 Fujikura Ltd Input system for automobile
US20100238115A1 (en) * 2009-03-19 2010-09-23 Smk Corporation Operation input device, control method, and program
JP2010224684A (en) * 2009-03-19 2010-10-07 Smk Corp Operation input device, control method and program
US20130307771A1 (en) * 2012-05-18 2013-11-21 Microsoft Corporation Interaction and management of devices using gaze detection
JP2014102829A (en) * 2012-11-20 2014-06-05 Immersion Corp System and method for feedforward and feedback with haptic effects
JP2015133109A (en) * 2013-12-20 2015-07-23 イマージョン コーポレーションImmersion Corporation Gesture based input system in vehicle with haptic feedback

Also Published As

Publication number Publication date
JP6585431B2 (en) 2019-10-02

Similar Documents

Publication Publication Date Title
JP6329723B2 (en) System and method for multi-pressure interaction on touch-sensitive surfaces
US9274608B2 (en) Systems and methods for triggering actions based on touch-free gesture detection
US9411423B2 (en) Method and apparatus for haptic flex gesturing
KR101946365B1 (en) Display device and Method for controlling the same
US10067563B2 (en) Interaction and management of devices using gaze detection
JP5898138B2 (en) An interactive model for shared feedback on mobile devices
JP2016192229A (en) Devices and methods for presenting information to user on tactile output surface of mobile device
US8203502B1 (en) Wearable heads-up display with integrated finger-tracking input sensor
US7295904B2 (en) Touch gesture based interface for motor vehicle
US10248213B2 (en) Systems and methods for interfaces featuring surface-based haptic effects
KR20140112910A (en) Input controlling Method and Electronic Device supporting the same
JP2008033739A (en) Touch screen interaction method and apparatus based on tactile force feedback and pressure measurement
CN102763342B (en) Mobile device and related control method for external output depending on user interaction based on image sensing module
US10338683B2 (en) Systems and methods for visual processing of spectrograms to generate haptic effects
US20090225043A1 (en) Touch Feedback With Hover
KR20170026567A (en) Three dimensional contextual feedback
US9910494B2 (en) Thresholds for determining feedback in computing devices
CN101563666B (en) The user interface apparatus
US9965035B2 (en) Device, method, and graphical user interface for synchronizing two or more displays
CN102575943B (en) For information and gesture-based command input of the motor vehicle
EP3564789A1 (en) Haptic warping system that transforms a haptic signal into a collection of vibrotactile haptic effect patterns
WO2013015070A1 (en) User interface device capable of image scrolling not accompanying finger movement, image scrolling method, and program
JP2011525247A (en) Mobile virtual guitar fingerboard display device and method
US9436282B2 (en) Contactor-based haptic feedback generation
WO2014176532A1 (en) Simulation of tangible user interface interactions and gestures using array of haptic cells

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20180802

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20190313

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20190402

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20190517

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20190813

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20190905

R150 Certificate of patent or registration of utility model

Ref document number: 6585431

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150