WO2014171096A1 - Control device for vehicle devices and vehicle device - Google Patents

Control device for vehicle devices and vehicle device Download PDF

Info

Publication number
WO2014171096A1
WO2014171096A1 PCT/JP2014/001953 JP2014001953W WO2014171096A1 WO 2014171096 A1 WO2014171096 A1 WO 2014171096A1 JP 2014001953 W JP2014001953 W JP 2014001953W WO 2014171096 A1 WO2014171096 A1 WO 2014171096A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
unit
contact
vibration
input
Prior art date
Application number
PCT/JP2014/001953
Other languages
French (fr)
Japanese (ja)
Inventor
磯部 憲寛
Original Assignee
株式会社デンソー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社デンソー filed Critical 株式会社デンソー
Priority to CN201480021014.XA priority Critical patent/CN105142983A/en
Publication of WO2014171096A1 publication Critical patent/WO2014171096A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/014Force feedback applied to GUI

Definitions

  • the present disclosure relates to a control device for an in-vehicle device that causes touch input to the in-vehicle device, and an in-vehicle device controlled by the control device.
  • touch input type input devices such as touch panels have become widespread in mobile terminal devices such as mobile phones and PDAs (Personal Digital Assistants).
  • PDAs Personal Digital Assistants
  • touch input type input device it is difficult to perform an input operation without looking at the button because there is no feeling of operation even when the button on the panel is pressed. Therefore, an input device that gives a tactile sensation in accordance with button operations has been proposed.
  • Patent Document 1 detects that the user has touched a button on the touch panel and applies vibration to the touch panel so that the user can perform an input operation without looking closely at the button area. . Specifically, in Patent Document 1, vibrations of different patterns are applied to the operation panel at the time of tap input, slide input, and hold input, respectively. Furthermore, at the time of slide input and hold input, different patterns of vibration are applied to the touch panel for each button so that the user can identify the buttons.
  • Patent Document 1 is an input device intended for input to a mobile terminal device.
  • the ease of input operation of a mobile terminal device does not change according to the surrounding situation. Therefore, Patent Document 1 notifies the user of the operation state by vibration in the same manner regardless of surrounding conditions.
  • the present disclosure provides a control device for an in-vehicle device capable of suppressing the user from feeling uncomfortable by notifying the user of the state of an input operation on the in-vehicle device according to the state of the vehicle, and the control by the control device
  • the main purpose is to provide a vehicle-mounted device.
  • a control device for an in-vehicle device mounted on a vehicle detects an input unit including an operation unit corresponding to an execution function, and contact with the input unit And a vibration applying unit that applies vibration to the input unit.
  • the vibration applied by the vibration applying unit is changed according to whether or not the vehicle is running.
  • the user touches the input unit to operate the in-vehicle device, contact with the input unit is detected.
  • contact with the input unit is detected, vibration that varies depending on whether the vehicle is running is applied to the input unit. Therefore, since the user is notified of the contact state with respect to the input unit by vibration according to the state of the vehicle, the user can be prevented from feeling uncomfortable.
  • the control device that controls the in-vehicle device mounted on the vehicle includes an input unit including an operation unit corresponding to an execution function, and a detection unit that detects contact with the input unit. And is configured to include.
  • the in-vehicle device includes a display device that displays information in a traveling direction of the vehicle, and when the contact is detected by the detection unit, the contact by the display device according to whether the vehicle is running or not. Display different modes for.
  • the contact with the input unit is detected.
  • the contact with the input unit is displayed in the field of view of the user facing the traveling direction of the vehicle in a mode that changes depending on whether or not the vehicle is traveling. Therefore, since the user is notified of the contact state with respect to the input unit by the display mode corresponding to the state of the vehicle, the user can be prevented from feeling uncomfortable.
  • Block diagram showing the configuration of in-vehicle equipment The figure which shows the state of input operation, The flowchart which shows the process sequence which starts the execution function selected by the user in 1st Embodiment, The flowchart which shows the process sequence which starts the execution function selected by the user in 1st Embodiment, The flowchart which shows the process sequence which starts the execution function selected by the user in 2nd Embodiment, The flowchart which shows the process sequence which starts the execution function selected by the user in 2nd Embodiment, The flowchart which shows the process sequence which starts the execution function selected by the user in 3rd Embodiment, The flowchart which shows the process sequence which starts the execution function selected by the user in 3rd Embodiment.
  • This in-vehicle device is mounted on a vehicle, and includes a touch input device for a touch panel 10a and a touch switch 10b, a control device 20 (a control device for the in-vehicle device), a HUD (Head Up Display) 30, and a gaze detection sensor 40.
  • a vehicle equipped with an in-vehicle device is also referred to as a host vehicle or a target vehicle.
  • the touch panel 10a is an input device such as a car navigation, and includes an input unit 12a, a detection unit 13a, and a vibration applying unit 15a.
  • the input unit 12a includes a display screen made up of a liquid crystal panel or an organic EL (Electro-Luminescence) panel, and a transparent touch panel arranged on the display screen, and accepts touch input by a user's finger or the like. Various kinds of information are displayed on the display screen, and the transparent touch panel transmits the information displayed on the display screen.
  • the input unit 12a includes a plurality of operation units 11a (input buttons) that are displayed on a display screen and assigned execution functions such as car navigation. In addition to the operation unit 11a, the input unit 12a includes a portion to which an execution function such as screen enlargement is assigned. The number and the position of the operation unit 11a change according to the information displayed on the display screen.
  • the detection unit 13a is a capacitive sensor mounted between the liquid crystal panel of the input unit 12a and the touch panel, and detects the contact with the input unit 12a, the contact position, and the area of the contact part. When the user touches the surface of the input unit 12a, the capacitance at the contact position changes, and a detection signal corresponding to the contact position and the area of the contact portion is transmitted to the control device 20.
  • the detection unit 13a may be a sensor such as a pressure-sensitive sensor or an infrared sensor in addition to the capacitance sensor.
  • the vibration applying unit 15a is an actuator such as a piezo element, an eccentric motor, or a voice coil that is driven according to a drive signal from the control device 20 and applies vibration to the input unit 12a.
  • the touch switch 10b is an input device such as an air conditioner, and includes an input unit 12b, a detection unit 13b, and a vibration applying unit 15b.
  • the input unit 12b is formed as a rectangular surface including a plurality of operation units 11b (input buttons) each assigned an execution function such as an air conditioner. Execution functions are not assigned to portions of the input unit 12b other than the operation unit 11b.
  • the detection unit 13b is a capacitive sensor mounted on the rear surface of the input unit 12b, and detects the contact with the operation unit 11b included in the input unit 12b, the contact position, and the area of the contact part.
  • the vibration applying unit 15b is an actuator that is driven according to a drive signal from the control device 20 and applies vibration to the input unit 12b.
  • the touch panel 10a and the touch switch 10b are installed in the center of the instrument panel between the driver seat and the passenger seat as shown in FIG.
  • the HUD 30 (display device) is a display device that displays information in the traveling direction of the vehicle, as shown in FIG.
  • the HUD 30 is embedded in the back of the dashboard, reflects the image displayed on the liquid crystal panel to the mirror, further reflects it on the concave mirror and enlarges it, and reflects the enlarged image on the windshield. Thereby, the user can see the image displayed so as to float up in front of the windshield.
  • Specific display information includes a vehicle speed, road guidance, a contact position in the input unit 12a or the input unit 12b, and the like.
  • the line-of-sight detection sensor 40 (line-of-sight detection device) is installed at a position facing the eyes of the user (driver) facing in the traveling direction, for example, a steering upper column or a meter panel, and is not visible. It is a camera that photographs the eyes of a user irradiated with near infrared light. The line-of-sight detection sensor 40 transmits the captured image of the user's eyes to the control device 20.
  • the control device 20 includes a control circuit 21, an operation input unit 22, a vibration output unit 23, a recording unit 24, a HUD display output unit 25, a line-of-sight recognition sensor input unit 26, a vehicle sensor input unit 27, and a power supply unit 28.
  • the control circuit 21 is also configured as a normal computer and includes a CPU, RAM, ROM, and the like.
  • the recording unit 24 may be included in the ROM or RAM.
  • the control circuit 21 detects the user's line of sight based on the eye image received from the line-of-sight detection sensor 40 via the line-of-sight recognition sensor input unit 26. Specifically, the control circuit 21 detects a pupil whose position changes depending on the line-of-sight direction and a corneal reflection that is not affected by the line-of-sight direction from the received eye image, and looks at the line of sight from the positional relationship between the pupil and the corneal reflection. Is detected.
  • the control circuit 21 determines whether the execution function corresponding to the operation units 11a and 11b is selected based on the detection signal transmitted from the detection units 13a and 13b via the operation input unit 22 and the detected line of sight of the user. . If it determines with the execution function corresponding to operation part 11a, 11b having been selected, the control circuit 21 will transmit the command signal which implement
  • the control circuit 21 vibrates the input units 12a and 12b by the vibration applying units 15a and 15b. Grant. Further, the control circuit 21 changes the vibration applied by the vibration applying units 15a and 15b depending on whether or not the vehicle is traveling. Specifically, the control circuit 21 applies vibration by the vibration applying units 15a and 15b when the vehicle is traveling, and does not apply vibration by the vibration applying units 15a and 15b unless the vehicle is traveling. Therefore, it can be said that the control unit 21 functions as a vehicle running state detection unit.
  • the control circuit 21 executes various programs stored in the recording unit 24 and causes the recording unit 24 to store various received data. Further, the control circuit 21 displays information in the display area of the HUD 30 via the HUD display output unit 25. Further, the control circuit 21 receives detection values from vehicle sensors such as a vehicle speed sensor and a brake sensor (not shown) via the vehicle sensor input unit 27, and determines whether or not the vehicle is traveling. Further, the control circuit 21 is supplied with power from the power supply unit 28.
  • vehicle sensors such as a vehicle speed sensor and a brake sensor (not shown) via the vehicle sensor input unit 27, and determines whether or not the vehicle is traveling. Further, the control circuit 21 is supplied with power from the power supply unit 28.
  • This processing procedure is repeatedly executed by the control circuit 21 at predetermined intervals.
  • S11 it is determined whether or not there is contact with the input units 12a and 12b. If no detection signal is received from the detection units 13a and 13b via the operation input unit 22, it is determined that there is no contact (NO), and the process proceeds to S12. In S12, when vibration is applied to the input units 12a and 12b by the vibration applying units 15a and 15b, the vibration is stopped and the process returns to S11.
  • the vibration traveling pattern to be applied by the vibration applying unit 15a is set in advance to a different vibration pattern in each area other than each input button and the input button, and is set to a different vibration pattern at the time of contact and at the time of selecting the execution function. ing.
  • the traveling pattern of the vibration to be applied by the vibration applying unit 15b is set in advance to a different vibration pattern for each input button, and is set to a different vibration pattern for the contact and execution function selection. Moreover, the pattern at the time of a stop does not give a vibration by the vibration provision parts 15a and 15b.
  • S17 it is determined whether or not the contact position is the position of the input button 1.
  • the number of input buttons is three. If the contact position is the position of the input button 1 in S17 (YES), the processes of S18 to S22 are performed.
  • S18 it is determined whether or not the detected line of sight of the user is in a direction other than the input units 12a and 12b.
  • the user's line of sight is in a direction other than the input units 12a and 12b (YES)
  • the input is performed by the vibration applying units 15a and 15b in S19.
  • the vibration for contact for the button 1 is applied.
  • the user who is not looking at the input units 12a and 12b is notified that the input button 1 is touched.
  • the vibration applying units 15a and 15b are not applied with vibration, and the process proceeds to S20.
  • the detection units 13a and 13b have detected a stronger push than the contact with the input units 12a and 12b at the contact position. Specifically, based on the detection signals received from the detection units 13a and 13b, it is determined that the push-in has been detected when the contact area with respect to the operation units 11a and 11b is larger than when the contact is detected.
  • the detection units 13a and 13b are pressure-sensitive sensors, it is determined that pressing has been detected when the detected pressure is greater than a reference value.
  • the process of S30 to S34 are performed in the same manner as the processes of S18 to S22. However, in S31, vibration at the time of contact for the input button 3 is applied by the vibration applying units 15a and 15b, and in S33, vibration at the time of selection for the input button 3 is applied by the vibration applying units 15a and 15b. If the contact position is not the position of the input button 3 in S29 (NO), the process returns to S11.
  • the vibration applying unit 15a is used for an area other than the input button position. Apply vibration when touching. As a result, the user who is not looking at the input unit 12a is notified that the user is touching an area other than the input button position.
  • the vibration applying unit 15a does not apply vibration, and the process proceeds to S37.
  • the user When the user pushes the operation units 11a and 11b stronger than the contact, it can be determined that the user has selected the execution function corresponding to the operation units 11a and 11b. In addition, when it is determined that the execution function is selected, the user can determine whether the input operation state is the contact state or the selection state by applying a vibration different from the case where the contact is detected. Can be recognized.
  • the first embodiment may be implemented with the following modifications.
  • the execution function corresponding to the contact position is selected by the user. It may be determined that it has been done.
  • vibration is applied by the vibration applying units 15a and 15b according to the input button. Even in this case, the user can recognize whether the state of the input operation is a contact state or a selection state.
  • the operation different from the contact includes, for example, an operation in which the contact position is kept in contact for a longer time than a predetermined time.
  • the vibration pattern at the time of stopping may be a vibration with a weaker amplitude or a lower frequency than when traveling.
  • the amplitude of vibration applied by the vibration applying units 15a and 15b may be larger than that when the vehicle is stopped when the vehicle is running. Good.
  • a notification sound having a frequency and amplitude corresponding to the vibration pattern during traveling and the vibration pattern during stopping may be output.
  • S41 As in S11, it is determined whether or not there is a contact. If it is determined that there is contact in S41 (YES), the contact position in the input units 12a and 12b is displayed in the display area of the HUD 30 in S43. On the other hand, if it is determined that there is no contact in S41 (NO), in S42, the vibration is stopped in the same manner as in S12. Subsequently, in S44, the display of the contact position in the display area of the HUD 30 is stopped. Return to processing.
  • the determination of the user's line-of-sight direction is not performed, and the same as S25 to S28, S31 to S34, and S36 to S40, respectively. This process is performed in S55 to S58, S60 to S63, and S64 to S68.
  • the user's contact position in the input units 12a and 12b is displayed in the field of view of the user facing the traveling direction of the vehicle. Therefore, the user can surely recognize which operation unit 11a, 11b of the input unit 12a, 12b is touched.
  • the second embodiment may be implemented with the following modifications.
  • ⁇ S49 and S50, S54 and S55, S59 and S60, and S48 and S64 may be processed in the same manner as S18, S24, S30, and S35.
  • S71 As in S11, it is determined whether or not there is a contact. If it is determined in S71 that there is contact (YES), it is determined in S72 whether the vehicle is traveling. If it is determined that the vehicle is traveling in S72 (YES), the contact state and the contact position in the input units 12a and 12b are displayed in the display area of the HUD 30 in S73, and the process proceeds to S75. On the other hand, when it is determined that there is no contact in S71 (NO), or when it is determined that the vehicle is not traveling in S72 (NO), in S74, the display of the contact state and the contact position in the display area of the HUD 30 is stopped. The process returns to S71.
  • S75 it is determined whether or not the contact position is the position of the input button. If the contact position is the position of the input button in S75 (YES), it is determined in S76 whether the contact position is the position of the input button 1 as in S17. If the contact position is the position of the input button 1 in S76 (YES), the execution function corresponding to the input button 1 is displayed in the display area of the HUD 30 in S77.
  • S78 it is determined whether or not the detection units 13a and 13b have detected a stronger push than the contact with the input units 12a and 12b at the contact positions. If it is determined in S78 that no push has been detected (NO), the process returns to S71. If it is determined in S78 that pressing has been detected (YES), it is determined that the execution function corresponding to the input button 1 has been selected by the user, and the selection of the input button 1 is displayed in the display area of the HUD 30 in S79. Specifically, for example, the execution function displayed in S77 is highlighted.
  • S76 If it is determined in S76 that the contact position is not the position of the input button 1 (NO), it is determined in S81 whether the contact position is the position of the input button 2 or not. If the contact position is the position of the input button 2 in S81 (YES), the processes of S82 to S85 are performed in the same manner as the processes of S77 to S80. However, in S82, the execution function corresponding to the input button 2 is displayed in the display area of the HUD 30, and in S85, the selection of the input button 2 is displayed in the display area of the HUD 30.
  • S86 determines whether the contact position is the position of the input button 3 or not. If the contact position is the position of the input button 3 in S86 (YES), the processes of S87 to S90 are performed in the same manner as the processes of S77 to S80. However, in S87, the execution function corresponding to the input button 3 is displayed in the display area of the HUD 30, and in S89, the selection of the input button 3 is displayed in the display area of the HUD 30. If the contact position is not the position of the input button 3 in S86 (NO), the process returns to S71.
  • S91 it is determined in the input unit 12a whether or not an execution function is assigned to an area other than the input button, that is, the operation unit 11a. If the execution function is not assigned (NO), the process returns to S71. If there is an execution function assignment (YES), the processing of S92 to S95 is performed as in S77 to S80. However, in S92, the assigned execution function is displayed in the display area of the HUD 30, and in S94, the selection of the assigned execution function is displayed in the display area of the HUD 30. This process is complete
  • the display mode includes a display mode and a non-display mode.
  • the display mode may include various specifications such as color, shape, size, and contents related to the display or its design, and specifications related to temporal changes in various specifications of the display.
  • the display mode is also referred to as a display pattern, a display format, a display manner, and a display mode.
  • the user cannot operate by looking at the operation units 11a and 11b, so it is necessary to notify the user of the state of the input operation.
  • the vehicle is not running, the user can operate by looking at the operation units 11a and 11b, and therefore it is not necessary to notify the user of the state of the input operation.
  • the contact state is displayed in the field of view of the user facing the traveling direction of the vehicle, and when the vehicle is not traveling, the contact state is not displayed, so that the state of the input operation is only necessary. Since it is displayed, it is possible to suppress the user from feeling uncomfortable.
  • the user's contact position in the input units 12a and 12b is displayed in the field of view of the user facing the traveling direction of the vehicle. Therefore, the user can surely recognize which operation unit 11a, 11b of the input unit 12a, 12b is touched.
  • the execution function corresponding to the touching operation units 11a and 11b is displayed in the field of view of the user facing the traveling direction of the vehicle. Therefore, the user can recognize the execution function corresponding to the operation units 11a and 11b in contact.
  • the contact state, the contact position, and the selection state may be displayed in the display area of the HUD 30.
  • the character size is reduced or the character color is made inconspicuous compared to when traveling.
  • ⁇ S76 and S77, S81 and S82, S86 and S87, and S75 and S91 may be processed similarly to S18, S24, S30, and S35.
  • the operation state may be displayed in the display area of the HUD 30 and vibration may be applied by the vibration applying units 15a and 15b as in the first embodiment.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Human Computer Interaction (AREA)
  • Automation & Control Theory (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

A control device (20) that controls vehicle devices mounted to a vehicle, comprising: input units (12a, 12b) including operation units corresponding to execution functions; detection units (13a, 13b) that detect contact in relation to the input units; and vibration application units (15a, 15) that apply vibration to the input units. When contact is detected by the detection units, the control device (20) causes the vibration applied by the vibration application units to change in accordance with whether or not the vehicle is traveling.

Description

車載機器の制御装置、車載機器In-vehicle device control device, in-vehicle device 関連出願の相互参照Cross-reference of related applications
 本開示は、2013年4月18日に出願された日本出願番号2013-87298号に基づくもので、ここにその記載内容を援用する。 This disclosure is based on Japanese Patent Application No. 2013-87298 filed on April 18, 2013, the contents of which are incorporated herein.
 本開示は、車載機器に対してタッチ入力させる車載機器の制御装置、及びその制御装置により制御される車載機器に関する。 The present disclosure relates to a control device for an in-vehicle device that causes touch input to the in-vehicle device, and an in-vehicle device controlled by the control device.
 近年、携帯電話やPDA(Personal Digital Assistant)等の携帯端末装置では、タッチパネル等のタッチ入力式の入力装置が普及している。このようなタッチ入力式の入力装置では、パネル上のボタンを押しても操作感がないため、ボタンを見ないまま入力操作をすることは困難である。そのため、ボタン操作に伴い触感を与える入力装置が提案されている。 In recent years, touch input type input devices such as touch panels have become widespread in mobile terminal devices such as mobile phones and PDAs (Personal Digital Assistants). In such a touch input type input device, it is difficult to perform an input operation without looking at the button because there is no feeling of operation even when the button on the panel is pressed. Therefore, an input device that gives a tactile sensation in accordance with button operations has been proposed.
 例えば、特許文献1は、ユーザがボタン領域をよく見なくても入力操作を行うことができるように、ユーザがタッチパネル上のボタンに触れたことを検出して、タッチパネルに振動を付与している。具体的には、特許文献1は、タップ入力時、スライド入力時、ホールド入力時で、それぞれに異なるパターンの振動を操作パネルに付与している。さらに、スライド入力時及びホールド入力時では、ユーザがボタンを識別できるように、ボタンごとに異なるパターンの振動をタッチパネルに付与している。 For example, Patent Document 1 detects that the user has touched a button on the touch panel and applies vibration to the touch panel so that the user can perform an input operation without looking closely at the button area. . Specifically, in Patent Document 1, vibrations of different patterns are applied to the operation panel at the time of tap input, slide input, and hold input, respectively. Furthermore, at the time of slide input and hold input, different patterns of vibration are applied to the touch panel for each button so that the user can identify the buttons.
JP 2011-210283 AJP 2011-210283 A
 特許文献1は、携帯端末装置に対する入力を対象とした入力装置である。一般に、携帯端末装置は、周囲の状況に応じて入力操作のしやすさが変わることはない。それゆえ、特許文献1は、周囲の状況に関わらず同じように、ユーザに操作状態を振動で通知している。 Patent Document 1 is an input device intended for input to a mobile terminal device. In general, the ease of input operation of a mobile terminal device does not change according to the surrounding situation. Therefore, Patent Document 1 notifies the user of the operation state by vibration in the same manner regardless of surrounding conditions.
 これに対して、車両に搭載された機器に対する入力装置の場合、車両の状態に応じて入力操作のしやすさが変わる。それゆえ、車両に搭載された機器に対する入力装置の場合、車両の状態に関わらず同じように操作状態をユーザに通知すると、ユーザが違和感を覚えるおそれがある。 In contrast, in the case of an input device for a device mounted on a vehicle, the ease of input operation varies depending on the state of the vehicle. Therefore, in the case of an input device for a device mounted on a vehicle, if the user is notified of the operation state in the same manner regardless of the state of the vehicle, the user may feel uncomfortable.
 本開示は、車両の状態に応じて車載機器に対する入力操作の状態をユーザに通知することにより、ユーザが違和感を覚えることを抑制することが可能な車載機器の制御装置、及びその制御装置により制御される車載機器を提供することを主たる目的とする。 The present disclosure provides a control device for an in-vehicle device capable of suppressing the user from feeling uncomfortable by notifying the user of the state of an input operation on the in-vehicle device according to the state of the vehicle, and the control by the control device The main purpose is to provide a vehicle-mounted device.
 上記課題を解決するため、本開示の第一の例によれば、車両に搭載された車載機器の制御装置は、実行機能に対応する操作部を含む入力部と、前記入力部に対する接触を検出する検出部と、前記入力部に振動を付与する振動付与部と、を備えるように構成される。前記検出部により前記接触が検出された場合に、前記車両が走行中か否かに応じて前記振動付与部により付与させる前記振動を変化させる。 In order to solve the above problem, according to a first example of the present disclosure, a control device for an in-vehicle device mounted on a vehicle detects an input unit including an operation unit corresponding to an execution function, and contact with the input unit And a vibration applying unit that applies vibration to the input unit. When the contact is detected by the detection unit, the vibration applied by the vibration applying unit is changed according to whether or not the vehicle is running.
 第一の例によれば、ユーザが車載機器を操作するために入力部に触れると、入力部に対する接触が検出される。入力部に対する接触が検出されると、車両が走行中か否かに応じて変わる振動が入力部に付与される。よって、車両の状態に応じた振動により、入力部に対する接触状態がユーザに通知されるため、ユーザが違和感を覚えることを抑制することができる。 According to the first example, when the user touches the input unit to operate the in-vehicle device, contact with the input unit is detected. When contact with the input unit is detected, vibration that varies depending on whether the vehicle is running is applied to the input unit. Therefore, since the user is notified of the contact state with respect to the input unit by vibration according to the state of the vehicle, the user can be prevented from feeling uncomfortable.
 また、本開示の第二の例によれば、車両に搭載された車載機器を制御する制御装置は、実行機能に対応する操作部を含む入力部と、前記入力部に対する接触を検出する検出部と、を備えるように構成される。前記車載機器は、前記車両の進行方向に情報を表示するディスプレイ装置を含み、前記検出部により接触が検出された場合に、前記車両が走行中か否かに応じて、前記ディスプレイ装置により前記接触について異なるモードの表示を実行させる。 According to the second example of the present disclosure, the control device that controls the in-vehicle device mounted on the vehicle includes an input unit including an operation unit corresponding to an execution function, and a detection unit that detects contact with the input unit. And is configured to include. The in-vehicle device includes a display device that displays information in a traveling direction of the vehicle, and when the contact is detected by the detection unit, the contact by the display device according to whether the vehicle is running or not. Display different modes for.
 第二の例によれば、ユーザが車載機器を操作するために入力部に触れると、入力部に対する接触が検出される。入力部に対する接触が検出されると、車両が走行中か否かに応じて変化するモードにより、車両の進行方向を向いたユーザの視野に、入力部に対する接触について表示される。よって、車両の状態に応じた表示モードにより、入力部に対する接触状態がユーザに通知されるため、ユーザが違和感を覚えることを抑制することができる。 According to the second example, when the user touches the input unit to operate the in-vehicle device, contact with the input unit is detected. When contact with the input unit is detected, the contact with the input unit is displayed in the field of view of the user facing the traveling direction of the vehicle in a mode that changes depending on whether or not the vehicle is traveling. Therefore, since the user is notified of the contact state with respect to the input unit by the display mode corresponding to the state of the vehicle, the user can be prevented from feeling uncomfortable.
 本開示についての上記目的およびその他の目的、特徴や利点は、添付の図面を参照しながら下記の詳細な記述により、より明確になる。
車載機器の構成を示すブロック図、 入力操作の様子を示す図、 第1実施形態におけるユーザに選択された実行機能を起動する処理手順を示すフローチャート、 第1実施形態におけるユーザに選択された実行機能を起動する処理手順を示すフローチャート、 第2実施形態におけるユーザに選択された実行機能を起動する処理手順を示すフローチャート、 第2実施形態におけるユーザに選択された実行機能を起動する処理手順を示すフローチャート、 第3実施形態におけるユーザに選択された実行機能を起動する処理手順を示すフローチャート、 第3実施形態におけるユーザに選択された実行機能を起動する処理手順を示すフローチャート。
The above and other objects, features and advantages of the present disclosure will become more apparent from the following detailed description with reference to the accompanying drawings.
Block diagram showing the configuration of in-vehicle equipment, The figure which shows the state of input operation, The flowchart which shows the process sequence which starts the execution function selected by the user in 1st Embodiment, The flowchart which shows the process sequence which starts the execution function selected by the user in 1st Embodiment, The flowchart which shows the process sequence which starts the execution function selected by the user in 2nd Embodiment, The flowchart which shows the process sequence which starts the execution function selected by the user in 2nd Embodiment, The flowchart which shows the process sequence which starts the execution function selected by the user in 3rd Embodiment, The flowchart which shows the process sequence which starts the execution function selected by the user in 3rd Embodiment.
 以下、車載機器を具現化した各実施形態について、図面を参照しつつ説明する。なお、以下の各実施形態相互において、互いに同一もしくは均等である部分には、図中、同一符号を付しており、同一符号の部分についてはその説明を援用する。 Hereinafter, each embodiment that embodies an in-vehicle device will be described with reference to the drawings. In the following embodiments, parts that are the same or equivalent to each other are denoted by the same reference numerals in the drawings, and the description of the same reference numerals is used.
 (第1実施形態)
 まず、図1を参照して、各実施形態に係る車載機器の構成について説明する。本車載機器は、車両に搭載され、タッチパネル10a及びタッチスイッチ10bのタッチ入力装置、制御装置20(車載機器の制御装置)、HUD(Head Up Display)30、視線検知用センサ40を備える。尚、車載機器を搭載する車両を、ホスト車両あるいは対象車両とも言及する。
(First embodiment)
First, the configuration of the in-vehicle device according to each embodiment will be described with reference to FIG. This in-vehicle device is mounted on a vehicle, and includes a touch input device for a touch panel 10a and a touch switch 10b, a control device 20 (a control device for the in-vehicle device), a HUD (Head Up Display) 30, and a gaze detection sensor 40. A vehicle equipped with an in-vehicle device is also referred to as a host vehicle or a target vehicle.
 タッチパネル10aは、カーナビゲーション等の入力装置であり、入力部12a、検出部13a、及び振動付与部15aを備える。入力部12aは、液晶パネルや有機EL(Electro-Luminescence)パネルからなる表示画面と、表示画面上に配置された透明なタッチパネルとを備え、ユーザの指等によるタッチ入力を受け付ける。表示画面上には、各種の情報が表示され、透明なタッチパネルは、表示画面上に表示された情報を透過させる。入力部12aは、表示画面上に表示され、カーナビゲーション等の実行機能がそれぞれ割り当てられた複数の操作部11a(入力ボタン)を含む。また、入力部12aには、操作部11a以外にも、画面の拡大等の実行機能が割り当てられている部分がある。操作部11aの数及び位置は、表示画面に表示された情報に応じて変化する。 The touch panel 10a is an input device such as a car navigation, and includes an input unit 12a, a detection unit 13a, and a vibration applying unit 15a. The input unit 12a includes a display screen made up of a liquid crystal panel or an organic EL (Electro-Luminescence) panel, and a transparent touch panel arranged on the display screen, and accepts touch input by a user's finger or the like. Various kinds of information are displayed on the display screen, and the transparent touch panel transmits the information displayed on the display screen. The input unit 12a includes a plurality of operation units 11a (input buttons) that are displayed on a display screen and assigned execution functions such as car navigation. In addition to the operation unit 11a, the input unit 12a includes a portion to which an execution function such as screen enlargement is assigned. The number and the position of the operation unit 11a change according to the information displayed on the display screen.
 検出部13aは、入力部12aの液晶パネルとタッチパネルとの間に搭載された静電容量式センサであり、入力部12aに対する接触と、その接触位置、及び接触部分の面積を検出する。ユーザが、入力部12aの表面に触れると、接触位置の静電容量が変化し、接触位置、及び接触部分の面積に応じた検出信号が制御装置20へ送信される。検出部13aとしては、静電容量式センサ以外に、感圧式センサ、赤外線方式センサ等のセンサでもよい。 The detection unit 13a is a capacitive sensor mounted between the liquid crystal panel of the input unit 12a and the touch panel, and detects the contact with the input unit 12a, the contact position, and the area of the contact part. When the user touches the surface of the input unit 12a, the capacitance at the contact position changes, and a detection signal corresponding to the contact position and the area of the contact portion is transmitted to the control device 20. The detection unit 13a may be a sensor such as a pressure-sensitive sensor or an infrared sensor in addition to the capacitance sensor.
 振動付与部15aは、制御装置20からの駆動信号に応じて駆動し、入力部12aに振動を付与するピエゾ素子、偏心モータ、ボイスコイル等のアクチュエータである。 The vibration applying unit 15a is an actuator such as a piezo element, an eccentric motor, or a voice coil that is driven according to a drive signal from the control device 20 and applies vibration to the input unit 12a.
 タッチスイッチ10bは、エアコン等の入力装置であり、入力部12b、検出部13b、及び振動付与部15bを備える。入力部12bは、エアコン等の実行機能がそれぞれ割り当てられた複数の操作部11b(入力ボタン)を含む矩形状の面として形成されている。入力部12bの操作部11b以外の部分には、実行機能は割り当てられていない。検出部13bは、入力部12bの後面に搭載された静電容量式のセンサであり、入力部12bに含まれる操作部11bに対する接触と、その接触位置、及び接触部分の面積を検出する。ユーザが、操作部11bの表面に触れると、接触位置の静電容量が変化し、接触位置、及び接触部分の面積に応じた検出信号が制御装置20へ送信される。振動付与部15bは、タッチパネル10aの振動付与部15aと同様に、制御装置20からの駆動信号に応じて駆動し、入力部12bに振動を付与するアクチュエータである。 The touch switch 10b is an input device such as an air conditioner, and includes an input unit 12b, a detection unit 13b, and a vibration applying unit 15b. The input unit 12b is formed as a rectangular surface including a plurality of operation units 11b (input buttons) each assigned an execution function such as an air conditioner. Execution functions are not assigned to portions of the input unit 12b other than the operation unit 11b. The detection unit 13b is a capacitive sensor mounted on the rear surface of the input unit 12b, and detects the contact with the operation unit 11b included in the input unit 12b, the contact position, and the area of the contact part. When the user touches the surface of the operation unit 11b, the capacitance at the contact position changes, and a detection signal corresponding to the contact position and the area of the contact portion is transmitted to the control device 20. Similarly to the vibration applying unit 15a of the touch panel 10a, the vibration applying unit 15b is an actuator that is driven according to a drive signal from the control device 20 and applies vibration to the input unit 12b.
 タッチパネル10a及びタッチスイッチ10bは、図2に示すように、運転席と助手席との間、インストルメントパネルの中央に設置される。 The touch panel 10a and the touch switch 10b are installed in the center of the instrument panel between the driver seat and the passenger seat as shown in FIG.
 HUD30(ディスプレイ装置)は、図2に示すように、車両の進行方向に情報を表示するディスプレイ装置である。HUD30は、ダッシュボードの奥に埋め込まれており、液晶パネルに表示した映像をミラーに反射させ、更にそれを凹面鏡に反射させて拡大させ、拡大させた映像をフロントガラスに反射させる。これにより、ユーザは、フロントガラスの前方に浮き上がるように表示された映像を見ることができる。具体的な表示情報は、車速、道路案内、入力部12a又は入力部12bにおける接触位置等である。 The HUD 30 (display device) is a display device that displays information in the traveling direction of the vehicle, as shown in FIG. The HUD 30 is embedded in the back of the dashboard, reflects the image displayed on the liquid crystal panel to the mirror, further reflects it on the concave mirror and enlarges it, and reflects the enlarged image on the windshield. Thereby, the user can see the image displayed so as to float up in front of the windshield. Specific display information includes a vehicle speed, road guidance, a contact position in the input unit 12a or the input unit 12b, and the like.
 視線検知用センサ40(視線検知装置)は、図2に示すように、進行方向を向いたユーザ(ドライバ)の目に正対する位置、例えばステアリングアッパコラムやメータパネルに設置され、目に見えない近赤外光が照射されたユーザの目を撮影するカメラである。視線検知用センサ40は、撮影したユーザの目の画像を制御装置20へ送信する。 As shown in FIG. 2, the line-of-sight detection sensor 40 (line-of-sight detection device) is installed at a position facing the eyes of the user (driver) facing in the traveling direction, for example, a steering upper column or a meter panel, and is not visible. It is a camera that photographs the eyes of a user irradiated with near infrared light. The line-of-sight detection sensor 40 transmits the captured image of the user's eyes to the control device 20.
 制御装置20は、制御回路21、操作入力部22、振動出力部23、記録部24、HUD表示出力部25、視線認識センサ入力部26、車両センサ入力部27、電源部28を備える。 The control device 20 includes a control circuit 21, an operation input unit 22, a vibration output unit 23, a recording unit 24, a HUD display output unit 25, a line-of-sight recognition sensor input unit 26, a vehicle sensor input unit 27, and a power supply unit 28.
 制御回路21は、通常のコンピュータとしても構成され、CPU、RAM、ROM等を含む。記録部24は、ROMあるいはRAMに含まれてもよい。制御回路21は、視線検知用センサ40から視線認識センサ入力部26を介して受信した目の画像に基づいて、ユーザの視線を検出する。具体的には、制御回路21は、受信した目の画像から、視線方向によって位置が変わる瞳孔、及び視線方向には影響を受けない角膜反射を検出し、瞳孔と角膜反射との位置関係から視線を検出する。 The control circuit 21 is also configured as a normal computer and includes a CPU, RAM, ROM, and the like. The recording unit 24 may be included in the ROM or RAM. The control circuit 21 detects the user's line of sight based on the eye image received from the line-of-sight detection sensor 40 via the line-of-sight recognition sensor input unit 26. Specifically, the control circuit 21 detects a pupil whose position changes depending on the line-of-sight direction and a corneal reflection that is not affected by the line-of-sight direction from the received eye image, and looks at the line of sight from the positional relationship between the pupil and the corneal reflection. Is detected.
 制御回路21は、検出部13a,13bから操作入力部22を介して送信された検出信号、及び検出したユーザの視線に基づいて、操作部11a,11bに対応する実行機能が選択されたか判定する。制御回路21は、操作部11a,11bに対応する実行機能が選択されたと判定すると、対象となる車載機器へ選択された実行機能を実現させる指令信号を送信する。 The control circuit 21 determines whether the execution function corresponding to the operation units 11a and 11b is selected based on the detection signal transmitted from the detection units 13a and 13b via the operation input unit 22 and the detected line of sight of the user. . If it determines with the execution function corresponding to operation part 11a, 11b having been selected, the control circuit 21 will transmit the command signal which implement | achieves the selected execution function to the vehicle equipment used as object.
 制御回路21は、検出部13a,13bから検出信号を受信した場合に、すなわち、検出部13a,13bにより接触が検出された場合に、振動付与部15a,15bにより入力部12a,12bに振動を付与させる。さらに、制御回路21は、車両が走行中か否かに応じて振動付与部15a,15bにより付与させる振動を変化させる。詳しくは、制御回路21は、車両が走行中であれば振動付与部15a,15bにより振動を付与させ、車両が走行中でなければ振動付与部15a,15bにより振動を付与させない。よって、制御部21は、車両走行状態検知部として機能するとも言える。 When the detection signal is received from the detection units 13a and 13b, that is, when contact is detected by the detection units 13a and 13b, the control circuit 21 vibrates the input units 12a and 12b by the vibration applying units 15a and 15b. Grant. Further, the control circuit 21 changes the vibration applied by the vibration applying units 15a and 15b depending on whether or not the vehicle is traveling. Specifically, the control circuit 21 applies vibration by the vibration applying units 15a and 15b when the vehicle is traveling, and does not apply vibration by the vibration applying units 15a and 15b unless the vehicle is traveling. Therefore, it can be said that the control unit 21 functions as a vehicle running state detection unit.
 制御回路21は、記録部24に記憶されている各種プログラムを実行するとともに、受信した各種データを記録部24に記憶させる。また、制御回路21は、HUD表示出力部25を介して、HUD30の表示領域に情報を表示させる。また、制御回路21は、車両センサ入力部27を介して、図示しない車速センサやブレーキセンサ等の車両センサから検出値を受信し、車両が走行中か否か判定する。また、制御回路21は、電源部28から電力の供給を受ける。 The control circuit 21 executes various programs stored in the recording unit 24 and causes the recording unit 24 to store various received data. Further, the control circuit 21 displays information in the display area of the HUD 30 via the HUD display output unit 25. Further, the control circuit 21 receives detection values from vehicle sensors such as a vehicle speed sensor and a brake sensor (not shown) via the vehicle sensor input unit 27, and determines whether or not the vehicle is traveling. Further, the control circuit 21 is supplied with power from the power supply unit 28.
 次に、図3A,3Bのフローチャートを参照して、ユーザにより選択された実行機能を起動する処理手順について説明する。本処理手順は、制御回路21が所定間隔で繰り返し実行する。 Next, the processing procedure for starting the execution function selected by the user will be described with reference to the flowcharts of FIGS. 3A and 3B. This processing procedure is repeatedly executed by the control circuit 21 at predetermined intervals.
 まず、S11では、入力部12a,12bに対する接触があるか否か判定する。操作入力部22を介して検出部13a,13bから検出信号を受信していない場合は、接触なしと判定し(NO)、S12の処理に進む。S12では、振動付与部15a,15bにより入力部12a,12bに振動を付与している場合は、振動を停止させ、S11の処理に戻る。 First, in S11, it is determined whether or not there is contact with the input units 12a and 12b. If no detection signal is received from the detection units 13a and 13b via the operation input unit 22, it is determined that there is no contact (NO), and the process proceeds to S12. In S12, when vibration is applied to the input units 12a and 12b by the vibration applying units 15a and 15b, the vibration is stopped and the process returns to S11.
 一方、S11で、操作入力部22を介して検出部13a,13bから検出信号を受信している場合は、接触ありと判定し(YES)、S13の処理に進む。S13では、車両が走行中か否か判定する。走行中の場合は(YES)、S14において、振動付与部15a,15bにより付与させる振動のパターンを走行時パターンに設定する。一方、走行中でない場合は(NO)、S15において、振動付与部15a,15bにより付与させる振動のパターンを停車時パターンに設定する。振動付与部15aにより付与させる振動の走行時パターンは、予め各入力ボタン及び入力ボタン以外の領域で異なる振動パターンに設定されているとともに、接触時と実行機能選択時とで異なる振動パターンに設定されている。振動付与部15bにより付与させる振動の走行時パターンは、予め各入力ボタンで異なる振動パターンに設定されているとともに、接触時と実行機能選択時とで異なる振動パターンに設定されている。また、停車時パターンは振動付与部15a,15bにより振動を付与させない。 On the other hand, when the detection signal is received from the detection units 13a and 13b via the operation input unit 22 in S11, it is determined that there is a contact (YES), and the process proceeds to S13. In S13, it is determined whether or not the vehicle is traveling. When the vehicle is traveling (YES), in S14, the vibration pattern applied by the vibration applying units 15a and 15b is set as the traveling pattern. On the other hand, when the vehicle is not traveling (NO), in S15, the vibration pattern to be applied by the vibration applying units 15a and 15b is set as the stop time pattern. The vibration traveling pattern to be applied by the vibration applying unit 15a is set in advance to a different vibration pattern in each area other than each input button and the input button, and is set to a different vibration pattern at the time of contact and at the time of selecting the execution function. ing. The traveling pattern of the vibration to be applied by the vibration applying unit 15b is set in advance to a different vibration pattern for each input button, and is set to a different vibration pattern for the contact and execution function selection. Moreover, the pattern at the time of a stop does not give a vibration by the vibration provision parts 15a and 15b.
 続いて、S16では、検出部13aから受信した検出信号に基づいて、入力部12aにおける接触位置が入力ボタンの位置か否か判定する。接触位置が入力ボタンの位置の場合(YES)、S17の処理に進む。一方、接触位置が入力ボタンの位置でない場合(NO)、S35の処理に進む。なお、タッチスイッチ10bの入力部12bでは、入力ボタン以外の位置において接触は検出されない。よって、検出部13bから検出信号を受信した場合は、S16の処理を行わないでS17の処理に進む。 Subsequently, in S16, based on the detection signal received from the detection unit 13a, it is determined whether or not the contact position on the input unit 12a is the position of the input button. When the contact position is the position of the input button (YES), the process proceeds to S17. On the other hand, when the contact position is not the position of the input button (NO), the process proceeds to S35. In addition, in the input part 12b of the touch switch 10b, contact is not detected in positions other than an input button. Therefore, when a detection signal is received from the detection unit 13b, the process proceeds to S17 without performing S16.
 S17では、接触位置が入力ボタン1の位置か否か判定する。ここでは、入力ボタンの数を3つとする。S17で、接触位置が入力ボタン1の位置の場合は(YES)、S18~S22の処理を行う。 In S17, it is determined whether or not the contact position is the position of the input button 1. Here, the number of input buttons is three. If the contact position is the position of the input button 1 in S17 (YES), the processes of S18 to S22 are performed.
 まず、S18で、検出したユーザの視線が、入力部12a,12b以外の方向か否か判定する。ユーザの視線が入力部12a,12b以外の方向の場合(YES)、すなわち、ユーザが入力部12a,12bを見ないで入力操作している場合は、S19で、振動付与部15a,15bにより入力ボタン1用の接触時振動を付与させる。これにより、入力部12a,12bを見ていないユーザに、入力ボタン1に接触していることが通知される。一方、S18で、ユーザの視線が入力部12a,12bの方向の場合は(NO)、振動付与部15a,15bにより振動を付与させないで、S20の処理に進む。 First, in S18, it is determined whether or not the detected line of sight of the user is in a direction other than the input units 12a and 12b. When the user's line of sight is in a direction other than the input units 12a and 12b (YES), that is, when the user is performing an input operation without looking at the input units 12a and 12b, the input is performed by the vibration applying units 15a and 15b in S19. The vibration for contact for the button 1 is applied. Thereby, the user who is not looking at the input units 12a and 12b is notified that the input button 1 is touched. On the other hand, if the user's line of sight is in the direction of the input units 12a and 12b in S18 (NO), the vibration applying units 15a and 15b are not applied with vibration, and the process proceeds to S20.
 続いて、S20で、検出部13a,13bにより接触位置において入力部12a,12bに対する接触よりも強い押し込みを検出したか否か判定する。具体的には、検出部13a,13bから受信した検出信号に基づいて、接触を検出したときよりも操作部11a,11bに対する接触面積が拡大している場合に、押し込みを検出したと判定する。なお、検出部13a,13bが感圧式センサの場合は、検出圧力が基準値よりも大きい場合に、押し込みを検出したと判定する。 Subsequently, in S20, it is determined whether or not the detection units 13a and 13b have detected a stronger push than the contact with the input units 12a and 12b at the contact position. Specifically, based on the detection signals received from the detection units 13a and 13b, it is determined that the push-in has been detected when the contact area with respect to the operation units 11a and 11b is larger than when the contact is detected. When the detection units 13a and 13b are pressure-sensitive sensors, it is determined that pressing has been detected when the detected pressure is greater than a reference value.
 S20で押し込みが検出されなかったと判定した場合は(NO)、S11の処理に戻る。S20で押し込みが検出されたと判定した場合は(YES)、ユーザにより入力ボタン1に対応する実行機能が選択されたと判定し、S21で、振動付与部15a,15bにより入力ボタン1用の選択時振動を付与させる。続いて、S22で、入力ボタン1に対応する実行機能を実現させるための指令信号を対象となる車載機器に送信し、入力ボタン1に対応する実行機能をスタートさせる。 If it is determined in S20 that no push-in has been detected (NO), the process returns to S11. If it is determined in S20 that the push-in has been detected (YES), it is determined that the execution function corresponding to the input button 1 has been selected by the user, and the vibration at the time of selection for the input button 1 is selected by the vibration applying units 15a and 15b in S21. Is granted. Subsequently, in S22, a command signal for realizing the execution function corresponding to the input button 1 is transmitted to the target in-vehicle device, and the execution function corresponding to the input button 1 is started.
 また、S17で、接触位置が入力ボタン1の位置でない場合は(NO)、S23で、接触位置が入力ボタン2の位置か否か判定する。S23で、接触位置が入力ボタン2の位置の場合(YES)、S18~S22の処理と同様に、S24~S28の処理を行う。ただし、S25では、振動付与部15a,15bにより入力ボタン2用の接触時振動を付与させ、S28では、振動付与部15a,15bにより入力ボタン2用の選択時振動を付与させる。 If it is determined in S17 that the contact position is not the position of the input button 1 (NO), it is determined in S23 whether the contact position is the position of the input button 2. If the contact position is the position of the input button 2 in S23 (YES), the processes of S24 to S28 are performed in the same manner as the processes of S18 to S22. However, in S25, vibration at the time of contact for the input button 2 is applied by the vibration applying units 15a and 15b, and in S28, vibration at the time of selection for the input button 2 is applied by the vibration applying units 15a and 15b.
 また、S23で、接触位置が入力ボタン2の位置でない場合は(NO)、S29で、接触位置が入力ボタン3の位置か否か判定する。S29で、接触位置が入力ボタン3の位置の場合(YES)、S18~S22の処理と同様に、S30~S34の処理を行う。ただし、S31では、振動付与部15a,15bにより入力ボタン3用の接触時振動を付与させ、S33では、振動付与部15a,15bにより入力ボタン3用の選択時振動を付与させる。S29で、接触位置が入力ボタン3の位置でない場合は(NO)、S11の処理に戻る。 If the contact position is not the position of the input button 2 at S23 (NO), it is determined whether the contact position is the position of the input button 3 at S29. If the contact position is the position of the input button 3 in S29 (YES), the processes of S30 to S34 are performed in the same manner as the processes of S18 to S22. However, in S31, vibration at the time of contact for the input button 3 is applied by the vibration applying units 15a and 15b, and in S33, vibration at the time of selection for the input button 3 is applied by the vibration applying units 15a and 15b. If the contact position is not the position of the input button 3 in S29 (NO), the process returns to S11.
 S16で、接触位置が入力ボタンの位置でない場合は(NO)、S35で、検出したユーザの視線が、入力部12a以外の方向か否か判定する。ユーザの視線が入力部12a以外の方向の場合(YES)、すなわち、ユーザが入力部12aを見ないで入力操作している場合は、S36で、振動付与部15aにより入力ボタン位置以外の領域用の接触時振動を付与させる。これにより、入力部12aを見ていないユーザに、入力ボタン位置以外の領域に接触していることが通知される。一方、S36で、ユーザの視線が入力部12aの方向の場合は(NO)、振動付与部15aにより振動を付与させないで、S37の処理に進む。 If it is determined in S16 that the contact position is not the position of the input button (NO), it is determined in S35 whether the detected user's line of sight is in a direction other than the input unit 12a. When the user's line of sight is in a direction other than the input unit 12a (YES), that is, when the user is performing an input operation without looking at the input unit 12a, in S36, the vibration applying unit 15a is used for an area other than the input button position. Apply vibration when touching. As a result, the user who is not looking at the input unit 12a is notified that the user is touching an area other than the input button position. On the other hand, if the user's line of sight is in the direction of the input unit 12a in S36 (NO), the vibration applying unit 15a does not apply vibration, and the process proceeds to S37.
 S37では、入力部12aにおいて、入力ボタンすなわち操作部11a以外の領域に、例えば拡大や縮小等の実行機能の割り当てがあるか否か判定する。実行機能の割り当てがない場合は(NO)、S11の処理に戻る。実行機能の割り当てがある場合は(YES)、S20~S22と同様に、S38~S40の処理を行う。ただし、S39では、振動付与部15a,15bによりボタン位置以外の領域用の選択時振動を付与させる。以上で、本処理を終了する。 In S37, in the input unit 12a, it is determined whether or not an execution function such as enlargement or reduction is assigned to an area other than the input button, that is, the operation unit 11a. If the execution function is not assigned (NO), the process returns to S11. If there is an execution function assignment (YES), the processing of S38 to S40 is performed as in S20 to S22. However, in S39, vibration at the time of selection for areas other than the button position is applied by the vibration applying units 15a and 15b. This process is complete | finished above.
 以上説明した第1実施形態によれば、以下の効果を奏する。 According to the first embodiment described above, the following effects are obtained.
 ・入力部12a,12bに対する接触が検出されると、車両が走行中か否かに応じて変わる振動が入力部12a,12bに付与される。よって、車両の状態に応じた振動により、入力部12a,12bに対する接触状態がユーザに通知されるため、ユーザが違和感を覚えることを抑制することができる。 When contact with the input units 12a and 12b is detected, vibrations that change depending on whether the vehicle is running are applied to the input units 12a and 12b. Therefore, since the user is notified of the contact state with respect to the input units 12a and 12b by vibration according to the state of the vehicle, the user can be prevented from feeling uncomfortable.
 ・車両が走行中の場合、ユーザは操作部11a,11bを見て操作できないため、入力操作の状態をユーザに通知する必要がある。一方、車両が走行中でない場合、ユーザは操作部11a,11bを見て操作できるため、入力操作の状態をユーザに通知する必要はない。車両が走行中の場合は振動を付与し、車両が走行中でない場合は振動を付与しないことにより、必要な場合に限って入力操作の状態がユーザに通知されるので、ユーザが違和感を覚えることを抑制できる。 When the vehicle is traveling, the user cannot operate by looking at the operation units 11a and 11b, so it is necessary to notify the user of the state of the input operation. On the other hand, when the vehicle is not running, the user can operate by looking at the operation units 11a and 11b, and therefore it is not necessary to notify the user of the state of the input operation. By giving vibration when the vehicle is running and not giving vibration when the vehicle is not running, the user is notified of the state of the input operation only when necessary, so that the user feels uncomfortable Can be suppressed.
 ・ユーザが入力部12a,12bを見て操作部11a,11bに触れている場合は、ユーザが意図して操作部11a,11bに触れている状態であり、意図せず操作部11a,11bに触れている状態ではないので、操作部11a,11bに対する接触状態をユーザに通知しなくてもよい。 When the user is touching the operation units 11a and 11b while looking at the input units 12a and 12b, the user is intentionally touching the operation units 11a and 11b, and the operation units 11a and 11b are unintentionally touched. Since it is not in the touching state, the user need not be notified of the contact state with respect to the operation units 11a and 11b.
 ・接触位置に応じて異なるパターンの振動を付与することにより、ユーザは、入力部12a,12bを見なくても、どの位置の操作部11a,11bに触れているか認識できる。 · By applying vibrations of different patterns depending on the contact position, the user can recognize which position the operation units 11a and 11b are touching without looking at the input units 12a and 12b.
 ・ユーザが操作部11a,11bを接触よりも強く押し込んでいる場合は、ユーザは操作部11a,11bに対応する実行機能を選択していると判定できる。また、実行機能を選択していると判定された場合には、接触が検出された場合とは異なる振動を付与することにより、ユーザは、入力操作の状態が接触の状態か選択の状態かを認識できる。 When the user pushes the operation units 11a and 11b stronger than the contact, it can be determined that the user has selected the execution function corresponding to the operation units 11a and 11b. In addition, when it is determined that the execution function is selected, the user can determine whether the input operation state is the contact state or the selection state by applying a vibration different from the case where the contact is detected. Can be recognized.
 (第1実施形態の変形例)
 上記第1実施形態は、以下のように変更して実施してもよい。
(Modification of the first embodiment)
The first embodiment may be implemented with the following modifications.
 ・S18、S24、S30、S35でユーザの視線が入力部12a,12bの方向と判定された場合は、S20、S26、S32、S38の処理を行わなくてもよい。すなわち、押し込みを検出したか否か判定しないで、ユーザにより接触位置の入力ボタンに対応する実行機能が選択されたと判定する。ユーザが入力部12a,12bを見て操作部11a,11bに触れている場合は、追加の操作を必要とせず接触位置の操作部11a,11bに対応する実行機能がユーザにより選択されたと判定できる。 When it is determined in S18, S24, S30, and S35 that the user's line of sight is the direction of the input units 12a and 12b, the processing of S20, S26, S32, and S38 may not be performed. That is, it is determined that the execution function corresponding to the input button of the touch position has been selected by the user without determining whether or not the pressing is detected. When the user looks at the input units 12a and 12b and touches the operation units 11a and 11b, it can be determined that an execution function corresponding to the operation units 11a and 11b at the contact position is selected by the user without requiring an additional operation. .
 ・S18、S24、S30、S35でユーザの視線が入力部12a,12bの方向と判定された場合は、S21、S27、S33、S39の処理を行わなくてもよい。すなわち、振動付与部15a,15bにより選択時振動を付与させなくてもよい。 When it is determined in S18, S24, S30, and S35 that the user's line of sight is the direction of the input units 12a and 12b, the processing of S21, S27, S33, and S39 may not be performed. That is, the vibration at the time of selection may not be applied by the vibration applying units 15a and 15b.
 ・S20、S26、S32、S38において、押し込みに限らず、入力部12a,12bの接触位置に対して接触とは異なる操作が検出された場合には、ユーザにより接触位置に対応する実行機能が選択されたと判定してもよい。そして、S21、S27、S33、S39において、振動付与部15a,15bにより入力ボタンに応じた選択時振動を付与させる。このようにしても、ユーザは、入力操作の状態が接触の状態か選択の状態かを認識できる。なお、接触とは異なる操作とは、例えば、接触位置において所定時間よりも長い時間接触したままの操作が挙げられる。 In S20, S26, S32, and S38, when an operation different from contact is detected for the contact position of the input units 12a and 12b, the execution function corresponding to the contact position is selected by the user. It may be determined that it has been done. In S21, S27, S33, and S39, vibration is applied by the vibration applying units 15a and 15b according to the input button. Even in this case, the user can recognize whether the state of the input operation is a contact state or a selection state. Note that the operation different from the contact includes, for example, an operation in which the contact position is kept in contact for a longer time than a predetermined time.
 ・停車時の振動パターンを、走行時よりも弱い振幅の振動や、低い周波数の振動にしてもよい。例えば、検出部13a,13bにより接触が検出された場合に、車両が走行中の時には、振動付与部15a、15bにより付与させる振動の振幅を、車両が停止中の時に比し、大きくしてもよい。 ・ The vibration pattern at the time of stopping may be a vibration with a weaker amplitude or a lower frequency than when traveling. For example, when contact is detected by the detection units 13a and 13b, the amplitude of vibration applied by the vibration applying units 15a and 15b may be larger than that when the vehicle is stopped when the vehicle is running. Good.
 ・振動付与部15a,15bにより振動を付与に加えて、走行時の振動パターン及び停車時の振動パターンに対応する周波数や振幅の報知音を出力させてもよい。 In addition to applying vibration by the vibration applying units 15a and 15b, a notification sound having a frequency and amplitude corresponding to the vibration pattern during traveling and the vibration pattern during stopping may be output.
 (第2実施形態)
 次に、第2実施形態について、第1実施形態とは異なる点について説明する。第2実施形態では、ユーザにより選択された実行機能を実現する処理手順の一部が、第1実施形態と異なる。図4A,4Bのフローチャートを参照して、第2実施形態に係るユーザにより選択された実行機能を実行する処理手順について説明する。第2実施形態では、HUD30の表示領域に接触位置を表示させる。
(Second Embodiment)
Next, differences of the second embodiment from the first embodiment will be described. In the second embodiment, a part of the processing procedure for realizing the execution function selected by the user is different from the first embodiment. A processing procedure for executing the execution function selected by the user according to the second embodiment will be described with reference to the flowcharts of FIGS. 4A and 4B. In the second embodiment, the contact position is displayed in the display area of the HUD 30.
 まず、S41では、S11と同様に、接触があるか否か判定する。S41で接触ありと判定した場合は(YES)、S43で、入力部12a,12bにおける接触位置を、HUD30の表示領域に表示させる。一方、S41で接触なしと判定した場合は(NO)、S42で、S12と同様に振動を停止させ、続いて、S44で、接触位置をHUD30の表示領域に表示させることを停止し、S41の処理に戻る。 First, in S41, as in S11, it is determined whether or not there is a contact. If it is determined that there is contact in S41 (YES), the contact position in the input units 12a and 12b is displayed in the display area of the HUD 30 in S43. On the other hand, if it is determined that there is no contact in S41 (NO), in S42, the vibration is stopped in the same manner as in S12. Subsequently, in S44, the display of the contact position in the display area of the HUD 30 is stopped. Return to processing.
 続いて、S45~S49では、S13~S17と同様の処理を行う。S49で、接触位置が入力ボタン1の位置の場合は(YES)、S50~S53で、S19~S22と同様の処理を行う。すなわち、第2実施形態では、ユーザの視線方向の判定(S18)は行わない。 Subsequently, in S45 to S49, the same processing as S13 to S17 is performed. If the contact position is the position of the input button 1 in S49 (YES), the same processing as S19 to S22 is performed in S50 to S53. That is, in the second embodiment, the determination of the user's line-of-sight direction (S18) is not performed.
 同様に、接触位置が入力ボタン2、入力ボタン3、入力ボタン位置以外の領域の場合も、ユーザの視線方向の判定は行わないで、それぞれ、S25~S28、S31~S34、S36~S40と同様の処理を、S55~S58、S60~S63、S64~S68で行う。 Similarly, when the contact position is an area other than the input button 2, the input button 3, and the input button position, the determination of the user's line-of-sight direction is not performed, and the same as S25 to S28, S31 to S34, and S36 to S40, respectively. This process is performed in S55 to S58, S60 to S63, and S64 to S68.
 以上説明した第2実施形態によれば、以下の効果を奏する。 According to the second embodiment described above, the following effects are obtained.
 ・車両の進行方向を向いたユーザの視野に、入力部12a,12bにおけるユーザの接触位置が表示される。よって、ユーザは、入力部12a,12bのどの操作部11a,11bに触れているか確実に認識することができる。 The user's contact position in the input units 12a and 12b is displayed in the field of view of the user facing the traveling direction of the vehicle. Therefore, the user can surely recognize which operation unit 11a, 11b of the input unit 12a, 12b is touched.
 (第2実施形態の変形例)
 上記第2実施形態は、以下のように変更して実施してもよい。
(Modification of the second embodiment)
The second embodiment may be implemented with the following modifications.
 ・S49とS50、S54とS55、S59とS60、S48とS64の間で、S18、S24、S30、S35と同様の処理を行ってもよい。 · S49 and S50, S54 and S55, S59 and S60, and S48 and S64 may be processed in the same manner as S18, S24, S30, and S35.
 (第3実施形態)
 次に、第3実施形態について、第1実施形態とは異なる点について説明する。第3実施形態では、ユーザにより選択された実行機能を実行する処理手順の一部が、第1実施形態と異なる。図5A,5Bのフローチャートを参照して、第3実施形態に係るユーザにより選択された実行機能を実現する処理手順について説明する。第3実施形態では、車両が走行中か否かに応じて、HUD30の表示領域に入力操作の状態を表示させる。
(Third embodiment)
Next, differences of the third embodiment from the first embodiment will be described. In the third embodiment, a part of the processing procedure for executing the execution function selected by the user is different from the first embodiment. A processing procedure for realizing the execution function selected by the user according to the third embodiment will be described with reference to the flowcharts of FIGS. 5A and 5B. In the third embodiment, the state of the input operation is displayed in the display area of the HUD 30 depending on whether or not the vehicle is traveling.
 まず、S71では、S11と同様に、接触があるか否か判定する。S71で接触ありと判定した場合は(YES)、S72で、走行中か否か判定する。S72で走行中と判定した場合は(YES)、S73で、HUD30の表示領域に、接触状態及び入力部12a,12bにおける接触位置を表示させ、S75の処理に進む。一方、S71で接触なしと判定した場合(NO)、又はS72で走行中でないと判定した場合は(NO)、S74で、HUD30の表示領域に接触状態及び接触位置を表示させることを停止し、S71の処理に戻る。 First, in S71, as in S11, it is determined whether or not there is a contact. If it is determined in S71 that there is contact (YES), it is determined in S72 whether the vehicle is traveling. If it is determined that the vehicle is traveling in S72 (YES), the contact state and the contact position in the input units 12a and 12b are displayed in the display area of the HUD 30 in S73, and the process proceeds to S75. On the other hand, when it is determined that there is no contact in S71 (NO), or when it is determined that the vehicle is not traveling in S72 (NO), in S74, the display of the contact state and the contact position in the display area of the HUD 30 is stopped. The process returns to S71.
 次に、S75では、S16と同様に、接触位置が入力ボタンの位置か否か判定する。S75で、接触位置が入力ボタンの位置の場合(YES)、S76で、S17と同様に、接触位置が入力ボタン1の位置か否か判定する。S76で、接触位置が入力ボタン1の位置の場合は(YES)、S77で、入力ボタン1に対応する実行機能をHUD30の表示領域に表示させる。 Next, in S75, as in S16, it is determined whether or not the contact position is the position of the input button. If the contact position is the position of the input button in S75 (YES), it is determined in S76 whether the contact position is the position of the input button 1 as in S17. If the contact position is the position of the input button 1 in S76 (YES), the execution function corresponding to the input button 1 is displayed in the display area of the HUD 30 in S77.
 続いて、S78で、S20と同様に、検出部13a,13bにより接触位置において入力部12a,12bに対する接触よりも強い押し込みを検出したか否か判定する。S78で押し込みが検出されなかったと判定した場合は(NO)、S71の処理に戻る。S78で押し込みが検出されたと判定した場合は(YES)、ユーザにより入力ボタン1に対応する実行機能が選択されたと判定し、S79で、入力ボタン1の選択をHUD30の表示領域に表示させる。具体的には、例えば、S77で表示させた実行機能を反転表示させる。 Subsequently, in S78, as in S20, it is determined whether or not the detection units 13a and 13b have detected a stronger push than the contact with the input units 12a and 12b at the contact positions. If it is determined in S78 that no push has been detected (NO), the process returns to S71. If it is determined in S78 that pressing has been detected (YES), it is determined that the execution function corresponding to the input button 1 has been selected by the user, and the selection of the input button 1 is displayed in the display area of the HUD 30 in S79. Specifically, for example, the execution function displayed in S77 is highlighted.
 続いて、S80で、S22と同様に、入力ボタン1に対応する実行機能を実現させるための指令信号を対象となる車載機器に送信し、入力ボタン1に対応する実行機能をスタートさせる。 Subsequently, in S80, as in S22, a command signal for realizing the execution function corresponding to the input button 1 is transmitted to the target in-vehicle device, and the execution function corresponding to the input button 1 is started.
 また、S76で、接触位置が入力ボタン1の位置でない場合は(NO)、S81で、接触位置が入力ボタン2の位置か否か判定する。S81で、接触位置が入力ボタン2の位置の場合(YES)、S77~S80の処理と同様に、S82~S85の処理を行う。ただし、S82では、入力ボタン2に対応する実行機能をHUD30の表示領域に表示させ、S85では、入力ボタン2の選択をHUD30の表示領域に表示させる。 If it is determined in S76 that the contact position is not the position of the input button 1 (NO), it is determined in S81 whether the contact position is the position of the input button 2 or not. If the contact position is the position of the input button 2 in S81 (YES), the processes of S82 to S85 are performed in the same manner as the processes of S77 to S80. However, in S82, the execution function corresponding to the input button 2 is displayed in the display area of the HUD 30, and in S85, the selection of the input button 2 is displayed in the display area of the HUD 30.
 また、S81で、接触位置が入力ボタン2の位置でない場合は(NO)、S86で、接触位置が入力ボタン3の位置か否か判定する。S86で、接触位置が入力ボタン3の位置の場合(YES)、S77~S80の処理と同様に、S87~S90の処理を行う。ただし、S87では、入力ボタン3に対応する実行機能をHUD30の表示領域に表示させ、S89では、入力ボタン3の選択をHUD30の表示領域に表示させる。S86で、接触位置が入力ボタン3の位置でない場合は(NO)、S71の処理に戻る。 If it is determined in S81 that the contact position is not the position of the input button 2 (NO), it is determined in S86 whether the contact position is the position of the input button 3 or not. If the contact position is the position of the input button 3 in S86 (YES), the processes of S87 to S90 are performed in the same manner as the processes of S77 to S80. However, in S87, the execution function corresponding to the input button 3 is displayed in the display area of the HUD 30, and in S89, the selection of the input button 3 is displayed in the display area of the HUD 30. If the contact position is not the position of the input button 3 in S86 (NO), the process returns to S71.
 S75で、接触位置が入力ボタンの位置でない場合は(NO)、S91で、入力部12aにおいて、入力ボタンすなわち操作部11a以外の領域に、実行機能の割り当てがあるか否か判定する。実行機能の割り当てがない場合は(NO)、S71の処理に戻る。実行機能の割り当てがある場合は(YES)、S77~S80と同様に、S92~S95の処理を行う。ただし、S92では、割り当てられている実行機能をHUD30の表示領域に表示させ、S94では、割り当てられている実行機能の選択をHUD30の表示領域に表示させる。以上で、本処理を終了する。 If it is determined in S75 that the contact position is not the position of the input button (NO), in S91, it is determined in the input unit 12a whether or not an execution function is assigned to an area other than the input button, that is, the operation unit 11a. If the execution function is not assigned (NO), the process returns to S71. If there is an execution function assignment (YES), the processing of S92 to S95 is performed as in S77 to S80. However, in S92, the assigned execution function is displayed in the display area of the HUD 30, and in S94, the selection of the assigned execution function is displayed in the display area of the HUD 30. This process is complete | finished above.
 以上説明した第3実施形態によれば、以下の効果を奏する。 According to the third embodiment described above, the following effects are obtained.
 ・入力部12a,12bに対する接触が検出されると、車両が走行中か否かに応じて、車両の進行方向を向いたユーザの視野に、入力部12a,12bに対する接触について表示される。よって、車両の状態に応じた表示モードにより、入力部12a,12bに対する接触状態がユーザに通知されるため、ユーザが違和感を覚えることを抑制することができる。ここで、表示モードとは、表示するモード、と非表示とするモードを含む。さらに、表示モードは、表示あるいはそのデザインにかかわる、色、形、大きさ、内容などの各種仕様、そして、その表示の各種仕様の時間的な変化にかかわる仕様等も含むこともできる。また、表示モードは、表示パターン、表示フォーマット、表示マナー、表示態様とも言及される。 When contact with the input units 12a and 12b is detected, contact with the input units 12a and 12b is displayed in the field of view of the user facing the traveling direction of the vehicle depending on whether or not the vehicle is running. Therefore, since the user is notified of the contact state with respect to the input units 12a and 12b by the display mode corresponding to the state of the vehicle, the user can be prevented from feeling uncomfortable. Here, the display mode includes a display mode and a non-display mode. Furthermore, the display mode may include various specifications such as color, shape, size, and contents related to the display or its design, and specifications related to temporal changes in various specifications of the display. The display mode is also referred to as a display pattern, a display format, a display manner, and a display mode.
 ・車両が走行中の場合、ユーザは操作部11a,11bを見て操作できないため、入力操作の状態をユーザに通知する必要がある。一方、車両が走行中でない場合、ユーザは操作部11a,11bを見て操作できるため、入力操作の状態をユーザに通知する必要はない。車両が走行中の場合は車両の進行方向を向いたユーザの視野に接触状態を表示し、車両が走行中でない場合は接触状態を表示しないことにより、必要な場合に限って入力操作の状態が表示されるので、ユーザが違和感を覚えることを抑制できる。 When the vehicle is traveling, the user cannot operate by looking at the operation units 11a and 11b, so it is necessary to notify the user of the state of the input operation. On the other hand, when the vehicle is not running, the user can operate by looking at the operation units 11a and 11b, and therefore it is not necessary to notify the user of the state of the input operation. When the vehicle is traveling, the contact state is displayed in the field of view of the user facing the traveling direction of the vehicle, and when the vehicle is not traveling, the contact state is not displayed, so that the state of the input operation is only necessary. Since it is displayed, it is possible to suppress the user from feeling uncomfortable.
 ・車両の進行方向を向いたユーザの視野に、入力部12a,12bにおけるユーザの接触位置が表示される。よって、ユーザは、入力部12a,12bのどの操作部11a,11bに触れているか確実に認識することができる。 The user's contact position in the input units 12a and 12b is displayed in the field of view of the user facing the traveling direction of the vehicle. Therefore, the user can surely recognize which operation unit 11a, 11b of the input unit 12a, 12b is touched.
 ・車両の進行方向を向いたユーザの視野に、接触している操作部11a,11bに対応する実行機能が表示される。よって、ユーザは、接触している操作部11a,11bに対応する実行機能を認識することができる。 · The execution function corresponding to the touching operation units 11a and 11b is displayed in the field of view of the user facing the traveling direction of the vehicle. Therefore, the user can recognize the execution function corresponding to the operation units 11a and 11b in contact.
 ・車両の進行方向を向いたユーザの視野に、接触位置の操作部11a,11bに対応する実行機能が選択されたことが表示される。よって、ユーザは、入力操作の状態が接触の状態か機能選択の状態かを認識できる。 In the field of view of the user facing the traveling direction of the vehicle, it is displayed that the execution function corresponding to the operation units 11a and 11b at the contact position has been selected. Therefore, the user can recognize whether the state of the input operation is a contact state or a function selection state.
 (第3実施形態の変形例)
 ・車両が走行中でない場合も、HUD30の表示領域に、接触状態及び接触位置と、選択状態とを表示させてもよい。この場合、走行中の場合よりも、文字の大きさを小さくしたり、文字の色を目立たない色にしたりする。
(Modification of the third embodiment)
Even when the vehicle is not traveling, the contact state, the contact position, and the selection state may be displayed in the display area of the HUD 30. In this case, the character size is reduced or the character color is made inconspicuous compared to when traveling.
 ・S76とS77、S81とS82、S86とS87、S75とS91の間で、S18、S24、S30、S35と同様の処理を行ってもよい。 · S76 and S77, S81 and S82, S86 and S87, and S75 and S91 may be processed similarly to S18, S24, S30, and S35.
 ・HUD30の表示領域に操作状態を表示させるとともに、実施形態1のように、振動付与部15a,15bにより振動を付与させてもよい。
 本開示は、実施例に準拠して記述されたが、本開示は当該実施例や構造に限定されるものではないと理解される。本開示は、様々な変形例や均等範囲内の変形をも包含する。加えて、様々な組み合わせや形態、さらには、それらに一要素のみ、それ以上、あるいはそれ以下、を含む他の組み合わせや形態をも、本開示の範疇や思想範囲に入るものである。
The operation state may be displayed in the display area of the HUD 30 and vibration may be applied by the vibration applying units 15a and 15b as in the first embodiment.
Although the present disclosure has been described with reference to the embodiments, it is understood that the present disclosure is not limited to the embodiments and structures. The present disclosure includes various modifications and modifications within the equivalent range. In addition, various combinations and forms, as well as other combinations and forms including only one element, more or less, are within the scope and spirit of the present disclosure.

Claims (16)

  1.  実行機能に対応する操作部(11a,11b)を含む入力部(12a,12b)と、前記入力部に対する接触を検出する検出部(13a,13b)と、前記入力部に振動を付与する振動付与部(15a,15b)と、を備えるところの、車両に搭載される車載機器を制御する制御装置(20)であって、
     前記検出部により前記接触が検出された場合に、前記車両が走行中か否かに応じて前記振動付与部により付与させる前記振動を変化させる車載機器の制御装置。
    An input unit (12a, 12b) including an operation unit (11a, 11b) corresponding to an execution function, a detection unit (13a, 13b) that detects contact with the input unit, and a vibration application that applies vibration to the input unit A control device (20) for controlling an in-vehicle device mounted on the vehicle, comprising: a portion (15a, 15b),
    When the said contact is detected by the said detection part, the control apparatus of the vehicle equipment which changes the said vibration given by the said vibration provision part according to whether the said vehicle is drive | working.
  2.  前記車両が走行中であれば前記振動付与部により前記振動を付与させ、前記車両が走行中でなければ前記振動付与部により前記振動を付与させない請求項1に記載の車載機器の制御装置。 2. The on-vehicle apparatus control device according to claim 1, wherein the vibration is applied by the vibration applying unit when the vehicle is traveling, and the vibration is not applied by the vibration applying unit when the vehicle is not traveling.
  3.  前記車載機器は、前記車両の進行方向に情報を表示するディスプレイ装置(30)を含み、
     前記検出部は、前記入力部における接触位置を検出し、
     前記検出部により検出された前記接触位置を、前記ディスプレイ装置に表示させる請求項1又は2に記載の車載機器の制御装置。
    The in-vehicle device includes a display device (30) for displaying information in a traveling direction of the vehicle,
    The detection unit detects a contact position in the input unit,
    The in-vehicle device control device according to claim 1, wherein the contact position detected by the detection unit is displayed on the display device.
  4.  実行機能に対応する操作部を含む入力部と、前記入力部に対する接触を検出する検出部と、を備える、車両に搭載される車載機器を制御する制御装置であって、
     前記車載機器は、前記車両の進行方向に情報を表示するディスプレイ装置を含み、
     前記検出部により接触が検出された場合に、前記車両が走行中か否かに応じて、前記ディスプレイ装置により前記接触について異なるモードの表示を実行させる車載機器の制御装置。
    A control device for controlling an in-vehicle device mounted on a vehicle, comprising: an input unit including an operation unit corresponding to an execution function; and a detection unit that detects contact with the input unit,
    The in-vehicle device includes a display device that displays information in a traveling direction of the vehicle,
    A control device for an in-vehicle device that causes the display device to display different modes for the contact depending on whether or not the vehicle is running when contact is detected by the detection unit.
  5.  前記車両が走行中であれば、前記ディスプレイ装置に接触があることを表示させ、前記車両が走行中でなければ、前記ディスプレイ装置に接触があることを表示させない請求項4に記載の車載機器の制御装置。 The in-vehicle device according to claim 4, wherein if the vehicle is running, the display device is displayed as being in contact, and if the vehicle is not running, the display device is not displayed as being in contact. Control device.
  6.  前記検出部は、前記入力部における接触位置を検出し、
     前記検出部により検出された前記接触位置を、前記ディスプレイ装置に表示させる請求項4及び5に記載の車載機器の制御装置。
    The detection unit detects a contact position in the input unit,
    The in-vehicle device control device according to claim 4 or 5, wherein the contact position detected by the detection unit is displayed on the display device.
  7.  前記検出部は、前記入力部における接触位置を検出し、
     前記検出部により検出された前記接触位置に対応する前記実行機能を、前記ディスプレイ装置に表示させる請求項3~6のいずれかに記載の車載機器の制御装置。
    The detection unit detects a contact position in the input unit,
    The in-vehicle device control device according to any one of claims 3 to 6, wherein the execution function corresponding to the contact position detected by the detection unit is displayed on the display device.
  8.  前記検出部は、前記入力部における接触位置を検出し、
     前記検出部により前記接触位置が検出されたことに基づいて前記操作部に対応する実行機能が選択されたと判定し、前記実行機能が選択されたことを、前記ディスプレイ装置に表示させる請求項3~7のいずれかに記載の車載機器の制御装置。
    The detection unit detects a contact position in the input unit,
    The determination unit determines that an execution function corresponding to the operation unit has been selected based on the detection of the contact position by the detection unit, and displays on the display device that the execution function has been selected. The control apparatus for vehicle equipment according to any one of 7.
  9.  前記車載機器は、ユーザの視線を検知する視線検知装置(40)を含み、
     前記検出部は、前記入力部における接触位置を検出し、
     前記視線検知装置により検知された前記ユーザの視線が前記入力部の方向である場合に、前記接触位置に対応する前記実行機能が選択されたと判定する請求項1~8のいずれかに記載の車載機器の制御装置。
    The in-vehicle device includes a line-of-sight detection device (40) for detecting a user's line of sight,
    The detection unit detects a contact position in the input unit,
    The vehicle-mounted device according to any one of claims 1 to 8, wherein when the line of sight of the user detected by the line-of-sight detection device is in the direction of the input unit, it is determined that the execution function corresponding to the contact position is selected. Equipment control device.
  10.  前記車載機器は、ユーザの視線を検知する視線検知装置を含み、
     前記検出部は、前記操作部における接触位置を検出し、
     前記視線検知装置により検知された前記ユーザの視線が前記入力部の方向である場合に、前記振動付与部により前記振動を付与させない請求項1~3のいずれかに記載の車載機器の制御装置。
    The in-vehicle device includes a line-of-sight detection device that detects a user's line of sight,
    The detection unit detects a contact position in the operation unit,
    The in-vehicle device control device according to any one of claims 1 to 3, wherein the vibration is not applied by the vibration applying unit when the user's line of sight detected by the line-of-sight detection device is in the direction of the input unit.
  11.  前記車載機器は、前記入力部に振動を付与する振動付与部を備え、
     前記検出部は、前記操作部における接触位置を検出し、
     前記振動付与部により前記接触位置に応じて異なるパターンの振動を付与させる請求項1~10のいずれかに記載の車載機器の制御装置。
    The in-vehicle device includes a vibration applying unit that applies vibration to the input unit,
    The detection unit detects a contact position in the operation unit,
    The on-vehicle apparatus control device according to any one of claims 1 to 10, wherein the vibration applying unit applies different patterns of vibration according to the contact position.
  12.  前記車載機器は、前記入力部に振動を付与する振動付与部を備え、
     前記検出部は、前記操作部における接触位置を検出するとともに、前記接触位置において前記接触と異なる操作を検出し、
     前記検出部により前記接触と異なる操作が検出された場合に、前記検出部により前記操作部に対する接触が検出された場合とは異なる振動を前記振動付与部により付与させる請求項1~11のいずれかに記載の車載機器の制御装置。
    The in-vehicle device includes a vibration applying unit that applies vibration to the input unit,
    The detection unit detects a contact position in the operation unit, and detects an operation different from the contact in the contact position,
    The vibration applying unit applies vibration different from that when the detection unit detects contact with the operation unit when an operation different from the contact is detected by the detection unit. The control apparatus of the vehicle equipment described in 2.
  13.  前記検出部は、前記入力部に対する前記接触よりも強い押し込みを検出し、
     前記接触位置において前記検出部により前記操作部に対する前記押し込みが検出された場合に、前記操作部に対応する実行機能が選択されたと判定する請求項12に記載の車載機器の制御装置。
    The detection unit detects pressing stronger than the contact with the input unit,
    The in-vehicle device control device according to claim 12, wherein when the detection unit detects the push on the operation unit at the contact position, the execution function corresponding to the operation unit is determined to be selected.
  14.  前記入力部は、複数の前記操作部を含み、
     前記検出部は、前記複数の操作部のそれぞれに対応する複数の接触位置を検出する
     請求項1~13のいずれかに記載の車載機器の制御装置。
    The input unit includes a plurality of the operation units,
    The in-vehicle device control device according to any one of claims 1 to 13, wherein the detection unit detects a plurality of contact positions corresponding to each of the plurality of operation units.
  15.  前記検出部により前記接触が検出された場合に、前記車両が走行中の時には、前記振動付与部により付与させる前記振動の振幅を、前記車両が停止中の時に比し、大きくする
     請求項1~14のいずれかに記載の車載機器の制御装置。
    The amplitude of the vibration applied by the vibration applying unit when the vehicle is running when the contact is detected by the detection unit is larger than that when the vehicle is stopped. 14. The on-vehicle device control device according to any one of claims 14 to 14.
  16.  請求項1~15のいずれかに記載の車載機器の制御装置と、前記入力部と、前記検出部と、前記入力部に振動を付与する振動付与部と、を備える車載機器。 16. A vehicle-mounted device comprising: the vehicle-mounted device control device according to claim 1; the input unit; the detection unit; and a vibration applying unit that applies vibration to the input unit.
PCT/JP2014/001953 2013-04-18 2014-04-03 Control device for vehicle devices and vehicle device WO2014171096A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201480021014.XA CN105142983A (en) 2013-04-18 2014-04-03 Control device for vehicle devices and vehicle device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2013087298A JP2014211738A (en) 2013-04-18 2013-04-18 On-vehicle device controller and on-vehicle device
JP2013-087298 2013-04-18

Publications (1)

Publication Number Publication Date
WO2014171096A1 true WO2014171096A1 (en) 2014-10-23

Family

ID=51731059

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/001953 WO2014171096A1 (en) 2013-04-18 2014-04-03 Control device for vehicle devices and vehicle device

Country Status (3)

Country Link
JP (1) JP2014211738A (en)
CN (1) CN105142983A (en)
WO (1) WO2014171096A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105739680A (en) * 2014-12-29 2016-07-06 意美森公司 System and method for generating haptic effects based on eye tracking

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016150623A (en) * 2015-02-16 2016-08-22 株式会社東海理化電機製作所 Appliance control device for vehicle
JP6570799B2 (en) * 2017-07-05 2019-09-04 三菱電機株式会社 Operation unit control device and operation unit control method
US11150732B2 (en) * 2018-06-25 2021-10-19 Canon Kabushiki Kaisha Image pickup apparatus having vibration device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005149197A (en) * 2003-11-17 2005-06-09 Sony Corp Input device, information processor, remote controller and method of controlling input device
JP2008191086A (en) * 2007-02-07 2008-08-21 Matsushita Electric Ind Co Ltd Navigation system
JP2009075118A (en) * 2008-10-20 2009-04-09 Kenwood Corp Navigation system, function setting method therefor, and program for navigation
JP2010201947A (en) * 2009-02-27 2010-09-16 Toyota Motor Corp Vehicular operating device
JP2012103852A (en) * 2010-11-09 2012-05-31 Tokai Rika Co Ltd Touch type input device
JP2012168196A (en) * 2012-05-16 2012-09-06 Denso It Laboratory Inc Information display device

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE112007001143T5 (en) * 2006-06-05 2009-04-23 Mitsubishi Electric Corp. Display system and method for limiting its operation
JP5052677B2 (en) * 2008-12-04 2012-10-17 三菱電機株式会社 Display input device
CN102855076B (en) * 2011-07-01 2016-06-15 上海博泰悦臻电子设备制造有限公司 The control method of touch-screen and device, mobile terminal device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005149197A (en) * 2003-11-17 2005-06-09 Sony Corp Input device, information processor, remote controller and method of controlling input device
JP2008191086A (en) * 2007-02-07 2008-08-21 Matsushita Electric Ind Co Ltd Navigation system
JP2009075118A (en) * 2008-10-20 2009-04-09 Kenwood Corp Navigation system, function setting method therefor, and program for navigation
JP2010201947A (en) * 2009-02-27 2010-09-16 Toyota Motor Corp Vehicular operating device
JP2012103852A (en) * 2010-11-09 2012-05-31 Tokai Rika Co Ltd Touch type input device
JP2012168196A (en) * 2012-05-16 2012-09-06 Denso It Laboratory Inc Information display device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105739680A (en) * 2014-12-29 2016-07-06 意美森公司 System and method for generating haptic effects based on eye tracking

Also Published As

Publication number Publication date
JP2014211738A (en) 2014-11-13
CN105142983A (en) 2015-12-09

Similar Documents

Publication Publication Date Title
US10394375B2 (en) Systems and methods for controlling multiple displays of a motor vehicle
JP6613170B2 (en) Vehicle control unit and control method thereof
EP3165994B1 (en) Information processing device
US10120567B2 (en) System, apparatus and method for vehicle command and control
US9489500B2 (en) Manipulation apparatus
CN107054089B (en) Method for displaying information in a motor vehicle and display device for a motor vehicle
EP3623930B1 (en) Operation apparatus
EP2230582A2 (en) Operation input device, control method, and program
US20110279391A1 (en) Image display device
US20100238129A1 (en) Operation input device
JP2014102660A (en) Manipulation assistance system, manipulation assistance method, and computer program
JP2007310496A (en) Touch operation input device
US10712822B2 (en) Input system for determining position on screen of display device, detection device, control device, storage medium, and method
WO2014171096A1 (en) Control device for vehicle devices and vehicle device
CN105683869A (en) Operating device that can be operated without keys
JP2018049432A (en) Display control device, display control system and display control method
US11221735B2 (en) Vehicular control unit
JP2017197015A (en) On-board information processing system
JP2014102658A (en) Operation support system, operation support method, and computer program
JP2013134717A (en) Operation input system
JP2011107900A (en) Input display device
JP5870689B2 (en) Operation input system
US11347344B2 (en) Electronic device
JP2018162023A (en) Operation device
JP2022030157A (en) Display control unit and display control method

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 201480021014.X

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14784787

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14784787

Country of ref document: EP

Kind code of ref document: A1