WO2014171096A1 - Dispositif de commande destiné à des dispositifs de véhicule et dispositif de véhicule - Google Patents

Dispositif de commande destiné à des dispositifs de véhicule et dispositif de véhicule Download PDF

Info

Publication number
WO2014171096A1
WO2014171096A1 PCT/JP2014/001953 JP2014001953W WO2014171096A1 WO 2014171096 A1 WO2014171096 A1 WO 2014171096A1 JP 2014001953 W JP2014001953 W JP 2014001953W WO 2014171096 A1 WO2014171096 A1 WO 2014171096A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
unit
contact
vibration
input
Prior art date
Application number
PCT/JP2014/001953
Other languages
English (en)
Japanese (ja)
Inventor
磯部 憲寛
Original Assignee
株式会社デンソー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社デンソー filed Critical 株式会社デンソー
Priority to CN201480021014.XA priority Critical patent/CN105142983A/zh
Publication of WO2014171096A1 publication Critical patent/WO2014171096A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/014Force feedback applied to GUI

Definitions

  • the present disclosure relates to a control device for an in-vehicle device that causes touch input to the in-vehicle device, and an in-vehicle device controlled by the control device.
  • touch input type input devices such as touch panels have become widespread in mobile terminal devices such as mobile phones and PDAs (Personal Digital Assistants).
  • PDAs Personal Digital Assistants
  • touch input type input device it is difficult to perform an input operation without looking at the button because there is no feeling of operation even when the button on the panel is pressed. Therefore, an input device that gives a tactile sensation in accordance with button operations has been proposed.
  • Patent Document 1 detects that the user has touched a button on the touch panel and applies vibration to the touch panel so that the user can perform an input operation without looking closely at the button area. . Specifically, in Patent Document 1, vibrations of different patterns are applied to the operation panel at the time of tap input, slide input, and hold input, respectively. Furthermore, at the time of slide input and hold input, different patterns of vibration are applied to the touch panel for each button so that the user can identify the buttons.
  • Patent Document 1 is an input device intended for input to a mobile terminal device.
  • the ease of input operation of a mobile terminal device does not change according to the surrounding situation. Therefore, Patent Document 1 notifies the user of the operation state by vibration in the same manner regardless of surrounding conditions.
  • the present disclosure provides a control device for an in-vehicle device capable of suppressing the user from feeling uncomfortable by notifying the user of the state of an input operation on the in-vehicle device according to the state of the vehicle, and the control by the control device
  • the main purpose is to provide a vehicle-mounted device.
  • a control device for an in-vehicle device mounted on a vehicle detects an input unit including an operation unit corresponding to an execution function, and contact with the input unit And a vibration applying unit that applies vibration to the input unit.
  • the vibration applied by the vibration applying unit is changed according to whether or not the vehicle is running.
  • the user touches the input unit to operate the in-vehicle device, contact with the input unit is detected.
  • contact with the input unit is detected, vibration that varies depending on whether the vehicle is running is applied to the input unit. Therefore, since the user is notified of the contact state with respect to the input unit by vibration according to the state of the vehicle, the user can be prevented from feeling uncomfortable.
  • the control device that controls the in-vehicle device mounted on the vehicle includes an input unit including an operation unit corresponding to an execution function, and a detection unit that detects contact with the input unit. And is configured to include.
  • the in-vehicle device includes a display device that displays information in a traveling direction of the vehicle, and when the contact is detected by the detection unit, the contact by the display device according to whether the vehicle is running or not. Display different modes for.
  • the contact with the input unit is detected.
  • the contact with the input unit is displayed in the field of view of the user facing the traveling direction of the vehicle in a mode that changes depending on whether or not the vehicle is traveling. Therefore, since the user is notified of the contact state with respect to the input unit by the display mode corresponding to the state of the vehicle, the user can be prevented from feeling uncomfortable.
  • Block diagram showing the configuration of in-vehicle equipment The figure which shows the state of input operation, The flowchart which shows the process sequence which starts the execution function selected by the user in 1st Embodiment, The flowchart which shows the process sequence which starts the execution function selected by the user in 1st Embodiment, The flowchart which shows the process sequence which starts the execution function selected by the user in 2nd Embodiment, The flowchart which shows the process sequence which starts the execution function selected by the user in 2nd Embodiment, The flowchart which shows the process sequence which starts the execution function selected by the user in 3rd Embodiment, The flowchart which shows the process sequence which starts the execution function selected by the user in 3rd Embodiment.
  • This in-vehicle device is mounted on a vehicle, and includes a touch input device for a touch panel 10a and a touch switch 10b, a control device 20 (a control device for the in-vehicle device), a HUD (Head Up Display) 30, and a gaze detection sensor 40.
  • a vehicle equipped with an in-vehicle device is also referred to as a host vehicle or a target vehicle.
  • the touch panel 10a is an input device such as a car navigation, and includes an input unit 12a, a detection unit 13a, and a vibration applying unit 15a.
  • the input unit 12a includes a display screen made up of a liquid crystal panel or an organic EL (Electro-Luminescence) panel, and a transparent touch panel arranged on the display screen, and accepts touch input by a user's finger or the like. Various kinds of information are displayed on the display screen, and the transparent touch panel transmits the information displayed on the display screen.
  • the input unit 12a includes a plurality of operation units 11a (input buttons) that are displayed on a display screen and assigned execution functions such as car navigation. In addition to the operation unit 11a, the input unit 12a includes a portion to which an execution function such as screen enlargement is assigned. The number and the position of the operation unit 11a change according to the information displayed on the display screen.
  • the detection unit 13a is a capacitive sensor mounted between the liquid crystal panel of the input unit 12a and the touch panel, and detects the contact with the input unit 12a, the contact position, and the area of the contact part. When the user touches the surface of the input unit 12a, the capacitance at the contact position changes, and a detection signal corresponding to the contact position and the area of the contact portion is transmitted to the control device 20.
  • the detection unit 13a may be a sensor such as a pressure-sensitive sensor or an infrared sensor in addition to the capacitance sensor.
  • the vibration applying unit 15a is an actuator such as a piezo element, an eccentric motor, or a voice coil that is driven according to a drive signal from the control device 20 and applies vibration to the input unit 12a.
  • the touch switch 10b is an input device such as an air conditioner, and includes an input unit 12b, a detection unit 13b, and a vibration applying unit 15b.
  • the input unit 12b is formed as a rectangular surface including a plurality of operation units 11b (input buttons) each assigned an execution function such as an air conditioner. Execution functions are not assigned to portions of the input unit 12b other than the operation unit 11b.
  • the detection unit 13b is a capacitive sensor mounted on the rear surface of the input unit 12b, and detects the contact with the operation unit 11b included in the input unit 12b, the contact position, and the area of the contact part.
  • the vibration applying unit 15b is an actuator that is driven according to a drive signal from the control device 20 and applies vibration to the input unit 12b.
  • the touch panel 10a and the touch switch 10b are installed in the center of the instrument panel between the driver seat and the passenger seat as shown in FIG.
  • the HUD 30 (display device) is a display device that displays information in the traveling direction of the vehicle, as shown in FIG.
  • the HUD 30 is embedded in the back of the dashboard, reflects the image displayed on the liquid crystal panel to the mirror, further reflects it on the concave mirror and enlarges it, and reflects the enlarged image on the windshield. Thereby, the user can see the image displayed so as to float up in front of the windshield.
  • Specific display information includes a vehicle speed, road guidance, a contact position in the input unit 12a or the input unit 12b, and the like.
  • the line-of-sight detection sensor 40 (line-of-sight detection device) is installed at a position facing the eyes of the user (driver) facing in the traveling direction, for example, a steering upper column or a meter panel, and is not visible. It is a camera that photographs the eyes of a user irradiated with near infrared light. The line-of-sight detection sensor 40 transmits the captured image of the user's eyes to the control device 20.
  • the control device 20 includes a control circuit 21, an operation input unit 22, a vibration output unit 23, a recording unit 24, a HUD display output unit 25, a line-of-sight recognition sensor input unit 26, a vehicle sensor input unit 27, and a power supply unit 28.
  • the control circuit 21 is also configured as a normal computer and includes a CPU, RAM, ROM, and the like.
  • the recording unit 24 may be included in the ROM or RAM.
  • the control circuit 21 detects the user's line of sight based on the eye image received from the line-of-sight detection sensor 40 via the line-of-sight recognition sensor input unit 26. Specifically, the control circuit 21 detects a pupil whose position changes depending on the line-of-sight direction and a corneal reflection that is not affected by the line-of-sight direction from the received eye image, and looks at the line of sight from the positional relationship between the pupil and the corneal reflection. Is detected.
  • the control circuit 21 determines whether the execution function corresponding to the operation units 11a and 11b is selected based on the detection signal transmitted from the detection units 13a and 13b via the operation input unit 22 and the detected line of sight of the user. . If it determines with the execution function corresponding to operation part 11a, 11b having been selected, the control circuit 21 will transmit the command signal which implement
  • the control circuit 21 vibrates the input units 12a and 12b by the vibration applying units 15a and 15b. Grant. Further, the control circuit 21 changes the vibration applied by the vibration applying units 15a and 15b depending on whether or not the vehicle is traveling. Specifically, the control circuit 21 applies vibration by the vibration applying units 15a and 15b when the vehicle is traveling, and does not apply vibration by the vibration applying units 15a and 15b unless the vehicle is traveling. Therefore, it can be said that the control unit 21 functions as a vehicle running state detection unit.
  • the control circuit 21 executes various programs stored in the recording unit 24 and causes the recording unit 24 to store various received data. Further, the control circuit 21 displays information in the display area of the HUD 30 via the HUD display output unit 25. Further, the control circuit 21 receives detection values from vehicle sensors such as a vehicle speed sensor and a brake sensor (not shown) via the vehicle sensor input unit 27, and determines whether or not the vehicle is traveling. Further, the control circuit 21 is supplied with power from the power supply unit 28.
  • vehicle sensors such as a vehicle speed sensor and a brake sensor (not shown) via the vehicle sensor input unit 27, and determines whether or not the vehicle is traveling. Further, the control circuit 21 is supplied with power from the power supply unit 28.
  • This processing procedure is repeatedly executed by the control circuit 21 at predetermined intervals.
  • S11 it is determined whether or not there is contact with the input units 12a and 12b. If no detection signal is received from the detection units 13a and 13b via the operation input unit 22, it is determined that there is no contact (NO), and the process proceeds to S12. In S12, when vibration is applied to the input units 12a and 12b by the vibration applying units 15a and 15b, the vibration is stopped and the process returns to S11.
  • the vibration traveling pattern to be applied by the vibration applying unit 15a is set in advance to a different vibration pattern in each area other than each input button and the input button, and is set to a different vibration pattern at the time of contact and at the time of selecting the execution function. ing.
  • the traveling pattern of the vibration to be applied by the vibration applying unit 15b is set in advance to a different vibration pattern for each input button, and is set to a different vibration pattern for the contact and execution function selection. Moreover, the pattern at the time of a stop does not give a vibration by the vibration provision parts 15a and 15b.
  • S17 it is determined whether or not the contact position is the position of the input button 1.
  • the number of input buttons is three. If the contact position is the position of the input button 1 in S17 (YES), the processes of S18 to S22 are performed.
  • S18 it is determined whether or not the detected line of sight of the user is in a direction other than the input units 12a and 12b.
  • the user's line of sight is in a direction other than the input units 12a and 12b (YES)
  • the input is performed by the vibration applying units 15a and 15b in S19.
  • the vibration for contact for the button 1 is applied.
  • the user who is not looking at the input units 12a and 12b is notified that the input button 1 is touched.
  • the vibration applying units 15a and 15b are not applied with vibration, and the process proceeds to S20.
  • the detection units 13a and 13b have detected a stronger push than the contact with the input units 12a and 12b at the contact position. Specifically, based on the detection signals received from the detection units 13a and 13b, it is determined that the push-in has been detected when the contact area with respect to the operation units 11a and 11b is larger than when the contact is detected.
  • the detection units 13a and 13b are pressure-sensitive sensors, it is determined that pressing has been detected when the detected pressure is greater than a reference value.
  • the process of S30 to S34 are performed in the same manner as the processes of S18 to S22. However, in S31, vibration at the time of contact for the input button 3 is applied by the vibration applying units 15a and 15b, and in S33, vibration at the time of selection for the input button 3 is applied by the vibration applying units 15a and 15b. If the contact position is not the position of the input button 3 in S29 (NO), the process returns to S11.
  • the vibration applying unit 15a is used for an area other than the input button position. Apply vibration when touching. As a result, the user who is not looking at the input unit 12a is notified that the user is touching an area other than the input button position.
  • the vibration applying unit 15a does not apply vibration, and the process proceeds to S37.
  • the user When the user pushes the operation units 11a and 11b stronger than the contact, it can be determined that the user has selected the execution function corresponding to the operation units 11a and 11b. In addition, when it is determined that the execution function is selected, the user can determine whether the input operation state is the contact state or the selection state by applying a vibration different from the case where the contact is detected. Can be recognized.
  • the first embodiment may be implemented with the following modifications.
  • the execution function corresponding to the contact position is selected by the user. It may be determined that it has been done.
  • vibration is applied by the vibration applying units 15a and 15b according to the input button. Even in this case, the user can recognize whether the state of the input operation is a contact state or a selection state.
  • the operation different from the contact includes, for example, an operation in which the contact position is kept in contact for a longer time than a predetermined time.
  • the vibration pattern at the time of stopping may be a vibration with a weaker amplitude or a lower frequency than when traveling.
  • the amplitude of vibration applied by the vibration applying units 15a and 15b may be larger than that when the vehicle is stopped when the vehicle is running. Good.
  • a notification sound having a frequency and amplitude corresponding to the vibration pattern during traveling and the vibration pattern during stopping may be output.
  • S41 As in S11, it is determined whether or not there is a contact. If it is determined that there is contact in S41 (YES), the contact position in the input units 12a and 12b is displayed in the display area of the HUD 30 in S43. On the other hand, if it is determined that there is no contact in S41 (NO), in S42, the vibration is stopped in the same manner as in S12. Subsequently, in S44, the display of the contact position in the display area of the HUD 30 is stopped. Return to processing.
  • the determination of the user's line-of-sight direction is not performed, and the same as S25 to S28, S31 to S34, and S36 to S40, respectively. This process is performed in S55 to S58, S60 to S63, and S64 to S68.
  • the user's contact position in the input units 12a and 12b is displayed in the field of view of the user facing the traveling direction of the vehicle. Therefore, the user can surely recognize which operation unit 11a, 11b of the input unit 12a, 12b is touched.
  • the second embodiment may be implemented with the following modifications.
  • ⁇ S49 and S50, S54 and S55, S59 and S60, and S48 and S64 may be processed in the same manner as S18, S24, S30, and S35.
  • S71 As in S11, it is determined whether or not there is a contact. If it is determined in S71 that there is contact (YES), it is determined in S72 whether the vehicle is traveling. If it is determined that the vehicle is traveling in S72 (YES), the contact state and the contact position in the input units 12a and 12b are displayed in the display area of the HUD 30 in S73, and the process proceeds to S75. On the other hand, when it is determined that there is no contact in S71 (NO), or when it is determined that the vehicle is not traveling in S72 (NO), in S74, the display of the contact state and the contact position in the display area of the HUD 30 is stopped. The process returns to S71.
  • S75 it is determined whether or not the contact position is the position of the input button. If the contact position is the position of the input button in S75 (YES), it is determined in S76 whether the contact position is the position of the input button 1 as in S17. If the contact position is the position of the input button 1 in S76 (YES), the execution function corresponding to the input button 1 is displayed in the display area of the HUD 30 in S77.
  • S78 it is determined whether or not the detection units 13a and 13b have detected a stronger push than the contact with the input units 12a and 12b at the contact positions. If it is determined in S78 that no push has been detected (NO), the process returns to S71. If it is determined in S78 that pressing has been detected (YES), it is determined that the execution function corresponding to the input button 1 has been selected by the user, and the selection of the input button 1 is displayed in the display area of the HUD 30 in S79. Specifically, for example, the execution function displayed in S77 is highlighted.
  • S76 If it is determined in S76 that the contact position is not the position of the input button 1 (NO), it is determined in S81 whether the contact position is the position of the input button 2 or not. If the contact position is the position of the input button 2 in S81 (YES), the processes of S82 to S85 are performed in the same manner as the processes of S77 to S80. However, in S82, the execution function corresponding to the input button 2 is displayed in the display area of the HUD 30, and in S85, the selection of the input button 2 is displayed in the display area of the HUD 30.
  • S86 determines whether the contact position is the position of the input button 3 or not. If the contact position is the position of the input button 3 in S86 (YES), the processes of S87 to S90 are performed in the same manner as the processes of S77 to S80. However, in S87, the execution function corresponding to the input button 3 is displayed in the display area of the HUD 30, and in S89, the selection of the input button 3 is displayed in the display area of the HUD 30. If the contact position is not the position of the input button 3 in S86 (NO), the process returns to S71.
  • S91 it is determined in the input unit 12a whether or not an execution function is assigned to an area other than the input button, that is, the operation unit 11a. If the execution function is not assigned (NO), the process returns to S71. If there is an execution function assignment (YES), the processing of S92 to S95 is performed as in S77 to S80. However, in S92, the assigned execution function is displayed in the display area of the HUD 30, and in S94, the selection of the assigned execution function is displayed in the display area of the HUD 30. This process is complete
  • the display mode includes a display mode and a non-display mode.
  • the display mode may include various specifications such as color, shape, size, and contents related to the display or its design, and specifications related to temporal changes in various specifications of the display.
  • the display mode is also referred to as a display pattern, a display format, a display manner, and a display mode.
  • the user cannot operate by looking at the operation units 11a and 11b, so it is necessary to notify the user of the state of the input operation.
  • the vehicle is not running, the user can operate by looking at the operation units 11a and 11b, and therefore it is not necessary to notify the user of the state of the input operation.
  • the contact state is displayed in the field of view of the user facing the traveling direction of the vehicle, and when the vehicle is not traveling, the contact state is not displayed, so that the state of the input operation is only necessary. Since it is displayed, it is possible to suppress the user from feeling uncomfortable.
  • the user's contact position in the input units 12a and 12b is displayed in the field of view of the user facing the traveling direction of the vehicle. Therefore, the user can surely recognize which operation unit 11a, 11b of the input unit 12a, 12b is touched.
  • the execution function corresponding to the touching operation units 11a and 11b is displayed in the field of view of the user facing the traveling direction of the vehicle. Therefore, the user can recognize the execution function corresponding to the operation units 11a and 11b in contact.
  • the contact state, the contact position, and the selection state may be displayed in the display area of the HUD 30.
  • the character size is reduced or the character color is made inconspicuous compared to when traveling.
  • ⁇ S76 and S77, S81 and S82, S86 and S87, and S75 and S91 may be processed similarly to S18, S24, S30, and S35.
  • the operation state may be displayed in the display area of the HUD 30 and vibration may be applied by the vibration applying units 15a and 15b as in the first embodiment.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Human Computer Interaction (AREA)
  • Automation & Control Theory (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

L'invention concerne un dispositif de commande (20) qui commande des dispositifs de véhicule montés sur un véhicule, comprenant : des unités d'entrée (12a, 12b) comprenant des unités de fonctionnement correspondant à des fonctions d'exécution ; des unités de détection (13a, 13b) qui détectent un contact par rapport aux unités d'entrée ; et des unités d'application de vibration (15a, 15) qui appliquent une vibration aux unités d'entrée. Lorsqu'un contact est détecté par les unités de détection, le dispositif de commande (20) fait en sorte que la vibration appliquée par les unités d'application de vibration change conformément au déplacement ou non du véhicule.
PCT/JP2014/001953 2013-04-18 2014-04-03 Dispositif de commande destiné à des dispositifs de véhicule et dispositif de véhicule WO2014171096A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201480021014.XA CN105142983A (zh) 2013-04-18 2014-04-03 车载设备的控制装置、车载设备

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2013-087298 2013-04-18
JP2013087298A JP2014211738A (ja) 2013-04-18 2013-04-18 車載機器の制御装置、車載機器

Publications (1)

Publication Number Publication Date
WO2014171096A1 true WO2014171096A1 (fr) 2014-10-23

Family

ID=51731059

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/001953 WO2014171096A1 (fr) 2013-04-18 2014-04-03 Dispositif de commande destiné à des dispositifs de véhicule et dispositif de véhicule

Country Status (3)

Country Link
JP (1) JP2014211738A (fr)
CN (1) CN105142983A (fr)
WO (1) WO2014171096A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105739680A (zh) * 2014-12-29 2016-07-06 意美森公司 用于基于眼睛跟踪生成触觉效果的系统和方法

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016150623A (ja) * 2015-02-16 2016-08-22 株式会社東海理化電機製作所 車両用機器制御装置
CN110832437A (zh) * 2017-07-05 2020-02-21 三菱电机株式会社 操作部控制装置及操作部控制方法
US11150732B2 (en) * 2018-06-25 2021-10-19 Canon Kabushiki Kaisha Image pickup apparatus having vibration device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005149197A (ja) * 2003-11-17 2005-06-09 Sony Corp 入力装置、情報処理装置、リモートコントロール装置および入力装置の制御方法
JP2008191086A (ja) * 2007-02-07 2008-08-21 Matsushita Electric Ind Co Ltd ナビゲーション装置
JP2009075118A (ja) * 2008-10-20 2009-04-09 Kenwood Corp ナビゲーション装置、その機能設定方法、およびナビゲーション用プログラム
JP2010201947A (ja) * 2009-02-27 2010-09-16 Toyota Motor Corp 車両用操作装置
JP2012103852A (ja) * 2010-11-09 2012-05-31 Tokai Rika Co Ltd タッチ式入力装置
JP2012168196A (ja) * 2012-05-16 2012-09-06 Denso It Laboratory Inc 情報表示装置

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101460919B (zh) * 2006-06-05 2012-04-18 三菱电机株式会社 显示装置及该装置的限制操作方法
WO2010064389A1 (fr) * 2008-12-04 2010-06-10 三菱電機株式会社 Dispositif d’entrée d’affichage
CN102855076B (zh) * 2011-07-01 2016-06-15 上海博泰悦臻电子设备制造有限公司 触摸屏的控制方法及装置、移动终端设备

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005149197A (ja) * 2003-11-17 2005-06-09 Sony Corp 入力装置、情報処理装置、リモートコントロール装置および入力装置の制御方法
JP2008191086A (ja) * 2007-02-07 2008-08-21 Matsushita Electric Ind Co Ltd ナビゲーション装置
JP2009075118A (ja) * 2008-10-20 2009-04-09 Kenwood Corp ナビゲーション装置、その機能設定方法、およびナビゲーション用プログラム
JP2010201947A (ja) * 2009-02-27 2010-09-16 Toyota Motor Corp 車両用操作装置
JP2012103852A (ja) * 2010-11-09 2012-05-31 Tokai Rika Co Ltd タッチ式入力装置
JP2012168196A (ja) * 2012-05-16 2012-09-06 Denso It Laboratory Inc 情報表示装置

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105739680A (zh) * 2014-12-29 2016-07-06 意美森公司 用于基于眼睛跟踪生成触觉效果的系统和方法

Also Published As

Publication number Publication date
CN105142983A (zh) 2015-12-09
JP2014211738A (ja) 2014-11-13

Similar Documents

Publication Publication Date Title
US10394375B2 (en) Systems and methods for controlling multiple displays of a motor vehicle
JP6613170B2 (ja) 車両用コントロールユニット及びその制御方法
EP3165994B1 (fr) Dispositif de traitement d'informations
US10120567B2 (en) System, apparatus and method for vehicle command and control
US9489500B2 (en) Manipulation apparatus
CN107054089B (zh) 在机动车中显示信息的方法和用于机动车的显示装置
EP3623930B1 (fr) Appareil de commande
EP2230582A2 (fr) Dispositif d'entrée d'opérations, procédé de commande et programme
EP2330487A1 (fr) Dispositif d'affichage d'images
US20100238129A1 (en) Operation input device
JP2007310496A (ja) タッチ操作入力装置
US10712822B2 (en) Input system for determining position on screen of display device, detection device, control device, storage medium, and method
WO2014171096A1 (fr) Dispositif de commande destiné à des dispositifs de véhicule et dispositif de véhicule
CN105683869A (zh) 能够无按键操作的操作装置
JP2018049432A (ja) 表示制御装置、表示制御システム及び表示制御方法
JP2018136616A (ja) 表示操作システム
US11221735B2 (en) Vehicular control unit
JP2017197015A (ja) 車載用情報処理システム
JP2014102658A (ja) 操作支援システム、操作支援方法及びコンピュータプログラム
JP2013134717A (ja) 操作入力システム
JP2014102657A (ja) 操作支援システム、操作支援方法及びコンピュータプログラム
JP2011107900A (ja) 入力表示装置
JP5870689B2 (ja) 操作入力システム
US11347344B2 (en) Electronic device
JP2018162023A (ja) 操作装置

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 201480021014.X

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14784787

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14784787

Country of ref document: EP

Kind code of ref document: A1