WO2016031152A1 - Interface d'entree pour vehicule - Google Patents

Interface d'entree pour vehicule Download PDF

Info

Publication number
WO2016031152A1
WO2016031152A1 PCT/JP2015/003975 JP2015003975W WO2016031152A1 WO 2016031152 A1 WO2016031152 A1 WO 2016031152A1 JP 2015003975 W JP2015003975 W JP 2015003975W WO 2016031152 A1 WO2016031152 A1 WO 2016031152A1
Authority
WO
WIPO (PCT)
Prior art keywords
finger
hand
control device
display control
vehicle
Prior art date
Application number
PCT/JP2015/003975
Other languages
English (en)
Japanese (ja)
Inventor
重明 西橋
豪之 藤本
Original Assignee
株式会社デンソー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社デンソー filed Critical 株式会社デンソー
Publication of WO2016031152A1 publication Critical patent/WO2016031152A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • This relates to a vehicle input interface having an aerial detectable area that can detect an operator's hand or finger in the air.
  • Patent Document 1 An input interface for a vehicle having an air-detectable area in which an operator's hand or finger can be detected in the air.
  • the present disclosure has been made in view of the above points, and an object of the present disclosure is to provide a vehicle input interface that can suppress the induction of an erroneous operation without significantly impairing the convenience of the aerial operation.
  • An input interface for a vehicle is an interface for operating an image displayed on a display device arranged in a vehicle interior, and can detect an operator's hand or finger in the air.
  • a touch pad having a region and a display control device that controls an image displayed on the display device based on an input to the touch pad, and the touch pad has a degree of proximity to an operator's hand or finger
  • a calculation device that estimates the position of the hand or finger and the presence or absence of contact with the hand or finger, the display control device determines whether or not the hand or finger has performed a specific operation on the touch pad, When it is determined that an operation has been performed, an image displayed on the display device is operated according to the movement of the hand or finger even when the hand or finger is not in contact with the touch pad.
  • the screen control according to the movement of the operator's hand or finger detected in the air is not performed until the display control device determines that the hand or finger has performed a specific action, thereby preventing an erroneous operation unintended by the operator. it can.
  • FIG. 1 is a diagram illustrating an arrangement of a vehicle input interface in a vehicle interior according to an embodiment of the present disclosure
  • FIG. 2 is a perspective view for explaining the touchpad in detail.
  • FIG. 3 is a cross-sectional view showing the range of the aerial detectable region of the touchpad
  • FIG. 4 is a flowchart showing the control executed by the touchpad arithmetic unit
  • FIG. 5 is a flowchart illustrating the control performed by the display control device based on a signal output from the arithmetic device of the touch pad
  • FIG. 6 is a diagram showing a display example of the display device during detection of each mode and specific operation.
  • FIG. 1 is a diagram illustrating an arrangement of a vehicle input interface 10 in a vehicle cabin according to an embodiment of the present disclosure.
  • the vehicle input interface 10 in this embodiment is for operating an image displayed on the display device 20, and includes a vehicle touch pad 30 and a display control device 40.
  • the display device 20 is a multi-function display disposed substantially at the center of the instrument panel 50 of the vehicle.
  • the display device 20 is an operation screen of a vehicle air conditioner, radio, navigation device, audio, etc. (not shown), a rear camera image, and a plurality of cameras. It is also a general-purpose display device that displays an around view or the like based on a composite image.
  • the vehicle touch pad 30 includes an arithmetic device 80, and the arithmetic device 80 reads a signal related to the electrostatic capacitance from the electrostatic capacitance sensor panel 70 arranged below the design panel 60 shown in FIG.
  • the degree of proximity to the operator's hand or finger, the position of the hand or finger, and the presence or absence of contact with the hand or finger are estimated to generate a two-dimensional coordinate signal or a three-dimensional coordinate signal, and display control is performed A signal is output to the device 40.
  • the display control device 40 switches various operation screens based on the two-dimensional signal and the three-dimensional signal from the arithmetic device 80 of the vehicular touchpad 30 and causes the display device 20 to display an image for inputting each function. Is.
  • the display control device 40 is connected to a sound source 90 such as a speaker arranged directly or indirectly in the vehicle interior so that a notification sound can be generated from the sound source by a signal from the display control device 40. It is configured.
  • the vehicle touchpad 30 of this example is disposed in the vicinity of an armrest (not shown) between the driver seat and the passenger seat, and the design panel 60 is such that the palm of the driver is positioned exactly when the driver puts his arm on the armrest.
  • the arithmetic device 80 and the display control device 40 are disposed inside the instrument panel 50 that is not visible from the passenger compartment.
  • the display device 20, the vehicle touch pad 30, the display control device 40, and the speaker 90 may be connected in any manner, may be connected via an in-vehicle network communication cable, or may be an individual cable. It may be connected or may be connected by wireless communication.
  • FIG. 2 is a perspective view of the touch pad 30 in the present embodiment.
  • the design panel 60 is disposed on the surface of the housing where the capacitance sensor panel 70 is disposed.
  • an operation detectable area 100 capable of detecting the approach, proximity, contact and position coordinates of an operating body such as an operator's hand or finger, and a decision input area for receiving an input by contact or pressing. 110 is color-coded to be easily understood by the driver.
  • the input to the decision input area 110 is detected using the capacitance sensor panel 70.
  • any method for detecting the presence or absence of the decision input may be used.
  • a switch may be separately provided, or a determination input may be made based on the fact that the surface of the design panel 60 is double-tapped with an operator's finger in the operation detectable area 100.
  • FIG. 3 is a cross-sectional view showing a range of an aerial detectable region (RG) in which the approach, proximity, and contact of the operating body in the touch pad 30 shown in FIG. 2 can be detected.
  • the computing device 80 of the touch pad 30 estimates the movement and shape of the operator's hand or finger within the aerial detectable region based on the electric charge stored in the capacitance sensor panel 70.
  • the detection means capable of detecting contact and proximity in the present disclosure is not limited to a capacitance sensor, and may be realized by combining, for example, a pressure-sensitive sensor and an imaging device such as an infrared camera.
  • the proximity of the operating body and the design panel 60, the degree of proximity, the position of the operating body with respect to the design panel 60, and the presence or absence of contact between the operating body and the design panel 60 are estimated.
  • step S10 the electric charge stored in the capacitance sensor panel 70 is measured, and the process proceeds to step S11.
  • step S11 the position estimated to be closest to the design panel 60 and the distance from the design panel 60 are estimated, and the process proceeds to step S12.
  • step S12 it is determined from the distance between the operating tool and the design panel 60 whether or not the operating tool (that is, the operator's hand or finger) is in contact with the design panel 60.
  • step S12 is affirmation determination, it progresses to step S13.
  • step S13 a two-dimensional coordinate signal indicating a coordinate point on the design panel at a position where the operating body and the design panel 60 are in contact is output to the display control device 40.
  • step S12 is negative determination, it progresses to step S14, and the three-dimensional coordinate signal which shows the position of the operation body on the design panel 60 is output to the display control apparatus 40.
  • step S20 it is determined whether or not a two-dimensional coordinate signal is received from the arithmetic device 80 of the touch pad 30.
  • step S20 is affirmation determination, it progresses to step S21.
  • step S21 it is assumed that a touch input operation has been performed on the touch pad 30, and the display content of the display device 20 is set to the individual function setting mode shown on the right side of FIG. 6, and the individual function setting screen is set based on the two-dimensional coordinate signal.
  • the cursor 130 displayed in 120 is operated to select one of the icons 140 displayed.
  • step S20 when step S20 is negative determination, it progresses to step S22 and it is determined whether the three-dimensional signal is received from the arithmetic unit 80. If step S22 is affirmative, the process proceeds to step S23.
  • step S23 it is determined whether or not the three-dimensional signal is continuously received from the arithmetic unit 80 for a predetermined time (for example, 1 second). If step S23 is affirmative, the process proceeds to step S24.
  • step S24 the display content of the display device 20 is set to the function selection mode shown in the center of FIG.
  • this function selection mode names 150 representing the respective functions are arranged on the left and right at the top of the screen, and an operation screen 160 corresponding to the currently selected function is displayed at the approximate center of the screen.
  • the currently selected operation screen 160 in this embodiment is a reduced display of the individual function setting screen 120.
  • the operating tool MP
  • the currently selected operation screen 160 is displayed on the operating tool. It is designed to scroll left and right to match the movement.
  • the name 150 at the top of the screen is also highlighted 170 indicating the name corresponding to the selected function.
  • step S23 determines whether at least one of the shape and movement of the operating tool corresponds to a “specific operation” stored in advance in a storage device (not shown) included in the display control device 40.
  • step S23 is affirmation determination, it progresses to step S24 mentioned above.
  • this step S23 is negative determination, it returns to step S20.
  • the “specific operation” will be described in more detail. For example, (1) the operator places his / her hand or finger on the touchpad 30 in a specific shape, and (2) the operator places his / her hand or finger on the touchpad 30 in a predetermined path. Moving, (3) the operator may leave the hand or finger on the touch pad 30 for a predetermined time or more.
  • the display control device 40 makes an affirmative determination in step S25 when at least one or any combination of the specific operations is performed.
  • the “specific operation” may be any as long as it is predetermined, and the operator may register an arbitrary operation as the “specific operation” in advance.
  • the screen control according to the movement of the operator's hand or finger detected in the air is not performed until the display control device determines that the hand or finger has performed a specific action. Incorrect operation can be suppressed.
  • the display screen of the display device 20 shown on the left in FIG. 6 is an image that the display control device 40 displays on the display device 20 while the specific operation is being detected in step S25.
  • a white arc 180 and a hatched arc 190 shown on the left display screen in FIG. 6 indicate the degree of completion of the specific operation on the display device when the hand or finger is performing the specific operation. For example, when an operation “create a peace sign with a finger and move it like drawing a circle” is predetermined as a “specific operation”, the peace sign mark 200 displayed on the left display screen in FIG.
  • the arc 180 allows the operator to remember what the particular action was, and the hatched arc 190 can visually understand the degree of completion of the particular action. For this reason, it is possible to assist the operator to get used to performing a specific operation.
  • sound may be generated from the sound source 90 arranged in the passenger compartment.
  • the “specific action” determined by the display control device 40 may be “the hand or finger stays in the aerial detectable region for a predetermined time or longer”, or the shape of the hand or finger is determined by the arithmetic device 80. It may be estimated that “a hand or a finger has a predetermined shape”, or it may be considered that the “specific action” is performed when any one of these operations is detected.
  • the display control device 40 may cause the display device to display the degree of completion of the specific action. Thereby, the operator can confirm the progress degree of the specific operation for canceling the aerial operation by the display device, and can assist the operator to get used to the execution of the specific operation.
  • the display control device 40 may generate sound from a sound source arranged in the vehicle interior.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

 La présente invention comporte un pavé tactile (30) équipé d'une zone aérienne détectable permettant de détecter dans l'air la main ou le doigt d'un opérateur, et un dispositif (40) de commande d'affichage pour commander, en fonction d'une entrée sur le pavé tactile, l'image qui est affichée sur un dispositif d'affichage, le pavé tactile permettant d'estimer le degré de proximité de la main ou du doigt de l'opérateur, la position de la main ou du doigt, et s'il y a contact avec la main ou le doigt. Le dispositif de commande d'affichage détermine si la main ou le doigt a effectué une opération spécifique sur l'écran tactile, et, lorsqu'il est déterminé que l'opération spécifique a été effectuée, fait en sorte que l'image affichée sur le dispositif d'affichage soit actionnée en fonction du mouvement de la main ou du doigt, même lorsque la main ou le doigt n'est pas en contact avec le pavé tactile.
PCT/JP2015/003975 2014-08-29 2015-08-07 Interface d'entree pour vehicule WO2016031152A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014175637A JP2016051288A (ja) 2014-08-29 2014-08-29 車両用入力インターフェイス
JP2014-175637 2014-08-29

Publications (1)

Publication Number Publication Date
WO2016031152A1 true WO2016031152A1 (fr) 2016-03-03

Family

ID=55399071

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/003975 WO2016031152A1 (fr) 2014-08-29 2015-08-07 Interface d'entree pour vehicule

Country Status (2)

Country Link
JP (1) JP2016051288A (fr)
WO (1) WO2016031152A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10137781B2 (en) 2013-08-02 2018-11-27 Denso Corporation Input device

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6896416B2 (ja) * 2016-12-27 2021-06-30 アルパイン株式会社 車載システム
JP2020166641A (ja) * 2019-03-29 2020-10-08 ソニー株式会社 情報処理装置、情報処理方法、及びプログラム

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001216069A (ja) * 2000-02-01 2001-08-10 Toshiba Corp 操作入力装置および方向検出方法
JP2011118857A (ja) * 2009-12-02 2011-06-16 Hyundai Motor Co Ltd 車両のマルチメディアシステム操作用ユーザーインターフェース装置
WO2012053033A1 (fr) * 2010-10-20 2012-04-26 三菱電機株式会社 Dispositif d'affichage tridimensionnel
US20130271360A1 (en) * 2012-04-16 2013-10-17 Qualcomm Incorporated Interacting with a device using gestures

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001216069A (ja) * 2000-02-01 2001-08-10 Toshiba Corp 操作入力装置および方向検出方法
JP2011118857A (ja) * 2009-12-02 2011-06-16 Hyundai Motor Co Ltd 車両のマルチメディアシステム操作用ユーザーインターフェース装置
WO2012053033A1 (fr) * 2010-10-20 2012-04-26 三菱電機株式会社 Dispositif d'affichage tridimensionnel
US20130271360A1 (en) * 2012-04-16 2013-10-17 Qualcomm Incorporated Interacting with a device using gestures

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10137781B2 (en) 2013-08-02 2018-11-27 Denso Corporation Input device

Also Published As

Publication number Publication date
JP2016051288A (ja) 2016-04-11

Similar Documents

Publication Publication Date Title
JP5928397B2 (ja) 入力装置
JP5572761B2 (ja) 車両用操作装置
JP5452566B2 (ja) 車両用入力装置
JP5124397B2 (ja) 自動車用入出力装置
US20150324006A1 (en) Display control device
EP3144850A1 (fr) Dispositif de détermination, procédé de détermination et support d'enregistrement non transitoire
JP2007106392A (ja) 車載電子機器用入力システム
JP5751233B2 (ja) 操作デバイス
KR101491169B1 (ko) 차량의 탑재기기 조작 장치 및 그 제어 방법
JPWO2017002708A1 (ja) 画像表示制御装置
JP2006264615A (ja) 車両用表示装置
WO2016031152A1 (fr) Interface d'entree pour vehicule
JP4848997B2 (ja) 車載機器の誤操作防止装置および誤操作防止方法
JP4847029B2 (ja) 入力装置
JP6018775B2 (ja) 車載機器の表示制御装置
JP6610452B2 (ja) 車両用表示装置
JP4849193B2 (ja) 車載機器の誤操作防止装置および誤操作防止方法
WO2019077908A1 (fr) Dispositif d'entrée gestuelle
JP4840332B2 (ja) 遠隔操作装置
WO2014162698A1 (fr) Dispositif d'entrée
WO2016031148A1 (fr) Bloc tactile pour véhicule et interface d'entrée pour véhicule
JP2011131686A (ja) ナビゲーションシステム
JP6274003B2 (ja) 表示操作システム
JP2020093591A (ja) 車両用操作装置
JP2012030645A (ja) 車載用電子機器

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15834917

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15834917

Country of ref document: EP

Kind code of ref document: A1