WO2016031152A1 - Input interface for vehicle - Google Patents

Input interface for vehicle Download PDF

Info

Publication number
WO2016031152A1
WO2016031152A1 PCT/JP2015/003975 JP2015003975W WO2016031152A1 WO 2016031152 A1 WO2016031152 A1 WO 2016031152A1 JP 2015003975 W JP2015003975 W JP 2015003975W WO 2016031152 A1 WO2016031152 A1 WO 2016031152A1
Authority
WO
WIPO (PCT)
Prior art keywords
finger
hand
control device
display control
vehicle
Prior art date
Application number
PCT/JP2015/003975
Other languages
French (fr)
Japanese (ja)
Inventor
重明 西橋
豪之 藤本
Original Assignee
株式会社デンソー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社デンソー filed Critical 株式会社デンソー
Publication of WO2016031152A1 publication Critical patent/WO2016031152A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • This relates to a vehicle input interface having an aerial detectable area that can detect an operator's hand or finger in the air.
  • Patent Document 1 An input interface for a vehicle having an air-detectable area in which an operator's hand or finger can be detected in the air.
  • the present disclosure has been made in view of the above points, and an object of the present disclosure is to provide a vehicle input interface that can suppress the induction of an erroneous operation without significantly impairing the convenience of the aerial operation.
  • An input interface for a vehicle is an interface for operating an image displayed on a display device arranged in a vehicle interior, and can detect an operator's hand or finger in the air.
  • a touch pad having a region and a display control device that controls an image displayed on the display device based on an input to the touch pad, and the touch pad has a degree of proximity to an operator's hand or finger
  • a calculation device that estimates the position of the hand or finger and the presence or absence of contact with the hand or finger, the display control device determines whether or not the hand or finger has performed a specific operation on the touch pad, When it is determined that an operation has been performed, an image displayed on the display device is operated according to the movement of the hand or finger even when the hand or finger is not in contact with the touch pad.
  • the screen control according to the movement of the operator's hand or finger detected in the air is not performed until the display control device determines that the hand or finger has performed a specific action, thereby preventing an erroneous operation unintended by the operator. it can.
  • FIG. 1 is a diagram illustrating an arrangement of a vehicle input interface in a vehicle interior according to an embodiment of the present disclosure
  • FIG. 2 is a perspective view for explaining the touchpad in detail.
  • FIG. 3 is a cross-sectional view showing the range of the aerial detectable region of the touchpad
  • FIG. 4 is a flowchart showing the control executed by the touchpad arithmetic unit
  • FIG. 5 is a flowchart illustrating the control performed by the display control device based on a signal output from the arithmetic device of the touch pad
  • FIG. 6 is a diagram showing a display example of the display device during detection of each mode and specific operation.
  • FIG. 1 is a diagram illustrating an arrangement of a vehicle input interface 10 in a vehicle cabin according to an embodiment of the present disclosure.
  • the vehicle input interface 10 in this embodiment is for operating an image displayed on the display device 20, and includes a vehicle touch pad 30 and a display control device 40.
  • the display device 20 is a multi-function display disposed substantially at the center of the instrument panel 50 of the vehicle.
  • the display device 20 is an operation screen of a vehicle air conditioner, radio, navigation device, audio, etc. (not shown), a rear camera image, and a plurality of cameras. It is also a general-purpose display device that displays an around view or the like based on a composite image.
  • the vehicle touch pad 30 includes an arithmetic device 80, and the arithmetic device 80 reads a signal related to the electrostatic capacitance from the electrostatic capacitance sensor panel 70 arranged below the design panel 60 shown in FIG.
  • the degree of proximity to the operator's hand or finger, the position of the hand or finger, and the presence or absence of contact with the hand or finger are estimated to generate a two-dimensional coordinate signal or a three-dimensional coordinate signal, and display control is performed A signal is output to the device 40.
  • the display control device 40 switches various operation screens based on the two-dimensional signal and the three-dimensional signal from the arithmetic device 80 of the vehicular touchpad 30 and causes the display device 20 to display an image for inputting each function. Is.
  • the display control device 40 is connected to a sound source 90 such as a speaker arranged directly or indirectly in the vehicle interior so that a notification sound can be generated from the sound source by a signal from the display control device 40. It is configured.
  • the vehicle touchpad 30 of this example is disposed in the vicinity of an armrest (not shown) between the driver seat and the passenger seat, and the design panel 60 is such that the palm of the driver is positioned exactly when the driver puts his arm on the armrest.
  • the arithmetic device 80 and the display control device 40 are disposed inside the instrument panel 50 that is not visible from the passenger compartment.
  • the display device 20, the vehicle touch pad 30, the display control device 40, and the speaker 90 may be connected in any manner, may be connected via an in-vehicle network communication cable, or may be an individual cable. It may be connected or may be connected by wireless communication.
  • FIG. 2 is a perspective view of the touch pad 30 in the present embodiment.
  • the design panel 60 is disposed on the surface of the housing where the capacitance sensor panel 70 is disposed.
  • an operation detectable area 100 capable of detecting the approach, proximity, contact and position coordinates of an operating body such as an operator's hand or finger, and a decision input area for receiving an input by contact or pressing. 110 is color-coded to be easily understood by the driver.
  • the input to the decision input area 110 is detected using the capacitance sensor panel 70.
  • any method for detecting the presence or absence of the decision input may be used.
  • a switch may be separately provided, or a determination input may be made based on the fact that the surface of the design panel 60 is double-tapped with an operator's finger in the operation detectable area 100.
  • FIG. 3 is a cross-sectional view showing a range of an aerial detectable region (RG) in which the approach, proximity, and contact of the operating body in the touch pad 30 shown in FIG. 2 can be detected.
  • the computing device 80 of the touch pad 30 estimates the movement and shape of the operator's hand or finger within the aerial detectable region based on the electric charge stored in the capacitance sensor panel 70.
  • the detection means capable of detecting contact and proximity in the present disclosure is not limited to a capacitance sensor, and may be realized by combining, for example, a pressure-sensitive sensor and an imaging device such as an infrared camera.
  • the proximity of the operating body and the design panel 60, the degree of proximity, the position of the operating body with respect to the design panel 60, and the presence or absence of contact between the operating body and the design panel 60 are estimated.
  • step S10 the electric charge stored in the capacitance sensor panel 70 is measured, and the process proceeds to step S11.
  • step S11 the position estimated to be closest to the design panel 60 and the distance from the design panel 60 are estimated, and the process proceeds to step S12.
  • step S12 it is determined from the distance between the operating tool and the design panel 60 whether or not the operating tool (that is, the operator's hand or finger) is in contact with the design panel 60.
  • step S12 is affirmation determination, it progresses to step S13.
  • step S13 a two-dimensional coordinate signal indicating a coordinate point on the design panel at a position where the operating body and the design panel 60 are in contact is output to the display control device 40.
  • step S12 is negative determination, it progresses to step S14, and the three-dimensional coordinate signal which shows the position of the operation body on the design panel 60 is output to the display control apparatus 40.
  • step S20 it is determined whether or not a two-dimensional coordinate signal is received from the arithmetic device 80 of the touch pad 30.
  • step S20 is affirmation determination, it progresses to step S21.
  • step S21 it is assumed that a touch input operation has been performed on the touch pad 30, and the display content of the display device 20 is set to the individual function setting mode shown on the right side of FIG. 6, and the individual function setting screen is set based on the two-dimensional coordinate signal.
  • the cursor 130 displayed in 120 is operated to select one of the icons 140 displayed.
  • step S20 when step S20 is negative determination, it progresses to step S22 and it is determined whether the three-dimensional signal is received from the arithmetic unit 80. If step S22 is affirmative, the process proceeds to step S23.
  • step S23 it is determined whether or not the three-dimensional signal is continuously received from the arithmetic unit 80 for a predetermined time (for example, 1 second). If step S23 is affirmative, the process proceeds to step S24.
  • step S24 the display content of the display device 20 is set to the function selection mode shown in the center of FIG.
  • this function selection mode names 150 representing the respective functions are arranged on the left and right at the top of the screen, and an operation screen 160 corresponding to the currently selected function is displayed at the approximate center of the screen.
  • the currently selected operation screen 160 in this embodiment is a reduced display of the individual function setting screen 120.
  • the operating tool MP
  • the currently selected operation screen 160 is displayed on the operating tool. It is designed to scroll left and right to match the movement.
  • the name 150 at the top of the screen is also highlighted 170 indicating the name corresponding to the selected function.
  • step S23 determines whether at least one of the shape and movement of the operating tool corresponds to a “specific operation” stored in advance in a storage device (not shown) included in the display control device 40.
  • step S23 is affirmation determination, it progresses to step S24 mentioned above.
  • this step S23 is negative determination, it returns to step S20.
  • the “specific operation” will be described in more detail. For example, (1) the operator places his / her hand or finger on the touchpad 30 in a specific shape, and (2) the operator places his / her hand or finger on the touchpad 30 in a predetermined path. Moving, (3) the operator may leave the hand or finger on the touch pad 30 for a predetermined time or more.
  • the display control device 40 makes an affirmative determination in step S25 when at least one or any combination of the specific operations is performed.
  • the “specific operation” may be any as long as it is predetermined, and the operator may register an arbitrary operation as the “specific operation” in advance.
  • the screen control according to the movement of the operator's hand or finger detected in the air is not performed until the display control device determines that the hand or finger has performed a specific action. Incorrect operation can be suppressed.
  • the display screen of the display device 20 shown on the left in FIG. 6 is an image that the display control device 40 displays on the display device 20 while the specific operation is being detected in step S25.
  • a white arc 180 and a hatched arc 190 shown on the left display screen in FIG. 6 indicate the degree of completion of the specific operation on the display device when the hand or finger is performing the specific operation. For example, when an operation “create a peace sign with a finger and move it like drawing a circle” is predetermined as a “specific operation”, the peace sign mark 200 displayed on the left display screen in FIG.
  • the arc 180 allows the operator to remember what the particular action was, and the hatched arc 190 can visually understand the degree of completion of the particular action. For this reason, it is possible to assist the operator to get used to performing a specific operation.
  • sound may be generated from the sound source 90 arranged in the passenger compartment.
  • the “specific action” determined by the display control device 40 may be “the hand or finger stays in the aerial detectable region for a predetermined time or longer”, or the shape of the hand or finger is determined by the arithmetic device 80. It may be estimated that “a hand or a finger has a predetermined shape”, or it may be considered that the “specific action” is performed when any one of these operations is detected.
  • the display control device 40 may cause the display device to display the degree of completion of the specific action. Thereby, the operator can confirm the progress degree of the specific operation for canceling the aerial operation by the display device, and can assist the operator to get used to the execution of the specific operation.
  • the display control device 40 may generate sound from a sound source arranged in the vehicle interior.

Abstract

 The present invention is equipped with a touch pad (30) provided with an aerial detectable area capable of detecting the hand or finger of an operator in the air, and a display control device (40) for controlling, in accordance with input to the touch pad, the image that is displayed on a display device, the touch pad estimating the degree of proximity to the hand or finger of an operator, the position of the hand or finger, and whether there is contact with the hand or finger, the display control device determining whether the hand or finger performed a specific operation on the touch panel, and, when it is determined that the specific operation was performed, causing the image displayed on the display device to be operated on in accordance with the motion of the hand or finger, even when the hand or finger is not in contact with the touch pad.

Description

車両用入力インターフェイスVehicle input interface 関連出願の相互参照Cross-reference of related applications
 本出願は、2014年8月29日に出願された日本出願番号2014-175637号に基づくもので、ここにその記載内容を援用する。 This application is based on Japanese Patent Application No. 2014-175637 filed on August 29, 2014, the contents of which are incorporated herein by reference.
 操作者の手もしくは指を空中で検出可能な空中検出可能領域を備えた車両用入力インターフェイスに関するものである。 This relates to a vehicle input interface having an aerial detectable area that can detect an operator's hand or finger in the air.
 操作者の手もしくは指を空中で検出可能な空中検出可能領域を備えた車両用入力インターフェイスが知られている(例えば特許文献1)。 There is known an input interface for a vehicle having an air-detectable area in which an operator's hand or finger can be detected in the air (for example, Patent Document 1).
特開2011-118857号公報JP 2011-118857 A
 空中で検出した手もしくは指の動きによる車載機器への入力は、便利である反面、誤操作を誘発することが危惧されている。誤操作を抑止するために空中操作が可能な領域を狭い領域に制限することも検討されているが、それでは空中操作による利便性を充分に享受できないという問題があった。 入 力 Input to in-vehicle devices by hand or finger movement detected in the air is convenient, but there is a concern that it may cause erroneous operations. In order to suppress erroneous operations, it has been studied to limit the area where the aerial operation can be performed to a narrow area, but there is a problem that the convenience of the aerial operation cannot be fully enjoyed.
 本開示は上記点に鑑みてなされたものであって、空中操作による利便性を著しく損なうことなく、誤操作の誘発を抑止することができる車両用入力インターフェイスを提供することを目的とする。 The present disclosure has been made in view of the above points, and an object of the present disclosure is to provide a vehicle input interface that can suppress the induction of an erroneous operation without significantly impairing the convenience of the aerial operation.
 本開示の一態様による車両用入力インターフェイスは、車室内に配置された表示装置に表示される画像を操作するためのインターフェイスであって、操作者の手もしくは指を空中で検出可能な空中検出可能領域を備えたタッチパッドと、タッチパッドへの入力に基づいて、表示装置に表示される画像を制御する表示制御装置と、を有し、タッチパッドは、操作者の手もしくは指との近接度合い、手もしくは指の位置、および手もしくは指との接触有無を推定する演算装置を備え、表示制御装置は、手もしくは指がタッチパッド上で特定の動作を行ったか否かを判定し、特定の動作を行ったと判定した場合に、手もしくは指がタッチパッドに接触していない状態でも手もしくは指の動きに応じて表示装置に表示される画像を操作する。 An input interface for a vehicle according to an aspect of the present disclosure is an interface for operating an image displayed on a display device arranged in a vehicle interior, and can detect an operator's hand or finger in the air. A touch pad having a region and a display control device that controls an image displayed on the display device based on an input to the touch pad, and the touch pad has a degree of proximity to an operator's hand or finger A calculation device that estimates the position of the hand or finger and the presence or absence of contact with the hand or finger, the display control device determines whether or not the hand or finger has performed a specific operation on the touch pad, When it is determined that an operation has been performed, an image displayed on the display device is operated according to the movement of the hand or finger even when the hand or finger is not in contact with the touch pad.
 これにより、表示制御装置が手もしくは指が特定の動作を行ったと判定するまで空中で検出した操作者の手もしくは指の動きに応じた画面操作を行わないため、操作者が意図しない誤操作が抑止できる。 As a result, the screen control according to the movement of the operator's hand or finger detected in the air is not performed until the display control device determines that the hand or finger has performed a specific action, thereby preventing an erroneous operation unintended by the operator. it can.
 本開示についての上記目的およびその他の目的、特徴や利点は、添付の図面を参照しながら下記の詳細な記述により、より明確になる。その図面は、
図1は、本開示の実施例における車両用入力インターフェイスの車室内における配置を示した図であり、 図2は、タッチパッドについて詳しく説明するための斜視図であり、 図3は、タッチパッドの空中検出可能領域の範囲を示した断面図であり、 図4は、タッチパッドの演算装置が実行する制御を示したフローチャートであり、 図5は、タッチパッドの演算装置から出力される信号に基づいて表示制御装置が行う制御を示したフローチャートであり、 図6は、各モードおよび特定の動作検出中の表示装置の表示例を示した図である。
The above and other objects, features and advantages of the present disclosure will become more apparent from the following detailed description with reference to the accompanying drawings. The drawing
FIG. 1 is a diagram illustrating an arrangement of a vehicle input interface in a vehicle interior according to an embodiment of the present disclosure; FIG. 2 is a perspective view for explaining the touchpad in detail. FIG. 3 is a cross-sectional view showing the range of the aerial detectable region of the touchpad, FIG. 4 is a flowchart showing the control executed by the touchpad arithmetic unit, FIG. 5 is a flowchart illustrating the control performed by the display control device based on a signal output from the arithmetic device of the touch pad, FIG. 6 is a diagram showing a display example of the display device during detection of each mode and specific operation.
 以下、本開示の実施例を図面に基づいて説明する。なお、各実施形態において対応する構成要素には同一の符号を付すことにより、重複する説明を省略する場合がある。各実施例においての構成の一部のみを説明している場合、当該構成の他の部分については、先行して説明した他の実施例の構成を適用することが出来る。また、各実施例の説明において明示している構成の組み合わせばかりでなく、特に組み合わせに支障が生じなければ、明示していなくても複数の実施例の構成同士を部分的に組み合わせることが出来る。 Hereinafter, embodiments of the present disclosure will be described with reference to the drawings. In addition, the overlapping description may be abbreviate | omitted by attaching | subjecting the same code | symbol to the corresponding component in each embodiment. When only a part of the configuration in each embodiment is described, the configuration of the other embodiment described above can be applied to other portions of the configuration. Moreover, not only the combination of the structures specified in the description of each embodiment, but also the combination of a plurality of embodiments can be partially combined even if they are not explicitly shown unless there is a problem with the combination.
 図1は本開示の実施例における車両用入力インターフェイス10の車室内における配置を示した図である。本実施例における車両用入力インターフェイス10は表示装置20に表示される画像を操作するためのものであり、車両用タッチパッド30と、表示制御装置40とを備えている。 FIG. 1 is a diagram illustrating an arrangement of a vehicle input interface 10 in a vehicle cabin according to an embodiment of the present disclosure. The vehicle input interface 10 in this embodiment is for operating an image displayed on the display device 20, and includes a vehicle touch pad 30 and a display control device 40.
 表示装置20は、車両のインストルメントパネル50の略中央に配置されたマルチファンクションディスプレイであり、図示しない車両用空調装置、ラジオ、ナビゲーション装置、オーディオなどの操作画面や、後方カメラの画像、複数カメラの合成画像によるアラウンドビューなどを表示するも汎用表示装置である。 The display device 20 is a multi-function display disposed substantially at the center of the instrument panel 50 of the vehicle. The display device 20 is an operation screen of a vehicle air conditioner, radio, navigation device, audio, etc. (not shown), a rear camera image, and a plurality of cameras. It is also a general-purpose display device that displays an around view or the like based on a composite image.
 車両用タッチパッド30は、演算装置80を備え、該演算装置80によって、図2に示された意匠パネル60の下に配置された静電容量センサパネル70から静電容量に関する信号を読み取り、後述する所定の規則に従って、操作者の手もしくは指との近接度合い、手もしくは指の位置、および手もしくは指との接触有無を推定して2次元座標信号または3次元座標信号を生成し、表示制御装置40に信号を出力するものである。 The vehicle touch pad 30 includes an arithmetic device 80, and the arithmetic device 80 reads a signal related to the electrostatic capacitance from the electrostatic capacitance sensor panel 70 arranged below the design panel 60 shown in FIG. In accordance with predetermined rules, the degree of proximity to the operator's hand or finger, the position of the hand or finger, and the presence or absence of contact with the hand or finger are estimated to generate a two-dimensional coordinate signal or a three-dimensional coordinate signal, and display control is performed A signal is output to the device 40.
 表示制御装置40は、車両用タッチパッド30の演算装置80からの2次元信号および3次元信号に基づいて各種操作画面を切り替えて、各機能に対する入力を行うための画像を表示装置20に表示させるものである。 The display control device 40 switches various operation screens based on the two-dimensional signal and the three-dimensional signal from the arithmetic device 80 of the vehicular touchpad 30 and causes the display device 20 to display an image for inputting each function. Is.
 また、表示制御装置40は、直接又は間接的に車室内に配置されたスピーカなどの音源90と接続されており、表示制御装置40からの信号によって当該音源から報知音を発生させることができるように構成されている。 Further, the display control device 40 is connected to a sound source 90 such as a speaker arranged directly or indirectly in the vehicle interior so that a notification sound can be generated from the sound source by a signal from the display control device 40. It is configured.
 本実例の車両用タッチパッド30は、運転席と助手席との間の図示しない肘掛近傍に配置されており、意匠パネル60は、ドライバが肘掛に腕を置いた際にドライバの手のひらがちょうど位置するように配置されている。また、演算装置80および表示制御装置40は、車室内から見えないインストルメントパネル50の内部などに配置されている。 The vehicle touchpad 30 of this example is disposed in the vicinity of an armrest (not shown) between the driver seat and the passenger seat, and the design panel 60 is such that the palm of the driver is positioned exactly when the driver puts his arm on the armrest. Are arranged to be. The arithmetic device 80 and the display control device 40 are disposed inside the instrument panel 50 that is not visible from the passenger compartment.
 尚、表示装置20、車両用タッチパッド30、表示制御装置40、スピーカ90の接続態様はいかなるものであっても良く、車内ネットワーク通信ケーブルを介して接続されていても良いし、個別のケーブルで接続されていてもよいし、無線通信で接続されていても良い。 The display device 20, the vehicle touch pad 30, the display control device 40, and the speaker 90 may be connected in any manner, may be connected via an in-vehicle network communication cable, or may be an individual cable. It may be connected or may be connected by wireless communication.
 次に、図2を用いて本実施例におけるタッチパッド30について詳しく説明する。図2は、本実施例におけるタッチパッド30の斜視図である。意匠パネル60は、静電容量センサパネル70が配置された筐体の表面に配置されている。該意匠パネル60上には、操作者の手もしくは指などの操作体の接近、近接、接触及びそれらの位置座標を検出可能な操作検出可能エリア100と、接触または押下による入力を受け付ける決定入力エリア110とが色分けされてドライバに分かりやすいように色分けされて示されている。 Next, the touch pad 30 in the present embodiment will be described in detail with reference to FIG. FIG. 2 is a perspective view of the touch pad 30 in the present embodiment. The design panel 60 is disposed on the surface of the housing where the capacitance sensor panel 70 is disposed. On the design panel 60, an operation detectable area 100 capable of detecting the approach, proximity, contact and position coordinates of an operating body such as an operator's hand or finger, and a decision input area for receiving an input by contact or pressing. 110 is color-coded to be easily understood by the driver.
 本実施例では、決定入力エリア110への入力検出を、静電容量センサパネル70を用いて実現しているが、決定入力有無の検出方法はどのような方法であってもよく、機械的なスイッチを別途設けるようにしてもよいし、操作検出可能エリア100において意匠パネル60の表面が操作者の指によってダブルタップされたことに基づいて決定入力をするようにしてもよい。 In this embodiment, the input to the decision input area 110 is detected using the capacitance sensor panel 70. However, any method for detecting the presence or absence of the decision input may be used. A switch may be separately provided, or a determination input may be made based on the fact that the surface of the design panel 60 is double-tapped with an operator's finger in the operation detectable area 100.
 図3は図2に示したタッチパッド30における操作体の接近、近接、接触が検出可能な空中検出可能領域(RG)の範囲を示した断面図である。本実施例におけるタッチパッド30の演算装置80は、静電容量センサパネル70に蓄えられた電荷に基づいて該空中検出可能領域内での操作者の手もしくは指の動き、形状を推測する。尚、本開示における接触、近接を検出できる検出手段は、静電容量センサに限定されるものではなく、例えば感圧センサと赤外線カメラなどの撮像装置を組み合わせて実現するようにしてもよい。 FIG. 3 is a cross-sectional view showing a range of an aerial detectable region (RG) in which the approach, proximity, and contact of the operating body in the touch pad 30 shown in FIG. 2 can be detected. The computing device 80 of the touch pad 30 in this embodiment estimates the movement and shape of the operator's hand or finger within the aerial detectable region based on the electric charge stored in the capacitance sensor panel 70. Note that the detection means capable of detecting contact and proximity in the present disclosure is not limited to a capacitance sensor, and may be realized by combining, for example, a pressure-sensitive sensor and an imaging device such as an infrared camera.
 具体的には、図4に示すフローチャートに従って操作体と意匠パネル60の接近、近接度合い、操作体の意匠パネル60に対する位置、操作体と意匠パネル60との接触有無を推定している。 Specifically, according to the flowchart shown in FIG. 4, the proximity of the operating body and the design panel 60, the degree of proximity, the position of the operating body with respect to the design panel 60, and the presence or absence of contact between the operating body and the design panel 60 are estimated.
 まずステップS10において、静電容量センサパネル70に蓄えられている電荷を測定し、ステップS11へと進む。 First, in step S10, the electric charge stored in the capacitance sensor panel 70 is measured, and the process proceeds to step S11.
 ステップS11では、操作体が最も意匠パネル60に接近していると推測される位置と、意匠パネル60からの距離を推定し、ステップS12へと進む。 In step S11, the position estimated to be closest to the design panel 60 and the distance from the design panel 60 are estimated, and the process proceeds to step S12.
 ステップS12では、操作体と意匠パネル60との距離から、操作体(即ち、操作者の手もしくは指)が意匠パネル60と接触しているか否かを判定する。ステップS12が肯定判定であった場合は、ステップS13へと進む。 In step S12, it is determined from the distance between the operating tool and the design panel 60 whether or not the operating tool (that is, the operator's hand or finger) is in contact with the design panel 60. When step S12 is affirmation determination, it progresses to step S13.
 ステップS13では、操作体と意匠パネル60とが接触している位置の意匠パネル上の座標点を示す2次元座標信号を表示制御装置40へ出力する。 In step S13, a two-dimensional coordinate signal indicating a coordinate point on the design panel at a position where the operating body and the design panel 60 are in contact is output to the display control device 40.
 一方、ステップS12が否定判定であった場合は、ステップS14へと進み、意匠パネル60上での操作体の位置を示す3次元座標信号を表示制御装置40へと出力する。 On the other hand, when step S12 is negative determination, it progresses to step S14, and the three-dimensional coordinate signal which shows the position of the operation body on the design panel 60 is output to the display control apparatus 40.
 次に、タッチパッド30の演算装置80から出力される信号に基づいて表示制御装置40が行う制御について図5を用いて説明する。 Next, control performed by the display control device 40 based on a signal output from the arithmetic device 80 of the touch pad 30 will be described with reference to FIG.
 まず、ステップS20において、タッチパッド30の演算装置80から2次元座標信号を受信したか否かを判定する。ステップS20が肯定判定であった場合はステップS21へと進む。 First, in step S20, it is determined whether or not a two-dimensional coordinate signal is received from the arithmetic device 80 of the touch pad 30. When step S20 is affirmation determination, it progresses to step S21.
 ステップS21ではタッチパッド30への接触入力操作があったとみなして、表示装置20の表示コンテンツを図6右側に示した個別機能設定モードに設定し、2次元座標信号に基づいて個別機能の設定画面120の中に表示されたカーソル130を操作して、されたアイコン140のうちの一つを選択する。 In step S21, it is assumed that a touch input operation has been performed on the touch pad 30, and the display content of the display device 20 is set to the individual function setting mode shown on the right side of FIG. 6, and the individual function setting screen is set based on the two-dimensional coordinate signal. The cursor 130 displayed in 120 is operated to select one of the icons 140 displayed.
 一方、ステップS20が否定判定であった場合、ステップS22へと進み、演算装置80から3次元信号を受信しているかを判定する。ステップS22が肯定判定であった場合、ステップS23へと進む。 On the other hand, when step S20 is negative determination, it progresses to step S22 and it is determined whether the three-dimensional signal is received from the arithmetic unit 80. If step S22 is affirmative, the process proceeds to step S23.
 ステップS23では、演算装置80から3次元信号を所定時間(例えば1秒)以上継続して受信しているか否かを判定する。ステップS23が肯定判定であった場合、ステップS24へと進む。 In step S23, it is determined whether or not the three-dimensional signal is continuously received from the arithmetic unit 80 for a predetermined time (for example, 1 second). If step S23 is affirmative, the process proceeds to step S24.
 ステップS24では、操作者がタッチパッド30に対して、空中操作による入力を行おうとしているとして、表示装置20の表示コンテンツを図6の中央に示した機能選択モードに設定する。この機能選択モードでは、各機能を示す名称150が画面の上部に左右に並んでおり、画面の略中央に現在選択中の機能に対応する操作画面160が表示されている。本実施例における現在選択中の操作画面160は、上記個別機能の設定画面120を縮小表示したものである。この機能選択モードにおいて操作体(MP)がタッチパッド30に近接し、かつ操作体がタッチパッド30から離間した状態で車両の左右方向に振られると、現在選択中の操作画面160が操作体の動きに合わせて左右方向にスクロールするようになっている。尚、画面上部の名称150にも、選択中の機能に対応する名称を示す強調表示170がなされる。 In step S24, the display content of the display device 20 is set to the function selection mode shown in the center of FIG. In this function selection mode, names 150 representing the respective functions are arranged on the left and right at the top of the screen, and an operation screen 160 corresponding to the currently selected function is displayed at the approximate center of the screen. The currently selected operation screen 160 in this embodiment is a reduced display of the individual function setting screen 120. In this function selection mode, when the operating tool (MP) is close to the touch pad 30 and is moved away from the touch pad 30 in the left-right direction of the vehicle, the currently selected operation screen 160 is displayed on the operating tool. It is designed to scroll left and right to match the movement. The name 150 at the top of the screen is also highlighted 170 indicating the name corresponding to the selected function.
 一方、ステップS23が否定判定であった場合、ステップS25へと進む。ステップS25では、操作体の形状および動きの少なくとも一方が、表示制御装置40が備える図示しない記憶装置に予め記憶された「特定の動作」に該当するか否かを判定する。ステップS23が肯定判定であった場合、上述したステップS24へと進む。また、このステップS23が否定判定であった場合は、ステップS20へと戻る。 On the other hand, if step S23 is negative, the process proceeds to step S25. In step S <b> 25, it is determined whether at least one of the shape and movement of the operating tool corresponds to a “specific operation” stored in advance in a storage device (not shown) included in the display control device 40. When step S23 is affirmation determination, it progresses to step S24 mentioned above. Moreover, when this step S23 is negative determination, it returns to step S20.
 ここで、上記「特定の動作」について更に詳しく説明する。特定の動作には、例えば(1)操作者が手または指を特定の形状にしてタッチパッド30に掲げる、(2)操作者が手または指をタッチパッド30上で所定の軌跡をなすように動かす、(3)操作者が手または指をタッチパッド30上に所定時間以上留まったままにするなどが含まれ得る。表示制御装置40は、上記特定の動作のうち、少なくともひとつまたは任意の組み合わせがなされたときに、上記ステップS25で肯定判定を行う。尚、「特定の動作」は予め定められているものであればどのようなものでも良く、操作者が任意の動作を「特定の動作」として予め登録できるようにしても良い。 Here, the “specific operation” will be described in more detail. For example, (1) the operator places his / her hand or finger on the touchpad 30 in a specific shape, and (2) the operator places his / her hand or finger on the touchpad 30 in a predetermined path. Moving, (3) the operator may leave the hand or finger on the touch pad 30 for a predetermined time or more. The display control device 40 makes an affirmative determination in step S25 when at least one or any combination of the specific operations is performed. The “specific operation” may be any as long as it is predetermined, and the operator may register an arbitrary operation as the “specific operation” in advance.
 以上の説明した構成により、表示制御装置が手もしくは指が特定の動作を行ったと判定するまで空中で検出した操作者の手もしくは指の動きに応じた画面操作を行わないため、操作者が意図しない誤操作が抑止できる。 With the configuration described above, the screen control according to the movement of the operator's hand or finger detected in the air is not performed until the display control device determines that the hand or finger has performed a specific action. Incorrect operation can be suppressed.
 尚、図6左に示された表示装置20の表示画面は、上記ステップS25において特定の動作を検出している最中に表示制御装置40が表示装置20に表示させる画像である。図6左の表示画面に示された白い円弧180とハッチングされた円弧190は、手もしくは指が特定の動作を行いつつある場合に、表示装置に当該特定の動作の完了度合いを示している。例えば「指でピースサインを作って、円を描くように動かす」という動作が「特定の動作」として予め定められていた場合、図6左の表示画面に表示されるピースサインのマーク200、白い円弧180によって操作者は、特定の動作が何であったかを思い出すことができ、ハッチングされた円弧190によって特定の動作の完了度合いを視覚的に理解することができる。このため、操作者が特定の動作の実行に慣れることを補助することができる。また、特定の動作が完了した際、もしくは完了しつつあるときに、車室内に配置された音源90から音声を発生させるようにしてもよい。 Note that the display screen of the display device 20 shown on the left in FIG. 6 is an image that the display control device 40 displays on the display device 20 while the specific operation is being detected in step S25. A white arc 180 and a hatched arc 190 shown on the left display screen in FIG. 6 indicate the degree of completion of the specific operation on the display device when the hand or finger is performing the specific operation. For example, when an operation “create a peace sign with a finger and move it like drawing a circle” is predetermined as a “specific operation”, the peace sign mark 200 displayed on the left display screen in FIG. The arc 180 allows the operator to remember what the particular action was, and the hatched arc 190 can visually understand the degree of completion of the particular action. For this reason, it is possible to assist the operator to get used to performing a specific operation. In addition, when a specific operation is completed or is being completed, sound may be generated from the sound source 90 arranged in the passenger compartment.
 上記したように、表示制御装置40が判定する「特定の動作」は、「手もしくは指が空中検出可能領域内に所定時間以上留まること」でも良いし、演算装置80によって手もしくは指の形状を推定して、「手もしくは指が予め定められた形状となること」でも良いし、これらのいずれか一方の操作を検出したときに該「特定の動作」がなされたものとみなしてもよい。 As described above, the “specific action” determined by the display control device 40 may be “the hand or finger stays in the aerial detectable region for a predetermined time or longer”, or the shape of the hand or finger is determined by the arithmetic device 80. It may be estimated that “a hand or a finger has a predetermined shape”, or it may be considered that the “specific action” is performed when any one of these operations is detected.
 また、表示制御装置40によって、手もしくは指が特定の動作を行いつつある場合に、表示装置に当該特定の動作の完了度合いを示す表示を行わせるようにしてもよい。これにより、操作者は空中操作を解除するための特定の動作の進捗度合いを表示装置によって確認することができ、操作者が特定の動作の実行に慣れることを補助することができる。 Further, when the hand or finger is performing a specific action, the display control device 40 may cause the display device to display the degree of completion of the specific action. Thereby, the operator can confirm the progress degree of the specific operation for canceling the aerial operation by the display device, and can assist the operator to get used to the execution of the specific operation.
 また、表示制御装置40は、特定の動作が完了したと判定したとき、車室内に配置された音源から音声を発生させるようにしてもよい。 Further, when it is determined that the specific operation has been completed, the display control device 40 may generate sound from a sound source arranged in the vehicle interior.
 本開示は、実施例に準拠して記述されたが、本開示は当該実施例や構造に限定されるものではないと理解される。本開示は、様々な変形例や均等範囲内の変形をも包含する。加えて、様々な組み合わせや形態、さらには、それらに一要素のみ、それ以上、あるいはそれ以下、を含む他の組み合わせや形態をも、本開示の範畴や思想範囲に入るものである。

 
Although the present disclosure has been described with reference to the embodiments, it is understood that the present disclosure is not limited to the embodiments and structures. The present disclosure includes various modifications and modifications within the equivalent range. In addition, various combinations and forms, as well as other combinations and forms including only one element, more or less, are within the scope and spirit of the present disclosure.

Claims (5)

  1.  車室内に配置された表示装置に表示される画像を操作するための車両用入力インターフェイスであって、
     操作者の手もしくは指を空中で検出可能な空中検出可能領域を備えたタッチパッド(30)と、
     前記タッチパッドへの入力に基づいて、前記表示装置に表示される画像を制御する表示制御装置(40)と、を有し、
     前記タッチパッドは、操作者の手もしくは指との近接度合い、前記手もしくは指の位置、および前記手もしくは指との接触有無を推定する演算装置(80)を備え、
     前記表示制御装置は、前記手もしくは指が前記タッチパッド上で特定の動作を行ったか否かを判定し、前記特定の動作を行ったと判定した場合に、前記手もしくは指が前記タッチパッドに接触していない状態でも前記手もしくは指の動きに応じて前記表示装置に表示される画像を操作する車両用入力インターフェイス。
    An input interface for a vehicle for operating an image displayed on a display device arranged in a vehicle interior,
    A touchpad (30) having an air-detectable area capable of detecting an operator's hand or finger in the air;
    A display control device (40) for controlling an image displayed on the display device based on an input to the touchpad,
    The touchpad includes a computing device (80) that estimates the degree of proximity to an operator's hand or finger, the position of the hand or finger, and the presence or absence of contact with the hand or finger.
    The display control device determines whether or not the hand or finger has performed a specific operation on the touch pad, and when determining that the specific operation has been performed, the hand or finger contacts the touch pad. An input interface for a vehicle that operates an image displayed on the display device in accordance with a movement of the hand or a finger even in a state in which it is not performed.
  2.  前記手もしくは指が行う前記特定の動作は、前記空中検出可能領域内に所定時間以上留まることである請求項1に記載の車両用入力インターフェイス。 The vehicle input interface according to claim 1, wherein the specific action performed by the hand or finger is to stay in the aerial detectable region for a predetermined time or more.
  3.  前記演算装置によって前記空中検出可能領域内に存在する前記手もしくは指の形状を推定し、
     前記表示制御装置は、前記手もしくは指の形状が予め定められた形状であった場合に前記特定の動作がなされたものであると判定する請求項1または2に記載の車両用入力インターフェイス。
    Estimating the shape of the hand or finger present in the aerial detectable region by the arithmetic unit,
    The vehicle input interface according to claim 1, wherein the display control device determines that the specific operation has been performed when the shape of the hand or finger is a predetermined shape.
  4.  前記表示制御装置は、前記手もしくは指が前記特定の動作を行いつつある場合に、前記表示装置に当該特定の動作の完了度合いを示す表示を行わせる請求項1ないし3のいずれか一項に記載の車両用入力インターフェイス。 4. The display control device according to claim 1, wherein, when the hand or the finger is performing the specific operation, the display control device causes the display device to display a degree of completion of the specific operation. 5. The vehicle input interface as described.
  5.  前記表示制御装置は、前記特定の動作が完了したと判定したとき、前記車室内に配置された音源から音声を発生させる請求項1ないし4のいずれか一項に記載の車両用入力インターフェイス。

     
    5. The vehicle input interface according to claim 1, wherein when the display control device determines that the specific operation is completed, the display control device generates a sound from a sound source arranged in the vehicle interior.

PCT/JP2015/003975 2014-08-29 2015-08-07 Input interface for vehicle WO2016031152A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014175637A JP2016051288A (en) 2014-08-29 2014-08-29 Vehicle input interface
JP2014-175637 2014-08-29

Publications (1)

Publication Number Publication Date
WO2016031152A1 true WO2016031152A1 (en) 2016-03-03

Family

ID=55399071

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/003975 WO2016031152A1 (en) 2014-08-29 2015-08-07 Input interface for vehicle

Country Status (2)

Country Link
JP (1) JP2016051288A (en)
WO (1) WO2016031152A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10137781B2 (en) 2013-08-02 2018-11-27 Denso Corporation Input device

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6896416B2 (en) * 2016-12-27 2021-06-30 アルパイン株式会社 In-vehicle system
JP2020166641A (en) * 2019-03-29 2020-10-08 ソニー株式会社 Information processing equipment, information processing method, and program

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001216069A (en) * 2000-02-01 2001-08-10 Toshiba Corp Operation inputting device and direction detecting method
JP2011118857A (en) * 2009-12-02 2011-06-16 Hyundai Motor Co Ltd User interface device for operations of multimedia system for vehicle
WO2012053033A1 (en) * 2010-10-20 2012-04-26 三菱電機株式会社 Three-dimensional display device
US20130271360A1 (en) * 2012-04-16 2013-10-17 Qualcomm Incorporated Interacting with a device using gestures

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001216069A (en) * 2000-02-01 2001-08-10 Toshiba Corp Operation inputting device and direction detecting method
JP2011118857A (en) * 2009-12-02 2011-06-16 Hyundai Motor Co Ltd User interface device for operations of multimedia system for vehicle
WO2012053033A1 (en) * 2010-10-20 2012-04-26 三菱電機株式会社 Three-dimensional display device
US20130271360A1 (en) * 2012-04-16 2013-10-17 Qualcomm Incorporated Interacting with a device using gestures

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10137781B2 (en) 2013-08-02 2018-11-27 Denso Corporation Input device

Also Published As

Publication number Publication date
JP2016051288A (en) 2016-04-11

Similar Documents

Publication Publication Date Title
JP5928397B2 (en) Input device
JP5572761B2 (en) Vehicle control device
JP5452566B2 (en) Vehicle input device
US20150324006A1 (en) Display control device
JP2007310496A (en) Touch operation input device
JP2007106392A (en) Input system for in-vehicle electronic equipment
EP3144850A1 (en) Determination apparatus, determination method, and non-transitory recording medium
JP5751233B2 (en) Operation device
KR101491169B1 (en) Device and method for controlling AVN of vehicle
JPWO2017002708A1 (en) Image display control device
JP2006264615A (en) Display device for vehicle
WO2016031152A1 (en) Input interface for vehicle
JP4848997B2 (en) Incorrect operation prevention device and operation error prevention method for in-vehicle equipment
JP4847029B2 (en) Input device
JP6018775B2 (en) Display control device for in-vehicle equipment
JP6819539B2 (en) Gesture input device
JP6610452B2 (en) Vehicle display device
JP4849193B2 (en) Incorrect operation prevention device and operation error prevention method for in-vehicle equipment
JP2012027538A (en) Electronic apparatus
JP4840332B2 (en) Remote control device
WO2014162698A1 (en) Input device
WO2016031148A1 (en) Touch pad for vehicle and input interface for vehicle
JP2011131686A (en) Navigation system
JP6274003B2 (en) Display operation system
JP2020093591A (en) Vehicle operation device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15834917

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15834917

Country of ref document: EP

Kind code of ref document: A1