WO2014196208A1 - カーナビ用ジェスチャ入力装置 - Google Patents
カーナビ用ジェスチャ入力装置 Download PDFInfo
- Publication number
- WO2014196208A1 WO2014196208A1 PCT/JP2014/003024 JP2014003024W WO2014196208A1 WO 2014196208 A1 WO2014196208 A1 WO 2014196208A1 JP 2014003024 W JP2014003024 W JP 2014003024W WO 2014196208 A1 WO2014196208 A1 WO 2014196208A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- gesture
- car navigation
- switch
- display unit
- navigation device
- Prior art date
Links
- 238000000034 method Methods 0.000 claims abstract description 40
- 238000004590 computer program Methods 0.000 claims description 4
- 230000001960 triggered effect Effects 0.000 claims 2
- 230000008569 process Effects 0.000 abstract description 19
- 238000001514 detection method Methods 0.000 abstract description 4
- 230000006870 function Effects 0.000 description 35
- 238000010586 diagram Methods 0.000 description 7
- 230000010365 information processing Effects 0.000 description 6
- 238000003825 pressing Methods 0.000 description 6
- 230000007704 transition Effects 0.000 description 5
- 238000012790 confirmation Methods 0.000 description 4
- 238000003384 imaging method Methods 0.000 description 3
- 230000004044 response Effects 0.000 description 2
- 210000003813 thumb Anatomy 0.000 description 2
- 238000004378 air conditioning Methods 0.000 description 1
- 230000001174 ascending effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 210000003811 finger Anatomy 0.000 description 1
- 210000004247 hand Anatomy 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3664—Details of the user input interface, e.g. buttons, knobs or sliders, including those provided on a touch screen; remote controllers; input using gestures
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/10—Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/11—Instrument graphical user interfaces or menu aspects
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/146—Instrument input by gesture
Definitions
- the present invention relates to a gesture input device for operating a car navigation by a gesture.
- car navigation As a method of operating a car navigation (hereinafter referred to as “car navigation”), there are operation methods using a remote control or a touch panel.
- an operation method using a remote controller there is a type in which a remote control button (controller) is provided on a steering wheel of a car (for example, Patent Document 1). In this type, the driver can operate while driving without taking his hands off the steering wheel.
- a car navigation system that allows a user to operate only by speaking by designating a destination or the like by voice input.
- buttons on the remote control are limited, so it is difficult to arrange a large number of buttons corresponding to various functions.
- the operation by voice input has a problem that it cannot be recognized correctly due to ambient noise or the like.
- Patent Document 2 Patent Document 3
- Patent Document 4 a method in which a user operates a car navigation system using a gesture. According to operations using gestures, there is no need to gaze at the screen or touch the screen, there can be no problem with button placement according to the function, there is no problem of a decrease in recognition rate due to ambient noise, etc. There are merits such as.
- an object of the present invention is to make it possible to operate a car navigation system having a hierarchical menu with a simple gesture.
- one embodiment of the present invention provides: A method that is executed by a car navigation device that includes a sensor, a storage unit that stores operations associated with a car navigation device having a hierarchical operation menu and a user's gesture, a switch, and a display unit. Detecting that the switch is pressed (a), detecting the gesture via the sensor after the switch is pressed (b), and being associated with the detected gesture (C) a step of acquiring the operation being performed from the storage unit, and a result of executing the acquired operation on the screen displayed on the display unit when the gesture is detected is displayed on the display unit. A step (d) of outputting, and a step (e) of detecting that the switch has been released. Was when said switch detects that it is pressed again within a predetermined time, a method characterized by repeating from said step (b) to said step (e).
- Another aspect of the present invention is a car navigation device that executes the above method.
- Another aspect of the present invention is a computer program for causing a computer to execute the above method.
- Another aspect of the present invention is a computer-readable recording medium storing a computer program for causing a computer to execute the above method.
- FIG. 1 is a diagram illustrating an appearance of a gesture input device according to the present embodiment.
- a switch 12 is provided on a car steering 5, and the car navigation 6 includes a display unit 20, and an integrated sensor 14 and light 16 are provided below the display unit 20. ing.
- the switch 12 is a switch for switching whether to accept a gesture operation.
- the gesture input device 1 detects this. There is a possibility of misrecognizing it as a gesture for a car navigation operation. In order to avoid such a problem, when the user performs a gesture operation, the gesture is started after the switch 12 is pressed. The gesture detected after the switch 12 is pressed is recognized as a gesture for operating the car navigation system.
- the sensor 14 is an imaging device that detects a user's gesture.
- the light 16 is a light for securing a peripheral light amount when the sensor 14 images a user's gesture.
- the light 16 may be visible light such as a white LED.
- the sensor 14 is a so-called depth camera that can measure the depth (distance from the light 16 to the subject)
- the light 16 may be a light for detecting a three-dimensional position (in this case, the light 16 may employ invisible light).
- the change in luminance is large especially during the daytime hours (the influence of weather, the influence of direct light, shadows, etc.), so that the gesture recognition process by the sensor 14 is affected.
- the light 16 By providing the light 16 with the gesture input device 1, it can be expected that the gesture recognition rate of the sensor 14 is improved and the gesture recognition range is expanded. Moreover, the illuminometer is provided, and the light 16 may be lit only when the ambient illuminance is less than a predetermined value.
- the sensor 14 and the light 16 have an integral structure, and are disposed below the display unit 20 of the car navigation 6 installed between the driver seat and the passenger seat.
- the configuration as shown in FIG. 1 is suitable for performing a gesture with the left hand on the sensor 14 below the display unit 20 (a right-hand drive car is assumed in FIG. 1).
- the switch 12 is installed on the window side of the steering 5, that is, on the side opposite to the side where the sensor 14 for detecting the gesture is installed. With such a configuration, the user can smoothly perform a gesture operation with the left hand after pressing the switch with the right hand.
- the light 16 is integrated with the sensor 14 in FIG. 1, but may have other configurations. For example, you may provide in the ceiling part of the vehicle body which does not interfere with a driver
- the switch 12 determines the start of the gesture operation as described above, it is sufficient that the light 16 is irradiated only when the gesture is imaged. Therefore, the light 16 may be turned on / off in conjunction with the switch 12. As a result, the switch 12 is turned on to perform a gesture operation, and the light 16 is turned on in conjunction with the image capture only when it is imaged. This further saves power. Even at night, the light 16 is turned on whenever the sensor 14 captures an image of a gesture.
- FIG. 2 is a functional block diagram showing an example of the configuration of the gesture input device according to this embodiment.
- the gesture device 1 includes a switch 12, a sensor 14, a light 16, a speaker 18, a display unit 20, a storage unit 102, a control unit 104, and an information processing unit 106.
- the control unit 104 receives a signal indicating ON / OFF of the switch 12, image data of a user's gesture imaged by the sensor 14, and the like. In addition, it receives output data from other components and controls the components. Each data received by the control unit 104 is stored in the storage unit 102 as necessary. Further, the data stored in the storage unit 102 is appropriately read out by the control unit 104 and processed by the information processing unit 106.
- the information processing unit 106 reads out and processes a program stored in the storage unit 102 and data necessary for the processing, thereby realizing each process in the gesture input device 1.
- the storage unit 102 stores a gesture and a user operation that means the gesture in association with each other.
- the information processing unit 104 inquires data in the storage unit 102 and reads an operation associated with the recognized gesture. Based on the read operation, the information processing unit 104 generates a result screen after this operation is performed on the screen displayed on the display unit 20 of the car navigation device 6, and the control unit 104 displays the result screen. It outputs so that it may display again on the part 20.
- the storage unit 102 may store a gesture, each screen displayed on the display unit 20, and a user operation that means the gesture on each screen in association with each other. This makes it possible to perform different operations for each screen even with the same gesture depending on the screen displayed on the display unit 20 when the gesture is performed. Thereby, the number of gestures that the user should know for operation can be reduced.
- FIG. 3 is a diagram illustrating an example of a hardware configuration of the gesture input device according to the present embodiment.
- the gesture input device according to the present embodiment that is the computer device 200 includes the sensor 14, the switch 12, the light 16, the display unit 20, and the speaker 18.
- the gesture input device according to the present embodiment is similar to a general computer device in that the CPU (Central Processing Unit) 202, the ROM (Read Only Memory) 204, the RAM (Random Access Memory) 206, etc.
- a non-volatile storage device is provided.
- the gesture input apparatus according to the present embodiment may include a hard disk or a removable recording medium connection interface not shown in FIG.
- the storage unit 102 in FIG. 2 corresponds to, for example, the ROM 204 and the RAM 206 in FIG.
- the control unit 104 in FIG. 2 mainly corresponds to the control unit CPU 202 in FIG. 2 is realized by the CPU 202 of FIG. 3 reading out a program stored in the ROM 204 or the like and processing it on a temporary memory.
- gesture operation a specific example of the gesture operation in the gesture input device according to the present embodiment.
- the menu of the car navigation system operated by the user or the operation performed on the car navigation system, the same applies hereinafter
- the same applies hereinafter has a hierarchical structure.
- the menu of the car navigation system has two levels.
- the car navigation menu only needs to have a hierarchical structure, and various menu configurations are assumed. For example, a function (car navigation, TV, music playback, video playback, etc.) provided in the car navigation in the first step is selected, and operations (channel selection, music folder ( (Album) selection, video category selection, etc.).
- a function car navigation, TV, music playback, video playback, etc.
- operations channel selection, music folder ( (Album) selection, video category selection, etc.).
- FIG. 4 is a specific example of the gesture operation in the first step.
- the user driver in this example
- the switch 12 holds the switch until the gesture operation is completed
- the gesture input device recognizes that a gesture for operating the car navigation 6 is started.
- the lighting of the light 16 and the imaging of the sensor 14 may be started.
- a screen for function selection is displayed on the display unit 20 of the car navigation 6 (FIG. 4B).
- functions such as “car navigation”, “television”, “music playback”, and “video playback” are displayed as icons.
- feedback of display confirmation by voice may be output from the car navigation system 6. Since the user can confirm the result of the gesture operation without visually recognizing the display unit 20 by the feedback by voice, it is not necessary to look at the screen or touch the screen as in the conventional car navigation operation method. The advantages of will be better utilized.
- the user When the function selection screen is displayed on the display unit 20 of the car navigation 6, the user performs a hand gesture within the detectable range of the sensor 14 (FIG. 4C). During this time, the user keeps pressing the switch 12. Then, a function linked in advance to the detected gesture is selected, and a result of this selection (for example, a screen of the selected function (TV playback screen, music album folder selection screen, etc.) is displayed on the display unit of the car navigation 6 (FIG.
- the first step is skipped by momentarily pressing and releasing the switch 12 in FIG. You may start from the following second step.
- FIG. 5 is an example of the gesture operation in the second step.
- the gesture input device recognizes that the second step has started.
- the gesture input device recognizes that a gesture for operating the car navigation 6 is started (for example, the light 16 is turned on again, and the imaging of the sensor 14 is resumed). May be)
- the screen of the function selected on the display unit 20 of the car navigation 6 TV playback screen (when “TV” is selected in the first step), music album folder selection screen (“Music playback” in the first step) Or the like) may be displayed (FIG. 5B).
- feedback of display confirmation by voice may be output from the car navigation system 6.
- the user When the screen of the selected function is displayed on the display unit 20 of the car navigation 6, the user performs a gesture by hand within the detectable range of the sensor 14 (FIG. 5C). During this time, the user keeps pressing the switch 12. Then, an operation in the selection function linked in advance to the gesture detected by the sensor 14 is executed, and the execution result is displayed on the display unit 20 of the car navigation 6 (FIG. 5D). Also at this time, feedback of display confirmation by voice may be executed. Thereafter, when the user releases (releases) the switch 12, the gesture input device determines that the gesture for operation has ended, and the second step ends (at this time, the light 16 is turned off and the sensor 14 is turned off). It may be turned off).
- the gesture operation can be realized by repeating the second step in the same manner.
- the gesture input device for the menu having a hierarchical structure, as described above, when the switch 12 is pressed, the gesture timing is clearly divided into the first step and the second step. Since it is divided, it is possible to avoid misrecognition during the transition between a plurality of gestures.
- FIG. 6 is a diagram illustrating a specific example of the gesture sign using the shape of the user's hand.
- the gesture operation may be any of the gesture signs shown in FIG. 6, a hand-waving operation, or a combination thereof.
- Example of gesture operation in the first step 7A, 7B, and 8 are diagrams illustrating an example of the gesture operation in the first step.
- a function is selected in the first step described above, one of a plurality of functions listed in the left-right direction on the screen of the display unit 20 is selected by instructing the left-right direction as shown in FIGS. 7A and 7B. It may be.
- a specific function may be directly indicated by a gesture sign. More specifically, for example, as shown in FIG. 8, the gesture input device 1 includes each gesture sign (or a gesture motion, a combination of a gesture sign and a hand motion, etc.) and each function of the car navigation 6. Data associated in advance is stored in the storage unit 102. When the gesture input device 1 receives the user's gesture via the sensor 14, the gesture input device 1 accesses the data in the storage unit 102 and acquires the function associated with the received gesture, thereby selecting the user. The function is determined (hereinafter, the same applies to the gesture operation in the second step).
- Example of gesture operation in the second step 9A to 9D, 10A, and 10B are diagrams illustrating an example of the gesture operation in the second step.
- the channel is set in the forward direction or the right direction by an instruction in the horizontal direction as shown in FIGS.
- the volume may be switched in the reverse direction, or the volume may be raised or lowered by an instruction in the vertical direction as shown in FIGS. 9C and 9D.
- the function selected in the first step is “music playback”, in the second step, as shown in FIG. It is possible to switch in ascending order or descending order in units of music by switching in the direction or the reverse direction, and by turning the thumb of the hand in the left-right direction as shown in FIG. 9B. Further, the volume of the hand is raised and lowered by extending the finger of the hand as shown in FIG. 9C and the entire hand is directed vertically, and the sound quality is adjusted by directing the thumb of the hand up and down as shown in FIG. 9D. It may be like this.
- FIG. 10A (a)
- the “response” and “ON” operations are performed by the upward direction instruction as shown in (b)
- the “rejection” and “OFF” operations are performed by the downward direction instruction as shown in FIGS. 10B (a) and (b). It may be.
- gesture operation examples in the first step and the second step described above are merely examples, and are not limited thereto.
- Various functions that can be selected in the first step by the car navigation 6 can be set, and there are various operations that can be performed in the function.
- Examples of operations in the second step include changing the map scale (wide / detail), music selection control (next / previous song), music control (play / stop), volume control (+/-), mute ( ON / OFF), map orientation conversion (heading up / north up), one-way display (ON / OFF), intersection guidance display (ON / OFF), back camera video display (ON / OFF), TV tuning (next) Station / previous station), radio (AM / FM, next / previous station), etc. are assumed.
- FIG. 11 is a flowchart showing the process ST1 in the first step.
- it is detected that the switch 12 has been pressed by the user (step S101).
- a gesture operation by the user is accepted via the sensor 14 (step S103).
- a function is selected (determined) according to the accepted gesture operation (step S105).
- step S107 the screen switching display of the display unit 20 and feedback by voice output are performed (step S107).
- step S109 the gesture input device 1 determines that the gesture operation in the first step has ended, and ends the process. Also in the second step, the processing is executed in the same processing flow as in FIG.
- FIG. 12 is a flowchart showing an example of determining continuation of the second step process after the first step process.
- ST1 the first step process
- ST2 the process of the second step is executed.
- the gesture input device 1 detects that the switch 12 is pressed after a predetermined time has elapsed
- the gesture operation of the second step is not started, but the first operation is started. It is determined that the step gesture operation has been started again, and the process is executed. Note that the processing in the second step is the same as the processing shown in FIG.
- FIG. 13 is a flowchart showing another example of the process ST1 in the first step.
- a gesture operation by the user is accepted via the sensor 14 (step S303).
- the function is selected (determined) according to the accepted gesture operation (step S307). For example, when the gesture operation is for selecting one of a plurality of functions listed on the display unit 20, when it is detected that the switch 12 is released in step S305, the switch 12 is switched in step S307.
- a function or the like that is selected when it is released may be determined as a selection result.
- step S309 the screen switching display of the display unit 20 and feedback by voice output are performed (step S309), and the process is terminated.
- the point different from the example of FIG. 11 is that the user's gesture operation is determined by determining that the gesture operation has been completed with the switch being released as a trigger. According to such a processing flow, the end timing of the gesture operation becomes clearer, and the process for the gesture operation can be simplified.
- the user holds down the switch 12 while performing the gesture operation.
- the user presses and releases the switch 12 for a predetermined time at the start and end of the gesture operation ( The start time and the end time of the gesture operation may be determined by pressing the switch 12 with a predetermined pattern or the like.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Automation & Control Theory (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Combustion & Propulsion (AREA)
- Chemical & Material Sciences (AREA)
- User Interface Of Digital Computer (AREA)
- Navigation (AREA)
- Position Input By Displaying (AREA)
Abstract
Description
センサと、操作メニューが階層構造となっているカーナビゲーション装置に対する操作とユーザのジェスチャとを関連付けて記憶する記憶部と、スイッチと、表示部と、を備えるカーナビゲーション装置が実行する方法であって、前記スイッチが押下されたことを検出する(a)ステップと、前記スイッチが押下された後に、前記センサを介して前記ジェスチャを検出する(b)ステップと、前記検出されたジェスチャに関連付けられている操作を、前記記憶部から取得する(c)ステップと、前記ジェスチャが検出された時に前記表示部に表示されていた画面に対して、前記取得された操作を実行した結果を前記表示部に出力する(d)ステップと、前記スイッチがリリースされたことを検出する(e)ステップと、を含み、あらかじめ定められた所定時間内に前記スイッチが再度押下されたことを検出した場合には、前記(b)ステップから前記(e)ステップまでを繰り返すことを特徴とする方法である。
図1は、本実施形態に係るジェスチャ入力装置の外観を示す図である。図1に示されるジェスチャ入力装置1は、一例として、車のステアリング5にスイッチ12が設けられており、カーナビ6は表示部20を備え、その下に一体化したセンサ14およびライト16が設けられている。
図2は、本実施形態に係るジェスチャ入力装置の構成の一例を示す機能ブロック図である。ジェスチャ装置1は、スイッチ12、センサ14、ライト16、スピーカ18、表示部20、記憶部102、制御部104、および情報処理部106を備える。
以下、本実施形態に係るジェスチャ入力装置におけるジェスチャ操作の具体例について説明する。なお、本実施形態に係るジェスチャ入力装置においては、ユーザが操作するカーナビのメニュー(またはカーナビに対して実行する操作。以下、同様。)は階層構造となっていることを前提とする。以下においては、説明を簡単なものとするために、カーナビのメニューが2段階となっている場合を一例として説明する。
図4は、第1ステップにおけるジェスチャ操作の具体例である。ユーザ(本例においては、運転者)は、ジェスチャ操作を行う際には、まずスイッチ12を押下する(ジェスチャ操作が終了するまで押したままにする)(図4(a))。ユーザがスイッチ12を押下すると、ジェスチャ入力装置は、カーナビ6の操作のためのジェスチャが開始されると認識する。ここで、ライト16の点灯、センサ14の撮像が開始されるようになっていてもよい。また、カーナビ6の表示部20に機能選択のための画面が表示される(図4(b))。具体的には、例えば「カーナビ」、「テレビ」、「音楽再生」、「動画再生」等の機能がアイコン等で表示される。また、これに加えて、カーナビ6から音声による表示確認のフィードバックが出力されるようになっていてもよい。音声によるフィードバックによって、ユーザは表示部20を視認することなくジェスチャ操作の結果を確認することができるため、従来のカーナビ操作方法のように画面を注視したり画面をタッチする必要がないというジェスチャ操作の利点が、より生かされることになる。
図5は、第2ステップにおけるジェスチャ操作の一例である。第1ステップが終了してからあらかじめ定められた一定時間内にユーザがスイッチ12を再度押下することで、ジェスチャ入力装置は、第2ステップが開始されたことを認識する。
以下、ジェスチャ操作のバリエーションについて説明する。
[ジェスチャサインの具体例]
図6は、ユーザの手の形状を用いたジェスチャサインの具体例を示す図である。ジェスチャ操作は、図6に示される各ジェスチャサインや、手を振る動作等、またはこれらの組み合わせ等のいずれでもよい。
図7A、図7B、および図8は、第1ステップにおけるジェスチャ操作の一例を示す図である。上述した第1ステップにおいて機能を選択する場合、図7A、図7Bのように左右方向を指示することで、表示部20の画面に左右方向に列挙される複数の機能のいずれかを選択するようになっていてもよい。
図9A~図9D、図10A、および図10Bは、第2ステップにおけるジェスチャ操作の一例を示す図である。例えば、第1ステップで選択された機能が「テレビ」であり、第2ステップにおいてテレビ画面に対してジェスチャ操作を行う場合、図9A、図9Bのような左右方向の指示でチャンネルを正方向または逆方向に切り替えたり、図9C、図9Dのような上下方向の指示でボリュームを上げたり下げたりするようになっていてもよい。
図11~図13を用いて、本実施形態に係るジェスチャ入力装置における処理の一例について説明する。
ここまで、本発明の一実施形態について説明したが、本発明は上述の実施形態に限定されず、その技術的思想の範囲内において種々異なる形態にて実施されてよいことは言うまでもない。
Claims (7)
- センサと、操作メニューが階層構造となっているカーナビゲーション装置に対する操作とユーザのジェスチャとを関連付けて記憶する記憶部と、スイッチと、表示部と、を備えるカーナビゲーション装置が実行する方法であって、
前記スイッチが押下されたことを検出する(a)ステップと、
前記スイッチが押下された後に、前記センサを介して前記ジェスチャを検出する(b)ステップと、
前記検出されたジェスチャに関連付けられている操作を、前記記憶部から取得する(c)ステップと、
前記ジェスチャが検出された時に前記表示部に表示されていた画面に対して、前記取得された操作を実行した結果を前記表示部に出力する(d)ステップと、
前記スイッチがリリースされたことを検出する(e)ステップと、を含み、
あらかじめ定められた所定時間内に前記スイッチが再度押下されたことを検出した場合には、前記(b)ステップから前記(e)ステップまでを繰り返すことを特徴とする方法。 - 前記(e)ステップを実行したことをトリガーとして、前記(c)ステップおよび前記(d)ステップを前記カーナビゲーション装置が実行し、
あらかじめ定められた所定時間内に前記スイッチが再度押下されたことを検出した場合には、さらに、前記(e)ステップを実行したことをトリガーとして、前記(c)ステップおよび前記(d)ステップを前記カーナビゲーション装置が実行することを特徴とする請求項1に記載の方法。 - 前記カーナビゲーション装置は、ライトをさらに備え、
前記(a)ステップを実行したことをトリガーとして前記ライトをONし、前記(e)ステップを実行したことをトリガーとして前記ライトをOFFすることを特徴とする請求項1または2に記載の方法。 - 前記(d)ステップは、前記表示部への出力に加えて、前記実行した結果を音声出力によって出力することを特徴とする請求項1から3のいずれか一項に記載の方法。
- 請求項1から4のいずれか一項に記載の方法を実行するカーナビゲーション装置。
- 請求項1から4のいずれか一項に記載の方法をコンピュータに実行させるためのコンピュータプログラム。
- 請求項1から4のいずれか一項に記載の方法をコンピュータに実行させるためのコンピュータプログラムを記憶したコンピュータ可読記録媒体。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP14807044.4A EP3007036A4 (en) | 2013-06-07 | 2014-06-06 | Gesture input device for car navigation device |
CN201480032583.4A CN105264463B (zh) | 2013-06-07 | 2014-06-06 | 用于汽车导航仪的手势输入装置 |
CA2914712A CA2914712C (en) | 2013-06-07 | 2014-06-06 | Gesture input apparatus for car navigation system |
KR1020157034801A KR20160009037A (ko) | 2013-06-07 | 2014-06-06 | 카네비게이션용 제스처 입력장치 |
US14/961,591 US9662980B2 (en) | 2013-06-07 | 2014-06-06 | Gesture input apparatus for car navigation system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013120899A JP5750687B2 (ja) | 2013-06-07 | 2013-06-07 | カーナビ用ジェスチャ入力装置 |
JP2013-120899 | 2013-06-07 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014196208A1 true WO2014196208A1 (ja) | 2014-12-11 |
Family
ID=52007865
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/003024 WO2014196208A1 (ja) | 2013-06-07 | 2014-06-06 | カーナビ用ジェスチャ入力装置 |
Country Status (7)
Country | Link |
---|---|
US (1) | US9662980B2 (ja) |
EP (1) | EP3007036A4 (ja) |
JP (1) | JP5750687B2 (ja) |
KR (1) | KR20160009037A (ja) |
CN (1) | CN105264463B (ja) |
CA (1) | CA2914712C (ja) |
WO (1) | WO2014196208A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017114191A (ja) * | 2015-12-22 | 2017-06-29 | クラリオン株式会社 | 車載装置 |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015043653A1 (de) * | 2013-09-27 | 2015-04-02 | Volkswagen Aktiengesellschaft | Anwenderschnittstelle und verfahren zur unterstützung eines anwenders bei der bedienung einer bedieneinheit |
KR101640053B1 (ko) * | 2015-01-02 | 2016-07-18 | 현대자동차주식회사 | 차량용 디스플레이 장치 및 이를 포함하는 차량 |
JP2017027456A (ja) * | 2015-07-24 | 2017-02-02 | 島根県 | ジェスチャ操作システム、方法およびプログラム |
JP2018022318A (ja) * | 2016-08-03 | 2018-02-08 | 株式会社東海理化電機製作所 | 操作入力装置 |
US10832031B2 (en) | 2016-08-15 | 2020-11-10 | Apple Inc. | Command processing using multimodal signal analysis |
US10913463B2 (en) * | 2016-09-21 | 2021-02-09 | Apple Inc. | Gesture based control of autonomous vehicles |
US10372132B2 (en) | 2016-12-12 | 2019-08-06 | Apple Inc. | Guidance of autonomous vehicles in destination vicinities using intent signals |
CN108944451B (zh) * | 2018-06-28 | 2020-09-01 | 北京车和家信息技术有限公司 | 一种显示控制方法及车辆 |
JP7484756B2 (ja) * | 2021-02-05 | 2024-05-16 | 株式会社デンソー | 表示システム |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001216069A (ja) | 2000-02-01 | 2001-08-10 | Toshiba Corp | 操作入力装置および方向検出方法 |
JP2006135703A (ja) * | 2004-11-08 | 2006-05-25 | Hitachi Ltd | 撮像装置 |
JP2006285370A (ja) | 2005-03-31 | 2006-10-19 | Mitsubishi Fuso Truck & Bus Corp | ハンドパターンスイッチ装置及びハンドパターン操作方法 |
JP2009104297A (ja) | 2007-10-22 | 2009-05-14 | Mitsubishi Electric Corp | 操作入力装置 |
JP2010097332A (ja) * | 2008-10-15 | 2010-04-30 | Toyota Motor Corp | 入力支援装置 |
JP2011131833A (ja) * | 2009-12-25 | 2011-07-07 | Honda Access Corp | 自動車における車載機器の操作装置 |
JP2012147440A (ja) | 2011-01-07 | 2012-08-02 | Apple Inc | ポータブルメディア装置のためのワイヤレスリモートコントロール装置 |
Family Cites Families (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6703999B1 (en) * | 2000-11-13 | 2004-03-09 | Toyota Jidosha Kabushiki Kaisha | System for computer user interface |
JP4585748B2 (ja) * | 2003-08-05 | 2010-11-24 | 日本ユニカー株式会社 | 難燃性オレフィン系樹脂被覆金属線及びその製造方法 |
JP4311190B2 (ja) * | 2003-12-17 | 2009-08-12 | 株式会社デンソー | 車載機器用インターフェース |
US7454717B2 (en) | 2004-10-20 | 2008-11-18 | Microsoft Corporation | Delimiters for selection-action pen gesture phrases |
JP2006312346A (ja) * | 2005-05-06 | 2006-11-16 | Nissan Motor Co Ltd | コマンド入力装置 |
JP2009248629A (ja) | 2008-04-02 | 2009-10-29 | Hitachi Ltd | 車載機器の入力装置及び車載機器の入力方法 |
KR101417090B1 (ko) | 2008-07-29 | 2014-07-09 | 현대자동차주식회사 | 차량용 통합 스위치 조작 방법 |
JP2010184600A (ja) * | 2009-02-12 | 2010-08-26 | Autonetworks Technologies Ltd | 車載用ジェスチャースイッチ装置 |
US20110221666A1 (en) | 2009-11-24 | 2011-09-15 | Not Yet Assigned | Methods and Apparatus For Gesture Recognition Mode Control |
US9019201B2 (en) | 2010-01-08 | 2015-04-28 | Microsoft Technology Licensing, Llc | Evolving universal gesture sets |
US8817087B2 (en) | 2010-11-01 | 2014-08-26 | Robert Bosch Gmbh | Robust video-based handwriting and gesture recognition for in-car applications |
WO2012063247A1 (en) | 2010-11-12 | 2012-05-18 | Hewlett-Packard Development Company, L . P . | Input processing |
US9292093B2 (en) * | 2010-11-18 | 2016-03-22 | Alpine Electronics, Inc. | Interface method and apparatus for inputting information with air finger gesture |
GB2488784A (en) * | 2011-03-07 | 2012-09-12 | Sharp Kk | A method for user interaction of the device in which a template is generated from an object |
JP2012212237A (ja) * | 2011-03-30 | 2012-11-01 | Namco Bandai Games Inc | 画像生成システム、サーバシステム、プログラム及び情報記憶媒体 |
US8873841B2 (en) | 2011-04-21 | 2014-10-28 | Nokia Corporation | Methods and apparatuses for facilitating gesture recognition |
DE102012012697A1 (de) * | 2012-06-26 | 2014-01-02 | Leopold Kostal Gmbh & Co. Kg | Bediensystem für ein Kraftfahrzeug |
JP6202810B2 (ja) * | 2012-12-04 | 2017-09-27 | アルパイン株式会社 | ジェスチャ認識装置および方法ならびにプログラム |
-
2013
- 2013-06-07 JP JP2013120899A patent/JP5750687B2/ja not_active Expired - Fee Related
-
2014
- 2014-06-06 CA CA2914712A patent/CA2914712C/en not_active Expired - Fee Related
- 2014-06-06 WO PCT/JP2014/003024 patent/WO2014196208A1/ja active Application Filing
- 2014-06-06 US US14/961,591 patent/US9662980B2/en not_active Expired - Fee Related
- 2014-06-06 KR KR1020157034801A patent/KR20160009037A/ko not_active Application Discontinuation
- 2014-06-06 CN CN201480032583.4A patent/CN105264463B/zh not_active Expired - Fee Related
- 2014-06-06 EP EP14807044.4A patent/EP3007036A4/en not_active Withdrawn
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001216069A (ja) | 2000-02-01 | 2001-08-10 | Toshiba Corp | 操作入力装置および方向検出方法 |
JP2006135703A (ja) * | 2004-11-08 | 2006-05-25 | Hitachi Ltd | 撮像装置 |
JP2006285370A (ja) | 2005-03-31 | 2006-10-19 | Mitsubishi Fuso Truck & Bus Corp | ハンドパターンスイッチ装置及びハンドパターン操作方法 |
JP2009104297A (ja) | 2007-10-22 | 2009-05-14 | Mitsubishi Electric Corp | 操作入力装置 |
JP2010097332A (ja) * | 2008-10-15 | 2010-04-30 | Toyota Motor Corp | 入力支援装置 |
JP2011131833A (ja) * | 2009-12-25 | 2011-07-07 | Honda Access Corp | 自動車における車載機器の操作装置 |
JP2012147440A (ja) | 2011-01-07 | 2012-08-02 | Apple Inc | ポータブルメディア装置のためのワイヤレスリモートコントロール装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3007036A4 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017114191A (ja) * | 2015-12-22 | 2017-06-29 | クラリオン株式会社 | 車載装置 |
Also Published As
Publication number | Publication date |
---|---|
KR20160009037A (ko) | 2016-01-25 |
EP3007036A1 (en) | 2016-04-13 |
JP2014238711A (ja) | 2014-12-18 |
JP5750687B2 (ja) | 2015-07-22 |
CA2914712C (en) | 2017-09-26 |
CN105264463B (zh) | 2017-05-10 |
US20160137061A1 (en) | 2016-05-19 |
CA2914712A1 (en) | 2014-12-11 |
US9662980B2 (en) | 2017-05-30 |
EP3007036A4 (en) | 2017-04-19 |
CN105264463A (zh) | 2016-01-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5750687B2 (ja) | カーナビ用ジェスチャ入力装置 | |
US10592198B2 (en) | Audio recording/playback device | |
US10029723B2 (en) | Input system disposed in steering wheel and vehicle including the same | |
US9103691B2 (en) | Multimode user interface of a driver assistance system for inputting and presentation of information | |
KR102029842B1 (ko) | 차량 제스처 인식 시스템 및 그 제어 방법 | |
JP5172485B2 (ja) | 入力装置及び入力装置の制御方法 | |
JP2007164814A (ja) | インタフェース装置 | |
EP2330486A1 (en) | Image display device | |
US10967737B2 (en) | Input device for vehicle and input method | |
JP2014046867A (ja) | 入力装置 | |
JP6269343B2 (ja) | 車両用操作装置 | |
US9904467B2 (en) | Display device | |
US20140281964A1 (en) | Method and system for presenting guidance of gesture input on a touch pad | |
JP2018036902A (ja) | 機器操作システム、機器操作方法および機器操作プログラム | |
JP2016038621A (ja) | 空間入力システム | |
KR20150000076A (ko) | 차량용 블라인드 콘트롤 시스템 | |
JP2016097928A (ja) | 車両用表示制御装置 | |
KR20180036556A (ko) | 차량용 조작 장치 | |
JP5420081B2 (ja) | ナビゲーション装置 | |
JP2019133395A (ja) | 入力装置 | |
US9582150B2 (en) | User terminal, electronic device, and control method thereof | |
JP6798608B2 (ja) | ナビゲーションシステムおよびナビゲーションプログラム | |
JP4908986B2 (ja) | タッチパネル表示機能を有する電子装置およびタッチパネル制御方法 | |
JP6558380B2 (ja) | 車両用入力装置、入力装置、及び、車両用入力装置の制御方法 | |
KR20220026074A (ko) | 증강현실을 이용한 차량 오디오/비디오 기능 설명제공장치 및 방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201480032583.4 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14807044 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014807044 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2914712 Country of ref document: CA Ref document number: 20157034801 Country of ref document: KR Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14961591 Country of ref document: US |