WO2007066488A1 - Dispositif d'affichage et procede de fonctionnement/commande par ecran tactile - Google Patents

Dispositif d'affichage et procede de fonctionnement/commande par ecran tactile Download PDF

Info

Publication number
WO2007066488A1
WO2007066488A1 PCT/JP2006/322925 JP2006322925W WO2007066488A1 WO 2007066488 A1 WO2007066488 A1 WO 2007066488A1 JP 2006322925 W JP2006322925 W JP 2006322925W WO 2007066488 A1 WO2007066488 A1 WO 2007066488A1
Authority
WO
WIPO (PCT)
Prior art keywords
images
image
menu
display device
display
Prior art date
Application number
PCT/JP2006/322925
Other languages
English (en)
Japanese (ja)
Inventor
Hiroshi Kobayashi
Hiroki Suzuki
Yuji Takatori
Original Assignee
Pioneer Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pioneer Corporation filed Critical Pioneer Corporation
Publication of WO2007066488A1 publication Critical patent/WO2007066488A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3206Monitoring of events, devices or parameters that trigger a change in power modality
    • G06F1/3231Monitoring the presence, absence or movement of users
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Definitions

  • the display unit has a function to display the operation such as S (On ceenDspa), which is used for setting the specifications of a plurality of images at the time of birth, by superimposing it on the corresponding image. To be done.
  • S On ceenDspa
  • Fig. 12 is a block diagram showing the configuration of the navigation device according to the light state.
  • FIG. 4 is a front view showing the configuration of the navigation device shown in FIG. This is a chart for explaining the processing executed in the control shown in 3.
  • this navigation clearly includes the control, the operation cut, the image display 2, the voice cut 3, the cut 4, and the check that form a stage. It comprises a net 5, an optical disc drive 6, and a drive disc 7. In addition to the general navigation and the above, it is also equipped with a voice input cut, a running sensor, a GPS, a media transmission / reception, etc. Is irrelevant, so that I am doing it.
  • 001 6 is a reaction that clearly constitutes a control means and a reaction that clearly constitutes a stage.
  • the left sensor 4 In order to detect that the left and right (for example, a person's finger) is moving closer to the left or right, the left sensor 4
  • the 0020 cut is equipped with. As shown in Fig. 2, this part includes only the number of dogs provided on the body of the navigation and the touch panel 5 provided on the face of the device 2 displayed on the display device 2. A plurality of Sotoki 2 etc. will be adopted.
  • the navigation parameters are set, for example, the destination is set, the information is set, and the driving situation of the vehicle is set.
  • the image display device 2 sends the display data sent from the device 2 () such as a device capable of displaying the image in the direction 2 in a single plane. Based on the above, it is configured to include two image display controls, such as a display controller such as a graphic renderer, and () a display memory for storing the data. This image display 2 displays the map information, route information, operation guidance, and optical disc drive 6 of the control unit.
  • the device 2 such as a device capable of displaying the image in the direction 2 in a single plane.
  • it is configured to include two image display controls, such as a display controller such as a graphic renderer, and () a display memory for storing the data.
  • This image display 2 displays the map information, route information, operation guidance, and optical disc drive 6 of the control unit.
  • 002 2 is composed of a unit for synthesizing a voice of a voice by a control unit, a speaker for converting a voice signal amplified through a voice into a voice and releasing the voice to the outside.
  • This dot 3 provides a voice guide for the vehicle's direction of travel, driving conditions, traffic conditions, and the above-mentioned conditions for the control dot, and an optical disc drive 6 and the like.
  • 002 4 comprises the above-mentioned left sensor 4 sensor 42 and a path for driving them.
  • Sensor 4 Sensor 42 is symmetrically arranged along the left and right two sides of the line 5.
  • the left sensor 4 sensor 42 for example, a sensor that detects the presence of a body within an area of c to 2 c degrees by using infrared rays can be used.
  • the channel 5 comprises the above-mentioned channel 5 and the path for driving the channel 5. , Obtains data on the position of the operation performed on channel 5, and gives this to the control unit.
  • Disc Dry 6 is not (Compac Dsc) or O (Dg a Ve sa eDsk ReadOn Memo) Reads the image data () data recorded in the body, and gives this to the control. Executes the processing of the given image data and audio data, and outputs them to the image display 2 and audio cut 3, respectively.
  • the 002 disk 7 stores various programs executed by the control, stores the image data related to the operation menu in the memory 7 and maps the navigation data. 72 for each.
  • the map data 72 for example, the map data recorded in a medium such as O is projected through the optical disc drive 6 and retained.
  • this channel control method is composed of a method that is periodically interrupted at a time interval of ⁇ seconds, and a task that is executed as a task during processing. Each of these processes will be described below.
  • 00283 shows the processing executed in the navigation
  • 4 shows the processing in the processing.
  • Step S the navigation unit executes the command. Then, as shown in 4, this mode is started by the control 2 determining whether the value of S, which indicates the state of the left sensor 4, has become O.
  • step S If the decision on the value of S is deterministic (step S), this means that an object is approaching the left sensor 4. Therefore, in Reaction 2, next, it is judged whether or not the value of S indicating the state of the right sensor 42 has become O (S 2) 003 1 The judgment result regarding this value of S is negative. If so (step S 2), then this Means that there is no object in the right sensor 42 and the body is closer to the left. Left sensor 4
  • the sensor 42 is the former one. Therefore, the control dot 03 is used for video images that can be viewed from the left.
  • control unit sets the value of the on-screen cursor O S to to indicate that the message displayed on the message at this time is the direction (step S4).
  • the control unit also sets the value of the tie for setting the duration of the operation mode to (step S5). If the value of this tie is set and the interval of the process of 3 is set to 5, for example, the operation duration is 5.
  • step S 6 If this judgment about the value of S is deterministic (step S 6), this means that there is no object in the left sensor 4 and its body is approaching from the right direction. To do. Left sensor 4
  • the sensor 42 is the latter one. Therefore, the control dot 03 is used for images that can be viewed from the right.
  • Operation which is the internal mode set for the operation, is selectively enabled and started (step S7).
  • control unit sets the value of the on-screen S to 2 in order to indicate that the screen displayed at this time is the direction (step S8).
  • control sets the value of the tie for setting the duration of the operation mode to in step S5 described above.
  • step S 2 if the judgment result regarding the value of S is determinative and the judgment result regarding the value of S is deterministic (step S 2), this means that an object is present at the left sensor 4 sensor 42. Means entering. , This means that the touch operation for touch panel 5 is currently being performed. Therefore, the control dot 3 continues the current mode or right mode (step S 900039, and the control dot sets the duration of the operation mode in step S 5 described above). Set the tie value for to. When the tie value is set, the control returns processing to the process in Figure 3.
  • step S 2 determines the tie value (step 2) and determines whether the tie value is (step S 2) o.
  • step S 2 If the decision on the value of Thailand is deterministic (step S 2), this means that the duration of the currently valid token has already passed. To do. Therefore, the control dot 3 terminates the current valid or right command (step S22) 004 2. Then, the control dot does not have a display message at this time. To indicate that the value of the on-screen OS is set to (step S23)
  • step S 2 the control returns the processing to the processing shown in FIG. If the decision on Thailand is negative (step S 2), this means that the duration of the current play has not passed. Therefore, in this case, the control returns the process to the process shown in Fig. 3.
  • control unit determines whether or not the value of the on-screen OSS is set to (step S 2). Now, if the value of the on-screen cursor OS is set to (step S2), the control
  • step S3 Reads out the image data representing the left-sided message from the memory 7 in the disk 7, and displays the corresponding message on the mouse 2 of the image display 2 (step S3). Then, the control ends the process related to this time.
  • step S 2 determines whether the value is set to 2 (step 4).
  • step S 4 the control device will store the image data representing the right-direction device in the memory 7 of the disk 7. From the image display device 2 on the image display device 2 (step S5). Then, the control ends the process related to this time.
  • step S 4 If the value of the on-screen OS is also not set to 2 (step S 4, the control menu is not displayed the corresponding left or right menu). Yes (step S6). Then, the control unit ends this process.
  • the reaction 2 and 4 detect that the behavior of the mechanism 5 is closer to the left and right sides. Further, when the orientation of the object is detected by the reaction 2 and 4, the operation mode 3 is allowed only the original setting operation of the image indicated by the image regarding the image possible from the corresponding direction. Depending on the user's action, the mode set for the action of channel 5 is selectively enabled 004. In the present embodiment, the action of moving only once or moving the body of the finger or the like close to touch panel 5 is performed. With only the line, it is possible to selectively display an image related to the image in the corresponding direction, and subsequently, it is possible to perform an action for the indicated image in succession to this action. As a result, it becomes possible to perform the operation by a very convenient operation procedure on the display device using the display device having the function of displaying a plurality of images in a distributed manner.
  • the present invention is not limited to the above state, but can be modified.
  • the present device in the above-mentioned state, an example in which the light device is applied to the navigation is given, but it is of course possible to apply the present device to the image display device of this book and other devices. .
  • the image displayed on the display chair 2 is divided into the left and right directions and displayed, but as the chair, for example, the one that distributes and displays the images in the up, down, left and right directions is adopted. If you do, you can apply Ming in the same way. In this case, the number of existing sensors that compose the unit 4 should be 4, and these should be symmetrically arranged in the direction of the channel 5 along the 4 directions of up, down, left and right.
  • the element of the control unit is Although it is assumed that the touch control method described in the above state is used, for example, the control unit is formed by a pin equipped with CP (Cen a Pocessng Un) RO (ReadOnyMemo) R (RandomAccessMemo). , Even if it's done.
  • the corresponding touch program is recorded on a recording medium such as a disk device or CO that can be used to read the computer, and the computer reads and executes it.
  • the program may be distributed via networks such as the Internet.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Navigation (AREA)
  • Position Input By Displaying (AREA)

Abstract

La présente invention concerne un dispositif d'affichage qui comporte une section de détection de séquence de réponse (102) (unité de détection de présence (140)) afin de détecter l'une de deux directions de gauche et de droite à partir de laquelle un objet servant à effectuer une utilisation de touches s'est approché ; une section de commande d'affichage de menu de fonctionnement (101) servant à afficher un menu de fonctionnement utilisé pour le réglage de spécifications lorsque chacune des images des directions de gauche et de droite est reproduite, le menu de fonctionnement étant affiché de manière superposée sur une image vidéo correspondante ; et une section de mise en œuvre d'un mode de fonctionnement (103) servant à mettre en œuvre de manière sélective un mode de fonctionnement qui est établi en relation à l'utilisation de touches sur un écran tactile de manière à correspondre au menu de fonctionnement, la mise en œuvre du mode de fonctionnement étant faite de manière à ce que, lorsque la direction d'approche de l'objet est détectée par la section de détection de séquence de réponse (102), seul soit permise l'opération de réglage de spécifications relative au menu de fonctionnement affiché par la section de commande d'affichage de menu de fonctionnement (101) en relation à une image vidéo pouvant être visualisée à partir de la direction d'approche correspondante. Par conséquent, dans le dispositif d'affichage utilisant un panneau d'affichage ayant une fonction d'affichage de plusieurs images vidéo d'une manière divisée, l'utilisation du menu peut être effectuée par une procédure d'utilisation simple.
PCT/JP2006/322925 2005-12-09 2006-11-17 Dispositif d'affichage et procede de fonctionnement/commande par ecran tactile WO2007066488A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2005-356083 2005-12-09
JP2005356083 2005-12-09

Publications (1)

Publication Number Publication Date
WO2007066488A1 true WO2007066488A1 (fr) 2007-06-14

Family

ID=38122638

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2006/322925 WO2007066488A1 (fr) 2005-12-09 2006-11-17 Dispositif d'affichage et procede de fonctionnement/commande par ecran tactile

Country Status (1)

Country Link
WO (1) WO2007066488A1 (fr)

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2028585A1 (fr) * 2007-08-21 2009-02-25 Wacom Co., Ltd. Appareil de traitement d'informations, procédé d'entrée de fonctionnement et produit de programme informatique
EP2133778A2 (fr) 2008-06-10 2009-12-16 Sony Service Centre (Europe) N.V. Écran tactil avec un clavier virtuel et au moins un capteur de proximité
US7903094B2 (en) 2006-06-23 2011-03-08 Wacom Co., Ltd Information processing apparatus, operation input method, and sensing device
TWI552124B (zh) * 2009-02-02 2016-10-01 蘋果公司 用於顯示器資料線之雙組態
US9582131B2 (en) 2009-06-29 2017-02-28 Apple Inc. Touch sensor panel design
TWI602086B (zh) * 2015-06-30 2017-10-11 華碩電腦股份有限公司 觸控裝置與其操作方法
US9874975B2 (en) 2012-04-16 2018-01-23 Apple Inc. Reconstruction of original touch image from differential touch image
US9880655B2 (en) 2014-09-02 2018-01-30 Apple Inc. Method of disambiguating water from a finger touch on a touch sensor panel
US9886141B2 (en) 2013-08-16 2018-02-06 Apple Inc. Mutual and self capacitance touch measurements in touch panel
US9996175B2 (en) 2009-02-02 2018-06-12 Apple Inc. Switching circuitry for touch sensitive display
US10001888B2 (en) 2009-04-10 2018-06-19 Apple Inc. Touch sensor panel design
US10289251B2 (en) 2014-06-27 2019-05-14 Apple Inc. Reducing floating ground effects in pixelated self-capacitance touch screens
US10365773B2 (en) 2015-09-30 2019-07-30 Apple Inc. Flexible scan plan using coarse mutual capacitance and fully-guarded measurements
US10386965B2 (en) 2017-04-20 2019-08-20 Apple Inc. Finger tracking in wet environment
US10444918B2 (en) 2016-09-06 2019-10-15 Apple Inc. Back of cover touch sensors
US10488992B2 (en) 2015-03-10 2019-11-26 Apple Inc. Multi-chip touch architecture for scalability
US10705658B2 (en) 2014-09-22 2020-07-07 Apple Inc. Ungrounded user signal compensation for pixelated self-capacitance touch sensor panel
US10712867B2 (en) 2014-10-27 2020-07-14 Apple Inc. Pixelated self-capacitance water rejection
US10795488B2 (en) 2015-02-02 2020-10-06 Apple Inc. Flexible self-capacitance and mutual capacitance touch sensing system architecture
US10936120B2 (en) 2014-05-22 2021-03-02 Apple Inc. Panel bootstraping architectures for in-cell self-capacitance
US11269467B2 (en) 2007-10-04 2022-03-08 Apple Inc. Single-layer touch-sensitive display
US11294503B2 (en) 2008-01-04 2022-04-05 Apple Inc. Sensor baseline offset adjustment for a subset of sensor output values
US11662867B1 (en) 2020-05-30 2023-05-30 Apple Inc. Hover detection on a touch sensor panel

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004067031A (ja) * 2002-08-08 2004-03-04 Nissan Motor Co Ltd 操作者判別装置およびこれを用いた車載装置
JP2004233816A (ja) * 2003-01-31 2004-08-19 Olympus Corp 映像表示装置及び映像表示方法
JP2005071286A (ja) * 2003-08-28 2005-03-17 Sharp Corp 表示装置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004067031A (ja) * 2002-08-08 2004-03-04 Nissan Motor Co Ltd 操作者判別装置およびこれを用いた車載装置
JP2004233816A (ja) * 2003-01-31 2004-08-19 Olympus Corp 映像表示装置及び映像表示方法
JP2005071286A (ja) * 2003-08-28 2005-03-17 Sharp Corp 表示装置

Cited By (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7903094B2 (en) 2006-06-23 2011-03-08 Wacom Co., Ltd Information processing apparatus, operation input method, and sensing device
EP2028585A1 (fr) * 2007-08-21 2009-02-25 Wacom Co., Ltd. Appareil de traitement d'informations, procédé d'entrée de fonctionnement et produit de programme informatique
US11983371B2 (en) 2007-10-04 2024-05-14 Apple Inc. Single-layer touch-sensitive display
US11269467B2 (en) 2007-10-04 2022-03-08 Apple Inc. Single-layer touch-sensitive display
US11294503B2 (en) 2008-01-04 2022-04-05 Apple Inc. Sensor baseline offset adjustment for a subset of sensor output values
US8619034B2 (en) 2008-06-10 2013-12-31 Sony Europe (Belgium) Nv Sensor-based display of virtual keyboard image and associated methodology
EP2133778A2 (fr) 2008-06-10 2009-12-16 Sony Service Centre (Europe) N.V. Écran tactil avec un clavier virtuel et au moins un capteur de proximité
TWI552124B (zh) * 2009-02-02 2016-10-01 蘋果公司 用於顯示器資料線之雙組態
US9996175B2 (en) 2009-02-02 2018-06-12 Apple Inc. Switching circuitry for touch sensitive display
US10001888B2 (en) 2009-04-10 2018-06-19 Apple Inc. Touch sensor panel design
US9582131B2 (en) 2009-06-29 2017-02-28 Apple Inc. Touch sensor panel design
US9874975B2 (en) 2012-04-16 2018-01-23 Apple Inc. Reconstruction of original touch image from differential touch image
US9886141B2 (en) 2013-08-16 2018-02-06 Apple Inc. Mutual and self capacitance touch measurements in touch panel
US10936120B2 (en) 2014-05-22 2021-03-02 Apple Inc. Panel bootstraping architectures for in-cell self-capacitance
US10289251B2 (en) 2014-06-27 2019-05-14 Apple Inc. Reducing floating ground effects in pixelated self-capacitance touch screens
US9880655B2 (en) 2014-09-02 2018-01-30 Apple Inc. Method of disambiguating water from a finger touch on a touch sensor panel
US10705658B2 (en) 2014-09-22 2020-07-07 Apple Inc. Ungrounded user signal compensation for pixelated self-capacitance touch sensor panel
US11625124B2 (en) 2014-09-22 2023-04-11 Apple Inc. Ungrounded user signal compensation for pixelated self-capacitance touch sensor panel
US11561647B2 (en) 2014-10-27 2023-01-24 Apple Inc. Pixelated self-capacitance water rejection
US10712867B2 (en) 2014-10-27 2020-07-14 Apple Inc. Pixelated self-capacitance water rejection
US10795488B2 (en) 2015-02-02 2020-10-06 Apple Inc. Flexible self-capacitance and mutual capacitance touch sensing system architecture
US11353985B2 (en) 2015-02-02 2022-06-07 Apple Inc. Flexible self-capacitance and mutual capacitance touch sensing system architecture
US12014003B2 (en) 2015-02-02 2024-06-18 Apple Inc. Flexible self-capacitance and mutual capacitance touch sensing system architecture
US10488992B2 (en) 2015-03-10 2019-11-26 Apple Inc. Multi-chip touch architecture for scalability
TWI602086B (zh) * 2015-06-30 2017-10-11 華碩電腦股份有限公司 觸控裝置與其操作方法
US10365773B2 (en) 2015-09-30 2019-07-30 Apple Inc. Flexible scan plan using coarse mutual capacitance and fully-guarded measurements
US10444918B2 (en) 2016-09-06 2019-10-15 Apple Inc. Back of cover touch sensors
US10642418B2 (en) 2017-04-20 2020-05-05 Apple Inc. Finger tracking in wet environment
US10386965B2 (en) 2017-04-20 2019-08-20 Apple Inc. Finger tracking in wet environment
US11662867B1 (en) 2020-05-30 2023-05-30 Apple Inc. Hover detection on a touch sensor panel

Similar Documents

Publication Publication Date Title
WO2007066488A1 (fr) Dispositif d'affichage et procede de fonctionnement/commande par ecran tactile
US10466800B2 (en) Vehicle information processing device
EP2650164B1 (fr) Système d'informations embarqué, appareil embarqué et terminal d'informations
JP6282188B2 (ja) 情報処理装置
CN105283356B (zh) 应用程序控制方法以及信息终端
JP5028038B2 (ja) 車載表示装置および車載表示装置の表示方法
JP5040901B2 (ja) 車載情報装置及び車載情報システム
JP4628199B2 (ja) 表示装置
JP5305039B2 (ja) 表示装置、表示方法、及び表示プログラム
JP2007241410A (ja) 表示装置及び表示制御方法
JP2009193135A (ja) 車載モニタ装置
JP5795177B2 (ja) 情報処理装置、情報処理方法、及び、プログラム
JP2016038621A (ja) 空間入力システム
JP2005196530A (ja) 空間入力装置及び空間入力方法
JP2006001498A (ja) 車載ユニット装置およびタッチパネルによる操作方法
JP7038560B2 (ja) 情報処理装置および情報処理方法
JP5261878B2 (ja) 車載画像表示制御装置および車載画像表示制御用プログラム
JP6033465B2 (ja) 表示制御装置
JP2012083831A (ja) タッチパネル装置、タッチパネルの表示方法、タッチパネルの表示処理プログラム、及び記録媒体
US20160253088A1 (en) Display control apparatus and display control method
JP6655997B2 (ja) 映像制御装置
JP6180306B2 (ja) 表示制御装置及び表示制御方法
JP4351921B2 (ja) ナビゲーション装置、情報呈示方法、およびナビゲーション用プログラム
JP2007112283A (ja) ナビゲーション装置
JP2010185686A (ja) 情報提供装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 06832804

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP