WO2010089980A1 - 画像表示装置 - Google Patents
画像表示装置 Download PDFInfo
- Publication number
- WO2010089980A1 WO2010089980A1 PCT/JP2010/000513 JP2010000513W WO2010089980A1 WO 2010089980 A1 WO2010089980 A1 WO 2010089980A1 JP 2010000513 W JP2010000513 W JP 2010000513W WO 2010089980 A1 WO2010089980 A1 WO 2010089980A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- touch panel
- finger
- image display
- operator
- menu
- Prior art date
Links
- 230000035945 sensitivity Effects 0.000 claims abstract description 54
- 238000013459 approach Methods 0.000 claims abstract description 39
- 230000008859 change Effects 0.000 claims description 8
- 230000004044 response Effects 0.000 claims description 3
- 230000003247 decreasing effect Effects 0.000 abstract 1
- 230000006870 function Effects 0.000 description 32
- 238000010586 diagram Methods 0.000 description 30
- 238000000034 method Methods 0.000 description 19
- 230000008569 process Effects 0.000 description 10
- 238000001514 detection method Methods 0.000 description 5
- 239000004973 liquid crystal related substance Substances 0.000 description 5
- 238000009434 installation Methods 0.000 description 3
- 230000007257 malfunction Effects 0.000 description 3
- 230000007423 decrease Effects 0.000 description 2
- 230000001771 impaired effect Effects 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3664—Details of the user input interface, e.g. buttons, knobs or sliders, including those provided on a touch screen; remote controllers; input using gestures
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/10—Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/21—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
- B60K35/22—Display screens
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/80—Arrangements for controlling instruments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/044—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
- G06F3/0446—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means using a grid-like structure of electrodes in at least two directions, e.g. using row and column electrodes
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/11—Instrument graphical user interfaces or menu aspects
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/11—Instrument graphical user interfaces or menu aspects
- B60K2360/115—Selection of menu items
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/141—Activation of instrument input devices by approaching fingers or pens
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/143—Touch sensitive instrument input devices
- B60K2360/1438—Touch screens
- B60K2360/1442—Emulation of input devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/55—Remote control arrangements
- B60K2360/56—Remote control arrangements using mobile devices
- B60K2360/573—Mobile devices controlling vehicle functions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04101—2.5D-digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface and also measures the distance of the input means within a short range in the Z direction, possibly with a separate measurement setup
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04108—Touchless 2D- digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface without distance measurement in the Z direction
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2380/00—Specific applications
- G09G2380/10—Automotive applications
Definitions
- the present invention relates to an image display device provided with a touch panel, and more particularly to an image display device used for an in-vehicle device such as a navigation device.
- a conventional touch panel has a transmissive touch panel on the surface of a liquid crystal display panel, and has a finger approach signal processing unit that detects the approach of a user's finger by infrared rays or microwaves.
- a navigation device using this conventional touch panel when the menu is displayed on the display unit, when a user operation is accepted by the touch panel, the finger touches the touch panel.
- a device that detects an approach signal processing unit and displays a menu on a display unit is disclosed (Patent Document 1).
- a conventional image display it is applied to, for example, a car navigation apparatus, and a display screen having a limited size can be effectively used to provide various information with certainty.
- a menu is displayed by the approach of a finger to the touch panel.
- the conventional apparatus senses the approach of a finger using infrared rays or microwaves
- a space for installing a signal processing unit for detecting the approach of the finger is required around a display device such as a liquid crystal display.
- the apparatus becomes complicated and large, and it is difficult to install it in a limited installation space in the vehicle.
- a detection device such as an infrared ray
- the approach of the user's finger is sensed using infrared rays or microwaves, and the approach of the user's finger is detected around the display unit of the image display device.
- Space for installing the signal processing unit is required, but when the image display device is applied as an on-vehicle device, the installation space for the image display device is limited, so the signal processing unit is sacrificed at the expense of the display area Space must be secured.
- a liquid crystal panel of about 7 to 9 inches which is currently mainstream cannot be used as a display unit used for the display unit, and it is necessary to use a smaller one than this. Therefore, it is difficult for the user to see the contents displayed on the image display device, and further, it is necessary to set a small menu button or the like to be displayed on the display unit, which makes it difficult for the user (operator) to operate the device. There was a problem.
- the present invention has been made to solve the conventional problems, and does not require a space for installing a signal processing unit for detecting the approach of a user's finger around the display unit of the image display device, and to the touch panel.
- the menu button is displayed on the display unit when the operator's finger approaches the touch panel, and further, the menu button is slid on the touch panel while pressing the displayed menu button.
- An object of the present invention is to provide an image display device that is easy to operate and easy to see the content displayed on the display unit by changing the number of menu icons displayed on the display unit.
- the image display device of the present invention is an image display device in which options for executing the function of the device itself are hierarchized, and is arranged on the surface of the image display unit for displaying the options and the image display unit A capacitance-type touch panel; a signal processing unit that detects a capacitance that has changed in response to an input operation on the touch panel; and a control unit that executes a function corresponding to the input operation.
- the sensitivity of the touch panel is set to be higher than the high sensitivity.
- the lower layer option is displayed on the image display unit. Executes a function of setting the display area for displaying the hierarchy options to a size corresponding to the distance that the operator's finger moves while pressing on the touch panel, and displaying the lower hierarchy options in the display area To do.
- the button is displayed when the finger is brought close to the touch panel, and the display of the number of menus can be changed by sliding the button, making the display screen easy to operate Can be seen easily.
- the signal processing unit converts a change in capacitance into a voltage
- the control unit determines that the value of the voltage converted into the signal processing unit is larger than a first threshold value.
- the sensitivity of the touch panel is changed by comparing the second threshold value, which is larger than the first threshold value, with the voltage value converted into the signal processing unit.
- the control unit displays the selection menu button of the upper hierarchy corresponding to the region where the image display unit is divided into a plurality of regions and the finger of the operator is detected.
- control unit displays the higher-level options on at least one corner of the image display unit. With this configuration, it is possible to avoid hiding information at the center of the image displayed before the menu button is displayed.
- the image display device of the present invention when it is detected that the operator's finger approaches the touch panel and the capacitance becomes a predetermined amount or more, an input operation is detected among the corners of the image display unit.
- the upper layer option is displayed on the image display unit at the corner closest to the finger position.
- the control unit sets the display area to an area including a position where the operator's finger starts pressing and an end position on the touch panel.
- the control unit converts the display area into a quadrangular area with a line segment connecting a position where the finger of the operator starts and ends on the touch panel as a diagonal line. Set. With this configuration, only the number of menu icons intended by the operator can be displayed, and selection becomes easy.
- the present invention is an image display device using a capacitive touch panel, in which the sensitivity at which the signal processing unit that detects input to the touch panel determines the input to the touch panel varies depending on the distance between the finger and the touch panel. Since it is configured to change according to the detection value of the capacity, it is not necessary to install a signal processing unit for detecting the approach of the user's finger around the display unit, so the size of the image display unit is larger than the conventional one Further, the menu button is displayed when the finger is brought close to the touch panel, and the number of menu icons displayed can be changed by sliding the menu button. it can.
- the block diagram of the vehicle-mounted apparatus in embodiment of this invention Flow chart for explaining operation of in-vehicle device In the in-vehicle device according to the first embodiment of the present invention, this is for explaining the operation, wherein (a), (c), (e), (g), (i), and (j) are image display units.
- the schematic diagram which shows the front view of (b), (d), (f), and (h) is the schematic diagram seen from the side of the image display part In the vehicle-mounted apparatus in embodiment of this invention, it is the schematic for operation
- (A)-(d) is a schematic diagram of changing the display area of the movement menu display area in the Y direction of the finger in the in-vehicle device according to the embodiment of the present invention.
- (A)-(e) is a schematic diagram when the direction of moving a finger is the XY direction in the in-vehicle device according to the embodiment of the present invention.
- (A)-(e) is a schematic diagram about the area
- in-vehicle device an in-vehicle image display device
- this in-vehicle image display device will be simply referred to as “in-vehicle device”.
- FIG. 1 shows a block diagram of an in-vehicle device according to an embodiment of the present invention.
- the in-vehicle device according to the present embodiment is a so-called car navigation device having a navigation function for performing route guidance and the like, an audio reproduction function for reproducing an audio image recorded on a recording medium such as a DVD (Digital Versatile Disc), and the like.
- a recording medium such as a DVD (Digital Versatile Disc), and the like.
- this will be described as an example of the in-vehicle device.
- an in-vehicle device 10 includes a storage device 11, a DVD / CD drive 12, a GPS receiver 13, a vehicle speed sensor 14, a gyro 15, a speaker 16, an image display unit 17, a touch panel 18, a signal processing unit 19, and a control unit. 20 is provided.
- the DVD / CD drive 12, the GPS receiver 13, the vehicle speed sensor 14, the gyro 15, and the speaker 16 do not have to be provided integrally in the in-vehicle device 10, and each of the in-vehicle device 10 is electrically connected. It is good also as a structure which can be connected and removed.
- the storage device 11 stores a basic program necessary for controlling the in-vehicle device 10 when the in-vehicle device is activated (or in an emergency), and is used for executing a navigation function or an audio playback function.
- Various programs and databases are stored, such as application programs to be used and databases for maps for navigation.
- the storage device 11 is provided with an area where various programs and various data are developed and an area where an image is developed, as in a general storage device.
- the DVD / CD drive 12 is provided for reproducing a disc in which an audio source (or audio data) or a video source (or video data) is stored.
- the GPS receiver 13 is provided for detecting signals from GPS satellites.
- the vehicle speed sensor 14 is provided for receiving a vehicle speed signal from the vehicle and determining the moving speed of the vehicle.
- the gyro 15 is provided to detect the amount of rotation of the vehicle and the amount of change in the vertical direction.
- the speaker 16 is provided for outputting voice during operation of voice guidance, which is one of navigation functions, and voice and music reproduced by the DVD / CD drive 12.
- the image display unit 17 provides various information such as a road map in the vicinity of the vehicle position to the occupant as a screen display based on various information such as map data, or an image recorded on a decoded DVD or the like. It is provided to display data on the screen.
- the touch panel 18 is a transparent panel provided on the surface of the image display unit 17.
- a capacitive touch panel is used, and the display screen displayed on the image display unit 17 is a touch panel. 18 is displayed.
- the signal processing unit 19 calculates the touch panel obtained from the position signal corresponding to the pressed position and the capacitance of the touch panel 18 and the finger 53. A signal corresponding to the distance between the finger 53 and the finger 53 is output to the control unit 20.
- the control unit 20 includes a microprocessor and a peripheral circuit that is an electric circuit for operating the microprocessor, and executes various control processes by executing a control program stored in the storage device 11. In addition, the control unit 20 performs control processing so that image data obtained as a result of the control processing by the control unit 20 is displayed on the display unit 17. The control unit 20 further acquires position information obtained by determining the pressed position on the touch panel 18 from the signal processing unit 19 and collates the information with the touch area information of the touch panel 18 stored in the storage device 11. . And the function defined in the button, menu, switch, etc. previously linked
- the in-vehicle device 10 uses a GPS receiver 13 or map data, a navigation mode in which a destination setting function, a route editing function, a point editing function, a navigation setting function, and the like are executed.
- a navigation mode in which a destination setting function, a route editing function, a point editing function, a navigation setting function, and the like are executed.
- Each operation such as a disc mode executed using a CD drive 12 or a DVD disc or a radio mode executed using a radio tuner (not shown) is selected by an operator (or user). Is done according to.
- Hierarchical operation refers to an operation in which when a certain option such as a menu button is selected, a plurality of items such as menu icons are displayed to select a hierarchical option (menu).
- the upper hierarchy option refers to an option that is higher in the operation order when performing a hierarchy operation
- the lower hierarchy option refers to an option that is lower in the operation order when performing a hierarchy operation.
- menu button is used as an upper layer and a menu icon is used as a lower layer
- alternatives may be set as a lower layer of the menu icon.
- a menu button what is displayed on the display screen in order to select each mode described above is called a menu button, and further, for example, a destination setting function or a route editing function in the navigation mode is assigned to each mode.
- menu icons displayed on the display screen for the operator to select the respective functions are referred to as menu icons.
- the menu button does not necessarily need to be allocated for selecting each mode and the menu icon for selecting each function. That is, for example, a menu button and a menu icon may be allocated only for the navigation mode.
- the menu button is allocated to “route guidance”, “map display method”, “basic setting”, etc.
- the menu icon is assigned to “Destination setting”, “Display destination area information”, etc., or the menu button is assigned to “Map display” Functions that can be assigned to menu buttons and menu icons as appropriate according to the situation in which the in-vehicle device may be set, such as being assigned to “planar display” or “overhead view display”, etc. May be set.
- FIG. 2 is a flowchart showing an input process performed by the in-vehicle device 10 when the operator performs an input operation by pressing the touch panel 18 in the in-vehicle device 10 according to the embodiment of the present invention.
- the input operation refers to selection of a plurality of options such as menu buttons and menu icons displayed, the operator's finger approaching the touch panel 18, and the operator's finger pressing the touch panel 18. It is an expression including
- FIG. 3A is a view of the image display unit 17 and the touch panel 18 as viewed from the front
- FIG. 3B is a view of the state shown in FIG. 3A as viewed from the side. is there.
- the image display unit 17 displays a road 51 and a vehicle mark 52 around the vehicle position.
- the in-vehicle device 10 is provided with the touch panel 18 and the image display unit 17 in order from the side on which the operator's finger 53 should come into contact.
- the initial state is waiting for input to the touch panel 18, and the state in which the sensitivity of the touch panel 18 is increased is set to the initial state (the state of START in FIG. 2).
- the state where the sensitivity of the touch panel 18 is increased is a sensitivity with which it is possible to determine whether or not the operator has performed an input operation on the touch panel even when the operator's finger 53 is not in contact with the touch panel 18.
- FIG. 3C is a diagram showing a state where the operator is bringing the operator's finger 53 close to the touch panel 18, and FIG. 3D is a side view of the state shown in FIG. 3C. It is a figure which shows a state.
- the signal processing unit 19 moves the finger 53.
- the detection sensitivity of the touch panel 18 is set so as to detect the approach.
- This distance L59 is, for example, about 1 centimeter.
- the distance L59 is set to about 1 centimeter, for example, but “about 1 centimeter” is assumed when the vehicle occupant performs some operation without intending to operate the in-vehicle device 10. Set distance.
- “about 1 centimeter” means that even if a vehicle occupant performs some operation not intended to operate the in-vehicle device 10 in the vicinity of the in-vehicle device 10 in consideration of general usage conditions, It is a distance that is set because it is thought that the in-vehicle device 10 will not perform an unnecessary function execution operation, and this value is appropriately set in consideration of the installation location of the in-vehicle device 10 and the indoor layout of the vehicle. Distance.
- step S ⁇ b> 10 the operator's finger 53 further approaches the touch panel 18, and the capacitance that changes according to the distance between the operator's finger 53 and the touch panel 18 is the distance between the operator's finger 53 and the touch panel 18.
- the signal processing unit 19 detects that the operator's finger 53 has approached and calculates position information of the operator's finger 53 (step S11).
- step S11 position information of the operator's finger 53
- this position information is input to the control unit 20, and based on this position information.
- control unit 20 sends a control signal to the signal processing unit 19, and then lowers the sensitivity of the touch panel 18 (step S12).
- control unit 20 displays menu buttons 54 to 57 on the image display unit 17. A control process is performed so as to do this (step S13).
- FIG. 3E shows menu buttons 54 to 57 when the distance L59 between the finger 53 and the touch panel is smaller than the distance (for example, 1 centimeter) at which the touch panel 18 detects the approach of the finger 53.
- 17 is a diagram showing a state displayed on the screen 17. Here, as an example, a state in which one menu button 54 to 57 is displayed at each of the four corners of the image display unit 17 is shown.
- the distance L59 between the touch panel 18 and the operator's finger 53 and the sensitivity of the touch panel 18 will be described.
- the operation is not an operation on the in-vehicle device 10, for example, an operation such as steering operation or air conditioner operation,
- the vehicle-mounted device 10 may malfunction due to a vehicle occupant receiving an influence such as a vehicle shake.
- the distance L59 is 1 to 2 centimeters or less, considering the distance L59 so that the approach of the finger 53 is detected, and the sensitivity of the touch panel 18 is set based on the value of the distance L59, boarding the vehicle Behavior other than the operator's touch panel operation, for example, malfunctions of the touch panel due to vehicle shaking or the like can be reduced, and the touch panel 18 can be easily operated by the operator.
- Step S14 the control unit 20 determines whether any of the menu buttons 54 to 57 displayed on the image display unit 17 has been touched.
- FIG. 3F is a diagram illustrating a state where the state illustrated in FIG. 3E is viewed from the side.
- menu buttons 54 to 57 are displayed on the image display unit 17 by the signal sent from the signal processing unit 19 to the control unit 20.
- FIG. 3G shows a state in which the menu button 54 is pressed by the operator.
- FIG. 3H is a diagram illustrating a state where the state illustrated in FIG. 3G is viewed from the side, and illustrates a state where the finger 53 is in contact with the touch panel 18.
- FIG. 3 (i) shows a state where the operator moves the finger 53 while pressing the touch panel 18, and shows a state where the menu display area 58 is displayed on the image display unit 17.
- FIG. 3J is a diagram showing a state where the menu icon 101 is displayed in the menu display area 58. If any of the menu buttons 54 to 57 is pressed in step S14, the process proceeds to execution of step S15.
- step S14 when the operator moves the finger 53 on the touch panel 18 while pressing any of the menu buttons 54 to 57, the menu display area 58 is displayed according to the distance moved, and the menu icon is displayed in the menu display area 58. 101 is displayed (step S15).
- the operator selects the menu icon 101 displayed in the area of the menu display area 58 displayed on the image display unit 17, and touches the selected menu icon 101 with the operator's finger, so that the input operation is performed.
- the function is executed by the control unit 20. That is, the process assigned to the selected menu icon 101 is performed (step S16).
- the process assigned to the selected menu icon 101 is executed because, for example, if the selected menu icon is displayed as “guidance start”, the route guidance operation of the navigation function is started automatically.
- the control unit 20 controls the apparatus. For example, if the name of a music track recorded on the DVD being played is displayed on the selected menu icon, the control unit starts playback of the song. 20 is to control.
- step S17 when the control unit 20 determines that the content displayed on the image display unit 17 is a predetermined display, the sensitivity of the touch panel 18 is increased (step S18), and the process ends.
- the process of step S16 is performed, and when it is determined in step S17 that the content displayed on the image display unit 17 is not a predetermined display, an operation from the operator is performed within a predetermined time set in advance.
- the predetermined display content such as the current location of the host vehicle displayed on the image display unit 17 before the operation is displayed, and then the process proceeds to step S18. Processing may be performed.
- the menu button is displayed when the finger is brought close to the touch panel by changing the sensitivity of determining the input to the touch panel in the signal processing unit that detects the input to the touch panel.
- the number of menu icons displayed can be changed by sliding the menu button, and the display screen can be easily viewed.
- the sensitivity of the touch panel 18 for detecting the operation by the operator is switched between when the finger 53 approaches the touch panel 18 and when the finger 53 touches the touch panel 18.
- the menu button is displayed only when the operator's finger 53 approaches the touch panel 18, so that the entire area where the image is displayed on the image display unit 17 is visually recognized when the operation on the touch panel 18 is not performed.
- menu which is an area for displaying images other than menu icons. It is possible to realize a vehicle device that can secure a wide area of ⁇ outside.
- a capacitive touch panel is used as the touch panel 18, a space for installing a signal processing unit for detecting the approach of the user's finger around the display unit of the image display device is not required. That's it.
- FIG. 4A and 4B are diagrams showing examples of display screens when an area other than the menu buttons 54 to 57 is pressed in step S14.
- the control unit 20 causes the image display unit 17 to be pressed. Then, the intersection point 80 in the display screen is controlled to be arranged at the center of the display screen, and is displayed on the image display unit 17 as in the display screen shown in FIG.
- the portion that the operator (or the user) wants to see even during the operation (for example, the self in the navigation mode).
- the information such as the vehicle position and the information around the vehicle is not hidden by the display of the menu display area.
- the function assigned to the position pressed by the operator can be executed. For example, a menu for CD operation is displayed when the menu button 54 is pressed, a radio broadcast menu is displayed when the menu button 55 is pressed, or a DVD video menu is pressed when the menu button 56 is pressed. Is displayed. In this way, when the finger 53 is not moved after pressing any of the menu buttons 54 to 57, the operator can display the intended menu.
- the functions assigned to the menu buttons are not limited to the above functions, and other functions (for example, a navigation function, a telephone function, a TV function), etc. may be assigned.
- the same function may be assigned to a plurality of menu buttons among the menu buttons 54-57.
- the menu display area can be set from the position preferred by the operator, and the same operability can be ensured even in vehicles in which the position of the steering wheel is different.
- the menu buttons 54 to 57 are displayed in step S13, the menu buttons 54 to 57 are displayed when there is no input operation such as the operator's finger 53 is not touching the touch panel 18 for a certain period of time.
- the control unit 20 may perform control so that the sensitivity of the touch panel is increased.
- FIG. 5A when the operator moves the finger 53 to the left while pressing the menu button 54 in the state shown in FIG. The state when the pressing of the touch panel 18 is finished in A is shown, and the state where the menu display area 58 is displayed only in the area A is shown. In the display example shown in FIG. 5A, three menu icons 101 are displayed in the menu display area 58.
- FIG. 5B shows that when the operator moves the finger 53 to the left while pressing the menu button 54 in the state shown in FIG.
- the state when the pressing of the touch panel 18 is finished in B is shown, and the state where the area A and the area B are displayed as the menu display area 58 is shown.
- a total of six menu icons 101 are displayed in the menu display area 58.
- FIG. 5C when the operator moves the finger 53 to the left while pressing the menu button 54 in the state shown in FIG.
- the state when the pressing of the touch panel 18 is finished in D is shown, and the state where the areas A to D are displayed as the menu display area 58 is shown.
- a total of 12 menu icons 101 are displayed in the menu display area 58.
- the display range of the menu display area 58 is from area A to area C, and a total of nine menu icons 101 are displayed in the menu display area 58.
- an area of an arbitrary size of the operator is determined as the menu display area depending on the area where the operator starts pressing the touch panel and the area where the touch panel is pressed, or based on the movement distance pressed, and the operator displays on the display screen.
- the number of divided areas at the point where the finger 53 has finished moving and the touch panel 18 has been pressed is not limited to four. That is, the display screen should not be divided into only four areas from the area A to the area D, and the display area may be divided into an arbitrary number of areas as appropriate. The number of icons displayed on the screen may be changed as appropriate.
- the size of the menu display area changes depending on the movement distance of the finger of the operator in the X direction.
- the direction in which the operator moves the finger 53 may be the Y direction, where X is the horizontal direction of the image display unit 17 and Y is the vertical direction.
- FIG. 6 shows an embodiment where the operator moves the finger 53 in the Y direction.
- FIG. 6A shows an example of how to divide the area in the Y direction, which is divided into three areas A to C in the horizontal direction.
- FIG. 6C shows a state when the finger 53 that starts moving from the area A while pressing the menu button 54 in the state shown in FIG. 3E finishes pressing the touch panel 18 in the area B.
- a total of eight menu icons 101 are displayed in the menu display area 58.
- FIG. 6D shows a state when the finger 53 that starts moving from the area A while pressing the menu button 54 in the state shown in FIG. 3E finishes pressing the touch panel 18 in the area C.
- a state in which the menu display area 58 is displayed in the areas A to C is shown.
- a total of 12 menu icons 101 are displayed in the menu display area 58.
- the number of divisions of the menu display area 58 and the menu icons 101 are not limited to the numbers shown in the above example.
- the operator can change the size of the menu display area according to the movement distance of the finger of the operator in the Y direction, where the horizontal direction of the image display unit is the X direction and the vertical direction is the Y direction.
- the direction in which the operator moves the finger 53 may be both XY directions, where X is the horizontal direction of the image display unit 17 and Y is the vertical direction.
- FIG. 7A shows an example of how to divide the region in the XY direction.
- the screen of the image display unit 17 is divided into 3 rows and 4 columns.
- FIG. 7B the operator's finger 53 that starts moving from the area A while pressing the menu button 54 in the state shown in FIG. 3E finishes pressing the touch panel 18 in the area A.
- the state of the hour is shown.
- the menu display area 58 is displayed only in the area A, and one menu icon 101 is displayed.
- FIG. 7C the operator's finger 53 that starts moving from the area A while pressing the menu button 54 in the state shown in FIG. 3E finishes pressing the touch panel 18 in the area B.
- the state of the hour is shown.
- the menu display area 58 is displayed only in the areas A and B, and two menu icons 101 are displayed.
- FIG. 7D shows a state when the finger 53 that starts moving from the area A while pressing the menu button 54 in the state shown in FIG. 3E finishes pressing the touch panel 18 in the area E. It is shown.
- the menu display area 58 is displayed only in the areas A, B, D, and E, and four menu icons 101 are displayed.
- FIG. 7E shows a state when the finger 53 that has started moving from the area A while pressing the menu button 54 in the state shown in FIG. It is shown.
- the menu display area 58 is displayed in all the areas A to L, and twelve menu icons 101 are displayed.
- the display range of the menu display area 58 can be similarly changed for an arbitrary area.
- the operator can change the size of the menu display area by the movement distance of the operator's finger in the X direction and the Y direction, where the horizontal direction of the image display unit is the X direction and the vertical direction is the Y direction. it can.
- the operator can display the menu display area 58 while avoiding the area that is not desired to be hidden by the menu display area 58, and can operate the menu icon 101 while displaying the current location.
- the menu is displayed by the operation of the operator when the content that the user wants to view is displayed in an area other than the menu display area, the menu is displayed with the content that the user wants to view. Can do.
- the number of divisions of the menu display area 58 and the number of display of the menu icons 101 are not limited to this number.
- FIG. 8 is a diagram for explaining the case where the operator's finger 53 approaches the touch panel 18 and the area for detecting the approach of the finger 53 is divided.
- FIG. 8A is a diagram for explaining an example in which an area where it is detected that the operator's finger 53 has approached the touch panel 18 is divided into four recognition areas 60 to 63.
- FIG. 8B shows a state in which the menu button 54 is displayed on the display screen when it is detected that the operator's finger 53 approaches the recognition area 60 and the finger 53 approaches the touch panel 18. It is.
- FIG. 8C is a schematic diagram when the state shown in FIG. 8B is viewed from the side.
- FIG. 8D is a diagram showing a state in which the menu button 55 is displayed on the display screen when it is detected that the finger 53 approaches the recognition area 61 and the finger 53 approaches the touch panel 18.
- FIG. 8E is a schematic diagram showing a case where the state shown in FIG. 8D is viewed from the side.
- the recognition area when the recognition area is divided into four on the display screen, the operator brings the finger 53 close to the touch panel 18, and the distance between the finger 53 and the touch panel 18 becomes a distance L59 or less.
- the signal processing unit 19 detects the position of the operator's finger 53 and outputs the position information to the control unit 20 for control.
- the unit 20 displays the menu button (here, the menu button 54) only in the recognition area including the position information (here, the recognition area 60) (here, the content shown in FIG. 8B is displayed).
- the in-vehicle device 10 is controlled.
- the position of the operator's finger 53 at that time is indicated by the signal processing unit. 19 detects and outputs the position information to the control unit 20, and the control unit 20 displays the menu button (here, the menu button 55) only in the recognition area including the position information (here, the recognition area 61).
- the vehicle-mounted device 10 is controlled (in this case, so as to display the contents shown in FIG. 8D).
- an arbitrary menu to be displayed among the menu buttons displayed on the image display unit 17. Since only the button can be displayed at the corner of the display screen of the image display unit, the area of the display region hidden by the menu button in the display region of the image display unit 17 can be suppressed, and visibility is impaired. This can be reduced.
- the menu is displayed when the finger 53 approaches that area (at least one of the recognition area 60 and the recognition area 61).
- the button 54 and the menu button 55 are simultaneously displayed at the corner of the display screen of the image display unit 17.
- FIG. 9 is a schematic diagram illustrating that when the finger 53 approaches the corner of the display screen of the image display unit 17, a menu button is displayed at a corner on the display screen near the finger 53.
- FIG. 9A is a schematic diagram showing an example in which the display screen of the image display unit 17 is divided into recognition areas 1 to 4 and other areas.
- FIG. 9B is a diagram showing a state in which the menu button 54 is displayed when it is detected that the finger 53 approaches the recognition area 65 and the finger 53 approaches the touch panel 18.
- FIG. 9C is a schematic diagram illustrating the state illustrated in FIG. 9B viewed from the side.
- FIG. 9D is a diagram illustrating a state in which the menu button 55 is displayed when it is detected that the finger 53 approaches the recognition area 66 and the finger 53 approaches the touch panel 18.
- FIG.9 (e) is the schematic diagram which looked at the state shown by FIG.9 (d) from the side.
- FIG. 9 (f) shows a state when the finger 53 approaches an area other than the recognition areas 65 to 68. Since there is no operator's finger 53 in the recognition areas 65 to 68, the menu buttons 54 to 57 is not displayed.
- the entire touch panel 18 as shown in FIG. 8A is not divided, and the recognition area is partially present in the vicinity of the four corners on the display screen as shown in FIG. 9A.
- the recognition area is provided as the area, when the operator brings the finger 53 close to the touch panel 18 and the distance between the finger 53 and the touch panel 18 is equal to or less than the distance L59, the operator's finger 53 is in the recognition area 65.
- the signal processing unit 19 detects the position of the operator's finger 53 at that time and outputs the position information to the control unit 20, and the control unit 20 recognizes the recognition area (in this case, the recognition The vehicle-mounted device 10 is controlled so that the menu button (here, the menu button 54) is displayed only in the area 65) (here, the content shown in FIG. 9B is displayed). Similarly, if the operator's finger 53 is on the recognition area 66 when the distance between the finger 53 and the touch panel 18 is equal to or less than the distance L59, the position of the operator's finger 53 at that time is indicated by the signal processing unit.
- the control unit 20 detects and outputs the position information to the control unit 20, and the control unit 20 displays the menu button (here, the menu button 55) only in the recognition area including the position information (here, the recognition area 66) (
- the vehicle-mounted device 10 is controlled so as to display the content shown in FIG.
- the distance between the finger 53 and the touch panel 18 is equal to or less than the distance L59, if the operator's finger 53 is not on any of the recognition areas 65 to 68, the operator's finger 53 at that time Is detected by the signal processing unit 19 and the position information is output to the control unit 20, so that the control unit 20 does not display any of the menu buttons 54 to 57 (in this case, the state shown in FIG. 9F). Control the in-vehicle device 10.
- the shape and size of the recognition area can be set arbitrarily.
- FIG. 10 is a schematic diagram for explaining the sensitivity.
- the horizontal axis represents the distance between the touch panel 18 and the finger 53
- the vertical axis represents the capacitance until the touch panel 18 is grounded via the finger 53.
- a capacitive touch panel when a capacitive load such as a finger 53 comes into contact with the touch panel, a current flows through the touch panel 18, and the flowed current is converted into a voltage, which is detected by the signal processing unit 19.
- the signal processing unit 19 performs arithmetic processing on the result to identify the contact position.
- the detected voltage also varies depending on the capacity of the capacitive load
- the detected voltage also varies depending on the change in the capacitance between the touch panel 18 and the finger 53 shown on the vertical axis in FIG. fluctuate.
- the capacitance between the touch panel 18 and the finger 53 increases as the distance between the touch panel 18 and the finger 53 decreases, and decreases as the distance between the touch panel 18 and the finger 53 increases. Therefore, when the finger 53 is brought close to the touch panel 18 from a state in which the distance between the touch panel 18 and the finger 53 is sufficiently long, the capacitance between the touch panel 18 and the finger becomes a certain capacitance (for example, in the case of FIG. 10). When it becomes larger than C), it is determined that the finger 53 has touched the touch panel 18, and the touched position is calculated.
- Changing the sensitivity means changing a threshold value that is a criterion for determining whether the value of the capacitance value causes the control unit 20 to determine that the finger 53 has touched. If it demonstrates using the curve shown in FIG. 10, a sensitivity can be changed by changing the threshold value used as the value which judges the magnitude
- the signal processing unit 19 notifies the control unit 20 of position information when the operator's finger 53 approaches the touch panel 18 even when the touch panel 18 is on.
- reducing the sensitivity means a threshold value that requires an input operation to the touch panel 18 with a large capacitance (for example, pressing of the finger 53 by the operator, contact, extreme approach to the touch panel, etc.).
- the signal processing unit 19 notifies the control unit 20 of position information when the operator's finger 53 touches the touch panel 18.
- the capacitance value A shown on the vertical axis in FIG. 10 indicates the capacitance value corresponding to the threshold when the sensitivity is extremely lowered
- the capacitance value B indicates a capacitance value corresponding to a first threshold value that is a threshold value when sensitivity is increased
- the capacitance value C indicates an arbitrary value between A and B. In the content described in the above-described embodiment, after the finger 53 and the touch panel 18 are switched to a distance less than the distance L59, the capacitance value C is changed. This indicates a capacitance value corresponding to the sensitivity, and is a value corresponding to a second threshold that is a threshold when the sensitivity is lowered.
- the threshold value is changed to a lower threshold value corresponding to the capacitance value C.
- the capacitance value at the time of touching the normal touch panel is A
- the threshold value corresponding to the capacitance value C lower than the capacitance value A is set, the contact of the finger 53 with the touch panel 18 is sufficiently detected. it can.
- the change from the threshold value B corresponding to the capacitance value B on the vertical axis to the threshold value corresponding to the capacitance value C changes the sensitivity.
- increasing the sensitivity means increasing the sensitivity to cause the control unit 20 to determine that there has been an input operation to the touch panel 18 even if the operator's finger 53 is not in contact with the touch panel 18.
- the threshold value B and the threshold value C can be arbitrarily set.
- the state where the sensitivity is lowered is an agreement that the sensitivity is low or low
- the state where the sensitivity is increased is an agreement that the sensitivity is high or high.
- the method for changing the sensitivity includes a method for changing the gain of the detected waveform in the signal processing unit 19, which enables sensitivity adjustment by hardware in addition to software sensitivity adjustment.
- a method for changing the gain of the detected waveform in the signal processing unit 19 which enables sensitivity adjustment by hardware in addition to software sensitivity adjustment.
- this invention demonstrated the navigation apparatus which mounted the memory
- the sensitivity of the signal processing unit that detects the input to the touch panel to determine the input to the touch panel is determined according to the distance between the finger 53 and the touch panel.
- the menu button is displayed when the finger 53 is brought close to the touch panel, and the number of menu icons displayed can be changed by adjusting the display area of the menu icon by sliding the menu button. It has the effect that it is easy to operate and the display screen is easy to see, and is useful as an image display device applicable to in-vehicle devices such as navigation devices.
- On-vehicle device 11 Storage device 12 DVD / CD drive 13 GPS 14 Vehicle speed sensor 15 Gyro 16 Speaker 17 Image display unit 18 Touch panel 19 Signal processing unit 20 Control unit 51 Road 52 Car mark 53 Finger 54 Menu button 55 Menu button 57 Menu button 57 Menu button 58 Menu display area 59 Between finger and touch panel Distance L 80 Road-road intersection 101 Menu icon
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Automation & Control Theory (AREA)
- Position Input By Displaying (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
この構成により、タッチパネルへの入力を判断する感度を変えることにより、指をタッチパネルに接近させると釦が表示され、さらに、釦をスライドさせることによりメニュー数の表示を変更でき、操作しやすく表示画面が見やすくできる。
この構成により、メニューボタンを表示させたいときには、指をタッチパネルに近づけただけで押圧しなくても、上位の選択肢を表示させることができる。
この構成により、表示される上位の選択肢のうち、表示させたい任意のメニューボタンのみを表示させることができ、画像表示部を上位の選択肢で隠してしまう面積を抑えることができ、視認性が損なわれることを低減できる。
この構成により、メニューボタンの表示前に表示されていた画像の中心部の情報を隠すことを避けられる。
この構成により、角以外の場所に操作者の手が近づいて、意図しない操作を引き起こすことを低減できる。
この構成により、メニュー表示領域とその他の出力画像を一つの画面で表示することができる。
この構成により、操作者の意図した個数のメニューアイコンのみを表示することができ、選択が容易になる。
本実施の形態における車載装置は、経路案内など行うナビゲーション機能や、例えばDVD(Digital Versatile Disc)などの記録媒体に記録された音響映像の再生を行うオーディオ再生機能などを備えた、所謂カーナビゲーション装置とし、以下の説明においては、これを車載装置の一例として説明する。
ただし、DVD/CDドライブ12、GPS受信機13、車速センサ14、ジャイロ15、及びスピーカ16は、それぞれ車載装置10の内部に一体に設けられている必要はなく、それぞれと車載装置10とが電気的に接続取外し可能な構成としても良い。
また、記憶装置11には、一般的な記憶装置と同様に、各種プログラムや各種データ等が展開される領域、画像が展開される領域とが設けられている。
制御部20はさらに、タッチパネル18上の押圧位置を割り出して得られた位置情報を信号処理部19より取得して、その情報を記憶装置11に格納されているタッチパネル18のタッチエリア情報と照合する。そして、押圧位置に該当するタッチエリアにあらかじめ関連付けられているボタンやメニューやスイッチなどに定義された機能を実行する。
なお、以下の説明において、車載装置は操作者が階層型に設定された選択肢を選択操作することによって各機能が実行される。
本実施の形態においては、これを階層操作と称する。階層操作とは、メニューボタンなどの、ある選択肢を選ぶとメニューアイコンなどの複数の項目が表示されて、階層化された選択肢(メニュー)を選択する操作を指す。
そして、上位の階層の選択肢とは、階層操作する場合に操作の順位が上である選択肢を指し、下位の階層の選択肢とは、階層操作する場合に操作の順位が下である選択肢を指す。
より具体的には、上述の各モードを選択するために表示画面に表示されるものをメニューボタンと称し、さらに、例えばナビゲーションモードにおける目的地設定機能や経路編集機能など、各モードごとに振り分けられているそれぞれの機能を操作者が選択するために表示画面に表示されるものをメニューアイコンと称する。
すなわち、例えばナビゲーションモードのみに対してメニューボタンとメニューアイコンとを割り振ってもよく、その場合、例えばメニューボタンを「経路案内」または「地図表示方法」あるいは「基本設定」などに割り振って、メニューボタンに「経路案内」が割り振られた場合はメニューアイコンが「目的地設定」や「目的地周辺情報の表示」などに割り振られる、またはメニューボタンが「地図表示」に割り振られた場合はメニューアイコンが「平面表示」や「俯瞰表示」に割り振られる、などのように車載装置が設定されていてもよく、階層的に操作が行われる状況に対応して、適宜メニューボタンおよびメニューアイコンに割り振られる機能が設定されていても良い。
ここで、入力操作とは、複数表示されたメニューボタンやメニューアイコンなどの選択肢を選択すること、操作者の指などがタッチパネル18に接近すること、操作者の指などがタッチパネル18を押圧することなどを含む表現である。
図3(a)に示されるとおり、画像表示部17には自車位置周辺の道路51および自車マーク52などが表示される。
図3(b)に示されるとおり、車載装置10には操作者の指53が接するべき側から順にタッチパネル18、画像表示部17が配置される。
ここで、タッチパネル18の感度が上がった状態とは、タッチパネル18に操作者の指53が接触していなくても操作者によるタッチパネルへの入力動作の有無が判断できる感度である。
図3(c)は、操作者がタッチパネル18に操作者の指53を近づけている状態を示す図であり、図3(d)は、図3(c)に示される状態を横から見た状態を示す図である。
ここで、距離L59は例えば1センチメートル程度に設定されているが、「1センチメートル程度」というのは車輌の搭乗者が車載装置10の操作を意図せずに何らかの動作を行う場合を想定して設定された距離である。
すなわち、「1センチメートル程度」というのは、一般的な使用状況を加味したうえで、車輌の搭乗者が車載装置10の近傍で車載装置10の操作を意図しない何らかの動作を行ったとしても、不要な機能実行動作を車載装置10が行わずに済むであろうと考えられて設定された距離であり、この値は、車載装置10の設置場所や車輌の室内レイアウトなどを考慮して適宜設定される距離である。
ステップS11にて操作者の指53がタッチパネル18に接近したことが検出され操作者の指53の位置情報が算出されると、この位置情報は制御部20に入力され、この位置情報をもとに制御部20が制御信号を信号処理部19へ送出し、その後、タッチパネル18の感度を下げる(ステップS12)。
次に、操作者の指53の位置情報が算出されて信号処理部19から制御部20へ信号が送られたことに対応して制御部20は画像表示部17にメニューボタン54~57を表示するよう制御処理を行う(ステップS13)。
タッチパネル18と操作者の指53との距離L59が大きく設定されると(すなわち感度を上げすぎると)、車載装置10に対する操作ではない操作である例えばステアリングの操作やエアコンの操作などの操作や、車輌の揺れなどの影響を車輌の搭乗者が受けることよって車載装置10が誤動作する場合がある。
このため、好ましくは、距離L59は1~2センチメートル以下で指53の接近を検出するように距離L59を考慮し、この距離L59の値に基づいてタッチパネル18の感度が設定されると車輛搭乗者のタッチパネル操作以外の挙動、例えば、車の揺れなどによるタッチパネルの誤動作を低減することができ、操作者にとってタッチパネル18が操作しやすくなる。
図3(g)には、操作者によってメニューボタン54が押圧された状態が示されている。
図3(h)は、図3(g)に示した状態を横から見た場合の状態を示す図で、タッチパネル18に指53が接触している状態を示す図である。
図3(i)は、操作者がタッチパネル18を押圧しながら指53を移動している状態を示し、画像表示部17上にメニュー表示領域58が表示された状態を示す図である。
図3(j)は、メニュー表示領域58にメニューアイコン101を表示した状態を示す図である。ステップS14にてメニューボタン54~57のいずれかを押した場合はステップS15の実行に移行する。
画像表示部17に表示されたメニュー表示領域58の領域内に表示されたメニューアイコン101を操作者が選択し、選択したメニューアイコン101に操作者の指を接触することにより、入力操作に応じた機能を制御部20が実行させる。すなわち、選択したメニューアイコン101に割り振られた処理が実施する(ステップS16)。
ここで、選択したメニューアイコン101に割り振られた処理が実施されると言うのは、例えば選択したメニューアイコンが「案内開始」などの表示であれば、ナビゲーション機能の経路案内動作を開始するよう自装置を制御部20が制御することであり、例えば選択したメニューアイコンに再生中のDVDに記録された音楽トラックの曲名が表示されたものであれば、その曲の再生を開始するよう、制御部20が制御することである。
ステップS17にて、画像表示部17に表示された内容が所定の表示であると制御部20が判断すると、タッチパネル18の感度を上げ(ステップS18)終了となる。
ここで、ステップS16の処理が実施され、ステップS17にて、画像表示部17に表示された内容が所定の表示でないと判断された場合に、あらかじめ設定された所定時間内に操作者からの操作が無いと制御部20が判断したときには、操作前に画像表示部17に表示されていた、自車輌の現在地などの所定の表示内容が表示されるようにして、その後、ステップS18へ移行するように処理が行われてもよい。
換言すると、本発明の実施の形態によれば、操作者による操作を検出するためのタッチパネル18の感度を、指53がタッチパネル18に接近した場合と指53がタッチパネル18に触れた場合とで切替える構成としたことにより、操作者の指53がタッチパネル18に接近した時にのみメニューボタンが表示されるので、タッチパネル18への操作を行わない状態では画像表示部17における画像を表示する領域を全て視認でき、さらに、操作者は画像表示部17に表示されているメニューボタンに指53を押圧(接触)させて、この接触状態を保ったまま指53をスライドさせることによりメニューアイコンの表示数を変更させることができるため、操作が容易となり、メニューアイコン以外の画像を表示するための領域であるメニュー表示領域外の領域を広く確保することができる車載装置を実現することができる。
図4(a)、図4(b)はステップS14において、メニューボタン54~57以外の領域を押圧した場合の表示画面の一例を示す図である。
たとえば、図4(a)に示すように、画像表示部17に表示された道路51が交差している交差点80の部分に相当するタッチパネル18の部分を押圧すると、制御部20は画像表示部17に表示画面内の交差している点80を表示画面の中央に配置するように制御し、図4(b)に示される表示画面のように画像表示部17に表示させる。
ステップS14において、操作者がメニューボタン54~57を押圧しても指53を移動させなかった場合は、操作者が押圧した位置に割り振られた機能を実行することができる。
例えば、メニューボタン54が押圧されるとCD操作のメニューが表示されたり、また、メニューボタン55が押圧されるとラジオ放送のメニューが表示されたり、メニューボタン56が押圧されるとDVDビデオのメニューが表示されるなどである。
このように、メニューボタン54~57のいずれかを押圧した後に指53を移動させなかった場合に、操作者は意図するメニューを出すことができる。
あるいは、メニューボタン54~57のうちの複数のメニューボタンに対して同じ機能を割り振っても良い。これにより、操作者の好む位置からメニュー表示領域を設定できるようになり、また、ハンドルの位置が左右異なる車両においても、同様の操作性を確保することができる。
さらに、ステップS13において、メニューボタン54~57が表示された後、一定期間タッチパネル18に対して操作者の指53が接触されないなど、入力操作がない場合は、メニューボタン54~57を表示することを止めて、タッチパネルの感度を上げるように制御部20が制御してもよい。
図3(j)に示される画面表示の例では、指53の移動によりメニュー表示領域58にメニューアイコン101が3個表示されているが、以下に他の場合の表示例および表示方法を説明する。
図5(a)~(c)に示されるように、表示画面にて画像を表示する領域を例えばA~Dの列に区分する。領域A~領域Dには各々3つのメニューアイコン101が表示されるようにあらかじめ設定されている。
なお、指53の移動が終わり、タッチパネル18の押圧終了した地点の場合分けの領域は、4個に限定されるわけではない。
すなわち表示画面は、表示領域を領域Aから領域Dまでの4つのみに分割されるべきものではなく、適宜、表示領域を任意の数の領域に分けても良いものであり、さらに表示領域内に表示されるアイコンの数も適宜変更して良いものである。
操作者が指53を移動させる方向は、画像表示部17の横方向をX、縦方向をYとすると、Y方向でもよい。図6に操作者がY方向に指53を移動させる場合の実施例を示す。
ただし、メニュー表示領域58の分割数やメニューアイコン101は上述の例に示される数に限定されるわけではない。
操作者が指53を移動させる方向は、画像表示部17の横方向をX、縦方向をYとすると、XYの両方向でもよい。
これにより、操作者は、メニュー表示領域58により隠されたくない領域を避けてメニュー表示領域58を表示でき、現在地を表示させながらのメニューアイコン101操作ができる。
すなわち、メニュー表示領域以外の領域に使用者が視認したい内容が表示されている場合に操作者による操作によってメニューが表示されても、当該使用者が視認したい内容を表示した状態でメニュー表示することができる。
なお、メニュー表示領域58の分割数やメニューアイコン101の表示数はこの数に限定されるものではない。
次に、タッチパネル18上の検出領域を分割する方法について、以下に説明する。
図8は、タッチパネル18に操作者の指53が近づいて、指53の接近を検出する領域を分割する場合した時の説明のための図である。
図8(b)は、認識エリア60に操作者の指53が接近して、指53がタッチパネル18に接近したことが検出された時に、表示画面にメニューボタン54が表示された状態を示す図である。
図8(c)は、図8(b)に示された状態を横から見た場合の模式図である。
図8(d)は、認識エリア61に指53が接近して、指53がタッチパネル18に接近したことが検出された時に、表示画面にメニューボタン55が表示された状態を示す図である。
図8(e)は、図8(d)に示された状態を横から見た場合を示す模式図である。
同様に、指53とタッチパネル18との距離が距離L59以下になったときに、操作者の指53が認識エリア61の上にあれば、そのときの操作者の指53の位置を信号処理部19が検出してその位置情報を制御部20へ出力し、制御部20がこの位置情報を含む認識エリア(ここでは認識エリア61)にのみメニューボタン(ここではメニューボタン55)を表示するように(ここでは図8(d)に示す内容を表示するように)車載装置10を制御する。
次に、タッチパネル18上の検出領域を分割する他の方法について、以下に説明する。
図9は、画像表示部17の表示画面の角に指53が近づくと、指53が近づいた付近の表示画面上の角にメニューボタンが表示されることを示す模式図である。
図9(b)は、認識エリア65に指53が接近して、指53がタッチパネル18に接近したことが検出された時に、メニューボタン54が表示された状態を示す図である。
図9(c)は、図9(b)に示された状態を横から見た状態を示す模式図である。
図9(d)は、認識エリア66に指53が接近して、指53がタッチパネル18が接近したことが検出された時に、メニューボタン55が表示された状態を示す図である。
図9(e)は、図9(d)に示された状態を横から見た模式図である。
図9(f)は、認識エリア65~68以外のエリアに指53が接近した場合の状態を示す図であるが、認識エリア65~68に操作者の指53がないため、メニューボタン54~57は表示されない。
同様に、指53とタッチパネル18との距離が距離L59以下になったときに、操作者の指53が認識エリア66の上にあれば、そのときの操作者の指53の位置を信号処理部19が検出してその位置情報を制御部20へ出力し、制御部20がこの位置情報を含む認識エリア(ここでは認識エリア66)にのみメニューボタン(ここではメニューボタン55)を表示するよう(ここでは図9(d)に示す内容を表示するよう)車載装置10を制御する。
そして、指53とタッチパネル18との距離が距離L59以下になったときに、操作者の指53が認識エリア65~68のいずれの認識エリアの上にない場合、そのときの操作者の指53の位置を信号処理部19が検出してその位置情報を制御部20へ出力し、制御部20がメニューボタン54~57のいずれも表示しないよう(ここでは図9(f)に示す状態になるよう)車載装置10を制御する。
図10において、横軸は、タッチパネル18と指53との距離であり、縦軸は、タッチパネル18から指53を介し接地するまでの静電容量である。静電容量型タッチパネルでは、指53などの容量性負荷がタッチパネルに接触することにより、タッチパネル18に電流が流れ、流れた電流が電圧に変換されてこれを信号処理部19が検出し、この検出結果を信号処理部19が演算処理することによって接触位置を特定する。
したがって、タッチパネル18と指53との距離が充分離れた状態からに指53をタッチパネル18に近づけると、タッチパネル18と指との間の静電容量が、ある静電容量(たとえば図10の場合はC)よりも大きくなった時に指53がタッチパネル18に接触したと判断されて、接触した位置が算出される。
図10に示した曲線を用いて説明すると、縦軸上のAやBなどの、検出された静電容量の大きさを判断する値となる閾値を変化させることによって感度を変更させることができる。
すなわち、感度を上げるとは、小さな静電容量のものでタッチパネルに入力した場合でも認識するように閾値を設定することであり、感度を上げた場合、操作者の指53がタッチパネル18から離れている時でも、タッチパネル18へ操作者の指53が接近した際の位置情報を信号処理部19が制御部20に通知する。
また、感度を下げるとは、大きな静電容量のもの(例えば操作者による指53の押圧、接触、タッチパネルへの極端な接近など)でのタッチパネル18への入力操作が必要になるような閾値を設定することであり、感度を下げた場合、タッチパネル18へ操作者の指53が接触した際の位置情報を信号処理部19が制御部20に通知する。
タッチパネル18に指53が近づいて、タッチパネルと指53との静電容量が図10に示される静電容量値Bを超えると、閾値を静電容量値Cに対応する低めの閾値に変更する。
通常タッチパネルに接触した時の静電容量値をAとすると、静電容量値Aよりも低い静電容量値Cに対応した閾値を設定すれば、十分に指53によるタッチパネル18への接触を検出できる。この縦軸の静電容量値Bに対応する閾値Bから静電容量値Cに対応する閾値への変更が感度を変えることになる。
また、感度を下げた状態とは、感度が低い、あるいは低感度であると同意であり、感度を上げた状態とは、感度が高い、あるいは高感度であると同意である。
11 記憶装置
12 DVD/CDドライブ
13 GPS
14 車速センサ
15 ジャイロ
16 スピーカ
17 画像表示部
18 タッチパネル
19 信号処理部
20 制御部
51 道路
52 自車マーク
53 指
54 メニューボタン
55 メニューボタン
56 メニューボタン
57 メニューボタン
58 メニュー表示領域
59 指とタッチパネルとの距離L
80 道路と道路の交点
101 メニューアイコン
Claims (7)
- 自装置の機能を実行させるための選択肢が階層化された画像表示装置であって、
前記選択肢を表示する画像表示部と、
前記画像表示部の表面に配置される静電容量型のタッチパネルと、
前記タッチパネルに対する入力操作に応じて変化した静電容量を検出する信号処理部と、
前記入力操作に対応した機能を実行させる制御部と、を備え、
前記制御部は、前記タッチパネルの感度を高感度に設定したあとに操作者の指が前記タッチパネルに接近して所定量以上の静電容量となったことが検出されると、前記タッチパネルの感度を前記高感度よりも低い感度の低感度に変更して上位の階層の選択肢を前記画像表示部に表示し、前記操作者の指によって前記上位の階層の選択肢が押圧される入力操作が検出されると下位の階層の選択肢を表示する表示領域を前記操作者の指が前記タッチパネル上を押圧しながら移動した距離に応じた大きさに設定して前記表示領域に前記下位の階層の選択肢を表示する機能を実行する画像表示装置。 - 前記信号処理部は静電容量の変化を電圧に変換し、前記制御部は第1の閾値よりも前記信号処理部に変換された電圧の値が大きいと判断した場合に前記第1の閾値よりも大きい値である第2の閾値と前記信号処理部に変換された電圧の値とを比較することで前記タッチパネルの感度を変更する請求項1記載の画像表示装置。
- 前記制御部は、前記画像表示部を複数の領域に分割し前記操作者の指を検出した領域に対応した前記上位の階層の選択肢を表示させる請求項1または請求項2に記載の画像表示装置。
- 前記制御部は、前記上位の階層の選択肢は前記画像表示部の少なくとも1つの角に表示する請求項1ないし請求項3のいずれかに記載の画像表示装置。
- 前記制御部は、操作者の指が前記タッチパネルに接近して所定量以上の静電容量となったことが検出されると、前記画像表示部の角のうち入力操作が検出された指の位置に最も近い角に前記上位の階層の選択肢を前記画像表示部に表示する請求項1ないし請求項4のいずれかに記載の画像表示装置。
- 前記制御部は、前記表示領域を前記タッチパネル上において前記操作者の指が押圧を開始した位置と終了した位置とを含む領域に設定する請求項1ないし請求項5のいずれかに記載の画像表示装置。
- 前記制御部は、前記表示領域を前記タッチパネル上において前記操作者の指が押圧を開始した位置と終了した位置とを結ぶ線分を対角線とした四方形の領域に設定する請求項1ないし請求項5のいずれかに記載の画像表示装置。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201080002536.7A CN102150114B (zh) | 2009-02-06 | 2010-01-28 | 图像显示设备 |
JP2010549381A JP5288643B2 (ja) | 2009-02-06 | 2010-01-28 | 画像表示装置 |
US13/062,054 US8542214B2 (en) | 2009-02-06 | 2010-01-28 | Image display device |
EP10738323.4A EP2330486B1 (en) | 2009-02-06 | 2010-01-28 | Image display device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2009-025745 | 2009-02-06 | ||
JP2009025745 | 2009-02-06 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2010089980A1 true WO2010089980A1 (ja) | 2010-08-12 |
Family
ID=42541891
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2010/000513 WO2010089980A1 (ja) | 2009-02-06 | 2010-01-28 | 画像表示装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US8542214B2 (ja) |
EP (1) | EP2330486B1 (ja) |
JP (1) | JP5288643B2 (ja) |
CN (1) | CN102150114B (ja) |
WO (1) | WO2010089980A1 (ja) |
Cited By (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102402384A (zh) * | 2010-09-07 | 2012-04-04 | 索尼公司 | 信息处理装置、信息处理方法和计算机程序 |
CN102654824A (zh) * | 2011-03-04 | 2012-09-05 | 三星电子株式会社 | 显示设备及其控制方法 |
EP2521013A1 (en) | 2011-05-03 | 2012-11-07 | HTC Corporation | Management and application methods and systems for touch-sensitive devices, and computer program products thereof |
JP2012238079A (ja) * | 2011-05-10 | 2012-12-06 | Kyocera Corp | 入力装置及び電子機器 |
JP2012248035A (ja) * | 2011-05-27 | 2012-12-13 | Sharp Corp | タッチパネルシステムおよびそれを用いた電子機器 |
JP2013011983A (ja) * | 2011-06-28 | 2013-01-17 | Kyocera Corp | 電子機器、制御方法および制御プログラム |
JP2013009727A (ja) * | 2011-06-28 | 2013-01-17 | Kyocera Corp | 電子機器、制御方法および制御プログラム |
JP2013089212A (ja) * | 2011-10-24 | 2013-05-13 | Kyocera Corp | 携帯端末および低感度領域設定プログラム |
WO2013121629A1 (ja) * | 2012-02-14 | 2013-08-22 | Necカシオモバイルコミュニケーションズ株式会社 | 情報処理装置、誤動作防止方法およびプログラム |
JP2014052988A (ja) * | 2012-09-10 | 2014-03-20 | Konica Minolta Inc | タッチパネル入力装置、タッチ入力方法及びタッチ入力制御プログラム |
JP2014515519A (ja) * | 2011-05-27 | 2014-06-30 | マイクロソフト コーポレーション | エッジ・ジェスチャー |
JP2014147441A (ja) * | 2013-01-31 | 2014-08-21 | Panasonic Corp | 操作装置及び洗濯機 |
CN104461105A (zh) * | 2013-09-25 | 2015-03-25 | 联想(北京)有限公司 | 一种控制电子设备的方法及电子设备 |
WO2015045090A1 (ja) * | 2013-09-27 | 2015-04-02 | 株式会社 東芝 | 電子機器および方法 |
US20150169114A1 (en) * | 2010-08-27 | 2015-06-18 | Apple Inc. | Touch and hover sensor compensation |
US9229918B2 (en) | 2010-12-23 | 2016-01-05 | Microsoft Technology Licensing, Llc | Presenting an application change through a tile |
US9329774B2 (en) | 2011-05-27 | 2016-05-03 | Microsoft Technology Licensing, Llc | Switching back to a previously-interacted-with application |
US9535597B2 (en) | 2011-05-27 | 2017-01-03 | Microsoft Technology Licensing, Llc | Managing an immersive interface in a multi-application immersive environment |
JP2017056860A (ja) * | 2015-09-17 | 2017-03-23 | 日本精機株式会社 | 車両用表示装置 |
US9658766B2 (en) | 2011-05-27 | 2017-05-23 | Microsoft Technology Licensing, Llc | Edge gesture |
JP2018028543A (ja) * | 2017-09-19 | 2018-02-22 | パイオニア株式会社 | 物体検出装置、物体検出方法、物体検出用プログラム及び情報記録媒体 |
US10254955B2 (en) | 2011-09-10 | 2019-04-09 | Microsoft Technology Licensing, Llc | Progressively indicating new content in an application-selectable user interface |
US10303325B2 (en) | 2011-05-27 | 2019-05-28 | Microsoft Technology Licensing, Llc | Multi-application environment |
JP2019194596A (ja) * | 2019-06-03 | 2019-11-07 | パイオニア株式会社 | 物体検出装置 |
JP2019197190A (ja) * | 2018-05-11 | 2019-11-14 | 富士ゼロックス株式会社 | 情報処理装置、情報処理システム及びプログラム |
US10579250B2 (en) | 2011-09-01 | 2020-03-03 | Microsoft Technology Licensing, Llc | Arranging tiles |
US10969944B2 (en) | 2010-12-23 | 2021-04-06 | Microsoft Technology Licensing, Llc | Application reporting in an application-selectable user interface |
JP2021077010A (ja) * | 2019-11-07 | 2021-05-20 | 株式会社東海理化電機製作所 | 指示入力装置、制御装置、およびコンピュータプログラム |
Families Citing this family (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101651859B1 (ko) | 2009-06-05 | 2016-09-12 | 삼성전자주식회사 | 사용자별 ui 제공방법 및 이를 적용한 디바이스 |
JP5545009B2 (ja) * | 2010-04-28 | 2014-07-09 | ソニー株式会社 | センサ装置および情報表示装置 |
KR101731346B1 (ko) * | 2010-11-12 | 2017-04-28 | 엘지전자 주식회사 | 멀티미디어 장치의 디스플레이화면 제공 방법 및 그에 따른 멀티미디어 장치 |
WO2013005586A1 (ja) * | 2011-07-04 | 2013-01-10 | Necカシオモバイルコミュニケーションズ株式会社 | 画像処理装置、画像処理方法、及び画像処理プログラム |
JP6020450B2 (ja) | 2011-08-05 | 2016-11-02 | 日本電気株式会社 | 情報入力ユニット及び情報入力方法並びにコンピュータプログラム |
KR101160681B1 (ko) | 2011-10-19 | 2012-06-28 | 배경덕 | 이동 통신 단말기의 활성화 시에 특정 동작이 수행되도록 하기 위한 방법, 이동 통신 단말기 및 컴퓨터 판독 가능 기록 매체 |
JP5885501B2 (ja) * | 2011-12-28 | 2016-03-15 | キヤノン株式会社 | 電子機器、その制御方法、およびプログラム、並びに記録媒体 |
JP6005417B2 (ja) | 2012-06-26 | 2016-10-12 | 株式会社東海理化電機製作所 | 操作装置 |
JP5751233B2 (ja) * | 2012-10-02 | 2015-07-22 | 株式会社デンソー | 操作デバイス |
TWI535587B (zh) * | 2012-11-14 | 2016-06-01 | 義晶科技股份有限公司 | 利用觸控面板來控制車用影像之顯示的方法及其車用影像系統 |
JP5860838B2 (ja) * | 2013-05-30 | 2016-02-16 | 京セラドキュメントソリューションズ株式会社 | 表示装置、電子機器、及び画像形成装置 |
US9606664B2 (en) * | 2013-11-13 | 2017-03-28 | Dell Products, Lp | Dynamic hover sensitivity and gesture adaptation in a dual display system |
US20150153932A1 (en) * | 2013-12-04 | 2015-06-04 | Samsung Electronics Co., Ltd. | Mobile device and method of displaying icon thereof |
KR101655810B1 (ko) * | 2014-04-22 | 2016-09-22 | 엘지전자 주식회사 | 차량용 디스플레이 장치 |
US20160179325A1 (en) | 2014-12-19 | 2016-06-23 | Delphi Technologies, Inc. | Touch-sensitive display with hover location magnification |
EP3040842B1 (de) * | 2015-01-02 | 2019-08-07 | Volkswagen AG | Anwenderschnittstelle und Verfahren zur hybriden Nutzung einer Anzeigeeinheit eines Fortbewegungsmittels |
CN104615302B (zh) * | 2015-01-30 | 2016-08-24 | 努比亚技术有限公司 | 移动终端防误触控方法、装置及移动终端 |
DE102015201722A1 (de) * | 2015-02-02 | 2016-08-04 | Robert Bosch Gmbh | Verfahren zum Betreiben einer Eingabevorrichtung, Eingabevorrichtung |
CN105182676B (zh) * | 2015-09-08 | 2017-03-01 | 京东方科技集团股份有限公司 | 投影幕布、触控屏投影显示方法及系统 |
KR20180051002A (ko) * | 2016-11-07 | 2018-05-16 | 삼성전자주식회사 | 터치 스크린을 이용하는 전자 장치에서 애플리케이션의 실행을 제어하는 방법과 이를 위한 전자 장치 |
CN110231908A (zh) * | 2018-10-30 | 2019-09-13 | 蔚来汽车有限公司 | 界面控制方法和装置、终端、控制器及介质 |
JP7194666B2 (ja) * | 2019-11-19 | 2022-12-22 | 株式会社ヴァレオジャパン | 表示装置 |
CN111762023B (zh) * | 2020-05-29 | 2022-04-12 | 法雷奥舒适驾驶辅助系统(广州)有限公司 | 触控装置及其方法和汽车方向盘辅助开关 |
CN111666031B (zh) * | 2020-06-05 | 2022-03-04 | 阿波罗智联(北京)科技有限公司 | 用于车载终端的虚拟按键控制方法、装置、设备及存储介质 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000242387A (ja) * | 1999-02-19 | 2000-09-08 | Casio Comput Co Ltd | 項目選択装置およびそのプログラム記録媒体 |
JP2002358162A (ja) | 2001-06-01 | 2002-12-13 | Sony Corp | 画像表示装置 |
JP2003195998A (ja) * | 2001-12-26 | 2003-07-11 | Canon Inc | 情報処理装置及び情報処理装置の制御方法及び情報処理装置の制御プログラム及び記憶媒体 |
JP2007199980A (ja) * | 2006-01-26 | 2007-08-09 | Xanavi Informatics Corp | 表示制御装置、地図表示装置およびナビゲーション装置 |
JP2009025745A (ja) | 2007-07-23 | 2009-02-05 | Ricoh Co Ltd | トナー補給方法、トナー補給装置及び画像形成装置 |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6459424B1 (en) * | 1999-08-10 | 2002-10-01 | Hewlett-Packard Company | Touch-sensitive input screen having regional sensitivity and resolution properties |
JP2002055750A (ja) * | 2000-08-10 | 2002-02-20 | Canon Inc | 情報処理装置、機能一覧表表示方法、及び記憶媒体 |
JP2005088801A (ja) * | 2003-09-18 | 2005-04-07 | Denso Corp | 情報処理システム |
JP4855654B2 (ja) * | 2004-05-31 | 2012-01-18 | ソニー株式会社 | 車載装置、車載装置の情報提供方法、車載装置の情報提供方法のプログラム及び車載装置の情報提供方法のプログラムを記録した記録媒体 |
WO2006003588A2 (en) * | 2004-06-29 | 2006-01-12 | Koninklijke Philips Electronics N.V. | Multi-layered display of a graphical user interface |
JP4642497B2 (ja) | 2005-02-10 | 2011-03-02 | クラリオン株式会社 | ナビゲーション装置 |
JP2007108841A (ja) * | 2005-10-11 | 2007-04-26 | Matsushita Electric Ind Co Ltd | データ処理装置 |
DE102006028046B4 (de) * | 2006-06-19 | 2016-02-11 | Audi Ag | Kombinierte Anzeige- und Bedienvorrichtung für ein Kraftfahrzeug |
US8284165B2 (en) * | 2006-10-13 | 2012-10-09 | Sony Corporation | Information display apparatus with proximity detection performance and information display method using the same |
US8127254B2 (en) * | 2007-06-29 | 2012-02-28 | Nokia Corporation | Unlocking a touch screen device |
DE102007032472A1 (de) * | 2007-07-10 | 2009-01-22 | Volkswagen Ag | Berührungslos messende Positionsbestimmungseinheit und Verfahren zum berührungslosen Bestimmen einer Position eines Betätigungselements eines Nutzers eines Kraftfahrzeugs |
US8054300B2 (en) * | 2008-06-17 | 2011-11-08 | Apple Inc. | Capacitive sensor panel having dynamically reconfigurable sensor size and shape |
-
2010
- 2010-01-28 EP EP10738323.4A patent/EP2330486B1/en active Active
- 2010-01-28 WO PCT/JP2010/000513 patent/WO2010089980A1/ja active Application Filing
- 2010-01-28 US US13/062,054 patent/US8542214B2/en active Active
- 2010-01-28 CN CN201080002536.7A patent/CN102150114B/zh active Active
- 2010-01-28 JP JP2010549381A patent/JP5288643B2/ja active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000242387A (ja) * | 1999-02-19 | 2000-09-08 | Casio Comput Co Ltd | 項目選択装置およびそのプログラム記録媒体 |
JP2002358162A (ja) | 2001-06-01 | 2002-12-13 | Sony Corp | 画像表示装置 |
JP2003195998A (ja) * | 2001-12-26 | 2003-07-11 | Canon Inc | 情報処理装置及び情報処理装置の制御方法及び情報処理装置の制御プログラム及び記憶媒体 |
JP2007199980A (ja) * | 2006-01-26 | 2007-08-09 | Xanavi Informatics Corp | 表示制御装置、地図表示装置およびナビゲーション装置 |
JP2009025745A (ja) | 2007-07-23 | 2009-02-05 | Ricoh Co Ltd | トナー補給方法、トナー補給装置及び画像形成装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2330486A4 |
Cited By (36)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150169114A1 (en) * | 2010-08-27 | 2015-06-18 | Apple Inc. | Touch and hover sensor compensation |
US9836158B2 (en) * | 2010-08-27 | 2017-12-05 | Apple Inc. | Touch and hover sensor compensation |
CN102402384A (zh) * | 2010-09-07 | 2012-04-04 | 索尼公司 | 信息处理装置、信息处理方法和计算机程序 |
CN102402384B (zh) * | 2010-09-07 | 2017-04-12 | 索尼公司 | 信息处理装置和信息处理方法 |
US11126333B2 (en) | 2010-12-23 | 2021-09-21 | Microsoft Technology Licensing, Llc | Application reporting in an application-selectable user interface |
US10969944B2 (en) | 2010-12-23 | 2021-04-06 | Microsoft Technology Licensing, Llc | Application reporting in an application-selectable user interface |
US9229918B2 (en) | 2010-12-23 | 2016-01-05 | Microsoft Technology Licensing, Llc | Presenting an application change through a tile |
CN102654824A (zh) * | 2011-03-04 | 2012-09-05 | 三星电子株式会社 | 显示设备及其控制方法 |
EP2521013A1 (en) | 2011-05-03 | 2012-11-07 | HTC Corporation | Management and application methods and systems for touch-sensitive devices, and computer program products thereof |
CN102768594A (zh) * | 2011-05-03 | 2012-11-07 | 宏达国际电子股份有限公司 | 触控式装置的管理与应用方法及系统 |
US9152255B2 (en) | 2011-05-03 | 2015-10-06 | Htc Corporation | Management and application methods and systems for touch-sensitive devices, and computer program products thereof |
JP2012238079A (ja) * | 2011-05-10 | 2012-12-06 | Kyocera Corp | 入力装置及び電子機器 |
US10303325B2 (en) | 2011-05-27 | 2019-05-28 | Microsoft Technology Licensing, Llc | Multi-application environment |
US9535597B2 (en) | 2011-05-27 | 2017-01-03 | Microsoft Technology Licensing, Llc | Managing an immersive interface in a multi-application immersive environment |
US11698721B2 (en) | 2011-05-27 | 2023-07-11 | Microsoft Technology Licensing, Llc | Managing an immersive interface in a multi-application immersive environment |
US9658766B2 (en) | 2011-05-27 | 2017-05-23 | Microsoft Technology Licensing, Llc | Edge gesture |
JP2014515519A (ja) * | 2011-05-27 | 2014-06-30 | マイクロソフト コーポレーション | エッジ・ジェスチャー |
JP2012248035A (ja) * | 2011-05-27 | 2012-12-13 | Sharp Corp | タッチパネルシステムおよびそれを用いた電子機器 |
US9329774B2 (en) | 2011-05-27 | 2016-05-03 | Microsoft Technology Licensing, Llc | Switching back to a previously-interacted-with application |
JP2013009727A (ja) * | 2011-06-28 | 2013-01-17 | Kyocera Corp | 電子機器、制御方法および制御プログラム |
JP2013011983A (ja) * | 2011-06-28 | 2013-01-17 | Kyocera Corp | 電子機器、制御方法および制御プログラム |
US10579250B2 (en) | 2011-09-01 | 2020-03-03 | Microsoft Technology Licensing, Llc | Arranging tiles |
US10254955B2 (en) | 2011-09-10 | 2019-04-09 | Microsoft Technology Licensing, Llc | Progressively indicating new content in an application-selectable user interface |
JP2013089212A (ja) * | 2011-10-24 | 2013-05-13 | Kyocera Corp | 携帯端末および低感度領域設定プログラム |
WO2013121629A1 (ja) * | 2012-02-14 | 2013-08-22 | Necカシオモバイルコミュニケーションズ株式会社 | 情報処理装置、誤動作防止方法およびプログラム |
JP2014052988A (ja) * | 2012-09-10 | 2014-03-20 | Konica Minolta Inc | タッチパネル入力装置、タッチ入力方法及びタッチ入力制御プログラム |
JP2014147441A (ja) * | 2013-01-31 | 2014-08-21 | Panasonic Corp | 操作装置及び洗濯機 |
CN104461105B (zh) * | 2013-09-25 | 2017-08-29 | 联想(北京)有限公司 | 一种控制电子设备的方法及电子设备 |
CN104461105A (zh) * | 2013-09-25 | 2015-03-25 | 联想(北京)有限公司 | 一种控制电子设备的方法及电子设备 |
WO2015045090A1 (ja) * | 2013-09-27 | 2015-04-02 | 株式会社 東芝 | 電子機器および方法 |
JP2017056860A (ja) * | 2015-09-17 | 2017-03-23 | 日本精機株式会社 | 車両用表示装置 |
JP2018028543A (ja) * | 2017-09-19 | 2018-02-22 | パイオニア株式会社 | 物体検出装置、物体検出方法、物体検出用プログラム及び情報記録媒体 |
JP2019197190A (ja) * | 2018-05-11 | 2019-11-14 | 富士ゼロックス株式会社 | 情報処理装置、情報処理システム及びプログラム |
JP2019194596A (ja) * | 2019-06-03 | 2019-11-07 | パイオニア株式会社 | 物体検出装置 |
JP2021152539A (ja) * | 2019-06-03 | 2021-09-30 | パイオニア株式会社 | 物体検出装置 |
JP2021077010A (ja) * | 2019-11-07 | 2021-05-20 | 株式会社東海理化電機製作所 | 指示入力装置、制御装置、およびコンピュータプログラム |
Also Published As
Publication number | Publication date |
---|---|
US20110187675A1 (en) | 2011-08-04 |
CN102150114B (zh) | 2014-01-22 |
EP2330486A1 (en) | 2011-06-08 |
JP5288643B2 (ja) | 2013-09-11 |
EP2330486A4 (en) | 2014-11-05 |
EP2330486B1 (en) | 2017-05-10 |
CN102150114A (zh) | 2011-08-10 |
US8542214B2 (en) | 2013-09-24 |
JPWO2010089980A1 (ja) | 2012-08-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5288643B2 (ja) | 画像表示装置 | |
JP6135942B2 (ja) | 車載装置および制御方法 | |
JP5409657B2 (ja) | 画像表示装置 | |
JP4388359B2 (ja) | 車載用マンマシン・インタフェース装置、方法、及びプログラム | |
WO2011129109A1 (ja) | 表示装置 | |
JP2011111061A (ja) | 車載表示システム | |
US20190322176A1 (en) | Input device for vehicle and input method | |
JP2006277588A (ja) | タッチパネル及びタッチパネルを備えた電子機器 | |
WO2017169263A1 (ja) | 表示処理装置、及び、表示処理プログラム | |
JP4548325B2 (ja) | 車載用表示装置 | |
JP6177660B2 (ja) | 入力装置 | |
KR102385060B1 (ko) | 입력 장치 및 그 제어방법 | |
JP2009286175A (ja) | 車両用表示装置 | |
JP4526307B2 (ja) | 機能選択装置 | |
JP6644500B2 (ja) | 車載装置 | |
JP2010129070A (ja) | ディスプレイ装置 | |
JP3831849B2 (ja) | 車両用コントローラ | |
JP2007226478A (ja) | 入出力制御装置、入出力制御方法およびナビゲーション装置 | |
KR101804767B1 (ko) | 입력 장치 및 이를 포함하는 차량 | |
JP2019139009A (ja) | 表示制御方法およびそれを利用した表示制御装置、表示システム | |
JP4188628B2 (ja) | 画像表示装置 | |
WO2020196559A1 (ja) | 制御装置及び制御システム | |
JP5294764B2 (ja) | メニュー項目選択装置およびメニュー項目選択方法 | |
JP2020160787A (ja) | 制御装置 | |
JPH09325845A (ja) | タッチパネルを用いた動作指示方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201080002536.7 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10738323 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010549381 Country of ref document: JP |
|
REEP | Request for entry into the european phase |
Ref document number: 2010738323 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010738323 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13062054 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |