WO2014162699A1 - 入力装置 - Google Patents
入力装置 Download PDFInfo
- Publication number
- WO2014162699A1 WO2014162699A1 PCT/JP2014/001777 JP2014001777W WO2014162699A1 WO 2014162699 A1 WO2014162699 A1 WO 2014162699A1 JP 2014001777 W JP2014001777 W JP 2014001777W WO 2014162699 A1 WO2014162699 A1 WO 2014162699A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- finger
- groove
- control unit
- input
- operation surface
- Prior art date
Links
- 238000001514 detection method Methods 0.000 claims abstract description 25
- 230000035945 sensitivity Effects 0.000 description 40
- 238000000034 method Methods 0.000 description 14
- 239000004973 liquid crystal related substance Substances 0.000 description 12
- 238000004891 communication Methods 0.000 description 11
- 238000010586 diagram Methods 0.000 description 11
- 230000006870 function Effects 0.000 description 11
- 238000012545 processing Methods 0.000 description 7
- 238000013459 approach Methods 0.000 description 5
- 230000005540 biological transmission Effects 0.000 description 3
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 2
- 238000004378 air conditioning Methods 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000007664 blowing Methods 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 238000011010 flushing procedure Methods 0.000 description 1
- 230000005764 inhibitory process Effects 0.000 description 1
- 238000012905 input function Methods 0.000 description 1
- 239000011810 insulating material Substances 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03547—Touch pads, in which fingers can move on a surface
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Arrangement of adaptations of instruments
-
- B60K35/10—
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3664—Details of the user input interface, e.g. buttons, knobs or sliders, including those provided on a touch screen; remote controllers; input using gestures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3688—Systems comprising multiple parts or multiple output devices (not client-server), e.g. detachable faceplates, key fobs or multiple output screens
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/169—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated pointing device, e.g. trackball in the palm rest area, mini-joystick integrated between keyboard keys, touch pads or touch stripes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- B60K2360/141—
-
- B60K2360/1446—
-
- B60K2360/146—
-
- B60K2360/1468—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04106—Multi-sensing digitiser, i.e. digitiser using at least two different sensing technologies simultaneously or alternatively, e.g. for detecting pen and finger, for saving power or for improving position detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04108—Touchless 2D- digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface without distance measurement in the Z direction
Definitions
- This disclosure relates to an input device that performs input for information display on a display unit by an operator's finger operation.
- Patent Document 1 As a conventional input device, for example, the one disclosed in Patent Document 1 is known.
- a remote touchpad unit for a user to perform a touch operation, and various modes of the multimedia system according to a three-dimensional signal received from the remote touchpad unit.
- a display unit for displaying and a control unit for controlling the multimedia system to operate according to the three-dimensional signal of the remote touchpad unit are provided.
- the operation standby screen can be switched on the radio main screen.
- the above-mentioned screen operation is performed according to the distance (height) from the surface of the remote touchpad to the finger in the above-mentioned patent document 1, so that it is erroneously performed.
- the finger (hand) is brought close to the remote touchpad, there is a possibility that an unintended input (erroneous operation) may be made on the display on the display unit.
- the present disclosure has been made in view of the above, and is to provide an input device that can prevent an input unintended by a user.
- the input device adopts the following configuration.
- the input device is mounted on the vehicle and formed separately from the display unit that displays the images divided into a plurality of layers in a switchable manner, and operates the image by a user's finger operation on the operation surface. It is an input device in which input is performed.
- the input device includes a groove portion in which a user's fingertip is inserted into the depression, and the finger can slide in the longitudinal direction in which the depression is continuous, a detection unit that detects the presence or absence of a finger in the groove, and a finger operation on the operation surface.
- the control unit prohibits input by finger operation on the operation surface. Therefore, even if the user inadvertently brings his finger close to the operation surface, the input for operating the image is not executed, and thus it is possible to prevent an input unintended by the user.
- the control unit determines from the detection result of the finger by the detection unit that the user's finger slides in the groove portion, the control unit cancels the prohibition of input. Therefore, the user can input again by finger operation on the operation surface by sliding the finger with respect to the groove. In this case, the finger operation intended by the user is performed.
- FIG. 1 is an explanatory diagram for explaining the arrangement of the navigation device and the remote operation device according to the first embodiment in a vehicle cabin.
- FIG. 2 is a configuration diagram showing the configurations of the navigation apparatus and the remote operation device in the first embodiment.
- FIG. 3 is a perspective view showing the remote control device of the first embodiment.
- FIG. 4 is a cross-sectional view showing the IV-IV part in FIG.
- FIG. 5 is a perspective view showing the touch sensor and the first and second sensors.
- FIG. 6 is a diagram for explaining the relationship between the sensitivity value detected by the touch sensor and the operation state determined by the operation control unit in the remote operation device according to the first embodiment.
- FIG. 1 is an explanatory diagram for explaining the arrangement of the navigation device and the remote operation device according to the first embodiment in a vehicle cabin.
- FIG. 2 is a configuration diagram showing the configurations of the navigation apparatus and the remote operation device in the first embodiment.
- FIG. 3 is a perspective view showing the remote control device of the first embodiment.
- FIG. 7 is a diagram illustrating a relationship between each sensitivity threshold value, an operation state, and a screen display stored in the operation control unit of the first embodiment.
- FIG. 8 is a flowchart illustrating an input process performed by the operation control unit in the remote operation device according to the first embodiment.
- FIG. 9 is a diagram for explaining the display state of the display screen accompanying the finger operation on the groove in the remote operation device according to the first embodiment.
- FIG. 10 is an explanatory diagram for explaining the arrangement of the navigation device and the remote operation device according to the second embodiment in the passenger compartment.
- FIG. 11 is a configuration diagram illustrating configurations of the navigation apparatus and the remote operation device according to the second embodiment.
- FIG. 12 is a flowchart illustrating an input process performed by the operation control unit in the remote operation device according to the second embodiment.
- FIG. 13 is a diagram for explaining a display state of a display screen associated with a finger operation on the groove in the remote operation device according to the second embodiment.
- FIG. 14 is an explanatory diagram for explaining the arrangement of the navigation device and the remote operation device according to the third embodiment in the vehicle interior.
- FIG. 15 is a configuration diagram showing configurations of the navigation apparatus and the remote operation device in the third embodiment.
- FIG. 16 is a flowchart illustrating an input process performed by the operation control unit in the remote operation device according to the third embodiment.
- FIG. 17 is a diagram for explaining the display state of the display screen accompanying the finger operation on the groove in the remote operation device according to the third embodiment.
- the input device of the present disclosure is applied to a remote operation device 100 for operating a navigation device 50.
- FIG. The remote operation device 100 is mounted on a vehicle and constitutes the display system 10 together with the navigation device 50 and the like.
- the remote control device 100 is installed at a position adjacent to the palm rest 39 at the center console of the vehicle, and is within a reach of an operator (a user, here a driver).
- the operation surface 32 is exposed.
- the remote operation device 100 has a touch sensor 31 (FIGS. 2 and 3), and the surface of the touch sensor 31 is formed as the operation surface 32 on which an operator's finger operation is performed. Note that “F” in FIG. 1 indicates an operator's finger.
- the navigation device 50 exposes the display screen 53 of the liquid crystal display 52 so that it can be viewed by the operator, and with the display screen 53 facing the driver's seat and the passenger's seat side, It is installed in the center of (vehicle width direction).
- the display system 10 liquid crystal display 52
- the remote control device 100 is formed separately from the navigation device 50 and is set at a position away from the navigation device 50.
- the remote operation device 100 is connected to a Controller Area Network bus (hereinafter referred to as a CAN bus) 90, an external battery 91, and the like.
- the CAN bus 90 is a transmission path used for data transmission between in-vehicle devices in an in-vehicle communication network formed by connecting a plurality of in-vehicle devices mounted on a vehicle.
- the remote operation device 100 can communicate with the navigation device 50 located remotely via the CAN bus 90.
- the remote operation device 100 is formed in a rectangular box shape as a whole.
- the sensor 34a and 34b, the operation control part 35, etc. are comprised.
- Each power supply interface 21, 22 stabilizes the power supplied from the battery 91 and supplies it to the operation control unit 35.
- One power interface 21 is always supplied with power from the battery 91.
- the other power interface 22 is supplied with electric power from the battery 91 when the switch 92 is energized when the accessory (ACC) power source of the vehicle is turned on.
- the communication control unit 23 and the communication interface 24 are configured to output information processed by the operation control unit 35 to the CAN bus 90 and to acquire information output to the CAN bus 90 from other in-vehicle devices.
- the communication control unit 23 and the communication interface 24 are connected to each other by a transmission signal line TX and a reception signal line RX.
- the touch sensor 31 is a capacitance type detection unit (an example of a detection unit), is formed in a rectangular flat plate shape, and detects an operation state with a finger on the sensor surface.
- the touch sensor 31 is provided at a position corresponding to the operation surface 32.
- the touch sensor 31 is formed by arranging electrodes extending along the x-axis direction and electrodes extending along the y-axis direction in a lattice shape. These electrodes are connected to the operation control unit 35.
- Each electrode is configured such that the generated capacitance changes according to the position of the finger close to the sensor surface (x, y, z coordinate position in FIG. 3).
- a signal (sensitivity value) is output to the operation control unit 35.
- the sensor surface is covered with an insulating sheet made of an insulating material.
- the operation surface 32 is a flat portion on which the operator performs finger operations, and is formed by, for example, a material that improves the sliding of the fingers over the entire insulating sheet on the sensor surface.
- the operation surface 32 is formed in a rectangular shape in a region on the vehicle rear side on the front surface of the rectangular parallelepiped remote operation device 100.
- Input for operation on the display image 60 (see FIG. 9) displayed on the display screen 53 can be performed by an operator's finger operation on the operation surface 32 in the x-axis direction, the y-axis direction, or the z-axis direction. Is set.
- the groove 33 is a groove formed so that the driver's fingertip can be inserted into the depression and the finger can slide in the longitudinal direction where the depression is continuous.
- the groove portion 33 is disposed in a region on the front side of the vehicle with respect to the operation surface 32 on the front surface of the rectangular parallelepiped remote operation device 100.
- the groove portion 33 (the bottom portion of the groove portion 33) is disposed so as to be flush with the operation surface 32.
- the longitudinal direction of the groove portion 33 is set in the vehicle front-rear direction and is disposed so as to face the operation surface 32.
- Groove portion 33 is arranged at approximately the center in the left-right direction of vehicle in remote operation device 100.
- the cross-sectional shape of the groove part 33 is formed in, for example, an arc shape (semi-circular shape) having the same width as the fingertip. Therefore, as shown in FIG. 4, a plurality of fingers, palms and the like cannot be inserted into the groove 33.
- the first and second sensors 34 a and 34 b are a plurality of detection units that detect the presence or absence of the driver's finger in the groove 33.
- the first and second sensors 34a and 34b are provided at positions corresponding to the groove 33 (further below the bottom of the groove 33).
- the first sensor 34 a is provided so as to correspond to one end side of the groove portion 33 which is on the side far from the operation surface 32.
- the second sensor 34 b is provided so as to correspond to the other end side of the groove 33 that is closer to the operation surface 32.
- the first sensor 34a corresponds to the far-side detection unit of the present disclosure
- the second sensor 34b corresponds to an example of the near-side detection unit of the present disclosure.
- the first and second sensors 34a and 34b are formed from the same electrodes as the touch sensor 31. As shown in FIG. 5, in the present embodiment, the touch sensor 31 is disposed on one end side of the touch sensor 31. And is formed integrally. The electrodes of the first and second sensors 34 a and 34 b are connected to the operation control unit 35 via the touch sensor 31. The electrodes of the first and second sensors 34 a and 34 b change in capacitance when a finger approaches or comes into contact with the groove 33, and a generated capacitance signal (sensitivity value) is output to the operation control unit 35. It has come to be.
- the operation control unit 35 corresponds to an example of the control unit of the present disclosure, and includes a processor that performs various types of arithmetic processing, a RAM that functions as a work area for arithmetic processing, a flash memory that stores programs used for arithmetic processing, and the like It is constituted by.
- the operation control unit 35 is connected to each of the power supply interfaces 21 and 22, the communication control unit 23, the touch sensor 31, and the first and second sensors 34a and 34b.
- the operation control unit 35 acquires a sensitivity value (Hth) as a measurement value of the touch sensor 31 by measuring a capacitance signal of each electrode of the touch sensor 31 by executing a predetermined program. When the operator's finger is close to the operation surface 32 (sensor surface), electric charge is stored between the electrode and the finger.
- the operation control unit 35 includes an x-coordinate and a y-coordinate indicating the relative operation position (hereinafter referred to as “relative position”) of the finger in the direction in which the surface of the operation surface 32 spreads.
- the z coordinate corresponding to (hereinafter referred to as “operation distance”) is calculated by a calculation process based on the sensitivity value.
- the operation control unit 35 calculates a part (x coordinate position, y coordinate position) where the largest sensitivity value is obtained among the electrodes on the x axis and the y axis as the current relative position of the finger. ing.
- the operation control unit 35 calculates the current z-coordinate position of the finger, that is, the operation distance based on the magnitude of the obtained sensitivity value.
- the operation control unit 35 operates the finger on the operation state (the finger operation distance in the z-axis direction) and the operation on the display image 60 to be described later with respect to the obtained sensitivity value.
- the operation control unit 35 includes a plurality of sensitivity threshold values Hth1 and Hth2 for determining an operation state in advance, and determines an operation state of a finger according to the sensitivity threshold value.
- the operation state includes a “contact state” in which the finger is in contact with the operation surface 32 or is not actually in contact but is almost in contact, and a “proximity state” in which the finger is in proximity to the operation surface 32. , And the “non-contact state” in which the finger is further away from the operation surface 32 than the “proximity state”.
- the “contact state” may be an operation state only when the finger substantially contacts the operation surface 32.
- the “contact state” is referred to as “in contact”
- the “proximity state” is referred to as “in proximity”
- the “non-contact state” is referred to as “non-contact”.
- the upper thresholds Hth1U and Hth2U corresponding to positions separated by a predetermined distance are set above the sensitivity thresholds Hth1 and Hth2, respectively (side closer to the operation surface 32).
- lower thresholds Hth1D and Hth2D corresponding to positions separated by a predetermined distance are set below the sensitivity thresholds Hth1 and Hth2, respectively (the side far from the operation surface 32).
- the upper threshold values Hth1U and Hth2U serve as sensitivity thresholds for determining the operation state.
- the lower threshold values Hth1D and Hth2D are used for determining the operation state. Sensitivity threshold.
- the operation control unit 35 detects a pressing operation (touch operation) when the operator gently presses the operation surface 32 with a finger. Then, the operation control unit 35 outputs the x, y, z coordinates indicating the position of the finger accompanying the finger slide operation and the presence / absence of the pressing operation to the CAN bus 90 through the communication control unit 23 and the communication interface 24.
- a pressing operation touch operation
- the operation control unit 35 outputs the x, y, z coordinates indicating the position of the finger accompanying the finger slide operation and the presence / absence of the pressing operation to the CAN bus 90 through the communication control unit 23 and the communication interface 24.
- the operation control unit 35 measures the capacitance signals of the electrodes of the first and second sensors 34a and 34b by executing a predetermined program, whereby the first and second sensors 34a and 34b are measured. Get the sensitivity value as the measured value.
- the operation control unit 35 determines that the finger is in the groove 33 when the obtained sensitivity value is equal to or greater than a predetermined value. Conversely, if the obtained sensitivity value is less than the predetermined value, the operation control unit 35 determines that there is no finger in the groove 33.
- the operation control unit 35 detects that the driver's finger is detected by all of the first and second sensors 34a and 34b, and that the detection is in the order of the first sensor 34a and the second sensor 34b. It is determined that there is a finger slide. That is, when the operation control unit 35 determines that there is a finger from the signal of the first sensor 34a and then determines that there is a finger from the signal of the second sensor 34b, the driver determines that one end side of the groove 33 (operation It is determined that the finger has been slid from the side far from the surface 32 toward the other end (side closer to the operation surface 32).
- the navigation device 50 has an air conditioning operation setting function for the vehicle air conditioning device, an audio operation setting function for the vehicle audio, and various information retrieval via the Internet. It has a browsing function.
- the navigation device 50 is connected to the CAN bus 90 so as to be able to communicate with the remote operation device 100 and the like, and includes a display control unit 51 and a liquid crystal display 52.
- the display control unit 51 includes a processor that performs various arithmetic processes, a RAM that functions as a work area for arithmetic processes, a graphic processor that performs image drawing processes, a graphic RAM that functions as a work area for drawing processes, and the like.
- the display control unit 51 has a flash memory for storing data used for arithmetic processing and drawing processing, a communication interface connected to the CAN bus 90, and a video output interface for outputting drawn image data to the liquid crystal display 52. is doing.
- the display control unit 51 draws a display image 60 to be displayed on the display screen 53 based on information acquired from the CAN bus 90.
- the display control unit 51 sequentially outputs the image data of the drawn display image 60 to the liquid crystal display 52 through the video output interface.
- the liquid crystal display 52 is a dot matrix type display unit that realizes color display by controlling a plurality of pixels arranged on the display screen 53.
- the liquid crystal display 52 displays video by continuously forming image data sequentially acquired from the display control unit 51 on the display screen 53.
- the display image 60 displayed on the display screen 53 corresponds to an example of the image of the present disclosure, and is composed of images divided into a plurality of layers.
- the image of the first layer is a plurality of main images for using various functions (navigation, air conditioner, audio, Internet, etc.) of the navigation device 50.
- a main image for an air conditioner is shown as one main image.
- a menu 61 is provided in which a plurality of types (names) of main images are used as menu items and these menu items are arranged in the left-right direction.
- the menu 61 is always displayed in the same form regardless of the hierarchy of the display image 60.
- the operator selects one of the desired menus 61 on the operation surface 32 by a finger operation, the selected main image is displayed on the display screen 53.
- the operator can scroll the main images sequentially by sliding the finger on the operation surface 32 without operating the menu 61 when any main image is displayed.
- the main image can be made.
- the display image 60 is provided with a plurality of icons 62 for operating the image.
- icons 62 for operating the image.
- an air volume setting icon, a temperature setting icon, a dual setting icon, a blowing mode setting icon, etc. in the air conditioner operation are shown.
- a pointer 63 is displayed on the display image 60.
- the pointer 63 indicates the position of the finger on the operation surface 32 so as to correspond to the display image 60.
- the index finger indicates the position when the finger is operated.
- the pointer 63 may be a pointer having a basic design. The pointer 63 is displayed on the display image 60 when the contact state of the finger is in contact and in proximity.
- a frame-like focus 64 indicating that the icon 62 has been selected is displayed.
- the finger is lightly pressed on the operation surface 32 so as to correspond to the position of the selected icon 62 (touch operation)
- the icon 62 is determined, and the image of the second hierarchy, that is, determined.
- the operation image corresponding to the icon is transferred, and various functions can be used in sequence.
- the display image 60 includes lock information 65 (FIG. 9) indicating that the input is prohibited. Is displayed.
- step 100 the operation control unit 35 determines from the detection signals of the first and second sensors 34 a and 34 b of the groove portion 33 that the driver has moved from one end side (point A in FIG. 9) It is determined whether or not the finger has been slid (traced) toward the end side (point B in FIG. 9).
- the operation control unit 35 determines that there is a finger slide in the groove 33 when a sensitivity value equal to or higher than a predetermined value is obtained in the order of the first sensor 34a and the second sensor 34b. It is determined that there is no slide.
- step S100 If NO in step S100, that is, it is determined that there is no finger slide, the operation control unit 35 assumes that the input prohibited (locked) state by the finger operation in step S240, which will be described later, continues and is currently locked in step S110. Is displayed on the display image 60 and recognized by the driver. Specifically, lock information (LOCKED) 65 is displayed on the display image 60 as shown in the left frame of FIG.
- LOCKED lock information
- step S120 the operation control part 35 performs highlighting with respect to a driver
- step S140 the operation control unit 35 cancels the input inhibition (lock) by the finger operation. As a result, a finger input operation on the operation surface 32 is enabled.
- step S100 means that the driver slides his / her finger from the point A side to the point B side with respect to the groove 33, and the driver's finger is naturally on the operation surface 32. You will be in the area.
- step S150 the operation control part 35 performs the acquisition process which acquires the sensitivity value detected with each electrode of the touch sensor 31, and progresses to step S160.
- step S160 calculation processing of the x coordinate indicating the relative position of the finger with respect to the operation surface 32, the y coordinate, and the z coordinate indicating the operation distance is performed from the sensitivity value acquired in step S150. Then, from the calculated value of the z coordinate indicating the operation distance, it is calculated whether the operation state of the finger is in contact, in proximity, or non-contact.
- step S170 it is determined whether the calculated finger operation state is not in contact and the sensitivity value is Hth1 or more (actually, the upper threshold value Hth1U or more). If an affirmative determination is made here, the operator's finger approaches the operation surface 32 from a non-contact or approaching state and is in contact with the operation surface 32. In step S180, the operation state of the finger is contacted. Update inside.
- the operation control unit 35 updates the display screen 53 to the contact screen.
- the contact screen is a screen on which the original menu 61, icon 62, pointer 63, and focus 64 are displayed on the display image 60, as shown in FIG. 9 (in the right frame).
- the focus 64 indicates the current operation state (operation setting state) of the device corresponding to the main image to be displayed.
- the operator can perform original screen operations, that is, selection and determination of each menu 61 and various icons 62 by finger operation (slide, touch operation, etc.).
- step S190 the operation control unit 35 is in a non-contact operation state as a first condition, and the sensitivity value is Hth2 or more (actually, the upper threshold value Hth2U or more). Whether the operation state is in contact and the sensitivity value is Hth1 or less (actually the lower threshold value Hth1D or less) is determined as a second condition. If an affirmative determination is made here, the operator's finger approaches the operation surface 32 from non-contact or is slightly away from the operation surface 32 during contact, and the operation state of the finger is approaching in step S200. Update to
- the operation control unit 35 updates the display screen 53 to the close proximity screen.
- the in-proximity screen is a screen on which the pointer 63 and the focus 64 are not displayed on the above-mentioned contact screen.
- the operator can switch the main image by the menu 61 by a finger operation (gesture such as flick) during proximity.
- step S210 the operation control unit 35 determines whether or not the operation state is non-contact and the sensitivity value is Hth2 or less (actually, the lower threshold value Hth2D or less). Determine. If an affirmative determination is made here, the operator's finger is in a state of being largely separated from the operating surface 32 from the state of being in contact or in proximity, and the operation state of the finger is updated to non-contact in step S220.
- the operation control unit 35 updates the display screen 53 to a non-contact screen.
- the non-contact screen is a screen on which the pointer 63 is not displayed in the above-described contact screen.
- the focus 64 indicates the current operation state (operation setting state) of the device corresponding to the displayed main image.
- the operator's finger is clearly separated from the operation surface 32, the operator has no intention to operate the display image 60, and the operator simply displays the display image 60 with the current device. It can be seen as a confirmation screen for confirming the operating state of
- step S230 the operation control unit 35 performs the finger operation of the driver on the operation surface 32 based on the sensitivity value of the touch sensor 31. It is determined whether or not the state that has not been continued for a predetermined time or more. If an affirmative determination is made in step S230, the operation control unit 35 prohibits (locks) input by a finger operation on the operation surface 32 in step S240. If a negative determination is made in step S230, step S230 is repeatedly executed.
- the operation control unit 35 prohibits input by finger operation on the operation surface 32 when the driver's finger operation on the operation surface 32 has not been performed for a predetermined time or longer (Ste S240). Therefore, even if the driver carelessly brings his finger close to the operation surface 32, the input for operating the display image 60 is not executed, and therefore it is possible to prevent an input unintended by the driver. It becomes.
- the operation control unit 35 determines that the finger slides in the groove 33 and cancels the prohibition of input. (Step S140). Therefore, the driver can again input by operating the finger on the operation surface 32 by sliding the finger with respect to the groove portion 33. In this case, the finger operation intended by the driver is performed.
- step S240 when the input is prohibited (step S240), the operation control unit 35 does not slide the finger in the groove 33 even if the driver performs a finger operation on the operation surface 32 (No in step S100). If it is determined (determination), the fact that the input is prohibited is displayed on the display image 60 (liquid crystal display 52) as the lock information 65 (step S110). Thus, the driver can clearly understand that the input operation cannot be performed, and is not confused when performing the finger operation.
- a plurality of detection units are provided as detection units for detecting a finger slide in the groove 33. Then, when both (all) of the first and second sensors 34a and 34b detect that the driver's finger is present, the operation control unit 35 determines that the finger is slid. Thereby, since the detection accuracy of a finger slide can be improved, erroneous determination can be reduced.
- the groove 33 is disposed on the same surface as the operation surface 32 so that the longitudinal direction thereof faces the operation surface 32, and the operation control unit 35 includes the first and second sensors 34a and 34b.
- the second sensor 34b on the near side (point B side) is detected after the first sensor 34a on the far side (point A side) with respect to the operation surface 32, it is determined that there is a finger slide.
- the driver slides his / her finger from the point A side to the point B side with respect to the groove portion 33, and then performs a finger operation on the operation surface 32. Operation becomes possible.
- first and second sensors 34 a and 34 b in the groove 33 are formed integrally with the touch sensor 31 on the operation surface 32. Thereby, it is possible to easily form the touch sensor 31 and the first and second sensors 34a and 34b without providing dedicated sensors, and it is possible to cope with the low cost.
- the shape of the groove 33 as the operation part for canceling the prohibition of input, a plurality of fingers, palms and the like cannot be inserted into the groove 33, and an erroneous operation can be prevented. Furthermore, by performing a finger slide operation along the groove 33, a blind operation during driving can be performed without directly viewing the operation screen 32 and the display screen 53 of the liquid crystal display 52.
- FIGS. A remote operation device 100A of the second embodiment is shown in FIGS.
- an input process of the display image 60 for the driver in the driver seat and the passenger in the passenger seat is added as an operator to the first embodiment. .
- the remote control device 100A includes a plurality of (here, two) grooves, that is, a driver seat groove 33a and a passenger seat groove 33b, arranged in the left-right direction. Is provided.
- the driver seat groove 33a is located on the driver's seat side (right side) of the remote operation device 100A and is a groove associated with the driver. As in the first embodiment, the driver seat groove 33a is provided with first and second sensors 34a1 and 34b1. The signals detected by the first and second sensors 34a1, 34b1 are output to the operation control unit 35 as signals indicating that the driver's finger is in the driver's seat groove 33a.
- the passenger seat groove portion 33b is located on the passenger seat side (left side) of the remote operation device 100A and is a groove portion associated with the passenger seat passenger.
- first and second sensors 34a2 and 34b2 are provided in the passenger seat groove 33b. The signals detected by the first and second sensors 34a2, 34b2 are output to the operation control unit 35 as signals indicating that the passenger's finger is in the passenger seat groove 33b.
- the operation control unit 35 determines whether the operator for each of the grooves 33a and 33b is a driver based on the signals from the first and second sensors 34a1 and 34b1 or the first and second sensors 34a2 and 34b2. Can be determined.
- Step S141, S142, and S143 for determining and setting the screen display according to the determination result are added between step S140 and step S150.
- the operation control unit 35 first determines whether or not the driver or passenger's finger slides on the grooves 33a and 33b in step S100 and makes a negative determination. Steps S110 to S130 are performed, and if an affirmative determination is made, step S140 is performed.
- step S141 the operation control unit 35 determines whether the operator is a driver or a passenger seat from the output signals of the first and second sensors 34a1 and 34b1 or the first and second sensors 34b1 and 34b2 in step S100. Determine whether. Specifically, when the operation control unit 35 determines that there is a finger slide in the driver seat groove 33a based on the output signals from the first and second sensors 34a1 and 34b1, the operator is determined to be a driver. To do. Further, when the operation control unit 35 determines that the finger slide is present in the passenger seat groove portion 33b based on the output signals from the first and second sensors 34a2 and 34b2, the operation control unit 35 determines that the operator is a passenger seat passenger. Note that “D seat” in FIG. 12 means a driver seat, and “P seat” means a passenger seat.
- step S142 the operation control unit 35 performs screen display settings for the driver. That is, the original display image 60 is switched to the driver exclusive image 60a.
- the driver-specific image 60a is an image on which various icons 62, a pointer 63, and a focus 64 are displayed, as shown in the right frame of FIG. Then, in the driver-specific image 60a, instead of the menu 61, a driver-specific display 60a1 indicating that this screen is a driver-specific screen is additionally displayed.
- the external shape of the driver-specific image 60a is changed so as to face the driver. That is, in the driver-specific image 60a, the original horizontally long rectangular appearance of the display image 60a becomes a parallelogram-like appearance.
- the external shape of the parallelogram is a shape in which the bottom side is shifted to the driver side (right side), and the left and right sides are inclined from the upper side to the lower side to the driver side. That is, the driver-specific image 60a is an image that allows the driver himself to intuitively recognize that the original display image 60 is an image directed toward the driver.
- step S143 the operation control unit 35 performs screen display settings for the passenger seat. That is, the original display image 60 is switched to the passenger seat exclusive image 60b.
- the passenger seat exclusive image 60b is an image in which various icons 62, a pointer 63, and a focus 64 are displayed as shown in the left frame of FIG.
- a passenger seat dedicated display 60b1 indicating that this screen is a passenger seat dedicated screen is additionally displayed in the passenger seat dedicated image 60b.
- the appearance shape of the passenger seat exclusive image 60b is changed so as to face the passenger seat side.
- the original laterally rectangular external appearance shape of the display image 60a becomes the parallelogram external appearance shape.
- the external shape of the parallelogram is a shape in which the bottom is shifted to the passenger seat side (left side), and the left and right sides are inclined from the upper side to the lower side toward the passenger seat side. That is, the passenger seat exclusive image 60b is an image that allows the passenger seat himself to intuitively recognize that the original display image 60 is an image directed toward the passenger seat seat side.
- step S142 and step S143 the operation control unit 35 performs steps S150 to S240 similarly to the first embodiment, and performs input control according to the operation state of the operator's finger, Further, when the state without finger operation continues for a predetermined time or longer, input to the operation surface 32 is prohibited.
- the driver's target is the driver and the passenger seat, and the driver's seat groove portion 33a associated with the driver and the passenger seat as the groove portion, and the assistant.
- a plurality of groove portions called seat groove portions 33b are provided.
- the operation control unit 35 displays the display image 60, the driver-specific image 60a, or the passenger seat so as to correspond to the groove 33a or the groove 33b that is determined to have a finger slide among the grooves 33a and 33b. The image is switched to one of the user-dedicated images 60 b and displayed on the liquid crystal display 52.
- the operator can confirm that the current display image 60 (driver's exclusive image 60a or passenger's exclusive image 60b) is based on the operator's own input.
- the current display image 60 driver's exclusive image 60a or passenger's exclusive image 60b
- a remote operation device 100B of the third embodiment is shown in FIGS.
- a plurality of groove portions are provided as groove portions in the first embodiment, and are displayed by a finger operation on each groove portion.
- a function for selecting a desired image is added.
- the remote operation device 100B includes a plurality of (here, three) grooves, that is, a navigation groove 331, an audio groove 332, and an air conditioner arranged in the left-right direction.
- a groove portion 333 is provided.
- the navigation groove 331 is located on the left side of the remote operation device 100B, and is a groove associated with, for example, a navigation screen display in the first hierarchy. As in the first embodiment, the navigation groove 331 is provided with first and second sensors 34aA and 34bA. The signals detected by the first and second sensors 34aA and 34bA are output to the operation control unit 35 as signals indicating that the operator's finger is in the navigation groove 331.
- the audio groove 332 is located at the center in the left-right direction of the remote operation device 100B, and is, for example, a groove associated with the audio screen display in the first layer.
- the audio groove 332 is provided with first and second sensors 34aB and 34bB. The signals detected by the first and second sensors 34aB and 34bB are output to the operation control unit 35 as signals indicating that the operator's finger is in the audio groove 332.
- the air conditioner groove 333 is located on the right side of the remote operation device 100B, and is a groove associated with, for example, an air conditioner screen display in the first hierarchy.
- the air conditioner groove 333 is provided with first and second sensors 34aC and 34bC, as in the first embodiment.
- the signals detected by the first and second sensors 34aC and 34bC are output to the operation control unit 35 as signals indicating that the operator's finger is in the air conditioner groove 333.
- the operation control unit 35 determines which one of the operators is based on the signals from the first and second sensors 34aA and 34bA, the first and second sensors 34aB and 34bB, or the first and second sensors 34aC and 34bC. It is possible to determine whether or not the groove portion has been operated (slid).
- Step S145, Step S146, Step S147, and Step S148 for setting screen display according to the determination result are added between Step S140 and Step S150.
- the operation control unit 35 first determines whether or not the operator's finger slides on the grooves 331, 332, and 333 in step S100, and if a negative determination is made, step S110 is performed. Step S130 is executed, and if an affirmative determination is made, step S140 is executed.
- step S145 the operation control unit 35 determines the output signal from the first and second sensors 34aA and 34bB, the first and second sensors 34aB and 34bB, or the first and second sensors 34aC and 34bC in step S100.
- the operator determines which groove part the finger is slid.
- the operation control unit 35 determines that there is a finger slide in the navigation groove 331 based on the output signals from the first and second sensors 34aA and 34bA. Further, the operation control unit 35 determines that there is a finger slide in the audio groove 332 based on the output signals from the first and second sensors 34aB and 34bB. Further, the operation control unit 35 determines that there is a finger slide in the air conditioner groove 333 based on the output signals from the first and second sensors 34aC and 34bC.
- the operation control unit 35 performs screen display setting for switching the original display image 60 to the navigation image 601 in step S146.
- the navigation image 601 is an image corresponding to “Map” in the menu 61, and shows a map near the current traveling, the position of the vehicle on this map, and the like. It is an image.
- the operation control unit 35 performs screen display setting for switching the original display image 60 to the audio image 602 in step S147.
- the audio image 602 is an image corresponding to “Media” in the menu 61, and can select a desired audio device and reproduce music, images, and the like. The image is displayed with an icon.
- step S145 If it is determined in step S145 that the finger is slid in the air conditioner groove 333, the operation control unit 35 performs screen display setting for switching the original display image 60 to the air conditioner image 603 in step S148.
- An air conditioner image 603 (in the right frame in FIG. 17) is an image for operating the air conditioner described in the first and second embodiments.
- step S146, step S147, and step S148 the operation control unit 35 performs steps S150 to S240 as in the first embodiment, and performs input control according to the operation state of the operator's finger. Further, when the state without finger operation continues for a predetermined time or longer, input to the operation surface 32 is prohibited.
- a plurality of groove portions (navigation groove portion 331, audio groove portion 332, and air conditioner groove portion 333) corresponding to a plurality of images in a predetermined hierarchy are provided as groove portions.
- the operation control part 35 cancels
- the operation control section 35 displays the display image 60 so as to correspond to the navigation groove section 331, the audio groove section 332, or the air conditioner groove section 333 that is determined to have a finger slide among the groove sections 331, 332, and 333. Is switched to either the navigation image 601, the audio image 602, or the air conditioner image 603 and displayed on the liquid crystal display 52.
- the operator can cancel the input prohibition state on the operation surface 32 while simultaneously selecting the desired display image 60, and the operability can be improved.
- the operator can select a desired display image 60 by sliding one of the grooves 331, 332, and 333 with a finger, so that an image can be selected by a blind operation.
- step S110 the lock information 65 is flushed (flashed) in step S120.
- Step S120 may be abolished.
- step S130 after returning to the normal image in step S130, the return to the normal image corresponding to step S130 may be performed when the input prohibition in step S140 is canceled after the affirmative determination in step S100 is abolished. .
- first and second sensors are provided for each groove part, but one sensor may be used for one groove part.
- the direction in which the finger is slid may be from either side (point A side or point B side) of both ends of the groove.
- the sliding direction from the point A side to the point B side is determined in advance as an operation procedure, continuous operation on the operation surface 32 from each groove portion becomes possible.
- the operation surface 32 and each groove portion are provided on the same surface, and the longitudinal direction of the groove portion faces the inside of the operation surface 32.
- the groove portion is formed on the operation surface 32.
- it may be provided on the upper side with a step.
- first and second sensors are provided integrally with the touch sensor 31, they may be provided as dedicated sensors.
- a predetermined input function for example, canceling the current input operation, or 1 by sliding a finger on two or more grooves simultaneously among a plurality of grooves, or 1 Or return to the main screen of the first hierarchy, or shift to the setting menu).
- the capacitive touch sensor 31 is used as the detection unit (detection means), the present invention is not limited to this, and other pressure-sensitive touch sensors may be used.
- a push switch may be provided in the remote operation device 100, 100A, 100B, and when the content (icon etc.) selected by the finger operation is determined, it may be determined by pressing this push switch.
Abstract
Description
第1実施形態(図1~図9)は、本開示の入力装置を、ナビゲーション装置50を操作するための遠隔操作デバイス100に適用したものである。遠隔操作デバイス100は、車両に搭載され、ナビゲーション装置50等と共に表示システム10を構成している。
第2実施形態の遠隔操作デバイス100Aを図10~図13に示す。第2実施形態は、上記第1実施形態に対して、操作者として、運転席における運転者、および助手席における助手席者を対象とした表示画像60の入力処理を付加したものとなっている。
第3実施形態の遠隔操作デバイス100Bを図14~図17に示す。第3実施形態は、上記第1実施形態に対して、溝部として複数の溝部(ナビ用溝部331、オーディオ用溝部332、エアコン用溝部333)を設けて、各溝部への指操作によって、表示させたい画像(例えば複数の階層のうち、第1階層における複数のメイン画像)の選択機能を付加したものとなっている。
上記各実施形態では、ステップS110の後に、ステップS120でロック情報65をフラッシング(点滅)するようにしたが、ロック情報65を表示すること自体で、操作者に対する注意喚起ができるようであれば、ステップS120は廃止しても良い。
Claims (8)
- 車両に搭載されて、複数の階層に分かれた画像(60)を切替え可能に表示する表示部(52)に対して別体で形成されており、操作面(32)に対する使用者の指操作によって、前記画像(60)を操作するための入力が行われる入力装置であって、
窪みに前記使用者の指先が挿入されて、前記窪みが連続する長手方向に指のスライドが可能となる溝部(33)と、
前記溝部(33)における前記指の有無を検出する検出部(34a、34b)と、
前記操作面(32)に対する前記指操作が所定時間以上実施されていないときに、前記操作面(32)への前記指操作による入力の禁止を行うと共に、前記検出部(34a、34b)によって前記溝部(33)に前記指があると検出されると、前記溝部(33)における前記指のスライド有りと判定して、前記入力の禁止を解除する制御部(35)とを備える入力装置。 - 前記制御部(35)は、前記入力の禁止を行っているときに、前記使用者による前記操作面(32)への指操作が行われても、前記溝部(33)における前記指のスライド無しと判定すると、前記入力の禁止を行っている旨を前記表示部(52)に表示させる請求項1に記載の入力装置。
- 前記検出部(34a、34b)は、前記溝部(33)の長手方向に複数設けられ、
前記制御部(35)は、複数の前記検出部(34a、34b)のすべてが、前記指の有ることを検出すると、前記指のスライド有りと判定する請求項1または請求項2に記載の入力装置。 - 前記溝部(33)は、前記操作面(32)と同一面上に、且つ前記長手方向が前記操作面(32)内に向くように配置されており、
前記制御部(35)は、複数の前記検出部(34a、34b)のうち、前記操作面(32)に対して遠い側の検出部(34a)の後に、近い側の検出部(34b)が前記指の有ることを検出すると、前記指のスライド有りと判定する請求項3に記載の入力装置。 - 前記溝部(33)は、前記操作面(32)と同一面上に、且つ前記長手方向が前記操作面(32)内に向くように配置されている請求項1または請求項2に記載の入力装置。
- 前記操作面(32)には、前記操作面(32)に対する前記指の位置を検出するタッチセンサ(31)が設けられており、
前記検出部(34a、34b)は、前記タッチセンサ(31)と一体的に形成されている請求項1~請求項5のいずれか1つに記載の入力装置。 - 前記使用者は、前記車両の運転者および助手席者を含み、
前記溝部(33)は、複数設けられており、
複数の前記溝部(33a、33b)は、前記運転者用、および前記助手席者用にそれぞれ対応付けされており、
前記制御部(35)は、複数の前記溝部(33a、33b)のうち、前記指のスライド有りと判定した前記溝部(33a、33b)に対応するように前記運転者用、あるいは前記助手席者用の専用画像(60a、60b)を前記表示部(52)に表示させる請求項1~請求項6のいずれか1つに記載の入力装置。 - 前記溝部(33)は、複数設けられており、
複数の前記溝部(331、332、333)は、複数の階層に分かれた前記画像(601、602、603)とそれぞれ対応付けされており、
前記制御部(35)は、複数の前記溝部(331、332、333)のうち、前記指のスライド有りと判定した前記溝部(331、332、333)に対応する前記画像(601、602、603)を前記表示部(52)に表示させる請求項1~請求項6のいずれか1つに記載の入力装置。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE112014001851.1T DE112014001851T5 (de) | 2013-04-03 | 2014-03-27 | Eingabevorrichtung |
US14/780,984 US9778764B2 (en) | 2013-04-03 | 2014-03-27 | Input device |
CN201480018825.4A CN105121228B (zh) | 2013-04-03 | 2014-03-27 | 输入装置 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013-077849 | 2013-04-03 | ||
JP2013077849A JP5928397B2 (ja) | 2013-04-03 | 2013-04-03 | 入力装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014162699A1 true WO2014162699A1 (ja) | 2014-10-09 |
Family
ID=51658013
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/001777 WO2014162699A1 (ja) | 2013-04-03 | 2014-03-27 | 入力装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US9778764B2 (ja) |
JP (1) | JP5928397B2 (ja) |
CN (1) | CN105121228B (ja) |
DE (1) | DE112014001851T5 (ja) |
WO (1) | WO2014162699A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9594466B2 (en) | 2013-04-02 | 2017-03-14 | Denso Corporation | Input device |
Families Citing this family (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016086343A1 (en) * | 2014-12-01 | 2016-06-09 | Xiamen Zkteco Biometric Identification Technology Co., Ltd | System and method for personal identification based on multimodal biometric information |
WO2016086341A1 (en) * | 2014-12-01 | 2016-06-09 | Dongguan Zkteco Electronic Technology Co., Ltd | System and method for acquiring multimodal biometric information |
JP6473610B2 (ja) * | 2014-12-08 | 2019-02-20 | 株式会社デンソーテン | 操作装置、及び、操作システム |
US10429930B2 (en) * | 2015-01-30 | 2019-10-01 | Ncr Corporation | Accessible tactile features for blind and partially sighted PIN entry and touchscreen usage |
JP6747941B2 (ja) * | 2016-11-04 | 2020-08-26 | アルパイン株式会社 | タッチ式入力装置および操作検出方法 |
CA3049120A1 (en) * | 2017-01-03 | 2018-07-12 | Brilliant Home Technology, Inc. | Home device controller with a touch control groove |
JP2018169756A (ja) * | 2017-03-29 | 2018-11-01 | 富士フイルム株式会社 | タッチ式操作システムとその作動方法および作動プログラム |
JP2018169755A (ja) * | 2017-03-29 | 2018-11-01 | 富士フイルム株式会社 | タッチ式操作装置とその作動方法および作動プログラム |
KR20190041632A (ko) * | 2017-10-13 | 2019-04-23 | 현대자동차주식회사 | 차량 및 그 제어 방법 |
WO2019111515A1 (ja) * | 2017-12-08 | 2019-06-13 | パナソニックIpマネジメント株式会社 | 入力装置、及び、入力方法 |
USD944216S1 (en) | 2018-01-08 | 2022-02-22 | Brilliant Home Technology, Inc. | Control panel with sensor area |
DE102018001200B3 (de) * | 2018-02-14 | 2019-04-25 | Daimler Ag | Verfahren und Vorrichtung zur Bestimmung einer Nutzereingabe |
CN108762051A (zh) * | 2018-04-08 | 2018-11-06 | 天芯智能(深圳)股份有限公司 | 一种指针式智能手表的控制方法及装置 |
GB2577480B (en) * | 2018-09-11 | 2022-09-07 | Ge Aviat Systems Ltd | Touch screen display assembly and method of operating vehicle having same |
USD945973S1 (en) | 2019-09-04 | 2022-03-15 | Brilliant Home Technology, Inc. | Touch control panel with moveable shutter |
JP6806223B1 (ja) * | 2019-12-06 | 2021-01-06 | トヨタ自動車株式会社 | 表示制御装置、車両、表示の制御方法及びプログラム |
US11715943B2 (en) | 2020-01-05 | 2023-08-01 | Brilliant Home Technology, Inc. | Faceplate for multi-sensor control device |
US20220297704A1 (en) * | 2021-03-18 | 2022-09-22 | Wind River Systems, Inc. | Detecting Vehicle Infotainment User Position |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH02240715A (ja) * | 1989-03-15 | 1990-09-25 | Matsushita Electric Ind Co Ltd | 接触位置検出装置 |
JP2000353269A (ja) * | 1999-06-11 | 2000-12-19 | Nec Kofu Ltd | 現金自動預金支払機 |
JP2008541222A (ja) * | 2005-06-02 | 2008-11-20 | サムスン エレクトロニクス カンパニー リミテッド | ユーザ命令を3次元的に入力可能な電子装置 |
JP2010013081A (ja) * | 2008-07-01 | 2010-01-21 | Dong-A Univ Research Foundation For Industry-Academy Cooperation | タッチパッドモジュールを利用した自動車の機能制御装置及び方法 |
JP2010061256A (ja) * | 2008-09-02 | 2010-03-18 | Alpine Electronics Inc | 表示装置 |
Family Cites Families (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090273574A1 (en) * | 1995-06-29 | 2009-11-05 | Pryor Timothy R | Programmable tactile touch screen displays and man-machine interfaces for improved vehicle instrumentation and telematics |
JPH1115599A (ja) | 1997-06-26 | 1999-01-22 | Tokai Rika Co Ltd | タッチ操作情報出力装置 |
JP3792920B2 (ja) | 1998-12-25 | 2006-07-05 | 株式会社東海理化電機製作所 | タッチ操作入力装置 |
JP2006011614A (ja) * | 2004-06-23 | 2006-01-12 | Sharp Corp | 指紋認識機能付き指輪、指紋認識装置、およびこれらを用いた情報処理システム |
US8926535B2 (en) * | 2006-09-14 | 2015-01-06 | Martin B. Rawls-Meehan | Adjustable bed position control |
US8676007B2 (en) * | 2008-06-19 | 2014-03-18 | Neonode Inc. | Light-based touch surface with curved borders and sloping bezel |
JP5364925B2 (ja) * | 2009-02-27 | 2013-12-11 | 現代自動車株式会社 | 車載機器の入力装置 |
JP5455557B2 (ja) | 2009-10-27 | 2014-03-26 | 京セラ株式会社 | 携帯端末装置 |
KR101092722B1 (ko) | 2009-12-02 | 2011-12-09 | 현대자동차주식회사 | 차량의 멀티미디어 시스템 조작용 사용자 인터페이스 장치 |
US20150309316A1 (en) * | 2011-04-06 | 2015-10-29 | Microsoft Technology Licensing, Llc | Ar glasses with predictive control of external device based on event input |
JP5638330B2 (ja) | 2010-09-28 | 2014-12-10 | 京セラ株式会社 | 携帯端末、プログラム及び表示制御方法 |
JP2012247890A (ja) | 2011-05-26 | 2012-12-13 | Nippon Seiki Co Ltd | タッチパネル入力操作装置 |
JP5617783B2 (ja) * | 2011-07-17 | 2014-11-05 | 株式会社デンソー | 車両用の操作入力装置及び制御システム |
JP5452566B2 (ja) * | 2011-10-31 | 2014-03-26 | 本田技研工業株式会社 | 車両用入力装置 |
JP5812054B2 (ja) | 2012-08-23 | 2015-11-11 | 株式会社デンソー | 操作デバイス |
US9001035B2 (en) * | 2012-10-19 | 2015-04-07 | Red Bird Rising, Llc | Configured input display for communicating to computational apparatus |
US10185416B2 (en) * | 2012-11-20 | 2019-01-22 | Samsung Electronics Co., Ltd. | User gesture input to wearable electronic device involving movement of device |
US9477313B2 (en) * | 2012-11-20 | 2016-10-25 | Samsung Electronics Co., Ltd. | User gesture input to wearable electronic device involving outward-facing sensor of device |
US20140267282A1 (en) * | 2013-03-14 | 2014-09-18 | Robert Bosch Gmbh | System And Method For Context Dependent Level Of Detail Adjustment For Navigation Maps And Systems |
JP5858059B2 (ja) | 2013-04-02 | 2016-02-10 | 株式会社デンソー | 入力装置 |
JP2014203177A (ja) | 2013-04-02 | 2014-10-27 | 株式会社デンソー | 入力装置 |
-
2013
- 2013-04-03 JP JP2013077849A patent/JP5928397B2/ja not_active Expired - Fee Related
-
2014
- 2014-03-27 US US14/780,984 patent/US9778764B2/en not_active Expired - Fee Related
- 2014-03-27 DE DE112014001851.1T patent/DE112014001851T5/de not_active Withdrawn
- 2014-03-27 WO PCT/JP2014/001777 patent/WO2014162699A1/ja active Application Filing
- 2014-03-27 CN CN201480018825.4A patent/CN105121228B/zh not_active Expired - Fee Related
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH02240715A (ja) * | 1989-03-15 | 1990-09-25 | Matsushita Electric Ind Co Ltd | 接触位置検出装置 |
JP2000353269A (ja) * | 1999-06-11 | 2000-12-19 | Nec Kofu Ltd | 現金自動預金支払機 |
JP2008541222A (ja) * | 2005-06-02 | 2008-11-20 | サムスン エレクトロニクス カンパニー リミテッド | ユーザ命令を3次元的に入力可能な電子装置 |
JP2010013081A (ja) * | 2008-07-01 | 2010-01-21 | Dong-A Univ Research Foundation For Industry-Academy Cooperation | タッチパッドモジュールを利用した自動車の機能制御装置及び方法 |
JP2010061256A (ja) * | 2008-09-02 | 2010-03-18 | Alpine Electronics Inc | 表示装置 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9594466B2 (en) | 2013-04-02 | 2017-03-14 | Denso Corporation | Input device |
Also Published As
Publication number | Publication date |
---|---|
CN105121228A (zh) | 2015-12-02 |
US20160054822A1 (en) | 2016-02-25 |
CN105121228B (zh) | 2017-03-15 |
JP5928397B2 (ja) | 2016-06-01 |
US9778764B2 (en) | 2017-10-03 |
DE112014001851T5 (de) | 2015-12-24 |
JP2014203206A (ja) | 2014-10-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5928397B2 (ja) | 入力装置 | |
JP5812054B2 (ja) | 操作デバイス | |
JP5858059B2 (ja) | 入力装置 | |
JP6035828B2 (ja) | 表示操作装置および表示システム | |
JP6310787B2 (ja) | 車両用入力装置および車両用コックピットモジュール | |
US10025402B2 (en) | Operating device for vehicle | |
US10137781B2 (en) | Input device | |
WO2014054208A1 (ja) | 操作装置 | |
WO2013153750A1 (ja) | 表示システム、表示装置、及び操作装置 | |
JP5954145B2 (ja) | 入力装置 | |
JP5754483B2 (ja) | 操作デバイス | |
WO2016031152A1 (ja) | 車両用入力インターフェイス | |
WO2014162698A1 (ja) | 入力装置 | |
EP3361367A1 (en) | In-vehicle input device, in-vehicle input system, and in-vehicle input device control method | |
WO2016031148A1 (ja) | 車両用タッチパッドおよび車両用入力インターフェイス | |
JP2012063831A (ja) | 遠隔操作装置 | |
EP3352067A1 (en) | Vehicular input device and method of controlling vehicular input device | |
CN110554830A (zh) | 显示装置、显示控制方法以及存储程序的存储介质 | |
JP2015168268A (ja) | 車両用操作装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14779390 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14780984 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1120140018511 Country of ref document: DE Ref document number: 112014001851 Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 14779390 Country of ref document: EP Kind code of ref document: A1 |