WO2020179362A1 - Control device, input device, and input control system - Google Patents

Control device, input device, and input control system Download PDF

Info

Publication number
WO2020179362A1
WO2020179362A1 PCT/JP2020/004987 JP2020004987W WO2020179362A1 WO 2020179362 A1 WO2020179362 A1 WO 2020179362A1 JP 2020004987 W JP2020004987 W JP 2020004987W WO 2020179362 A1 WO2020179362 A1 WO 2020179362A1
Authority
WO
WIPO (PCT)
Prior art keywords
area
touch
input device
direct
operation area
Prior art date
Application number
PCT/JP2020/004987
Other languages
French (fr)
Japanese (ja)
Inventor
しのぶ 佐々木
Original Assignee
株式会社東海理化電機製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社東海理化電機製作所 filed Critical 株式会社東海理化電機製作所
Publication of WO2020179362A1 publication Critical patent/WO2020179362A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the present invention relates to a control device, an input device and an input control system.
  • a touch pad characterized in that a rib is provided on a boundary line between a first region instructing a first function and a second region instructing a second function on the upper surface is known (for example, a patent). See Reference 1.).
  • This touch pad has ribs formed by using UV (Ultraviolet) ink or foaming ink on a PET (polyethylene terephthalate) resin sheet. The user can confirm the position of the boundary between the first region and the second region by the touch of the fingertip by the rib.
  • UV Ultraviolet
  • PET polyethylene terephthalate
  • the touch pad disclosed in Patent Document 1 may, for example, operate the first area and then unintentionally reach the second area to receive an unintended operation.
  • An object of the present invention is to provide a control device, an input device, and an input control system capable of suppressing acceptance of an unintended operation.
  • the control unit After the operation on the first operation area is detected, the operation on the second operation area adjacent to the first operation area and having a different assigned function is detected. When it is determined that the operation on the second operation area is an unintended operation, the control unit does not accept the operation on the second operation area.
  • FIG. 1A is an explanatory diagram showing the inside of a vehicle in which the input device according to the first embodiment is mounted.
  • FIG. 1B is a block diagram of the input device according to the first embodiment.
  • FIG. 2A is an explanatory diagram illustrating an example of a projection area of a head-up display that displays functions that can be controlled by operating the input device according to the first embodiment.
  • FIG. 2B is an explanatory view showing a steering wheel in which a touch pad of the input device according to the first embodiment is arranged.
  • FIG. 3A is an explanatory diagram showing a touch pad of the input device according to the first embodiment.
  • 3B is a cross-sectional view of the cross section taken along the line III(b)-III(b) of FIG. 3A as seen from the arrow direction.
  • FIG. 3C is an explanatory diagram showing a case where the input device according to the modified example is used as a mirror switch.
  • FIG. 4A is an explanatory diagram illustrating a first display area that displays functions that can be controlled by a tracing operation on a menu area of the left touchpad according to the first embodiment.
  • FIG. 4B is an explanatory diagram showing a tracing operation on the menu area of the left touch pad according to the first embodiment.
  • FIG. 5A is an explanatory diagram showing a second display area in which the display is changed by a touch operation performed on the direct area of the left touch pad according to the first embodiment.
  • FIG. 5B is an explanatory diagram showing a touch operation on the direct region of the left touch pad according to the first embodiment.
  • FIG. 6A is an explanatory diagram showing a second display area in which the display is changed by a push operation made in the direct area of the left touchpad according to the first embodiment.
  • FIG. 6B is an explanatory diagram showing a push operation on the direct region of the left touch pad according to the first embodiment.
  • FIG. 7A is an explanatory diagram showing a locus of a tracing operation from a menu area made on the left touchpad according to the first embodiment to a direct area.
  • FIG. 7B is an explanatory diagram showing the timing of the operation detected in the menu area and the direct area in the tracing operation from the menu area on the left touchpad according to the first embodiment to the direct area.
  • FIG. 8 is a flowchart for explaining the operation of the input device according to the first embodiment.
  • FIG. 9A is an explanatory diagram showing a non-continuous locus of operations detected by the input device according to the second embodiment.
  • FIG. 9B is an explanatory diagram showing another locus of discontinuous operations detected by the input device according to the second embodiment.
  • FIG. 9C is an explanatory diagram showing a continuous locus of operations detected by the input device according to the second embodiment.
  • FIG. 10 is a flowchart for explaining the operation of the input device according to the second embodiment.
  • FIG. 11 is a block diagram of the input control system according to the third embodiment.
  • control device In the control device according to the embodiment, after the operation on the first operation area is detected, the operation on the second operation area adjacent to the first operation area and having a different assigned function is detected, and the second operation area is detected. When it is determined that the operation on the second operation area is an unintended operation, the control unit does not accept the operation on the second operation area.
  • the control device determines that the operation on the second operation area is an unintended operation when the user's operation extends from the first operation area to the second operation area, the control device performs an operation on the second operation area. Since it is not accepted, it is possible to suppress acceptance of an unintended operation in which an operation for the first operation area extends to the second operation area and an unintended operation is detected, as compared with the case where this configuration is not adopted. ..
  • FIG. 1A is an explanatory diagram showing the inside of the vehicle
  • FIG. 1B is a block diagram of the input device.
  • FIG. 2A is an explanatory diagram showing a projection area of the head-up display
  • FIG. 2B is an explanatory diagram showing steering.
  • 3A is an explanatory diagram showing the left touch pad
  • FIG. 3B is a sectional view of the touch pad
  • FIG. 3C is an explanatory diagram showing a case where the input device is used as a mirror switch.
  • the input device 1 of the present embodiment is mounted on the steering wheel 85 of the vehicle 8, for example, as shown in FIG. 1A.
  • the user can operate the input device 1 while holding the steering wheel 85.
  • the input device 1 is configured to be able to operate an in-vehicle device of the vehicle 8, for example.
  • the on-vehicle device is, for example, an air conditioner, a music/video reproducing device, a navigation device, a vehicle control device that can perform various settings for the vehicle 8 such as seat arrangement and automatic driving.
  • the vehicle 8 is, for example, as shown in FIG. 1A, a meter display 84 arranged on a meter panel 83, a main display 86 arranged on a center console 81 b, an instrument panel 81 c, and a projection area of a windshield 88. It is provided with a display device such as a head-up display 87 that projects a display object on the 870 and a room mirror monitor 880 arranged on the windshield 88.
  • a display device such as a head-up display 87 that projects a display object on the 870 and a room mirror monitor 880 arranged on the windshield 88.
  • the input device 1 is configured to display a hierarchical GUI (graphical user interface) on at least one of these display devices.
  • the input device 1 of the present embodiment is configured to display a GUI in the projection area 870 of the head-up display 87, as shown in FIG. 2A, for example. On this GUI, functions and set values controllable by the input device 1 are displayed as images.
  • the input device 1 has a menu area 301 as a first operation area and a direct area 302 as a second operation area, and has a touch panel 3 for detecting a touch operation.
  • the control device 5 detects an operation on the menu area 301 and then detects an operation on the direct area 302 adjacent to the menu area 301 and having a different assigned function.
  • the control unit 50 does not accept the operation on the direct area 302 when it is determined that the operation on the direct area 302 is an unintended operation.
  • the menu area 301 (first operation area) and the direct area 302 (second operation area) are operation areas that are adjacent to each other and have different operation methods. Specifically, the first operation area is configured to receive a tracing operation. The second operation area is configured to receive a touch operation different from the drag operation.
  • the input device 1 of the present embodiment corresponds to the menu area 301 as the first operation area and the direct area 302 as the second operation area, but is not limited to this, and at least one first operation area. And at least one second operation area may be arranged adjacent to each other. That is, in the input device 1, a plurality of second operation areas may be arranged around the first operation area, or the first operation area and the second operation area may be arranged alternately side by side. ..
  • the input device 1 of the present embodiment includes, for example, a touch pad 2a and a touch pad 2b on the left and right of the steering wheel 85, as shown in FIG. 2B.
  • Each of the touch pad 2a and the touch pad 2b has a touch panel 3 and a switch 4. Different functions are assigned to the touch pad 2a and the touch pad 2b.
  • the touch pad 2a and the touch pad 2b are arranged on the left and right spokes (spokes 851 and 852) of the steering 85, for example, as shown in FIG. 2B.
  • the spoke portion 851 and the spoke portion 852 connect the base portion 850 on which an alarm device, an airbag, and the like are arranged and the ring portion 853 held by the user. Further, as shown in FIG. 2B, for example, the spoke portion 851 and the spoke portion 852 extend substantially horizontally and connect the base portion 850 and the ring portion 853 at the steering position where the vehicle 8 advances straight.
  • the touch pad 2a and the touch pad 2b can be operated by the user while holding the periphery of the connecting portion between the ring portion 853 and the spoke portions 851 and 852.
  • the input device 1 may be configured to include either the touch pad 2a or the touch pad 2b.
  • the functions assigned to the touchpad 2a and the touchpad 2b are displayed in the projection area 870 by the head-up display 87.
  • the head-up display 87 divides the projection area 870 into a first display area 871 to a third display area 873 for display as shown in FIG. 2A.
  • FIG. 2A shows a projection area 870 when the menu area 301 of the touchpad 2a is operated.
  • the first display area 871 is an area in which an icon group 875 indicating the functions assigned to the menu area 301 of the operated touch pad is displayed. Since FIG. 2A is a view when the touch pad 2a arranged on the left side of the steering wheel 85 is operated, the first display area 871 is displayed on the left side in the center.
  • the second display area 872 is an area in which an icon group 876 indicating the function assigned to the direct area 302 of the operated touch pad is displayed. Since FIG. 2A is a diagram when the touch pad 2a is operated, the second display area 872 is displayed on the left side.
  • the third display area 873 is an area in which an icon group 877 or the like indicating the function of the touch pad that has not been operated is displayed. That is, the icon group 877 is composed of a plurality of icons having functions different from the functions of the icon group 875 and the icon group 876 displayed in the first display area 871 and the second display area 872. Since FIG. 2A is a diagram when the touch pad 2a is operated, the third display area 873 is displayed on the right side.
  • first display region 871 to the third display region 873 are the third display region 873, the first display region 871, and the second display region 873 from the left side of the paper surface of FIG. 2A when the right touchpad 2b is operated. It is displayed in the projection area 870 in the order of the display area 872.
  • the input device 1 is not limited to the steering 85, and may be configured by using the touch pad 82 of the floor console 81a located between the driver's seat 80a and the passenger seat 80b, or the center console 81b or the door trim 89.
  • the arranged touch pad may be used.
  • the input device 1 as a modification may be configured as a mirror switch.
  • the touch pad 890 may be traced on the first operation area 890a to move the mirror.
  • the angle can be adjusted, and the left and right mirrors can be selected by touching the left mirror selection area 890b and the right mirror selection area 890c as the second operation area of the touch pad 890.
  • FIG. 4A is an explanatory diagram showing a first display area for performing display related to the menu area
  • FIG. 4B is an explanatory diagram showing a tracing operation on the menu area
  • FIG. 5A is an explanatory diagram showing the second display area in which the display is changed by the touch operation performed on the direct area
  • FIG. 5B is an explanatory view showing the touch operation on the direct area
  • FIG. 6A is an explanatory diagram showing the second display area in which the display is changed by the push operation made in the direct area
  • FIG. 6B is an explanatory view showing the push operation in the direct area.
  • 4B, 5B, and 6B show a case where the user performs a tracing operation and a touch operation using the thumb (operating finger 90) of the right hand (hand 9) while holding the steering wheel 85.
  • the touch pad 2a has, for example, a trapezoidal shape in which the operation surface 30a has a lower bottom longer than an upper bottom. Further, the touch pad 2a is provided with ribs 31 around it.
  • the touch pad 2b has, for example, a line-symmetrical shape of the touch pad 2a.
  • the shapes of the touch pad 2a and the touch pad 2b are not limited to this. Hereinafter, the touch pad 2a will be mainly described.
  • the touch panel 3 is configured to detect a touch operation and a tracing operation on the operation surface 30a.
  • a touch panel of a resistance film type or a capacitance type can be used.
  • the touch panel 3 of the present embodiment uses a capacitance type touch panel because a push operation made on the touch panel 3 is detected by the switch 4.
  • the touch panel 3 is arranged so that the plurality of drive electrodes and the plurality of detection electrodes intersect each other while maintaining insulation.
  • the drive electrode and the detection electrode are attached to a member made of, for example, glass or polycarbonate.
  • the touch panel 3 of the present embodiment is configured by using a hard resin such as polycarbonate, which is hard to bend by the push operation because the push operation is performed on the operation surface 30a.
  • the touch panel 3 scans all combinations of the plurality of drive electrodes and the plurality of detection electrodes, reads out the electrostatic capacity for each combination, and outputs the information of the electrostatic capacity for one cycle to the control device 5 as detection information. To do. Incidentally panel 3 of the touch pad 2a periodically outputs the detection information S 1 to the control device 5. The touch panel 3 of the touch pad 2b periodically outputs the detection information S 3 to the control device 5.
  • the touch panel 3 is attached to the housing 10 so that the operation surface 30a is exposed from the opening 11 as shown in FIG. 3B, for example.
  • the housing 10 has, for example, a box shape having an opening 11 on the upper surface.
  • the opening 11 is formed by, for example, as shown in FIG. 3B, a protruding portion 13 that protrudes inward from the inner side surface 12 of the upper portion of the housing 10.
  • the lower surface 130 of the protruding portion 13 faces the upper surface 310 of the rib 31 of the touch panel 3.
  • the switch 4 and the first elastic body 15 to the fourth elastic body 18 are arranged between the back surface 30b and the bottom surface 14 of the housing 10. ing.
  • the switch 4 and the first elastic body 15 to the fourth elastic body 18 apply an elastic force to the touch panel 3, and press the upper surface 310 of the rib 31 of the touch panel 3 against the lower surface 130 of the protrusion 13 of the housing 10. ing.
  • the touch panel 3 is prevented from coming off the housing 10 by the contact between the rib 31 and the protruding portion 13.
  • the operation surface 30a of the touch panel 3 is divided into a menu area 301 and a direct area 302 by a dotted line connecting the upper bottom and the lower bottom on the paper surface of FIG. 3A, for example.
  • the dotted line shown in FIG. 3A is a line added for the purpose of explanation, and is not a line actually added.
  • the menu area 301 is an area in which the menu displayed in the first display area 871 of the projection area 870 can be scrolled up and down, for example, as shown in FIGS. 4A and 4B.
  • a relative coordinate system is set in this menu area 301.
  • the menu area 301 is not limited to scrolling a menu or the like, and a cursor or the like may be movable.
  • an icon 875a to which a function related to music is assigned, an icon 875b to which a function related to a mobile phone is assigned, and an icon 875c to which a function of voice recognition is assigned are illustrated as an example of a menu.
  • the user can display icons other than the icons displayed by the vertical drag operation on the menu area 301.
  • the icon displayed as a menu is not limited to the icon shown in FIG. 4A, and may be an icon to which other functions are assigned.
  • the user can select an icon 875a from the icons 875b with the icon 875b as a reference by performing an upward drag operation on the menu area 301. Further, the user can select the icon 875c from the icons 875b by performing a downward drag operation on the menu area 301. Then, the user can select an icon that is not displayed by performing a tracing operation in the upward or downward direction from the state in which the icon 875a or the icon 875c is selected.
  • the icon 875a displayed larger than the other icons in FIG. 4A indicates that it is in the selected state.
  • the selection is determined by a push operation on the touch panel 3.
  • the direct area 302 is, for example, as shown in FIGS. 5A and 5B, the first touch area 302a to the third touch area corresponding to the icons 876a to 876c displayed in the second display area 872 of the projection area 870. It has a region 302c. The user can select an icon by performing a touch operation on any of the first touch area 302a to the third touch area 302c of the direct area 302.
  • an icon 876a to which a function of returning the displayed screen to the previous screen is assigned, an icon 876b to which a function of increasing the volume is assigned, and an icon 876c to which a function of reducing the volume are assigned are illustrated. ing.
  • the number of touch areas and the functions to be assigned are not limited to this.
  • Images corresponding to the icons 876a to 876c are printed on the first touch area 302a to the third touch area 302c.
  • the user can select an icon by a touch operation on the first touch area 302a to the third touch area 302c of the direct area 302. 5A and 5B, a state where a touch operation is performed on the second touch area 302b, an icon 876b corresponding to the second touch area 302b is enlarged and displayed, and a numerical value (27) of the current volume is also shown. Is shown.
  • the user can make a selection decision by performing a push operation on the touch panel 3.
  • the icon 876b is enlarged and the current volume is displayed.
  • the volume increases according to the number of push operations.
  • FIG. 6A shows that the volume has risen from "27" to "29".
  • the icon 876b shown in FIG. 6A is, for example, a display in which a push operation has been accepted.
  • the switch 4 is arranged near the center of the operation surface 30a, for example, as shown in FIGS. 3A and 3B.
  • the switch 4 is a micro switch that detects pressing of the touch panel 3, that is, a push operation performed on the operation surface 30a.
  • the switch 4 applies an elastic force so that the touch panel 3 and the housing 10 come into contact with each other. Further, the switch 4 is provided with an initial load (preload).
  • the switch 4 is configured so that the button 40 is pushed in by a push operation to be in an on state, and when the push operation is completed, the button 40 returns to its original position and is in an off state as the touch panel 3 moves up.
  • the switch 4 is not limited to a mechanical switch such as a micro switch, and may be a non-contact switch using a magnetic sensor or a load sensor that detects a push operation by detecting a load associated with the push operation. good.
  • the switch 4 When the switch 4 is turned on, it outputs a switch signal to the control device 5.
  • the switch 4 of the touch pad 2a outputs the switch signal S 2 to the control device 5. Further, the switch 4 of the touch pad 2b outputs the switch signal S 4 to the control device 5.
  • first elastic body 15 to 4th elastic body 18 For the first elastic body 15 to the fourth elastic body 18, for example, a sponge, a coil spring, rubber, or the like can be used.
  • the first elastic body 15 to the fourth elastic body 18 of the present embodiment are coil springs as an example.
  • the first elastic body 15 to the fourth elastic body 18 are arranged at the four corners of the touch panel 3.
  • the first elastic body 15 to the fourth elastic body 18 support the touch panel 3 and apply an elastic force to the touch panel 3 in the direction opposite to the push operation direction.
  • FIG. 7A shows the trajectory of the tracing operation from the menu area to the direct area.
  • the upper row shows the timing of the operation detected in the menu area
  • the middle row and the lower row show the timing of the operation detected in the direct area.
  • the horizontal axis is time.
  • the control unit 50 of the control device 5 includes, for example, a CPU (Central Processing Unit) that performs calculation and processing on the acquired data according to a stored program, a RAM (Random Access Memory) that is a semiconductor memory, and a ROM (Read Only Memory). ) 51 and the like.
  • the ROM 51 stores, for example, a program for operating the control unit 50 and threshold value information 510.
  • the RAM is used, for example, as a storage area for temporarily storing a calculation result or the like.
  • the control unit 50 also has a means for generating a clock signal therein, and operates based on this clock signal.
  • the control unit 50 is configured to process an operation performed as a relative coordinate system on the menu area 301 and an operation performed as an absolute coordinate system on the direct area 302.
  • the relative coordinate system is set in the menu area 301. Further, since the control unit 50 accepts a touch operation performed on any one of the first touch area 302a to the third touch area 302c set in advance in the direct area 302, the absolute coordinate system is set in the direct area 302. ..
  • control unit 50 is configured so that a plurality of functions can be switched and selected according to the operation performed on the menu area 301, and only the functions assigned in advance can be selected on the direct area 302. Has been done.
  • the input device 1 has a first operation device having a first operation area capable of tracing operation and a second operation device having a second operation area capable of touch operation side by side. It may be configured by arranging.
  • the tracing operation is performed in the menu area 301, for example, when the tracing operation extends from the menu area 301 to the direct area 302 as shown by an arrow in FIG. 7A, the operation on the direct area 302 is unintentionally accepted. There's a problem.
  • an operation on the direct area 302 is accepted, for example, as shown in FIG. 5A, an icon corresponding to the touch area on which the drag operation is performed is unintentionally enlarged and displayed, which is a cause for bother and distraction for the user. It becomes.
  • control unit 50 is configured not to accept the operation of the direct area 302 when the tracing operation extends from the menu area 301 to the direct area 302 and a predetermined condition is satisfied.
  • the control unit 50 determines that the operation is an unintended operation. It is configured.
  • This predetermined period T is included in the threshold value information 510 stored in the ROM 51.
  • FIG. 7B shows the timing of operations in the menu area 301 and the direct area 302.
  • the operation at the time t 1 ⁇ time t 2 is detected in the menu area 301.
  • the operation at time t 3 ⁇ time t 4 is detected by the direct area 302.
  • the operation at t 6 time t 5 ⁇ time is detected by the direct area 302.
  • the control unit 50 satisfies T 1 ⁇ T dm, so that the operation performed in the direct area 302 is performed. Accept.
  • the control unit 50 when the start time T d (time t 5 ) comes first, the control unit 50 satisfies T dm ⁇ T 1 , so that an unintended operation is performed.
  • the operation performed in the direct area 302 is not accepted as being detected. That is, when an operation on the menu area 301 and an operation on the direct area 302 are detected at short intervals, the control unit 50 determines that an unintended operation is detected and does not accept the operation on the direct area 302.
  • the control unit 50 When the operation on the menu area 301 is detected, the control unit 50 outputs the menu operation information S 5 which is information on the detected operation to the connected electronic device. Further, when the operation on the direct area 302 is detected, the control unit 50 outputs the direct operation information S 6 that is the information of the touch area where the operation is detected to the connected electronic device.
  • Step3 When the controller 50 exceeds the period T in which the time interval measured from the end time T m is predetermined (Step3: Yes), to discard the measured time (Step4). Then, the control unit 50 can accept an operation on the menu area 301 and the direct area 302 (Step 5), and ends the operation.
  • Step 3 the control unit 50 in Step 3, before the period T of predetermined arrives (Step3: No), if the operation is detected in the direct area 302 (Step6: Yes), the T d-m ⁇ T Since it is satisfied, it is determined as an unintended operation and the operation on the direct area 302 is not accepted (Step 7).
  • step 6 when the operation is detected on the menu area 301 (Step 6: No), the control unit 50 accepts the operation (Step 8).
  • the input device 1 can suppress the acceptance of unintended operations. Specifically, when the user's tracing operation extends from the menu area 301 to the direct area 302, the input device 1 determines whether the operation on the direct area 302 is an unintended operation. When the input device 1 determines that the operation is an unintended operation, it does not accept the operation on the direct area 302, so that it is possible to suppress the acceptance of the unintended operation as compared with the case where this configuration is not adopted.
  • the time interval T dm between the end time T m when the detection of the operation on the menu area 301 is ended and the start time T d when the detection of the operation on the direct area 302 is started is within a predetermined period T. If it is, it is determined that the operation on the direct area 302 is an unintended operation. Therefore, the input device 1 can determine that the operation is an unintended operation even when the direct area 302 is unintentionally touched after the tracing operation made on the menu area 301.
  • the input device 1 has better operability as compared with the case where either one of the coordinate systems is set.
  • the input device 1 can operate a plurality of functions by the menu area 301 and the direct area 302, it is space-saving as compared with the case where a plurality of switches corresponding to the plurality of functions are arranged.
  • the input device 1 does not accept an unintended operation, the annoyance of detecting an unintended operation on the direct area 302 and enlarging the icon of the second display area 872 is suppressed.
  • the input device 1 can detect the push operation on the touch pad 2a and the touch pad 2b by the switch, the selection can be easily determined and the operability is improved as compared with the case where the switch is not provided.
  • the second embodiment is different from the first embodiment in that it is determined whether or not the operation is an unintended operation based on the trajectory of the operation.
  • the control unit 50 is configured to determine that it is an unintended operation when it is determined that the trajectory of the operation on the menu area 301 and the trajectory of the operation on the direct area 302 are continuous.
  • FIG. 9A and 9B show an example in which an operation on the direct area 302 is accepted.
  • the locus 304 and the locus 305 are separated so as to sandwich the boundary 303 between the menu area 301 and the direct area 302.
  • the control unit 50 determines that the intended operation is performed and accepts the operation on the direct area 302.
  • FIG. 9B shows a case where the user performs a tracing operation on the menu area 301 and then performs a touch operation on the direct area 302.
  • the locus 306 shows the locus of the tracing operation.
  • the locus 307 shows the locus of the touch operation.
  • the control unit 50 determines that the intended operation has been performed assuming that the trajectories are separated across the boundary 303, and accepts the operation for the direct region 302, as in the case where the trajectories of the tracing operation are divided.
  • FIG. 9C shows an example in the case where the operation for the direct region 302 is not accepted.
  • the operation trajectory 308 is not divided by the boundary 303, and is a trajectory that is continuously connected from the menu area 301 to the direct area 302.
  • the control unit 50 determines that the locus 308 is continuously connected from the menu area 301 to the direct area 302 across the boundary 303, it does not accept the operation detected in the direct area 302.
  • control unit 50 continuously calculates detection points based on the detection information acquired periodically, and when the locus connecting the calculated detection points straddles the boundary 303, the loci are continuously connected.
  • the control unit 50 may determine the unintended operation in combination with the continuity of the trajectory by setting the predetermined period T of the first embodiment.
  • T m :T d ⁇ m the operation on the direct area 302 is not accepted.
  • Step 10 When detecting an operation from the menu area 301 to the direct area 302 (Step 10), the control unit 50 of the input device 1 outputs the menu operation information S 5 based on the operation performed on the menu area 301 and then Check continuity.
  • the timing of outputting the menu operation information S 5 is not limited to this, and may be after the continuity of the trajectory is determined.
  • Step 11 Yes
  • the control unit 50 does not accept the operation of the direct area 302 (Step 12) and ends the operation.
  • step 11 if the trajectory is not continuous (Step 11: No), the control unit 50 accepts the operation detected in the direct area 302 (Step 13).
  • the input device 1 of the present embodiment determines whether or not the operation on the direct area 302 is the intended operation based on the trajectory of the operation, it can be easily determined as compared to the case where this configuration is not adopted. ..
  • the third embodiment differs from other embodiments in that it includes a display device.
  • FIG. 11 is a block diagram showing an input control system.
  • the input control system 7 includes, for example, as shown in FIG. 11, the above-described input device 1 and a display device 70 that displays the functions assigned to the menu area 301 and the direct area 302.
  • the display device 70 may be, for example, at least one device of the meter display 84, the main display 86, the head-up display 87, and the room mirror monitor 880 shown in FIG. 1A, or may be another display device.
  • the control device 5 generates the display control signal S 7 and outputs it to the display device 70 in order to display the menu area 301 and the direct area 302.
  • Display device 70 performs display based on the display control signal S 7.
  • the operable functions are easier to confirm and the operability is better than when this configuration is not adopted. ..
  • the control device 5, and the input control system 7 of at least one embodiment described above it is possible to suppress the reception of an unintended operation.
  • Input control system Vehicle 84 Meter display 86 Main display 87 Head-up display 301 Menu area 302 Direct area 880 Room mirror monitor 890 Touch pad 890a First operation area 890b Left mirror selection area 890c Right mirror selection area

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

An input device 1 has: a touch panel 3 that detects a touch operation and has a menu region 301 and a direct region 302; a switch 4 that detects a push operation performed on the touch panel 3; and a control device 5. For cases wherein an operation is detected in the menu region 301 and an operation is thereafter detected in a direct region 302, which is adjacent to the menu region 301 and has been assigned a different function, this control device 5 has a control unit 50 which does not accept the operation in the direct region 302 if it is determined that the operation in the direct region 302 is an unintended operation.

Description

制御装置、入力装置及び入力制御システムControl device, input device and input control system 関連出願の相互参照Cross-reference of related applications
本出願は、2019年3月5日に出願された日本国特許出願2019-039226号の優先権を主張するものであり、日本国特許出願2019-039226号の全内容を本出願に参照により援用する。 This application claims the priority of Japanese Patent Application No. 2019-039226 filed on Mar. 5, 2019, and the entire contents of Japanese Patent Application No. 2019-039226 are incorporated herein by reference. To do.
本発明は、制御装置、入力装置及び入力制御システムに関する。 The present invention relates to a control device, an input device and an input control system.
上面に第1の機能を指示する第1の領域と第2の機能を指示する第2の領域との境界線上にリブを設けたことを特徴とするタッチパッドが知られている(例えば、特許文献1参照。)。 A touch pad characterized in that a rib is provided on a boundary line between a first region instructing a first function and a second region instructing a second function on the upper surface is known (for example, a patent). See Reference 1.).
このタッチパッドは、リブがPET(polyethylene terephthalate)樹脂シート上にUV(Ultraviolet)インク又は発泡インクを使用することにより形成されている。ユーザは、このリブにより第1の領域と第2の領域の境界の位置を指先の感触により確認することができる。 This touch pad has ribs formed by using UV (Ultraviolet) ink or foaming ink on a PET (polyethylene terephthalate) resin sheet. The user can confirm the position of the boundary between the first region and the second region by the touch of the fingertip by the rib.
特開2000-20194号公報Japanese Unexamined Patent Publication No. 2000-20194
特許文献1に開示されたタッチパッドは、例えば、第1の領域を操作した後、意図せず第2の領域に操作が及んで意図しない操作を受け付ける可能性がある。 The touch pad disclosed in Patent Document 1 may, for example, operate the first area and then unintentionally reach the second area to receive an unintended operation.
本発明の目的は、意図しない操作の受け付けを抑制することができる制御装置、入力装置及び入力制御システムを提供することにある。 An object of the present invention is to provide a control device, an input device, and an input control system capable of suppressing acceptance of an unintended operation.
本発明の一実施形態による制御装置は、第1の操作領域に対する操作が検出された後、第1の操作領域に隣接すると共に割り当てられた機能が異なる第2の操作領域に対する操作が検出され、第2の操作領域に対する操作が意図しない操作であると判定した場合、第2の操作領域に対する操作を受け付けない制御部を有する。 In the control device according to the embodiment of the present invention, after the operation on the first operation area is detected, the operation on the second operation area adjacent to the first operation area and having a different assigned function is detected. When it is determined that the operation on the second operation area is an unintended operation, the control unit does not accept the operation on the second operation area.
本発明の一実施形態によれば、意図しない操作の受け付けを抑制する制御装置、入力装置及び入力制御システムを提供することができる。 According to one embodiment of the present invention, it is possible to provide a control device, an input device, and an input control system that suppress the acceptance of unintended operations.
図1Aは、第1の実施形態に係る入力装置が搭載された車両内部を示す説明図である。FIG. 1A is an explanatory diagram showing the inside of a vehicle in which the input device according to the first embodiment is mounted. 図1Bは、第1の実施形態に係る入力装置のブロック図である。FIG. 1B is a block diagram of the input device according to the first embodiment. 図2Aは、第1の実施の形態に係る入力装置の操作によって制御可能な機能を表示するヘッドアップディスプレイの投影領域の一例を示す説明図である。FIG. 2A is an explanatory diagram illustrating an example of a projection area of a head-up display that displays functions that can be controlled by operating the input device according to the first embodiment. 図2Bは、第1の実施の形態に係る入力装置のタッチパッドが配置されたステアリングを示す説明図である。FIG. 2B is an explanatory view showing a steering wheel in which a touch pad of the input device according to the first embodiment is arranged. 図3Aは、第1の実施の形態に係る入力装置のタッチパッドを示す説明図である。FIG. 3A is an explanatory diagram showing a touch pad of the input device according to the first embodiment. 図3Bは、図3AのIII(b)-III(b)で切断した断面を矢印方向から見た断面図である。3B is a cross-sectional view of the cross section taken along the line III(b)-III(b) of FIG. 3A as seen from the arrow direction. 図3Cは、変形例に係る入力装置をミラースイッチとして用いた場合を示す説明図である。FIG. 3C is an explanatory diagram showing a case where the input device according to the modified example is used as a mirror switch. 図4Aは、第1の実施の形態に係る左側のタッチパッドのメニュー領域のなぞり操作によって制御可能な機能を表示する第1の表示領域を示す説明図である。FIG. 4A is an explanatory diagram illustrating a first display area that displays functions that can be controlled by a tracing operation on a menu area of the left touchpad according to the first embodiment. 図4Bは、第1の実施の形態に係る左側のタッチパッドのメニュー領域に対するなぞり操作を示す説明図である。FIG. 4B is an explanatory diagram showing a tracing operation on the menu area of the left touch pad according to the first embodiment. 図5Aは、第1の実施の形態に係る左側のタッチパッドのダイレクト領域になされたタッチ操作によって表示が変化した第2の表示領域を示す説明図である。FIG. 5A is an explanatory diagram showing a second display area in which the display is changed by a touch operation performed on the direct area of the left touch pad according to the first embodiment. 図5Bは、第1の実施の形態に係る左側のタッチパッドのダイレクト領域に対するタッチ操作を示す説明図である。FIG. 5B is an explanatory diagram showing a touch operation on the direct region of the left touch pad according to the first embodiment. 図6Aは、第1の実施の形態に係る左側のタッチパッドのダイレクト領域になされたプッシュ操作によって表示が変化した第2の表示領域を示す説明図である。FIG. 6A is an explanatory diagram showing a second display area in which the display is changed by a push operation made in the direct area of the left touchpad according to the first embodiment. 図6Bは、第1の実施の形態に係る左側のタッチパッドのダイレクト領域に対するプッシュ操作を示す説明図である。FIG. 6B is an explanatory diagram showing a push operation on the direct region of the left touch pad according to the first embodiment. 図7Aは、第1の実施の形態に係る左側のタッチパッドになされたメニュー領域からダイレクト領域に亘るなぞり操作の軌跡を示す説明図である。FIG. 7A is an explanatory diagram showing a locus of a tracing operation from a menu area made on the left touchpad according to the first embodiment to a direct area. 図7Bは、第1の実施の形態に係る左側のタッチパッドになされたメニュー領域からダイレクト領域に亘るなぞり操作における、メニュー領域及びダイレクト領域で検出された操作のタイミングを示す説明図である。FIG. 7B is an explanatory diagram showing the timing of the operation detected in the menu area and the direct area in the tracing operation from the menu area on the left touchpad according to the first embodiment to the direct area. 図8は、第1の実施の形態に係る入力装置の動作について説明するためのフローチャートである。FIG. 8 is a flowchart for explaining the operation of the input device according to the first embodiment. 図9Aは、第2の実施の形態に係る入力装置が検出した、操作の連続していない軌跡を示す説明図である。FIG. 9A is an explanatory diagram showing a non-continuous locus of operations detected by the input device according to the second embodiment. 図9Bは、第2の実施の形態に係る入力装置が検出した、操作の連続していない他の軌跡を示す説明図である。FIG. 9B is an explanatory diagram showing another locus of discontinuous operations detected by the input device according to the second embodiment. 図9Cは、第2の実施の形態に係る入力装置が検出した、操作の連続した軌跡を示す説明図である。FIG. 9C is an explanatory diagram showing a continuous locus of operations detected by the input device according to the second embodiment. 図10は、第2の実施の形態に係る入力装置の動作について説明するためのフローチャートである。FIG. 10 is a flowchart for explaining the operation of the input device according to the second embodiment. 図11は、第3の実施の形態に係る入力制御システムのブロック図である。FIG. 11 is a block diagram of the input control system according to the third embodiment.
(実施の形態の要約)
実施の形態に係る制御装置は、第1の操作領域に対する操作が検出された後、第1の操作領域に隣接すると共に割り当てられた機能が異なる第2の操作領域に対する操作が検出され、第2の操作領域に対する操作が意図しない操作であると判定した場合、第2の操作領域に対する操作を受け付けない制御部を有する。
(Summary of Embodiments)
In the control device according to the embodiment, after the operation on the first operation area is detected, the operation on the second operation area adjacent to the first operation area and having a different assigned function is detected, and the second operation area is detected. When it is determined that the operation on the second operation area is an unintended operation, the control unit does not accept the operation on the second operation area.
制御装置は、ユーザの操作が第1の操作領域から第2の操作領域に及んだ際、第2の操作領域に対する操作が意図しない操作であると判定すると、第2の操作領域に対する操作を受け付けないので、この構成を採用しない場合と比べて、第1の操作領域に対する操作が第2の操作領域に及んで意図しない操作が検出されるような意図しない操作の受け付けを抑制することができる。 When the control device determines that the operation on the second operation area is an unintended operation when the user's operation extends from the first operation area to the second operation area, the control device performs an operation on the second operation area. Since it is not accepted, it is possible to suppress acceptance of an unintended operation in which an operation for the first operation area extends to the second operation area and an unintended operation is detected, as compared with the case where this configuration is not adopted. ..
[第1の実施の形態]
(入力装置1の概要)
図1Aは、車両内部を示す説明図であり、図1Bは、入力装置のブロック図である。図2Aは、ヘッドアップディスプレイの投影領域を示す説明図であり、図2Bは、ステアリングを示す説明図である。図3Aは、左側のタッチパッドを示す説明図であり、図3Bは、タッチパッドの断面図であり、図3Cは、入力装置をミラースイッチとして用いた場合を示す説明図である。
[First Embodiment]
(Outline of input device 1)
FIG. 1A is an explanatory diagram showing the inside of the vehicle, and FIG. 1B is a block diagram of the input device. FIG. 2A is an explanatory diagram showing a projection area of the head-up display, and FIG. 2B is an explanatory diagram showing steering. 3A is an explanatory diagram showing the left touch pad, FIG. 3B is a sectional view of the touch pad, and FIG. 3C is an explanatory diagram showing a case where the input device is used as a mirror switch.
なお以下に記載する実施の形態に係る各図において、図形間の比率は、実際の比率とは異なる場合がある。また図1B、図11では、主な信号や情報の流れを矢印で示している。 In each of the drawings according to the embodiments described below, the ratio between figures may be different from the actual ratio. In addition, in FIGS. 1B and 11, arrows indicate main signal and information flows.
本実施の形態の入力装置1は、例えば、図1Aに示すように、車両8のステアリング85に搭載されている。ユーザは、ステアリング85を把持した状態で入力装置1を操作することができる。この入力装置1は、例えば、車両8の車載装置を操作することができるように構成されている。この車載装置とは、一例として、空調装置、音楽及び映像再生装置、ナビゲーション装置、座席のアレンジや自動運転などの車両8に関する各種設定を行うことができる車両制御装置などである。 The input device 1 of the present embodiment is mounted on the steering wheel 85 of the vehicle 8, for example, as shown in FIG. 1A. The user can operate the input device 1 while holding the steering wheel 85. The input device 1 is configured to be able to operate an in-vehicle device of the vehicle 8, for example. The on-vehicle device is, for example, an air conditioner, a music/video reproducing device, a navigation device, a vehicle control device that can perform various settings for the vehicle 8 such as seat arrangement and automatic driving.
車両8は、一例として、図1Aに示すように、メータパネル83に配置されたメータディスプレイ84、センターコンソール81bに配置されたメインディスプレイ86、インストルメントパネル81cに配置され、フロントガラス88の投影領域870に表示対象を投影するヘッドアップディスプレイ87、及びフロントガラス88に配置されたルームミラーモニタ880などの表示装置を備えている。 As shown in FIG. 1A, the vehicle 8 is, for example, as shown in FIG. 1A, a meter display 84 arranged on a meter panel 83, a main display 86 arranged on a center console 81 b, an instrument panel 81 c, and a projection area of a windshield 88. It is provided with a display device such as a head-up display 87 that projects a display object on the 870 and a room mirror monitor 880 arranged on the windshield 88.
入力装置1は、これらのうち、少なくとも1つの表示装置に階層式のGUI(graphical user interface)を表示させるように構成されている。本実施の形態の入力装置1は、例えば、図2Aに示すように、ヘッドアップディスプレイ87の投影領域870にGUIを表示させるように構成されている。このGUIには、入力装置1によって制御可能な機能や設定値が画像として表示される。 The input device 1 is configured to display a hierarchical GUI (graphical user interface) on at least one of these display devices. The input device 1 of the present embodiment is configured to display a GUI in the projection area 870 of the head-up display 87, as shown in FIG. 2A, for example. On this GUI, functions and set values controllable by the input device 1 are displayed as images.
入力装置1は、例えば、図3A及び図3Bに示すように、第1の操作領域としてのメニュー領域301及び第2の操作領域としてのダイレクト領域302を有し、タッチ操作を検出するタッチパネル3と、タッチパネル3になされたプッシュ操作を検出するスイッチ4と、後述する制御装置5と、を有する。 As shown in FIGS. 3A and 3B, for example, the input device 1 has a menu area 301 as a first operation area and a direct area 302 as a second operation area, and has a touch panel 3 for detecting a touch operation. , A switch 4 for detecting a push operation made on the touch panel 3 and a control device 5 described later.
この制御装置5は、例えば、図1B~図2Bに示すように、メニュー領域301に対する操作が検出された後、メニュー領域301に隣接すると共に割り当てられた機能が異なるダイレクト領域302に対する操作が検出され、ダイレクト領域302に対する操作が意図しない操作であると判定した場合、ダイレクト領域302に対する操作を受け付けない制御部50を有する。 For example, as shown in FIGS. 1B to 2B, the control device 5 detects an operation on the menu area 301 and then detects an operation on the direct area 302 adjacent to the menu area 301 and having a different assigned function. The control unit 50 does not accept the operation on the direct area 302 when it is determined that the operation on the direct area 302 is an unintended operation.
このメニュー領域301(第1の操作領域)とダイレクト領域302(第2の操作領域)は、隣接すると共に操作方法が異なる操作領域である。具体的には、第1の操作領域は、なぞり操作を受け付けるように構成されている。また第2の操作領域は、なぞり操作とは異なるタッチ操作を受け付けるように構成されている。 The menu area 301 (first operation area) and the direct area 302 (second operation area) are operation areas that are adjacent to each other and have different operation methods. Specifically, the first operation area is configured to receive a tracing operation. The second operation area is configured to receive a touch operation different from the drag operation.
なお本実施の形態の入力装置1は、第1の操作領域としてメニュー領域301、第2の操作領域としてダイレクト領域302が対応していたがこれに限定されず、少なくとも1つの第1の操作領域と、少なくとも1つの第2の操作領域と、が隣接して並ぶように構成されても良い。つまり入力装置1は、第1の操作領域の周囲に複数の第2の操作領域が配置されても良いし、第1の操作領域と第2の操作領域が交互に並んで配置されても良い。 The input device 1 of the present embodiment corresponds to the menu area 301 as the first operation area and the direct area 302 as the second operation area, but is not limited to this, and at least one first operation area. And at least one second operation area may be arranged adjacent to each other. That is, in the input device 1, a plurality of second operation areas may be arranged around the first operation area, or the first operation area and the second operation area may be arranged alternately side by side. ..
本実施の形態の入力装置1は、例えば、図2Bに示すように、ステアリング85の左右にタッチパッド2a及びタッチパッド2bを備えている。このタッチパッド2a及びタッチパッド2bは、それぞれがタッチパネル3及びスイッチ4を有する。このタッチパッド2a及びタッチパッド2bは、異なる機能が割り付けられている。 The input device 1 of the present embodiment includes, for example, a touch pad 2a and a touch pad 2b on the left and right of the steering wheel 85, as shown in FIG. 2B. Each of the touch pad 2a and the touch pad 2b has a touch panel 3 and a switch 4. Different functions are assigned to the touch pad 2a and the touch pad 2b.
タッチパッド2a及びタッチパッド2bは、例えば、図2Bに示すように、ステアリング85の左右のスポーク部(スポーク部851及びスポーク部852)に配置されている。このスポーク部851及びスポーク部852は、警報機やエアバッグなどが配置された基部850と、ユーザが把持するリング部853と、を連結するものである。またスポーク部851及びスポーク部852は、例えば、図2Bに示すように、車両8が直進する状態のステアリング位置において略水平に伸びて基部850とリング部853とを連結している。 The touch pad 2a and the touch pad 2b are arranged on the left and right spokes (spokes 851 and 852) of the steering 85, for example, as shown in FIG. 2B. The spoke portion 851 and the spoke portion 852 connect the base portion 850 on which an alarm device, an airbag, and the like are arranged and the ring portion 853 held by the user. Further, as shown in FIG. 2B, for example, the spoke portion 851 and the spoke portion 852 extend substantially horizontally and connect the base portion 850 and the ring portion 853 at the steering position where the vehicle 8 advances straight.
タッチパッド2a及びタッチパッド2bは、ユーザがリング部853とスポーク部851及びスポーク部852との連結部分周辺を把持した状態で操作可能となっている。なお入力装置1は、タッチパッド2a及びタッチパッド2bのいずれか一方を備える構成であっても良い。 The touch pad 2a and the touch pad 2b can be operated by the user while holding the periphery of the connecting portion between the ring portion 853 and the spoke portions 851 and 852. The input device 1 may be configured to include either the touch pad 2a or the touch pad 2b.
タッチパッド2a及びタッチパッド2bに割り付けられた機能は、ヘッドアップディスプレイ87によって投影領域870に表示される。ヘッドアップディスプレイ87は、一例として、図2Aに示すように、投影領域870を第1の表示領域871~第3の表示領域873に分けて表示を行う。なお図2Aは、タッチパッド2aのメニュー領域301を操作した際の投影領域870を示している。 The functions assigned to the touchpad 2a and the touchpad 2b are displayed in the projection area 870 by the head-up display 87. As an example, the head-up display 87 divides the projection area 870 into a first display area 871 to a third display area 873 for display as shown in FIG. 2A. Note that FIG. 2A shows a projection area 870 when the menu area 301 of the touchpad 2a is operated.
第1の表示領域871は、操作されたタッチパッドのメニュー領域301に割り付けられた機能を示すアイコン群875が表示される領域である。図2Aは、ステアリング85の左側に配置されたタッチパッド2aを操作した際の図面であるので、第1の表示領域871が中央左側に表示されている。 The first display area 871 is an area in which an icon group 875 indicating the functions assigned to the menu area 301 of the operated touch pad is displayed. Since FIG. 2A is a view when the touch pad 2a arranged on the left side of the steering wheel 85 is operated, the first display area 871 is displayed on the left side in the center.
第2の表示領域872は、操作されたタッチパッドのダイレクト領域302に割り付けられた機能を示すアイコン群876が表示される領域である。図2Aは、タッチパッド2aを操作した際の図面であるので、第2の表示領域872が左側に表示されている。 The second display area 872 is an area in which an icon group 876 indicating the function assigned to the direct area 302 of the operated touch pad is displayed. Since FIG. 2A is a diagram when the touch pad 2a is operated, the second display area 872 is displayed on the left side.
第3の表示領域873は、操作されていないタッチパッドの機能を示すアイコン群877などが表示される領域である。つまりアイコン群877は、第1の表示領域871及び第2の表示領域872に表示されたアイコン群875及びアイコン群876の機能とは異なる機能を示す複数のアイコンから構成されている。図2Aは、タッチパッド2aを操作した際の図面であるので、第3の表示領域873が右側に表示されている。 The third display area 873 is an area in which an icon group 877 or the like indicating the function of the touch pad that has not been operated is displayed. That is, the icon group 877 is composed of a plurality of icons having functions different from the functions of the icon group 875 and the icon group 876 displayed in the first display area 871 and the second display area 872. Since FIG. 2A is a diagram when the touch pad 2a is operated, the third display area 873 is displayed on the right side.
なお第1の表示領域871~第3の表示領域873は、右側のタッチパッド2bが操作された場合、図2Aの紙面左側から第3の表示領域873、第1の表示領域871及び第2の表示領域872の順で投影領域870に表示される。 Note that the first display region 871 to the third display region 873 are the third display region 873, the first display region 871, and the second display region 873 from the left side of the paper surface of FIG. 2A when the right touchpad 2b is operated. It is displayed in the projection area 870 in the order of the display area 872.
ここで入力装置1は、ステアリング85に限定されず、運転席80aと助手席80bの間に位置するフロアコンソール81aのタッチパッド82を用いて構成されても良いし、センターコンソール81bやドアトリム89に配置されたタッチパッドを用いても良い。 Here, the input device 1 is not limited to the steering 85, and may be configured by using the touch pad 82 of the floor console 81a located between the driver's seat 80a and the passenger seat 80b, or the center console 81b or the door trim 89. The arranged touch pad may be used.
変形例としての入力装置1は、ミラースイッチとして構成されても良い。入力装置1がドアトリム89に配置されたタッチパッド890を用いてミラースイッチとして構成される場合、一例として、図3Cに示すように、タッチパッド890の第1の操作領域890aに対するなぞり操作によってミラーの角度が調整可能となり、タッチパッド890の第2の操作領域としての左ミラー選択領域890b及び右ミラー選択領域890cに対するタッチ操作によって左右のミラーの選択を行うことができる。 The input device 1 as a modification may be configured as a mirror switch. When the input device 1 is configured as a mirror switch using the touch pad 890 arranged on the door trim 89, as an example, as shown in FIG. 3C, the touch pad 890 may be traced on the first operation area 890a to move the mirror. The angle can be adjusted, and the left and right mirrors can be selected by touching the left mirror selection area 890b and the right mirror selection area 890c as the second operation area of the touch pad 890.
(タッチパッド2a及びタッチパッド2bの構成)
図4Aは、メニュー領域に関する表示を行う第1の表示領域を示す説明図であり、図4Bは、メニュー領域に対するなぞり操作を示す説明図である。図5Aは、ダイレクト領域になされたタッチ操作によって表示が変化した第2の表示領域を示す説明図であり、図5Bは、ダイレクト領域に対するタッチ操作を示す説明図である。図6Aは、ダイレクト領域になされたプッシュ操作によって表示が変化した第2の表示領域を示す説明図であり、図6Bは、ダイレクト領域に対するプッシュ操作を示す説明図である。図4B、図5B及び図6Bでは、ユーザがステアリング85を把持した状態で右手(手9)の親指(操作指90)を用いたなぞり操作及びタッチ操作を行った場合を示している。
(Structure of touch pad 2a and touch pad 2b)
FIG. 4A is an explanatory diagram showing a first display area for performing display related to the menu area, and FIG. 4B is an explanatory diagram showing a tracing operation on the menu area. FIG. 5A is an explanatory diagram showing the second display area in which the display is changed by the touch operation performed on the direct area, and FIG. 5B is an explanatory view showing the touch operation on the direct area. FIG. 6A is an explanatory diagram showing the second display area in which the display is changed by the push operation made in the direct area, and FIG. 6B is an explanatory view showing the push operation in the direct area. 4B, 5B, and 6B show a case where the user performs a tracing operation and a touch operation using the thumb (operating finger 90) of the right hand (hand 9) while holding the steering wheel 85.
タッチパッド2aは、一例として、図3Aに示すように、操作面30aが上底よりも下底が長い台形形状を有している。またタッチパッド2aは、周囲にリブ31が設けられている。このタッチパッド2bは、例えば、タッチパッド2aの線対称形状となっている。なおタッチパッド2a及びタッチパッド2bの形状は、これに限定されない。以下では、主にタッチパッド2aについて説明する。 As shown in FIG. 3A, the touch pad 2a has, for example, a trapezoidal shape in which the operation surface 30a has a lower bottom longer than an upper bottom. Further, the touch pad 2a is provided with ribs 31 around it. The touch pad 2b has, for example, a line-symmetrical shape of the touch pad 2a. The shapes of the touch pad 2a and the touch pad 2b are not limited to this. Hereinafter, the touch pad 2a will be mainly described.
タッチパネル3は、操作面30aに対するタッチ操作及びなぞり操作を検出するように構成されている。タッチパネル3は、例えば、抵抗膜方式や静電容量方式などのタッチパネルを用いることが可能である。本実施の形態のタッチパネル3は、一例として、タッチパネル3になされたプッシュ操作をスイッチ4によって検出することから静電容量方式のタッチパネルを用いている。 The touch panel 3 is configured to detect a touch operation and a tracing operation on the operation surface 30a. As the touch panel 3, for example, a touch panel of a resistance film type or a capacitance type can be used. As an example, the touch panel 3 of the present embodiment uses a capacitance type touch panel because a push operation made on the touch panel 3 is detected by the switch 4.
またタッチパネル3は、複数の駆動電極と複数の検出電極とが絶縁を保ちながら交差するように配置されている。この駆動電極と検出電極とは、例えば、ガラスやポリカーボネートなどからなる部材に取り付けられている。本実施の形態のタッチパネル3は、一例として、操作面30aに対してプッシュ操作がなされるので、プッシュ操作によって撓み難い、ポリカーボネートのような硬質な樹脂を用いて構成されている。 Further, the touch panel 3 is arranged so that the plurality of drive electrodes and the plurality of detection electrodes intersect each other while maintaining insulation. The drive electrode and the detection electrode are attached to a member made of, for example, glass or polycarbonate. As an example, the touch panel 3 of the present embodiment is configured by using a hard resin such as polycarbonate, which is hard to bend by the push operation because the push operation is performed on the operation surface 30a.
タッチパネル3は、この複数の駆動電極と複数の検出電極の全ての組み合わせを走査して組み合わせごとの静電容量を読み出し、この一周期分の静電容量の情報を検出情報として制御装置5に出力する。なおタッチパッド2aのタッチパネル3は、検出情報Sを制御装置5に周期的に出力する。またタッチパッド2bのタッチパネル3は、検出情報Sを制御装置5に周期的に出力する。 The touch panel 3 scans all combinations of the plurality of drive electrodes and the plurality of detection electrodes, reads out the electrostatic capacity for each combination, and outputs the information of the electrostatic capacity for one cycle to the control device 5 as detection information. To do. Incidentally panel 3 of the touch pad 2a periodically outputs the detection information S 1 to the control device 5. The touch panel 3 of the touch pad 2b periodically outputs the detection information S 3 to the control device 5.
タッチパネル3は、例えば、図3Bに示すように、開口11から操作面30aが露出するように筐体10に取り付けられている。この筐体10は、例えば、上面が開口11となった箱形状を有している。開口11は、例えば、図3Bに示すように、筐体10の上部の内側面12から内側に向かって突出する突出部13によって形成されている。この突出部13の下面130は、タッチパネル3のリブ31の上面310と対向している。 The touch panel 3 is attached to the housing 10 so that the operation surface 30a is exposed from the opening 11 as shown in FIG. 3B, for example. The housing 10 has, for example, a box shape having an opening 11 on the upper surface. The opening 11 is formed by, for example, as shown in FIG. 3B, a protruding portion 13 that protrudes inward from the inner side surface 12 of the upper portion of the housing 10. The lower surface 130 of the protruding portion 13 faces the upper surface 310 of the rib 31 of the touch panel 3.
タッチパネル3は、例えば、図3A及び図3Bに示すように、裏面30bと筐体10の底面14との間に、スイッチ4と第1の弾性体15~第4の弾性体18とが配置されている。このスイッチ4と第1の弾性体15~第4の弾性体18とは、タッチパネル3に弾性力を付加し、タッチパネル3のリブ31の上面310を筐体10の突出部13の下面130に押し付けている。このリブ31と突出部13との接触によって、タッチパネル3は、筐体10から抜けないようになっている。 In the touch panel 3, for example, as shown in FIGS. 3A and 3B, the switch 4 and the first elastic body 15 to the fourth elastic body 18 are arranged between the back surface 30b and the bottom surface 14 of the housing 10. ing. The switch 4 and the first elastic body 15 to the fourth elastic body 18 apply an elastic force to the touch panel 3, and press the upper surface 310 of the rib 31 of the touch panel 3 against the lower surface 130 of the protrusion 13 of the housing 10. ing. The touch panel 3 is prevented from coming off the housing 10 by the contact between the rib 31 and the protruding portion 13.
タッチパネル3の操作面30aは、例えば、図3Aの紙面において上底と下底を結ぶ点線によってメニュー領域301とダイレクト領域302とに分かれている。なお図3Aに示す点線は、説明のために付された線であって実際に付された線ではない。 The operation surface 30a of the touch panel 3 is divided into a menu area 301 and a direct area 302 by a dotted line connecting the upper bottom and the lower bottom on the paper surface of FIG. 3A, for example. Note that the dotted line shown in FIG. 3A is a line added for the purpose of explanation, and is not a line actually added.
メニュー領域301は、例えば、図4A及び図4Bに示すように、投影領域870の第1の表示領域871に表示されたメニューを上下にスクロールさせることができる領域である。このメニュー領域301には、相対座標系が設定されている。なおメニュー領域301は、メニューなどのスクロールに限定されず、カーソルなどの移動を可能とされても良い。 The menu area 301 is an area in which the menu displayed in the first display area 871 of the projection area 870 can be scrolled up and down, for example, as shown in FIGS. 4A and 4B. A relative coordinate system is set in this menu area 301. Note that the menu area 301 is not limited to scrolling a menu or the like, and a cursor or the like may be movable.
図4Aでは、一例として、メニューの一例として音楽に関する機能が割り付けられたアイコン875a、携帯電話に関する機能が割り付けられたアイコン875b、及び音声認識の機能が割り付けられたアイコン875cが図示されている。 In FIG. 4A, as an example, an icon 875a to which a function related to music is assigned, an icon 875b to which a function related to a mobile phone is assigned, and an icon 875c to which a function of voice recognition is assigned are illustrated as an example of a menu.
ユーザは、メニュー領域301に対する上下方向のなぞり操作によって表示されているアイコン以外のアイコンを表示させることができる。なおメニューとして表示されるアイコンは、図4Aに示すアイコンに限定されず、他の機能が割り付けられたアイコンでも良い。 The user can display icons other than the icons displayed by the vertical drag operation on the menu area 301. The icon displayed as a menu is not limited to the icon shown in FIG. 4A, and may be an icon to which other functions are assigned.
ユーザは、メニュー領域301に対して上方向になぞり操作を行うことによって、アイコン875bを基準とするとアイコン875bからアイコン875aを選択することができる。またユーザは、メニュー領域301に対して下方向になぞり操作を行うことによって、アイコン875bからアイコン875cを選択することができる。そしてユーザは、アイコン875aやアイコン875cが選択された状態からさらに上方向や下方向になぞり操作を行うと、表示されていないアイコンを選択することができる。 The user can select an icon 875a from the icons 875b with the icon 875b as a reference by performing an upward drag operation on the menu area 301. Further, the user can select the icon 875c from the icons 875b by performing a downward drag operation on the menu area 301. Then, the user can select an icon that is not displayed by performing a tracing operation in the upward or downward direction from the state in which the icon 875a or the icon 875c is selected.
この図4Aにおいて他のアイコンより大きく表示されたアイコン875aは、選択状態であることを示している。なお選択の決定は、タッチパネル3に対するプッシュ操作によって行われる。 The icon 875a displayed larger than the other icons in FIG. 4A indicates that it is in the selected state. The selection is determined by a push operation on the touch panel 3.
ダイレクト領域302は、例えば、図5A及び図5Bに示すように、投影領域870の第2の表示領域872に表示されたアイコン876a~アイコン876cに対応する第1のタッチ領域302a~第3のタッチ領域302cを有している。ユーザは、ダイレクト領域302の第1のタッチ領域302a~第3のタッチ領域302cのいずれかにタッチ操作を行うことでアイコンを選択することができる。 The direct area 302 is, for example, as shown in FIGS. 5A and 5B, the first touch area 302a to the third touch area corresponding to the icons 876a to 876c displayed in the second display area 872 of the projection area 870. It has a region 302c. The user can select an icon by performing a touch operation on any of the first touch area 302a to the third touch area 302c of the direct area 302.
図5Aでは、一例として、表示された画面を直前の画面に戻す機能が割り付けられたアイコン876a、音量を上げる機能が割り付けられたアイコン876b、及び音量を下げる機能が割り付けられたアイコン876cが図示されている。なおタッチ領域の数や割り付けられる機能は、これに限定されない。 In FIG. 5A, as an example, an icon 876a to which a function of returning the displayed screen to the previous screen is assigned, an icon 876b to which a function of increasing the volume is assigned, and an icon 876c to which a function of reducing the volume are assigned are illustrated. ing. The number of touch areas and the functions to be assigned are not limited to this.
そして第1のタッチ領域302a~第3のタッチ領域302cには、アイコン876a~アイコン876cに応じた画像が印刷されている。 Images corresponding to the icons 876a to 876c are printed on the first touch area 302a to the third touch area 302c.
ユーザは、ダイレクト領域302の第1のタッチ領域302a~第3のタッチ領域302cに対するタッチ操作によってアイコンを選択することができる。図5A及び図5Bは、第2のタッチ領域302bにタッチ操作がなされ、第2のタッチ領域302bに対応するアイコン876bが拡大表示されると共に現在の音量の数値(27)が併記されている状態を示している。 The user can select an icon by a touch operation on the first touch area 302a to the third touch area 302c of the direct area 302. 5A and 5B, a state where a touch operation is performed on the second touch area 302b, an icon 876b corresponding to the second touch area 302b is enlarged and displayed, and a numerical value (27) of the current volume is also shown. Is shown.
ユーザは、タッチパネル3に対してプッシュ操作を行うことによって選択の決定を行うことができる。ユーザが音量を上げようと第2のタッチ領域302bにタッチ操作を行うと、例えば、図5Aに示すように、アイコン876bが拡大表示されると共に現在の音量が表示される。そしてユーザは、所望の音量となるように、第2のタッチ領域302bにプッシュ操作を行うと、プッシュ操作の回数に応じて音量が上昇する。図6Aでは、音量が「27」から「29」に上がったことを示している。また図6Aに示すアイコン876bは、例えば、プッシュ操作を受け付けた表示になっている。 The user can make a selection decision by performing a push operation on the touch panel 3. When the user touches the second touch area 302b to increase the volume, for example, as shown in FIG. 5A, the icon 876b is enlarged and the current volume is displayed. Then, when the user performs a push operation on the second touch area 302b so as to obtain a desired volume, the volume increases according to the number of push operations. FIG. 6A shows that the volume has risen from "27" to "29". The icon 876b shown in FIG. 6A is, for example, a display in which a push operation has been accepted.
(スイッチ4の構成)
スイッチ4は、例えば、図3A及び図3Bに示すように、操作面30aの中央付近に配置されている。このスイッチ4は、タッチパネル3の押し下げ、つまり操作面30aになされたプッシュ操作を検出するマイクロスイッチである。
(Configuration of switch 4)
The switch 4 is arranged near the center of the operation surface 30a, for example, as shown in FIGS. 3A and 3B. The switch 4 is a micro switch that detects pressing of the touch panel 3, that is, a push operation performed on the operation surface 30a.
スイッチ4は、タッチパネル3と筐体10とが接触するように弾性力を付加している。またスイッチ4は、初期荷重(プリロード)が付加されている。 The switch 4 applies an elastic force so that the touch panel 3 and the housing 10 come into contact with each other. Further, the switch 4 is provided with an initial load (preload).
スイッチ4は、プッシュ操作によってボタン40が押し込まれてオン状態となり、プッシュ操作が終了するとタッチパネル3の上昇と共にボタン40が元の位置に戻ってオフ状態となるように構成されている。なおスイッチ4は、マイクロスイッチのような機械式スイッチに限定されず、磁気センサなどを用いた非接触スイッチやプッシュ操作に伴う荷重を検出することでプッシュ操作を検出する荷重センサなどであっても良い。 The switch 4 is configured so that the button 40 is pushed in by a push operation to be in an on state, and when the push operation is completed, the button 40 returns to its original position and is in an off state as the touch panel 3 moves up. The switch 4 is not limited to a mechanical switch such as a micro switch, and may be a non-contact switch using a magnetic sensor or a load sensor that detects a push operation by detecting a load associated with the push operation. good.
スイッチ4は、オン状態となるとスイッチ信号を制御装置5に出力する。なおタッチパッド2aのスイッチ4は、スイッチ信号Sを制御装置5に出力する。またタッチパッド2bのスイッチ4は、スイッチ信号Sを制御装置5に出力する。 When the switch 4 is turned on, it outputs a switch signal to the control device 5. The switch 4 of the touch pad 2a outputs the switch signal S 2 to the control device 5. Further, the switch 4 of the touch pad 2b outputs the switch signal S 4 to the control device 5.
(第1の弾性体15~第4の弾性体18の構成)
第1の弾性体15~第4の弾性体18は、例えば、スポンジ、コイルばね、ゴムなどを用いることができる。本実施の形態の第1の弾性体15~第4の弾性体18は、一例として、コイルばねである。
(Structure of 1st elastic body 15 to 4th elastic body 18)
For the first elastic body 15 to the fourth elastic body 18, for example, a sponge, a coil spring, rubber, or the like can be used. The first elastic body 15 to the fourth elastic body 18 of the present embodiment are coil springs as an example.
第1の弾性体15~第4の弾性体18は、タッチパネル3の四隅に配置されている。この第1の弾性体15~第4の弾性体18は、タッチパネル3を支持すると共にタッチパネル3にプッシュ操作方向と逆向きの弾性力を付加している。 The first elastic body 15 to the fourth elastic body 18 are arranged at the four corners of the touch panel 3. The first elastic body 15 to the fourth elastic body 18 support the touch panel 3 and apply an elastic force to the touch panel 3 in the direction opposite to the push operation direction.
(制御装置5の構成)
図7Aは、メニュー領域からダイレクト領域に亘るなぞり操作の軌跡を示す。図7Bにおいて上段は、メニュー領域において検出された操作のタイミングを示し、中段及び下段は、ダイレクト領域において検出された操作のタイミングを示す。この図7Bの上段、中段及び下段は、横軸が時間である。
(Configuration of control device 5)
FIG. 7A shows the trajectory of the tracing operation from the menu area to the direct area. In FIG. 7B, the upper row shows the timing of the operation detected in the menu area, and the middle row and the lower row show the timing of the operation detected in the direct area. In the upper, middle and lower stages of FIG. 7B, the horizontal axis is time.
制御装置5の制御部50は、例えば、記憶されたプログラムに従って、取得したデータに演算、加工などを行うCPU(Central Processing Unit)、半導体メモリであるRAM(Random Access Memory)及びROM(Read Only Memory)51などから構成されるマイクロコンピュータである。このROM51には、例えば、制御部50が動作するためのプログラムとしきい値情報510が格納されている。RAMは、例えば、一時的に演算結果などを格納する記憶領域として用いられる。また制御部50は、その内部にクロック信号を生成する手段を有し、このクロック信号に基づいて動作を行う。 The control unit 50 of the control device 5 includes, for example, a CPU (Central Processing Unit) that performs calculation and processing on the acquired data according to a stored program, a RAM (Random Access Memory) that is a semiconductor memory, and a ROM (Read Only Memory). ) 51 and the like. The ROM 51 stores, for example, a program for operating the control unit 50 and threshold value information 510. The RAM is used, for example, as a storage area for temporarily storing a calculation result or the like. The control unit 50 also has a means for generating a clock signal therein, and operates based on this clock signal.
制御部50は、メニュー領域301に対しては相対座標系としてなされた操作を処理し、ダイレクト領域302に対しては絶対座標系としてなされた操作を処理するように構成されている。 The control unit 50 is configured to process an operation performed as a relative coordinate system on the menu area 301 and an operation performed as an absolute coordinate system on the direct area 302.
上述のように、制御部50は、なぞり操作によってメニューをスクロールさせるので、メニュー領域301においては相対座標系を設定する。また制御部50は、ダイレクト領域302に予め設定された第1のタッチ領域302a~第3のタッチ領域302cのいずれかになされたタッチ操作を受け付けるので、ダイレクト領域302においては絶対座標系を設定する。 As described above, since the control unit 50 scrolls the menu by the tracing operation, the relative coordinate system is set in the menu area 301. Further, since the control unit 50 accepts a touch operation performed on any one of the first touch area 302a to the third touch area 302c set in advance in the direct area 302, the absolute coordinate system is set in the direct area 302. ..
従って制御部50は、メニュー領域301に対してはなされた操作に応じて複数の機能を切り替えて選択可能とし、ダイレクト領域302に対しては予め割り付けられた機能のみを選択可能とするように構成されている。 Therefore, the control unit 50 is configured so that a plurality of functions can be switched and selected according to the operation performed on the menu area 301, and only the functions assigned in advance can be selected on the direct area 302. Has been done.
なお変形例として入力装置1は、なぞり操作が可能な第1の操作領域を有する第1の操作装置と、タッチ操作が可能な第2の操作領域を有する第2の操作装置と、を並んで配置することで構成されても良い。 As a modification, the input device 1 has a first operation device having a first operation area capable of tracing operation and a second operation device having a second operation area capable of touch operation side by side. It may be configured by arranging.
メニュー領域301では、なぞり操作が行われるので、例えば、図7Aに矢印で示すように、なぞり操作がメニュー領域301からダイレクト領域302に及んだ場合、意図せずダイレクト領域302に対する操作が受け付けられる問題がある。ダイレクト領域302に対する操作が受け付けられた場合、例えば、図5Aに示すように、なぞり操作が及んだタッチ領域に応じたアイコンが意図せず拡大表示されるので、ユーザにとって煩わしく、気が散る原因となる。 Since the tracing operation is performed in the menu area 301, for example, when the tracing operation extends from the menu area 301 to the direct area 302 as shown by an arrow in FIG. 7A, the operation on the direct area 302 is unintentionally accepted. There's a problem. When an operation on the direct area 302 is accepted, for example, as shown in FIG. 5A, an icon corresponding to the touch area on which the drag operation is performed is unintentionally enlarged and displayed, which is a cause for bother and distraction for the user. It becomes.
そこで制御部50は、なぞり操作がメニュー領域301からダイレクト領域302に及び、かつ予め定められた条件が満たされた場合、ダイレクト領域302の操作を受け付けないように構成されている。 Therefore, the control unit 50 is configured not to accept the operation of the direct area 302 when the tracing operation extends from the menu area 301 to the direct area 302 and a predetermined condition is satisfied.
本実施の形態の制御部50は、メニュー領域301に対する操作とダイレクト領域302に対する操作の時間間隔Td-mが予め定められた期間T以内である場合、意図しない操作であると判定するように構成されている。この予め定められた期間Tは、ROM51に記憶されたしきい値情報510に含まれている。時間間隔Td-mは、ダイレクト領域302において操作が検出された開始時間Tと、メニュー領域301において操作が検出されなくなった終了時間Tと、の差(=T-T)である。 When the time interval T dm between the operation on the menu area 301 and the operation on the direct area 302 is within a predetermined period T, the control unit 50 according to the present embodiment determines that the operation is an unintended operation. It is configured. This predetermined period T is included in the threshold value information 510 stored in the ROM 51. The time interval T dm is the difference (=T d −T m ) between the start time T d at which the operation is detected in the direct area 302 and the end time T m at which the operation is not detected in the menu area 301. is there.
図7Bは、メニュー領域301及びダイレクト領域302の操作のタイミングを示している。上段では、時間t~時間tにおいて操作がメニュー領域301で検出されている。中段では、時間t~時間tにおいて操作がダイレクト領域302で検出されている。下段では、時間t~時間tにおいて操作がダイレクト領域302で検出されている。 FIG. 7B shows the timing of operations in the menu area 301 and the direct area 302. In the upper part, the operation at the time t 1 ~ time t 2 is detected in the menu area 301. In the middle, the operation at time t 3 ~ time t 4 is detected by the direct area 302. In the lower part, the operation at t 6 time t 5 ~ time is detected by the direct area 302.
制御部50は、例えば、図7Bの上段及び中段に示すように、メニュー領域301における操作が終了した時間t(=T)から時間の測定を行う。この時間の測定は、予め定められた期間Tの到来、及びダイレクト領域302に対する操作の開始時間Tの到来のうち、早い方が訪れると終了する。 The control unit 50 measures time from the time t 2 (=T m ) at which the operation in the menu area 301 ends, as shown in the upper and middle rows of FIG. 7B, for example. The measurement of this time ends when the earlier of the arrival of the predetermined period T and the start time T d of the operation for the direct region 302, whichever comes first.
制御部50は、例えば、図7Bの上段及び中段に示すように、先に予め定められた期間Tが到来した場合、T<Td-mとなるので、ダイレクト領域302においてなされた操作を受け付ける。 For example, as shown in the upper and middle rows of FIG. 7B, when the predetermined period T has arrived, the control unit 50 satisfies T 1 <T dm, so that the operation performed in the direct area 302 is performed. Accept.
また制御部50は、例えば、図7Bの上段及び下段に示すように、先に開始時間T(時間t)が到来した場合、Td-m≦Tとなるので、意図しない操作が検出されたとしてダイレクト領域302においてなされた操作を受け付けない。つまりメニュー領域301の操作とダイレクト領域302の操作とが短い間隔で検出された場合、制御部50は、意図しない操作が検出されたと判定してダイレクト領域302に対する操作を受け付けない。 Further, for example, as shown in the upper and lower rows of FIG. 7B, when the start time T d (time t 5 ) comes first, the control unit 50 satisfies T dm ≦T 1 , so that an unintended operation is performed. The operation performed in the direct area 302 is not accepted as being detected. That is, when an operation on the menu area 301 and an operation on the direct area 302 are detected at short intervals, the control unit 50 determines that an unintended operation is detected and does not accept the operation on the direct area 302.
制御部50は、メニュー領域301に対する操作が検出された場合、検出した操作の情報であるメニュー操作情報Sを接続された電子機器に出力する。また制御部50は、ダイレクト領域302に対する操作が検出された場合、操作が検出されたタッチ領域の情報であるダイレクト操作情報Sを接続された電子機器に出力する。 When the operation on the menu area 301 is detected, the control unit 50 outputs the menu operation information S 5 which is information on the detected operation to the connected electronic device. Further, when the operation on the direct area 302 is detected, the control unit 50 outputs the direct operation information S 6 that is the information of the touch area where the operation is detected to the connected electronic device.
以下では、メニュー領域301で検出された操作後のダイレクト領域302に対する操作の受け付けを判定する動作の一例について図8のフローチャートに従って説明する。なお以下では、主に、左側のタッチパッド2aのメニュー領域301に対する操作が検出された後の動作について説明する。 Hereinafter, an example of an operation of determining acceptance of an operation for the direct area 302 after the operation detected in the menu area 301 will be described with reference to the flowchart of FIG. 8. In the following, the operation after the operation on the menu area 301 of the left touch pad 2a is detected will be mainly described.
(動作)
入力装置1の制御部50は、左右のタッチパネル3から出力される検出情報S及び検出情報Sに基づいてメニュー領域301に対する操作が検出された場合、検出された操作に基づくメニュー操作情報Sを出力する。そして制御部50は、操作が終了した場合(Step1)、時間の測定を開始する(Step2)。
(motion)
When an operation on the menu area 301 is detected based on the detection information S 1 and the detection information S 3 output from the left and right touch panels 3, the control unit 50 of the input device 1 detects the menu operation information S based on the detected operation. 5 is output. Then, when the operation is completed (Step 1), the control unit 50 starts measuring time (Step 2).
制御部50は、終了時間Tから測定した時間間隔が予め定められた期間Tを超えた場合(Step3:Yes)、測定した時間を破棄し(Step4)する。そして制御部50は、メニュー領域301及びダイレクト領域302に対する操作を受け付け可能とし(Step5)、動作を終了する。 When the controller 50 exceeds the period T in which the time interval measured from the end time T m is predetermined (Step3: Yes), to discard the measured time (Step4). Then, the control unit 50 can accept an operation on the menu area 301 and the direct area 302 (Step 5), and ends the operation.
ここでステップ3において制御部50は、予め定められた期間Tが到来する前に(Step3:No)、ダイレクト領域302に操作が検出された場合(Step6:Yes)、Td-m≦Tを満足するので、意図しない操作と判定してダイレクト領域302に対する操作を受け付けない(Step7)。 Here, the control unit 50 in Step 3, before the period T of predetermined arrives (Step3: No), if the operation is detected in the direct area 302 (Step6: Yes), the T d-m ≦ T Since it is satisfied, it is determined as an unintended operation and the operation on the direct area 302 is not accepted (Step 7).
またステップ6において制御部50は、メニュー領域301に対して操作が検出された場合(Step6:No)、操作を受け付ける(Step8)。 In step 6, when the operation is detected on the menu area 301 (Step 6: No), the control unit 50 accepts the operation (Step 8).
(第1の実施の形態の効果)
本実施の形態に係る入力装置1は、意図しない操作の受け付けを抑制することができる。具体的には、入力装置1は、ユーザのなぞり操作がメニュー領域301からダイレクト領域302に及んだ際、ダイレクト領域302に対する操作が意図しない操作であるか否かの判定を行う。そして入力装置1は、意図しない操作であると判定した場合、ダイレクト領域302に対する操作を受け付けないので、この構成を採用しない場合と比べて、意図しない操作の受け付けを抑制することができる。
(Effect of the first embodiment)
The input device 1 according to the present embodiment can suppress the acceptance of unintended operations. Specifically, when the user's tracing operation extends from the menu area 301 to the direct area 302, the input device 1 determines whether the operation on the direct area 302 is an unintended operation. When the input device 1 determines that the operation is an unintended operation, it does not accept the operation on the direct area 302, so that it is possible to suppress the acceptance of the unintended operation as compared with the case where this configuration is not adopted.
入力装置1は、メニュー領域301に対する操作の検出が終了した終了時間Tとダイレクト領域302に対する操作の検出が開始された開始時間Tの時間間隔Td-mが予め定められた期間T以内であった場合、ダイレクト領域302に対する操作が意図しない操作であると判定する。従って入力装置1は、メニュー領域301になされたなぞり操作の後に意図せずダイレクト領域302に触れたような場合であっても、意図しない操作であると判定することができる。 In the input device 1, the time interval T dm between the end time T m when the detection of the operation on the menu area 301 is ended and the start time T d when the detection of the operation on the direct area 302 is started is within a predetermined period T. If it is, it is determined that the operation on the direct area 302 is an unintended operation. Therefore, the input device 1 can determine that the operation is an unintended operation even when the direct area 302 is unintentionally touched after the tracing operation made on the menu area 301.
メニュー領域301は、複数の機能を切り替えて選択可能とするため、階層式のGUIを操作し易い相対座標系が設定されている。またダイレクト領域302は、予め割り付けられた機能のみを選択可能とするため、タッチ操作によって選択し易い絶対座標系が設定されている。従って入力装置1は、いずれか一方の座標系が設定される場合と比べて、操作性が良い。 In the menu area 301, a plurality of functions are switched and selectable, so that a relative coordinate system in which a hierarchical GUI is easily operated is set. Further, in the direct area 302, since only the functions assigned in advance can be selected, an absolute coordinate system that is easy to select by a touch operation is set. Therefore, the input device 1 has better operability as compared with the case where either one of the coordinate systems is set.
入力装置1は、メニュー領域301及びダイレクト領域302によって複数の機能を操作することができるので、複数の機能に応じた複数のスイッチを配置する場合と比べて、省スペースである。 Since the input device 1 can operate a plurality of functions by the menu area 301 and the direct area 302, it is space-saving as compared with the case where a plurality of switches corresponding to the plurality of functions are arranged.
入力装置1は、意図しない操作を受け付けないので、ダイレクト領域302に対する意図しない操作が検出されて第2の表示領域872のアイコンが拡大されるような煩わしさが抑制される。 Since the input device 1 does not accept an unintended operation, the annoyance of detecting an unintended operation on the direct area 302 and enlarging the icon of the second display area 872 is suppressed.
入力装置1は、タッチパッド2a及びタッチパッド2bに対するプッシュ操作をスイッチによって検出することができるので、スイッチを備えない場合と比べて、選択の決定が容易となり、操作性が良くなる。 Since the input device 1 can detect the push operation on the touch pad 2a and the touch pad 2b by the switch, the selection can be easily determined and the operability is improved as compared with the case where the switch is not provided.
[第2の実施の形態]
第2の実施の形態は、操作の軌跡に基づいて意図しない操作であるか否かを判定する点で第1の実施の形態と異なっている。
[Second Embodiment]
The second embodiment is different from the first embodiment in that it is determined whether or not the operation is an unintended operation based on the trajectory of the operation.
なお以下に記載する実施の形態において、第1の実施の形態と同じ機能及び構成を有する部分は、第1の実施の形態と同じ符号を付し、その説明は省略するものとする。 In the embodiments described below, parts having the same functions and configurations as those in the first embodiment are designated by the same reference numerals as those in the first embodiment, and the description thereof will be omitted.
制御部50は、メニュー領域301に対する操作の軌跡とダイレクト領域302に対する操作の軌跡とが連続した軌跡であると判定した場合、意図しない操作であると判定するように構成されている。 The control unit 50 is configured to determine that it is an unintended operation when it is determined that the trajectory of the operation on the menu area 301 and the trajectory of the operation on the direct area 302 are continuous.
図9A及び図9Bは、ダイレクト領域302に対する操作が受け付けられる一例を示している。図9Aでは、メニュー領域301とダイレクト領域302の境界303を挟んで軌跡304と軌跡305とが分かれている。制御部50は、境界303を挟んで軌跡が分かれて検出された場合、意図した操作がなされたと判定してダイレクト領域302に対する操作を受け付ける。 9A and 9B show an example in which an operation on the direct area 302 is accepted. In FIG. 9A, the locus 304 and the locus 305 are separated so as to sandwich the boundary 303 between the menu area 301 and the direct area 302. When the loci are separated with the boundary 303 interposed therebetween, the control unit 50 determines that the intended operation is performed and accepts the operation on the direct area 302.
図9Bは、ユーザがメニュー領域301に対してなぞり操作を行った後、ダイレクト領域302に対してタッチ操作を行った場合を示している。軌跡306は、なぞり操作の軌跡を示している。軌跡307は、タッチ操作の軌跡を示している。制御部50は、なぞり操作の軌跡が分かれている場合と同様に、境界303を挟んで軌跡が分かれているとして意図した操作がなされたと判定し、ダイレクト領域302に対する操作を受け付ける。 FIG. 9B shows a case where the user performs a tracing operation on the menu area 301 and then performs a touch operation on the direct area 302. The locus 306 shows the locus of the tracing operation. The locus 307 shows the locus of the touch operation. The control unit 50 determines that the intended operation has been performed assuming that the trajectories are separated across the boundary 303, and accepts the operation for the direct region 302, as in the case where the trajectories of the tracing operation are divided.
図9Cは、ダイレクト領域302に対する操作が受け付けられない場合の一例を示している。この図9Cでは、操作の軌跡308が境界303で分かれず、メニュー領域301からダイレクト領域302へと連続的に繋がった軌跡となっている。制御部50は、軌跡308が境界303を挟んでメニュー領域301からダイレクト領域302へと連続的に繋がっていると判定した場合、ダイレクト領域302で検出された操作を受け付けない。 FIG. 9C shows an example in the case where the operation for the direct region 302 is not accepted. In FIG. 9C, the operation trajectory 308 is not divided by the boundary 303, and is a trajectory that is continuously connected from the menu area 301 to the direct area 302. When the control unit 50 determines that the locus 308 is continuously connected from the menu area 301 to the direct area 302 across the boundary 303, it does not accept the operation detected in the direct area 302.
制御部50は、例えば、周期的に取得した検出情報に基づいて連続的に検出点が算出され、この算出された検出点を結ぶ軌跡が境界303を跨いている場合、軌跡が連続的に繋がっていると判定する。 For example, the control unit 50 continuously calculates detection points based on the detection information acquired periodically, and when the locus connecting the calculated detection points straddles the boundary 303, the loci are continuously connected. Judge that
なお変形例として制御部50は、第1の実施の形態の予め定められた期間Tを設定して軌跡の連続性と組み合わせて意図しない操作の判定を行っても良い。この場合、制御部50は、軌跡が繋がっていなくても、メニュー領域301に対する操作が終了した終了時間Tとダイレクト領域302に対する操作が検出された開始時間Tとの差(=T-T:Td-m)が予め定められた期間T以内である場合、ダイレクト領域302に対する操作を受け付けない。 As a modification, the control unit 50 may determine the unintended operation in combination with the continuity of the trajectory by setting the predetermined period T of the first embodiment. In this case, the control unit 50 controls the difference (=T d −) between the end time T m at which the operation on the menu area 301 ends and the start time T d at which the operation on the direct area 302 is detected, even if the loci are not connected. When T m :T d−m ) is within a predetermined period T, the operation on the direct area 302 is not accepted.
以下に本実施形態の入力装置1の動作の一例について図10のフローチャートに従って説明する。 An example of the operation of the input device 1 of this embodiment will be described below with reference to the flowchart of FIG.
(動作)
入力装置1の制御部50は、メニュー領域301からダイレクト領域302に亘る操作を検出した場合(Step10)、メニュー領域301に対してなされた操作に基づくメニュー操作情報Sを出力した後、軌跡の連続性を確認する。なおメニュー操作情報Sを出力するタイミングは、これに限定されず、軌跡の連続性を判定した後であっても良い。
(motion)
When detecting an operation from the menu area 301 to the direct area 302 (Step 10), the control unit 50 of the input device 1 outputs the menu operation information S 5 based on the operation performed on the menu area 301 and then Check continuity. The timing of outputting the menu operation information S 5 is not limited to this, and may be after the continuity of the trajectory is determined.
制御部50は、操作の軌跡が連続していた場合(Step11:Yes)、ダイレクト領域302の操作を受け付けず(Step12)、動作を終了する。 When the operation trajectory is continuous (Step 11: Yes), the control unit 50 does not accept the operation of the direct area 302 (Step 12) and ends the operation.
ここでステップ11において制御部50は、軌跡が連続してない場合(Step11:No)、ダイレクト領域302において検出された操作を受け付ける(Step13)。 Here, in step 11, if the trajectory is not continuous (Step 11: No), the control unit 50 accepts the operation detected in the direct area 302 (Step 13).
(第2の実施の形態の効果)
本実施の形態の入力装置1は、操作の軌跡に基づいてダイレクト領域302に対する操作が意図した操作か否かを判定するので、この構成を採用しない場合と比べて、容易に判定することができる。
(Effect of the second embodiment)
Since the input device 1 of the present embodiment determines whether or not the operation on the direct area 302 is the intended operation based on the trajectory of the operation, it can be easily determined as compared to the case where this configuration is not adopted. ..
[第3の実施の形態]
第3の実施の形態は、表示装置を含む点で他の実施の形態と異なっている。
[Third Embodiment]
The third embodiment differs from other embodiments in that it includes a display device.
図11は、入力制御システムを示すブロック図である。入力制御システム7は、例えば、図11に示すように、上述の入力装置1と、メニュー領域301及びダイレクト領域302に割り当てられた機能に関する表示を行う表示装置70と、を有する。 FIG. 11 is a block diagram showing an input control system. The input control system 7 includes, for example, as shown in FIG. 11, the above-described input device 1 and a display device 70 that displays the functions assigned to the menu area 301 and the direct area 302.
表示装置70は、例えば、図1Aに示すメータディスプレイ84、メインディスプレイ86、ヘッドアップディスプレイ87及びルームミラーモニタ880の少なくとも1つの装置であっても良いし、他の表示装置であっても良い。 The display device 70 may be, for example, at least one device of the meter display 84, the main display 86, the head-up display 87, and the room mirror monitor 880 shown in FIG. 1A, or may be another display device.
制御装置5は、メニュー領域301及びダイレクト領域302に関する表示を行わせるため、表示制御信号Sを生成して表示装置70に出力する。表示装置70は、この表示制御信号Sに基づいて表示を行う。 The control device 5 generates the display control signal S 7 and outputs it to the display device 70 in order to display the menu area 301 and the direct area 302. Display device 70 performs display based on the display control signal S 7.
本実施の形態の入力制御システム7は、入力装置1に割り付けられた機能が表示装置70によって確認できるので、この構成を採用しない場合と比べて、操作可能な機能が確認し易く操作性が良い。 In the input control system 7 of the present embodiment, since the functions assigned to the input device 1 can be confirmed by the display device 70, the operable functions are easier to confirm and the operability is better than when this configuration is not adopted. ..
以上述べた少なくとも1つの実施の形態の入力装置1、制御装置5及び入力制御システム7によれば、意図しない操作の受け付けを抑制することが可能となる。 According to the input device 1, the control device 5, and the input control system 7 of at least one embodiment described above, it is possible to suppress the reception of an unintended operation.
以上、本発明のいくつかの実施の形態及び変形例を説明したが、これらの実施の形態及び変形例は、一例に過ぎず、請求の範囲に係る発明を限定するものではない。これら新規な実施の形態及び変形例は、その他の様々な形態で実施されることが可能であり、本発明の要旨を逸脱しない範囲で、種々の省略、置き換え、変更などを行うことができる。また、これら実施の形態及び変形例の中で説明した特徴の組合せの全てが発明の課題を解決するための手段に必須であるとは限らない。さらに、これら実施の形態及び変形例は、発明の範囲及び要旨に含まれると共に、請求の範囲に記載された発明とその均等の範囲に含まれる。 Although some embodiments and modifications of the present invention have been described above, these embodiments and modifications are merely examples and do not limit the invention according to the claims. These new embodiments and modifications can be implemented in various other forms, and various omissions, replacements, and changes can be made without departing from the scope of the present invention. Further, not all of the combinations of characteristics described in the embodiments and the modifications are essential to the means for solving the problems of the invention. Furthermore, these embodiments and modifications are included in the scope and spirit of the invention, and are also included in the invention described in the claims and the equivalent scope thereof.
1 入力装置
3 タッチパネル
4 スイッチ
5 制御装置
7 入力制御システム
8 車両
84 メータディスプレイ
86 メインディスプレイ
87 ヘッドアップディスプレイ
301 メニュー領域
302 ダイレクト領域
880 ルームミラーモニタ
890 タッチパッド
890a 第1の操作領域
890b 左ミラー選択領域
890c 右ミラー選択領域
1 Input device 3 Touch panel 4 Switch 5 Control device 7 Input control system 8 Vehicle 84 Meter display 86 Main display 87 Head-up display 301 Menu area 302 Direct area 880 Room mirror monitor 890 Touch pad 890a First operation area 890b Left mirror selection area 890c Right mirror selection area

Claims (10)

  1. 第1の操作領域に対する操作が検出された後、前記第1の操作領域に隣接すると共に割り当てられた機能が異なる第2の操作領域に対する操作が検出され、前記第2の操作領域に対する操作が意図しない操作であると判定した場合、前記第2の操作領域に対する操作を受け付けない制御部を備えた制御装置。 After an operation on the first operation area is detected, an operation on a second operation area adjacent to the first operation area and having a different assigned function is detected, and the operation on the second operation area is intended. A control device including a control unit that does not accept an operation on the second operation area when it is determined that the operation is not performed.
  2. 前記制御部は、前記第1の操作領域に対する操作と前記第2の操作領域に対する操作の時間間隔が予め定められた期間以内である場合、意図しない操作であると判定する、
    請求項1に記載の制御装置。
    The control unit determines that the operation is an unintended operation when the time interval between the operation on the first operation area and the operation on the second operation area is within a predetermined period.
    The control device according to claim 1.
  3. 前記制御部は、前記第1の操作領域に対する操作の軌跡と前記第2の操作領域に対する操作の軌跡とが連続した軌跡であると判定した場合、意図しない操作であると判定する、
    請求項1に記載の制御装置。
    When the control unit determines that the trajectory of the operation on the first operation area and the trajectory of the operation on the second operation area are continuous trajectories, it is determined that the operation is an unintended operation.
    The control device according to claim 1.
  4. 前記制御部は、前記第1の操作領域に対しては相対座標系としてなされた操作を処理し、前記第2の操作領域に対しては絶対座標系としてなされた操作を処理する、
    請求項1乃至3のいずれか1項に記載の制御装置。
    The control unit processes an operation made as a relative coordinate system for the first operation area, and an operation made as an absolute coordinate system for the second operation area,
    The control device according to any one of claims 1 to 3.
  5. 前記制御部は、前記第1の操作領域に対してはなされた操作に応じて複数の機能を切り替えて選択可能とし、
    前記第2の操作領域に対しては予め割り付けられた機能のみを選択可能とする、
    請求項1乃至4のいずれか1項に記載の制御装置。
    The control unit can select a plurality of functions by switching a plurality of functions according to an operation performed on the first operation area,
    Only the functions assigned in advance can be selected for the second operation area.
    The control device according to any one of claims 1 to 4.
  6. 前記第1の操作領域及び前記第2の操作領域を有し、タッチ操作を検出するタッチパネルと、
    前記タッチパネルになされたプッシュ操作を検出するスイッチと、
    請求項1乃至5のいずれか1項に記載の制御装置と、
    を備えた入力装置。
    A touch panel having the first operation area and the second operation area and detecting a touch operation,
    A switch for detecting a push operation made on the touch panel,
    A control device according to any one of claims 1 to 5,
    Input device equipped with.
  7. 前記タッチパネルは、前記タッチ操作として、前記第1の操作領域になされたなぞり操作と前記第2の操作領域になされた前記なぞり操作と異なったタッチ操作を検出する、
    請求項6に記載の入力装置。
    The touch panel detects, as the touch operation, a touch operation different from the drag operation performed on the first operation area and the drag operation performed on the second operation area.
    The input device according to claim 6.
  8. 前記第1の操作領域及び前記第2の操作領域は、それぞれ少なくとも1つの領域から構成される、
    請求項6又は7に記載の入力装置。
    The first operation area and the second operation area each include at least one area,
    The input device according to claim 6 or 7.
  9. 請求項6乃至8のいずれか1項に記載の入力装置と、
    前記第1の操作領域及び前記第2の操作領域に割り当てられた機能に関する表示を行う表示装置と、
    を備えた入力制御システム。
    An input device according to any one of claims 6 to 8,
    A display device for displaying a function assigned to the first operation area and the second operation area;
    Input control system with.
  10. 前記入力装置、及び前記表示装置は、車両に搭載され、
    前記入力装置は、前記車両の車載装置を操作する、
    請求項9に記載の入力制御システム。
    The input device and the display device are mounted on a vehicle,
    The input device operates an in-vehicle device of the vehicle,
    The input control system according to claim 9.
PCT/JP2020/004987 2019-03-05 2020-02-07 Control device, input device, and input control system WO2020179362A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-039226 2019-03-05
JP2019039226A JP2020144496A (en) 2019-03-05 2019-03-05 Control device, input device, and input control system

Publications (1)

Publication Number Publication Date
WO2020179362A1 true WO2020179362A1 (en) 2020-09-10

Family

ID=72338303

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/004987 WO2020179362A1 (en) 2019-03-05 2020-02-07 Control device, input device, and input control system

Country Status (2)

Country Link
JP (1) JP2020144496A (en)
WO (1) WO2020179362A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000029586A (en) * 1998-07-08 2000-01-28 Sharp Corp Handwriting input device
JP2005011233A (en) * 2003-06-20 2005-01-13 Canon Inc Coordinate input control method
JP2013159273A (en) * 2012-02-07 2013-08-19 Denso Corp In-vehicle operation device
JP2015041189A (en) * 2013-08-21 2015-03-02 ソニー株式会社 Display control device, display control method, and program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000029586A (en) * 1998-07-08 2000-01-28 Sharp Corp Handwriting input device
JP2005011233A (en) * 2003-06-20 2005-01-13 Canon Inc Coordinate input control method
JP2013159273A (en) * 2012-02-07 2013-08-19 Denso Corp In-vehicle operation device
JP2015041189A (en) * 2013-08-21 2015-03-02 ソニー株式会社 Display control device, display control method, and program

Also Published As

Publication number Publication date
JP2020144496A (en) 2020-09-10

Similar Documents

Publication Publication Date Title
JP5882117B2 (en) In-vehicle operation display device
JP5565421B2 (en) In-vehicle operation device
US9346356B2 (en) Operation input device for vehicle
WO2009128148A1 (en) Remote control device
CN107797726B (en) Information terminal
JP6622264B2 (en) In-vehicle device operation support system
CN106314151B (en) Vehicle and method of controlling vehicle
JP2006264615A (en) Display device for vehicle
JP5852592B2 (en) Touch operation type input device
JP2014102656A (en) Manipulation assistance system, manipulation assistance method, and computer program
WO2020179362A1 (en) Control device, input device, and input control system
KR101422060B1 (en) Information display apparatus and method for vehicle using touch-pad, and information input module thereof
EP3361367A1 (en) In-vehicle input device, in-vehicle input system, and in-vehicle input device control method
JP6125582B2 (en) In-vehicle operation display device
WO2020179361A1 (en) Steering switching device and steering switch system
JP6429699B2 (en) Vehicle input system
JP2014102658A (en) Operation support system, operation support method, and computer program
WO2020196559A1 (en) Control device and control system
JP2014100998A (en) Operation support system, operation support method, and computer program
JP2007058426A (en) Input device
JP2020042621A (en) On-vehicle display system
JP2014102657A (en) Manipulation assistance system, manipulation assistance method, and computer program
EP3352067B1 (en) Vehicular input device and method of controlling vehicular input device
JP2020160787A (en) Control device
WO2016031148A1 (en) Touch pad for vehicle and input interface for vehicle

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20766460

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20766460

Country of ref document: EP

Kind code of ref document: A1