WO2023100537A1 - Control device, control method, and program - Google Patents

Control device, control method, and program Download PDF

Info

Publication number
WO2023100537A1
WO2023100537A1 PCT/JP2022/039665 JP2022039665W WO2023100537A1 WO 2023100537 A1 WO2023100537 A1 WO 2023100537A1 JP 2022039665 W JP2022039665 W JP 2022039665W WO 2023100537 A1 WO2023100537 A1 WO 2023100537A1
Authority
WO
WIPO (PCT)
Prior art keywords
axis
finger
display device
control unit
display
Prior art date
Application number
PCT/JP2022/039665
Other languages
French (fr)
Japanese (ja)
Inventor
光雄 折戸
Original Assignee
株式会社東海理化電機製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社東海理化電機製作所 filed Critical 株式会社東海理化電機製作所
Publication of WO2023100537A1 publication Critical patent/WO2023100537A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0362Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 1D translations or rotations of an operating part of the device, e.g. scroll wheels, sliders, knobs, rollers or belts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser

Definitions

  • the present invention relates to a control device, control method and program.
  • the present invention has been made in view of the above problems, and an object of the present invention is to make it possible to suppress deterioration in usability given to the user.
  • display of first selection candidates corresponding to each of a plurality of functions along a direction corresponding to a first axis is controlled by a display device,
  • control unit configured to control display by the device, wherein the control unit controls the detected position of the user's finger on the first axis detected by an operation input device provided separately from the display device; and a moving direction of the finger along two axes are obtained from the operation input device, and a selection candidate corresponding to the detected position and the moving direction is selected from the first selection candidate and the second selection candidate.
  • a controller is provided to select.
  • first selection candidates corresponding to each of a plurality of functions along a direction corresponding to a first axis are controlled.
  • second selection candidates corresponding to each of the plurality of functions are provided at positions moved from the position of the first selection candidate in a direction corresponding to a predetermined direction of a second axis different from the first axis. and a detected position of the user's finger on the first axis detected by an operation input device provided separately from the display device, and on the second axis obtaining from the operation input device the moving direction of the finger along the finger, and selecting a selection candidate according to the detected position and the moving direction from the first selection candidate and the second selection candidate.
  • a control method is provided, comprising:
  • a computer is configured to display first selection candidates respectively corresponding to a plurality of functions along a direction corresponding to a first axis.
  • a second axis corresponding to each of a plurality of functions is moved from the position of the first selection candidate in a direction corresponding to a predetermined direction of a second axis different from the first axis.
  • a detected position and a moving direction of the finger along the second axis are acquired from the operation input device, and the detected position and the moving direction are obtained from the first selection candidate and the second selection candidate.
  • a program functioning as a control device is provided that selects a selection candidate according to.
  • FIG. 10 is a diagram for explaining an example of detection of a finger moving direction along the Y-axis; It is a figure which shows the functional structural example of a control apparatus. 4 is a flow chart showing an operation example of the control device according to the embodiment of the present invention.
  • FIG. 1 is a diagram showing an example of the appearance of a system according to an embodiment of the present invention.
  • a system 1 according to an embodiment of the present invention includes an operation input device 10 and a display device 20.
  • the system 1 is installed inside the vehicle.
  • the user using the system 1 may be a user in the vehicle (for example, a driver, a person sitting in a front passenger seat, etc.).
  • the place where the system 1 is installed is not limited to the interior of the vehicle.
  • the operation input device 10 and the display device 20 are provided separately. More specifically, in the embodiment of the present invention, unlike the configuration in which the operation input device 10 and the display device 20 are stacked (the configuration of a so-called touch panel), the operation input device 10 and the display device 20 are stacked. and separate configurations are employed.
  • the display device 20 displays various information under the control of the control device 30 (FIG. 3).
  • the type of display device 20 is not limited.
  • the display device 20 may be an LCD (Liquid Crystal Display), an organic EL (Electro-Luminescence) display, or a PDP (Plasma Display Panel).
  • a selection candidate may be any object that is displayed to be visually perceived by the user.
  • Icons I1, I3, I5, and I7 form an upper row of icons.
  • icons I2, I4, I6, and I8 form the lower row of icons.
  • the number of icons forming the upper row of icons is four, and the number of icons forming the lower row of icons is four.
  • the number of icons forming each of the upper icon row and the lower icon row may be plural.
  • Icons I1 to I8 are icons corresponding to functions.
  • Icons I1, I2, I7, and I8 are icons for adjusting detailed contents of functions (hereinafter also referred to as “function adjustment icons"). Each time one of these function adjustment icons is selected, the details of the function are adjusted.
  • Icons I3 to I6 are icons for activating functions (hereinafter also referred to as “function activation icons”). Each time one of these feature activation icons is selected, the feature is activated or deactivated.
  • the icon I1 is an icon corresponding to control to increase the audio volume.
  • the icon I2 is an icon corresponding to control for decreasing the audio volume.
  • Icon I3 is an icon corresponding to the cooling function and the dehumidification function.
  • Icon I4 is an icon corresponding to the automatic setting function of the air conditioner.
  • Icon I5 is an icon that corresponds to the function (independent mode) of independently adjusting the set temperature on the driver's seat side and the front passenger's seat side. When this function is stopped, it switches to a function (interlocking mode) in which the set temperatures on the driver's side and passenger's side are interlocked.
  • Icon I6 is an icon corresponding to a defroster function for removing fog from the windshield.
  • the icon I7 is an icon corresponding to control for increasing the temperature, which is the reference for adjustment by the air conditioner.
  • the icon I8 is an icon corresponding to the control of lowering the temperature that is the reference for adjustment by the air conditioner.
  • icons I1 to I8 are displayed superimposed on the map.
  • the icons I1 to I8 may be displayed superimposed on display contents other than the map.
  • the icons I1 to I8 may be displayed without being superimposed on other display contents.
  • the operation input device 10 includes an operation unit 110 that receives an operation input by a user's finger.
  • the operation unit 110 has an operation surface 115 with which the user touches with a finger when inputting an operation.
  • the operation surface 115 is provided with a sensor that detects the contact of the user's finger.
  • an electrostatic sensor can be used, but the type of sensor is not particularly limited.
  • the operation unit 110 detects the position of the user's finger on the X axis (first axis) as the detection position. More specifically, the operation unit 110 detects the contact position of the user's finger on the operation surface 115 on the X axis as the detection position.
  • the operation unit 110 detects the moving direction of the user's finger along the Y axis (second axis). More specifically, the operation unit 110 detects the direction in which the user moves the finger along the Y-axis while touching the operation surface 115 as the finger movement direction.
  • the operation unit 110 detects the moving direction of the user's finger along the Y axis (second axis). More specifically, the operation unit 110 detects the direction in which the user moves the finger along the Y-axis while touching the operation surface 115 as the finger movement direction.
  • an example of detection of the moving direction of the finger along the Y-axis will be described with reference to FIG.
  • FIG. 2 is a diagram for explaining an example of detection of the finger movement direction along the Y-axis.
  • display device 20 has display surface 215 .
  • a screen is displayed on the display surface 215 .
  • the operation unit 110 includes a tact switch 120a and a tact switch 120b.
  • the operation unit 110 moves so that the direction of the operation surface 115 approaches the horizontal direction (that is, counterclockwise in FIG. 2). It rotates and the tact switch 120a is pressed.
  • the operation unit 110 detects the movement of the finger in the +Y direction as the movement direction of the finger by detecting the pressing of the tact switch 120a.
  • the operation unit 110 when the user moves the finger in the -Y direction while touching the operation surface 115, the operation unit 110 is moved so that the direction of the operation surface 115 approaches the vertical direction (that is, clockwise in FIG. 2). It rotates and the tact switch 120b is pressed. The operation unit 110 detects the movement of the finger in the -Y direction as the movement direction of the finger by detecting the depression of the tactile switch 120b.
  • the detection of the finger movement direction along the Y-axis is not limited to this example.
  • tact switch 120a and tact switch 120b other switches (eg, two contact rubber switches, etc.) may be used.
  • a sensor capable of detecting a finger moved in the +Y direction and a sensor capable of detecting a finger moved in the -Y direction may be additionally provided.
  • the operation unit 110 does not have to rotate with the movement of the finger.
  • an electrostatic sensor can be used, but the type of sensor is not particularly limited.
  • the X-axis direction and the Y-axis direction are different directions. As shown in FIG. 1, the X and Y axes may be orthogonal. For example, as shown in FIG. 1 , the X-axis may be the longitudinal direction of the operating surface 115 and the Y-axis may be the lateral direction of the operating surface 115 . The Z-axis is orthogonal to both the X-axis and the Y-axis.
  • the operation surface 115 may be a concave surface curved along the Y-axis. This makes it easier for the operation surface 115 to fit the pad of the finger, so that the usability during operation by the user and the detection accuracy of the contact position of the finger on the operation surface 115 can be improved.
  • the operation input device 10 transmits the detected position of the user's finger on the X-axis to the control device 30 . Furthermore, the operation input device 10 transmits the direction of movement of the user's finger along the Y-axis to the control device 30 .
  • the user wants to activate the function corresponding to icon I3 or I4.
  • the user touches a position corresponding to icon I3 or icon I4 on the X-axis on operation surface 115 with a finger.
  • the user moves the finger in the +Y direction or the -Y direction along the Y-axis while touching the operation surface 115 with the finger.
  • the contact position of the finger on the operation surface 115 on the X-axis is acquired by the control device 30 via the operation input device 10 as a detection position. Also, the direction of movement of the finger along the Y-axis while touching the operation surface 115 is acquired by the control device 30 via the operation input device 10 . It should be noted that similar operations can be performed by the system 1 when an icon other than the icon I3 and the icon I4 is selected.
  • Controller 30 may be implemented by a computer.
  • the control device 30 is connected to the operation input device 10 wirelessly or by wire. Further, the control device 30 is connected to the display device 20 wirelessly or by wire. Note that the control device 30 may be integrated with the operation input device 10 or the display device 20 . A configuration example of the control device 30 will be described with reference to FIG.
  • FIG. 3 is a diagram showing an example of the functional configuration of the control device 30.
  • the control device 30 includes an input section 310 , a control section 320 , a storage section 330 and an output section 340 .
  • the input unit 310 functions as an input interface with the operation input device 10 . More specifically, the input unit 310 receives input of the detected position of the user's finger on the X-axis, which is transmitted from the operation input device 10 . The input unit 310 outputs the detected position of the user's finger on the X-axis to the control device 30 .
  • the input unit 310 receives an input of the moving direction of the user's finger along the Y-axis, which is transmitted from the operation input device 10 .
  • the input unit 310 outputs the movement direction of the user's finger along the Y-axis to the control device 30 .
  • Control unit 320 includes a processor, and its functions can be realized by the processor executing a program stored in a memory. At this time, a computer-readable recording medium recording the program may also be provided. Alternatively, these blocks may be composed of dedicated hardware, or may be composed of a combination of multiple pieces of hardware.
  • control unit 320 acquires the detected position of the user's finger on the X-axis output from the input unit 310 .
  • Control unit 320 outputs a control signal for controlling display by display device 20 to output unit 340 based on the detected position of the user's finger on the X axis. Details of such display control executed by the control unit 320 will be described later.
  • control unit 320 acquires the moving direction of the user's finger along the Y-axis output from the input unit 310 .
  • the control unit 320 outputs a control signal for controlling display by the display device 20 to the output unit 340 based on the moving direction of the user's finger along the Y-axis. Details of such display control executed by the control unit 320 will be described later.
  • the storage unit 330 appropriately stores data necessary for computation by the processor.
  • the storage unit 330 may be composed of a memory such as a RAM (Random Access Memory), a hard disk drive, or a flash memory.
  • the output unit 340 functions as an output interface with the display device 20 . More specifically, the output section 340 outputs the control signal output from the control section 320 to the display device 20 . Thereby, the control of the display by the display device 20, which is executed by the control unit 320, can be realized.
  • control unit 320 controls the display device 20 to display the upper row of icons (icons I1, I3, I5, I7). Furthermore, the control unit 320 controls the display device 20 so that the lower row of icons (icons I2, I4, I6, and I8) is displayed by the display device 20 .
  • control unit 320 controls the display device 20 so that the upper row of icons (icons I1, I3, I5, I7) is displayed along the direction corresponding to the X axis. Furthermore, the control unit 320 causes the lower row of icons (icons I2, I4, I6, I8) to move in a direction (here, the display device The display device 20 is controlled so that it is displayed at a position moved downward on the screen displayed by 20).
  • each of the upper icon row and the lower icon row corresponds to the X axis, which is the direction in which the finger detection position changes. Therefore, it is possible to give the user the feeling that the operation on the operation input device 10 and the display by the display device 20 are linked.
  • the direction from the upper icon to the lower icon corresponds to the Y-axis, which is the finger movement direction detected by the operation unit 110 . Therefore, it is possible to give the user the feeling that the operation on the operation input device 10 and the display by the display device 20 are linked.
  • Giving the user the feeling that the operation on the operation input device 10 and the display by the display device 20 are linked in this way can lead to the effect of suppressing the deterioration of the user's feeling of use.
  • the direction corresponding to the X-axis should be the left-right direction of the screen, and the direction of the X-axis should match the left-right direction of the screen.
  • the direction in which the finger detection position changes coincides with the direction in which the upper row of icons (or the lower row of icons) is displayed. It is possible to give the user a stronger sense of cooperation.
  • the right direction is the +X direction and the left direction is the -X direction.
  • the direction corresponding to the Y-axis is preferably the vertical direction of the screen, and the direction of the Y-axis is preferably the depth direction as seen from the user.
  • the direction of the upper icon or the lower icon and the finger movement direction match with the position sandwiched between the upper icon and the lower icon. It is possible to give the user a stronger feeling that the display by 20 is linked.
  • the direction from front to back is the +Y direction
  • the direction from back to front is the -Y direction.
  • the operation input device 10 detects the contact position of the user's finger on the X axis as the detection position of the finger on the X axis. 320 obtains the detected position of the finger on the X-axis.
  • the control unit 320 controls display by the display device 20 of predetermined feedback information according to the detected position of the finger on the X-axis. This allows the user to grasp which icon the position corresponding to has been detected.
  • control unit 320 controls the icon I3 corresponding to the detected position in the upper row of icons (icons I1, I3, I5, I7) and the The display of the feedback information F1 by the display device 20 is controlled at a position sandwiched between the icon I4 corresponding to the detection position.
  • FIG. 1 shows, as an example of the feedback information F1, a case where the feedback information is displayed with a higher luminance than the luminance of other places where the feedback information is not displayed.
  • the feedback information F1 is not limited to such an example.
  • the feedback information F1 may be displayed in some manner different from the display at other locations where the feedback information F1 is not displayed.
  • Control unit 320 selects an icon from icons I1 to I8 in accordance with the detection position and movement direction. Here, icon I3 or icon I4 is selected.
  • the control unit 320 changes the upper row of icons (icons I1, I3, I5, I7). Among them, the icon I3 corresponding to the detected position is selected.
  • the control unit 320 selects the icon corresponding to the detection position among the lower row of icons (icons I2, I4, I6, I8). Select I4.
  • the control unit 320 activates the function corresponding to the selected icon I3 or icon I4. For example, when the icon I3 is selected, the control unit 320 activates the cooling function and the dehumidifying function if the cooling function and the dehumidifying function are stopped. On the other hand, when the icon I3 is selected, the control unit 320 stops the cooling function and the dehumidifying function if the cooling function and the dehumidifying function are activated.
  • the control unit 320 activates the automatic setting function of the air conditioner if the automatic setting function of the air conditioner is stopped.
  • the control unit 320 stops the automatic setting function of the air conditioner when the automatic setting function of the air conditioner is activated.
  • FIG. 4 is a flow chart showing an operation example of the control device 30 according to the embodiment of the present invention.
  • the control unit 320 controls the display device 20 so that the upper row of icons (icons I1, I3, I5, I7) is displayed along the direction corresponding to the X axis. Furthermore, the control unit 320 controls the lower row of icons (icons I2, I4, I6, and I8) to be displayed at a position shifted downward from the upper row of icons on the screen displayed by the display device 20. It controls the display device 20 .
  • the operation input device 10 detects the contact position of the user's finger on the X axis as the detection position of the finger on the X axis. Then, as shown in FIG. 4, the detected position of the finger on the X-axis is acquired by the controller 320 (S11).
  • the control unit 320 controls display by the display device 20 of predetermined feedback information according to the detected position of the finger on the X axis. More specifically, the control unit 320 detects an icon corresponding to the detection position in the upper row of icons (icons I1, I3, I5, I7), and detects an icon from the lower row of icons (icons I2, I4, I6, I8). The display by the display device 20 of the feedback information is controlled at the position sandwiched between the icon corresponding to the position.
  • the operation input device 10 detects the moving direction of the finger along the Y-axis (+Y direction or -Y direction).
  • the direction of finger movement along the Y-axis is obtained by 320 .
  • Control unit 320 selects an icon from icons I1 to I8 in accordance with the detection position and movement direction.
  • the control unit 320 selects the detected position from among the icon rows (icons I1, I3, I5, I7) corresponding to the +Y direction.
  • the icon to be selected is selected (S14).
  • the control unit 320 selects an icon corresponding to the detection position from among the icon rows (icons I2, I4, I6, I8) corresponding to the -Y direction ( S15).
  • the control unit 320 performs control corresponding to the function corresponding to the selected icon. For example, when the selected icon is the function adjustment icon, the control unit 320 performs control to adjust the detailed content of the function. Alternatively, when the selected icon is the function activation icon, control unit 320 activates or deactivates the function.
  • control device 30 The operation example of the control device 30 according to the embodiment of the present invention has been described above.
  • control unit 320 controls the display device 20 so that the upper row of icons (icons I1, I3, I5, I7) is displayed along the direction corresponding to the X axis.
  • control unit 320 causes the lower row of icons (icons I2, I4, I6, I8) to move in a direction (here, the display device The display device 20 is controlled so that it is displayed at a position moved downward on the screen displayed by 20).
  • each of the upper icon row and the lower icon row corresponds to the X axis, which is the direction in which the finger detection position changes. Therefore, it is possible to give the user the feeling that the operation on the operation input device 10 and the display by the display device 20 are linked.
  • the direction from the upper icon to the lower icon corresponds to the Y-axis, which is the finger movement direction detected by the operation unit 110 . Therefore, it is possible to give the user the feeling that the operation on the operation input device 10 and the display by the display device 20 are linked.
  • Giving the user the feeling that the operation on the operation input device 10 and the display by the display device 20 are linked in this way can lead to the effect of suppressing the deterioration of the user's feeling of use.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

[Problem] To enable suppression of deterioration in a feeling of use provided to a user. [Solution] Provided is a control device provided with a control unit for controlling a display, performed by a display device, of first selection candidates that respectively correspond to a plurality of functions along a direction that corresponds to a first axis, and controlling a display, performed by the display device, of second selection candidates that respectively correspond to a plurality of functions at positions moved from the positions of the first selection candidates in a direction that corresponds to a prescribed direction on a second axis different from the first axis. The control unit acquires, from an operation input device, a detection position of a finger of a user on the first axis and a movement direction of the finger that extends along the second axis, as detected by the operation input device provided separately from the display device, and selects a selection candidate that corresponds to the detection position and the movement direction from the first selection candidates and the second selection candidates.

Description

制御装置、制御方法およびプログラムControl device, control method and program
 本発明は、制御装置、制御方法およびプログラムに関する。 The present invention relates to a control device, control method and program.
 近年、ユーザの操作に基づいて選択することが可能な選択候補を画面に表示する技術が知られている。例えば、ステアリングホイールの表面側に両面タッチパネルの表操作面を配置し、ステアリングホイールの裏面側に両面タッチパネルの裏操作面を配置する技術が開示されている(例えば、特許文献1参照)。かかる技術では、裏操作面側への指の侵入が検出されると、裏操作面に対する操作によって選択可能な選択候補が表示される。 In recent years, there has been known a technique for displaying selection candidates on a screen that can be selected based on a user's operation. For example, a technique is disclosed in which the front operation surface of a double-sided touch panel is arranged on the front side of a steering wheel and the back operation surface of a double-sided touch panel is arranged on the back side of the steering wheel (see, for example, Patent Document 1). With this technology, when it is detected that a finger has entered the back operation surface, selection candidates that can be selected by operating the back operation surface are displayed.
特開2020-102066号公報Japanese Patent Application Laid-Open No. 2020-102066
 しかし、ユーザに与える使用感の低下を抑制することが要求される。 However, it is required to suppress the deterioration of usability given to the user.
 そこで、本発明は、上記問題に鑑みてなされたものであり、本発明の目的とするところは、ユーザに与える使用感の低下を抑制することを可能とすることにある。 Therefore, the present invention has been made in view of the above problems, and an object of the present invention is to make it possible to suppress deterioration in usability given to the user.
 上記課題を解決するために、本発明のある観点によれば、第1の軸に対応する方向に沿って複数の機能それぞれに対応する第1の選択候補の表示装置による表示を制御するとともに、前記第1の選択候補の位置から前記第1の軸とは異なる第2の軸の所定の方向に対応する方向に移動した位置に、複数の機能それぞれに対応する第2の選択候補の前記表示装置による表示を制御する制御部を備え、前記制御部は、前記表示装置とは別個に設けられた操作入力装置によって検出された、前記第1の軸におけるユーザの指の検出位置と、前記第2の軸に沿った前記指の移動方向とを、前記操作入力装置から取得し、前記第1の選択候補および前記第2の選択候補から前記検出位置と前記移動方向とに応じた選択候補を選択する、制御装置が提供される。 In order to solve the above problems, according to an aspect of the present invention, display of first selection candidates corresponding to each of a plurality of functions along a direction corresponding to a first axis is controlled by a display device, The display of second selection candidates corresponding to each of a plurality of functions at positions moved from the position of the first selection candidate in a direction corresponding to a predetermined direction of a second axis different from the first axis. a control unit configured to control display by the device, wherein the control unit controls the detected position of the user's finger on the first axis detected by an operation input device provided separately from the display device; and a moving direction of the finger along two axes are obtained from the operation input device, and a selection candidate corresponding to the detected position and the moving direction is selected from the first selection candidate and the second selection candidate. A controller is provided to select.
 また、上記課題を解決するために、本発明の別の観点によれば、第1の軸に対応する方向に沿って複数の機能それぞれに対応する第1の選択候補の表示装置による表示を制御するとともに、前記第1の選択候補の位置から前記第1の軸とは異なる第2の軸の所定の方向に対応する方向に移動した位置に、複数の機能それぞれに対応する第2の選択候補の前記表示装置による表示を制御することと、前記表示装置とは別個に設けられた操作入力装置によって検出された、前記第1の軸におけるユーザの指の検出位置と、前記第2の軸に沿った前記指の移動方向とを、前記操作入力装置から取得することと、前記第1の選択候補および前記第2の選択候補から前記検出位置と前記移動方向とに応じた選択候補を選択することと、を含む、制御方法が提供される。 In order to solve the above problems, according to another aspect of the present invention, display by a display device of first selection candidates corresponding to each of a plurality of functions along a direction corresponding to a first axis is controlled. In addition, second selection candidates corresponding to each of the plurality of functions are provided at positions moved from the position of the first selection candidate in a direction corresponding to a predetermined direction of a second axis different from the first axis. and a detected position of the user's finger on the first axis detected by an operation input device provided separately from the display device, and on the second axis obtaining from the operation input device the moving direction of the finger along the finger, and selecting a selection candidate according to the detected position and the moving direction from the first selection candidate and the second selection candidate. A control method is provided, comprising:
 また、上記課題を解決するために、本発明の別の観点によれば、コンピュータを、第1の軸に対応する方向に沿って複数の機能それぞれに対応する第1の選択候補の表示装置による表示を制御するとともに、前記第1の選択候補の位置から前記第1の軸とは異なる第2の軸の所定の方向に対応する方向に移動した位置に、複数の機能それぞれに対応する第2の選択候補の前記表示装置による表示を制御する制御部を備え、前記制御部は、前記表示装置とは別個に設けられた操作入力装置によって検出された、前記第1の軸におけるユーザの指の検出位置と、前記第2の軸に沿った前記指の移動方向とを、前記操作入力装置から取得し、前記第1の選択候補および前記第2の選択候補から前記検出位置と前記移動方向とに応じた選択候補を選択する、制御装置として機能させるプログラムが提供される。 In order to solve the above problems, according to another aspect of the present invention, a computer is configured to display first selection candidates respectively corresponding to a plurality of functions along a direction corresponding to a first axis. A second axis corresponding to each of a plurality of functions is moved from the position of the first selection candidate in a direction corresponding to a predetermined direction of a second axis different from the first axis. a control unit for controlling display by the display device of the selection candidates of the user's finger on the first axis detected by an operation input device provided separately from the display device A detected position and a moving direction of the finger along the second axis are acquired from the operation input device, and the detected position and the moving direction are obtained from the first selection candidate and the second selection candidate. A program functioning as a control device is provided that selects a selection candidate according to.
 以上説明したように本発明によれば、ユーザに与える使用感の低下を抑制することが可能となる。 As described above, according to the present invention, it is possible to suppress deterioration in usability given to the user.
本発明の実施形態に係るシステムの外観例を示す図である。It is a figure which shows the example of the appearance of the system which concerns on embodiment of this invention. Y軸に沿った指の移動方向の検出例について説明するための図である。FIG. 10 is a diagram for explaining an example of detection of a finger moving direction along the Y-axis; 制御装置の機能構成例を示す図である。It is a figure which shows the functional structural example of a control apparatus. 本発明の実施形態に係る制御装置の動作例を示すフローチャートである。4 is a flow chart showing an operation example of the control device according to the embodiment of the present invention;
 以下に添付図面を参照しながら、本発明の好適な実施の形態について詳細に説明する。なお、本明細書及び図面において、実質的に同一の機能構成を有する構成要素については、同一の符号を付することにより重複説明を省略する。 Preferred embodiments of the present invention will be described in detail below with reference to the accompanying drawings. In the present specification and drawings, constituent elements having substantially the same functional configuration are denoted by the same reference numerals, thereby omitting redundant description.
 <1.実施形態>
 本発明の実施形態の詳細について説明する。
<1. embodiment>
Details of embodiments of the present invention will be described.
 (1.1.システムの構成)
 まず、図1を参照しながら、本発明の実施形態に係るシステムの構成例について説明する。
(1.1. System configuration)
First, a configuration example of a system according to an embodiment of the present invention will be described with reference to FIG.
 図1は、本発明の実施形態に係るシステムの外観例を示す図である。図1に示されるように、本発明の実施形態に係るシステム1は、操作入力装置10と、表示装置20とを備える。なお、システム1は、後にも説明するように制御装置30(図3)をさらに備える。 FIG. 1 is a diagram showing an example of the appearance of a system according to an embodiment of the present invention. As shown in FIG. 1, a system 1 according to an embodiment of the present invention includes an operation input device 10 and a display device 20. FIG. Note that the system 1 further includes a control device 30 (FIG. 3) as will be described later.
 本発明の実施形態においては、システム1が車室内に搭載される場合を主に想定する。かかる場合には、システム1を利用するユーザは、車両に乗っているユーザ(例えば、運転者、助手席に座る者など)であり得る。しかし、システム1が設けられる場所は車室内に限定されない。 In the embodiment of the present invention, it is mainly assumed that the system 1 is installed inside the vehicle. In such a case, the user using the system 1 may be a user in the vehicle (for example, a driver, a person sitting in a front passenger seat, etc.). However, the place where the system 1 is installed is not limited to the interior of the vehicle.
 また、図1に示されるように、本発明の実施形態においては、操作入力装置10と表示装置20とが別個に設けられている場合を想定する。より詳細に、本発明の実施形態においては、操作入力装置10と表示装置20とが積層された構成(いわゆるタッチパネルが有する構成)とは異なり、操作入力装置10と表示装置20とが積層されておらず、互いに離れた構成が採用される。 Also, as shown in FIG. 1, in the embodiment of the present invention, it is assumed that the operation input device 10 and the display device 20 are provided separately. More specifically, in the embodiment of the present invention, unlike the configuration in which the operation input device 10 and the display device 20 are stacked (the configuration of a so-called touch panel), the operation input device 10 and the display device 20 are stacked. and separate configurations are employed.
 したがって、本発明の実施形態においては、操作入力装置10への操作と表示装置20による表示とが連携している感覚をユーザに与えることが要求される。 Therefore, in the embodiment of the present invention, it is required to give the user a feeling that the operation on the operation input device 10 and the display by the display device 20 are linked.
 (表示装置20)
 表示装置20は、制御装置30(図3)による制御に従って、各種の情報を表示する。ここで、表示装置20の種類は限定されない。例えば、表示装置20は、LCD(Liquid Crystal Display)であってもよいし、有機EL(Electro-Luminescence)ディスプレイであってもよいし、PDP(Plasma Display Panel)などであってもよい。
(Display device 20)
The display device 20 displays various information under the control of the control device 30 (FIG. 3). Here, the type of display device 20 is not limited. For example, the display device 20 may be an LCD (Liquid Crystal Display), an organic EL (Electro-Luminescence) display, or a PDP (Plasma Display Panel).
 図1を参照すると、アイコンI1~I8を表示する例が示されている。なお、アイコンI1~I8それぞれは、選択候補の例である。したがって、表示装置20によって表示されるアイコンI1~I8の一部または全部は、アイコン以外の選択候補に置き換えられてもよい。例えば、選択候補は、ユーザの視覚によって知覚されるように表示される何らかのオブジェクトであればよい。 Referring to FIG. 1, an example of displaying icons I1 to I8 is shown. Each of the icons I1 to I8 is an example of selection candidates. Therefore, some or all of the icons I1 to I8 displayed by the display device 20 may be replaced with selection candidates other than the icons. For example, a selection candidate may be any object that is displayed to be visually perceived by the user.
 アイコンI1、I3、I5、I7(第1の選択候補)は、上段のアイコン列を形成している。一方、アイコンI2、I4、I6、I8(第2の選択候補)は、下段のアイコン列を形成している。ここでは、上段のアイコン列を形成するアイコンの数が4つであり、下段のアイコン列を形成するアイコンの数が4つである場合を主に想定する。しかし、上段のアイコン列および下段のアイコン列それぞれを形成するアイコンの数は、複数であればよい。 Icons I1, I3, I5, and I7 (first selection candidates) form an upper row of icons. On the other hand, icons I2, I4, I6, and I8 (second selection candidates) form the lower row of icons. Here, it is mainly assumed that the number of icons forming the upper row of icons is four, and the number of icons forming the lower row of icons is four. However, the number of icons forming each of the upper icon row and the lower icon row may be plural.
 アイコンI1~I8それぞれは、機能に対応するアイコンである。そして、アイコンI1、I2、I7、I8は、機能の詳細内容を調整するためのアイコン(以下、「機能調整アイコン」とも言う。)である。これらの機能調整アイコンが選択されるたびに機能の詳細内容が調整される。アイコンI3~I6は、機能を起動するためのアイコン(以下、「機能起動アイコン」とも言う。)である。これらの機能起動アイコンが選択されるたびに機能が起動したり停止したりする。 Icons I1 to I8 are icons corresponding to functions. Icons I1, I2, I7, and I8 are icons for adjusting detailed contents of functions (hereinafter also referred to as "function adjustment icons"). Each time one of these function adjustment icons is selected, the details of the function are adjusted. Icons I3 to I6 are icons for activating functions (hereinafter also referred to as "function activation icons"). Each time one of these feature activation icons is selected, the feature is activated or deactivated.
 より詳細に、アイコンI1は、オーディオの音量を上昇させる制御に対応するアイコンである。アイコンI2は、オーディオの音量を低下させる制御に対応するアイコンである。アイコンI3は、冷房機能および除湿機能に対応するアイコンである。アイコンI4は、エアーコンディショナーのオート設定機能に対応するアイコンである。 More specifically, the icon I1 is an icon corresponding to control to increase the audio volume. The icon I2 is an icon corresponding to control for decreasing the audio volume. Icon I3 is an icon corresponding to the cooling function and the dehumidification function. Icon I4 is an icon corresponding to the automatic setting function of the air conditioner.
 アイコンI5は、運転席側と助手席側の設定温度を独立して調節する機能(独立モード)に対応するアイコンである。この機能が停止しているときには、運転席側と助手席側の設定温度が連動する機能(連動モード)に切り替わる。アイコンI6は、フロントガラスの曇りを除去するデフロスター機能に対応するアイコンである。アイコンI7は、エアーコンディショナーによる調整基準となる温度を上昇させる制御に対応するアイコンである。アイコンI8は、エアーコンディショナーによる調整基準となる温度を低下させる制御に対応するアイコンである。 Icon I5 is an icon that corresponds to the function (independent mode) of independently adjusting the set temperature on the driver's seat side and the front passenger's seat side. When this function is stopped, it switches to a function (interlocking mode) in which the set temperatures on the driver's side and passenger's side are interlocked. Icon I6 is an icon corresponding to a defroster function for removing fog from the windshield. The icon I7 is an icon corresponding to control for increasing the temperature, which is the reference for adjustment by the air conditioner. The icon I8 is an icon corresponding to the control of lowering the temperature that is the reference for adjustment by the air conditioner.
 なお、アイコンI1~I8それぞれに対応する機能の種類は、かかる例に限定されない。また、図1に示された例では、アイコンI1~I8が地図に重畳されて表示されている。しかし、アイコンI1~I8は、地図以外の表示内容に重畳されて表示されてもよい。あるいは、アイコンI1~I8は、他の表示内容に重畳されずに表示されてもよい。 Note that the types of functions corresponding to the icons I1 to I8 are not limited to these examples. Also, in the example shown in FIG. 1, icons I1 to I8 are displayed superimposed on the map. However, the icons I1 to I8 may be displayed superimposed on display contents other than the map. Alternatively, the icons I1 to I8 may be displayed without being superimposed on other display contents.
 (操作入力装置10)
 操作入力装置10は、ユーザの指による操作の入力を受け付ける操作部110を備える。操作部110は、ユーザが操作を入力するときに、指を接触させる操作面115を有している。操作面115には、ユーザの指の接触を検出するセンサが設けられている。かかるセンサの例としては、静電センサが用いられ得るが、センサの種類は特に限定されない。
(Operation input device 10)
The operation input device 10 includes an operation unit 110 that receives an operation input by a user's finger. The operation unit 110 has an operation surface 115 with which the user touches with a finger when inputting an operation. The operation surface 115 is provided with a sensor that detects the contact of the user's finger. As an example of such a sensor, an electrostatic sensor can be used, but the type of sensor is not particularly limited.
 操作部110は、X軸(第1の軸)におけるユーザの指の位置を検出位置として検出する。より詳細に、操作部110は、X軸におけるユーザの指の操作面115への接触位置を検出位置として検出する。 The operation unit 110 detects the position of the user's finger on the X axis (first axis) as the detection position. More specifically, the operation unit 110 detects the contact position of the user's finger on the operation surface 115 on the X axis as the detection position.
 さらに、操作部110は、Y軸(第2の軸)に沿ったユーザの指の移動方向を検出する。より詳細に、操作部110は、ユーザが指を操作面115に接触させながらY軸に沿って移動させた方向を指の移動方向として検出する。ここで、図2を参照しながら、Y軸に沿った指の移動方向の検出例について説明する。 Furthermore, the operation unit 110 detects the moving direction of the user's finger along the Y axis (second axis). More specifically, the operation unit 110 detects the direction in which the user moves the finger along the Y-axis while touching the operation surface 115 as the finger movement direction. Here, an example of detection of the moving direction of the finger along the Y-axis will be described with reference to FIG.
 図2は、Y軸に沿った指の移動方向の検出例について説明するための図である。図2に示されるように、表示装置20は、表示面215を有している。表示面215には画面が表示される。操作部110は、タクトスイッチ120aおよびタクトスイッチ120bを備える。 FIG. 2 is a diagram for explaining an example of detection of the finger movement direction along the Y-axis. As shown in FIG. 2, display device 20 has display surface 215 . A screen is displayed on the display surface 215 . The operation unit 110 includes a tact switch 120a and a tact switch 120b.
 例えば、ユーザが、指を操作面115に接触させながら+Y方向に指を移動させると、操作面115の方向が水平方向に近づくように(すなわち、図2の反時計回りに)操作部110が回転し、タクトスイッチ120aが押下される。操作部110は、タクトスイッチ120aの押下を検出することによって、指の移動方向として+Y方向への指の移動を検出する。 For example, when the user moves the finger in the +Y direction while touching the operation surface 115, the operation unit 110 moves so that the direction of the operation surface 115 approaches the horizontal direction (that is, counterclockwise in FIG. 2). It rotates and the tact switch 120a is pressed. The operation unit 110 detects the movement of the finger in the +Y direction as the movement direction of the finger by detecting the pressing of the tact switch 120a.
 一方、ユーザが、指を操作面115に接触させながら-Y方向に指を移動させると、操作面115の方向が鉛直方向に近づくように(すなわち、図2の時計回りに)操作部110が回転し、タクトスイッチ120bが押下される。操作部110は、タクトスイッチ120bの押下を検出することによって、指の移動方向として-Y方向への指の移動を検出する。 On the other hand, when the user moves the finger in the -Y direction while touching the operation surface 115, the operation unit 110 is moved so that the direction of the operation surface 115 approaches the vertical direction (that is, clockwise in FIG. 2). It rotates and the tact switch 120b is pressed. The operation unit 110 detects the movement of the finger in the -Y direction as the movement direction of the finger by detecting the depression of the tactile switch 120b.
 なお、Y軸に沿った指の移動方向の検出は、かかる例に限定されない。例えば、タクトスイッチ120aおよびタクトスイッチ120bの代わりに、他のスイッチ(例えば、二つのコンタクトラバースイッチなど)が用いられてもよい。あるいは、+Y方向に移動した指を検出可能なセンサ、および、-Y方向に移動した指を検出可能なセンサが追加的に設けられてもよい。このとき、操作部110は、指の移動に伴って回転しなくてもよい。なお、センサの例としては、静電センサが用いられ得るが、センサの種類は特に限定されない。 Note that the detection of the finger movement direction along the Y-axis is not limited to this example. For example, instead of tact switch 120a and tact switch 120b, other switches (eg, two contact rubber switches, etc.) may be used. Alternatively, a sensor capable of detecting a finger moved in the +Y direction and a sensor capable of detecting a finger moved in the -Y direction may be additionally provided. At this time, the operation unit 110 does not have to rotate with the movement of the finger. As an example of the sensor, an electrostatic sensor can be used, but the type of sensor is not particularly limited.
 図1に戻って説明を続ける。X軸の方向とY軸の方向とは異なる方向である。図1に示されるように、X軸とY軸とは直交してもよい。例えば、図1に示されるように、X軸は、操作面115の長手方向であってよく、Y軸は、操作面115の短手方向であってよい。Z軸は、X軸とY軸との双方に直交する。 Return to Figure 1 and continue the explanation. The X-axis direction and the Y-axis direction are different directions. As shown in FIG. 1, the X and Y axes may be orthogonal. For example, as shown in FIG. 1 , the X-axis may be the longitudinal direction of the operating surface 115 and the Y-axis may be the lateral direction of the operating surface 115 . The Z-axis is orthogonal to both the X-axis and the Y-axis.
 例えば、操作面115は、Y軸に沿って曲がった凹面であってよい。これによって、操作面115と指の腹とがフィットしやすくなるため、ユーザによる操作時の使用感および操作面115への指の接触位置の検出精度が向上し得る。操作入力装置10は、X軸におけるユーザの指の検出位置を制御装置30に送信する。さらに、操作入力装置10は、Y軸に沿ったユーザの指の移動方向を制御装置30に送信する。 For example, the operation surface 115 may be a concave surface curved along the Y-axis. This makes it easier for the operation surface 115 to fit the pad of the finger, so that the usability during operation by the user and the detection accuracy of the contact position of the finger on the operation surface 115 can be improved. The operation input device 10 transmits the detected position of the user's finger on the X-axis to the control device 30 . Furthermore, the operation input device 10 transmits the direction of movement of the user's finger along the Y-axis to the control device 30 .
 ここでは、一例として、ユーザがアイコンI3またはI4に対応する機能を起動させたいと考えた場合を主に想定する。このとき、図1に示されたように、ユーザは、操作面115におけるX軸におけるアイコンI3またはアイコンI4に対応する位置に指を接触させる。そして、ユーザは、操作面115に指を接触させながら指をY軸に沿って+Y方向または-Y方向に移動させる。 Here, as an example, it is mainly assumed that the user wants to activate the function corresponding to icon I3 or I4. At this time, as shown in FIG. 1, the user touches a position corresponding to icon I3 or icon I4 on the X-axis on operation surface 115 with a finger. Then, the user moves the finger in the +Y direction or the -Y direction along the Y-axis while touching the operation surface 115 with the finger.
 X軸における指の操作面115への接触位置は、検出位置として操作入力装置10を介して制御装置30によって取得される。また、操作面115に接触しながら指がY軸に沿って移動した移動方向は、操作入力装置10を介して制御装置30によって取得される。なお、アイコンI3およびアイコンI4以外のアイコンが選択される場合にも、同様な動作がシステム1によって実行され得る。 The contact position of the finger on the operation surface 115 on the X-axis is acquired by the control device 30 via the operation input device 10 as a detection position. Also, the direction of movement of the finger along the Y-axis while touching the operation surface 115 is acquired by the control device 30 via the operation input device 10 . It should be noted that similar operations can be performed by the system 1 when an icon other than the icon I3 and the icon I4 is selected.
 (制御装置30)
 制御装置30は、コンピュータによって実現され得る。制御装置30は、操作入力装置10と無線または有線によって接続されている。さらに、制御装置30は、表示装置20と無線または有線によって接続されている。なお、制御装置30は、操作入力装置10または表示装置20と一体化されていてもよい。図3を参照しながら、制御装置30の構成例について説明する。
(control device 30)
Controller 30 may be implemented by a computer. The control device 30 is connected to the operation input device 10 wirelessly or by wire. Further, the control device 30 is connected to the display device 20 wirelessly or by wire. Note that the control device 30 may be integrated with the operation input device 10 or the display device 20 . A configuration example of the control device 30 will be described with reference to FIG.
 図3は、制御装置30の機能構成例を示す図である。図3に示されるように、制御装置30は、入力部310と、制御部320と、記憶部330と、出力部340とを備える。 FIG. 3 is a diagram showing an example of the functional configuration of the control device 30. As shown in FIG. As shown in FIG. 3 , the control device 30 includes an input section 310 , a control section 320 , a storage section 330 and an output section 340 .
 (入力部310)
 入力部310は、操作入力装置10との入力インタフェースとして機能する。より詳細に、入力部310は、操作入力装置10から送信された、X軸におけるユーザの指の検出位置の入力を受け付ける。入力部310は、X軸におけるユーザの指の検出位置を制御装置30に出力する。
(Input unit 310)
The input unit 310 functions as an input interface with the operation input device 10 . More specifically, the input unit 310 receives input of the detected position of the user's finger on the X-axis, which is transmitted from the operation input device 10 . The input unit 310 outputs the detected position of the user's finger on the X-axis to the control device 30 .
 さらに、入力部310は、操作入力装置10から送信された、Y軸に沿ったユーザの指の移動方向の入力を受け付ける。入力部310は、Y軸に沿ったユーザの指の移動方向を制御装置30に出力する。 Furthermore, the input unit 310 receives an input of the moving direction of the user's finger along the Y-axis, which is transmitted from the operation input device 10 . The input unit 310 outputs the movement direction of the user's finger along the Y-axis to the control device 30 .
 (制御部320)
 制御部320は、プロセッサを含み、メモリにより記憶されているプログラムがプロセッサによって実行されることにより、その機能が実現され得る。このとき、当該プログラムを記録した、コンピュータに読み取り可能な記録媒体も提供され得る。あるいは、これらのブロックは、専用のハードウェアにより構成されていてもよいし、複数のハードウェアの組み合わせにより構成されてもよい。
(control unit 320)
Control unit 320 includes a processor, and its functions can be realized by the processor executing a program stored in a memory. At this time, a computer-readable recording medium recording the program may also be provided. Alternatively, these blocks may be composed of dedicated hardware, or may be composed of a combination of multiple pieces of hardware.
 例えば、制御部320は、入力部310から出力された、X軸におけるユーザの指の検出位置を取得する。制御部320は、X軸におけるユーザの指の検出位置に基づいて、表示装置20による表示を制御するための制御信号を出力部340に出力する。制御部320によって実行される、かかる表示制御の詳細については、後に説明する。 For example, the control unit 320 acquires the detected position of the user's finger on the X-axis output from the input unit 310 . Control unit 320 outputs a control signal for controlling display by display device 20 to output unit 340 based on the detected position of the user's finger on the X axis. Details of such display control executed by the control unit 320 will be described later.
 さらに、制御部320は、入力部310から出力された、Y軸に沿ったユーザの指の移動方向を取得する。制御部320は、Y軸に沿ったユーザの指の移動方向に基づいて、表示装置20による表示を制御するための制御信号を出力部340に出力する。制御部320によって実行される、かかる表示制御の詳細については、後に説明する。 Furthermore, the control unit 320 acquires the moving direction of the user's finger along the Y-axis output from the input unit 310 . The control unit 320 outputs a control signal for controlling display by the display device 20 to the output unit 340 based on the moving direction of the user's finger along the Y-axis. Details of such display control executed by the control unit 320 will be described later.
 (記憶部330)
 記憶部330は、プロセッサによる演算に必要なデータを適宜記憶する。記憶部330は、RAM(Random Access Memory)、ハードディスクドライブまたはフラッシュメモリなどのメモリによって構成されてよい。
(storage unit 330)
The storage unit 330 appropriately stores data necessary for computation by the processor. The storage unit 330 may be composed of a memory such as a RAM (Random Access Memory), a hard disk drive, or a flash memory.
 (出力部340)
 出力部340は、表示装置20との出力インタフェースとして機能する。より詳細に、出力部340は、制御部320から出力された制御信号を表示装置20に出力する。これによって、制御部320によって実行される、表示装置20による表示の制御が実現され得る。
(Output unit 340)
The output unit 340 functions as an output interface with the display device 20 . More specifically, the output section 340 outputs the control signal output from the control section 320 to the display device 20 . Thereby, the control of the display by the display device 20, which is executed by the control unit 320, can be realized.
 (制御部320の詳細)
 続いて、図1を参照しながら、制御部320が有する機能の詳細について説明する。図1に示されたように、制御部320は、上段のアイコン列(アイコンI1、I3、I5、I7)が表示装置20によって表示されるよう表示装置20を制御する。さらに、制御部320は、下段のアイコン列(アイコンI2、I4、I6、I8)が表示装置20によって表示されるよう表示装置20を制御する。
(Details of control unit 320)
Next, details of the functions of the control unit 320 will be described with reference to FIG. As shown in FIG. 1, the controller 320 controls the display device 20 to display the upper row of icons (icons I1, I3, I5, I7). Furthermore, the control unit 320 controls the display device 20 so that the lower row of icons (icons I2, I4, I6, and I8) is displayed by the display device 20 .
 より詳細に、制御部320は、上段のアイコン列(アイコンI1、I3、I5、I7)が、X軸に対応する方向に沿って表示されるように表示装置20を制御する。さらに、制御部320は、下段のアイコン列(アイコンI2、I4、I6、I8)が、上段のアイコン列から、Y軸の所定の方向(-Y方向)に対応する方向(ここでは、表示装置20によって表示される画面の下方向)に移動した位置に表示されるように表示装置20を制御する。 More specifically, the control unit 320 controls the display device 20 so that the upper row of icons (icons I1, I3, I5, I7) is displayed along the direction corresponding to the X axis. Furthermore, the control unit 320 causes the lower row of icons (icons I2, I4, I6, I8) to move in a direction (here, the display device The display device 20 is controlled so that it is displayed at a position moved downward on the screen displayed by 20).
 かかる構成によれば、上段のアイコン列だけではなく、下段のアイコン列も表示されるようになるため、表示されるアイコンの数を増加させることが可能となる。 According to this configuration, not only the upper row of icons but also the lower row of icons are displayed, so it is possible to increase the number of displayed icons.
 さらに、上段のアイコン列および下段のアイコン列それぞれの方向は、指の検出位置が変化する方向であるX軸に対応している。そのため、操作入力装置10への操作と表示装置20による表示とが連携している感覚をユーザに与えることが可能となる。 Furthermore, the direction of each of the upper icon row and the lower icon row corresponds to the X axis, which is the direction in which the finger detection position changes. Therefore, it is possible to give the user the feeling that the operation on the operation input device 10 and the display by the display device 20 are linked.
 さらに、かかる構成によれば、上段のアイコンから下段のアイコンへの方向が、操作部110によって検出される指の移動方向であるY軸に対応している。そのため、操作入力装置10への操作と表示装置20による表示とが連携している感覚をユーザに与えることが可能となる。 Furthermore, according to this configuration, the direction from the upper icon to the lower icon corresponds to the Y-axis, which is the finger movement direction detected by the operation unit 110 . Therefore, it is possible to give the user the feeling that the operation on the operation input device 10 and the display by the display device 20 are linked.
 このように、操作入力装置10への操作と表示装置20による表示とが連携している感覚をユーザに与えることは、ユーザの使用感の低下を抑制するという効果を奏することにつながり得る。 Giving the user the feeling that the operation on the operation input device 10 and the display by the display device 20 are linked in this way can lead to the effect of suppressing the deterioration of the user's feeling of use.
 一例として、X軸に対応する方向は、画面の左右方向であるのがよく、X軸の方向は、画面の左右方向と一致するのがよい。これによって、指の検出位置が変化する方向と、上段のアイコン列(または下段のアイコン列)が表示される方向とが一致するため、操作入力装置10への操作と表示装置20による表示とが連携している感覚をより強くユーザに与えることが可能となる。なお、図1に示された例では、右方向が+X方向であり、左方向が-X方向である。 As an example, the direction corresponding to the X-axis should be the left-right direction of the screen, and the direction of the X-axis should match the left-right direction of the screen. As a result, the direction in which the finger detection position changes coincides with the direction in which the upper row of icons (or the lower row of icons) is displayed. It is possible to give the user a stronger sense of cooperation. In the example shown in FIG. 1, the right direction is the +X direction and the left direction is the -X direction.
 さらに、Y軸に対応する方向は、画面の上下方向であるのがよく、Y軸の方向は、ユーザから見て奥行方向であるのがよい。これによって、上段のアイコンと下段のアイコンとに挟まれる位置を基準とした、上段のアイコンまたは下段のアイコンの方向と指の移動方向とが一致するため、操作入力装置10への操作と表示装置20による表示とが連携している感覚をより強くユーザに与えることが可能となる。なお、図1に示された例では、手前から奥への方向が+Y方向であり、奥から手前への方向が-Y方向である。 Furthermore, the direction corresponding to the Y-axis is preferably the vertical direction of the screen, and the direction of the Y-axis is preferably the depth direction as seen from the user. As a result, the direction of the upper icon or the lower icon and the finger movement direction match with the position sandwiched between the upper icon and the lower icon. It is possible to give the user a stronger feeling that the display by 20 is linked. In the example shown in FIG. 1, the direction from front to back is the +Y direction, and the direction from back to front is the -Y direction.
 操作面115におけるアイコンI3またはアイコンI4に対応する位置にユーザが指を接触させると、操作入力装置10によってX軸におけるユーザの指の接触位置がX軸における指の検出位置として検出され、制御部320によってX軸における指の検出位置が取得される。制御部320は、X軸における指の検出位置に応じた所定のフィードバック情報の表示装置20による表示を制御する。これによって、どのアイコンに対応する位置が検出されているかがユーザに把握され得る。 When the user touches a position corresponding to the icon I3 or the icon I4 on the operation surface 115, the operation input device 10 detects the contact position of the user's finger on the X axis as the detection position of the finger on the X axis. 320 obtains the detected position of the finger on the X-axis. The control unit 320 controls display by the display device 20 of predetermined feedback information according to the detected position of the finger on the X-axis. This allows the user to grasp which icon the position corresponding to has been detected.
 より詳細に、制御部320は、上段のアイコン列(アイコンI1、I3、I5、I7)のうち検出位置に対応するアイコンI3と、下段のアイコン列(アイコンI2、I4、I6、I8)のうち検出位置に対応するアイコンI4とに挟まれる位置に、フィードバック情報F1の表示装置20による表示を制御する。 More specifically, the control unit 320 controls the icon I3 corresponding to the detected position in the upper row of icons (icons I1, I3, I5, I7) and the The display of the feedback information F1 by the display device 20 is controlled at a position sandwiched between the icon I4 corresponding to the detection position.
 なお、図1には、フィードバック情報F1の例として、フィードバック情報が表示されていない他の場所の輝度よりも高輝度な表示を行うことによって実現される場合が示されている。しかし、フィードバック情報F1は、かかる例に限定されない。例えば、フィードバック情報F1は、フィードバック情報F1が表示されていない他の場所の表示と異なる態様による何らかの表示がなされればよい。 Note that FIG. 1 shows, as an example of the feedback information F1, a case where the feedback information is displayed with a higher luminance than the luminance of other places where the feedback information is not displayed. However, the feedback information F1 is not limited to such an example. For example, the feedback information F1 may be displayed in some manner different from the display at other locations where the feedback information F1 is not displayed.
 ユーザがY軸に沿って指を操作面115に接触させながら指を移動させると、操作入力装置10によってY軸に沿った指の移動方向(+Y方向または-Y方向)が検出され、制御部320によってY軸に沿った指の移動方向が取得される。制御部320は、アイコンI1~I8から検出位置と移動方向とに応じたアイコンを選択する。ここでは、アイコンI3またはアイコンI4が選択される。 When the user moves the finger along the Y-axis while touching the operation surface 115, the operation input device 10 detects the moving direction of the finger along the Y-axis (+Y direction or -Y direction). The direction of finger movement along the Y-axis is obtained by 320 . Control unit 320 selects an icon from icons I1 to I8 in accordance with the detection position and movement direction. Here, icon I3 or icon I4 is selected.
 例えば、制御部320は、指の移動方向がY軸の所定の方向(-Y方向)と逆方向(+Y方向)である場合に、上段のアイコン列(アイコンI1、I3、I5、I7)のうち検出位置に対応するアイコンI3を選択する。一方、制御部320は、指の移動方向がY軸の所定の方向(-Y方向)である場合に、下段のアイコン列(アイコンI2、I4、I6、I8)のうち検出位置に対応するアイコンI4を選択する。 For example, when the direction of finger movement is the opposite direction (+Y direction) of the predetermined direction of the Y axis (−Y direction), the control unit 320 changes the upper row of icons (icons I1, I3, I5, I7). Among them, the icon I3 corresponding to the detected position is selected. On the other hand, when the moving direction of the finger is the predetermined direction of the Y-axis (-Y direction), the control unit 320 selects the icon corresponding to the detection position among the lower row of icons (icons I2, I4, I6, I8). Select I4.
 制御部320は、選択したアイコンI3またはアイコンI4に対応する機能を起動させる。例えば、制御部320は、アイコンI3を選択した場合、冷房機能および除湿機能が停止している場合には、冷房機能および除湿機能を起動させる。一方、制御部320は、アイコンI3を選択した場合、冷房機能および除湿機能が起動している場合には、冷房機能および除湿機能を停止させる。 The control unit 320 activates the function corresponding to the selected icon I3 or icon I4. For example, when the icon I3 is selected, the control unit 320 activates the cooling function and the dehumidifying function if the cooling function and the dehumidifying function are stopped. On the other hand, when the icon I3 is selected, the control unit 320 stops the cooling function and the dehumidifying function if the cooling function and the dehumidifying function are activated.
 例えば、制御部320は、アイコンI4を選択した場合、エアーコンディショナーのオート設定機能が停止している場合には、エアーコンディショナーのオート設定機能を起動させる。一方、制御部320は、アイコンI4を選択した場合、エアーコンディショナーのオート設定機能が起動している場合には、エアーコンディショナーのオート設定機能を停止させる。 For example, when the icon I4 is selected, the control unit 320 activates the automatic setting function of the air conditioner if the automatic setting function of the air conditioner is stopped. On the other hand, when the icon I4 is selected, the control unit 320 stops the automatic setting function of the air conditioner when the automatic setting function of the air conditioner is activated.
 以上、本発明の実施形態に係るシステム1の構成例について説明した。 The configuration example of the system 1 according to the embodiment of the present invention has been described above.
 (1.2.制御装置の動作)
 続いて、図4を参照しながら(適宜図1~図3も参照しながら)、本発明の実施形態に係る制御装置30の動作例について説明する。
(1.2. Operation of the control device)
Next, an operation example of the control device 30 according to the embodiment of the present invention will be described with reference to FIG. 4 (also referring to FIGS. 1 to 3 as appropriate).
 図4は、本発明の実施形態に係る制御装置30の動作例を示すフローチャートである。まず、制御部320は、上段のアイコン列(アイコンI1、I3、I5、I7)が、X軸に対応する方向に沿って表示されるように表示装置20を制御する。さらに、制御部320は、下段のアイコン列(アイコンI2、I4、I6、I8)が、上段のアイコン列から、表示装置20によって表示される画面の下方向に移動した位置に表示されるように表示装置20を制御する。 FIG. 4 is a flow chart showing an operation example of the control device 30 according to the embodiment of the present invention. First, the control unit 320 controls the display device 20 so that the upper row of icons (icons I1, I3, I5, I7) is displayed along the direction corresponding to the X axis. Furthermore, the control unit 320 controls the lower row of icons (icons I2, I4, I6, and I8) to be displayed at a position shifted downward from the upper row of icons on the screen displayed by the display device 20. It controls the display device 20 .
 操作面115におけるアイコンに対応する位置にユーザが指を接触させると、操作入力装置10によってX軸におけるユーザの指の接触位置がX軸における指の検出位置として検出される。そして、図4に示されるように、制御部320によってX軸における指の検出位置が取得される(S11)。 When the user touches the position corresponding to the icon on the operation surface 115 with the finger, the operation input device 10 detects the contact position of the user's finger on the X axis as the detection position of the finger on the X axis. Then, as shown in FIG. 4, the detected position of the finger on the X-axis is acquired by the controller 320 (S11).
 制御部320は、X軸における指の検出位置に応じた所定のフィードバック情報の表示装置20による表示を制御する。より詳細に、制御部320は、上段のアイコン列(アイコンI1、I3、I5、I7)のうち検出位置に対応するアイコンと、下段のアイコン列(アイコンI2、I4、I6、I8)のうち検出位置に対応するアイコンとに挟まれる位置に、フィードバック情報の表示装置20による表示を制御する。 The control unit 320 controls display by the display device 20 of predetermined feedback information according to the detected position of the finger on the X axis. More specifically, the control unit 320 detects an icon corresponding to the detection position in the upper row of icons (icons I1, I3, I5, I7), and detects an icon from the lower row of icons (icons I2, I4, I6, I8). The display by the display device 20 of the feedback information is controlled at the position sandwiched between the icon corresponding to the position.
 ユーザがY軸に沿って指を操作面115に接触させながら指を移動させると、操作入力装置10によってY軸に沿った指の移動方向(+Y方向または-Y方向)が検出され、制御部320によってY軸に沿った指の移動方向が取得される。制御部320は、アイコンI1~I8から検出位置と移動方向とに応じたアイコンを選択する。 When the user moves the finger along the Y-axis while touching the operation surface 115, the operation input device 10 detects the moving direction of the finger along the Y-axis (+Y direction or -Y direction). The direction of finger movement along the Y-axis is obtained by 320 . Control unit 320 selects an icon from icons I1 to I8 in accordance with the detection position and movement direction.
 例えば、制御部320は、指の移動方向が+Y方向である場合に(S13において「+Y方向」)、+Y方向に対応するアイコン列(アイコンI1、I3、I5、I7)のうち検出位置に対応するアイコンを選択する(S14)。一方、制御部320は、指の移動方向が-Y方向である場合に、-Y方向に対応するアイコン列(アイコンI2、I4、I6、I8)のうち検出位置に対応するアイコンを選択する(S15)。 For example, when the moving direction of the finger is the +Y direction (“+Y direction” in S13), the control unit 320 selects the detected position from among the icon rows (icons I1, I3, I5, I7) corresponding to the +Y direction. The icon to be selected is selected (S14). On the other hand, when the finger movement direction is the -Y direction, the control unit 320 selects an icon corresponding to the detection position from among the icon rows (icons I2, I4, I6, I8) corresponding to the -Y direction ( S15).
 制御部320は、選択したアイコンに対応する機能に対応する制御を行う。例えば、制御部320は、選択したアイコンが機能調整アイコンである場合には、機能の詳細内容を調整する制御を行う。あるいは、制御部320は、選択したアイコンが機能起動アイコンである場合には、機能を起動または停止させる。 The control unit 320 performs control corresponding to the function corresponding to the selected icon. For example, when the selected icon is the function adjustment icon, the control unit 320 performs control to adjust the detailed content of the function. Alternatively, when the selected icon is the function activation icon, control unit 320 activates or deactivates the function.
 以上、本発明の実施形態に係る制御装置30の動作例について説明した。 The operation example of the control device 30 according to the embodiment of the present invention has been described above.
 (1.3.効果)
 続いて、本発明の実施形態が奏する効果について説明する。上記したように、制御部320は、上段のアイコン列(アイコンI1、I3、I5、I7)が、X軸に対応する方向に沿って表示されるように表示装置20を制御する。
(1.3. Effect)
Next, the effects of the embodiment of the present invention will be described. As described above, the control unit 320 controls the display device 20 so that the upper row of icons (icons I1, I3, I5, I7) is displayed along the direction corresponding to the X axis.
 さらに、制御部320は、下段のアイコン列(アイコンI2、I4、I6、I8)が、上段のアイコン列から、Y軸の所定の方向(-Y方向)に対応する方向(ここでは、表示装置20によって表示される画面の下方向)に移動した位置に表示されるように表示装置20を制御する。 Furthermore, the control unit 320 causes the lower row of icons (icons I2, I4, I6, I8) to move in a direction (here, the display device The display device 20 is controlled so that it is displayed at a position moved downward on the screen displayed by 20).
 かかる構成によれば、上段のアイコン列だけではなく、下段のアイコン列も表示されるようになるため、表示されるアイコンの数を増加させることが可能となる。 According to this configuration, not only the upper row of icons but also the lower row of icons are displayed, so it is possible to increase the number of displayed icons.
 さらに、上段のアイコン列および下段のアイコン列それぞれの方向は、指の検出位置が変化する方向であるX軸に対応している。そのため、操作入力装置10への操作と表示装置20による表示とが連携している感覚をユーザに与えることが可能となる。 Furthermore, the direction of each of the upper icon row and the lower icon row corresponds to the X axis, which is the direction in which the finger detection position changes. Therefore, it is possible to give the user the feeling that the operation on the operation input device 10 and the display by the display device 20 are linked.
 さらに、かかる構成によれば、上段のアイコンから下段のアイコンへの方向が、操作部110によって検出される指の移動方向であるY軸に対応している。そのため、操作入力装置10への操作と表示装置20による表示とが連携している感覚をユーザに与えることが可能となる。 Furthermore, according to this configuration, the direction from the upper icon to the lower icon corresponds to the Y-axis, which is the finger movement direction detected by the operation unit 110 . Therefore, it is possible to give the user the feeling that the operation on the operation input device 10 and the display by the display device 20 are linked.
 このように、操作入力装置10への操作と表示装置20による表示とが連携している感覚をユーザに与えることは、ユーザの使用感の低下を抑制するという効果を奏することにつながり得る。 Giving the user the feeling that the operation on the operation input device 10 and the display by the display device 20 are linked in this way can lead to the effect of suppressing the deterioration of the user's feeling of use.
 <2.補足>
 以上、添付図面を参照しながら本発明の好適な実施形態について詳細に説明したが、本発明はかかる例に限定されない。本発明の属する技術の分野における通常の知識を有する者であれば、請求の範囲に記載された技術的思想の範疇内において、各種の変更例または修正例に想到し得ることは明らかであり、これらについても、当然に本発明の技術的範囲に属するものと了解される。
<2. Supplement>
Although the preferred embodiments of the present invention have been described in detail above with reference to the accompanying drawings, the present invention is not limited to such examples. It is obvious that a person having ordinary knowledge in the technical field to which the present invention belongs can conceive of various modifications or modifications within the scope of the technical idea described in the claims. It is understood that these also belong to the technical scope of the present invention.
 10:操作入力装置、110:操作部、115:操作面、120:タクトスイッチ、20:表示装置、215:表示面、30:制御装置、310:入力部、320:制御部、330:記憶部、340:出力部 10: operation input device, 110: operation unit, 115: operation surface, 120: tact switch, 20: display device, 215: display surface, 30: control device, 310: input unit, 320: control unit, 330: storage unit , 340: output unit

Claims (9)

  1.  第1の軸に対応する方向に沿って複数の機能それぞれに対応する第1の選択候補の表示装置による表示を制御するとともに、前記第1の選択候補の位置から前記第1の軸とは異なる第2の軸の所定の方向に対応する方向に移動した位置に、複数の機能それぞれに対応する第2の選択候補の前記表示装置による表示を制御する制御部を備え、
     前記制御部は、前記表示装置とは別個に設けられた操作入力装置によって検出された、前記第1の軸におけるユーザの指の検出位置と、前記第2の軸に沿った前記指の移動方向とを、前記操作入力装置から取得し、前記第1の選択候補および前記第2の選択候補から前記検出位置と前記移動方向とに応じた選択候補を選択する、
     制御装置。
    controlling display by a display device of first selection candidates corresponding to each of a plurality of functions along a direction corresponding to a first axis, and a position of the first selection candidate that differs from the first axis; a control unit for controlling display by the display device of second selection candidates corresponding to each of a plurality of functions at a position moved in a direction corresponding to a predetermined direction of the second axis;
    The control unit controls a detected position of the user's finger on the first axis and a moving direction of the finger along the second axis, which are detected by an operation input device provided separately from the display device. is obtained from the operation input device, and a selection candidate corresponding to the detected position and the moving direction is selected from the first selection candidate and the second selection candidate,
    Control device.
  2.  前記制御部は、前記検出位置に応じた所定のフィードバック情報の前記表示装置による表示を制御する、
     請求項1に記載の制御装置。
    The control unit controls display by the display device of predetermined feedback information corresponding to the detected position.
    A control device according to claim 1 .
  3.  前記制御部は、前記第1の選択候補のうち前記検出位置に対応する選択候補と、前記第2の選択候補のうち前記検出位置に対応する選択候補とに挟まれる位置に、前記フィードバック情報の前記表示装置による表示を制御する、
     請求項2に記載の制御装置。
    The control unit stores the feedback information at a position sandwiched between a selection candidate corresponding to the detection position among the first selection candidates and a selection candidate corresponding to the detection position among the second selection candidates. controlling display by the display device;
    3. A control device according to claim 2.
  4.  前記制御部は、前記移動方向が前記所定の方向と逆方向である場合に、前記第1の選択候補のうち前記検出位置に対応する選択候補を選択する、
     請求項1に記載の制御装置。
    The control unit selects a selection candidate corresponding to the detected position from among the first selection candidates when the moving direction is opposite to the predetermined direction.
    A control device according to claim 1 .
  5.  前記制御部は、前記移動方向が前記所定の方向である場合に、前記第2の選択候補のうち前記検出位置に対応する選択候補を選択する、
     請求項1に記載の制御装置。
    The control unit selects a selection candidate corresponding to the detected position from among the second selection candidates when the moving direction is the predetermined direction.
    A control device according to claim 1 .
  6.  前記制御部は、選択した選択候補に対応する機能を起動させる、
     請求項1に記載の制御装置。
    The control unit activates a function corresponding to the selected selection candidate.
    A control device according to claim 1 .
  7.  前記制御部は、選択した選択候補に対応する機能の詳細内容を調整する、
     請求項1に記載の制御装置。
    The control unit adjusts the detailed content of the function corresponding to the selected selection candidate.
    A control device according to claim 1 .
  8.  第1の軸に対応する方向に沿って複数の機能それぞれに対応する第1の選択候補の表示装置による表示を制御するとともに、前記第1の選択候補の位置から前記第1の軸とは異なる第2の軸の所定の方向に対応する方向に移動した位置に、複数の機能それぞれに対応する第2の選択候補の前記表示装置による表示を制御することと、
     前記表示装置とは別個に設けられた操作入力装置によって検出された、前記第1の軸におけるユーザの指の検出位置と、前記第2の軸に沿った前記指の移動方向とを、前記操作入力装置から取得することと、
     前記第1の選択候補および前記第2の選択候補から前記検出位置と前記移動方向とに応じた選択候補を選択することと、
     を含む、制御方法。
    controlling display by a display device of first selection candidates corresponding to each of a plurality of functions along a direction corresponding to a first axis, and a position of the first selection candidate that differs from the first axis; controlling display by the display device of second selection candidates corresponding to each of a plurality of functions at positions moved in a direction corresponding to a predetermined direction of the second axis;
    The detection position of the user's finger on the first axis detected by an operation input device provided separately from the display device and the moving direction of the finger along the second axis are used as the operation input device. obtaining from an input device;
    selecting a selection candidate according to the detection position and the movement direction from the first selection candidate and the second selection candidate;
    control methods, including;
  9.  コンピュータを、
     第1の軸に対応する方向に沿って複数の機能それぞれに対応する第1の選択候補の表示装置による表示を制御するとともに、前記第1の選択候補の位置から前記第1の軸とは異なる第2の軸の所定の方向に対応する方向に移動した位置に、複数の機能それぞれに対応する第2の選択候補の前記表示装置による表示を制御する制御部を備え、
     前記制御部は、前記表示装置とは別個に設けられた操作入力装置によって検出された、前記第1の軸におけるユーザの指の検出位置と、前記第2の軸に沿った前記指の移動方向とを、前記操作入力装置から取得し、前記第1の選択候補および前記第2の選択候補から前記検出位置と前記移動方向とに応じた選択候補を選択する、
     制御装置として機能させるプログラム。
    the computer,
    controlling display by a display device of first selection candidates corresponding to each of a plurality of functions along a direction corresponding to a first axis, and different from the first axis from a position of the first selection candidate; a control unit for controlling display by the display device of second selection candidates corresponding to each of a plurality of functions at a position moved in a direction corresponding to a predetermined direction of the second axis;
    The control unit controls a detected position of the user's finger on the first axis and a moving direction of the finger along the second axis, which are detected by an operation input device provided separately from the display device. is obtained from the operation input device, and a selection candidate corresponding to the detected position and the movement direction is selected from the first selection candidate and the second selection candidate,
    A program that acts as a controller.
PCT/JP2022/039665 2021-12-01 2022-10-25 Control device, control method, and program WO2023100537A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021195108A JP2023081430A (en) 2021-12-01 2021-12-01 Control device, control method, and program
JP2021-195108 2021-12-01

Publications (1)

Publication Number Publication Date
WO2023100537A1 true WO2023100537A1 (en) 2023-06-08

Family

ID=86611910

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/039665 WO2023100537A1 (en) 2021-12-01 2022-10-25 Control device, control method, and program

Country Status (2)

Country Link
JP (1) JP2023081430A (en)
WO (1) WO2023100537A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080186287A1 (en) * 2007-02-05 2008-08-07 Nokia Corporation User input device
WO2015155955A1 (en) * 2014-04-10 2015-10-15 株式会社デンソー Input device for vehicle
JP2021054391A (en) * 2019-09-27 2021-04-08 株式会社東海理化電機製作所 Operating device
JP2021152832A (en) * 2020-03-25 2021-09-30 パナソニックIpマネジメント株式会社 Control apparatus, control system, control method, and program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080186287A1 (en) * 2007-02-05 2008-08-07 Nokia Corporation User input device
WO2015155955A1 (en) * 2014-04-10 2015-10-15 株式会社デンソー Input device for vehicle
JP2021054391A (en) * 2019-09-27 2021-04-08 株式会社東海理化電機製作所 Operating device
JP2021152832A (en) * 2020-03-25 2021-09-30 パナソニックIpマネジメント株式会社 Control apparatus, control system, control method, and program

Also Published As

Publication number Publication date
JP2023081430A (en) 2023-06-13

Similar Documents

Publication Publication Date Title
US10936108B2 (en) Method and apparatus for inputting data with two types of input and haptic feedback
US7747961B2 (en) Display device, user interface, and method for providing menus
JP4960127B2 (en) Operation device
US8384666B2 (en) Input device for operating in-vehicle apparatus
US11625145B2 (en) Automotive touchscreen with simulated texture for the visually impaired
US8723821B2 (en) Electronic apparatus and input control method
US10661682B2 (en) Vehicle seat haptic system and method
WO2009128148A1 (en) Remote control device
JP6653489B2 (en) Input device and input method
JP5778904B2 (en) Touch input device
KR102179289B1 (en) Large Display Interaction System and Method of Autonomous Vehicles
JP2008059796A (en) Operating system and operating apparatus
JP6896416B2 (en) In-vehicle system
CN111032414A (en) Control system for main display of autonomous vehicle
US8610668B2 (en) Computer keyboard with input device
JP2017045330A (en) Input device and on-vehicle device
KR20170029180A (en) Vehicle, and control method for the same
JP5852592B2 (en) Touch operation type input device
JP2009286175A (en) Display device for vehicle
JP2011162007A (en) Vehicle seat adjusting device
WO2023100537A1 (en) Control device, control method, and program
JP6473610B2 (en) Operating device and operating system
WO2023100584A1 (en) Control device, control method, and program
JP7378902B2 (en) Operation control device
JP2018128968A (en) Input device for vehicle and control method for input device for vehicle

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22900970

Country of ref document: EP

Kind code of ref document: A1