WO2014103221A1 - Operation device - Google Patents

Operation device Download PDF

Info

Publication number
WO2014103221A1
WO2014103221A1 PCT/JP2013/007315 JP2013007315W WO2014103221A1 WO 2014103221 A1 WO2014103221 A1 WO 2014103221A1 JP 2013007315 W JP2013007315 W JP 2013007315W WO 2014103221 A1 WO2014103221 A1 WO 2014103221A1
Authority
WO
WIPO (PCT)
Prior art keywords
space
distance
operating
reaction force
finger
Prior art date
Application number
PCT/JP2013/007315
Other languages
French (fr)
Japanese (ja)
Inventor
江波 和也
健一 竹中
Original Assignee
株式会社デンソー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社デンソー filed Critical 株式会社デンソー
Publication of WO2014103221A1 publication Critical patent/WO2014103221A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • B60K35/10
    • B60K35/22
    • B60K35/53
    • B60K2360/1438
    • B60K2360/146
    • B60K2360/1468

Definitions

  • This disclosure relates to an operation device that operates an image portion displayed on a display screen by an input to an operation surface.
  • Patent Document 1 discloses a technique for moving an image portion such as a navigation pointer and a radio main screen displayed on a display screen in association with an operation performed on a remote touchpad portion.
  • the user interface device disclosed in Patent Document 1 relates a remote touch pad unit that detects an operation of moving an operator's finger and the like, and a finger operation detected by the remote touch pad unit to movement of a map, a pointer, and the like.
  • a control unit obtains the distance from the remote touch pad unit to the finger.
  • the control unit can distinguish between a case where the distance to the finger is less than 3 centimeters (cm) and a case where the distance to the finger is within a range of 5 cm to 7 cm, for example.
  • Patent Document 1 it is possible to grasp whether or not the finger is positioned within the range of 5 cm to 7 cm from the detection means such as the remote touchpad unit when performing the switching operation described above. I had to rely on the intuition and familiarity of the operator. As described above, it is difficult to grasp the distance from the detection means to the finger, and there is a possibility that an erroneous operation is induced.
  • the present disclosure has been made in view of the above problems, and an object of the present disclosure is to provide an operation device capable of reducing erroneous operations even in a configuration in which an operation space is distinguished by a distance from a detection unit. That is.
  • the operation device operates on the image portion displayed on the display screen when input from the operation body is performed on the operation surface, and is provided on the back side of the operation surface.
  • Detection means for detecting movement of the operating tool, acquisition means for acquiring the operating tool distance from the detecting means to the operating tool, and detection in the first operating space where the operating tool distance is less than a predetermined threshold distance.
  • the discriminating means for distinguishing between the movement of the operated body and the movement of the operating body detected in the second operation space where the operating body distance exceeds the threshold distance, and the pressing force applied to the operating surface from the operating body, And a moving mechanism that moves the operation surface positioned in the second operation space into the first operation space.
  • the operation device operates on the image portion displayed on the display screen when input by the operation body is performed on the operation surface, and is provided on the back side of the operation surface.
  • a moving mechanism that moves the detecting means together with the operation surface, an acquisition means for acquiring the opposing surface distance, and a threshold distance in which the opposing surface distance is defined in advance by applying pressure
  • the operation surface or the detection means when the pressing force applied from the operating body to the operation surface is weak, the operation surface or the detection means is located in the second operation space.
  • the operation surface or the detection means moves from the second operation space to the first operation space by the function of the moving mechanism. .
  • the operation surface moves from the second operation space to the first operation space according to the strength of the pressing force applied from the operation body.
  • FIG. 5 is a diagram for explaining the configuration of the remote control device according to the first embodiment, and is a cross-sectional view taken along line VV in FIG. 4.
  • FIG. 5 is a diagram for explaining the configuration of the remote control device according to the first embodiment, and is a cross-sectional view taken along line VV in FIG. 4.
  • FIG. 6 is a view for explaining the shape of the elastic member, and is a cross-sectional view taken along line VI-VI in FIG. 5. It is a figure which shows the correlation with the deformation
  • the remote operation device 100 is mounted on a vehicle and constitutes a display system 10 together with a navigation device 50 and the like as shown in FIG.
  • the remote operation device 100 is installed at a position adjacent to the palm rest 39 at the center console of the vehicle, and exposes the operation surface 70 in a range that can be easily reached by the operator.
  • An operation with an index finger (hereinafter simply referred to as “finger”) F or the like of the operator's hand is input to the operation surface 70.
  • the navigation device 50 is installed in the instrument panel of the vehicle in a posture in which the display screen 52 is exposed and viewed from the operator and the display screen 52 faces the driver's seat.
  • Various display images 60 are displayed on the display screen 52.
  • the display image 60 shown in FIG. 3 is one of a plurality of display images displayed on the display screen 52, and shows an air conditioning menu 60b for operating the air conditioning equipment mounted on the vehicle.
  • the display image 60 includes a plurality of icons 63 associated with a specific function, a focus 62 for selecting the icons 63, a background portion 64 serving as a background of the icons 63 and the focus 62, and the like.
  • the position where the focus 62 is displayed on the display screen 52 corresponds to the position where the finger F touches on the operation surface 70 shown in FIG.
  • the display image 60 described above is generated when the navigation device 50 superimposes a plurality of drawing layers.
  • the remote operation device 100 is connected to a controller area network (CAN, registered trademark) bus 90, an external battery 95, and the like.
  • the CAN bus 90 is a transmission path used for data transmission between in-vehicle devices in an in-vehicle communication network formed by connecting a plurality of in-vehicle devices mounted on a vehicle.
  • the remote operation device 100 is capable of CAN communication with the navigation device 50 located remotely via the CAN bus 90.
  • the remote operation device 100 includes power interfaces 21 and 22, a communication control unit 23, a communication interface 24, a detection unit 31, an operation control unit 33, and the like.
  • Each power interface 21, 22 stabilizes the power supplied from the battery 95 and supplies it to the operation control unit 33.
  • One power interface 21 is always supplied with power from the battery 95.
  • the other power interface 22 is supplied with electric power from the battery 95 when the switch 93 is energized when the accessory (ACC) power source of the vehicle is turned on.
  • the communication control unit 23 and the communication interface 24 are configured to output information processed by the operation control unit 33 to the CAN bus 90 and to acquire information output to the CAN bus 90 from other in-vehicle devices.
  • the communication control unit 23 and the communication interface 24 are connected to each other by a transmission signal line TX and a reception signal line RX.
  • the detection unit 31 includes a touch sensor 31a, a low-pass filter 31b, and an electrostatic detection IC 31c.
  • the touch sensor 31 a is provided on the back side of the operation surface 70, is formed in a rectangular shape along the operation surface 70, and accumulates electric charges with the finger F.
  • the touch sensor 31a is formed by arranging electrodes extending along the x-axis direction and electrodes extending along the y-axis direction in FIG. 4 in a grid pattern.
  • the low-pass filter 31b in FIG. 1 is a circuit formed by combining a passive resistor, a coil, a capacitor, and the like.
  • the low-pass filter 31b suppresses a high-frequency noise component generated in the touch sensor 31a from being input to the electrostatic detection IC 31c.
  • the electrostatic detection IC 31 c is connected to the touch sensor 31 a and the operation control unit 33. As shown in FIG. 5, electric charges are stored between the finger F and the touch sensor 31a that are close to each other.
  • the electrostatic detection IC 31c in FIG. 1 acquires a sensitivity value (see FIG. 10) that increases or decreases according to the capacitance between the finger F (see FIG. 5) and each electrode based on the output of each electrode, Output to the operation control unit 33.
  • the operation control unit 33 includes a processor that performs various arithmetic processes, a RAM that functions as a work area for the arithmetic processes, and a flash memory that stores programs used for the arithmetic processes.
  • the operation control unit 33 is connected to the power supply interfaces 21 and 22, the communication control unit 23, the detection unit 31, and the like.
  • the operation control unit 33 includes an acquisition block 34 and a determination block 35 and an association block 36, which will be described in detail later, as functional blocks by executing a predetermined program.
  • the acquisition block 34 acquires the sensitivity value output from the detection unit 31.
  • the acquisition block 34 includes an x coordinate and ay coordinate indicating the relative position of the finger F (see FIG. 5) with respect to the operation surface 70, and a distance from the touch sensor 31a to the finger F (hereinafter referred to as “operation object distance d” Z coordinate corresponding to (refer to A in FIG. 8) is detected by a calculation process based on the sensitivity value.
  • the operation control unit 33 outputs the x coordinate and the y coordinate indicating the relative position of the finger F to the CAN bus 90 through the communication control unit 23 and the communication interface 24.
  • the navigation apparatus 50 shown in FIGS. 1 and 2 is connected to a CAN bus 90 so that it can communicate with the remote operation device 100 and the like.
  • the navigation device 50 includes a display control unit 53 and a liquid crystal display 51.
  • the display control unit 53 includes a processor that performs various types of arithmetic processing, a RAM that functions as a work area for arithmetic processing, a graphic processor that performs image drawing processing, a graphic RAM that functions as a work area for drawing processing, and the like.
  • the display control unit 53 has a flash memory for storing data used for arithmetic processing and drawing processing, a communication interface connected to the CAN bus 90, and a video output interface for outputting drawn image data to the liquid crystal display 51. is doing.
  • the display control unit 53 draws a display image 60 to be displayed on the display screen 52 based on information acquired from the CAN bus 90. Then, the display control unit 53 sequentially outputs the image data of the drawn display image 60 to the liquid crystal display 51 through the video output interface.
  • the liquid crystal display 51 is a dot matrix type display that realizes color display by controlling a plurality of pixels arranged on the display screen 52.
  • the liquid crystal display 51 displays video by continuously forming image data sequentially acquired from the display control unit 53 on the display screen 52.
  • the lifting mechanism 40 includes a movable support member 41, a fixed support member 44, a first link 47 and a second link 48, an elastic member 80, and the like.
  • the movable support member 41 is formed in a rectangular plate shape, and is movable in the z-axis direction while maintaining a posture substantially parallel to the touch sensor 31a.
  • the sheet member 71 is affixed to the front surface located on the side away from the touch sensor 31a among both plate surfaces of the movable support member 41.
  • connection portions 42a and 42b are provided on the back surface located on the side close to the touch sensor 31a.
  • One set of connection portions 42 a and 42 b is provided on both edge portions of the movable support member 41 in the x-axis direction, and is erected from the back surface of the movable support member 41 toward the fixed support member 44.
  • the connecting portions 42a and 42b are positioned so as to be shifted from each other in the x-axis direction.
  • the fixed support member 44 is formed in a plate shape and is fixed to the center console of the vehicle. Of the two plate surfaces of the fixed support member 44, the touch sensor 31a is affixed to the back surface facing the touch sensor 31a. Further, of both plate surfaces of the fixed support member 44, connection portions 45 a and 45 b are provided on the front surface facing the movable support member 41. One set of connection portions 45 a and 45 b is provided on both sides of the elastic member 80 in the x-axis direction, and is erected from the front surface of the fixed support member 44 toward the movable support member 41. The connection parts 45a and 45b are located offset from each other in the x-axis direction.
  • the first link 47 and the second link 48 are formed in a longitudinal shape by a metal member extending in a band shape, and one set is provided on each side of the elastic member 80 in the x-axis direction. Each longitudinal direction of the first link 47 and the second link 48 is oriented in the y-axis direction.
  • the first link 47 and the second link 48 are attached to the movable support member 41 and the fixed support member 44 in a posture in which the respective longitudinal directions are inclined with respect to the y-axis direction.
  • the first link 47 and the second link 48 are shifted from each other in the x-axis direction. Both end portions of the first link 47 and the second link 48 are connected to one of the movable support member 41 and the fixed support member 44 by hinges.
  • one end 47a of both ends of the first link 47 is rotatably attached to the connecting portion 42a of the movable support member 41, and the other 47b of both ends of the first link 47 is fixedly supported. It is rotatably attached to the connecting portion 45b of the member 44.
  • One end 48 a of both end portions of the second link 48 is rotatably attached to the connection portion 45 a of the fixed support member 44, and the other end 48 b of both end portions of the second link 48 is the connection portion of the movable support member 41.
  • 42b is rotatably attached.
  • the hinge is allowed to slide in the y-axis direction.
  • the elastic member 80 shown in FIGS. 5 and 6 is formed in a container shape by a material such as rubber that is easily elastically deformed.
  • the elastic member 80 is placed at the center of the fixed support member 44 and is sandwiched between the movable support member 41 and the fixed support member 44.
  • the elastic member 80 is formed with a large diameter portion 81, a small diameter portion 83, a reduced diameter portion 82, and a stopper 85.
  • the large-diameter portion 81 is formed in a cylindrical shape, and is placed on the front surface of the fixed support member 44 in a posture in which the axial direction is aligned with the z-axis direction.
  • the small diameter portion 83 is formed in a cylindrical shape having a smaller outer diameter than the large diameter portion 81.
  • the small diameter portion 83 is in contact with the back surface of the movable support member 41 in a posture in which the axial direction is along the z-axis direction.
  • the small diameter portion 83 is positioned on the same axis as the large diameter portion 81.
  • the reduced diameter portion 82 is formed between the large diameter portion 81 and the small diameter portion 83.
  • the outer diameter of the reduced diameter portion 82 is gradually reduced from the large diameter portion 81 toward the reduced diameter portion 82 along the z-axis direction.
  • the stopper 85 protrudes in a columnar shape from the center of the top wall portion 84 of the elastic member 80 toward the fixed support member 44.
  • the elastic member 80 described above is compressed in the z-axis direction, so that the large diameter portion 81 and the small diameter portion 83 are crushed. At this time, the restoring force in the z-axis direction of the elastic member 80 gradually increases as the deformation amount of the elastic member 80 increases (see FIG. 7). When the elastic member 80 is further compressed, the small diameter portion 83 sinks to the inner peripheral side of the large diameter portion 81. At this time, the restoring force of the elastic member 80 temporarily decreases (see FIG. 7). When the elastic member 80 is further compressed, the restoring force in the axial direction of the elastic member 80 increases again (see FIG. 7). Then, when the stopper 85 comes into contact with the front surface of the fixed support member 44, the compression in the z-axis direction is restricted.
  • the pressing force Fp stronger than the restoring force of the elastic member 80 is applied from the finger F to the operating surface 70. Then, it moves downward toward the fixed support member 44 integrally with the movable support member 41. Thereby, the distance between the touch sensor 31a and the operation surface 70, and hence the operation body distance d between the touch sensor 31a and the finger F are shortened.
  • the pressing force Fp applied to the operation surface 70 disappears, the operation surface 70 rises integrally with the movable support member 41 by the restoring force of the elastic member 80. Thereby, the above-mentioned operating body distance d is extended.
  • the operation mode is switched according to the operation object distance d of the finger F that inputs the movement operation.
  • the determination block 35 shown in FIG. 1 distinguishes the detected movement of the finger F according to the z coordinate acquired by the acquisition block 34.
  • the association block 36 changes the image portion associated with the movement operation of the finger F on the display screen 52 shown in FIG.
  • the following operation modes (1) to (3) preliminarily defined in the remote operation device 100 will be described in detail.
  • the shallow portion operation mode is set when the finger F is located in the second operation space Sp2.
  • the second operation space Sp2 is a space in which the operating body distance d is not less than the first threshold distance Dth1 and less than the second threshold distance Dth2.
  • the first threshold distance Dth1 is set to be shorter than the distance between the touch sensor 31a and the operation surface 70 of the movable support member 41 in the state farthest from the fixed support member 44 (hereinafter referred to as the “uppermost position”).
  • the second threshold distance Dth2 is set to be, for example, about 0.5 to 1 cm longer than the distance from the touch sensor 31a to the operation surface 70 of the movable support member 41 at the uppermost position.
  • the movement operation for moving the finger F along the xy plane in the second operation space Sp2 is defined as “shallow operation”. Therefore, the movement operation that traces the operation surface 70 that has been raised to the uppermost position is a shallow operation.
  • the deep operation mode is set when the finger F is located in the first operation space Sp1.
  • the first operation space Sp1 is a space in which the operating body distance d is less than the first threshold distance Dth1.
  • the first threshold distance Dth1 is, for example, less than the distance between the operation surface 70 of the movable support member 41 and the touch sensor 31a in the state closest to the fixed support member 44 (hereinafter referred to as the “lowermost position”).
  • the length is set to about 5 to 1 cm.
  • a virtual boundary surface BP between the first operation space Sp1 and the second operation space Sp2 (see A in FIG. 8) is defined.
  • the movement operation for moving the finger F along the xy plane in the first operation space Sp1 is defined as “deep part operation”. Therefore, the moving operation of tracing the operation surface 70 lowered to the lowest position is a deep operation.
  • Non-proximity mode In the non-proximity mode, the moving operation with the finger F is not associated with any image portion of the display screen 52.
  • the non-proximity mode is set when the finger F is not positioned in either the first operation space Sp1 or the second operation space Sp2 (see A in FIG. 8).
  • the space excluding the first operation space Sp1 and the second operation space Sp2 is set as a non-proximity space.
  • the operator who wants to start the icon selection operation moves the finger F (see the two-dot chain line in A of FIG. 8) away from the touch sensor 31 a to the second threshold distance Dth2. Move toward the operation surface 70.
  • the operation mode is the non-proximity mode, the operation with the finger F is not associated with the image portion.
  • the finger F When the finger F is moved from the non-proximity space to the second operation space Sp2 and the operator inputs a tap operation that strikes the operation surface 70, the association between the shallow portion operation by the finger F and the scroll control is started.
  • the operation mode is set to the shallow operation mode
  • the main menu 60a defined in the upper hierarchy is displayed on the display screen 52.
  • a plurality of submenu image portions 165 can be scrolled.
  • reaction force increasing section Zin a section where the reaction force Fre gradually increases while the operation surface 70 moves from the uppermost position to the lowermost position.
  • reaction force decrease section Zde a section that continues to the reaction force increase section Zin in the pushing direction Dp of the reaction force increase section Zin and gradually decreases the reaction force Fre.
  • the first threshold distance Dth1 is adjusted in advance so that the operating surface 70 that descends passes through the boundary surface BP in the reaction force decrease zone Zde. Therefore, the operation surface 70 passes through the boundary surface BP so as to be interlocked with the timing at which the operation surface 70 is pulled downward due to the decrease in the reaction force Fre.
  • the finger F moves from the second operation space Sp2 (see A in FIG. 8) to the first operation space Sp1.
  • the operation mode of the remote operation device 100 is switched from the shallow operation mode to the deep operation mode.
  • the movement operation of the finger F is related to the focus control.
  • the submenu image portion 165 displayed in the central portion of the display screen 52 is displayed on the entire display screen 52 as the air conditioning menu 60b (see also B in FIG. 8).
  • the air conditioning menu 60b is a submenu defined in the lower hierarchy of the main menu 60a.
  • the operator can superimpose the focus 62 on the arbitrary icon 63 by inputting a deep operation of tracing the operation surface 70 with the finger F. Then, the operator weakens the pressing force Fp applied to the operation surface 70 in a state where the focus 62 is superimposed on the arbitrary icon 63, and moves the finger F away from the touch sensor 31a. Thus, the operator can select an arbitrary icon 63.
  • the operator who has completed the selection operation of the icon 63 further moves the finger F away from the operation surface 70 and moves it to the non-proximity space. Thereby, the operation mode is switched to the non-proximity mode. Then, after the operation mode is switched to the non-proximity mode, when a predetermined threshold time Tth (see FIG. 9) elapses, the remote operation device 100 waits for an icon selection operation by the operator after the next time. It becomes a state to do.
  • the operation mode selection processing performed by the operation control unit 33 to realize the above icon selection operation will be described in detail with reference to FIG. 1 based on FIG.
  • the operation mode selection process shown in FIG. 9 is started by the operation control unit 33 when the ACC power supply of the vehicle is turned on.
  • S101 the presence / absence of a tap operation on the operation surface 70 at the uppermost position is determined based on the change in output acquired from the touch sensor 31a. If it is determined in S101 that there is no tap operation, the standby state of the remote operation device 100 is maintained by repeating the determination in S101. On the other hand, if it is determined in S101 that there is a tap operation, the process proceeds to S102.
  • an acquisition process for acquiring a sensitivity value detected by each electrode of the touch sensor 31a is performed, and the process proceeds to S103.
  • calculation processing is performed for the x-coordinate, y-coordinate, and z-coordinate (hereinafter referred to as “input position coordinates”) indicating the position of the finger F in the three-dimensional direction with respect to the touch sensor 31a from the sensitivity value acquired in S102. , The process proceeds to S104.
  • the sensitivity value is a value that increases as the capacitance stored between the operation surface 70 and the finger F increases. Therefore, each coordinate in the x-axis direction and the y-axis direction where the sensitivity value is maximum indicates the relative position of the finger F on the operation surface 70.
  • the sensitivity value decreases as the operating tool distance d (see A in FIG. 8) decreases, and increases as the operating tool distance d increases. Therefore, the maximum sensitivity value corresponds to the operating tool distance d, and thus the coordinate in the z-axis direction.
  • the operation control unit 33 stores in advance a table indicating a first sensitivity threshold value Hth1 corresponding to the first threshold distance Dth1 and a second sensitivity threshold value Hth2 corresponding to the second threshold distance Dth2. In the processing after S104 shown in FIG. 9, the operation control unit 33 performs processing for comparing the maximum sensitivity value acquired in S103 with the sensitivity threshold values Hth1 and Hth2.
  • S104 it is determined whether or not the finger F is in the first operation space Sp1 based on whether or not the sensitivity value is greater than or equal to the first sensitivity threshold Hth1. If an affirmative determination is made in S104, the process proceeds to S105. In S105, the operation mode is set to the deep operation mode, and the process returns to S102.
  • S106 when a negative determination is made in S104, whether the finger F is in the second operation space Sp2 based on whether or not the sensitivity value is greater than or equal to the second sensitivity threshold Hth2 and less than the first sensitivity threshold Hth1. Determine whether or not. If a positive determination is made in S106, the process proceeds to S107. In S107, the operation mode is set to the shallow operation mode, and the process returns to S102.
  • S109 it is determined whether or not the elapsed time t at which the count was started in S108 of this time or the previous S108 is equal to or greater than a predetermined threshold time Tth. If a positive determination is made in S109, the process returns to S101. Thereby, the remote operation device 100 shifts to a standby state. On the other hand, if a negative determination is made in S109, the process returns to S102.
  • the operation control unit 33 that has changed the operation mode in S105, S107, and S108 described above outputs a command signal notifying the change of the operation mode to the CAN bus 90 through the communication control unit 23 and the communication interface 24.
  • the display control unit 53 that has acquired the command signal activates the drawing layer corresponding to each operation mode based on the signal.
  • the operation control unit 33 sets the operation mode to the deep operation mode
  • the display control unit 53 selects the focus layer as an active drawing layer.
  • the operation control unit 33 can associate the deep operation with the finger F with the focus 62 and change the display mode of the focus 62.
  • the operation control unit 33 sets the operation mode to the shallow operation mode
  • the display control unit 53 selects the submenu layer on which the plurality of submenu image units 165 are drawn as the active drawing layer.
  • the operation control unit 33 can associate the shallow operation with the finger F with the submenu image unit 165 and change the display mode of the submenu image unit 165.
  • the operation control unit 33 sets the operation mode to the non-proximity mode
  • the display control unit 53 sets the active drawing layer to “none”. Therefore, the movement operation of the finger F is not associated with any image portion.
  • the operation surface 70 moves from the second operation space Sp2 to the first operation space Sp1 according to the strength of the pressing force Fp applied from the finger F.
  • the operator can grasp which operation space the finger F is currently located on, based on an operation feeling as to whether or not the operation surface 70 is depressed. . Therefore, even if the operation space is distinguished by the operating body distance d as described above and the image portion to be operated is switched, it is possible to reduce erroneous operations.
  • the reaction force Fre that resists the pressing force Fp is generated by the restoring force of the elastic member 80. Therefore, even when the finger F moves in a direction away from the touch panel 31a, the operation surface 70 can move following the finger F. By maintaining the contact state between the finger F and the operation surface 70 in this way, the operator can reliably keep track of the operational sensation as to whether or not the operation surface 70 has been pushed. Therefore, the effect of reducing erroneous operations is exhibited with higher certainty.
  • the reaction force Fre is caused by the transition from the reaction force increase section Zin to the reaction force decrease section Zde. Turns from increasing to decreasing.
  • the click feeling is transmitted to the finger by such a change in the reaction force Fre, the operation feeling as to whether or not the operation surface 70 is pushed is further clarified.
  • the operation space in which the finger F is positioned can be more easily grasped, so that the effect of reducing erroneous operations can be reliably exhibited.
  • the operation surface 70 passes through the boundary surface BP.
  • the click feeling transmitted to the finger F and the switching of the operation space are linked, the operator can more easily grasp the operation space where the finger F is located. Therefore, the effect of reducing erroneous operations is more reliably exhibited.
  • the moving mechanism of the first embodiment even if the operation surface 70 is moved in the pushing direction Dp, the operation surface 70 moves in the direction along the xy plane very little. Therefore, it is possible to avoid a situation in which the finger F dragged on the operation surface 70 moves in the above-described plane direction and causes the movement of the image part not intended by the operator during the pushing operation. Therefore, erroneous operation caused by the configuration that allows the operation surface 70 to be raised and lowered can be reduced with the above-described configuration.
  • the display on the display screen 52 is constructed in a hierarchical structure.
  • the submenu image portion 165 of the main menu 60a which is the upper layer
  • the focus 62 of the air conditioning menu 60b which is a lower hierarchy
  • the upper and lower logical correlations in the hierarchical structure and the upper and lower physical correlations in the operation spaces Sp1 and Sp2 are easily linked in the operator's thought. Therefore, erroneous operations are more easily reduced by the association described above.
  • the finger F corresponds to the “operation body” described in the claims
  • the touch sensor 31a corresponds to the “detection means” described in the claims.
  • the block 34 cooperates with the “acquiring means” recited in the claims.
  • the determination block 35 corresponds to the “determination means” described in the claims
  • the association block 36 corresponds to the “association means” described in the claims
  • the lifting mechanism 40 is described in the claims. This corresponds to the “movement mechanism”.
  • the focus 62 corresponds to the “first image portion” described in the claims
  • the submenu image portion 165 corresponds to the “second image portion” described in the claims
  • the elastic member 80 This corresponds to the “reaction force generation unit” recited in the claims.
  • the main menu 60a corresponds to the “upper selection image” described in the claims
  • the air conditioning menu 60b corresponds to the “lower selection image” described in the claims
  • the remote operation device 100 claims This corresponds to the “operation device” described in the range.
  • the second embodiment of the present disclosure shown in FIGS. 11 to 15 is a modification of the first embodiment.
  • the elevating mechanism 240 is accommodated in a case 275 that forms a part of the center console.
  • the touch sensor 231a of the second embodiment can be moved along the z-axis direction together with the operation surface 70 by the elevating mechanism 240.
  • Case 275 is formed of a conductive resin material or the like.
  • the case 275 has a peripheral wall portion 275a and a lid wall portion 275b.
  • the peripheral wall part 275a is erected so as to surround the periphery of the elevating mechanism 240 over the entire circumference.
  • the lid wall portion 275b is formed so as to cover the lifting mechanism 240 from above, and is continuous with the tip portion of the peripheral wall portion 275a in the standing direction.
  • An opening 276 for exposing the operation surface 70 is provided in the center of the lid wall portion 275b.
  • the opening 276 is formed in a rectangular shape slightly smaller than the operation surface 70.
  • the lid wall portion 275 b forms a facing surface 277 in a region located around the opening 276 among the back surface facing the operation surface 70.
  • the facing surface 277 is formed in a rectangular frame shape along the outer edge 72 of the operation surface 70, and is arranged to face the outer edge 72.
  • the facing surface 277 can contact the outer edge 72
  • the elevating mechanism 240 holds the touch sensor 231a by the movable support member 241.
  • the touch sensor 231 a is attached to the back surface of the movable support member 241 in a posture along the operation surface 70.
  • the outer edge region 232b of the touch sensor 231a faces the facing surface 277 in the z-axis direction with the movable support member 241 and the sheet member 71 interposed therebetween.
  • the outer edge region 232b can accumulate charges between the opposing surface 277 having conductivity.
  • the lifting mechanism 240 having the above configuration is configured so that the distance from the touch sensor 231a to the facing surface 277 (hereinafter referred to as “facing surface distance df”) is increased by the pressing force Fp applied from the finger F to the operation surface 70.
  • the touch sensor 231a is lowered together with the operation surface 70.
  • the lifting mechanism 240 operates the touch sensor 231a so that the opposing surface distance df is reduced by the restoring force of the elastic member 80. Raise with surface 70.
  • the acquisition block 234 of the operation control unit 33 shown in FIG. 11 detects the operating body distance d in the central region 232a of the touch sensor 231a shown in FIGS. 12 and 13 by calculation processing based on the sensitivity value. To do.
  • the acquisition block 234 shown in FIG. 11 detects the facing surface distance df from the outer edge region 232b to the facing surface 277 by calculation processing based on the sensitivity value. As described above, the acquisition block 234 acquires both the operating body distance d and the facing surface distance df shown in FIGS. 12 and 13 in cooperation with the detection unit 31 including the touch sensor 231a.
  • the central region 232a of the touch sensor 231a is a region that overlaps the opening 276 in the z-axis direction.
  • the image portion associated with the movement operation of the finger F is changed by the association block 36 (see FIG. 11) depending on the vertical position of the movable support member 241 in the z-axis direction. Is done.
  • the operation mode of the remote operation device 200 is the shallow operation mode.
  • the space in the range where the facing surface distance df is less than the first threshold distance DFth1 is set as the second operation space Sp2.
  • the operation of the finger F tracing the operation surface 70 in the state where the touch sensor 231a is positioned in the second operation space Sp2 is a shallow operation.
  • the operation mode of the remote operation device 200 is set to the deep operation mode.
  • the space in which the facing surface distance df exceeds the second threshold distance DFth2 is defined as the first operation space Sp1.
  • the touch sensor 231a positioned in the first operation space Sp1 the operation of the finger F tracing the operation surface 70 is a deep operation.
  • the discrimination block 235 shown in FIG. 11 includes the above-described two threshold distances for distinguishing the movement operation of the finger F in the first operation space Sp1 and the movement operation of the finger F in the second operation space Sp2 shown in FIGS. DFth1 and DFth2 are set.
  • the first threshold distance DFth1 is set to be shorter than the second threshold distance DFth2.
  • the first threshold distance DFth1 is a threshold distance used for switching from the deep operation mode to the shallow operation mode when the movable support member 241 is lifted by the reaction force Fre of the elastic member 80 (see B in FIG. 8).
  • the second threshold distance DFth2 is a threshold distance used for switching from the shallow operation mode to the deep operation mode when the movable support member 241 is lowered by the pressing force Fp. According to the setting of the two threshold distances DFth1 and DFth2, a situation in which the switching of the operation mode frequently occurs can be avoided.
  • the boundary surface BP of the second embodiment is defined at a position separated from the facing surface 277 by the second threshold distance DFth2 so as to face the first operation space Sp1 shown in FIG.
  • the operation mode selection processing performed by the operation control unit 33 will be described in detail with reference to FIGS. 11 to 13 based on FIG.
  • the operation mode selection process shown in FIG. 14 is started by the operation control unit 33 when the ACC power supply of the vehicle is turned on.
  • S201 and S202 are substantially the same as S101 and S102 of the first embodiment.
  • calculation processing is performed on the input position coordinates of the finger F detected in the central region 232a and the facing surface distance df detected in the outer edge region 232b from the sensitivity value acquired in S202, and in S204. Proceed to
  • S204 it is determined whether or not the maximum sensitivity value in the central region 232a acquired in S203 is equal to or greater than a predetermined contact sensitivity threshold HthT (see FIG. 15).
  • a predetermined contact sensitivity threshold HthT it is estimated that the finger F is not touching the operation surface 70, and the process returns to S201, and the tap operation for starting the operation is performed. Wait until it is done.
  • the maximum sensitivity value exceeds the contact sensitivity threshold value HthT, it is estimated that the finger F is touching the operation surface 70, and the process proceeds to S205.
  • S205 it is determined whether or not the previous operation mode is the shallow operation mode and the sensitivity value of the outer edge region 232b is equal to or lower than the second sensitivity threshold HFth2 (see FIG. 15).
  • the touch sensor 231a moves from the second operation space Sp2 to the first operation space Sp1
  • affirmative determination is made in S205
  • the process proceeds to S206.
  • the operation mode is set to the deep operation mode, and the process returns to S202.
  • S207 when a negative determination is made in S205, it is determined whether or not the previous operation mode is the deep operation mode and the sensitivity value is equal to or higher than the first sensitivity threshold HFth1 (see FIG. 15).
  • the touch sensor 231a moves from the first operation space Sp1 to the second operation space Sp2 and makes an affirmative determination in S207, the process proceeds to S208.
  • the operation mode is set to the shallow operation mode, and the process returns to S202.
  • the previous operation mode is maintained, and the process returns to S202.
  • the contact sensitivity threshold value HthT used in S204 is a sensitivity value for detecting the contact of the finger F with the operation surface 70.
  • the contact sensitivity threshold value HthT is a value slightly smaller than the sensitivity value detected when the finger F is in contact with the central region 232 a of the operation surface 70.
  • the first sensitivity threshold value HFth1 and the second sensitivity threshold value HFth2 used in S205 and S207 of FIG. 14 are such that the opposing surface distance df is the first threshold value distance DFth1 and the second threshold value distance DFth2, respectively. This corresponds to the sensitivity value detected in the outer edge region 232b in the state.
  • the operator can position the finger F at the same effect as the first embodiment. It is possible to grasp which operation space is used. Therefore, it is possible to reduce erroneous operations.
  • the operation control unit 33 determines whether or not the facing surface is based on the detection results at a plurality of locations in the outer edge region 232b.
  • the distance df can be calculated. According to the above, since the detection accuracy of the facing surface distance df is maintained high, the operation mode switching based on the operation of pushing the operation surface 70 can be performed accurately.
  • the touch sensor 231a moves together with the operation surface 70, so that the touch sensor 231a can detect a moving operation by the finger F at a position close to the finger F. According to the above, the touch sensor 231a can easily detect a detailed movement operation by the finger F. Therefore, the moving operation with the finger F is accurately reflected in the movement of the image portion displayed on the display screen 52.
  • the elevating mechanism 240 corresponds to the “movement mechanism” described in the claims
  • the touch sensor 231a corresponds to the “detection means” described in the claims
  • the detection unit 31 and The acquisition block 234 cooperates and corresponds to “acquisition means” described in the claims.
  • the determination block 235 corresponds to “determination means” described in the claims
  • the remote operation device 200 corresponds to “operation device” described in the claims.
  • the third embodiment of the present disclosure shown in FIGS. 16 to 18 is another modification of the first embodiment.
  • the configuration of the lifting mechanism 340 is different from the configuration of the lifting mechanism 40 (see FIG. 5) in the first embodiment.
  • the raising / lowering mechanism 340 the structure different from the raising / lowering mechanism 40 of 1st embodiment is demonstrated in detail.
  • a pair of the first link 347 and the second link 348 of the elevating mechanism 340 are connected to each other by a central hinge 349 at the center portion thereof.
  • the first link 347 and the second link 348 are rotatable with respect to each other by a central hinge 349.
  • the end portion 347b of the first link 347 is connected to the connection portion 345b of the fixed support member 44 by a hinge.
  • the end portion 347b is allowed to slide the hinge in the y-axis direction at the connection portion with the connection portion 345b.
  • the end portion 348b of the second link 348 is connected to the connection portion 342b of the movable support member 41 by a hinge.
  • the end portion 348b is allowed to slide the hinge in the y-axis direction at the connection portion with the connection portion 342b. According to the connection configuration of the links 347 and 348 described above, the operation surface 70 moves only in the z-axis direction substantially orthogonal to the operation surface 70 when a push operation is performed.
  • the lifting mechanism 340 has five elastic members 380.
  • Each elastic member 380 is formed with a large diameter portion 381, a small diameter portion 383, a reduced diameter portion 382, and a stopper (not shown) that are substantially the same as the elastic member 80 (see FIG. 6) according to the first embodiment. ing.
  • One of the plurality of elastic members 380 is placed at the center of the fixed support member 44.
  • the other four are arranged close to the four corners of the fixed support member 44 so as to surround the elastic member 380 placed at the center of the fixed support member 44.
  • the operation surface 70 of the above lifting mechanism 340 When the operation surface 70 of the above lifting mechanism 340 is at the uppermost position, it protrudes upward with respect to the surrounding surface 375 (see A in FIG. 18) surrounding the movable support member 41.
  • the operation surface 70 is lowered toward the fixed support member 44 integrally with the movable support member 41 by applying a pressing force Fp stronger than the restoring force by the plurality of elastic members 380 from the finger F.
  • the operation surface 70 that has been moved to the lowest position by such a pushing operation is in a state where there is substantially no step with the surrounding surface 375 (see B in FIG. 18).
  • the pressing force Fp applied to the operation surface 70 disappears, the operation surface 70 rises integrally with the movable support member 41 by the restoring force of the plurality of elastic members 380.
  • the icon selection operation input to the remote operation device 300 described so far will be described with reference to FIG.
  • the display image 60 displayed on the display screen 52 is a navigation image 360a showing a route to the destination set by the operator.
  • the navigation image 360a includes a plurality of icons 63 associated with the destination, a pointer 362 for selecting the icon 63, a map 364 indicating the form of the road around the vehicle, and the like.
  • the navigation image 360a includes a focus 62 that emphasizes the icon 63 on which the pointer 362 is superimposed.
  • the remote operation device 300 switches the operation mode to the shallow operation mode.
  • the shallow portion operation by the operator is associated with the map 364 displayed on the display screen 52. Therefore, the operator scrolls the map 364 up, down, left, and right by a shallow portion operation that traces the uppermost operation surface 70, and moves an arbitrary icon 63 arranged on the map 364 to the center of the display screen 52. Can do.
  • the operation surface 70 is lowered to the lowest position by the pushing operation in the pushing direction Dp.
  • the operation mode is switched to the deep operation mode.
  • the pointer 362 is displayed on the display screen 52.
  • the deep operation by the operator is associated with the pointer 362 displayed on the display screen 52. Therefore, the operator can superimpose the pointer 362 on the arbitrary icon 63 by deep operation of tracing the operation surface 70 at the lowest position. Under such a state, an arbitrary icon 63 is selected by weakening the pressing force Fp applied to the operation surface 70 and separating the finger F from the touch sensor 31a.
  • the plurality of elastic members 380 generate a restoring force at different locations. Therefore, even if the pressing force Fp is applied to a position deviated from the center of the operation surface 70, the elevating mechanism 340 stably and reliably generates the reaction force Fre and follows the operation surface 70 as the finger F moves. Can be made. By maintaining the contact state between the finger F and the operation surface 70 in this way, the operator can reliably keep track of the operational sensation as to whether or not the operation surface 70 has been pushed. Therefore, the effect of reducing erroneous operations is exhibited with higher certainty.
  • the operation surface 70 moves substantially only in the z-axis direction and does not move substantially in the direction along the xy plane during the pushing operation. Therefore, a situation in which the finger F dragged on the operation surface 70 moves in the above-described plane direction and causes the movement of the image part not intended by the operator can be avoided. Therefore, erroneous operation caused by the configuration that allows the operation surface 70 to be raised and lowered can be reduced with the above-described configuration.
  • the elevating mechanism 340 corresponds to the “movement mechanism” recited in the claims
  • each elastic member 380 corresponds to the “reaction force generating portion” recited in the claims
  • the operation device 300 corresponds to an “operation device” recited in the claims.
  • the lifting mechanisms having different configurations are used, but the mechanism for making the operation surface movable is not limited to the support mechanism (so-called pantograph) of the above embodiment.
  • the arrangement of the links in the elevating mechanism, the connection structure at both ends, and the like may be changed as appropriate.
  • a rail mechanism that allows the movable support member to slide along a guide rail extending along the z-axis direction, for example, may be provided. Further, movement of the operation surface in the xy plane direction accompanying the pressing operation may be allowed.
  • a rubber elastic member (so-called rubber dome) is provided as a configuration for generating a reaction force.
  • the material, hardness, shape, and the like of the elastic member may be changed as appropriate so that a light click feeling is produced.
  • an elastic member made of a resin material such as urethane may generate a reaction force.
  • reaction force may be generated by sandwiching elements such as a coil spring and a leaf spring between the movable support member and the fixed support member.
  • produces reaction force may be sufficient as the bending of the link which supports a movable support member.
  • the click feeling is produced by continuously providing the reaction force increasing section Zin and the reaction force decreasing section Zde.
  • a click feeling need not be produced.
  • hysteresis is provided for switching the operation mode by using two threshold distances, the first threshold distance DFth1 and the second threshold distance DFth2.
  • hysteresis may be provided for each threshold distance.
  • the first operation space Sp1 is expanded by extending the first threshold distance Dth1.
  • the second operation space Sp2 is expanded by extending the second threshold distance Dth2.
  • the facing surface is formed in a frame shape that surrounds the periphery of the opening.
  • the facing surface only needs to face at least part of the touch sensor, and the shape can be changed as appropriate.
  • the electroconductivity which an opposing surface requires may be ensured by sticking a plate-shaped metal member etc. to a case.
  • a display using a plasma display panel, a display using an organic EL, and the like form a “display screen”.
  • a window shield and a combiner provided on the top surface of the instrument panel are used as a “display screen”, and an image is projected onto the “display screen” by a projection unit such as a projector.
  • a display device that projects an image on a window shield, a combiner, or the like can also be included in the navigation device as a configuration that forms a display screen.
  • a push button for selecting an arbitrary icon is provided in the vicinity of the operation surface. The operator can select the function of the icon by pressing the push button while the pointer and focus are superimposed on an arbitrary icon.
  • the plurality of “first image portion” and “second image portion” exemplified may be changed as appropriate.
  • the function provided by the operation control unit 33 that executes the program may be provided by hardware and software different from the above-described control device, or a combination thereof.
  • a function such as “association means” may be provided by a circuit that performs a predetermined function without depending on a program.
  • each sensitivity threshold value serving as a boundary value is included. It may be changed as appropriate. Specifically, when the sensitivity value is less than the second sensitivity threshold value HFth2 in S205 of the second embodiment, switching to the deep operation mode may be performed. Similarly, when the sensitivity value exceeds the first sensitivity threshold Hth1 in S207, switching to the shallow portion operation mode may be performed.
  • the present disclosure is applied to a remote operation device used in a display system mounted on a vehicle.
  • the present disclosure can also be applied to a so-called touch panel type operation device configured integrally with a display screen.
  • the operation device to which the present disclosure is applied can be applied not only to the vehicle but also to all display systems used for various transportation devices and various information terminals.

Abstract

This operation device operates an image section when an input is made with respect to an peration surface (70) by an operating body (F), said image section being displayed on a display screen (52). The operation device is provided with: a detecting means (31a), which is provided on the rear side of the operation surface, and which detects a move of the operating body; acquiring means (31, 34), which acquire an operating body distance (d) between the detecting means and the operating body; a discriminating means (35), which discriminates from each other an operating body move detected in a first operation space (Sp1) where the operating body distance is shorter than a predetermined threshold distance (Dth1), and an operating body move detected in a second operation space (Sp2) where the operating body distance exceeds the threshold distance; and a moving mechanism (40, 340), which moves the operation surface positioned in the second operation space to the inside of the first operation space by means of a pressing force (Fp) applied to the operation surface from the operating body.

Description

操作デバイスOperation device 関連出願の相互参照Cross-reference of related applications
 本開示は、2012年12月24日に出願された日本出願番号2012-280406号及び2013年9月2日に出願された日本出願番号2013-181483号に基づくもので、ここにその記載内容を援用する。 This disclosure is based on Japanese Application No. 2012-280406 filed on December 24, 2012 and Japanese Application No. 2013-181383 filed on September 2, 2013. Incorporate.
 本開示は、操作面への入力によって表示画面に表示される画像部を操作する操作デバイスに関する。 This disclosure relates to an operation device that operates an image portion displayed on a display screen by an input to an operation surface.
 従来、例えば特許文献1には、表示画面に表示されるナビゲーションのポインタ及びラジオメイン画面等の画像部を、遠隔タッチパッド部に対してなされる操作に関連付けて移動させる技術が、開示されている。この特許文献1に開示のユーザインターフェース装置は、操作者の指等を移動させる操作を検出する遠隔タッチパット部と、遠隔タッチパッド部によって検出される指の操作を地図及びポインタ等の移動に関連付ける制御部とを備えている。加えて制御部は、遠隔タッチパット部から指までの距離を取得している。これにより制御部は、指までの距離が例えば3センチメートル(cm)未満である場合と、指までの距離が例えば5cm~7cmの範囲内である場合とを区別することができる。 Conventionally, for example, Patent Document 1 discloses a technique for moving an image portion such as a navigation pointer and a radio main screen displayed on a display screen in association with an operation performed on a remote touchpad portion. . The user interface device disclosed in Patent Document 1 relates a remote touch pad unit that detects an operation of moving an operator's finger and the like, and a finger operation detected by the remote touch pad unit to movement of a map, a pointer, and the like. And a control unit. In addition, the control unit obtains the distance from the remote touch pad unit to the finger. As a result, the control unit can distinguish between a case where the distance to the finger is less than 3 centimeters (cm) and a case where the distance to the finger is within a range of 5 cm to 7 cm, for example.
特開2011―118857号公報JP 2011-118857 A
 さて、特許文献1に開示の構成では、上述の切替操作を行う際に、遠隔タッチパッド部等の検出手段から5cm~7cmの範囲内に指を位置させた状態であるか否かの把握は、操作者の勘や慣れに依存せざるを得なかった。このように、検出手段から指までの距離の把握が困難であることに起因して、誤操作が誘発されるおそれがあった。 In the configuration disclosed in Patent Document 1, it is possible to grasp whether or not the finger is positioned within the range of 5 cm to 7 cm from the detection means such as the remote touchpad unit when performing the switching operation described above. I had to rely on the intuition and familiarity of the operator. As described above, it is difficult to grasp the distance from the detection means to the finger, and there is a possibility that an erroneous operation is induced.
 本開示は、上記問題点に鑑みてなされたものであって、その目的は、検出手段からの距離によって操作空間が区別される構成であっても、誤操作の低減が可能な操作デバイスを提供することである。 The present disclosure has been made in view of the above problems, and an object of the present disclosure is to provide an operation device capable of reducing erroneous operations even in a configuration in which an operation space is distinguished by a distance from a detection unit. That is.
 本開示の第1の態様にかかる操作デバイスは、操作体による入力が操作面に対してなされることにより、表示画面に表示された画像部を操作するものであり、操作面の背面側に設けられ、操作体の移動を検出する検出手段と、検出手段から操作体までの操作体距離を取得する取得手段と、操作体距離が予め規定された閾値距離未満となる第一操作空間内において検出された操作体の移動と、操作体距離が閾値距離を超える第二操作空間内で検出された操作体の移動とを区別する判別手段と、操作体から操作面に印加される押圧力により、第二操作空間内に位置させた操作面を、第一操作空間内に移動させる移動機構と、を備えている。 The operation device according to the first aspect of the present disclosure operates on the image portion displayed on the display screen when input from the operation body is performed on the operation surface, and is provided on the back side of the operation surface. Detection means for detecting movement of the operating tool, acquisition means for acquiring the operating tool distance from the detecting means to the operating tool, and detection in the first operating space where the operating tool distance is less than a predetermined threshold distance. The discriminating means for distinguishing between the movement of the operated body and the movement of the operating body detected in the second operation space where the operating body distance exceeds the threshold distance, and the pressing force applied to the operating surface from the operating body, And a moving mechanism that moves the operation surface positioned in the second operation space into the first operation space.
 本開示の第2の態様にかかる操作デバイスは、操作体による入力が操作面に対してなされることにより、表示画面に表示された画像部を操作するものであり、操作面の背面側に設けられ、操作面に沿った操作体の移動を検出する検出手段と、操作面の少なくとも一部と対向するよう配置される対向面と、操作体から操作面に印加される押圧力により、検出手段から対向面までの対向面距離が拡大するよう、検出手段を操作面と共に移動させる移動機構と、対向面距離を取得する取得手段と、押圧力の印加によって対向面距離が予め規定された閾値距離を超える第一操作空間内に位置した検出手段の検出する操作体の移動と、押圧力の解放によって対向面距離が閾値距離未満となる第二操作空間内に位置した検出手段の検出する操作体の移動とを区別する判別手段と、を備えている。 The operation device according to the second aspect of the present disclosure operates on the image portion displayed on the display screen when input by the operation body is performed on the operation surface, and is provided on the back side of the operation surface. Detection means for detecting movement of the operation body along the operation surface, an opposing surface arranged to face at least a part of the operation surface, and a pressing force applied from the operation body to the operation surface. A moving mechanism that moves the detecting means together with the operation surface, an acquisition means for acquiring the opposing surface distance, and a threshold distance in which the opposing surface distance is defined in advance by applying pressure The operating body detected by the detecting means positioned in the second operating space where the opposing surface distance is less than the threshold distance due to the movement of the operating body detected by the detecting means positioned in the first operating space exceeding and the release of the pressing force Move And a, a distinguishing discriminating means.
 これらの態様によれば、操作体から操作面に印加される押圧力が弱い場合には、操作面又は検出手段は、第二操作空間内に位置する。一方で、操作体から操作面に印加される押圧力が強くされた場合には、操作面又は検出手段は、移動機構の機能により、第二操作空間内から第一操作空間内へと移動する。以上のように、操作面は、操作体から印加される押圧力の強弱に応じて、第二操作空間から第一操作空間へと移動する。こうした構成でれば、操作者は、操作面を押し込んだ状態であるか否かの操作感覚に基づいて、操作体が現在位置している操作空間を把握することができる。したがって、検出手段から操作体までの操作体距離、又は検出手段から対向面までの対向面距離によって操作空間が区別される構成であっても、誤操作の低減が可能となる。 According to these aspects, when the pressing force applied from the operating body to the operation surface is weak, the operation surface or the detection means is located in the second operation space. On the other hand, when the pressing force applied to the operation surface from the operation body is strengthened, the operation surface or the detection means moves from the second operation space to the first operation space by the function of the moving mechanism. . As described above, the operation surface moves from the second operation space to the first operation space according to the strength of the pressing force applied from the operation body. With such a configuration, the operator can grasp the operation space in which the operating body is currently located based on an operation feeling whether or not the operation surface is pushed in. Therefore, even if the operation space is distinguished by the operation body distance from the detection means to the operation body or the facing surface distance from the detection means to the facing surface, it is possible to reduce erroneous operations.
 本開示についての上記目的およびその他の目的、特徴や利点は、添付の図面を参照しながら下記の詳細な記述により、より明確になる。その図面は、
本開示の第一実施形態による遠隔操作デバイスを備えた表示システムの構成を説明するための図である。 表示画面及び操作面の車室内での配置を説明するための図である。 表示画面に表示される表示画像の一例を説明するための図である。 第一実施形態による遠隔操作デバイスを模式的に示す平面図である。 第一実施形態による遠隔操作デバイスの構成を説明するための図であって、図4のV-V線断面図である。 弾性部材の形状を説明するための図であって、図5のVI-VI線断面図である。 操作面の押込変位量によって変化する弾性部材の変形量と、操作面から指に作用する反力を発生させる弾性部材の復元力との相関を示す図である。 操作者による一連のアイコン選択操作を説明するための図である。 第一実施形態による遠隔操作デバイスにおいて、操作制御部により実施される操作モード選択処理が示されるフローチャートである。 第一実施形態による遠隔操作デバイスにおいて、タッチセンサにより検出される感度値と、操作制御部により判定される操作状態との関係を説明するための図である。 第二実施形態による遠隔操作デバイスを備えた表示システムの構成を説明するための図である。 第二実施形態による遠隔操作デバイスの構成を説明するための図であって、操作面が最上位置にある場合の図である。 操作面が最下位置にある場合の図である。 第二実施形態の操作制御部により実施される操作モード選択処理が示されるフローチャートである。 第二実施形態による遠隔操作デバイスにおいて、タッチセンサにより検出される感度値と、操作制御部により判定される操作状態との関係を説明するための図である。 第三実施形態による遠隔操作デバイスを模式的に示す平面図である。 第三実施形態による遠隔操作デバイスの構成を説明するための図であって、図16のXVII-XVII線断面図である。 操作者による一連のアイコン選択操作を説明するための図である。
The above and other objects, features and advantages of the present disclosure will become more apparent from the following detailed description with reference to the accompanying drawings. The drawing
It is a figure for demonstrating the structure of the display system provided with the remote control device by 1st embodiment of this indication. It is a figure for demonstrating arrangement | positioning in the vehicle interior of a display screen and an operation surface. It is a figure for demonstrating an example of the display image displayed on a display screen. It is a top view which shows typically the remote control device by 1st embodiment. FIG. 5 is a diagram for explaining the configuration of the remote control device according to the first embodiment, and is a cross-sectional view taken along line VV in FIG. 4. FIG. 6 is a view for explaining the shape of the elastic member, and is a cross-sectional view taken along line VI-VI in FIG. 5. It is a figure which shows the correlation with the deformation | transformation amount of the elastic member which changes with the pushing displacement amount of an operation surface, and the restoring force of the elastic member which generates the reaction force which acts on a finger | toe from an operation surface. It is a figure for demonstrating a series of icon selection operation by an operator. It is a flowchart in which the operation mode selection process implemented by the operation control part is shown in the remote operation device by 1st embodiment. In the remote operation device by 1st embodiment, it is a figure for demonstrating the relationship between the sensitivity value detected by a touch sensor, and the operation state determined by the operation control part. It is a figure for demonstrating the structure of the display system provided with the remote control device by 2nd embodiment. It is a figure for demonstrating the structure of the remote operation device by 2nd embodiment, Comprising: It is a figure in case an operation surface exists in the highest position. It is a figure in case an operation surface exists in the lowest position. It is a flowchart in which the operation mode selection process implemented by the operation control part of 2nd embodiment is shown. In the remote operation device by 2nd embodiment, it is a figure for demonstrating the relationship between the sensitivity value detected by a touch sensor, and the operation state determined by the operation control part. It is a top view which shows typically the remote control device by 3rd embodiment. It is a figure for demonstrating the structure of the remote control device by 3rd embodiment, Comprising: It is the XVII-XVII sectional view taken on the line of FIG. It is a figure for demonstrating a series of icon selection operation by an operator.
 以下、本開示の複数の実施形態を図面に基づいて説明する。尚、各実施形態において対応する構成要素には同一の符号を付すことにより、重複する説明を省略する場合がある。各実施形態において構成の一部分のみを説明している場合、当該構成の他の部分については、先行して説明した他の実施形態の構成を適用することができる。また、各実施形態の説明において明示している構成の組み合わせばかりではなく、特に組み合わせに支障が生じなければ、明示していなくても複数の実施形態の構成同士を部分的に組み合せることができる。そして、複数の実施形態及び変形例に記述された構成同士の明示されていない組み合わせも、以下の説明によって開示されているものとする。 Hereinafter, a plurality of embodiments of the present disclosure will be described with reference to the drawings. In addition, the overlapping description may be abbreviate | omitted by attaching | subjecting the same code | symbol to the corresponding component in each embodiment. When only a part of the configuration is described in each embodiment, the configuration of the other embodiment described above can be applied to the other part of the configuration. In addition, not only combinations of configurations explicitly described in the description of each embodiment, but also the configurations of a plurality of embodiments can be partially combined even if they are not explicitly specified unless there is a problem with the combination. . And the combination where the structure described in several embodiment and the modification is not specified shall also be disclosed by the following description.
 (第一実施形態)
 本開示の第一実施形態による遠隔操作デバイス100は、車両に搭載され、図1に示されるように、ナビゲーション装置50等と共に表示システム10を構成している。遠隔操作デバイス100は、図2に示されるように、車両のセンターコンソールにてパームレスト39と隣接する位置に設置され、操作者の手の届き易い範囲に操作面70を露出させている。この操作面70には、操作者の手の人差し指(以下、単に「指」という)F等による操作が入力される。ナビゲーション装置50は、操作者から目視されるよう表示画面52を露出させ且つ表示画面52を運転席に向けた姿勢にて、車両のインスツルメントパネル内に設置されている。この表示画面52には、種々の表示画像60(図8参照)が表示される。
(First embodiment)
The remote operation device 100 according to the first embodiment of the present disclosure is mounted on a vehicle and constitutes a display system 10 together with a navigation device 50 and the like as shown in FIG. As shown in FIG. 2, the remote operation device 100 is installed at a position adjacent to the palm rest 39 at the center console of the vehicle, and exposes the operation surface 70 in a range that can be easily reached by the operator. An operation with an index finger (hereinafter simply referred to as “finger”) F or the like of the operator's hand is input to the operation surface 70. The navigation device 50 is installed in the instrument panel of the vehicle in a posture in which the display screen 52 is exposed and viewed from the operator and the display screen 52 faces the driver's seat. Various display images 60 (see FIG. 8) are displayed on the display screen 52.
 図3に示される表示画像60は、表示画面52に表示される複数の表示画像のうちの一つであり、車両に搭載された空調機器を操作するための空調メニュー60bを示している。表示画像60には、特定の機能が関連付けられた複数のアイコン63、アイコン63を選択するためのフォーカス62、及びこれらアイコン63及びフォーカス62の背景となる背景部64等が含まれている。表示画面52上にてフォーカス62の表示される位置は、図2に示す操作面70上にて指Fの接触の接触する位置に対応している。以上の表示画像60は、ナビゲーション装置50が複数の描画レイヤを重畳することにより、生成される。 The display image 60 shown in FIG. 3 is one of a plurality of display images displayed on the display screen 52, and shows an air conditioning menu 60b for operating the air conditioning equipment mounted on the vehicle. The display image 60 includes a plurality of icons 63 associated with a specific function, a focus 62 for selecting the icons 63, a background portion 64 serving as a background of the icons 63 and the focus 62, and the like. The position where the focus 62 is displayed on the display screen 52 corresponds to the position where the finger F touches on the operation surface 70 shown in FIG. The display image 60 described above is generated when the navigation device 50 superimposes a plurality of drawing layers.
 次に、図1に示す遠隔操作デバイス100及びナビゲーション装置50の各構成を詳しく説明する。 Next, each configuration of the remote control device 100 and the navigation device 50 shown in FIG. 1 will be described in detail.
 遠隔操作デバイス100は、Controller Area Network(CAN,登録商標)バス90、及び外部のバッテリ95等と接続されている。CANバス90は、車両に搭載された複数の車載装置を相互に接続してなる車内通信ネットワークにおいて、各車載装置間でのデータの伝送に用いられる伝送路である。遠隔操作デバイス100は、CANバス90を通じて、離れて位置するナビゲーション装置50とCAN通信可能とされている。 The remote operation device 100 is connected to a controller area network (CAN, registered trademark) bus 90, an external battery 95, and the like. The CAN bus 90 is a transmission path used for data transmission between in-vehicle devices in an in-vehicle communication network formed by connecting a plurality of in-vehicle devices mounted on a vehicle. The remote operation device 100 is capable of CAN communication with the navigation device 50 located remotely via the CAN bus 90.
 遠隔操作デバイス100は、電源インターフェース21,22、通信制御部23、通信インターフェース24、検出部31、及び操作制御部33等によって構成されている。各電源インターフェース21,22は、バッテリ95から供給される電力を安定化して、操作制御部33に供給する。一方の電源インターフェース21には、バッテリ95から常に電力が供給されている。他方の電源インターフェース22には、車両のアクセサリ(ACC)電源がオン状態とされたことに基づきスイッチ93が通電状態となることにより、バッテリ95から電力が供給される。 The remote operation device 100 includes power interfaces 21 and 22, a communication control unit 23, a communication interface 24, a detection unit 31, an operation control unit 33, and the like. Each power interface 21, 22 stabilizes the power supplied from the battery 95 and supplies it to the operation control unit 33. One power interface 21 is always supplied with power from the battery 95. The other power interface 22 is supplied with electric power from the battery 95 when the switch 93 is energized when the accessory (ACC) power source of the vehicle is turned on.
 通信制御部23及び通信インターフェース24は、操作制御部33によって処理された情報をCANバス90に出力すると共に、他の車載装置からCANバス90に出力された情報を取得するための構成である。通信制御部23及び通信インターフェース24は、送信用の信号線TX及び受信用の信号線RXによって互いに接続されている。 The communication control unit 23 and the communication interface 24 are configured to output information processed by the operation control unit 33 to the CAN bus 90 and to acquire information output to the CAN bus 90 from other in-vehicle devices. The communication control unit 23 and the communication interface 24 are connected to each other by a transmission signal line TX and a reception signal line RX.
 図1,2に示すように、検出部31は、タッチセンサ31a、ローパスフィルタ31b、及び静電検出用IC31cを有している。タッチセンサ31aは、操作面70の背面側に設けられ、この操作面70に沿った矩形状に形成されており、指Fとの間に電荷を蓄積する。タッチセンサ31aは、図4のx軸方向に沿って延びる電極とy軸方向に沿って延びる電極とが格子状に配列されることにより、形成されている。図1のローパスフィルタ31bは、受動抵抗器、コイル、及びコンデンサ等を組み合わせてなる回路である。ローパスフィルタ31bは、タッチセンサ31aに生じた高周波数のノイズ成分が静電検出用IC31cに入力されるのを抑制する。静電検出用IC31cは、タッチセンサ31a及び操作制御部33と接続されている。図5に示すように、互いに近接した指Fとタッチセンサ31aとの間には、電荷が蓄えられる。図1の静電検出用IC31cは、指F(図5参照)及び各電極間の静電容量に応じて増減する感度値(図10参照)を、当該各電極の出力に基づいて取得し、操作制御部33に出力する。 As shown in FIGS. 1 and 2, the detection unit 31 includes a touch sensor 31a, a low-pass filter 31b, and an electrostatic detection IC 31c. The touch sensor 31 a is provided on the back side of the operation surface 70, is formed in a rectangular shape along the operation surface 70, and accumulates electric charges with the finger F. The touch sensor 31a is formed by arranging electrodes extending along the x-axis direction and electrodes extending along the y-axis direction in FIG. 4 in a grid pattern. The low-pass filter 31b in FIG. 1 is a circuit formed by combining a passive resistor, a coil, a capacitor, and the like. The low-pass filter 31b suppresses a high-frequency noise component generated in the touch sensor 31a from being input to the electrostatic detection IC 31c. The electrostatic detection IC 31 c is connected to the touch sensor 31 a and the operation control unit 33. As shown in FIG. 5, electric charges are stored between the finger F and the touch sensor 31a that are close to each other. The electrostatic detection IC 31c in FIG. 1 acquires a sensitivity value (see FIG. 10) that increases or decreases according to the capacitance between the finger F (see FIG. 5) and each electrode based on the output of each electrode, Output to the operation control unit 33.
 操作制御部33は、各種の演算処理を行うプロセッサ、演算処理の作業領域として機能するRAM、及び演算処理に用いられるプログラム等が格納されたフラッシュメモリ等によって構成されている。加えて、操作制御部33は、各電源インターフェース21,22、通信制御部23、検出部31等と接続されている。 The operation control unit 33 includes a processor that performs various arithmetic processes, a RAM that functions as a work area for the arithmetic processes, and a flash memory that stores programs used for the arithmetic processes. In addition, the operation control unit 33 is connected to the power supply interfaces 21 and 22, the communication control unit 23, the detection unit 31, and the like.
 操作制御部33は、所定のプログラムを実行することにより、取得ブロック34、並びに詳細を後述する判別ブロック35及び関連付けブロック36、を機能ブロックとして有する。取得ブロック34は、検出部31から出力される感度値を取得する。そして取得ブロック34は、操作面70に対する指F(図5参照)の相対的な位置を示すx座標及びy座標、さらにタッチセンサ31aから指Fまでの距離(以下、「操作体距離d」(図8のA参照)という)に相当するz座標を、感度値に基づく算出処理によって検出する。こうして操作制御部33は、指Fの相対位置を示すx座標及びy座標を、通信制御部23及び通信インターフェース24を通じて、CANバス90に出力する。 The operation control unit 33 includes an acquisition block 34 and a determination block 35 and an association block 36, which will be described in detail later, as functional blocks by executing a predetermined program. The acquisition block 34 acquires the sensitivity value output from the detection unit 31. Then, the acquisition block 34 includes an x coordinate and ay coordinate indicating the relative position of the finger F (see FIG. 5) with respect to the operation surface 70, and a distance from the touch sensor 31a to the finger F (hereinafter referred to as “operation object distance d” Z coordinate corresponding to (refer to A in FIG. 8) is detected by a calculation process based on the sensitivity value. In this way, the operation control unit 33 outputs the x coordinate and the y coordinate indicating the relative position of the finger F to the CAN bus 90 through the communication control unit 23 and the communication interface 24.
 図1,2に示すナビゲーション装置50は、遠隔操作デバイス100等と通信可能なようCANバス90と接続されている。ナビゲーション装置50は、表示制御部53及び液晶ディスプレイ51を有している。 The navigation apparatus 50 shown in FIGS. 1 and 2 is connected to a CAN bus 90 so that it can communicate with the remote operation device 100 and the like. The navigation device 50 includes a display control unit 53 and a liquid crystal display 51.
 表示制御部53は、各種の演算処理を行うプロセッサ、演算処理の作業領域として機能するRAM、画像の描画処理を行うグラフィックプロセッサ、描画処理の作業領域として機能するグラフィックRAM等によって構成されている。加えて表示制御部53は、演算処理及び描画処理に用いられるデータを格納するフラッシュメモリ、CANバス90と接続される通信インターフェース、及び描画した画像データを液晶ディスプレイ51に出力する映像出力インターフェースを有している。表示制御部53は、CANバス90から取得する情報に基づいて、表示画面52に表示する表示画像60を描画する。そして表示制御部53は、描画した表示画像60の画像データを、映像出力インターフェースを通じて液晶ディスプレイ51に逐次出力する。 The display control unit 53 includes a processor that performs various types of arithmetic processing, a RAM that functions as a work area for arithmetic processing, a graphic processor that performs image drawing processing, a graphic RAM that functions as a work area for drawing processing, and the like. In addition, the display control unit 53 has a flash memory for storing data used for arithmetic processing and drawing processing, a communication interface connected to the CAN bus 90, and a video output interface for outputting drawn image data to the liquid crystal display 51. is doing. The display control unit 53 draws a display image 60 to be displayed on the display screen 52 based on information acquired from the CAN bus 90. Then, the display control unit 53 sequentially outputs the image data of the drawn display image 60 to the liquid crystal display 51 through the video output interface.
 液晶ディスプレイ51は、表示画面52に配列された複数の画素を制御することにより、カラー表示を実現するドットマトリクス方式の表示器である。液晶ディスプレイ51は、表示制御部53から逐次取得する画像データを表示画面52に連続的に形成することにより、映像を表示する。 The liquid crystal display 51 is a dot matrix type display that realizes color display by controlling a plurality of pixels arranged on the display screen 52. The liquid crystal display 51 displays video by continuously forming image data sequentially acquired from the display control unit 53 on the display screen 52.
 次に、遠隔操作デバイス100に設けられて、操作面70をz軸方向に移動可能にする昇降機構40の構成を、図4~7に基づいて詳細に説明する。 Next, the configuration of the elevating mechanism 40 provided in the remote operation device 100 and capable of moving the operation surface 70 in the z-axis direction will be described in detail with reference to FIGS.
 図4,5に示す昇降機構40は、操作面70を形成するシート部材71と、タッチセンサ31aと間に設けられている。昇降機構40は、可動支持部材41、固定支持部材44、第一リンク47及び第二リンク48、並びに弾性部材80等によって構成されている。 4 and 5 is provided between the sheet member 71 forming the operation surface 70 and the touch sensor 31a. The lifting mechanism 40 includes a movable support member 41, a fixed support member 44, a first link 47 and a second link 48, an elastic member 80, and the like.
 可動支持部材41は、矩形の板状に形成されており、タッチセンサ31aと実質的に平行な姿勢を維持したままz軸方向に移動可能である。可動支持部材41の両板面のうち、タッチセンサ31aから離間する側に位置するおもて面には、シート部材71が貼り付けられている。また、可動支持部材41の両板面のうち、タッチセンサ31aに近接する側に位置する裏面には、接続部42a,42bが設けられている。接続部42a,42bは、x軸方向における可動支持部材41の両縁部に一組ずつ設けられており、可動支持部材41の裏面から固定支持部材44に向けて立設されている。接続部42a,42bは、x軸方向において互いにずれて位置している。 The movable support member 41 is formed in a rectangular plate shape, and is movable in the z-axis direction while maintaining a posture substantially parallel to the touch sensor 31a. The sheet member 71 is affixed to the front surface located on the side away from the touch sensor 31a among both plate surfaces of the movable support member 41. Further, of both plate surfaces of the movable support member 41, connection portions 42a and 42b are provided on the back surface located on the side close to the touch sensor 31a. One set of connection portions 42 a and 42 b is provided on both edge portions of the movable support member 41 in the x-axis direction, and is erected from the back surface of the movable support member 41 toward the fixed support member 44. The connecting portions 42a and 42b are positioned so as to be shifted from each other in the x-axis direction.
 固定支持部材44は、板状に形成されており、車両のセンターコンソールに固定されている。固定支持部材44の両板面のうち、タッチセンサ31aと対向する裏面には、このタッチセンサ31aが貼り付けられている。また、固定支持部材44の両板面のうち、可動支持部材41と対向するおもて面には、接続部45a,45bが設けられている。接続部45a,45bは、x軸方向における弾性部材80の両側に一組ずつ設けられており、固定支持部材44のおもて面から可動支持部材41に向けて立設されている。接続部45a,45bは、x軸方向において互いにずれて位置している。 The fixed support member 44 is formed in a plate shape and is fixed to the center console of the vehicle. Of the two plate surfaces of the fixed support member 44, the touch sensor 31a is affixed to the back surface facing the touch sensor 31a. Further, of both plate surfaces of the fixed support member 44, connection portions 45 a and 45 b are provided on the front surface facing the movable support member 41. One set of connection portions 45 a and 45 b is provided on both sides of the elastic member 80 in the x-axis direction, and is erected from the front surface of the fixed support member 44 toward the movable support member 41. The connection parts 45a and 45b are located offset from each other in the x-axis direction.
 第一リンク47及び第二リンク48は、帯状に延伸する金属部材によって長手形状に形成されており、x軸方向における弾性部材80の両側に一組ずつ設けられている。第一リンク47及び第二リンク48の各長手方向は、y軸方向に向けられている。加えて、第一リンク47及び第二リンク48は、それぞれの長手方向を、y軸方向に対して傾斜させた姿勢にて、可動支持部材41及び固定支持部材44に取り付けられている。第一リンク47及び第二リンク48は、x軸方向において互いにずれて位置している。第一リンク47及び第二リンク48の各両端部は、ヒンジによって、可動支持部材41及び固定支持部材44のいずれか一方に接続されている。 The first link 47 and the second link 48 are formed in a longitudinal shape by a metal member extending in a band shape, and one set is provided on each side of the elastic member 80 in the x-axis direction. Each longitudinal direction of the first link 47 and the second link 48 is oriented in the y-axis direction. In addition, the first link 47 and the second link 48 are attached to the movable support member 41 and the fixed support member 44 in a posture in which the respective longitudinal directions are inclined with respect to the y-axis direction. The first link 47 and the second link 48 are shifted from each other in the x-axis direction. Both end portions of the first link 47 and the second link 48 are connected to one of the movable support member 41 and the fixed support member 44 by hinges.
 具体的には、第一リンク47の両端部のうち一方47aは、可動支持部材41の接続部42aに回転可能に取り付けられており、第一リンク47の両端部のうち他方47bは、固定支持部材44の接続部45bに回転可能に取り付けられている。第二リンク48の両端部のうち一方48aは、固定支持部材44の接続部45aに回転可能に取り付けられており、第二リンク48の両端部のうち他方48bは、可動支持部材41の接続部42bに回転可能に取り付けられている。この他方の端部48bと接続部42bとの接続箇所では、y軸方向へのヒンジの摺動が許容されている。 Specifically, one end 47a of both ends of the first link 47 is rotatably attached to the connecting portion 42a of the movable support member 41, and the other 47b of both ends of the first link 47 is fixedly supported. It is rotatably attached to the connecting portion 45b of the member 44. One end 48 a of both end portions of the second link 48 is rotatably attached to the connection portion 45 a of the fixed support member 44, and the other end 48 b of both end portions of the second link 48 is the connection portion of the movable support member 41. 42b is rotatably attached. At the connecting portion between the other end 48b and the connecting portion 42b, the hinge is allowed to slide in the y-axis direction.
 図5,6に示す弾性部材80は、弾性変形容易なゴム等の材料によって、容器状に形成されている。弾性部材80は、固定支持部材44の中央に載置され、可動支持部材41と固定支持部材44と間に挟まれている。弾性部材80には、大径部81、小径部83、縮径部82、及びストッパ85が形成されている。 The elastic member 80 shown in FIGS. 5 and 6 is formed in a container shape by a material such as rubber that is easily elastically deformed. The elastic member 80 is placed at the center of the fixed support member 44 and is sandwiched between the movable support member 41 and the fixed support member 44. The elastic member 80 is formed with a large diameter portion 81, a small diameter portion 83, a reduced diameter portion 82, and a stopper 85.
 大径部81は、円筒状に形成されており、その軸方向をz軸方向に沿わせた姿勢にて固定支持部材44のおもて面に載置されている。小径部83は、大径部81よりも外径の小さい円筒状に形成されている。小径部83は、その軸方向をz軸方向に沿わせた姿勢で、可動支持部材41の裏面と接触している。小径部83は、大径部81の同軸上に位置している。縮径部82は、大径部81と小径部83との間に形成されている。縮径部82の外径は、z軸方向に沿って大径部81から縮径部82に向かうに従い、漸減している。ストッパ85は、弾性部材80の頂壁部分84の中央から、固定支持部材44に向けて柱状に突出している。 The large-diameter portion 81 is formed in a cylindrical shape, and is placed on the front surface of the fixed support member 44 in a posture in which the axial direction is aligned with the z-axis direction. The small diameter portion 83 is formed in a cylindrical shape having a smaller outer diameter than the large diameter portion 81. The small diameter portion 83 is in contact with the back surface of the movable support member 41 in a posture in which the axial direction is along the z-axis direction. The small diameter portion 83 is positioned on the same axis as the large diameter portion 81. The reduced diameter portion 82 is formed between the large diameter portion 81 and the small diameter portion 83. The outer diameter of the reduced diameter portion 82 is gradually reduced from the large diameter portion 81 toward the reduced diameter portion 82 along the z-axis direction. The stopper 85 protrudes in a columnar shape from the center of the top wall portion 84 of the elastic member 80 toward the fixed support member 44.
 以上の弾性部材80は、z軸方向に押し縮められることで、大径部81及び小径部83のそれぞれが押し潰される。このとき、弾性部材80のz軸方向における復元力は、弾性部材80の変形量が大きくなるに従って、次第に増加する(図7参照)。そして、弾性部材80がさらに押し縮められると、小径部83は、大径部81の内周側に沈み込む。このとき、弾性部材80の復元力は、一旦減少する(図7参照)。さらに弾性部材80が押し縮められると、弾性部材80の軸方向における復元力は、再び増加する(図7参照)。そして、固定支持部材44のおもて面にストッパ85が当接することにより、z軸方向への圧縮は、規制される。 The elastic member 80 described above is compressed in the z-axis direction, so that the large diameter portion 81 and the small diameter portion 83 are crushed. At this time, the restoring force in the z-axis direction of the elastic member 80 gradually increases as the deformation amount of the elastic member 80 increases (see FIG. 7). When the elastic member 80 is further compressed, the small diameter portion 83 sinks to the inner peripheral side of the large diameter portion 81. At this time, the restoring force of the elastic member 80 temporarily decreases (see FIG. 7). When the elastic member 80 is further compressed, the restoring force in the axial direction of the elastic member 80 increases again (see FIG. 7). Then, when the stopper 85 comes into contact with the front surface of the fixed support member 44, the compression in the z-axis direction is restricted.
 ここまで説明した昇降機構40の構成によれば、図5に示すように、弾性部材80の復元力よりも強い押圧力Fpが指Fから操作面70に印加されることにより、操作面70は、可動支持部材41と一体的に固定支持部材44に向けて下降する。これにより、タッチセンサ31a及び操作面70の間の距離、ひいてはタッチセンサ31a及び指F間の操作体距離dが短縮される。また、操作面70に印加される押圧力Fpが消失すると、弾性部材80の復元力により、操作面70は、可動支持部材41と一体的に上昇する。これにより、上述の操作体距離dは、延長される。 According to the configuration of the lifting mechanism 40 described so far, as shown in FIG. 5, the pressing force Fp stronger than the restoring force of the elastic member 80 is applied from the finger F to the operating surface 70. Then, it moves downward toward the fixed support member 44 integrally with the movable support member 41. Thereby, the distance between the touch sensor 31a and the operation surface 70, and hence the operation body distance d between the touch sensor 31a and the finger F are shortened. When the pressing force Fp applied to the operation surface 70 disappears, the operation surface 70 rises integrally with the movable support member 41 by the restoring force of the elastic member 80. Thereby, the above-mentioned operating body distance d is extended.
 次に、遠隔操作デバイス100に予め規定された複数の操作モードについて、その詳細を説明する。遠隔操作デバイス100では、移動操作を入力する指Fの操作体距離dにより、操作モードが切り替えられる。具体的には、図1に示す判別ブロック35が、取得ブロック34によって取得されたz座標に応じて、検出された指Fの移動を区別する。そして関連付けブロック36が、判別ブロック35による区別に基づき、図8に示す表示画面52において、指Fの移動操作と関連付けられる画像部を変更させる。以下、遠隔操作デバイス100に予め規定された次の(1)~(3)の操作モードを、詳細に説明する。 Next, details of a plurality of operation modes defined in advance in the remote operation device 100 will be described. In the remote operation device 100, the operation mode is switched according to the operation object distance d of the finger F that inputs the movement operation. Specifically, the determination block 35 shown in FIG. 1 distinguishes the detected movement of the finger F according to the z coordinate acquired by the acquisition block 34. Then, the association block 36 changes the image portion associated with the movement operation of the finger F on the display screen 52 shown in FIG. Hereinafter, the following operation modes (1) to (3) preliminarily defined in the remote operation device 100 will be described in detail.
 (1)浅部操作モード
 浅部操作モードでは、図8のAに示すように、指Fによる移動操作は、表示画面52の複数のサブメニュー画像部165を水平方向に移動(以下、「スクロール」という)させるスクロール制御に関連付けられる。こうした浅部操作モードとされるのは、第二操作空間Sp2内に指Fが位置していた場合である。この第二操作空間Sp2は、操作体距離dが第一閾値距離Dth1以上且つ第二閾値距離Dth2未満となる空間である。第一閾値距離Dth1は、固定支持部材44から最も離間した状態(以下、「最上位置」という)の可動支持部材41の操作面70と、タッチセンサ31aとの距離よりも、短く設定されている。第二閾値距離Dth2は、タッチセンサ31aから、最上位置における可動支持部材41の操作面70までの距離よりも、例えば0.5~1cm程度長く設定されている。以上の第二操作空間Sp2内にて、xy平面に沿って指Fを移動させる移動操作は、「浅部操作」と定義される。故に、最上位置まで上昇した操作面70をなぞる移動操作は、浅部操作となる。
(1) Shallow operation mode In the shallow operation mode, as shown in FIG. 8A, the movement operation with the finger F moves the plurality of submenu image portions 165 of the display screen 52 in the horizontal direction (hereinafter referred to as “scroll”). Associated with the scroll control. The shallow portion operation mode is set when the finger F is located in the second operation space Sp2. The second operation space Sp2 is a space in which the operating body distance d is not less than the first threshold distance Dth1 and less than the second threshold distance Dth2. The first threshold distance Dth1 is set to be shorter than the distance between the touch sensor 31a and the operation surface 70 of the movable support member 41 in the state farthest from the fixed support member 44 (hereinafter referred to as the “uppermost position”). . The second threshold distance Dth2 is set to be, for example, about 0.5 to 1 cm longer than the distance from the touch sensor 31a to the operation surface 70 of the movable support member 41 at the uppermost position. The movement operation for moving the finger F along the xy plane in the second operation space Sp2 is defined as “shallow operation”. Therefore, the movement operation that traces the operation surface 70 that has been raised to the uppermost position is a shallow operation.
 (2)深部操作モード
 深部操作モードでは、図8のCに示すように、指Fによる移動操作は、表示画面52のフォーカス62を移動させるフォーカス制御に関連付けられる。こうした深部操作モードとされるのは、第一操作空間Sp1内に指Fが位置していた場合である。この第一操作空間Sp1は、操作体距離dが第一閾値距離Dth1未満となる空間である。この第一閾値距離Dth1は、固定支持部材44に最も近接した状態(以下、「最下位置」という)における可動支持部材41の操作面70と、タッチセンサ31aとの距離よりも、例えば0.5~1cm程度長く設定されている。この第一閾値距離Dth1によって、第一操作空間Sp1と第二操作空間Sp2(図8のA参照)との間における仮想の境界面BPが規定される。以上の第一操作空間Sp1内にて、xy平面に沿って指Fを移動させる移動操作は、「深部操作」と定義される。故に、最下位置まで下降させた操作面70をなぞる移動操作は、深部操作となる。
(2) Deep Operation Mode In the deep operation mode, as shown in FIG. 8C, the moving operation with the finger F is associated with focus control for moving the focus 62 of the display screen 52. The deep operation mode is set when the finger F is located in the first operation space Sp1. The first operation space Sp1 is a space in which the operating body distance d is less than the first threshold distance Dth1. The first threshold distance Dth1 is, for example, less than the distance between the operation surface 70 of the movable support member 41 and the touch sensor 31a in the state closest to the fixed support member 44 (hereinafter referred to as the “lowermost position”). The length is set to about 5 to 1 cm. By this first threshold distance Dth1, a virtual boundary surface BP between the first operation space Sp1 and the second operation space Sp2 (see A in FIG. 8) is defined. The movement operation for moving the finger F along the xy plane in the first operation space Sp1 is defined as “deep part operation”. Therefore, the moving operation of tracing the operation surface 70 lowered to the lowest position is a deep operation.
 (3)非近接モード
 非近接モードでは、指Fによる移動操作は、表示画面52のいずれの画像部とも関連付けられない。こうした非近接モードとされるのは、第一操作空間Sp1及び第二操作空間Sp2(図8のA参照)のいずれにも指Fが位置していない場合である。このように、第一操作空間Sp1及び第二操作空間Sp2を除く空間を、非近接空間とする。
(3) Non-proximity mode In the non-proximity mode, the moving operation with the finger F is not associated with any image portion of the display screen 52. The non-proximity mode is set when the finger F is not positioned in either the first operation space Sp1 or the second operation space Sp2 (see A in FIG. 8). Thus, the space excluding the first operation space Sp1 and the second operation space Sp2 is set as a non-proximity space.
 ここまで説明したように、表示システム10(図1参照)においては、図8のAに示す第二操作空間Sp2内で検出される指Fの移動と、図8のCに示す第一操作空間Sp1内で検出される指Fの移動とが区別される。こうした構成において、操作者が任意のアイコン63を選択するまでの一連のアイコン選択操作を、順に説明する。 As described so far, in the display system 10 (see FIG. 1), the movement of the finger F detected in the second operation space Sp2 shown in A of FIG. 8 and the first operation space shown in C of FIG. It is distinguished from the movement of the finger F detected in Sp1. In such a configuration, a series of icon selection operations until the operator selects an arbitrary icon 63 will be described in order.
 図8のAに示すように、アイコン選択操作を開始しようとする操作者は、タッチセンサ31aから第二閾値距離Dth2よりも離れていた指F(図8のAの二点鎖線参照)を、操作面70に向けて移動させる。操作モードが非近接モードである状態では、指Fによる操作と画像部との関連付けは、行われない。 As shown in A of FIG. 8, the operator who wants to start the icon selection operation moves the finger F (see the two-dot chain line in A of FIG. 8) away from the touch sensor 31 a to the second threshold distance Dth2. Move toward the operation surface 70. When the operation mode is the non-proximity mode, the operation with the finger F is not associated with the image portion.
 非近接空間から第二操作空間Sp2へと指Fを移動させ、操作面70を叩くタップ操作を操作者が入力することにより、指Fによる浅部操作とスクロール制御との関連付けが開始される。こうして操作モードが浅部操作モードとされることにより、表示画面52には、上位の階層に規定されたメインメニュー60aが表示される。このメインメニュー60aでは、複数のサブメニュー画像部165のスクロールが可能な状態となっている。 When the finger F is moved from the non-proximity space to the second operation space Sp2 and the operator inputs a tap operation that strikes the operation surface 70, the association between the shallow portion operation by the finger F and the scroll control is started. In this way, when the operation mode is set to the shallow operation mode, the main menu 60a defined in the upper hierarchy is displayed on the display screen 52. In the main menu 60a, a plurality of submenu image portions 165 can be scrolled.
 指Fから操作面70に印加される押圧力Fp(図8のB参照)が弱い場合には、操作面70は、最上位置に留まり、第二操作空間Sp2内に位置している。故に、操作者は、x軸方向に沿って操作面70をなぞる浅部操作により、任意のアイコン63を含むサブメニュー画像部165を表示画面52の中央に移動させることができる。そして操作者は、上述のサブメニュー画像部165を中央に位置させた状態下で、第二操作空間Sp2から第一操作空間Sp1(図8のC参照)へ向かう押込方向Dpへ、操作面70を押し込む操作(以下、「押込操作」という)を行う。 When the pressing force Fp (see B in FIG. 8) applied from the finger F to the operation surface 70 is weak, the operation surface 70 remains at the uppermost position and is located in the second operation space Sp2. Therefore, the operator can move the submenu image portion 165 including the arbitrary icon 63 to the center of the display screen 52 by a shallow portion operation of tracing the operation surface 70 along the x-axis direction. Then, the operator operates the operation surface 70 in the pushing direction Dp from the second operation space Sp2 toward the first operation space Sp1 (see C in FIG. 8) with the above-described submenu image portion 165 positioned at the center. An operation of pushing in (hereinafter referred to as “push-in operation”) is performed.
 図8のBに示すように、指Fから印加される押圧力Fpが強くされると、昇降機構40の機能によって、操作面70は、第二操作空間Sp2(図8のA参照)内から第一操作空間Sp1(図8のC参照)内へと移動する。こうした押込操作では、押込方向Dpへの操作面70の押込変位量sが大きくなるに従い、指Fの押圧力Fpに抗する反力Freが増加する。さらに、押込操作が継続されると、押込変位量sが増加するに従って、反力Freは次第に減少するようになる。そして、ストッパ85(図6参照)が固定支持部材44に当接して可動支持部材41の移動が規制されることにより、押込操作は、終了する。 As shown in FIG. 8B, when the pressing force Fp applied from the finger F is increased, the operation surface 70 is moved from the second operation space Sp2 (see A in FIG. 8) by the function of the lifting mechanism 40. It moves into the first operation space Sp1 (see C in FIG. 8). In such a pressing operation, the reaction force Fre against the pressing force Fp of the finger F increases as the pressing displacement amount s of the operation surface 70 in the pressing direction Dp increases. Further, when the pushing operation is continued, the reaction force Fre gradually decreases as the pushing displacement amount s increases. Then, when the stopper 85 (see FIG. 6) contacts the fixed support member 44 and the movement of the movable support member 41 is restricted, the pushing operation is completed.
 以上のように、操作面70が最上位置から最下位置へと移動するうちで、反力Freの漸増する区間を「反力増加区間Zin」と定義する(図7も参照)。また、反力増加区間Zinの押込方向Dpにて当該反力増加区間Zinと連続し、反力Freの漸減する区間を「反力減少区間Zde」と定義する(図7も参照)。上述の第一閾値距離Dth1は、下降する操作面70が反力減少区間Zdeにおいて境界面BPを通過するよう、予め調整されている。よって、反力Freの低下によって操作面70が下方へと引き込まれるタイミングと連動するように、操作面70は、境界面BPを通過することとなる。 As described above, a section where the reaction force Fre gradually increases while the operation surface 70 moves from the uppermost position to the lowermost position is defined as a “reaction force increasing section Zin” (see also FIG. 7). Further, a section that continues to the reaction force increase section Zin in the pushing direction Dp of the reaction force increase section Zin and gradually decreases the reaction force Fre is defined as a “reaction force decrease section Zde” (see also FIG. 7). The first threshold distance Dth1 is adjusted in advance so that the operating surface 70 that descends passes through the boundary surface BP in the reaction force decrease zone Zde. Therefore, the operation surface 70 passes through the boundary surface BP so as to be interlocked with the timing at which the operation surface 70 is pulled downward due to the decrease in the reaction force Fre.
 こうした押込操作によれば、図8のCに示すように、指Fは、第二操作空間Sp2(図8のA参照)から第一操作空間Sp1へと移動する。この移動により、遠隔操作デバイス100の操作モードは、浅部操作モードから深部操作モードへと切り替えられる。こうして指Fの移動操作は、フォーカス制御に関連付けられるようになる。加えて、表示画面52の中央部分に表示されていたサブメニュー画像部165(図8のA参照)は、空調メニュー60bとして、表示画面52の全体に表示される(図8のBも参照)。この空調メニュー60bは、メインメニュー60aの下位の階層に規定されたサブメニューである。 According to such a pressing operation, as shown in FIG. 8C, the finger F moves from the second operation space Sp2 (see A in FIG. 8) to the first operation space Sp1. By this movement, the operation mode of the remote operation device 100 is switched from the shallow operation mode to the deep operation mode. Thus, the movement operation of the finger F is related to the focus control. In addition, the submenu image portion 165 (see A in FIG. 8) displayed in the central portion of the display screen 52 is displayed on the entire display screen 52 as the air conditioning menu 60b (see also B in FIG. 8). . The air conditioning menu 60b is a submenu defined in the lower hierarchy of the main menu 60a.
 以上の状態下、操作者は、操作面70を指Fでなぞる深部操作を入力することで、任意のアイコン63にフォーカス62を重畳させることができる。そして操作者は、任意のアイコン63にフォーカス62を重畳させた状態で、操作面70に印加していた押圧力Fpを弱めて、タッチセンサ31aから指Fを離間させる。こうして操作者は、任意のアイコン63を選択することができる。 In the above state, the operator can superimpose the focus 62 on the arbitrary icon 63 by inputting a deep operation of tracing the operation surface 70 with the finger F. Then, the operator weakens the pressing force Fp applied to the operation surface 70 in a state where the focus 62 is superimposed on the arbitrary icon 63, and moves the finger F away from the touch sensor 31a. Thus, the operator can select an arbitrary icon 63.
 そして、アイコン63の選択操作を完了させた操作者は、操作面70から指Fをさらに離間させ、非近接空間に移動させる。これにより、操作モードは、非近接モードへと切り替えられる。そして、操作モードが非近接モードへと切り替えられた後、予め規定された閾値時間Tth(図9参照)が経過することにより、遠隔操作デバイス100は、次回以降の操作者によるアイコン選択操作を待機する状態となる。 Then, the operator who has completed the selection operation of the icon 63 further moves the finger F away from the operation surface 70 and moves it to the non-proximity space. Thereby, the operation mode is switched to the non-proximity mode. Then, after the operation mode is switched to the non-proximity mode, when a predetermined threshold time Tth (see FIG. 9) elapses, the remote operation device 100 waits for an icon selection operation by the operator after the next time. It becomes a state to do.
 以上のアイコン選択操作を実現するために、操作制御部33によって実施される操作モード選択処理を、図9に基づき、図1を参照しつつ詳細に説明する。図9に示される操作モード選択処理は、車両のACC電源がオン状態とされることにより、操作制御部33によって開始される。 The operation mode selection processing performed by the operation control unit 33 to realize the above icon selection operation will be described in detail with reference to FIG. 1 based on FIG. The operation mode selection process shown in FIG. 9 is started by the operation control unit 33 when the ACC power supply of the vehicle is turned on.
 S101では、タッチセンサ31aから取得する出力の変動に基づいて、最上位置にある操作面70へのタップ操作の有無を判定する。S101にて、タップ操作が無いと判定した場合には、S101の判定を繰り返すことで、遠隔操作デバイス100の待機状態が維持される。一方で、S101にて、タップ操作が有ったと判定した場合には、S102に進む。 In S101, the presence / absence of a tap operation on the operation surface 70 at the uppermost position is determined based on the change in output acquired from the touch sensor 31a. If it is determined in S101 that there is no tap operation, the standby state of the remote operation device 100 is maintained by repeating the determination in S101. On the other hand, if it is determined in S101 that there is a tap operation, the process proceeds to S102.
 S102では、タッチセンサ31aの各電極にて検出される感度値を取得する取得処理を実施し、S103に進む。S103では、S102にて取得された感度値から、タッチセンサ31aに対する指Fの三次元方向の位置を示すx座標,y座標,z座標(以下、「入力位置座標」)につき計算処理を実施し、S104に進む。 In S102, an acquisition process for acquiring a sensitivity value detected by each electrode of the touch sensor 31a is performed, and the process proceeds to S103. In S103, calculation processing is performed for the x-coordinate, y-coordinate, and z-coordinate (hereinafter referred to as “input position coordinates”) indicating the position of the finger F in the three-dimensional direction with respect to the touch sensor 31a from the sensitivity value acquired in S102. , The process proceeds to S104.
 以上のS103にて実施される計算処理の詳細を、図10に基づいて説明する。感度値は、操作面70と指Fとの間に蓄えられる静電容量が増加するに従って、大きくなる値である。故に、感度値が最大となるx軸方向及びy軸方向の各座標は、操作面70上における指Fの相対位置を示している。加えて、感度値は、操作体距離d(図8のA参照)が短くなるにしたがって減少し、操作体距離dが長くなるに従って増加する。故に、最大となる感度値は、操作体距離d、ひいてはz軸方向の座標に対応する。 Details of the calculation processing performed in S103 will be described with reference to FIG. The sensitivity value is a value that increases as the capacitance stored between the operation surface 70 and the finger F increases. Therefore, each coordinate in the x-axis direction and the y-axis direction where the sensitivity value is maximum indicates the relative position of the finger F on the operation surface 70. In addition, the sensitivity value decreases as the operating tool distance d (see A in FIG. 8) decreases, and increases as the operating tool distance d increases. Therefore, the maximum sensitivity value corresponds to the operating tool distance d, and thus the coordinate in the z-axis direction.
 また、操作制御部33には、第一閾値距離Dth1に対応する第一感度閾値Hth1と、第二閾値距離Dth2に対応する第二感度閾値Hth2とを示すテーブルが、予め記憶されている。図9に示すS104以降の処理において、操作制御部33は、S103にて取得された最大の感度値を、各感度閾値Hth1,Hth2と比較する処理を行う。 Also, the operation control unit 33 stores in advance a table indicating a first sensitivity threshold value Hth1 corresponding to the first threshold distance Dth1 and a second sensitivity threshold value Hth2 corresponding to the second threshold distance Dth2. In the processing after S104 shown in FIG. 9, the operation control unit 33 performs processing for comparing the maximum sensitivity value acquired in S103 with the sensitivity threshold values Hth1 and Hth2.
 S104では、感度値が第一感度閾値Hth1以上であるか否かに基づいて、指Fが第一操作空間Sp1内にあるか否かを判定する。S104にて、肯定判定をした場合には、S105に進む。S105では、操作モードを深部操作モードに設定し、S102に戻る。 In S104, it is determined whether or not the finger F is in the first operation space Sp1 based on whether or not the sensitivity value is greater than or equal to the first sensitivity threshold Hth1. If an affirmative determination is made in S104, the process proceeds to S105. In S105, the operation mode is set to the deep operation mode, and the process returns to S102.
 S104にて、否定判定をした場合のS106では、感度値が第二感度閾値Hth2以上且つ第一感度閾値Hth1未満であるか否かに基づいて、指Fが第二操作空間Sp2内にあるか否かを判定する。S106にて肯定判定をした場合には、S107に進む。S107では、操作モードを浅部操作モードに設定し、S102に戻る。 In S106 when a negative determination is made in S104, whether the finger F is in the second operation space Sp2 based on whether or not the sensitivity value is greater than or equal to the second sensitivity threshold Hth2 and less than the first sensitivity threshold Hth1. Determine whether or not. If a positive determination is made in S106, the process proceeds to S107. In S107, the operation mode is set to the shallow operation mode, and the process returns to S102.
 S106にて否定判定をした場合のS108では、操作モードを非近接モードに設定し、S109に進む。このS108において、深部操作モード及び浅部操作モードのいずれか一方から非近接モードへと操作モードを切り替えた場合には、当該非近接モードに移行後の経過時間tのカウントを開始する。 In S108 when a negative determination is made in S106, the operation mode is set to the non-proximity mode, and the process proceeds to S109. In S108, when the operation mode is switched from one of the deep operation mode and the shallow operation mode to the non-proximity mode, counting of the elapsed time t after shifting to the non-proximity mode is started.
 S109では、今回のS108又は前回以前のS108にてカウントの開始された経過時間tが、予め規定された閾値時間Tth以上となったか否かを判定する。S109にて肯定判定をした場合には、S101に戻る。これにより、遠隔操作デバイス100は、待機状態へと移行する。一方で、S109にて否定判定をした場合には、S102に戻る。 In S109, it is determined whether or not the elapsed time t at which the count was started in S108 of this time or the previous S108 is equal to or greater than a predetermined threshold time Tth. If a positive determination is made in S109, the process returns to S101. Thereby, the remote operation device 100 shifts to a standby state. On the other hand, if a negative determination is made in S109, the process returns to S102.
 以上のS105,S107,S108にて操作モードを変更した操作制御部33は、操作モードの変更を通知する指令信号を、通信制御部23及び通信インターフェース24を通じて、CANバス90に出力する。この指令信号を取得した表示制御部53は、当該信号に基づいて、各操作モードに対応する描画レイヤをアクティブとする。 The operation control unit 33 that has changed the operation mode in S105, S107, and S108 described above outputs a command signal notifying the change of the operation mode to the CAN bus 90 through the communication control unit 23 and the communication interface 24. The display control unit 53 that has acquired the command signal activates the drawing layer corresponding to each operation mode based on the signal.
 具体的には、操作制御部33が操作モードを深部操作モードに設定した場合、表示制御部53は、フォーカスレイヤをアクティブな描画レイヤとして選択する。以上により、操作制御部33は、指Fによる深部操作を、フォーカス62に関連付けて、当該フォーカス62の表示態様を変化させることができる。 Specifically, when the operation control unit 33 sets the operation mode to the deep operation mode, the display control unit 53 selects the focus layer as an active drawing layer. As described above, the operation control unit 33 can associate the deep operation with the finger F with the focus 62 and change the display mode of the focus 62.
 一方、操作制御部33が操作モードを浅部操作モードに設定に設定した場合、表示制御部53は、複数のサブメニュー画像部165が描画されたサブメニューレイヤをアクティブな描画レイヤとして選択する。以上により、操作制御部33は、指Fによる浅部操作を、サブメニュー画像部165に関連付けて、当該サブメニュー画像部165の表示態様を変化させることができる。 On the other hand, when the operation control unit 33 sets the operation mode to the shallow operation mode, the display control unit 53 selects the submenu layer on which the plurality of submenu image units 165 are drawn as the active drawing layer. As described above, the operation control unit 33 can associate the shallow operation with the finger F with the submenu image unit 165 and change the display mode of the submenu image unit 165.
 さらに、操作制御部33が操作モードを非近接モードに設定した場合、表示制御部53は、アクティブな描画レイヤを「無し」とする。これにより、指Fの移動操作は、いずれの画像部とも関連付けられなくなる。 Furthermore, when the operation control unit 33 sets the operation mode to the non-proximity mode, the display control unit 53 sets the active drawing layer to “none”. Thereby, the movement operation of the finger F is not associated with any image portion.
 ここまで説明した第一実施形態によれば、操作面70は、指Fから印加される押圧力Fpの強弱に応じて、第二操作空間Sp2から第一操作空間Sp1へと移動する。こうした構成であれば、操作者は、操作面70を押し込んだ状態であるか否かの操作感覚に基づいて、指Fが現在位置している操作空間がいずれであるかを把握することができる。したがって、上述のように操作体距離dによって操作空間が区別され、操作対象とされる画像部が切り替わる構成であっても、誤操作の低減が可能となる。 According to the first embodiment described so far, the operation surface 70 moves from the second operation space Sp2 to the first operation space Sp1 according to the strength of the pressing force Fp applied from the finger F. With such a configuration, the operator can grasp which operation space the finger F is currently located on, based on an operation feeling as to whether or not the operation surface 70 is depressed. . Therefore, even if the operation space is distinguished by the operating body distance d as described above and the image portion to be operated is switched, it is possible to reduce erroneous operations.
 加えて第一実施形態では、弾性部材80の復元力により、押圧力Fpに抗する反力Freが発生している。故に、タッチパネル31aから離間する方向に指Fが移動した場合でも、操作面70は、指Fに追従して、移動することができる。こうして指Fと操作面70との接触状態が維持されることによれば、操作者は、操作面70を押し込んだ状態であるか否かの操作感覚を確実に把握し続けられる。したがって、誤操作を低減する効果は、さらに高い確実性をもって発揮されるようになる。 In addition, in the first embodiment, the reaction force Fre that resists the pressing force Fp is generated by the restoring force of the elastic member 80. Therefore, even when the finger F moves in a direction away from the touch panel 31a, the operation surface 70 can move following the finger F. By maintaining the contact state between the finger F and the operation surface 70 in this way, the operator can reliably keep track of the operational sensation as to whether or not the operation surface 70 has been pushed. Therefore, the effect of reducing erroneous operations is exhibited with higher certainty.
 また第一実施形態によれば、操作面70が第二操作空間Sp2から第一操作空間Sp1へと変位する際に、反力増加区間Zinから反力減少区間Zdeへの遷移により、反力Freは、増加から減少に転じる。こうした反力Freの変化によって指にクリック感が伝達されることによれば、操作面70を押し込んだ状態であるか否かの操作感覚は、いっそう明確となる。以上により、指Fの位置している操作空間の把握がさらに容易となるので、誤操作を低減する効果は、確実に発揮されるようになる。 In addition, according to the first embodiment, when the operation surface 70 is displaced from the second operation space Sp2 to the first operation space Sp1, the reaction force Fre is caused by the transition from the reaction force increase section Zin to the reaction force decrease section Zde. Turns from increasing to decreasing. When the click feeling is transmitted to the finger by such a change in the reaction force Fre, the operation feeling as to whether or not the operation surface 70 is pushed is further clarified. As described above, the operation space in which the finger F is positioned can be more easily grasped, so that the effect of reducing erroneous operations can be reliably exhibited.
 そして、指Fにクリック感が伝達されるタイミングにて、操作面70は、境界面BPを通過する。このように、指Fに伝達されるクリック感と操作空間の切り替わりとが連動する構成であれば、操作者は、指Fの位置している操作空間の把握をさらに容易に行い得る。したがって、誤操作を低減する効果は、いっそう確実に発揮されるようになる。 Then, at the timing when the click feeling is transmitted to the finger F, the operation surface 70 passes through the boundary surface BP. As described above, if the click feeling transmitted to the finger F and the switching of the operation space are linked, the operator can more easily grasp the operation space where the finger F is located. Therefore, the effect of reducing erroneous operations is more reliably exhibited.
 さらに第一実施形態の移動機構によれば、押込方向Dpに操作面70を移動させても、xy平面に沿う方向への操作面70の移動は、ごく僅かである。故に、押込操作の際に、操作面70に引き摺られた指Fが上述の平面方向に移動し、操作者の意図しない画像部の移動を生じてしまう事態は、回避され得る。したがって、操作面70を昇降可能にする構成に起因した誤操作も、上述の構成であれば低減可能となる。 Furthermore, according to the moving mechanism of the first embodiment, even if the operation surface 70 is moved in the pushing direction Dp, the operation surface 70 moves in the direction along the xy plane very little. Therefore, it is possible to avoid a situation in which the finger F dragged on the operation surface 70 moves in the above-described plane direction and causes the movement of the image part not intended by the operator during the pushing operation. Therefore, erroneous operation caused by the configuration that allows the operation surface 70 to be raised and lowered can be reduced with the above-described configuration.
 また加えて第一実施形態では、メインメニュー60a及び空調メニュー60bのように、表示画面52の表示は、階層構造で構築されている。こうした形態であれば、上位の階層であるメインメニュー60aのサブメニュー画像部165は、第二操作空間Sp2内での浅部操作と関連付けられることが望ましい。一方で、下位の階層である空調メニュー60bのフォーカス62は、第一操作空間Sp1内での深部操作と関連付けられることが望ましい。以上の関連付けによれば、階層構造における上位及び下位の論理的な相関と、各操作空間Sp1,Sp2における上方及び下方の物理的な相関とが、操作者の思考において結び付き易くなる。したがって、上述の関連付によって、誤操作は、いっそう低減され易くなるのである。 In addition, in the first embodiment, as in the main menu 60a and the air conditioning menu 60b, the display on the display screen 52 is constructed in a hierarchical structure. In such a form, it is desirable that the submenu image portion 165 of the main menu 60a, which is the upper layer, is associated with the shallow portion operation in the second operation space Sp2. On the other hand, it is desirable that the focus 62 of the air conditioning menu 60b, which is a lower hierarchy, be associated with a deep operation in the first operation space Sp1. According to the above association, the upper and lower logical correlations in the hierarchical structure and the upper and lower physical correlations in the operation spaces Sp1 and Sp2 are easily linked in the operator's thought. Therefore, erroneous operations are more easily reduced by the association described above.
 尚、第一実施形態において、指Fが特許請求の範囲に記載の「操作体」に相当し、タッチセンサ31aが特許請求の範囲に記載の「検出手段」に相当し、検出部31及び取得ブロック34が協働で特許請求の範囲に記載の「取得手段」に相当する。また、判別ブロック35が特許請求の範囲に記載の「判別手段」に相当し、関連付けブロック36が特許請求の範囲に記載の「関連付け手段」に相当し、昇降機構40が特許請求の範囲に記載の「移動機構」に相当する。そして、フォーカス62が特許請求項の範囲に記載の「第一画像部」に相当し、サブメニュー画像部165が特許請求の範囲に記載の「第二画像部」に相当し、弾性部材80が特許請求の範囲に記載の「反力発生部」に相当する。さらに、メインメニュー60aが特許請求の範囲に記載の「上位選択画像」に相当し、空調メニュー60bが特許請求の範囲に記載の「下位選択画像」に相当し、遠隔操作デバイス100が特許請求の範囲に記載の「操作デバイス」に相当する。 In the first embodiment, the finger F corresponds to the “operation body” described in the claims, and the touch sensor 31a corresponds to the “detection means” described in the claims. The block 34 cooperates with the “acquiring means” recited in the claims. Further, the determination block 35 corresponds to the “determination means” described in the claims, the association block 36 corresponds to the “association means” described in the claims, and the lifting mechanism 40 is described in the claims. This corresponds to the “movement mechanism”. The focus 62 corresponds to the “first image portion” described in the claims, the submenu image portion 165 corresponds to the “second image portion” described in the claims, and the elastic member 80 This corresponds to the “reaction force generation unit” recited in the claims. Furthermore, the main menu 60a corresponds to the “upper selection image” described in the claims, the air conditioning menu 60b corresponds to the “lower selection image” described in the claims, and the remote operation device 100 claims This corresponds to the “operation device” described in the range.
 (第二実施形態)
 図11~15に示される本開示の第二実施形態は、第一実施形態の変形例である。図12,13に示す第二実施形態の遠隔操作デバイス200において、昇降機構240は、センターコンソールの一部を形成するケース275に収容されている。また、第二実施形態のタッチセンサ231aは、昇降機構240によって操作面70と共にz軸方向に沿って移動可能である。以下、第二実施形態による遠隔操作デバイス200の詳細を説明する。
(Second embodiment)
The second embodiment of the present disclosure shown in FIGS. 11 to 15 is a modification of the first embodiment. In the remote operation device 200 of the second embodiment shown in FIGS. 12 and 13, the elevating mechanism 240 is accommodated in a case 275 that forms a part of the center console. Further, the touch sensor 231a of the second embodiment can be moved along the z-axis direction together with the operation surface 70 by the elevating mechanism 240. Hereinafter, details of the remote operation device 200 according to the second embodiment will be described.
 ケース275は、導電性を有する樹脂材料等によって形成されている。ケース275は、周壁部275a及び蓋壁部275bを有している。周壁部275aは、昇降機構240の周囲を全周に亘って囲むように立設されている。蓋壁部275bは、昇降機構240を上方から覆うように形成されており、周壁部275aにおける立設方向の先端部分と連続している。蓋壁部275bの中央には、操作面70を露出させるための開口276が設けられている。開口276は、操作面70よりも僅かに小さな矩形状に形成されている。蓋壁部275bは、操作面70と対向する裏面のうちで、開口276の周囲に位置する領域に、対向面277を形成している。対向面277は、操作面70の外縁72に沿った矩形の枠状に形成されており、当該外縁72と対向するよう配置されている。対向面277は、操作面70の外縁72と接触可能である。 Case 275 is formed of a conductive resin material or the like. The case 275 has a peripheral wall portion 275a and a lid wall portion 275b. The peripheral wall part 275a is erected so as to surround the periphery of the elevating mechanism 240 over the entire circumference. The lid wall portion 275b is formed so as to cover the lifting mechanism 240 from above, and is continuous with the tip portion of the peripheral wall portion 275a in the standing direction. An opening 276 for exposing the operation surface 70 is provided in the center of the lid wall portion 275b. The opening 276 is formed in a rectangular shape slightly smaller than the operation surface 70. The lid wall portion 275 b forms a facing surface 277 in a region located around the opening 276 among the back surface facing the operation surface 70. The facing surface 277 is formed in a rectangular frame shape along the outer edge 72 of the operation surface 70, and is arranged to face the outer edge 72. The facing surface 277 can contact the outer edge 72 of the operation surface 70.
 昇降機構240は、可動支持部材241によってタッチセンサ231aを保持している。タッチセンサ231aは、操作面70に沿った姿勢にて、可動支持部材241の裏面に貼り付けられている。タッチセンサ231aの外縁領域232bは、可動支持部材241及びシート部材71を挟みつつ、z軸方向において対向面277と対向している。外縁領域232bは、導電性を有する対向面277との間に電荷を蓄積することができる。以上の構成による昇降機構240は、指Fから操作面70に印加される押圧力Fpにより、タッチセンサ231aから対向面277までの距離(以下、「対向面距離df」という)が拡大するよう、タッチセンサ231aを操作面70と共に降下させる。また、可動支持部材241が操作面70に印加される押圧力Fpから解放されると、昇降機構240は、弾性部材80の復元力により、対向面距離dfが縮小するよう、タッチセンサ231aを操作面70と共に上昇させる。 The elevating mechanism 240 holds the touch sensor 231a by the movable support member 241. The touch sensor 231 a is attached to the back surface of the movable support member 241 in a posture along the operation surface 70. The outer edge region 232b of the touch sensor 231a faces the facing surface 277 in the z-axis direction with the movable support member 241 and the sheet member 71 interposed therebetween. The outer edge region 232b can accumulate charges between the opposing surface 277 having conductivity. The lifting mechanism 240 having the above configuration is configured so that the distance from the touch sensor 231a to the facing surface 277 (hereinafter referred to as “facing surface distance df”) is increased by the pressing force Fp applied from the finger F to the operation surface 70. The touch sensor 231a is lowered together with the operation surface 70. When the movable support member 241 is released from the pressing force Fp applied to the operation surface 70, the lifting mechanism 240 operates the touch sensor 231a so that the opposing surface distance df is reduced by the restoring force of the elastic member 80. Raise with surface 70.
 図11に示す操作制御部33の取得ブロック234は、第一実施形態と同様に、図12,13に示すタッチセンサ231aの中央領域232aにおける操作体距離dを、感度値に基づく算出処理により検出する。加えて第二実施形態では、外縁領域232bから対向面277までの対向面距離dfを、図11に示す取得ブロック234が感度値に基づく算出処理により検出する。以上のように、取得ブロック234は、タッチセンサ231aを含む検出部31と協働して図12,13に示す操作体距離d及び対向面距離dfを共に取得する。尚、タッチセンサ231aの中央領域232aは、z軸方向において開口276と重なる領域である。 As in the first embodiment, the acquisition block 234 of the operation control unit 33 shown in FIG. 11 detects the operating body distance d in the central region 232a of the touch sensor 231a shown in FIGS. 12 and 13 by calculation processing based on the sensitivity value. To do. In addition, in the second embodiment, the acquisition block 234 shown in FIG. 11 detects the facing surface distance df from the outer edge region 232b to the facing surface 277 by calculation processing based on the sensitivity value. As described above, the acquisition block 234 acquires both the operating body distance d and the facing surface distance df shown in FIGS. 12 and 13 in cooperation with the detection unit 31 including the touch sensor 231a. The central region 232a of the touch sensor 231a is a region that overlaps the opening 276 in the z-axis direction.
 以上の遠隔操作デバイス200でも、第一実施形態と同様に、z軸方向における可動支持部材241の上下位置により、関連付けブロック36(図11参照)によって指Fの移動操作と関連付けられる画像部が変更される。具体的には、図12に示すように対向面距離dfが第一閾値距離DFth1未満となる場合には、遠隔操作デバイス200の操作モードは、浅部操作モードとされる。このように対向面距離dfが第一閾値距離DFth1未満となる範囲の空間が、第二操作空間Sp2とされる。そして、第二操作空間Sp2内にタッチセンサ231aを位置させた状態下、操作面70をなぞる指Fの操作が、浅部操作とされる。 In the remote operation device 200 described above, as in the first embodiment, the image portion associated with the movement operation of the finger F is changed by the association block 36 (see FIG. 11) depending on the vertical position of the movable support member 241 in the z-axis direction. Is done. Specifically, as shown in FIG. 12, when the facing surface distance df is less than the first threshold distance DFth1, the operation mode of the remote operation device 200 is the shallow operation mode. The space in the range where the facing surface distance df is less than the first threshold distance DFth1 is set as the second operation space Sp2. Then, the operation of the finger F tracing the operation surface 70 in the state where the touch sensor 231a is positioned in the second operation space Sp2 is a shallow operation.
 一方で、図13に示すように対向面距離dfが第二閾値距離DFth2を超える場合には、遠隔操作デバイス200の操作モードは、深部操作モードとされる。このように対向面距離dfが第二閾値距離DFth2を超える範囲の空間が、第一操作空間Sp1とされる。そして、第一操作空間Sp1内にタッチセンサ231aを位置させた状態下、操作面70をなぞる指Fの操作が、深部操作とされる。 On the other hand, as shown in FIG. 13, when the facing surface distance df exceeds the second threshold distance DFth2, the operation mode of the remote operation device 200 is set to the deep operation mode. Thus, the space in which the facing surface distance df exceeds the second threshold distance DFth2 is defined as the first operation space Sp1. Then, with the touch sensor 231a positioned in the first operation space Sp1, the operation of the finger F tracing the operation surface 70 is a deep operation.
 図11に示す判別ブロック235には、図12,13にそれぞれ示す第一操作空間Sp1における指Fの移動操作と第二操作空間Sp2における指Fの移動操作とを区別する上述した二つの閾値距離DFth1,DFth2が設定されている。第一閾値距離DFth1は、第二閾値距離DFth2よりも短い距離に設定されている。第一閾値距離DFth1は、弾性部材80の反力Fre(図8のB参照)によって可動支持部材241が上昇した際、深部操作モードから浅部操作モードへの切り替えに用いられる閾値距離である。一方で、第二閾値距離DFth2は、押圧力Fpによって可動支持部材241が下降した際、浅部操作モードから深部操作モードへの切り替えに用いられる閾値距離である。こうした二つの閾値距離DFth1,DFth2の設定によれば、操作モードの切り替わりが頻繁に生じてしまう事態は回避され得る。尚、第二実施形態の境界面BPは、図13に示す第一操作空間Sp1に臨むように、対向面277から第二閾値距離DFth2だけ離れた位置に規定される。 The discrimination block 235 shown in FIG. 11 includes the above-described two threshold distances for distinguishing the movement operation of the finger F in the first operation space Sp1 and the movement operation of the finger F in the second operation space Sp2 shown in FIGS. DFth1 and DFth2 are set. The first threshold distance DFth1 is set to be shorter than the second threshold distance DFth2. The first threshold distance DFth1 is a threshold distance used for switching from the deep operation mode to the shallow operation mode when the movable support member 241 is lifted by the reaction force Fre of the elastic member 80 (see B in FIG. 8). On the other hand, the second threshold distance DFth2 is a threshold distance used for switching from the shallow operation mode to the deep operation mode when the movable support member 241 is lowered by the pressing force Fp. According to the setting of the two threshold distances DFth1 and DFth2, a situation in which the switching of the operation mode frequently occurs can be avoided. Note that the boundary surface BP of the second embodiment is defined at a position separated from the facing surface 277 by the second threshold distance DFth2 so as to face the first operation space Sp1 shown in FIG.
 ここまで説明した第二実施形態において、操作制御部33により実施される操作モード選択処理を、図14に基づき、図11~13を参照しつつ詳細に説明する。図14に示される操作モード選択処理は、車両のACC電源がオン状態とされることにより、操作制御部33によって開始される。 In the second embodiment described so far, the operation mode selection processing performed by the operation control unit 33 will be described in detail with reference to FIGS. 11 to 13 based on FIG. The operation mode selection process shown in FIG. 14 is started by the operation control unit 33 when the ACC power supply of the vehicle is turned on.
 S201及びS202は、第一実施形態のS101及びS102と実質同一である。S203では、S202にて取得された感度値から、中央領域232aにて検知される指Fの入力位置座標と、外縁領域232bにて検知される対向面距離dfとについて計算処理を実施し、S204に進む。 S201 and S202 are substantially the same as S101 and S102 of the first embodiment. In S203, calculation processing is performed on the input position coordinates of the finger F detected in the central region 232a and the facing surface distance df detected in the outer edge region 232b from the sensitivity value acquired in S202, and in S204. Proceed to
 S204では、S203にて取得された中央領域232aにおける最大の感度値が所定の接触感度閾値HthT(図15参照)以上であるか否かを判定する。S204において、最大の感度値が接触感度閾値HthTを超えていないと判定した場合、操作面70に指Fが触れていない状態であると推定してS201に戻り、操作開始のためのタップ操作がなされるまで待機する。一方で、S204において最大の感度値が接触感度閾値HthTを超えていると判定した場合には、操作面70に指Fが触れている状態であると推定してS205に進む。 In S204, it is determined whether or not the maximum sensitivity value in the central region 232a acquired in S203 is equal to or greater than a predetermined contact sensitivity threshold HthT (see FIG. 15). In S204, when it is determined that the maximum sensitivity value does not exceed the contact sensitivity threshold value HthT, it is estimated that the finger F is not touching the operation surface 70, and the process returns to S201, and the tap operation for starting the operation is performed. Wait until it is done. On the other hand, if it is determined in S204 that the maximum sensitivity value exceeds the contact sensitivity threshold value HthT, it is estimated that the finger F is touching the operation surface 70, and the process proceeds to S205.
 S205では、それまでの操作モードが浅部操作モードであって、且つ、外縁領域232bの感度値が第二感度閾値HFth2(図15参照)以下であるか否かを判定する。タッチセンサ231aが第二操作空間Sp2から第一操作空間Sp1へと移動したことにより、S205にて肯定判定をした場合には、S206に進む。S206では、操作モードを深部操作モードに設定し、S202に戻る。 In S205, it is determined whether or not the previous operation mode is the shallow operation mode and the sensitivity value of the outer edge region 232b is equal to or lower than the second sensitivity threshold HFth2 (see FIG. 15). When the touch sensor 231a moves from the second operation space Sp2 to the first operation space Sp1, and affirmative determination is made in S205, the process proceeds to S206. In S206, the operation mode is set to the deep operation mode, and the process returns to S202.
 S205にて、否定判定をした場合のS207では、それまでの操作モードが深部操作モードであって、感度値が第一感度閾値HFth1(図15参照)以上であるか否かを判定する。タッチセンサ231aが第一操作空間Sp1から第二操作空間Sp2へと移動したことにより、S207にて肯定判定をした場合には、S208に進む。S208では、操作モードを浅部操作モードに設定し、S202に戻る。一方で、S207にて否定判定をした場合では、それまでの操作モードを維持して、S202に戻る。 In S207 when a negative determination is made in S205, it is determined whether or not the previous operation mode is the deep operation mode and the sensitivity value is equal to or higher than the first sensitivity threshold HFth1 (see FIG. 15). When the touch sensor 231a moves from the first operation space Sp1 to the second operation space Sp2 and makes an affirmative determination in S207, the process proceeds to S208. In S208, the operation mode is set to the shallow operation mode, and the process returns to S202. On the other hand, if a negative determination is made in S207, the previous operation mode is maintained, and the process returns to S202.
 以上の操作モード選択処理において、S204にて用いられる接触感度閾値HthTは、操作面70への指Fの接触を検知するための感度値である。接触感度閾値HthTは、図15に示すように、指Fが操作面70の中央領域232aに触れている状態で検出される感度値よりも僅かに小さい値とされている。また、図14のS205及びS207において用いられる第一感度閾値HFth1及び第二感度閾値HFth2は、図15に示すように、それぞれ対向面距離dfが第一閾値距離DFth1及び第二閾値距離DFth2である状態で外縁領域232bにて検出される感度値に対応している。 In the operation mode selection process described above, the contact sensitivity threshold value HthT used in S204 is a sensitivity value for detecting the contact of the finger F with the operation surface 70. As shown in FIG. 15, the contact sensitivity threshold value HthT is a value slightly smaller than the sensitivity value detected when the finger F is in contact with the central region 232 a of the operation surface 70. Further, as shown in FIG. 15, the first sensitivity threshold value HFth1 and the second sensitivity threshold value HFth2 used in S205 and S207 of FIG. 14 are such that the opposing surface distance df is the first threshold value distance DFth1 and the second threshold value distance DFth2, respectively. This corresponds to the sensitivity value detected in the outer edge region 232b in the state.
 ここまで説明したように、操作面70と一体でタッチセンサ231aが上下に移動する第二実施形態でも、第一実施形態と同様の効果を奏することにより、操作者は、現在指Fの位置する操作空間がいずれであるかを把握することができる。したがって、誤操作の低減が可能となる。 As described so far, in the second embodiment in which the touch sensor 231a moves up and down integrally with the operation surface 70, the operator can position the finger F at the same effect as the first embodiment. It is possible to grasp which operation space is used. Therefore, it is possible to reduce erroneous operations.
 加えて第二実施形態では、操作面70の外縁72と対向面277が枠状に対向しているので、操作制御部33は、外縁領域232bにおける複数箇所での検出結果に基づいて、対向面距離dfを算出することが可能となる。以上によれば、対向面距離dfの検出精度が高く維持されるので、操作面70を押し込む操作に基づく操作モードの切り替わりが、正確に行われ得る。 In addition, in the second embodiment, since the outer edge 72 of the operation surface 70 and the facing surface 277 are opposed to each other in a frame shape, the operation control unit 33 determines whether or not the facing surface is based on the detection results at a plurality of locations in the outer edge region 232b. The distance df can be calculated. According to the above, since the detection accuracy of the facing surface distance df is maintained high, the operation mode switching based on the operation of pushing the operation surface 70 can be performed accurately.
 さらに第二実施形態では、タッチセンサ231aが操作面70と共に移動するので、タッチセンサ231aは、指Fに近接した位置にて指Fによる移動操作を検知することができる。以上によれば、タッチセンサ231aは、指Fによる仔細な移動操作を検知し易くなる。故に、指Fによる移動操作が、表示画面52に表示される画像部の移動に精度良く反映されるようになる。 Further, in the second embodiment, the touch sensor 231a moves together with the operation surface 70, so that the touch sensor 231a can detect a moving operation by the finger F at a position close to the finger F. According to the above, the touch sensor 231a can easily detect a detailed movement operation by the finger F. Therefore, the moving operation with the finger F is accurately reflected in the movement of the image portion displayed on the display screen 52.
 尚、第二実施形態において、昇降機構240が特許請求の範囲に記載の「移動機構」に相当し、タッチセンサ231aが特許請求の範囲に記載の「検出手段」に相当し、検出部31及び取得ブロック234が協働で特許請求の範囲に記載の「取得手段」に相当する。さらに、判別ブロック235が特許請求の範囲に記載の「判別手段」に相当し、遠隔操作デバイス200が特許請求の範囲に記載の「操作デバイス」に相当する。 In the second embodiment, the elevating mechanism 240 corresponds to the “movement mechanism” described in the claims, the touch sensor 231a corresponds to the “detection means” described in the claims, and the detection unit 31 and The acquisition block 234 cooperates and corresponds to “acquisition means” described in the claims. Further, the determination block 235 corresponds to “determination means” described in the claims, and the remote operation device 200 corresponds to “operation device” described in the claims.
 (第三実施形態)
 図16~18に示される本開示の第三実施形態は、第一実施形態の別の変形例である。第三実施形態による遠隔操作デバイス300では、昇降機構340の構成が、第一実施形態における昇降機構40(図5参照)の構成と異なっている。以下、昇降機構340において、第一実施形態の昇降機構40と異なる構成を、詳細に説明する。
(Third embodiment)
The third embodiment of the present disclosure shown in FIGS. 16 to 18 is another modification of the first embodiment. In the remote operation device 300 according to the third embodiment, the configuration of the lifting mechanism 340 is different from the configuration of the lifting mechanism 40 (see FIG. 5) in the first embodiment. Hereinafter, in the raising / lowering mechanism 340, the structure different from the raising / lowering mechanism 40 of 1st embodiment is demonstrated in detail.
 図16,17に示す昇降機構340の一組の第一リンク347及び第二リンク348は、互いの中央部分において、中央ヒンジ349により接続されている。第一リンク347及び第二リンク348は、中央ヒンジ349により、互いに回転可能とされている。また、第一リンク347の端部347bは、固定支持部材44の接続部345bとヒンジによって接続されている。この端部347bは、接続部345bとの接続箇所において、y軸方向へのヒンジの摺動を許容されている。同様に、第二リンク348の端部348bは、可動支持部材41の接続部342bとヒンジによって接続されている。この端部348bは、接続部342bとの接続箇所において、y軸方向へのヒンジの摺動を許容されている。以上の各リンク347,348の接続構成によれば、操作面70は、押込操作がなされた際に、当該操作面70と実質的に直交するz軸方向にのみ移動する。 16 and 17, a pair of the first link 347 and the second link 348 of the elevating mechanism 340 are connected to each other by a central hinge 349 at the center portion thereof. The first link 347 and the second link 348 are rotatable with respect to each other by a central hinge 349. The end portion 347b of the first link 347 is connected to the connection portion 345b of the fixed support member 44 by a hinge. The end portion 347b is allowed to slide the hinge in the y-axis direction at the connection portion with the connection portion 345b. Similarly, the end portion 348b of the second link 348 is connected to the connection portion 342b of the movable support member 41 by a hinge. The end portion 348b is allowed to slide the hinge in the y-axis direction at the connection portion with the connection portion 342b. According to the connection configuration of the links 347 and 348 described above, the operation surface 70 moves only in the z-axis direction substantially orthogonal to the operation surface 70 when a push operation is performed.
 また昇降機構340は、五つの弾性部材380を有している。各弾性部材380には、第一実施形態による弾性部材80(図6参照)と実質的に同一な、大径部381、小径部383、縮径部382、及びストッパ(図示しない)が形成されている。複数の弾性部材380のうち一つは、固定支持部材44の中央に載置されている。複数の弾性部材380のうち、他の四つは、固定支持部材44の中央に載置された弾性部材380を囲むようにして、固定支持部材44の四隅に寄せて配置されている。以上の配置により、各弾性部材380は、反力Freを生じさせる復元力を、互いに異なる箇所にて可動支持部材41に作用させることができる。故に、押込操作がなされた際の操作面70の姿勢は、xy平面に対して傾き難くなり、安定化する。 Also, the lifting mechanism 340 has five elastic members 380. Each elastic member 380 is formed with a large diameter portion 381, a small diameter portion 383, a reduced diameter portion 382, and a stopper (not shown) that are substantially the same as the elastic member 80 (see FIG. 6) according to the first embodiment. ing. One of the plurality of elastic members 380 is placed at the center of the fixed support member 44. Among the plurality of elastic members 380, the other four are arranged close to the four corners of the fixed support member 44 so as to surround the elastic member 380 placed at the center of the fixed support member 44. With the above arrangement, each elastic member 380 can apply the restoring force that generates the reaction force Fre to the movable support member 41 at different locations. Therefore, the posture of the operation surface 70 when the push operation is performed is difficult to be tilted with respect to the xy plane, and is stabilized.
 以上の昇降機構340における操作面70は、最上位置にある場合には、可動支持部材41を囲む囲繞面375(図18のA参照)に対して、上方に突き出している。そして操作面70は、複数の弾性部材380による復元力よりも強い押圧力Fpを指Fから印加されることにより、可動支持部材41と一体的に固定支持部材44に向けて下降する。こうした押込操作によって最下位置まで移動した操作面70は、囲繞面375(図18のB参照)と実質的に段差の無い状態となる。また操作面70に印加された押圧力Fpが消失した場合には、操作面70は、複数の弾性部材380の復元力によって可動支持部材41と一体的に上昇する。 When the operation surface 70 of the above lifting mechanism 340 is at the uppermost position, it protrudes upward with respect to the surrounding surface 375 (see A in FIG. 18) surrounding the movable support member 41. The operation surface 70 is lowered toward the fixed support member 44 integrally with the movable support member 41 by applying a pressing force Fp stronger than the restoring force by the plurality of elastic members 380 from the finger F. The operation surface 70 that has been moved to the lowest position by such a pushing operation is in a state where there is substantially no step with the surrounding surface 375 (see B in FIG. 18). When the pressing force Fp applied to the operation surface 70 disappears, the operation surface 70 rises integrally with the movable support member 41 by the restoring force of the plurality of elastic members 380.
 ここまで説明した遠隔操作デバイス300に対し入力されるアイコン選択操作を、図18に基づいて説明する。表示画面52に表示される表示画像60は、操作者によって設定された目的地までの経路を示すナビゲーション画像360aである。このナビゲーション画像360aには、目的地が関連付けられた複数のアイコン63、アイコン63を選択するためのポインタ362、及び車両周辺の道路の形態を示すマップ364等が含まれている。加えてナビゲーション画像360aには、ポインタ362が重畳されたアイコン63を強調するフォーカス62も含まれている。 The icon selection operation input to the remote operation device 300 described so far will be described with reference to FIG. The display image 60 displayed on the display screen 52 is a navigation image 360a showing a route to the destination set by the operator. The navigation image 360a includes a plurality of icons 63 associated with the destination, a pointer 362 for selecting the icon 63, a map 364 indicating the form of the road around the vehicle, and the like. In addition, the navigation image 360a includes a focus 62 that emphasizes the icon 63 on which the pointer 362 is superimposed.
 図18のAのように、指Fから操作面70に印加される押圧力Fpが弱い場合には、操作面70は、最上位置に留まり、第二操作空間Sp2内に位置している。こうした状態では、遠隔操作デバイス300は、操作モードを浅部操作モードに切り替えている。この浅部操作モードでは、操作者による浅部操作は、表示画面52に表示されたマップ364に関連付けられる。故に、操作者は、最上位置の操作面70をなぞる浅部操作により、マップ364を上下左右にスクロールさせて、マップ364上に配置された任意のアイコン63を表示画面52の中央に移動させることができる。 As shown in FIG. 18A, when the pressing force Fp applied from the finger F to the operation surface 70 is weak, the operation surface 70 stays at the uppermost position and is located in the second operation space Sp2. In such a state, the remote operation device 300 switches the operation mode to the shallow operation mode. In this shallow portion operation mode, the shallow portion operation by the operator is associated with the map 364 displayed on the display screen 52. Therefore, the operator scrolls the map 364 up, down, left, and right by a shallow portion operation that traces the uppermost operation surface 70, and moves an arbitrary icon 63 arranged on the map 364 to the center of the display screen 52. Can do.
 図18のBでは、押込方向Dpへの押込操作により、操作面70は、最下位置まで下降している。こうして、第二操作空間Sp2(図18のA参照)から第一操作空間Sp1へと指Fが移動することにより、操作モードは、深部操作モードへと切り替えられる。これにより、表示画面52には、ポインタ362が表示される。この深部操作モードでは、操作者による深部操作は、表示画面52に表示されたポインタ362に関連付けられる。故に、操作者は、最下位置の操作面70をなぞる深部操作により、任意のアイコン63にポインタ362を重畳させることができる。こうした状態下、操作面70に印加していた押圧力Fpを弱めて、タッチセンサ31aから指Fを離間させることで、任意のアイコン63が選択される。 In FIG. 18B, the operation surface 70 is lowered to the lowest position by the pushing operation in the pushing direction Dp. Thus, when the finger F moves from the second operation space Sp2 (see A in FIG. 18) to the first operation space Sp1, the operation mode is switched to the deep operation mode. As a result, the pointer 362 is displayed on the display screen 52. In this deep operation mode, the deep operation by the operator is associated with the pointer 362 displayed on the display screen 52. Therefore, the operator can superimpose the pointer 362 on the arbitrary icon 63 by deep operation of tracing the operation surface 70 at the lowest position. Under such a state, an arbitrary icon 63 is selected by weakening the pressing force Fp applied to the operation surface 70 and separating the finger F from the touch sensor 31a.
 ここまで説明した第三実施形態でも、第一実施形態と同様の効果を奏することにより、操作者は、指Fが現在位置している操作空間がいずれであるかを把握することができる。したがって、誤操作の低減が可能となる。 Even in the third embodiment described so far, the operator can grasp which operation space the finger F is currently located by obtaining the same effect as the first embodiment. Therefore, it is possible to reduce erroneous operations.
 加えて第三実施形態では、複数の弾性部材380が異なる箇所にて復元力を発生させている。故に、操作面70の中央からずれた位置に押圧力Fpが印加されたとしても、昇降機構340は、安定的且つ確実に反力Freを発生させて、指Fの移動に操作面70を追従させることができる。こうして指Fと操作面70との接触状態が維持されることによれば、操作者は、操作面70を押し込んだ状態であるか否かの操作感覚を確実に把握し続けられる。したがって、誤操作を低減する効果は、さらに高い確実性をもって発揮されるようになる。 In addition, in the third embodiment, the plurality of elastic members 380 generate a restoring force at different locations. Therefore, even if the pressing force Fp is applied to a position deviated from the center of the operation surface 70, the elevating mechanism 340 stably and reliably generates the reaction force Fre and follows the operation surface 70 as the finger F moves. Can be made. By maintaining the contact state between the finger F and the operation surface 70 in this way, the operator can reliably keep track of the operational sensation as to whether or not the operation surface 70 has been pushed. Therefore, the effect of reducing erroneous operations is exhibited with higher certainty.
 さらに第三実施形態による昇降機構340では、押込操作の際に、操作面70は、z軸方向にのみ実質的に移動し、xy平面に沿う方向には実質的に移動しない。故に、操作面70に引き摺られた指Fが上述の平面方向に移動することで、操作者の意図しない画像部の移動を生じさせてしまう事態は、回避され得る。したがって、操作面70を昇降可能にする構成に起因した誤操作も、上述の構成であれば低減可能となる。 Furthermore, in the elevating mechanism 340 according to the third embodiment, the operation surface 70 moves substantially only in the z-axis direction and does not move substantially in the direction along the xy plane during the pushing operation. Therefore, a situation in which the finger F dragged on the operation surface 70 moves in the above-described plane direction and causes the movement of the image part not intended by the operator can be avoided. Therefore, erroneous operation caused by the configuration that allows the operation surface 70 to be raised and lowered can be reduced with the above-described configuration.
 尚、第三実施形態において、昇降機構340が特許請求の範囲に記載の「移動機構」に相当し、各弾性部材380が特許請求の範囲に記載の「反力発生部」に相当し、遠隔操作デバイス300が特許請求の範囲に記載の「操作デバイス」に相当する。 In the third embodiment, the elevating mechanism 340 corresponds to the “movement mechanism” recited in the claims, and each elastic member 380 corresponds to the “reaction force generating portion” recited in the claims, The operation device 300 corresponds to an “operation device” recited in the claims.
 (変形例)
 以上、本開示による複数の実施形態について説明したが、本開示は、上記実施形態に限定して解釈されるものではなく、本開示の要旨を逸脱しない範囲内において変形して適用することができる。
(Modification)
Although a plurality of embodiments according to the present disclosure have been described above, the present disclosure is not construed as being limited to the above embodiments, and can be modified and applied without departing from the scope of the present disclosure. .
 上記第一~第三実施形態では、互いに構成の異なる昇降機構が用いられていたが、操作面を移動可能にするための機構は、上記実施形態の支持機構(所謂、パンタグラフ)に限定されない。昇降機構における各リンクの配置及び両端部の接続構造等は、適宜変更されてよい。さらに、上記実施形態のような支持機構に替えて、例えばz軸方向に沿って延伸するガイドレールに沿って可動支持部材を摺動可能にするレール機構が設けられていてよい。また、押込操作に伴ったxy平面方向への操作面の移動は、許容されていてもよい。 In the first to third embodiments, the lifting mechanisms having different configurations are used, but the mechanism for making the operation surface movable is not limited to the support mechanism (so-called pantograph) of the above embodiment. The arrangement of the links in the elevating mechanism, the connection structure at both ends, and the like may be changed as appropriate. Furthermore, instead of the support mechanism as in the above embodiment, a rail mechanism that allows the movable support member to slide along a guide rail extending along the z-axis direction, for example, may be provided. Further, movement of the operation surface in the xy plane direction accompanying the pressing operation may be allowed.
 上記実施形態では、反力を発生させるための構成として、ゴム製の弾性部材(所謂ラバードム)が設けられていた。しかし、弾性部材の材料、硬度、及び形状等は、軽いクリック感が演出されるように、適宜変更されてよい。例えば、ウレタン等の樹脂材料による弾性部材が、反力を発生させてもよい。さらに、コイルスプリングや板ばね等の要素が、可動支持部材及び固定支持部材の間に挟まれることで、反力を発生させていてもよい。さらに、可動支持部材を支持するリンクの撓みが反力を発生させる構成であってもよい。 In the above embodiment, a rubber elastic member (so-called rubber dome) is provided as a configuration for generating a reaction force. However, the material, hardness, shape, and the like of the elastic member may be changed as appropriate so that a light click feeling is produced. For example, an elastic member made of a resin material such as urethane may generate a reaction force. Furthermore, reaction force may be generated by sandwiching elements such as a coil spring and a leaf spring between the movable support member and the fixed support member. Furthermore, the structure which generate | occur | produces reaction force may be sufficient as the bending of the link which supports a movable support member.
 上記実施形態では、反力増加区間Zinと反力減少区間Zdeとを連続して設けることにより、クリック感が演出されていた。しかし、こうしたクリック感の演出は、なされなくてもよい。 In the above embodiment, the click feeling is produced by continuously providing the reaction force increasing section Zin and the reaction force decreasing section Zde. However, such a click feeling need not be produced.
 上記第二実施形態では、第一閾値距離DFth1及び第二閾値距離DFth2という二つの閾値距離を用いることにより、操作モードの切り替えにヒステリシスが設けられていた。第一,第三実施形態でも同様に、各閾値距離にヒステリシスが設けられていてもよい。具体的には、第一操作空間Sp1内に指Fが位置する深部操作モードである場合に、第一閾値距離Dth1を延長することにより、第一操作空間Sp1を拡大させる。同様に、第二操作空間Sp2内に指Fが位置している浅部操作モードである場合に、第二閾値距離Dth2を延長することにより、第二操作空間Sp2を拡大させる。こうした構成により、各操作空間Sp1,Sp2内における指Fによる操作は、いっそう容易となる。また、第二実施形態における深部操作モードと浅部操作モードとの切り替えは、一つの閾値距離を超えているか否かに基づいて行われてもよい。 In the second embodiment, hysteresis is provided for switching the operation mode by using two threshold distances, the first threshold distance DFth1 and the second threshold distance DFth2. Similarly, in the first and third embodiments, hysteresis may be provided for each threshold distance. Specifically, in the deep operation mode in which the finger F is located in the first operation space Sp1, the first operation space Sp1 is expanded by extending the first threshold distance Dth1. Similarly, in the shallow operation mode in which the finger F is located in the second operation space Sp2, the second operation space Sp2 is expanded by extending the second threshold distance Dth2. With such a configuration, the operation with the finger F in each operation space Sp1, Sp2 becomes even easier. Further, the switching between the deep operation mode and the shallow operation mode in the second embodiment may be performed based on whether one threshold distance is exceeded or not.
 上記第二実施形態において、対向面は、開口の周囲を巡るような枠状に形成されていた。しかし、対向面は、タッチセンサの少なくとも一部と対抗していればよく、形状を適宜変更することが可能である。また、板状の金属部材等をケースに貼り付けることにより、対向面の要する導電性が確保されていてもよい。 In the second embodiment, the facing surface is formed in a frame shape that surrounds the periphery of the opening. However, the facing surface only needs to face at least part of the touch sensor, and the shape can be changed as appropriate. Moreover, the electroconductivity which an opposing surface requires may be ensured by sticking a plate-shaped metal member etc. to a case.
 上記実施形態の変形例では、プラズマディスプレパネルを用いた表示器及び有機ELを用いた表示器等が「表示画面」を形成している。さらに別の変形例では、ウィンドウシールド及びインスツルメントパネルの上面に設けたコンバイナ等が「表示画面」とされ、プロジェクタ等の投影手段によって「表示画面」に画像が投影される。このように、ウィンドウシールド及びコンバイナ等に画像を投影する表示装置も、表示画面を形成する構成としてナビゲーション装置に含まれ得る。 In the modification of the above embodiment, a display using a plasma display panel, a display using an organic EL, and the like form a “display screen”. In yet another modification, a window shield and a combiner provided on the top surface of the instrument panel are used as a “display screen”, and an image is projected onto the “display screen” by a projection unit such as a projector. As described above, a display device that projects an image on a window shield, a combiner, or the like can also be included in the navigation device as a configuration that forms a display screen.
 上記実施形態の変形例では、任意のアイコンを選択するためのプッシュボタンが、操作面の近傍に設けられている。操作者は、ポインタ及びフォーカスを任意のアイコンに重畳させた状態で、プッシュボタンを押圧操作することにより、当該アイコンの機能を選択することができる。 In the modification of the above embodiment, a push button for selecting an arbitrary icon is provided in the vicinity of the operation surface. The operator can select the function of the icon by pressing the push button while the pointer and focus are superimposed on an arbitrary icon.
 上記実施形態において、複数例示された「第一画像部」及び「第二画像部」は、適宜変更されてよい。また、上記実施形態において、プログラムを実行した操作制御部33によって提供されていた機能は、上述の制御装置と異なるハードウェア及びソフトウェア、或いはこれらの組み合わせによって提供されてよい。例えば、プログラムによらないで所定の機能を果たす回路によって、「関連付け手段」等の機能が提供されていてもよい。 In the above embodiment, the plurality of “first image portion” and “second image portion” exemplified may be changed as appropriate. In the above-described embodiment, the function provided by the operation control unit 33 that executes the program may be provided by hardware and software different from the above-described control device, or a combination thereof. For example, a function such as “association means” may be provided by a circuit that performs a predetermined function without depending on a program.
 上記第一実施形態におけるS104及びS106(図9参照)の判定、並びに上記第二実施形態におけるS205及びS207(図14参照)の判定において、境界値となる各感度閾値を含むか否かは、適宜変更されてよい。具体的には、第二実施形態のS205において感度値が第二感度閾値HFth2未満である場合に、深部操作モードへの切り替えが行われてもよい。同様に、S207において感度値が第一感度閾値Hth1を超えている場合に、浅部操作モードへの切り替えが行われてもよい。 In the determination of S104 and S106 (refer to FIG. 9) in the first embodiment and the determination of S205 and S207 (refer to FIG. 14) in the second embodiment, whether or not each sensitivity threshold value serving as a boundary value is included. It may be changed as appropriate. Specifically, when the sensitivity value is less than the second sensitivity threshold value HFth2 in S205 of the second embodiment, switching to the deep operation mode may be performed. Similarly, when the sensitivity value exceeds the first sensitivity threshold Hth1 in S207, switching to the shallow portion operation mode may be performed.
 上記実施形態では、車両に搭載される表示システムに用いられる遠隔操作デバイスに、本開示を適用した例を説明した。しかし、本開示は、表示画面と一体的に構成される、所謂タッチパネル方式の操作デバイスにも、適用可能である。さらに、車両用に限らず、各種輸送用機器及び各種情報端末等に用いられる表示システム全般に、本開示を適用された操作デバイスは採用可能である。 In the above embodiment, an example in which the present disclosure is applied to a remote operation device used in a display system mounted on a vehicle has been described. However, the present disclosure can also be applied to a so-called touch panel type operation device configured integrally with a display screen. Furthermore, the operation device to which the present disclosure is applied can be applied not only to the vehicle but also to all display systems used for various transportation devices and various information terminals.

Claims (9)

  1.  操作体(F)による入力が操作面(70)に対してなされることにより、表示画面(52)に表示された画像部を操作する操作デバイスであって、
     前記操作面の背面側に設けられ、前記操作体の移動を検出する検出手段(31a)と、
     前記検出手段から前記操作体までの操作体距離(d)を取得する取得手段(31,34)と、
     前記操作体距離が予め規定された閾値距離(Dth1)未満となる第一操作空間(Sp1)内において検出された前記操作体の移動と、前記操作体距離が前記閾値距離を超える第二操作空間(Sp2)内で検出された前記操作体の移動と、を区別する判別手段(35)と、
     前記操作体から前記操作面に印加される押圧力(Fp)により、前記第二操作空間内に位置させた前記操作面を、前記第一操作空間内に移動させる移動機構(40,340)と、を備える操作デバイス。
    An operation device for operating the image portion displayed on the display screen (52) by inputting the operation body (F) to the operation surface (70),
    A detecting means (31a) provided on the back side of the operation surface for detecting movement of the operation body;
    Obtaining means (31, 34) for obtaining an operating body distance (d) from the detecting means to the operating body;
    The movement of the operating tool detected in the first operating space (Sp1) where the operating tool distance is less than a predetermined threshold distance (Dth1), and the second operating space where the operating tool distance exceeds the threshold distance. A discriminating means (35) for distinguishing the movement of the operating body detected in (Sp2);
    A moving mechanism (40, 340) for moving the operation surface positioned in the second operation space into the first operation space by a pressing force (Fp) applied to the operation surface from the operation body; , Comprising an operation device.
  2.  操作体(F)による入力が操作面(70)に対してなされることにより、表示画面(52)に表示された画像部を操作する操作デバイスであって、
     前記操作面の背面側に設けられ、前記操作面に沿った前記操作体の移動を検出する検出手段(231a)と、
     前記操作面の少なくとも一部と対向するよう配置される対向面(277)と、 前記操作体から前記操作面に印加される押圧力(Fp)により、前記検出手段から前記対向面までの対向面距離(df)が拡大するよう、前記検出手段を前記操作面と共に移動させる移動機構(240)と、
     前記対向面距離を取得する取得手段(31,234)と、
     前記押圧力の印加によって前記対向面距離が予め規定された閾値距離(DFth1,DFth2)を超える第一操作空間(Sp1)内に位置した前記検出手段の検出する前記操作体の移動と、前記押圧力の解放によって前記対向面距離が前記閾値距離未満となる第二操作空間(Sp2)内に位置した前記検出手段の検出する前記操作体の移動とを区別する判別手段(235)と、を備える操作デバイス。
    An operation device for operating the image portion displayed on the display screen (52) by inputting the operation body (F) to the operation surface (70),
    Detection means (231a) provided on the back side of the operation surface and detecting movement of the operation body along the operation surface;
    An opposing surface (277) arranged to oppose at least a part of the operating surface, and an opposing surface from the detecting means to the opposing surface by a pressing force (Fp) applied from the operating body to the operating surface A moving mechanism (240) for moving the detecting means together with the operation surface so that the distance (df) is increased;
    Acquisition means (31,234) for acquiring the facing surface distance;
    The movement of the operating body detected by the detecting means positioned in the first operation space (Sp1) in which the opposing surface distance exceeds a predetermined threshold distance (DFth1, DFth2) by the application of the pressing force; A discriminating unit (235) for distinguishing the movement of the operating body detected by the detecting unit located in the second operating space (Sp2) in which the opposing surface distance is less than the threshold distance by releasing pressure. Operation device.
  3.  前記対向面は、前記操作面の外縁と対向する枠状である請求項2に記載の操作デバイス。 The operation device according to claim 2, wherein the facing surface has a frame shape facing an outer edge of the operation surface.
  4.  前記移動機構は、前記押圧力に抗する反力(Fre)を発生する反力発生部(80,380)を有する請求項1~3のいずれか一項に記載の操作デバイス。 The operation device according to any one of claims 1 to 3, wherein the moving mechanism includes a reaction force generation unit (80, 380) that generates a reaction force (Fre) against the pressing force.
  5.  前記反力発生部は、
     前記第二操作空間から前記第一操作空間へ向かう押込方向(Dp)への前記操作面の押込変位量(s)が大きくなるに従い、前記反力が増加する反力増加区間(Zin)と、
     前記反力増加区間の前記押込方向にて当該反力増加区間と連続し、前記押込変位量が大きくなるに従い前記反力が減少する反力減少区間(Zde)と、を設定する請求項4に記載の操作デバイス。
    The reaction force generator is
    A reaction force increasing section (Zin) in which the reaction force increases as the pushing displacement amount (s) of the operation surface in the pushing direction (Dp) from the second operation space toward the first operation space increases;
    5. A reaction force decrease section (Zde), which is continuous with the reaction force increase section in the pushing direction of the reaction force increase section and decreases as the push displacement amount increases, is set. The operating device described.
  6.  前記押込方向に移動する前記操作面は、前記反力減少区間において、前記第二操作空間と前記第一操作空間との間における仮想の境界面(BP)を通過する請求項5に記載の操作デバイス。 The operation according to claim 5, wherein the operation surface moving in the pushing direction passes through a virtual boundary surface (BP) between the second operation space and the first operation space in the reaction force reduction section. device.
  7.  前記移動機構は、複数の前記反力発生部(380)を有し、
     複数の前記反力発生部は、互いに異なる箇所にて前記反力を発生する請求項4~6のいずれか一項に記載の操作デバイス。
    The moving mechanism includes a plurality of the reaction force generation units (380),
    The operation device according to any one of claims 4 to 6, wherein the plurality of reaction force generation units generate the reaction force at different locations.
  8.  前記移動機構は、前記操作面と実質的に直交する方向に、当該操作面を移動させる請求項1~7のいずれか一項に記載の操作デバイス。 The operation device according to any one of claims 1 to 7, wherein the moving mechanism moves the operation surface in a direction substantially orthogonal to the operation surface.
  9.  前記表示画面による表示は、上位選択画像(60a)にて特定の機能が選択されることにより、当該上位選択画像よりも下位の階層に規定された下位選択画像(60b)へと切り替わり、
     前記第一操作空間内における前記操作体の移動を、前記下位選択画像に含まれる第一画像部(62)に関連付け、前記第二操作空間内における前記操作体の移動を、前記上位選択画像に含まれる第二画像部(165)に関連付ける関連付け手段(36)、をさらに備える請求項1~8のいずれか一項に記載の操作デバイス。
    The display on the display screen is switched to a lower selection image (60b) defined in a lower hierarchy than the upper selection image when a specific function is selected in the upper selection image (60a).
    The movement of the operation body in the first operation space is associated with the first image portion (62) included in the lower selection image, and the movement of the operation body in the second operation space is changed to the upper selection image. The operation device according to any one of claims 1 to 8, further comprising association means (36) for associating with the second image part (165) included.
PCT/JP2013/007315 2012-12-24 2013-12-12 Operation device WO2014103221A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2012280406 2012-12-24
JP2012-280406 2012-12-24
JP2013-181483 2013-09-02
JP2013181483A JP5754483B2 (en) 2012-12-24 2013-09-02 Operation device

Publications (1)

Publication Number Publication Date
WO2014103221A1 true WO2014103221A1 (en) 2014-07-03

Family

ID=51020345

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/007315 WO2014103221A1 (en) 2012-12-24 2013-12-12 Operation device

Country Status (2)

Country Link
JP (1) JP5754483B2 (en)
WO (1) WO2014103221A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110945462A (en) * 2017-07-21 2020-03-31 阿尔卑斯阿尔派株式会社 Input device, input device control method, and control program

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6304154B2 (en) * 2015-07-14 2018-04-04 株式会社Soken Operating device
JP6477522B2 (en) * 2016-01-26 2019-03-06 豊田合成株式会社 Touch sensor device
CN105930049A (en) * 2016-04-12 2016-09-07 广东欧珀移动通信有限公司 Method for avoiding incorrect operation, and terminal

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006350736A (en) * 2005-06-16 2006-12-28 Tokai Rika Co Ltd Pointer display controller and pointing device
JP2008016053A (en) * 2007-08-29 2008-01-24 Hitachi Ltd Display device having touch panel
JP2011128692A (en) * 2009-12-15 2011-06-30 Panasonic Corp Input device, input method, and input program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006350736A (en) * 2005-06-16 2006-12-28 Tokai Rika Co Ltd Pointer display controller and pointing device
JP2008016053A (en) * 2007-08-29 2008-01-24 Hitachi Ltd Display device having touch panel
JP2011128692A (en) * 2009-12-15 2011-06-30 Panasonic Corp Input device, input method, and input program

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110945462A (en) * 2017-07-21 2020-03-31 阿尔卑斯阿尔派株式会社 Input device, input device control method, and control program
CN110945462B (en) * 2017-07-21 2023-06-09 阿尔卑斯阿尔派株式会社 Input device, input device control method, and control program

Also Published As

Publication number Publication date
JP5754483B2 (en) 2015-07-29
JP2014142914A (en) 2014-08-07

Similar Documents

Publication Publication Date Title
USRE47028E1 (en) Information processing apparatus, method and computer readable medium for fixing an input position according to an input operation
JP4522475B1 (en) Operation input device, control method, and program
US9778764B2 (en) Input device
JP4787087B2 (en) Position detection apparatus and information processing apparatus
US20150205943A1 (en) Manipulation apparatus
EP2426581A2 (en) Information processing device, information processing method, and computer program
CN106164824B (en) Operating device for vehicle
JP5640486B2 (en) Information display device
JP2010224658A (en) Operation input device
JP5858059B2 (en) Input device
CN106372544B (en) Temporary secure access via an input object held in place
JP5754483B2 (en) Operation device
JP5751233B2 (en) Operation device
WO2013153750A1 (en) Display system, display device, and operation device
WO2012111227A1 (en) Touch input device, electronic apparatus, and input method
JP5954145B2 (en) Input device
US20130311945A1 (en) Input device
WO2015033682A1 (en) Manipulation input device, portable information terminal, method for control of manipulation input device, program, and recording medium
JP6115421B2 (en) Input device and input system
JP5772804B2 (en) Operation device and operation teaching method for operation device
JP2013073365A (en) Information processing device
JP2013088912A (en) Operation device for vehicle
WO2014162698A1 (en) Input device
US20180292924A1 (en) Input processing apparatus
WO2015093005A1 (en) Display system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13867826

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13867826

Country of ref document: EP

Kind code of ref document: A1