WO2014103221A1 - Dispositif d'action - Google Patents

Dispositif d'action Download PDF

Info

Publication number
WO2014103221A1
WO2014103221A1 PCT/JP2013/007315 JP2013007315W WO2014103221A1 WO 2014103221 A1 WO2014103221 A1 WO 2014103221A1 JP 2013007315 W JP2013007315 W JP 2013007315W WO 2014103221 A1 WO2014103221 A1 WO 2014103221A1
Authority
WO
WIPO (PCT)
Prior art keywords
space
distance
operating
reaction force
finger
Prior art date
Application number
PCT/JP2013/007315
Other languages
English (en)
Japanese (ja)
Inventor
江波 和也
健一 竹中
Original Assignee
株式会社デンソー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社デンソー filed Critical 株式会社デンソー
Publication of WO2014103221A1 publication Critical patent/WO2014103221A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/10Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/22Display screens
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/50Instruments characterised by their means of attachment to or integration in the vehicle
    • B60K35/53Movable instruments, e.g. slidable
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/143Touch sensitive instrument input devices
    • B60K2360/1438Touch screens
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/146Instrument input by gesture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/146Instrument input by gesture
    • B60K2360/1468Touch gesture

Definitions

  • This disclosure relates to an operation device that operates an image portion displayed on a display screen by an input to an operation surface.
  • Patent Document 1 discloses a technique for moving an image portion such as a navigation pointer and a radio main screen displayed on a display screen in association with an operation performed on a remote touchpad portion.
  • the user interface device disclosed in Patent Document 1 relates a remote touch pad unit that detects an operation of moving an operator's finger and the like, and a finger operation detected by the remote touch pad unit to movement of a map, a pointer, and the like.
  • a control unit obtains the distance from the remote touch pad unit to the finger.
  • the control unit can distinguish between a case where the distance to the finger is less than 3 centimeters (cm) and a case where the distance to the finger is within a range of 5 cm to 7 cm, for example.
  • Patent Document 1 it is possible to grasp whether or not the finger is positioned within the range of 5 cm to 7 cm from the detection means such as the remote touchpad unit when performing the switching operation described above. I had to rely on the intuition and familiarity of the operator. As described above, it is difficult to grasp the distance from the detection means to the finger, and there is a possibility that an erroneous operation is induced.
  • the present disclosure has been made in view of the above problems, and an object of the present disclosure is to provide an operation device capable of reducing erroneous operations even in a configuration in which an operation space is distinguished by a distance from a detection unit. That is.
  • the operation device operates on the image portion displayed on the display screen when input from the operation body is performed on the operation surface, and is provided on the back side of the operation surface.
  • Detection means for detecting movement of the operating tool, acquisition means for acquiring the operating tool distance from the detecting means to the operating tool, and detection in the first operating space where the operating tool distance is less than a predetermined threshold distance.
  • the discriminating means for distinguishing between the movement of the operated body and the movement of the operating body detected in the second operation space where the operating body distance exceeds the threshold distance, and the pressing force applied to the operating surface from the operating body, And a moving mechanism that moves the operation surface positioned in the second operation space into the first operation space.
  • the operation device operates on the image portion displayed on the display screen when input by the operation body is performed on the operation surface, and is provided on the back side of the operation surface.
  • a moving mechanism that moves the detecting means together with the operation surface, an acquisition means for acquiring the opposing surface distance, and a threshold distance in which the opposing surface distance is defined in advance by applying pressure
  • the operation surface or the detection means when the pressing force applied from the operating body to the operation surface is weak, the operation surface or the detection means is located in the second operation space.
  • the operation surface or the detection means moves from the second operation space to the first operation space by the function of the moving mechanism. .
  • the operation surface moves from the second operation space to the first operation space according to the strength of the pressing force applied from the operation body.
  • FIG. 5 is a diagram for explaining the configuration of the remote control device according to the first embodiment, and is a cross-sectional view taken along line VV in FIG. 4.
  • FIG. 5 is a diagram for explaining the configuration of the remote control device according to the first embodiment, and is a cross-sectional view taken along line VV in FIG. 4.
  • FIG. 6 is a view for explaining the shape of the elastic member, and is a cross-sectional view taken along line VI-VI in FIG. 5. It is a figure which shows the correlation with the deformation
  • the remote operation device 100 is mounted on a vehicle and constitutes a display system 10 together with a navigation device 50 and the like as shown in FIG.
  • the remote operation device 100 is installed at a position adjacent to the palm rest 39 at the center console of the vehicle, and exposes the operation surface 70 in a range that can be easily reached by the operator.
  • An operation with an index finger (hereinafter simply referred to as “finger”) F or the like of the operator's hand is input to the operation surface 70.
  • the navigation device 50 is installed in the instrument panel of the vehicle in a posture in which the display screen 52 is exposed and viewed from the operator and the display screen 52 faces the driver's seat.
  • Various display images 60 are displayed on the display screen 52.
  • the display image 60 shown in FIG. 3 is one of a plurality of display images displayed on the display screen 52, and shows an air conditioning menu 60b for operating the air conditioning equipment mounted on the vehicle.
  • the display image 60 includes a plurality of icons 63 associated with a specific function, a focus 62 for selecting the icons 63, a background portion 64 serving as a background of the icons 63 and the focus 62, and the like.
  • the position where the focus 62 is displayed on the display screen 52 corresponds to the position where the finger F touches on the operation surface 70 shown in FIG.
  • the display image 60 described above is generated when the navigation device 50 superimposes a plurality of drawing layers.
  • the remote operation device 100 is connected to a controller area network (CAN, registered trademark) bus 90, an external battery 95, and the like.
  • the CAN bus 90 is a transmission path used for data transmission between in-vehicle devices in an in-vehicle communication network formed by connecting a plurality of in-vehicle devices mounted on a vehicle.
  • the remote operation device 100 is capable of CAN communication with the navigation device 50 located remotely via the CAN bus 90.
  • the remote operation device 100 includes power interfaces 21 and 22, a communication control unit 23, a communication interface 24, a detection unit 31, an operation control unit 33, and the like.
  • Each power interface 21, 22 stabilizes the power supplied from the battery 95 and supplies it to the operation control unit 33.
  • One power interface 21 is always supplied with power from the battery 95.
  • the other power interface 22 is supplied with electric power from the battery 95 when the switch 93 is energized when the accessory (ACC) power source of the vehicle is turned on.
  • the communication control unit 23 and the communication interface 24 are configured to output information processed by the operation control unit 33 to the CAN bus 90 and to acquire information output to the CAN bus 90 from other in-vehicle devices.
  • the communication control unit 23 and the communication interface 24 are connected to each other by a transmission signal line TX and a reception signal line RX.
  • the detection unit 31 includes a touch sensor 31a, a low-pass filter 31b, and an electrostatic detection IC 31c.
  • the touch sensor 31 a is provided on the back side of the operation surface 70, is formed in a rectangular shape along the operation surface 70, and accumulates electric charges with the finger F.
  • the touch sensor 31a is formed by arranging electrodes extending along the x-axis direction and electrodes extending along the y-axis direction in FIG. 4 in a grid pattern.
  • the low-pass filter 31b in FIG. 1 is a circuit formed by combining a passive resistor, a coil, a capacitor, and the like.
  • the low-pass filter 31b suppresses a high-frequency noise component generated in the touch sensor 31a from being input to the electrostatic detection IC 31c.
  • the electrostatic detection IC 31 c is connected to the touch sensor 31 a and the operation control unit 33. As shown in FIG. 5, electric charges are stored between the finger F and the touch sensor 31a that are close to each other.
  • the electrostatic detection IC 31c in FIG. 1 acquires a sensitivity value (see FIG. 10) that increases or decreases according to the capacitance between the finger F (see FIG. 5) and each electrode based on the output of each electrode, Output to the operation control unit 33.
  • the operation control unit 33 includes a processor that performs various arithmetic processes, a RAM that functions as a work area for the arithmetic processes, and a flash memory that stores programs used for the arithmetic processes.
  • the operation control unit 33 is connected to the power supply interfaces 21 and 22, the communication control unit 23, the detection unit 31, and the like.
  • the operation control unit 33 includes an acquisition block 34 and a determination block 35 and an association block 36, which will be described in detail later, as functional blocks by executing a predetermined program.
  • the acquisition block 34 acquires the sensitivity value output from the detection unit 31.
  • the acquisition block 34 includes an x coordinate and ay coordinate indicating the relative position of the finger F (see FIG. 5) with respect to the operation surface 70, and a distance from the touch sensor 31a to the finger F (hereinafter referred to as “operation object distance d” Z coordinate corresponding to (refer to A in FIG. 8) is detected by a calculation process based on the sensitivity value.
  • the operation control unit 33 outputs the x coordinate and the y coordinate indicating the relative position of the finger F to the CAN bus 90 through the communication control unit 23 and the communication interface 24.
  • the navigation apparatus 50 shown in FIGS. 1 and 2 is connected to a CAN bus 90 so that it can communicate with the remote operation device 100 and the like.
  • the navigation device 50 includes a display control unit 53 and a liquid crystal display 51.
  • the display control unit 53 includes a processor that performs various types of arithmetic processing, a RAM that functions as a work area for arithmetic processing, a graphic processor that performs image drawing processing, a graphic RAM that functions as a work area for drawing processing, and the like.
  • the display control unit 53 has a flash memory for storing data used for arithmetic processing and drawing processing, a communication interface connected to the CAN bus 90, and a video output interface for outputting drawn image data to the liquid crystal display 51. is doing.
  • the display control unit 53 draws a display image 60 to be displayed on the display screen 52 based on information acquired from the CAN bus 90. Then, the display control unit 53 sequentially outputs the image data of the drawn display image 60 to the liquid crystal display 51 through the video output interface.
  • the liquid crystal display 51 is a dot matrix type display that realizes color display by controlling a plurality of pixels arranged on the display screen 52.
  • the liquid crystal display 51 displays video by continuously forming image data sequentially acquired from the display control unit 53 on the display screen 52.
  • the lifting mechanism 40 includes a movable support member 41, a fixed support member 44, a first link 47 and a second link 48, an elastic member 80, and the like.
  • the movable support member 41 is formed in a rectangular plate shape, and is movable in the z-axis direction while maintaining a posture substantially parallel to the touch sensor 31a.
  • the sheet member 71 is affixed to the front surface located on the side away from the touch sensor 31a among both plate surfaces of the movable support member 41.
  • connection portions 42a and 42b are provided on the back surface located on the side close to the touch sensor 31a.
  • One set of connection portions 42 a and 42 b is provided on both edge portions of the movable support member 41 in the x-axis direction, and is erected from the back surface of the movable support member 41 toward the fixed support member 44.
  • the connecting portions 42a and 42b are positioned so as to be shifted from each other in the x-axis direction.
  • the fixed support member 44 is formed in a plate shape and is fixed to the center console of the vehicle. Of the two plate surfaces of the fixed support member 44, the touch sensor 31a is affixed to the back surface facing the touch sensor 31a. Further, of both plate surfaces of the fixed support member 44, connection portions 45 a and 45 b are provided on the front surface facing the movable support member 41. One set of connection portions 45 a and 45 b is provided on both sides of the elastic member 80 in the x-axis direction, and is erected from the front surface of the fixed support member 44 toward the movable support member 41. The connection parts 45a and 45b are located offset from each other in the x-axis direction.
  • the first link 47 and the second link 48 are formed in a longitudinal shape by a metal member extending in a band shape, and one set is provided on each side of the elastic member 80 in the x-axis direction. Each longitudinal direction of the first link 47 and the second link 48 is oriented in the y-axis direction.
  • the first link 47 and the second link 48 are attached to the movable support member 41 and the fixed support member 44 in a posture in which the respective longitudinal directions are inclined with respect to the y-axis direction.
  • the first link 47 and the second link 48 are shifted from each other in the x-axis direction. Both end portions of the first link 47 and the second link 48 are connected to one of the movable support member 41 and the fixed support member 44 by hinges.
  • one end 47a of both ends of the first link 47 is rotatably attached to the connecting portion 42a of the movable support member 41, and the other 47b of both ends of the first link 47 is fixedly supported. It is rotatably attached to the connecting portion 45b of the member 44.
  • One end 48 a of both end portions of the second link 48 is rotatably attached to the connection portion 45 a of the fixed support member 44, and the other end 48 b of both end portions of the second link 48 is the connection portion of the movable support member 41.
  • 42b is rotatably attached.
  • the hinge is allowed to slide in the y-axis direction.
  • the elastic member 80 shown in FIGS. 5 and 6 is formed in a container shape by a material such as rubber that is easily elastically deformed.
  • the elastic member 80 is placed at the center of the fixed support member 44 and is sandwiched between the movable support member 41 and the fixed support member 44.
  • the elastic member 80 is formed with a large diameter portion 81, a small diameter portion 83, a reduced diameter portion 82, and a stopper 85.
  • the large-diameter portion 81 is formed in a cylindrical shape, and is placed on the front surface of the fixed support member 44 in a posture in which the axial direction is aligned with the z-axis direction.
  • the small diameter portion 83 is formed in a cylindrical shape having a smaller outer diameter than the large diameter portion 81.
  • the small diameter portion 83 is in contact with the back surface of the movable support member 41 in a posture in which the axial direction is along the z-axis direction.
  • the small diameter portion 83 is positioned on the same axis as the large diameter portion 81.
  • the reduced diameter portion 82 is formed between the large diameter portion 81 and the small diameter portion 83.
  • the outer diameter of the reduced diameter portion 82 is gradually reduced from the large diameter portion 81 toward the reduced diameter portion 82 along the z-axis direction.
  • the stopper 85 protrudes in a columnar shape from the center of the top wall portion 84 of the elastic member 80 toward the fixed support member 44.
  • the elastic member 80 described above is compressed in the z-axis direction, so that the large diameter portion 81 and the small diameter portion 83 are crushed. At this time, the restoring force in the z-axis direction of the elastic member 80 gradually increases as the deformation amount of the elastic member 80 increases (see FIG. 7). When the elastic member 80 is further compressed, the small diameter portion 83 sinks to the inner peripheral side of the large diameter portion 81. At this time, the restoring force of the elastic member 80 temporarily decreases (see FIG. 7). When the elastic member 80 is further compressed, the restoring force in the axial direction of the elastic member 80 increases again (see FIG. 7). Then, when the stopper 85 comes into contact with the front surface of the fixed support member 44, the compression in the z-axis direction is restricted.
  • the pressing force Fp stronger than the restoring force of the elastic member 80 is applied from the finger F to the operating surface 70. Then, it moves downward toward the fixed support member 44 integrally with the movable support member 41. Thereby, the distance between the touch sensor 31a and the operation surface 70, and hence the operation body distance d between the touch sensor 31a and the finger F are shortened.
  • the pressing force Fp applied to the operation surface 70 disappears, the operation surface 70 rises integrally with the movable support member 41 by the restoring force of the elastic member 80. Thereby, the above-mentioned operating body distance d is extended.
  • the operation mode is switched according to the operation object distance d of the finger F that inputs the movement operation.
  • the determination block 35 shown in FIG. 1 distinguishes the detected movement of the finger F according to the z coordinate acquired by the acquisition block 34.
  • the association block 36 changes the image portion associated with the movement operation of the finger F on the display screen 52 shown in FIG.
  • the following operation modes (1) to (3) preliminarily defined in the remote operation device 100 will be described in detail.
  • the shallow portion operation mode is set when the finger F is located in the second operation space Sp2.
  • the second operation space Sp2 is a space in which the operating body distance d is not less than the first threshold distance Dth1 and less than the second threshold distance Dth2.
  • the first threshold distance Dth1 is set to be shorter than the distance between the touch sensor 31a and the operation surface 70 of the movable support member 41 in the state farthest from the fixed support member 44 (hereinafter referred to as the “uppermost position”).
  • the second threshold distance Dth2 is set to be, for example, about 0.5 to 1 cm longer than the distance from the touch sensor 31a to the operation surface 70 of the movable support member 41 at the uppermost position.
  • the movement operation for moving the finger F along the xy plane in the second operation space Sp2 is defined as “shallow operation”. Therefore, the movement operation that traces the operation surface 70 that has been raised to the uppermost position is a shallow operation.
  • the deep operation mode is set when the finger F is located in the first operation space Sp1.
  • the first operation space Sp1 is a space in which the operating body distance d is less than the first threshold distance Dth1.
  • the first threshold distance Dth1 is, for example, less than the distance between the operation surface 70 of the movable support member 41 and the touch sensor 31a in the state closest to the fixed support member 44 (hereinafter referred to as the “lowermost position”).
  • the length is set to about 5 to 1 cm.
  • a virtual boundary surface BP between the first operation space Sp1 and the second operation space Sp2 (see A in FIG. 8) is defined.
  • the movement operation for moving the finger F along the xy plane in the first operation space Sp1 is defined as “deep part operation”. Therefore, the moving operation of tracing the operation surface 70 lowered to the lowest position is a deep operation.
  • Non-proximity mode In the non-proximity mode, the moving operation with the finger F is not associated with any image portion of the display screen 52.
  • the non-proximity mode is set when the finger F is not positioned in either the first operation space Sp1 or the second operation space Sp2 (see A in FIG. 8).
  • the space excluding the first operation space Sp1 and the second operation space Sp2 is set as a non-proximity space.
  • the operator who wants to start the icon selection operation moves the finger F (see the two-dot chain line in A of FIG. 8) away from the touch sensor 31 a to the second threshold distance Dth2. Move toward the operation surface 70.
  • the operation mode is the non-proximity mode, the operation with the finger F is not associated with the image portion.
  • the finger F When the finger F is moved from the non-proximity space to the second operation space Sp2 and the operator inputs a tap operation that strikes the operation surface 70, the association between the shallow portion operation by the finger F and the scroll control is started.
  • the operation mode is set to the shallow operation mode
  • the main menu 60a defined in the upper hierarchy is displayed on the display screen 52.
  • a plurality of submenu image portions 165 can be scrolled.
  • reaction force increasing section Zin a section where the reaction force Fre gradually increases while the operation surface 70 moves from the uppermost position to the lowermost position.
  • reaction force decrease section Zde a section that continues to the reaction force increase section Zin in the pushing direction Dp of the reaction force increase section Zin and gradually decreases the reaction force Fre.
  • the first threshold distance Dth1 is adjusted in advance so that the operating surface 70 that descends passes through the boundary surface BP in the reaction force decrease zone Zde. Therefore, the operation surface 70 passes through the boundary surface BP so as to be interlocked with the timing at which the operation surface 70 is pulled downward due to the decrease in the reaction force Fre.
  • the finger F moves from the second operation space Sp2 (see A in FIG. 8) to the first operation space Sp1.
  • the operation mode of the remote operation device 100 is switched from the shallow operation mode to the deep operation mode.
  • the movement operation of the finger F is related to the focus control.
  • the submenu image portion 165 displayed in the central portion of the display screen 52 is displayed on the entire display screen 52 as the air conditioning menu 60b (see also B in FIG. 8).
  • the air conditioning menu 60b is a submenu defined in the lower hierarchy of the main menu 60a.
  • the operator can superimpose the focus 62 on the arbitrary icon 63 by inputting a deep operation of tracing the operation surface 70 with the finger F. Then, the operator weakens the pressing force Fp applied to the operation surface 70 in a state where the focus 62 is superimposed on the arbitrary icon 63, and moves the finger F away from the touch sensor 31a. Thus, the operator can select an arbitrary icon 63.
  • the operator who has completed the selection operation of the icon 63 further moves the finger F away from the operation surface 70 and moves it to the non-proximity space. Thereby, the operation mode is switched to the non-proximity mode. Then, after the operation mode is switched to the non-proximity mode, when a predetermined threshold time Tth (see FIG. 9) elapses, the remote operation device 100 waits for an icon selection operation by the operator after the next time. It becomes a state to do.
  • the operation mode selection processing performed by the operation control unit 33 to realize the above icon selection operation will be described in detail with reference to FIG. 1 based on FIG.
  • the operation mode selection process shown in FIG. 9 is started by the operation control unit 33 when the ACC power supply of the vehicle is turned on.
  • S101 the presence / absence of a tap operation on the operation surface 70 at the uppermost position is determined based on the change in output acquired from the touch sensor 31a. If it is determined in S101 that there is no tap operation, the standby state of the remote operation device 100 is maintained by repeating the determination in S101. On the other hand, if it is determined in S101 that there is a tap operation, the process proceeds to S102.
  • an acquisition process for acquiring a sensitivity value detected by each electrode of the touch sensor 31a is performed, and the process proceeds to S103.
  • calculation processing is performed for the x-coordinate, y-coordinate, and z-coordinate (hereinafter referred to as “input position coordinates”) indicating the position of the finger F in the three-dimensional direction with respect to the touch sensor 31a from the sensitivity value acquired in S102. , The process proceeds to S104.
  • the sensitivity value is a value that increases as the capacitance stored between the operation surface 70 and the finger F increases. Therefore, each coordinate in the x-axis direction and the y-axis direction where the sensitivity value is maximum indicates the relative position of the finger F on the operation surface 70.
  • the sensitivity value decreases as the operating tool distance d (see A in FIG. 8) decreases, and increases as the operating tool distance d increases. Therefore, the maximum sensitivity value corresponds to the operating tool distance d, and thus the coordinate in the z-axis direction.
  • the operation control unit 33 stores in advance a table indicating a first sensitivity threshold value Hth1 corresponding to the first threshold distance Dth1 and a second sensitivity threshold value Hth2 corresponding to the second threshold distance Dth2. In the processing after S104 shown in FIG. 9, the operation control unit 33 performs processing for comparing the maximum sensitivity value acquired in S103 with the sensitivity threshold values Hth1 and Hth2.
  • S104 it is determined whether or not the finger F is in the first operation space Sp1 based on whether or not the sensitivity value is greater than or equal to the first sensitivity threshold Hth1. If an affirmative determination is made in S104, the process proceeds to S105. In S105, the operation mode is set to the deep operation mode, and the process returns to S102.
  • S106 when a negative determination is made in S104, whether the finger F is in the second operation space Sp2 based on whether or not the sensitivity value is greater than or equal to the second sensitivity threshold Hth2 and less than the first sensitivity threshold Hth1. Determine whether or not. If a positive determination is made in S106, the process proceeds to S107. In S107, the operation mode is set to the shallow operation mode, and the process returns to S102.
  • S109 it is determined whether or not the elapsed time t at which the count was started in S108 of this time or the previous S108 is equal to or greater than a predetermined threshold time Tth. If a positive determination is made in S109, the process returns to S101. Thereby, the remote operation device 100 shifts to a standby state. On the other hand, if a negative determination is made in S109, the process returns to S102.
  • the operation control unit 33 that has changed the operation mode in S105, S107, and S108 described above outputs a command signal notifying the change of the operation mode to the CAN bus 90 through the communication control unit 23 and the communication interface 24.
  • the display control unit 53 that has acquired the command signal activates the drawing layer corresponding to each operation mode based on the signal.
  • the operation control unit 33 sets the operation mode to the deep operation mode
  • the display control unit 53 selects the focus layer as an active drawing layer.
  • the operation control unit 33 can associate the deep operation with the finger F with the focus 62 and change the display mode of the focus 62.
  • the operation control unit 33 sets the operation mode to the shallow operation mode
  • the display control unit 53 selects the submenu layer on which the plurality of submenu image units 165 are drawn as the active drawing layer.
  • the operation control unit 33 can associate the shallow operation with the finger F with the submenu image unit 165 and change the display mode of the submenu image unit 165.
  • the operation control unit 33 sets the operation mode to the non-proximity mode
  • the display control unit 53 sets the active drawing layer to “none”. Therefore, the movement operation of the finger F is not associated with any image portion.
  • the operation surface 70 moves from the second operation space Sp2 to the first operation space Sp1 according to the strength of the pressing force Fp applied from the finger F.
  • the operator can grasp which operation space the finger F is currently located on, based on an operation feeling as to whether or not the operation surface 70 is depressed. . Therefore, even if the operation space is distinguished by the operating body distance d as described above and the image portion to be operated is switched, it is possible to reduce erroneous operations.
  • the reaction force Fre that resists the pressing force Fp is generated by the restoring force of the elastic member 80. Therefore, even when the finger F moves in a direction away from the touch panel 31a, the operation surface 70 can move following the finger F. By maintaining the contact state between the finger F and the operation surface 70 in this way, the operator can reliably keep track of the operational sensation as to whether or not the operation surface 70 has been pushed. Therefore, the effect of reducing erroneous operations is exhibited with higher certainty.
  • the reaction force Fre is caused by the transition from the reaction force increase section Zin to the reaction force decrease section Zde. Turns from increasing to decreasing.
  • the click feeling is transmitted to the finger by such a change in the reaction force Fre, the operation feeling as to whether or not the operation surface 70 is pushed is further clarified.
  • the operation space in which the finger F is positioned can be more easily grasped, so that the effect of reducing erroneous operations can be reliably exhibited.
  • the operation surface 70 passes through the boundary surface BP.
  • the click feeling transmitted to the finger F and the switching of the operation space are linked, the operator can more easily grasp the operation space where the finger F is located. Therefore, the effect of reducing erroneous operations is more reliably exhibited.
  • the moving mechanism of the first embodiment even if the operation surface 70 is moved in the pushing direction Dp, the operation surface 70 moves in the direction along the xy plane very little. Therefore, it is possible to avoid a situation in which the finger F dragged on the operation surface 70 moves in the above-described plane direction and causes the movement of the image part not intended by the operator during the pushing operation. Therefore, erroneous operation caused by the configuration that allows the operation surface 70 to be raised and lowered can be reduced with the above-described configuration.
  • the display on the display screen 52 is constructed in a hierarchical structure.
  • the submenu image portion 165 of the main menu 60a which is the upper layer
  • the focus 62 of the air conditioning menu 60b which is a lower hierarchy
  • the upper and lower logical correlations in the hierarchical structure and the upper and lower physical correlations in the operation spaces Sp1 and Sp2 are easily linked in the operator's thought. Therefore, erroneous operations are more easily reduced by the association described above.
  • the finger F corresponds to the “operation body” described in the claims
  • the touch sensor 31a corresponds to the “detection means” described in the claims.
  • the block 34 cooperates with the “acquiring means” recited in the claims.
  • the determination block 35 corresponds to the “determination means” described in the claims
  • the association block 36 corresponds to the “association means” described in the claims
  • the lifting mechanism 40 is described in the claims. This corresponds to the “movement mechanism”.
  • the focus 62 corresponds to the “first image portion” described in the claims
  • the submenu image portion 165 corresponds to the “second image portion” described in the claims
  • the elastic member 80 This corresponds to the “reaction force generation unit” recited in the claims.
  • the main menu 60a corresponds to the “upper selection image” described in the claims
  • the air conditioning menu 60b corresponds to the “lower selection image” described in the claims
  • the remote operation device 100 claims This corresponds to the “operation device” described in the range.
  • the second embodiment of the present disclosure shown in FIGS. 11 to 15 is a modification of the first embodiment.
  • the elevating mechanism 240 is accommodated in a case 275 that forms a part of the center console.
  • the touch sensor 231a of the second embodiment can be moved along the z-axis direction together with the operation surface 70 by the elevating mechanism 240.
  • Case 275 is formed of a conductive resin material or the like.
  • the case 275 has a peripheral wall portion 275a and a lid wall portion 275b.
  • the peripheral wall part 275a is erected so as to surround the periphery of the elevating mechanism 240 over the entire circumference.
  • the lid wall portion 275b is formed so as to cover the lifting mechanism 240 from above, and is continuous with the tip portion of the peripheral wall portion 275a in the standing direction.
  • An opening 276 for exposing the operation surface 70 is provided in the center of the lid wall portion 275b.
  • the opening 276 is formed in a rectangular shape slightly smaller than the operation surface 70.
  • the lid wall portion 275 b forms a facing surface 277 in a region located around the opening 276 among the back surface facing the operation surface 70.
  • the facing surface 277 is formed in a rectangular frame shape along the outer edge 72 of the operation surface 70, and is arranged to face the outer edge 72.
  • the facing surface 277 can contact the outer edge 72
  • the elevating mechanism 240 holds the touch sensor 231a by the movable support member 241.
  • the touch sensor 231 a is attached to the back surface of the movable support member 241 in a posture along the operation surface 70.
  • the outer edge region 232b of the touch sensor 231a faces the facing surface 277 in the z-axis direction with the movable support member 241 and the sheet member 71 interposed therebetween.
  • the outer edge region 232b can accumulate charges between the opposing surface 277 having conductivity.
  • the lifting mechanism 240 having the above configuration is configured so that the distance from the touch sensor 231a to the facing surface 277 (hereinafter referred to as “facing surface distance df”) is increased by the pressing force Fp applied from the finger F to the operation surface 70.
  • the touch sensor 231a is lowered together with the operation surface 70.
  • the lifting mechanism 240 operates the touch sensor 231a so that the opposing surface distance df is reduced by the restoring force of the elastic member 80. Raise with surface 70.
  • the acquisition block 234 of the operation control unit 33 shown in FIG. 11 detects the operating body distance d in the central region 232a of the touch sensor 231a shown in FIGS. 12 and 13 by calculation processing based on the sensitivity value. To do.
  • the acquisition block 234 shown in FIG. 11 detects the facing surface distance df from the outer edge region 232b to the facing surface 277 by calculation processing based on the sensitivity value. As described above, the acquisition block 234 acquires both the operating body distance d and the facing surface distance df shown in FIGS. 12 and 13 in cooperation with the detection unit 31 including the touch sensor 231a.
  • the central region 232a of the touch sensor 231a is a region that overlaps the opening 276 in the z-axis direction.
  • the image portion associated with the movement operation of the finger F is changed by the association block 36 (see FIG. 11) depending on the vertical position of the movable support member 241 in the z-axis direction. Is done.
  • the operation mode of the remote operation device 200 is the shallow operation mode.
  • the space in the range where the facing surface distance df is less than the first threshold distance DFth1 is set as the second operation space Sp2.
  • the operation of the finger F tracing the operation surface 70 in the state where the touch sensor 231a is positioned in the second operation space Sp2 is a shallow operation.
  • the operation mode of the remote operation device 200 is set to the deep operation mode.
  • the space in which the facing surface distance df exceeds the second threshold distance DFth2 is defined as the first operation space Sp1.
  • the touch sensor 231a positioned in the first operation space Sp1 the operation of the finger F tracing the operation surface 70 is a deep operation.
  • the discrimination block 235 shown in FIG. 11 includes the above-described two threshold distances for distinguishing the movement operation of the finger F in the first operation space Sp1 and the movement operation of the finger F in the second operation space Sp2 shown in FIGS. DFth1 and DFth2 are set.
  • the first threshold distance DFth1 is set to be shorter than the second threshold distance DFth2.
  • the first threshold distance DFth1 is a threshold distance used for switching from the deep operation mode to the shallow operation mode when the movable support member 241 is lifted by the reaction force Fre of the elastic member 80 (see B in FIG. 8).
  • the second threshold distance DFth2 is a threshold distance used for switching from the shallow operation mode to the deep operation mode when the movable support member 241 is lowered by the pressing force Fp. According to the setting of the two threshold distances DFth1 and DFth2, a situation in which the switching of the operation mode frequently occurs can be avoided.
  • the boundary surface BP of the second embodiment is defined at a position separated from the facing surface 277 by the second threshold distance DFth2 so as to face the first operation space Sp1 shown in FIG.
  • the operation mode selection processing performed by the operation control unit 33 will be described in detail with reference to FIGS. 11 to 13 based on FIG.
  • the operation mode selection process shown in FIG. 14 is started by the operation control unit 33 when the ACC power supply of the vehicle is turned on.
  • S201 and S202 are substantially the same as S101 and S102 of the first embodiment.
  • calculation processing is performed on the input position coordinates of the finger F detected in the central region 232a and the facing surface distance df detected in the outer edge region 232b from the sensitivity value acquired in S202, and in S204. Proceed to
  • S204 it is determined whether or not the maximum sensitivity value in the central region 232a acquired in S203 is equal to or greater than a predetermined contact sensitivity threshold HthT (see FIG. 15).
  • a predetermined contact sensitivity threshold HthT it is estimated that the finger F is not touching the operation surface 70, and the process returns to S201, and the tap operation for starting the operation is performed. Wait until it is done.
  • the maximum sensitivity value exceeds the contact sensitivity threshold value HthT, it is estimated that the finger F is touching the operation surface 70, and the process proceeds to S205.
  • S205 it is determined whether or not the previous operation mode is the shallow operation mode and the sensitivity value of the outer edge region 232b is equal to or lower than the second sensitivity threshold HFth2 (see FIG. 15).
  • the touch sensor 231a moves from the second operation space Sp2 to the first operation space Sp1
  • affirmative determination is made in S205
  • the process proceeds to S206.
  • the operation mode is set to the deep operation mode, and the process returns to S202.
  • S207 when a negative determination is made in S205, it is determined whether or not the previous operation mode is the deep operation mode and the sensitivity value is equal to or higher than the first sensitivity threshold HFth1 (see FIG. 15).
  • the touch sensor 231a moves from the first operation space Sp1 to the second operation space Sp2 and makes an affirmative determination in S207, the process proceeds to S208.
  • the operation mode is set to the shallow operation mode, and the process returns to S202.
  • the previous operation mode is maintained, and the process returns to S202.
  • the contact sensitivity threshold value HthT used in S204 is a sensitivity value for detecting the contact of the finger F with the operation surface 70.
  • the contact sensitivity threshold value HthT is a value slightly smaller than the sensitivity value detected when the finger F is in contact with the central region 232 a of the operation surface 70.
  • the first sensitivity threshold value HFth1 and the second sensitivity threshold value HFth2 used in S205 and S207 of FIG. 14 are such that the opposing surface distance df is the first threshold value distance DFth1 and the second threshold value distance DFth2, respectively. This corresponds to the sensitivity value detected in the outer edge region 232b in the state.
  • the operator can position the finger F at the same effect as the first embodiment. It is possible to grasp which operation space is used. Therefore, it is possible to reduce erroneous operations.
  • the operation control unit 33 determines whether or not the facing surface is based on the detection results at a plurality of locations in the outer edge region 232b.
  • the distance df can be calculated. According to the above, since the detection accuracy of the facing surface distance df is maintained high, the operation mode switching based on the operation of pushing the operation surface 70 can be performed accurately.
  • the touch sensor 231a moves together with the operation surface 70, so that the touch sensor 231a can detect a moving operation by the finger F at a position close to the finger F. According to the above, the touch sensor 231a can easily detect a detailed movement operation by the finger F. Therefore, the moving operation with the finger F is accurately reflected in the movement of the image portion displayed on the display screen 52.
  • the elevating mechanism 240 corresponds to the “movement mechanism” described in the claims
  • the touch sensor 231a corresponds to the “detection means” described in the claims
  • the detection unit 31 and The acquisition block 234 cooperates and corresponds to “acquisition means” described in the claims.
  • the determination block 235 corresponds to “determination means” described in the claims
  • the remote operation device 200 corresponds to “operation device” described in the claims.
  • the third embodiment of the present disclosure shown in FIGS. 16 to 18 is another modification of the first embodiment.
  • the configuration of the lifting mechanism 340 is different from the configuration of the lifting mechanism 40 (see FIG. 5) in the first embodiment.
  • the raising / lowering mechanism 340 the structure different from the raising / lowering mechanism 40 of 1st embodiment is demonstrated in detail.
  • a pair of the first link 347 and the second link 348 of the elevating mechanism 340 are connected to each other by a central hinge 349 at the center portion thereof.
  • the first link 347 and the second link 348 are rotatable with respect to each other by a central hinge 349.
  • the end portion 347b of the first link 347 is connected to the connection portion 345b of the fixed support member 44 by a hinge.
  • the end portion 347b is allowed to slide the hinge in the y-axis direction at the connection portion with the connection portion 345b.
  • the end portion 348b of the second link 348 is connected to the connection portion 342b of the movable support member 41 by a hinge.
  • the end portion 348b is allowed to slide the hinge in the y-axis direction at the connection portion with the connection portion 342b. According to the connection configuration of the links 347 and 348 described above, the operation surface 70 moves only in the z-axis direction substantially orthogonal to the operation surface 70 when a push operation is performed.
  • the lifting mechanism 340 has five elastic members 380.
  • Each elastic member 380 is formed with a large diameter portion 381, a small diameter portion 383, a reduced diameter portion 382, and a stopper (not shown) that are substantially the same as the elastic member 80 (see FIG. 6) according to the first embodiment. ing.
  • One of the plurality of elastic members 380 is placed at the center of the fixed support member 44.
  • the other four are arranged close to the four corners of the fixed support member 44 so as to surround the elastic member 380 placed at the center of the fixed support member 44.
  • the operation surface 70 of the above lifting mechanism 340 When the operation surface 70 of the above lifting mechanism 340 is at the uppermost position, it protrudes upward with respect to the surrounding surface 375 (see A in FIG. 18) surrounding the movable support member 41.
  • the operation surface 70 is lowered toward the fixed support member 44 integrally with the movable support member 41 by applying a pressing force Fp stronger than the restoring force by the plurality of elastic members 380 from the finger F.
  • the operation surface 70 that has been moved to the lowest position by such a pushing operation is in a state where there is substantially no step with the surrounding surface 375 (see B in FIG. 18).
  • the pressing force Fp applied to the operation surface 70 disappears, the operation surface 70 rises integrally with the movable support member 41 by the restoring force of the plurality of elastic members 380.
  • the icon selection operation input to the remote operation device 300 described so far will be described with reference to FIG.
  • the display image 60 displayed on the display screen 52 is a navigation image 360a showing a route to the destination set by the operator.
  • the navigation image 360a includes a plurality of icons 63 associated with the destination, a pointer 362 for selecting the icon 63, a map 364 indicating the form of the road around the vehicle, and the like.
  • the navigation image 360a includes a focus 62 that emphasizes the icon 63 on which the pointer 362 is superimposed.
  • the remote operation device 300 switches the operation mode to the shallow operation mode.
  • the shallow portion operation by the operator is associated with the map 364 displayed on the display screen 52. Therefore, the operator scrolls the map 364 up, down, left, and right by a shallow portion operation that traces the uppermost operation surface 70, and moves an arbitrary icon 63 arranged on the map 364 to the center of the display screen 52. Can do.
  • the operation surface 70 is lowered to the lowest position by the pushing operation in the pushing direction Dp.
  • the operation mode is switched to the deep operation mode.
  • the pointer 362 is displayed on the display screen 52.
  • the deep operation by the operator is associated with the pointer 362 displayed on the display screen 52. Therefore, the operator can superimpose the pointer 362 on the arbitrary icon 63 by deep operation of tracing the operation surface 70 at the lowest position. Under such a state, an arbitrary icon 63 is selected by weakening the pressing force Fp applied to the operation surface 70 and separating the finger F from the touch sensor 31a.
  • the plurality of elastic members 380 generate a restoring force at different locations. Therefore, even if the pressing force Fp is applied to a position deviated from the center of the operation surface 70, the elevating mechanism 340 stably and reliably generates the reaction force Fre and follows the operation surface 70 as the finger F moves. Can be made. By maintaining the contact state between the finger F and the operation surface 70 in this way, the operator can reliably keep track of the operational sensation as to whether or not the operation surface 70 has been pushed. Therefore, the effect of reducing erroneous operations is exhibited with higher certainty.
  • the operation surface 70 moves substantially only in the z-axis direction and does not move substantially in the direction along the xy plane during the pushing operation. Therefore, a situation in which the finger F dragged on the operation surface 70 moves in the above-described plane direction and causes the movement of the image part not intended by the operator can be avoided. Therefore, erroneous operation caused by the configuration that allows the operation surface 70 to be raised and lowered can be reduced with the above-described configuration.
  • the elevating mechanism 340 corresponds to the “movement mechanism” recited in the claims
  • each elastic member 380 corresponds to the “reaction force generating portion” recited in the claims
  • the operation device 300 corresponds to an “operation device” recited in the claims.
  • the lifting mechanisms having different configurations are used, but the mechanism for making the operation surface movable is not limited to the support mechanism (so-called pantograph) of the above embodiment.
  • the arrangement of the links in the elevating mechanism, the connection structure at both ends, and the like may be changed as appropriate.
  • a rail mechanism that allows the movable support member to slide along a guide rail extending along the z-axis direction, for example, may be provided. Further, movement of the operation surface in the xy plane direction accompanying the pressing operation may be allowed.
  • a rubber elastic member (so-called rubber dome) is provided as a configuration for generating a reaction force.
  • the material, hardness, shape, and the like of the elastic member may be changed as appropriate so that a light click feeling is produced.
  • an elastic member made of a resin material such as urethane may generate a reaction force.
  • reaction force may be generated by sandwiching elements such as a coil spring and a leaf spring between the movable support member and the fixed support member.
  • produces reaction force may be sufficient as the bending of the link which supports a movable support member.
  • the click feeling is produced by continuously providing the reaction force increasing section Zin and the reaction force decreasing section Zde.
  • a click feeling need not be produced.
  • hysteresis is provided for switching the operation mode by using two threshold distances, the first threshold distance DFth1 and the second threshold distance DFth2.
  • hysteresis may be provided for each threshold distance.
  • the first operation space Sp1 is expanded by extending the first threshold distance Dth1.
  • the second operation space Sp2 is expanded by extending the second threshold distance Dth2.
  • the facing surface is formed in a frame shape that surrounds the periphery of the opening.
  • the facing surface only needs to face at least part of the touch sensor, and the shape can be changed as appropriate.
  • the electroconductivity which an opposing surface requires may be ensured by sticking a plate-shaped metal member etc. to a case.
  • a display using a plasma display panel, a display using an organic EL, and the like form a “display screen”.
  • a window shield and a combiner provided on the top surface of the instrument panel are used as a “display screen”, and an image is projected onto the “display screen” by a projection unit such as a projector.
  • a display device that projects an image on a window shield, a combiner, or the like can also be included in the navigation device as a configuration that forms a display screen.
  • a push button for selecting an arbitrary icon is provided in the vicinity of the operation surface. The operator can select the function of the icon by pressing the push button while the pointer and focus are superimposed on an arbitrary icon.
  • the plurality of “first image portion” and “second image portion” exemplified may be changed as appropriate.
  • the function provided by the operation control unit 33 that executes the program may be provided by hardware and software different from the above-described control device, or a combination thereof.
  • a function such as “association means” may be provided by a circuit that performs a predetermined function without depending on a program.
  • each sensitivity threshold value serving as a boundary value is included. It may be changed as appropriate. Specifically, when the sensitivity value is less than the second sensitivity threshold value HFth2 in S205 of the second embodiment, switching to the deep operation mode may be performed. Similarly, when the sensitivity value exceeds the first sensitivity threshold Hth1 in S207, switching to the shallow portion operation mode may be performed.
  • the present disclosure is applied to a remote operation device used in a display system mounted on a vehicle.
  • the present disclosure can also be applied to a so-called touch panel type operation device configured integrally with a display screen.
  • the operation device to which the present disclosure is applied can be applied not only to the vehicle but also to all display systems used for various transportation devices and various information terminals.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention concerne un dispositif d'action qui agit sur une section d'image lorsqu'une entrée est effectuée par rapport à une surface (70) d'action par un corps (F) d'action, ladite section d'image étant affichée sur un écran (52) d'affichage. Le dispositif d'action comporte: un moyen (31a) de détection qui est placé sur la face arrière de la surface d'action et qui détecte un déplacement du corps d'action; des moyens (31, 34) d'acquisition, qui acquièrent une distance du corps d'action (d) entre le moyen de détection et le corps d'action; un moyen (35) de discernement, qui distingue l'un de l'autre un déplacement du corps d'action détecté dans un premier espace (Sp1) d'action où la distance du corps d'action est inférieure à une distance seuil prédéterminée (Dth1), et un déplacement du corps d'action détecté dans un deuxième espace (Sp2) d'action où la distance du corps d'action dépasse la distance seuil; et un mécanisme (40, 340) de déplacement, qui déplace la surface d'action positionnée dans le deuxième espace d'action jusqu'à l'intérieur du premier espace d'action au moyen d'une force (Fp) de pression appliquée à la surface d'action à partir du corps d'action.
PCT/JP2013/007315 2012-12-24 2013-12-12 Dispositif d'action WO2014103221A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2012280406 2012-12-24
JP2012-280406 2012-12-24
JP2013181483A JP5754483B2 (ja) 2012-12-24 2013-09-02 操作デバイス
JP2013-181483 2013-09-02

Publications (1)

Publication Number Publication Date
WO2014103221A1 true WO2014103221A1 (fr) 2014-07-03

Family

ID=51020345

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/007315 WO2014103221A1 (fr) 2012-12-24 2013-12-12 Dispositif d'action

Country Status (2)

Country Link
JP (1) JP5754483B2 (fr)
WO (1) WO2014103221A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110945462A (zh) * 2017-07-21 2020-03-31 阿尔卑斯阿尔派株式会社 输入装置、输入装置控制方法以及控制程序

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6304154B2 (ja) * 2015-07-14 2018-04-04 株式会社Soken 操作装置
JP6477522B2 (ja) * 2016-01-26 2019-03-06 豊田合成株式会社 タッチセンサ装置
CN105930049A (zh) * 2016-04-12 2016-09-07 广东欧珀移动通信有限公司 一种避免误操作的方法及终端

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006350736A (ja) * 2005-06-16 2006-12-28 Tokai Rika Co Ltd ポインタ表示制御装置及びポインティングデバイス
JP2008016053A (ja) * 2007-08-29 2008-01-24 Hitachi Ltd タッチパネルを備えた表示装置
JP2011128692A (ja) * 2009-12-15 2011-06-30 Panasonic Corp 入力装置、入力方法および入力プログラム

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006350736A (ja) * 2005-06-16 2006-12-28 Tokai Rika Co Ltd ポインタ表示制御装置及びポインティングデバイス
JP2008016053A (ja) * 2007-08-29 2008-01-24 Hitachi Ltd タッチパネルを備えた表示装置
JP2011128692A (ja) * 2009-12-15 2011-06-30 Panasonic Corp 入力装置、入力方法および入力プログラム

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110945462A (zh) * 2017-07-21 2020-03-31 阿尔卑斯阿尔派株式会社 输入装置、输入装置控制方法以及控制程序
CN110945462B (zh) * 2017-07-21 2023-06-09 阿尔卑斯阿尔派株式会社 输入装置、输入装置控制方法以及控制程序

Also Published As

Publication number Publication date
JP5754483B2 (ja) 2015-07-29
JP2014142914A (ja) 2014-08-07

Similar Documents

Publication Publication Date Title
USRE47028E1 (en) Information processing apparatus, method and computer readable medium for fixing an input position according to an input operation
JP4522475B1 (ja) 操作入力装置、制御方法、およびプログラム
US9778764B2 (en) Input device
JP4787087B2 (ja) 位置検出装置及び情報処理装置
US20150205943A1 (en) Manipulation apparatus
EP2426581A2 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme informatique
CN106164824B (zh) 用于车辆的操作装置
JP5640486B2 (ja) 情報表示装置
JP5858059B2 (ja) 入力装置
CN106372544B (zh) 经由保持在原地的输入对象的暂时安全访问
JP5754483B2 (ja) 操作デバイス
JP5751233B2 (ja) 操作デバイス
WO2013153750A1 (fr) Système d'affichage, dispositif d'affichage et dispositif d'exploitation
WO2012111227A1 (fr) Dispositif d'entrée tactile, appareil électronique et procédé d'entrée
JP5954145B2 (ja) 入力装置
US20130311945A1 (en) Input device
WO2015033682A1 (fr) Dispositif d'entrée de manipulation, terminal de traitement d'informations portatif, procédé de commande de dispositif d'entrée de manipulation, programme et support d'enregistrement
JP6115421B2 (ja) 入力装置および入力システム
JP5772804B2 (ja) 操作デバイス、及び操作デバイスの操作教示方法
JP2013073365A (ja) 情報処理装置
JP2013088912A (ja) 車両用操作装置
WO2014162698A1 (fr) Dispositif d'entrée
KR101573287B1 (ko) 전자기기에서 터치 위치 디스플레이 방법 및 장치
US20180292924A1 (en) Input processing apparatus
WO2015093005A1 (fr) Système d'affichage

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13867826

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13867826

Country of ref document: EP

Kind code of ref document: A1