WO2021193633A1 - Appareil d'entrée tactile et programme - Google Patents

Appareil d'entrée tactile et programme Download PDF

Info

Publication number
WO2021193633A1
WO2021193633A1 PCT/JP2021/011994 JP2021011994W WO2021193633A1 WO 2021193633 A1 WO2021193633 A1 WO 2021193633A1 JP 2021011994 W JP2021011994 W JP 2021011994W WO 2021193633 A1 WO2021193633 A1 WO 2021193633A1
Authority
WO
WIPO (PCT)
Prior art keywords
region
touch
mode
detection
detection mode
Prior art date
Application number
PCT/JP2021/011994
Other languages
English (en)
Japanese (ja)
Inventor
稜祐 米山
琢磨 江口
Original Assignee
日本精機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本精機株式会社 filed Critical 日本精機株式会社
Publication of WO2021193633A1 publication Critical patent/WO2021193633A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means

Definitions

  • the present invention relates to a touch input device and a program.
  • Patent Document 1 includes a touch panel for performing touch input by the user and an acceleration sensor, and when the acceleration sensor detects that the user has shaken the device, the detection sensitivity of the touch input is provided. The one to switch is described.
  • the present invention has been made in view of the above circumstances, and an object of the present invention is to provide a touch input device and a program capable of easily realizing touch input with a plurality of detection sensitivities.
  • the touch input device is An input detection unit that detects the touch input based on a change in the capacitance of the touch unit on which the user performs touch input, and an input detection unit.
  • the input detection unit includes a sensitivity setting unit for setting a detection sensitivity when detecting the touch input.
  • the sensitivity setting unit The detection sensitivity of the first region in the touch portion is set to the first sensitivity,
  • the detection sensitivity of the second region different from the first region in the touch portion is set to a second sensitivity higher than the first sensitivity.
  • the first region and the second region can be simultaneously present in the touch portion.
  • the program according to the second aspect of the present invention is Computer, An input detection means that detects the touch input based on a change in the capacitance of the touch portion on which the user performs a touch input.
  • the sensitivity setting means The detection sensitivity of the first region in the touch portion is set to the first sensitivity, The detection sensitivity of the second region different from the first region in the touch portion is set to a second sensitivity higher than the first sensitivity.
  • the first region and the second region can be simultaneously present in the touch portion.
  • touch input with a plurality of detection sensitivities can be easily realized.
  • the touch input device 100 includes a touch unit 10, a display unit 20, a touch control unit 30, and a main control unit 40.
  • the touch input device 100 enables the operation of a predetermined device based on the touch input made to the touch unit 10 by the user's hand.
  • the display unit 20 and the device 200 shown in FIG. 3 are objects that can be operated by touch input.
  • the device 200 is a device mounted on a construction machine such as a hydraulic excavator.
  • the user of the touch input device 100 is, for example, a driver / operator of a construction machine.
  • the touch input device 100 is detachably provided near the driver's seat of a construction machine, for example.
  • the touch portion 10 is a panel-shaped member that has translucency and has a rectangular shape in a plan view.
  • the touch unit 10 has a cover 11 and a touch sensor 12.
  • the cover 11 is a plate-shaped member formed of a translucent resin such as polymethyl methacrylate resin (PMMA).
  • PMMA polymethyl methacrylate resin
  • the front surface of the cover 11 is an input surface that the user's hand performing touch input contacts.
  • the touch input means an operation in which the user touches the input surface of the touch unit 10 by hand.
  • the touch input device 100 is set to a mode in which the touch input can be detected regardless of whether the operation is performed by the user's bare hands or by the hands wearing gloves.
  • the touch sensor 12 is a sheet-shaped capacitive touch sensor, and is provided on the back surface of the cover 11.
  • the touch sensor 12 has a plurality of drive electrodes 12a and a plurality of detection electrodes 12b.
  • the drive electrodes 12a form a band extending in the X direction and are arranged at intervals in the Y direction.
  • the detection electrodes 12b form a band extending in the Y direction and are arranged at intervals in the X direction.
  • the drive electrode 12a and the detection electrode 12b are made of, for example, ITO (Indium Tin Oxide), are insulated from each other, and are arranged so as to intersect with each other.
  • the display unit 20 is composed of an LCD (Liquid Crystal Display), an OLED (Organic Light Emitting Diodes), etc., and is provided behind the translucent touch unit 10 as shown in FIG.
  • the display unit 20 displays an image on the display surface 20a under the control of the main control unit 40.
  • the image displayed on the display surface 20a is visually recognized by the user through the touch portion 10.
  • the touch unit 10 and the display unit 20 form a so-called touch panel. For example, as shown in FIG. 4, a region overlapping the rectangular display surface 20a is set as a touch input detectable region Ad.
  • the touch control unit 30 controls the operation of the touch sensor 12, and for example, drives the CPU (Central Processing Unit), ROM (Read Only Memory), RAM (Random Access Memory), and touch sensor 12. It is equipped with a circuit and the like. Further, the touch control unit 30 can measure time by a built-in timer.
  • the ROM of the touch control unit 30 stores a program P1 for controlling the operation of the touch sensor 12, including a program for executing the detection mode switching process described later.
  • the touch control unit 30 is composed of, for example, an IC (Integrated Circuit) provided in the vicinity of the touch unit 10.
  • the touch control unit 30 includes an input detection unit 31, a sensitivity setting unit 32, and a state determination unit 33 as main functions.
  • the input detection unit 31 detects the touch input by the user based on the change in the capacitance of the touch unit 10. For example, the input detection unit 31 sequentially supplies drive signals to the plurality of drive electrodes 12a of the touch sensor 12, and sequentially selects the plurality of detection electrodes 12b to select the drive electrodes 12a to which the drive signals are supplied. The capacitance at the intersection with the detection electrode 12b is acquired. Then, the input detection unit 31 is defined as an intersection of the drive electrode 12a and the detection electrode 12b, generates a capacitance distribution of the coordinates (X, Y) set in the touch sensor 12, and the capacitance is predetermined. Touch input is detected at coordinates (X, Y) that exceed the threshold value of. As described above, the input detection unit 31 and the touch sensor 12 are configured to be able to detect the touch input by the mutual capacitance method.
  • the sensitivity setting unit 32 sets the detection sensitivity when the input detection unit 31 detects the touch input. Specifically, as shown in FIG. 4, the sensitivity setting unit 32 sets the sensitivity of the first region A1 of the detectable region Ad of the touch unit 10 to the first sensitivity, and sets the sensitivity of the second region A2. Set the sensitivity to the second sensitivity.
  • the first sensitivity is realized by setting the threshold value of the capacitance when the input detection unit 31 detects the touch input to the first threshold value as the sensitivity suitable for the touch input by the bare hand.
  • the second sensitivity is set as a sensitivity suitable for touch input by a hand wearing a glove. When the glove is worn, the second sensitivity is set higher than the first sensitivity because the conductivity is lower than that of the bare hand.
  • the second sensitivity is realized by setting the threshold value of the capacitance when the input detection unit 31 detects the touch input to a second threshold value lower than the first threshold value.
  • the sensitivity setting unit 32 can change the ratio occupied by each of the first region A1 and the second region A2 in the touch unit 10. Then, the sensitivity setting unit 32 sets the detection mode of the touch unit 10 to a bare hand mode in which the first region A1 is larger than the second region A2 (an example of the first detection mode) and a second region A2 in the first. It is possible to switch from one of the glove mode (an example of the second detection mode) larger than the region A1 of the above to the other.
  • the bare-handed mode execution image 1 shown in FIG. 4 is an image displayed on the display surface 20a when the bare-handed mode is executed, and is a switching button image 1a displayed at a position overlapping the second region A2 of the touch unit 10. ,
  • the detection mode notification image 1b displayed in the area other than the switching button image 1a.
  • the ratio occupied by the first region A1 in the detectable region Ad is set to be larger than that of the second region A2.
  • the changeover button image 1a is displayed in a manner capable of notifying the second region A2 and notifying that the second region A2 has the second sensitivity (that is, touch input with the glove is possible).
  • the detection mode notification image 1b is an image showing that a wide range of the detectable region Ad other than the second region A2 (that is, the first region A1) is currently suitable for touch input by a bare hand.
  • the area other than the switching button image 1a is suitable for touch input with bare hands by the characters "BARE" displayed at an arbitrary position in the first area A1. Is shown.
  • the glove mode execution image 2 shown in FIG. 4 is an image displayed on the display surface 20a when the glove mode is executed.
  • the sensitivity setting unit 32 sets the portion that was the second region A2 in the bare hand mode as the first region A1 and becomes the first region A1 in the bare hand mode.
  • the location is designated as the second region A2. Therefore, when the glove mode is executed, the ratio occupied by the second region A2 in the detectable region Ad is set to be larger than that of the first region A1.
  • the glove mode execution image 2 includes a switching button image 2a displayed at a position overlapping the first region A1 of the touch unit 10, and a detection mode notification image 2b displayed in an region other than the switching button image 2a. ..
  • the changeover button image 2a is displayed in a manner capable of notifying the first region A1 and notifying that the first region A1 has the first sensitivity.
  • the first area A1 is indicated by the frame line of the outer edge, and the character "BARE" can be switched to the bare hand mode again by performing touch input in the first area A1. It shows that there is.
  • the detection mode broadcast image 2b shows that a wide area (that is, the second area A2) other than the first area A1 in the detectable area Ad is currently capable of touch input by a gloved hand. It is an image.
  • the characters "GLOVE" displayed at an arbitrary position in the second area A2 allow the area other than the switching button image 2a to be touch-input by a hand wearing a glove. It shows that it is possible.
  • the detection mode is switched from the glove mode to the bare hand mode, and the bare hand mode execution image 1 is displayed again on the display surface 20a.
  • the state determination unit 33 determines whether the cover 11 of the touch unit 10 is in a water-covered state in which water such as rainwater is presumed to be attached, or a normal state in which the touch unit 10 is not in a water-covered state. For example, in the state determination unit 33, the touch unit 10 is covered based on the distribution of the capacitance of the coordinates (X, Y) set in the touch sensor 12, similar to the method in which the input detection unit 31 detects the touch input. Determine whether it is in a water state or a normal state.
  • the state-determining unit 33 As a method for determining the water-covered state, a known method can be appropriately adopted, but as an example, in the state-determining unit 33, the capacitance generated in the detectable region Ad of the touch unit 10 is used for determining the water-filled state. When the threshold value is exceeded, it is determined that the touch portion 10 is in a water-covered state. The threshold value is set lower than the second threshold value when the touch input is detected by the second sensitivity described above. In determining the water-covered state, the distribution of coordinates (X, Y) exceeding the threshold value for determining the water-filled state may be taken into consideration. Further, the state determination unit 33 may determine whether the touch unit 10 is in a water-covered state or a normal state based on information from a known rainwater sensor (not shown) using, for example, infrared rays. ..
  • the touch control unit 30 having the above functions indicates detection information indicating the detection state of the touch input, detection mode information indicating the current detection mode, and whether the touch unit 10 is in a water-covered state or a normal state.
  • the state information is transmitted to the main control unit 40.
  • the main control unit 40 controls the overall operation of the touch input device 100 while communicating with the electrically connected touch control unit 30.
  • an MCU Micro Controller
  • the main control unit 40 can be timed by a built-in timer.
  • the ROM of the main control unit 40 stores a program P2 for controlling the overall operation of the touch input device 100, including a program for executing the operation mode control process described later.
  • the main control unit 40 includes an information acquisition unit 41, an operation control unit 42, and a display control unit 43 as main functions.
  • the information acquisition unit 41 acquires the above-mentioned detection information, detection mode information, and state information from the touch control unit 30.
  • the operation control unit 42 enables the operation of the target device to be executed according to the detected touch input based on the detection information acquired by the information acquisition unit 41. For example, the operation control unit 42 creates a motion locus of the user's hand based on the coordinate information of the touch input, compares the created motion locus with the locus pattern stored in the ROM, and identifies the gesture by the touch input. ..
  • the identifiable gesture may be various actions such as tapping, swiping, scrolling, pinch-in, and pinch-out.
  • the gesture specific method is arbitrary, and for example, a known method such as a pattern matching method, an NN method (Nearest Neighbor algorithm), or a k-NN method (k-Nearest Neighbor algorithm) can be used.
  • the operation control unit 42 outputs an operation command corresponding to the specified gesture to the target device.
  • the data indicating the content of the operation command corresponding to the gesture is stored in the ROM in advance.
  • the display unit 20 and the device 200 shown in FIG. 3 are objects that can be operated under the control of the operation control unit 42.
  • the device 200 is a device mounted on a construction machine as described above, and includes an ECU (Electronic Control Unit) that controls the operation of each part of the construction machine, and various systems configured in the construction machine.
  • the operation control unit 42 controls the operation of the device 200 according to the gesture specified as described above.
  • the operation control unit 42 can transmit the operation command to the device 200 not only by wired communication but also by wireless communication. Therefore, the user can perform a predetermined operation on the device 200 by using the touch input device 100 even when the user is at a position away from the construction machine.
  • the device 200 operates in response to an operation command from the operation control unit 42.
  • the operation control unit 42 can control the operation of the device 200 or the like according to the touch input detected in the first area A1.
  • the detection mode is switched from the bare hand mode to the glove mode as described above.
  • the operation control unit 42 can control the operation of the device 200 or the like according to the touch input detected in the second area A1.
  • the detection mode is switched from the glove mode to the bare hand mode as described above.
  • the operation control unit 42 switches the operation mode from one of the normal mode for executing the operation of the predetermined device in response to the touch input and the operation invalid mode for disabling the operation in response to the touch input to the other. Can be done.
  • the normal mode the operation corresponding to the touch input detected when the detection mode of the touch unit 10 is the bare hand mode or the glove mode is executed.
  • the operation invalid mode the operation corresponding to the detected touch input is invalidated regardless of whether the detection mode of the touch unit 10 is the bare hand mode or the glove mode.
  • FIG. 6 is a timing chart showing a transition example between the state of the touch unit 10 determined by the touch control unit 30 (state determination unit 33) and the operation mode switched by the main control unit 40 (operation control unit 42). The figure is shown assuming that time elapses to the right.
  • the operation control unit 42 of this embodiment executes the normal mode only when the touch unit 10 is in the normal state, but the operation mode is not necessarily controlled to the normal mode just because the touch unit 10 is in the normal state. .. As shown in FIG. 6, even when the touch unit 10 is in the normal state, the operation mode may be controlled to the operation invalid mode.
  • the trigger for switching the operation mode will be described in detail in the operation mode control process described later.
  • the display control unit 43 controls the display operation of the display unit 20.
  • the display control unit 43 refers to the detection mode information from the touch control unit 30, displays the image 1 at the time of executing the bare hand mode on the display unit 20 while the bare hand mode is being executed, and the image at the time of executing the globe mode during the execution of the glove mode. 2 is displayed on the display unit 20. Further, the display control unit 43 displays information about the target device that can be operated in a place other than the switching button image 1a of the image 1 when the bare hand mode is executed and a place other than the switching button image 2a of the image 2 when the glove mode is executed. Display an image (not shown).
  • the information indicated by the target image includes not only the information indicating the target device itself, but also the operation items of the target device, the information indicating some functions of the target device, and the like.
  • the target image may be configured to represent information about the target device with characters, figures, icons, and the like. The configuration of the touch input device 100 has been described above.
  • the detection mode of the touch unit 10 is set to the bare hand mode (step S101).
  • the image 1 at the time of executing the bare-handed mode shown in FIG. 4 is displayed on the display unit 10 under the control of the main control unit 40.
  • the touch control unit 30 determines whether or not the touch input is detected in the second region A2 based on the capacitance of the touch sensor 12 (step S102). When the touch input is not detected in the second area A2 (step S102; No), the touch control unit 30 continues the bare hand mode (step S101). On the other hand, when the touch input is detected in the second area A2 (step S102; Yes), the touch control unit 30 switches the detection mode of the touch unit 10 from the bare hand mode to the glove mode (step S103). While the glove mode is being executed, the glove mode execution image 2 shown in FIG. 4 is displayed on the display unit 10 under the control of the main control unit 40.
  • the touch control unit 30 determines whether or not the touch input is detected in the first region A1 based on the capacitance of the touch sensor 12 (step S104). When the touch input is not detected in the first area A1 (step S104; No), the touch control unit 30 continues the glove mode (step S103). On the other hand, when the touch input is detected in the first area A1 (step S104; Yes), the touch control unit 30 switches the detection mode of the touch unit 10 from the glove mode to the bare hand mode (step S101). The above detection mode switching process is continuously executed, for example, while the touch input device 100 is being activated. Subsequently, the operation mode control process will be described.
  • the main control unit 40 When the operation mode control process shown in FIG. 8 is started, the main control unit 40 first controls the operation mode of the touch input device 100 in the normal mode (step S201). Subsequently, the main control unit 40 determines whether or not the touch unit 10 is in a water-covered state based on the state information supplied from the touch control unit 30 (step S202). When the touch unit 10 is not in the water-covered state (step S202; No), that is, in the normal state, the main control unit 40 continues the control in the normal mode (step S201).
  • the main control unit 40 switches the operation mode from the normal mode to the operation invalid mode (step S203). This switching timing corresponds to t0 in the example of FIG.
  • the main control unit 40 may display an image notifying that the operation is currently in the operation invalid mode on the display unit 20 when the operation mode is changed to the operation invalid mode.
  • the main control unit 40 determines whether or not a predetermined predetermined period T1 has elapsed since the operation mode was switched to the operation invalid mode (step S204).
  • a predetermined predetermined period T1 has elapsed since the operation mode was switched to the operation invalid mode (step S204).
  • the main control unit 40 continues the operation invalid mode (step S203).
  • the predetermined period T1 can be arbitrarily set, but can be set to, for example, about 30 seconds.
  • step S204 determines whether or not the touch unit 10 is in the normal state (step S205).
  • step S205 determines whether or not the touch unit 10 is in the normal state (step S205).
  • step S205 No
  • the main control unit 40 continues the operation invalid mode (step S203).
  • step S205 When the touch unit 10 is in the normal state (step S205; Yes), has the main control unit 40 continued the normal state for a predetermined specific period T2 (specifically, as shown in FIG. 6) for a predetermined period. Whether or not the specific period T2 has elapsed in the normal state after T1) is determined (step S206). When the specific period T2 has not continued (step S206; No), that is, when the specific period T2 has not yet passed in the normal state, or when the water has been flooded before the specific period T2 has elapsed. , The main control unit 40 continues the operation invalid mode (step S203).
  • the specific period T2 can be arbitrarily set, but can be set to, for example, about 10 seconds.
  • the main control unit 40 switches the operation mode from the operation invalid mode to the normal mode (step S201).
  • This switching timing corresponds to t1 in the example of FIG.
  • the above operation mode control process is continuously executed, for example, during the activation of the touch input device 100.
  • the touch input device 100 described above includes an input detection unit 31 that detects a touch input based on a change in the capacitance of the touch unit 10 that the user performs a touch input, and when the input detection unit 31 detects the touch input.
  • a sensitivity setting unit 32 for setting the detection sensitivity of the above is provided.
  • the program P1 causes the computer to function as an input detecting means realized as an input detecting unit 31 and a sensitivity setting means realized as a sensitivity setting unit 32. Then, the sensitivity setting unit 32 sets the detection sensitivity of the first region A1 to the first sensitivity, sets the detection sensitivity of the second region A2 to a second sensitivity higher than the first sensitivity, and touches.
  • the first region A1 and the second region A2 can be present in the unit 10 at the same time. According to the touch input device 100 and the program P1, the first region A1 and the second region A2 can be simultaneously present in the touch unit 10, so that touch input with a plurality of detection sensitivities can be easily realized. be able to.
  • the touch input device 100 is a translucent touch unit 10 and a display unit provided behind the touch unit 10 and displaying an image showing at least one of the first region A1 and the second region A2. 20 and are further provided. According to this configuration, it is possible to notify the user that a plurality of touch input detection sensitivities are set in the touch unit 10.
  • the switching button image 1a notifies the second region A2
  • the detection mode notification image 1b notifies the first region A1.
  • the detection mode notification image 1b may be omitted. This is because if only one region of the first region A1 and the second region A2 can be notified by an image, the other region can be estimated by the user.
  • the display mode of the image 2 when the glove mode is executed is also arbitrary, and the region may be notified by color coding of the image, a background pattern, or the like.
  • the sensitivity setting unit 32 can change the ratio occupied by each of the first region A1 and the second region A2 in the touch unit 10. Further, the sensitivity setting unit 32 has a first detection mode in which the first region A1 is larger than the second region A2 (corresponding to the bare hand mode), and a second region A2 in which the second region A2 is larger than the first region A1. It is possible to switch the detection mode of the touch unit 10 from one of the two detection modes (corresponding to the glove mode) to the other. Then, when the touch input is detected in the second area A2 during the execution of the first detection mode, the sensitivity setting unit 32 switches the detection mode from the first detection mode to the second detection mode. Further, when the touch input is detected in the first area A1 during the execution of the second detection mode, the sensitivity setting unit 32 switches the detection mode from the second detection mode to the first detection mode.
  • the second region A2 in the first detection mode (bare hand mode) and the first region A1 in the second detection mode (glove mode) are set at the same location in the touch unit 10. It is preferable to be done. This is because the user can intuitively understand that the detection mode can be switched by touching the area set in the same place.
  • the touch input device 100 described above includes an information acquisition unit 41 that acquires state information indicating whether the touch unit 10 on which the user performs touch input is in a water-filled state or a normal state, and the static electricity of the touch unit 10. It includes an operation control unit 42 capable of executing an operation of a device (for example, the device 200) in response to a touch input detected based on a change in capacitance. Further, the program P2 causes the computer to function as an information acquisition means realized as an information acquisition unit 41 and an operation control means realized as an operation control unit 42.
  • the operation control unit 42 can switch the operation mode from one of the normal mode and the operation invalid mode to the other, and can set the operation mode to the normal mode in the normal state, and can set the operation mode to the normal mode in the water-filled state. Is set to the operation invalid mode. Then, when the operation control unit 42 changes from the normal state to the water-filled state when the operation mode is the normal mode, the operation mode is switched from the normal mode to the operation invalid mode, and after switching to the operation invalid mode, T1 is set for a predetermined period. , Even if it becomes a normal state, the operation invalid mode is continued. According to the touch input device 100 and the program P2 described above, it is possible to suppress frequent switching of the operation mode.
  • the operation control unit 42 switches to the normal mode after the operation mode is switched from the normal mode to the operation invalid mode for a predetermined period T1 and then for a specific period T2 after the normal state is reached. Restore. According to this configuration, it is possible to suppress frequent switching of the operation mode when returning from the operation invalid mode to the normal mode.
  • the state information may be generated based on the change in the capacitance of the touch unit 10. According to this configuration, it is not necessary to provide a dedicated rainwater sensor, so that the number of parts can be reduced. As described above, a known rainwater sensor may be used to detect whether the touch portion 10 is in a normal state or a water-covered state.
  • the touch input device 100 is provided with a translucent touch unit 10 and an image behind the touch unit 10 to notify that the operation mode is the operation invalid mode when the operation mode is the operation invalid mode.
  • a display unit 20 for displaying may be further provided. According to this configuration, it is possible to notify the user that the predetermined device cannot be operated based on the touch input at present.
  • the operation control unit 42 can transmit an operation command to the device 200 by wireless communication, and the device 200 may include a device mounted on a construction machine.
  • the present invention is not limited to this.
  • the second detection mode a configuration may be adopted in which all of the detectable regions Ad are set as the second region A2. That is, the sensitivity setting unit 32 has a first detection mode in which the first region A1 is larger than the second region A2, and the touch unit 10 does not have the first region A1 and is more than the first detection mode. It may be possible to switch the detection mode of the touch unit 10 from one of the second detection modes in which the second region A2 is expanded to the other.
  • the sensitivity setting unit 32 may switch the detection mode from the first detection mode to the second detection mode.
  • the sensitivity setting unit 32 changes the detection mode from the second detection mode to the first detection mode after a certain period of time has elapsed after the detection mode is switched from the first detection mode to the second detection mode. You can switch to detection mode.
  • the fixed period can be set arbitrarily, but for example, it may be a period of several minutes or several tens of minutes.
  • the detection mode when the configuration according to the above modification is adopted, for example, when the sensitivity setting unit 32 detects the touch input in the switching area set in the second area A2 of the second detection mode, the detection mode May be switched from the second detection mode to the first detection mode. In this way, even when the user wears a glove on his / her hand, the detection mode can be changed from the second detection mode (glove mode) to the first detection mode (bare hand) by performing touch input to the switching area. Mode) can be restored.
  • the second region A2 of the first detection mode and the switching region of the second detection mode can be set at the same location on the touch unit 10.
  • the switching area can be set at the display location of the switching button image 2a in the image 2 when the glove mode is executed.
  • the functions of the touch control unit 30 and the main control unit 40 may be realized by one control unit, or may be realized by the cooperation of three or more control units. Further, the division of each function of each of the touch control unit 30 and the main control unit 40 is not limited to the examples shown in the above embodiments and is arbitrary. For example, some functions of the touch control unit 30 may be realized by the main control unit 40. Therefore, the division of each process constituting the program P1 and the program P2 is also arbitrary, and is not limited to the examples shown in the above embodiments, and either the touch control unit 30 or the main control unit 40 may execute the program P1 and the program P2.
  • the programs P1 and P2 are stored in the ROM of the touch input device 100 in advance, they may be distributed and provided by a detachable recording medium. Further, the programs P1 and P2 may be downloaded from another device connected to the touch input device 100. Further, the touch input device 100 may execute each process according to the programs P1 and P2 by exchanging various data with other devices via a telecommunication network or the like.
  • the state determination unit 33 determines whether the cover 11 of the touch unit 10 is in a water-covered state in which water such as rainwater is presumed to be attached, or a normal state in which the touch unit 10 is not in a water-covered state. showed that. However, even if the state determination unit 33 determines whether the cover 11 of the touch unit 10 is in a water-covered state (running state) in which it is presumed that a water flow exists, or a normal state in which the water flow is not flowing. good. In this case, the state determination unit 33 determines that the state in which it is estimated that water droplets whose position does not change is attached to the cover 11 is also a normal state.
  • the state determination unit 33 may determine whether the change in capacitance is a water droplet or a water stream based on the distribution of the amount of change in capacitance in the detectable region Ad with respect to time.
  • the input detection unit 31 is either a change in capacitance due to touch input or capacitance due to water droplets based on the feature amount of the capacitance distribution.
  • Touch input may be detected by distinguishing whether it is a change in capacitance. Since it is relatively easy to discriminate between the water whose position does not change and the touch input on the cover 11 as compared with discriminating the water flow and the touch input, such a configuration may be adopted.
  • the state determination unit 33 is in a normal state even when it is estimated that minute water droplets are attached to the cover 11 or a minute amount of water is attached to the cover 11. Of course, it may be determined that the mode is set.
  • the state determination unit 33 may change the standard for estimating the existence of the water flow based on the angle of the touch unit 10 with respect to the gravity direction. This is because it is considered that the behavior of the water flow becomes agile as the surface direction of the touch portion 10 approaches the direction of gravity.
  • the state determination unit 33 may detect the angle of the touch unit 10 with respect to the gravity direction based on the input of a gyro sensor (not shown). In this way, the touch input device 100 can detect the water flow more accurately.
  • the sensitivity setting unit 32 can change the ratio occupied by each of the first region A1 and the second region A2 in the touch unit 10, but this ratio also includes 0% and 100%.
  • the ratio of each of the first region A1 and the second region A2 to the detectable region Ad realized by the control of the sensitivity setting unit 32 is, for example, 70% of the first region A1 and the second region A2. May be 30%, or the first region A1 may be 0% and the second region A2 may be 100%.
  • the ratio of each of the first region A1 and the second region A2 to the detectable region Ad is, for example, 60% for the first region A1 and 10% for the second region A2, and the rest.
  • the 30% area may be an area that does not accept touch input or an area that accepts touch input with other sensitivities.
  • the operation mode control process may be executed on condition that the detection mode of the touch unit 10 is the glove mode. Since the glove mode has high detection sensitivity, it is assumed that the frequency of erroneous detection of touch input due to water exposure of the touch unit 10 increases. Therefore, the operation mode control process is particularly effective in preventing such false detection.
  • the touch input device 100 may have a function of determining whether or not the own machine is attached to or removed from the construction machine. Then, the touch input device 100 may execute the detection mode switching process or the operation mode control process on the condition that the own machine is removed from the construction machine.
  • the shape and configuration of the touch panel composed of the touch unit 10 and the display unit 20 are arbitrary, and a known shape and configuration can be appropriately adopted.
  • the touch input device 100 is used in a construction machine, but the application and the target user are not limited, and it is arbitrary depending on the purpose.
  • the touch input device 100 is suitable for outdoor use.
  • touch input is performed manually by the user.
  • the hand is assumed to be a finger or a palm, but it may be a part beyond the shoulder of the human body. Further, as long as touch input can be safely performed, touch input may be performed at a portion other than the user's hand (for example, foot).
  • touch input device 100 does not have to be provided with the operation invalid mode. That is, the touch input device 100 does not have to execute the operation mode control process shown in FIG.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un appareil d'entrée tactile et un programme qui peuvent facilement réaliser une entrée tactile avec une pluralité de sensibilités de détection. Un appareil d'entrée tactile (100) comprend une unité de détection d'entrée (31) qui détecte une entrée tactile, sur la base d'un changement de capacité d'une unité tactile (10) par l'intermédiaire de laquelle un utilisateur entre l'entrée tactile, et une unité de réglage de sensibilité (32) qui règle une sensibilité de détection pour que l'unité de détection d'entrée (31) détecte l'entrée tactile. L'unité de réglage de sensibilité (32) règle une sensibilité de détection pour une première région dans l'unité tactile (10) à une première sensibilité, et définit une sensibilité de détection pour une seconde région différente de la première région dans l'unité tactile (10) à une seconde sensibilité qui est supérieure à la première sensibilité. L'unité de réglage de sensibilité (32) permet la coexistence de la première région et de la seconde région dans l'unité tactile (10).
PCT/JP2021/011994 2020-03-24 2021-03-23 Appareil d'entrée tactile et programme WO2021193633A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-052552 2020-03-24
JP2020052552 2020-03-24

Publications (1)

Publication Number Publication Date
WO2021193633A1 true WO2021193633A1 (fr) 2021-09-30

Family

ID=77892210

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/011994 WO2021193633A1 (fr) 2020-03-24 2021-03-23 Appareil d'entrée tactile et programme

Country Status (1)

Country Link
WO (1) WO2021193633A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013254331A (ja) * 2012-06-06 2013-12-19 Panasonic Corp 入力装置、入力支援方法及びプログラム
JP2014142894A (ja) * 2013-01-25 2014-08-07 Sharp Corp 携帯情報端末装置およびタッチパネルの感度設定方法
JP2018092497A (ja) * 2016-12-07 2018-06-14 三菱自動車工業株式会社 操作表示装置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013254331A (ja) * 2012-06-06 2013-12-19 Panasonic Corp 入力装置、入力支援方法及びプログラム
JP2014142894A (ja) * 2013-01-25 2014-08-07 Sharp Corp 携帯情報端末装置およびタッチパネルの感度設定方法
JP2018092497A (ja) * 2016-12-07 2018-06-14 三菱自動車工業株式会社 操作表示装置

Similar Documents

Publication Publication Date Title
KR102091597B1 (ko) 포터블 디바이스 및 그 제어 방법
JP5832784B2 (ja) タッチパネルシステムおよびそれを用いた電子機器
JP5779923B2 (ja) 情報処理装置、情報処理方法およびコンピュータプログラム
US8446376B2 (en) Visual response to touch inputs
JP5780438B2 (ja) 電子機器、位置指定方法及びプログラム
KR101404234B1 (ko) 이동 단말기 및 그것의 제어방법
KR20150065543A (ko) 이동 단말기 및 그것의 제어방법
JP2010244132A (ja) タッチパネル付きユーザインタフェース装置、ユーザインタフェース制御方法およびユーザインタフェース制御プログラム
CN104199604A (zh) 一种具有触摸显示屏的电子设备及其信息处理方法
KR20140023402A (ko) 교환 가능한 사용자 인터페이스들을 가진 휴대형 전자 장치 및 그 방법
JP5733634B2 (ja) 電源管理装置、電源管理方法、及び電源管理プログラム
JP2015038695A (ja) 情報処理装置および情報処理方法
US9201587B2 (en) Portable device and operation method thereof
JP2008107906A (ja) タッチパネルへのタッチ操作無効化方法およびタッチパネル式電子機器
CN107621899B (zh) 信息处理装置、误操作抑制方法以及计算机可读存储介质
US20230409163A1 (en) Input terminal device and operation input method
US9727233B2 (en) Touch device and control method and method for determining unlocking thereof
JP5845585B2 (ja) 情報処理装置
CN206741452U (zh) 一种车载触摸显示装置和车辆
WO2021193633A1 (fr) Appareil d'entrée tactile et programme
KR101348696B1 (ko) 터치패턴기반 터치스크린 장치 및 그 제어방법
WO2021201200A1 (fr) Dispositif d'entrée tactile et programme
US20100038151A1 (en) Method for automatic switching between a cursor controller and a keyboard of depressible touch panels
JP5128991B2 (ja) 情報処理装置および入力装置
JP2006085218A (ja) タッチパネル操作装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21776856

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21776856

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP