WO2013021685A1 - Device and method for controlling device to be operated in vehicle, and steering wheel - Google Patents

Device and method for controlling device to be operated in vehicle, and steering wheel Download PDF

Info

Publication number
WO2013021685A1
WO2013021685A1 PCT/JP2012/059712 JP2012059712W WO2013021685A1 WO 2013021685 A1 WO2013021685 A1 WO 2013021685A1 JP 2012059712 W JP2012059712 W JP 2012059712W WO 2013021685 A1 WO2013021685 A1 WO 2013021685A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch sensor
input operation
target device
driver
operation target
Prior art date
Application number
PCT/JP2012/059712
Other languages
French (fr)
Japanese (ja)
Inventor
学 唐沢
耕一 中島
良 近藤
亨 土井垣
俊介 福田
涼 井手上
Original Assignee
株式会社Jvcケンウッド
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2012007894A external-priority patent/JP5821647B2/en
Priority claimed from JP2012043554A external-priority patent/JP5825146B2/en
Priority claimed from JP2012073562A external-priority patent/JP5765282B2/en
Application filed by 株式会社Jvcケンウッド filed Critical 株式会社Jvcケンウッド
Publication of WO2013021685A1 publication Critical patent/WO2013021685A1/en
Priority to US14/176,626 priority Critical patent/US9267809B2/en
Priority to US14/939,375 priority patent/US9886117B2/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D1/00Steering controls, i.e. means for initiating a change of direction of the vehicle
    • B62D1/02Steering controls, i.e. means for initiating a change of direction of the vehicle vehicle-mounted
    • B62D1/04Hand wheels
    • B62D1/046Adaptations on rotatable parts of the steering wheel for accommodation of switches

Definitions

  • the present invention uses an in-vehicle device such as a navigation device mounted on a vehicle or a vehicle operation control device that controls the operation of a vehicle such as a transmission or a direction indicator as an operation target device, and controls the operation target device.
  • the present invention relates to a device, a control method, and a steering wheel suitable for operating an operation target device.
  • Patent Document 1 Vehicles in which operation switches for operating in-vehicle devices such as navigation devices mounted on vehicles are arranged on a steering wheel are widely used (see Patent Document 1). If the operation switch is arranged on the steering wheel, the driver does not need to reach the vehicle-mounted device when operating the vehicle-mounted device, so that the operability is improved. As described in Patent Document 1, the operation switch is not actually an annular portion of a steering wheel that is a gripping portion that is gripped by a driver, but a center portion and a circle that store an airbag. It is usual to arrange
  • Patent Document 2 describes that an operation switch is arranged on the back surface or inner side surface of the annular portion.
  • the operation switch since the operation switch is arranged in the annular portion, the operation switch can be operated without releasing the hand from the annular portion or greatly shifting the hand.
  • the operation switch described in Patent Document 2 is a key provided by a push button or a key provided with unevenness, and this type of key may be an obstacle when the driver operates the steering wheel. It is not preferable to provide large irregularities in the annular portion gripped by the driver.
  • an operation unit such as an operation switch is arranged in the annular part, there is no intention to operate the operation target device as in the case where the driver holds the annular part for normal driving. In some cases, it is required to prevent the operation target device from being inadvertently operated.
  • the present invention can operate the operation target device without releasing the hand from the gripping part or greatly shifting the hand, which hinders the driver from operating the steering wheel. It is an object of the present invention to provide a control device and control method for a device to be operated in a vehicle, and a steering wheel that can greatly reduce the possibility of becoming a steering wheel. It is another object of the present invention to provide a control device and control method for an operation target device in a vehicle, and a steering wheel that can greatly reduce erroneous operations.
  • the gripping portion (having a plurality of detection regions (R) and gripped by the driver in the steering wheel (200, 201)) 200r, 201s), a sensor data generation unit that generates sensor data including position data indicating which detection area is touched based on a contact detection signal obtained from a touch sensor (21) mounted in a predetermined range. (22), based on the sensor data, whether or not the driver is gripping the gripping part, and a detection part (10a) for detecting an input operation to the touch sensor; In response to the specific input operation when it is detected that the hand grips the gripping part and a specific input operation is detected with respect to the touch sensor.
  • the control device of the operation target apparatus in a vehicle characterized in that it comprises a control unit (10) for controlling the operation target device for which manipulated by the touch sensor is provided.
  • the steering wheel (200, 201) has a plurality of detection regions (R), and is mounted in a predetermined range of the grip portion (200r, 201s) gripped by the driver. It is detected whether or not the driver is holding the touch sensor (21), whether or not a specific input operation is performed on the touch sensor, and the driver is holding the touch sensor.
  • a method for controlling an operation target device in a vehicle comprising: detecting an operation target device to be operated by the touch sensor when it is detected and the specific input operation is detected. Provided.
  • the gripping portion (200r) which is a portion gripped by the driver, and a plurality of detection regions (R), the gripping portion is within a predetermined range of the gripping portion.
  • Sensor data generation for generating sensor data including position data indicating which detection area is touched based on a touch detection signal obtained from the touch sensor (21) and the touch sensor mounted so as to cover the part
  • a detection unit (24a) for detecting whether or not the driver is gripping the part of the touch sensor in the gripping part and an input operation to the touch sensor based on the sensor data; The specific input is detected when the detection unit detects that the driver is holding the part of the touch sensor and that a specific input operation is performed on the touch sensor.
  • the control signal generator for generating a control signal for controlling the operation target device for which manipulated by the touch sensor and (24b), A steering wheel is provided.
  • the first area (Arg) in the touch sensor (21) attached to the grip part (200r, 201s) gripped by the driver on the steering wheel (200, 201) is touched.
  • a specific input operation is performed on the first detection unit (10a) that detects that the touch sensor is in the state of being touched and the second area (Arv) that is located above the first area of the touch sensor. It is detected that the first area is touched by the second detection unit (10a) that detects that the first area has been made, and the specific state is detected by the second detection unit.
  • a control unit (10) that controls an operation target device to be operated by the touch sensor according to the specific input operation when it is detected that the input operation is performed. Controller of the operation target apparatus in a vehicle according to claim is provided.
  • the first area (Arg) in the touch sensor (21) attached to the grip part (200r, 201s) gripped by the driver on the steering wheel (200, 201) is touched. Is detected, and is specified for the second area (Arv) located above the first area of the touch sensor in a state where the first area is touched.
  • an operation target device to be operated by the touch sensor is controlled according to the specific input operation.
  • a sensor data generation unit that generates sensor data including position data indicating which detection area is touched based on a contact detection signal obtained from a touch sensor (21) mounted so as to cover the gripping unit. 22), based on the sensor data, a detection unit (10a) that detects an input operation on the touch sensor, and the detection unit detects that a specific input operation has been performed on the touch sensor.
  • the vehicle has a plurality of detection regions (R), and within a predetermined range of the grip portions (200r, 201s) gripped by the driver in the steering wheel (200, 201) of the vehicle. It is detected whether or not a specific input operation has been performed on the touch sensor (21) attached so as to cover the gripping portion, and whether or not the vehicle is in a specific state is detected.
  • an operation target device to be operated by the touch sensor is controlled when it is detected that the specific input operation has been performed and that the specific input operation has been performed.
  • a method for controlling an operation target device is provided.
  • the vehicle has a plurality of detection regions (R), and within a predetermined range of the grip portions (200r, 201s) gripped by the driver in the steering wheel (200, 201) of the vehicle. It is detected whether or not a specific input operation has been performed on the touch sensor (21) attached so as to cover the gripping portion, and whether or not the vehicle is in a specific state is detected.
  • an operation target device to be operated by the touch sensor is controlled when it is detected that the specific input operation has been performed and that the specific input operation has been performed.
  • a method for controlling an operation target device is provided.
  • the steering wheel (200, 201) has a plurality of detection regions (R) and is mounted in a predetermined range of the gripping part (200r, 201s) gripped by the driver.
  • a sensor data generation unit (22) that generates sensor data including position data indicating which detection area is touched, and based on the sensor data,
  • Controller of the operation target apparatus in a vehicle characterized in that it comprises a Gosuru controller (10) is provided.
  • the steering wheel (200, 201) has a plurality of detection regions (R) and is mounted in a predetermined range of the gripping part (200r, 201s) gripped by the driver.
  • the touch sensor There is provided a method for controlling an operation target device in a vehicle, characterized in that a transition is made from a state in which a first specific input operation for operating an operation target device to be operated is not accepted to a state in which it is accepted.
  • the operation target device can be operated without releasing the hand from the gripping part or greatly shifting the hand.
  • the possibility of hindering the operation of the steering wheel can be greatly reduced.
  • erroneous operations can be greatly reduced.
  • FIG. 1 is a block diagram illustrating each embodiment of a control device for an operation target device in a vehicle.
  • FIG. 2 is a partial plan view illustrating an example of a vehicle including a control device for an operation target device according to each embodiment.
  • FIG. 3 is a diagram illustrating an example of a position and a range where the touch sensor according to each embodiment is mounted on the steering wheel.
  • FIG. 4 is a diagram illustrating another example of a position and a range in which the touch sensor according to each embodiment is mounted on the steering wheel.
  • FIG. 5 is a diagram illustrating an example in which the touch sensor is mounted on the modified steering wheel.
  • FIG. 6 is a partial perspective view showing an example of a portion where sensor data can be obtained in the state where the touch sensor portion of the steering wheel is gripped.
  • FIG. 1 is a block diagram illustrating each embodiment of a control device for an operation target device in a vehicle.
  • FIG. 2 is a partial plan view illustrating an example of a vehicle including a control device for
  • FIG. 7 is a cross-sectional view showing coordinates in the circumferential direction of the cross section in the touch sensor.
  • FIG. 8 is a plan view showing a state in which the touch sensor shown in FIG. 6 is developed.
  • FIG. 9 is a schematic diagram illustrating a state in which each area illustrated in FIG. 8 is converted into an equal size.
  • FIG. 10 is a diagram illustrating an example of requirements for determining that the touch sensor portion of the steering wheel is being gripped.
  • FIG. 11 is a schematic diagram illustrating an example of a specific input operation with respect to the touch sensor.
  • FIG. 12 is a schematic diagram illustrating another example of the specific input operation with respect to the touch sensor.
  • FIG. 13 is a schematic diagram illustrating still another example of the specific input operation with respect to the touch sensor.
  • FIG. 11 is a schematic diagram illustrating an example of a specific input operation with respect to the touch sensor.
  • FIG. 14 is a flowchart for explaining the operation of each embodiment.
  • FIG. 15 is a schematic perspective view illustrating a configuration example for changing the color when the touch sensor is operated.
  • FIG. 16 is a schematic perspective view showing a configuration example for changing the sense of touch when the touch sensor is operated.
  • FIG. 17 is a plan view showing an embodiment of a steering wheel.
  • FIG. 18 is a diagram for explaining the rotation angle of the steering wheel.
  • FIG. 19 is a flowchart showing an example of specific processing in step S4 of FIG.
  • FIG. 20 is a schematic diagram illustrating an example of a state where the driver is holding the touch sensor portion for normal driving.
  • FIG. 21 is a schematic diagram illustrating an example of a state where the driver is holding the touch sensor and is about to operate the operation target device.
  • FIG. 15 is a schematic perspective view illustrating a configuration example for changing the color when the touch sensor is operated.
  • FIG. 16 is a schematic perspective view showing a configuration example for changing the sense of touch when
  • FIG. 22 is a schematic diagram showing a state in which the operation invalid area Ariv in FIG. 11 is omitted.
  • FIG. 23 is a plan view for explaining another configuration example that shows a state where the touch sensor shown in FIG. 6 is developed and that distinguishes whether or not the driver is going to operate the operation target device.
  • FIG. 24 is a diagram illustrating an example of an input operation when the same input operation is performed with the left and right hands at the same timing with respect to the left and right touch sensors.
  • FIG. 25 is a diagram illustrating an example when the input operations are regarded as having the same timing.
  • FIG. 26 is a diagram illustrating an example of an input operation when a predetermined input operation is continuously performed with the left and right hands with respect to the left and right touch sensors.
  • FIG. 27 is a diagram illustrating an example in the case of being regarded as a continuous input operation.
  • FIG. 28 is a diagram illustrating a first example in which an operation mode is set by a combination of input operations by left and right hands with respect to left and right touch sensors.
  • FIG. 29 is a diagram illustrating a second example in which the operation mode is set by a combination of input operations by the left and right hands with respect to the left and right touch sensors.
  • FIG. 30 is a diagram illustrating an example in which each area of the touch sensor is color-coded.
  • FIG. 31 is a diagram illustrating an example in which a marker is attached to the boundary of the area of the touch sensor.
  • FIG. 32 is a diagram illustrating an example in which the diameter in the operation detection area of the touch sensor is reduced.
  • FIG. 33 is a diagram illustrating an example in which the diameter in the operation detection area of the touch sensor is increased.
  • FIG. 34 is a diagram illustrating an example in which a recess is provided at the boundary of the area of the touch sensor.
  • FIG. 35 is a diagram illustrating an example in which a convex portion is provided at the boundary of the area of the touch sensor.
  • FIG. 36 is a diagram illustrating an example in which the color of the operation detection area is changed when it is detected that the grip detection area of the touch sensor is gripped.
  • FIG. 37 is a diagram illustrating an example in which the tactile sensation of the operation detection area is changed when it is detected that the grip detection area of the touch sensor is gripped.
  • FIG. 34 is a diagram illustrating an example in which the diameter in the operation detection area of the touch sensor is increased.
  • FIG. 34 is a diagram illustrating an example in which a recess is provided at the boundary of the area of the touch sensor.
  • FIG. 35 is a diagram illustrating an example in which
  • FIG. 38 is a diagram illustrating an example of a locus when a finger is slid in the left-right direction.
  • FIG. 39 is a diagram for explaining the correction of the locus when the finger is slid in the right direction.
  • FIG. 40 is a diagram for explaining the correction of the trajectory when the finger is slid downward.
  • FIG. 41 is a diagram for explaining an example of realizing diagonal dragging.
  • FIG. 42 is a partial perspective view for explaining the definition of dragging in the horizontal direction and the vertical direction in the eighth embodiment.
  • FIG. 43 is a plan view for explaining the definition of dragging in the horizontal direction and the vertical direction in the eighth embodiment with the touch sensor deployed.
  • FIG. 44 is a plan view showing a configuration example in which a modified steering wheel is developed.
  • FIG. 45 is a partially enlarged plan view of FIG. 46 is a cross-sectional view taken along the line AA in FIG.
  • FIG. 47 is a cross-sectional view taken along the line BB of FIG. 45 for explaining the on / off switching operation by the on / off switching mechanism.
  • FIG. 48 is a flowchart for explaining the operation of the eighth embodiment when the modified steering wheel of FIG. 44 is used.
  • FIG. 49 is a schematic diagram illustrating an example of gripping state identification data indicating how the driver is gripping the steering wheel of the part to which the touch sensor is attached.
  • FIG. 50 is a schematic diagram showing a modification of FIG. 49 in order to facilitate understanding of FIG. FIG.
  • FIG. 51 is a schematic diagram illustrating another example of gripping state identification data indicating how the driver is gripping the steering wheel of the part to which the touch sensor is attached.
  • FIG. 52 is a diagram illustrating an example of driver specifying data registered in the driver database.
  • FIG. 53 is a flowchart for explaining an operation when a driver is specified.
  • FIG. 54 is a partial perspective view illustrating still another example of gripping state identification data indicating how the driver is gripping the steering wheel of the portion where the touch sensor is mounted.
  • the in-vehicle device 100 is mounted in a dashboard of a vehicle.
  • the in-vehicle device 100 includes a control unit 10, a navigation processing unit 11, an audio playback unit 12, a television (TV) tuner 13, a video signal processing unit 14, a video display unit 15, and an audio signal processing unit 16.
  • the control unit 10 includes a detection unit 10a.
  • the navigation processing unit 11 has a storage unit that holds map data, a GPS antenna, and the like, and the control unit 10 and the navigation processing unit 11 cooperate to provide route guidance.
  • the audio reproducing unit 12 reproduces an audio signal recorded on an optical disc such as a compact disc or a semiconductor memory according to control by the control unit 10.
  • the TV tuner 13 receives a TV broadcast wave signal of a predetermined broadcast station under the control of the control unit 10.
  • the video signal output from the navigation processing unit 11 or the TV tuner 13 is input to the video signal processing unit 14 via the control unit 10 and processed, and displayed on the video display unit 15 such as a liquid crystal panel.
  • the audio signal output from the navigation processing unit 11, the audio reproduction unit 12, and the TV tuner 13 is input to the audio signal processing unit 16 through the control unit 10, processed, and produced by the external speaker 20.
  • the audio signal processing unit 16 includes an amplification unit.
  • the speaker 20 is installed inside the door of the vehicle.
  • the display element 17 is, for example, a light emitting diode (LED), and is turned on or off according to the contact state of the touch sensor 21 described later according to control by the control unit 10.
  • the display element 17 is disposed, for example, in a housing of the in-vehicle device 100 so that the driver can visually recognize the display element 17.
  • the display element 17 may be arranged away from the in-vehicle device 100 and in the vicinity of the steering wheel 200 of the vehicle.
  • the storage unit 18 is a nonvolatile memory.
  • the touch sensor 21 serving as the operation unit is attached to the annular portion 200r of the steering wheel 200.
  • the annular portion 200r is a gripping portion that is a portion that the driver grips during driving.
  • the touch sensor 21 is mounted in a predetermined angular range on each of the left and right sides of the annular portion 200r.
  • the touch sensor 21 is a so-called multi-point detection (multi-touch) touch sensor that can detect contact at a plurality of locations.
  • the touch sensor 21 is preferably mounted within a range of 360 ° on the circumference of the radial cross section of the annular portion 200r. Even in the range of less than 360 °, it is only necessary to cover substantially the entire circumference of the cross section of the annular portion 200r.
  • the driver is holding the portion of the annular portion 200r where the touch sensor 21 is attached.
  • the output of the touch sensor 21 is input to the sensor data generation unit 22.
  • a contact detection signal is input to the sensor data generation unit 22.
  • the sensor data generation unit 22 generates sensor data including position data indicating from which position of the touch sensor 21 the contact detection signal is obtained based on the input contact detection signal, and supplies the sensor data to the control unit 10.
  • the touch sensor 21 and the sensor data generation unit 22 may be integrated, or the sensor data generation unit 22 may be provided in the control unit 10.
  • a projected capacitive (mutual capacitance) type touch sensor can be used.
  • a flexible touch panel developed by the Micro Technology Research Institute can be employed. This flexible touch panel has a structure in which a sensor portion is made of ultra-thin glass having a thickness of 0.02 to 0.05 mm, and an ultra-thin glass and a PET (polyethylene terephthalate) film are bonded together. Even when the touch sensor 21 is attached to the annular portion 200r, the touch sensor 21 does not have irregularities that can be perceived by a hand or a finger, so that it is almost impossible for the driver to operate the steering wheel 200. Absent.
  • the steering angle sensor 31 detects the rotation angle of the steering wheel 200.
  • the direction indicator sensor 32 detects an operation of the direction indicator 320.
  • the shift lever sensor 33 detects where the shift position by the shift lever 330 is.
  • the detection signals of the steering angle sensor 31, the direction indicator sensor 32, and the shift lever sensor 33 are supplied to the control unit 10 via the in-vehicle communication unit 34.
  • FIG. 3A shows an example in which the touch sensor 21 is attached to the entire circumference of the annular portion 200r.
  • FIG. 3B is the same as FIG. 2, and is an example in which the touch sensors 21 are mounted apart from each other in predetermined angular ranges on the left and right above the annular portion 200r.
  • FIG. 3C shows an example in which the touch sensor 21 is mounted in a predetermined angle range only on the right side above the annular portion 200r.
  • FIG. 3D shows an example in which the touch sensors 21 are mounted apart from each other within a predetermined angular range on the left and right sides below the annular portion 200r.
  • FIG. 3E shows an example in which the touch sensor 21 is mounted in a relatively wide angle range above including the top of the annular portion 200r.
  • FIG. 3E corresponds to a combination of the left and right touch sensors 21 in FIG.
  • FIG. 4 is an example in which the left and right touch sensors 21 in FIG. 3B are divided into an upper touch sensor 21a and a lower touch sensor 21b.
  • the upper touch sensor 21a detects contact with the index finger and thumb of the hand
  • the lower touch sensor 21b mainly detects contact with the palm, middle finger, and ring finger.
  • FIG. 5 shows an example in which the touch sensor 21 is mounted on a deformed steering wheel 201 that is not circular.
  • the touch sensor 21 is attached to the left and right straight portions 201 s of the modified steering wheel 201.
  • the driver operates by grasping the straight portion 201s that is the gripping portion, and the touch sensor 21 detects a contact with a palm or a finger.
  • FIG. 6 shows an example of a range in which the palm and the finger are in contact when the driver grips the right touch sensor 21 in FIG.
  • the manner in which the driver grips the annular portion 200r with his / her hand and the size of the hand are not uniform, and FIG. 6 is merely an example.
  • a plurality of detection regions R indicated by hatched Tp are portions where the palm contact is detected, and a plurality of detection regions R indicated by hatched Tt detect the contact of the thumb. It is the part which is doing.
  • the palm contact detection unit Tp and the thumb contact detection unit Tt are referred to.
  • An index finger contacts the back side of the touch sensor 21, which is the traveling direction side of the vehicle not visible in FIG.
  • the touch sensor 21 has a plurality of detection regions R as a detection portion for detecting the contact of a palm or a finger. Coordinates are set in each detection region R of the touch sensor 21. As shown in FIG. 6, in the circumferential direction of the annular portion 200r, the detection region R located at the lower end portion of the touch sensor 21 is set to the coordinate 0, and 1, 2, 2, to the detection region R located at the upper end portion. ..., 30, 31 and circumferential coordinates are set. A coordinate in the circumferential direction of the annular portion 200r in the touch sensor 21 is defined as a Y coordinate.
  • FIG. 7 is a cross-sectional view of the annular portion 200r cut in the radial direction of the annular portion 200r at the portion where the touch sensor 21 is mounted.
  • a detection region R located on the inner diameter side of the annular portion 200r is set as a coordinate 0.
  • 1,..., 21 and 22 are set as coordinates.
  • the coordinate in the circumferential direction of the cross section in the touch sensor 21 is defined as an X coordinate.
  • the sensor data generation unit 22 can obtain position data indicating where the driver is touching the touch sensor 21 based on the X coordinate and Y coordinate of the detection region R from which the contact detection signal is obtained.
  • FIG. 9 schematically shows a state in which each area of the touch sensor 21 shown in FIG. 8 is converted into an equal size.
  • an index finger contact detection unit Ti which is a plurality of detection regions R with which the index finger is in contact, is also shown.
  • the touch sensor 21 also detects contact of those fingers.
  • the driver uses a thumb or index finger suitable for performing a specific input operation on the touch sensor 21 as a finger for the operation.
  • the detection unit 10 a of the control unit 10 detects that an input operation has been performed on the touch sensor 21 with the thumb or index finger based on the sensor data output from the sensor data generation unit 22. Based on the sensor data output from the sensor data generation unit 22, the detection unit 10a also detects that the annular portion 200r (touch sensor 21) is being gripped.
  • the control unit 10 controls the operation target device according to a specific input operation performed on the touch sensor 21.
  • the operation target device is the in-vehicle device 100 as an example.
  • the control unit 10 executes control related to route guidance in the navigation processing unit 11 according to a specific input operation, plays back or stops an audio signal in the audio playback unit 12, and plays a track ( Music) can be advanced or returned. Further, the control unit 10 can switch the reception channel in the TV tuner 13 or control the amplification unit of the audio signal processing unit 16 to decrease or increase the volume according to a specific input operation.
  • the operation target device is a vehicle operation control device that controls the operation of the vehicle.
  • the control unit 10 may control a transmission, a direction indicator, an air conditioner on / off, a set temperature of the air conditioner, and the like via the in-vehicle communication unit 34.
  • the operation target device is a vehicle motion control device
  • the control unit that controls the operation target device may be the control unit 10 in the in-vehicle device 100 or may be a control unit outside the in-vehicle device 100 provided in the vehicle.
  • the extremely thin touch sensor 21 is attached to the annular portion 200r gripped by the driver, and the operation target device is operated by operating the touch sensor 21, so that the hand is released from the annular portion 200r. Or the operation target device can be operated without greatly shifting the hand. Further, since the touch sensor 21 has no irregularities on the surface of the annular portion 200r, there is almost no possibility that the driver will interfere with the operation of the steering wheel 200.
  • the operation target device is prevented from being inadvertently operated when the driver does not intend to operate the operation target device as in the case where the driver holds the annular portion 200r for normal driving. It will be necessary. Therefore, in this embodiment, in order to avoid an erroneous operation that is not intended by the driver, the following is performed.
  • a plurality of detection areas R on the touch sensor 21 detect the operation input by enabling the grip detection area Arg for detecting the contact of the palm and the operation input by the thumb or the index finger.
  • the operation detection area Arv for performing the operation, and the operation invalid area Ariv that is an intermediate area between the grip detection area Arg and the operation detection area Arv and invalidates the operation input are set.
  • the palm contact detection unit Tp is located in the grip detection area Arg
  • the thumb contact detection unit Tt and the index finger contact detection unit Ti are located in the operation detection area Arv.
  • the operation invalid area Ariv has a detection area R for detecting a palm or finger contact, but is operated by the control unit 10 (detection unit 10a) or the sensor data generation unit 22. By performing processing so as to invalidate the input operation from the invalid area Ariv, the operation invalid area can be obtained. Further, the touch sensor 21 may be configured not to provide the detection region R in the range of the operation invalid area Ariv, so that the operation invalid area may be provided. This case is substantially equivalent to the example shown in FIG.
  • the palm contact detection unit Tp, the thumb contact detection unit Tt, and the index finger contact detection unit Ti are relatively close to each other. Therefore, in this embodiment, in order to accurately distinguish between a case where the driver simply holds the annular portion 200r and a case where the driver touches the touch sensor 21 in order to operate the operation target device, the operation invalid area Ariv Is provided.
  • the driver wants to operate the operation target device the driver intentionally extends the thumb or index finger and touches the touch sensor 21 to perform a specific input operation described later.
  • the control unit 10 controls the operation target device according to the input operation when a specific input operation described later is performed in the operation detection area Arv.
  • the detection unit 10a determines that the annular portion 200r is gripped when the palm contact detection unit Tp having a predetermined area or more is obtained in the grip detection area Arg.
  • the controller 10 is configured to hold the annular portion 200r and to control the operation target device when a specific operation is performed in the operation detection area Arv.
  • the area of the palm contact detection unit Tp that is determined to be gripping the toric part 200r should be set appropriately by statistically examining the area when a plurality of drivers hold the steering wheel 200 in a normal manner. That's fine.
  • the area of the palm contact detection portion Tp in the grip detection area Arg is an example of a requirement for determining that the driver is holding the annular portion 200r, and is not limited to this requirement.
  • FIG. 10 shows a cross section in which the annular portion 200r is cut at the grip detection area Arg of the touch sensor 21.
  • the detection unit 10a can determine that the annular portion 200r is gripped when the angle ⁇ in the circumferential direction of the palm contact detection unit Tp is equal to or greater than a predetermined angle.
  • the predetermined angle is, for example, 180 °.
  • the operation detection area Arv is provided at a position separated from the grip detection area Arg by a predetermined distance, so that the driver intentionally performs a specific input operation on the touch sensor 21. It is possible to accurately detect what is going on. Therefore, erroneous operations can be greatly reduced.
  • the area of the palm contact detection portion Tp in the grip detection area Arg and the angle ⁇ in the circumferential direction of the cross section are requirements for determining whether or not the driver is holding the annular portion 200r. It is possible to accurately determine whether or not the annular portion 200r is gripped. Accordingly, it is possible to avoid an erroneous operation in the case where the driver carelessly touches the operation detection area Arv in a state where the driver does not hold the annular portion 200r.
  • the control unit 10 When the control unit 10 detects that the driver is holding the annular portion 200r (touch sensor 21) based on the sensor data based on the contact detection signal from the grip detection area Arg, the control unit 10 informs the driver.
  • the display element 17 is turned on to notify that the operation input by the operation detection area Arv is possible.
  • the driver can determine whether or not the operation target device can be operated by the touch sensor 21 by turning on / off the display element 17. It is preferable to arrange the display element 17 in the vicinity of the steering wheel 200.
  • the control unit 10 sets an area including the palm contact detection unit Tp as the grip detection area Arg.
  • a predetermined range of the Y coordinate including the palm contact detection unit Tp may be set as the grip detection area Arg.
  • the palm contact detection unit Tp has a predetermined area or more, a portion of the plurality of detection regions R on the touch sensor 21 that is detected to be touched by a predetermined area or more is the palm contact detection unit Tp. It becomes.
  • a portion that is detected to be touched by a predetermined angle or more in the circumferential direction of the cross section obtained by cutting the annular portion 200 r at the portion of the touch sensor 21 becomes the palm contact detection portion Tp. .
  • the control unit 10 sets the grip detection area Arg
  • the control unit 10 sets a predetermined range of the Y coordinate above the grip detection area Arg as the operation detection area Arv. In this case, if necessary, a predetermined range of the Y coordinate adjacent to the grip detection area Arg is set as the operation invalid area Ariv, and the grip detection area Arg is set at a position separated from the grip detection area Arg.
  • FIGS. 11A to 11E schematically show a half that is the front side or the back side of the touch sensor 21 facing the driver.
  • the operations shown in FIGS. 11A to 11E are performed with the thumb on the front side and with the index finger on the back side.
  • D R is the right drag sliding to the right thumb or index finger on the touch sensor 21 (operation detection area ARV)
  • D L is the left sliding the thumb or index finger in the right direction Direction drag.
  • D U is an upward drag that slides the thumb or index finger upward
  • D D D is a downward drag that slides in the direction of the thumb or index finger.
  • FIG. 11B shows a tap T that taps the touch sensor 21 with the thumb or index finger.
  • FIG. 11 (c) shows an arc drag D C drag to draw an arc on the touch sensor 21 with the thumb or index finger.
  • FIG. 11 (d) shows a zigzag drag D Z to drag in a zigzag shape on the touch sensor 21 with the thumb or index finger.
  • Figure 11 (e) shows a symbol input drag D S dragging to write symbols thumb or index finger.
  • FIG. 11E shows a state in which the numeral 3 is drawn as a symbol. As a symbol, it is preferable to use numbers and alphabets that are relatively easy to recognize.
  • FIGS. 12A to 12D schematically show a front part 21f that is a half on the front side of the touch sensor 21 and a back part 21r that is a half on the back side, with the touch sensor 21 opened.
  • the front portion 21f is a portion of 0 to 11 of the X coordinate shown in FIGS. 8 and 9, and the back portion 21r is a portion of 12 to 22 of the X coordinate.
  • the front part 21f and the rear part 21r do not have the same area in the strict sense, but the front part 21f and the rear part 21r are the same in FIGS. 12 (a) to 12 (d).
  • the area is shown. 12A to 12D, for easy understanding, the back surface portion 21r is not seen from the back surface side of the annular portion 200r but the back surface portion 21r is seen through the front surface portion 21f. Indicates the state.
  • a specific input operation for the touch sensor 21 may be a combination of an input operation with the thumb for the front portion 21f and an input operation with the index finger for the back portion 21r.
  • 12 (a) is an example in which the right drag D TR sliding the thumb in the right direction at the front portion 21f, both the rightward drag D IR sliding the index finger in the right direction at the rear surface portion 21r.
  • FIG. 12 (b) is an example of performing the leftward drag D TL sliding the thumb to the left at the front portion 21f, both the rightward drag D IR sliding the index finger in the right direction at the rear surface portion 21r.
  • FIG. 12B is realized by dragging the thumb from the outer peripheral side of the annular part 200r to the inner peripheral side and dragging the index finger from the inner peripheral side of the annular part 200r to the outer peripheral side.
  • Figure 12 (c) is an example of performing the right drag D TR sliding the thumb in the right direction at the front portion 21f, both the left drag D IL sliding the index finger to the left at the back surface 21r.
  • FIG. 12C is realized by dragging the thumb from the inner peripheral side to the outer peripheral side of the annular portion 200r and dragging the index finger from the outer peripheral side to the inner peripheral side of the annular portion 200r.
  • Figure 12 (d) is an example in which the direction drag D TU on sliding the thumb in the upward direction at the front portion 21f, both the downward drag D ID sliding the index finger in a downward direction at the rear surface portion 21r.
  • a pattern in which the thumb is dragged downward and the index finger is dragged upward may be used, or a pattern in which both the thumb and index finger are dragged upward or downward may be used.
  • various patterns are shown in which the input operation with the thumb on the front surface portion 21f and the input operation with the index finger on the back surface portion 21r are shown.
  • FIGS. 13A to 13D show the operations by the left and right hands when the left touch sensor 21 in FIG. 3B is the left touch sensor 21L and the right touch sensor 21 is the right touch sensor 21R.
  • An example of a combined pattern is shown.
  • a surface corresponding to the front portion 21f of FIG. 12 operated by the thumb is shown by a schematic plane.
  • FIG. 13 (a) is combined with the left drag D TL sliding the thumb against the left touch sensor 21L in the left direction and a right direction drag D TR sliding the thumb in the right direction with respect to the right touch sensor 21R Pattern.
  • FIG. 13 (b) combined with the right drag D TR sliding the thumb in the right direction with respect to the left touch sensor 21L, and a left drag D TL sliding the thumb to the left with respect to the right touch sensor 21R Pattern.
  • FIG. 13C shows a pattern in which an upward drag DTU that slides the thumb upward is combined with both the left touch sensor 21L and the right touch sensor 21R.
  • a pattern combining input operations by left and right hands is a specific input operation for controlling the operation target device, the driver will hold the annular portion 200r with both hands, which contributes to safe driving.
  • the example of FIG. 3B is most preferable in that it contributes to safe driving because the touch sensor 21 is mounted at the most appropriate position where the annular portion 200r is grasped with both hands.
  • An input operation may be received when the left and right touch sensors 21 are held with both hands. It is good also as a state which does not accept input operation, when one hand leaves
  • FIG. When one hand moves away from the touch sensor 21, the state of accepting an input operation may be continued. Even when a specific input operation with only one hand is used, if the input operation is accepted when the left and right touch sensors 21 are held with both hands, it contributes to safe driving.
  • the specific operation is a combination of the input operation with the thumb and the input operation with the index finger. It is considered that there is a relatively low possibility that a pattern or a specific pattern combining left and right hand input operations will result. Therefore, when only a specific pattern that combines the input operation with the thumb and the input operation with the index finger or only a specific pattern that combines the input operation with the left and right hands is used, one of the devices for avoiding the above-described erroneous operation. Some or all of them may be omitted. Of course, even when only a specific pattern in which the input operations by the left and right hands are combined is used, it is preferable to employ a device for avoiding the above-described erroneous operation.
  • the storage unit 18 stores a table in which the above-described specific input operation or a combination of specific input operations is associated with the type of control for the operation target device.
  • the control unit 10 controls the operation target device according to the operation input to the touch sensor 21 according to the table stored in the storage unit 18.
  • the storage unit 18 may be provided in the control unit 10.
  • the control unit 10 acquires sensor data output from the sensor data generation unit 22 in step S1.
  • the control unit 10 determines whether or not the annular portion 200r is gripped based on the detection output from the detection unit 10a. If it is determined that the annular portion 200r is gripped (YES), the control unit 10 proceeds to step S3, and if it is not determined that the annular portion 200r is gripped (NO), the control unit 10 10 returns the process to step S1.
  • step S3 the control unit 10 determines whether or not an input operation has been performed based on the detection output from the detection unit 10a. If it is determined that there is an input operation (YES), the control unit 10 moves the process to step S4. If it is not determined that there is an operation input (NO), the control unit 10 returns the process to step S1. In step S4, the control unit 10 determines whether or not to permit an operation on the operation target device by the input operation in step S3. If it is determined that the operation is permitted (YES), the control unit 10 moves the process to step S5. If it is not determined that the operation is permitted (NO), the control unit 10 returns the process to step S1.
  • control unit 10 permits an operation on the operation target device when a specific input operation is performed in the operation detection area Arv, and operates even if a specific input operation is performed in the operation invalid area Ariv. Do not allow operations on the target device. In addition, even if any input operation is performed in the operation detection area Arv, the control unit 10 does not permit the operation on the operation target device if it is not the above-described specific input operation, and only when the specific input operation is performed. Allows operations on the operation target device.
  • step S5 the control unit 10 determines an operation based on the input operation.
  • step S6 the control unit 10 executes control according to the determined operation on the operation target device, and returns the process to step S1. .
  • the operation according to this embodiment is summarized as follows.
  • the detection unit 10a (first detection unit) is touched in the first area of the touch sensor 21 attached to the gripping part (the annular part 200r or the straight part 201s) gripped by the driver in the steering wheels 200 and 201. It is detected that it is in a state.
  • An example of the first area is the grip detection area Arg.
  • the detection unit 10a (second detection unit) performs a specific input operation on the second area located above the first area in the touch sensor 21 while the first area is being touched. Detect what has been done.
  • An example of the second area is the operation detection area Arv.
  • the thumb or index finger is positioned above the palm, so the upper side of the first area may be the second area.
  • the area located on the upper side is an area located on the upper side of the first area when the driver holds the gripping part without rotating the steering wheel 200.
  • the first area is touched more than a predetermined area, it is preferable that the first area is touched.
  • the detection unit 10a (first detection unit) is mounted so as to cover the gripping part within a predetermined range of the gripping part (annular part 200r or linear part 201s) gripped by the driver on the steering wheels 200 and 201.
  • the touch sensor 21 In the first area on the touch sensor 21, the touch sensor 21 is in a state of being touched by a predetermined angle in the circumferential direction in the cross section when the gripping part is cut in the radial direction of the steering wheels 200 and 201. Is detected.
  • the detection unit 10a (second detection unit) performs a specific input operation on a second area different from the first area in the touch sensor 21 in a state where the first area is touched by a predetermined angle or more. Detect what has been done. When the first area is touched more than the predetermined angle and a specific input operation is performed, an operation target device to be operated by the touch sensor 21 is controlled according to the specific input operation.
  • the second area is an area located above the first area.
  • the area located on the upper side is an area located on the upper side of the first area in a state where the driver grips the gripping part without rotating the steering wheel 200.
  • 15 and 16 show a configuration example for effectively notifying the driver that the touch sensor 21 has been operated.
  • 15 and 16 are schematic views in which the touch sensor 21 is developed and converted into a rectangular shape, as in FIG.
  • FIG. 15 is an example in which a color change sheet 41 containing a coloring material is provided on the lower surface side of the touch sensor 21.
  • the driver can see the color of the color change sheet 41 disposed on the lower surface of the touch sensor 21 via the touch sensor 21.
  • the driver can recognize that the touch sensor 21 has been operated by changing the color of the color change sheet 41 at the portion where the touch sensor 21 is touched under the control of the control unit 10.
  • FIG. 16 shows an example in which a tactile feedback sheet 42 for changing a tactile sensation (hand touch) is provided on the upper surface side of the touch sensor 21.
  • a tactile feedback sheet 42 for example, a sheet called “E-Sense” developed by Senseg of Finland can be used. This sheet realizes tactile feedback by charging the film. Even if the tactile feedback sheet 42 is provided on the upper surface side of the touch sensor 21, the touch sensor 21 can detect contact with a finger or the like. When the touch sensor 21 is operated via the tactile feedback sheet 42, the driver can recognize that the touch sensor 21 has been operated by changing the tactile sense of the tactile feedback sheet 42 under the control of the control unit 10. it can.
  • a steering wheel 210 according to an embodiment shown in FIG. 17 is configured to output a control signal for the operation target device from the steering wheel 210.
  • the same parts as those in FIGS. 1 and 2 are denoted by the same reference numerals, and description thereof will be omitted as appropriate.
  • the steering wheel 210 includes, for example, a sender data generation unit 23 similar to the sensor data generation unit 22 in FIG. 1 and a control unit 24 similar to the control unit 10 in a portion other than the annular portion 200r. I have.
  • the control unit 24 includes a detection unit 24a similar to the detection unit 10a and a control signal generation unit 24b.
  • the control signal generator 24b When the steering wheel 210 is mounted on the vehicle, the control signal generator 24b generates a control signal for controlling the operation target device in accordance with a specific input operation to the touch sensor 21.
  • the control signal output from the control signal generator 24 b is output to the output terminal 26 via the cable 25. If the output terminal 26 is connected to the operation target device, the operation target device can be controlled by the control signal. Examples of specific input operations are the same as those in FIGS.
  • the requirements for the control signal generator 24b to generate the control signal are the same as described above.
  • the touch sensor 21 may be detachably attached to the annular portion 200r using a surface fastener.
  • the annular portion 200r is the gripping portion, the gripping portion is not necessarily circular.
  • the touch sensor 21 does not need to be configured by a single sheet, and the touch sensor 21 may be configured by a plurality of pieces of touch sensors. If the touch sensor 21 is composed of a plurality of pieces of touch sensors, the shape of each piece of the touch sensor can be simplified, which is advantageous when producing touch sensors. In addition, when the touch sensor 21 is configured by a plurality of pieces of touch sensors, the pieces of touch sensors do not necessarily have to be arranged without gaps.
  • the touch panel 21 according to the present embodiment is mounted so as to cover the gripping portion. However, the touch panel 21 according to the present embodiment is mounted so as to cover the gripping portion. It is assumed that the sensor 21 is mounted so as to cover the gripping portion in a state where there is a gap between the pieces of the touch sensor.
  • the range in which the touch sensor 21 is provided is not limited to the gripping portion (the annular portion 200r or the straight portion 201s) that the driver grips during driving. You may extend to the surface of the connection part which connects between.
  • the connecting portion is a portion located between the left and right hands in the state shown in FIG. 2, and in FIG. 21, the sensor data generating portion 23 and the control portion 24 are provided.
  • the touch sensor 21 may be extended to the surface of the connecting portion, and the operation detection area Arv may be set at a position close to the gripping portion in the connecting portion. If the position is close to the gripping part, the driver can operate the operation target device without releasing the hand from the gripping part or greatly shifting the hand during driving. Therefore, even when the touch sensor 21 is extended to the surface of the connecting portion, there is almost no possibility that the driver will interfere with the operation of the steering wheels 200, 201, and 210.
  • step S4 when the vehicle is in a specific state, it is preferable not to permit (that is, invalidate) the control of the operation target device.
  • the rotation angle of the steering wheel 200 for determining whether or not to allow control of the operation target device is set. As shown in FIG. 18, when the steering wheel 200 is not rotated, the rotation angle is 0 °. For example, when the wheel is rotated rightward, the rotation angle is plus. When the wheel is rotated leftward, the rotation angle is ⁇ . An input operation to the touch sensor 21 is permitted as valid within a range of 30 °, and if the rotation angle exceeds ⁇ 30 °, the input operation to the touch sensor 21 is invalidated and not permitted.
  • the vehicle When the rotation angle exceeds ⁇ 30 °, the vehicle is in a specific state where it makes a right or left turn or cornering. If an operation target device is to be controlled in such a specific state, there is a high possibility that an erroneous operation will occur. In other words, there is a high possibility that the operation input in such a specific state is an operation input not intended by the user. Also, it is not preferable in terms of safety. Therefore, in the present embodiment, when the vehicle is in a specific state, the control on the operation target device is invalidated.
  • the rotation angle of the steering wheel 200 detected by the steering angle sensor 31 is input to the control unit 10.
  • the control unit 10 switches between a state where the input operation to the touch sensor 21 is enabled and a state where the input operation to the touch sensor 21 is disabled according to the rotation angle of the steering wheel 200 detected by the steering angle sensor 31.
  • a detection signal from the direction indicator sensor 32 is also input to the control unit 10. Therefore, the control unit 10 may invalidate the input operation to the touch sensor 21 when the direction indicator 320 is operated by the detection signal from the direction indicator sensor 32.
  • the direction indicator 320 When the direction indicator 320 is operated, it can be considered that the steering wheel 200 is in a specific state in which the steering wheel 200 is rotated beyond a predetermined rotation angle.
  • the direction indicator 320 may also be used for operations other than a right turn or left turn signal, and the operation of the direction indicator 320 here is an operation for making a right turn or left turn signal.
  • the control unit 10 invalidates the input operation to the touch sensor 21 even when the shift position of the shift lever 330 is reversed by the detection signal from the shift lever sensor 33.
  • the operation target device when the rotation angle of the steering wheel 200 exceeds a predetermined angle of, for example, ⁇ 30 °, or when the shift position of the shift lever 330 is reversed in addition to when the direction indicator 320 is operated.
  • a predetermined angle for example, ⁇ 30 °
  • disabling control on the operation target device may mean disabling control of the operation target device by disabling the specific input operation even if the specific input operation described above is performed. Even if some sensor data is input from the sensor data generation unit 22 to the unit 10, the control unit 10 may invalidate the sensor data. As a result, the control on the operation target device may be invalidated.
  • step S4 in FIG. 14 An example of specific processing in step S4 in FIG. 14 will be described using the flowchart in FIG. In FIG. 19, the controller 10 determines whether or not the shift position of the shift lever 330 is reverse in step S41. If the shift position is reverse (YES), in step S45, the control unit 10 disallows the input operation in step S3 and proceeds to step S1 in FIG. If the shift position is not reverse (NO), the control unit 10 determines whether or not the direction indicator 320 is operated in step S42. If the direction indicator 320 is operated (YES), the control part 10 will make the input operation in step S3 disallowed in step S45, and will transfer to step S1 of FIG.
  • the control unit 10 determines whether or not the rotation angle of the steering wheel 200 exceeds a predetermined angle in step S43. If the rotation angle of the steering wheel 200 exceeds a predetermined angle (YES), the control unit 10 makes the input operation in step S3 not permitted in step S45 and shifts to step S1 in FIG. If the rotation angle of the steering wheel 200 does not exceed the predetermined angle (NO), the control unit 10 determines in step S44 whether or not a specific input operation has been performed in the operation detection area Arv. If the specific input operation is not performed (NO), the control unit 10 determines that the input operation in step S3 is not permitted in step S45, and proceeds to step S1 in FIG. If the input operation is not permitted in step S45, the control on the operation target device becomes invalid. When a specific input operation is performed (YES), the control unit 10 permits the input operation in step S3 in step S46, and proceeds to step S5 in FIG.
  • steps S41, S42, and S43 are provided, but only one or two of these steps may be provided. Further, when all of steps S41, S42, and S43 or two of these steps are provided, the order is arbitrary.
  • the shift lever 330 is referred to, but the shape of the operation unit for switching between straight and reverse travel of the vehicle and changing the transmission gear ratio is arbitrary, and may be any of a floor shift, a column shift, a paddle shift, and the like. These are all included in the shift lever.
  • a third embodiment of the control device and control method for the operation target device in the vehicle will be described.
  • the basic configuration and operation in the third embodiment are the same as those in the first embodiment, and only different parts will be described.
  • the driver when the driver is holding the annular portion 200r for normal driving and does not intend to operate the operation target device, the driver does not accept an input operation on the touch sensor 21 with a finger, When the driver tries to operate the operation target device, an input operation to the touch sensor 21 by a finger is accepted.
  • FIG. 20 shows an example of the state of the palm contact detection unit Tp and the thumb contact detection unit Tt when the driver is holding the annular portion 200r for normal driving.
  • FIG. 20 is a schematic diagram in which each area of the touch sensor 21 is converted into an equal size, as in FIG. 9.
  • the palm contact detection unit Tp has a relatively large area, and the thumb contact detection unit Tt is located at a position close to the palm contact detection unit Tp.
  • the index finger contact detection unit Ti is not shown, but the index finger contact detection unit Ti is also in a position close to the palm contact detection unit Tp.
  • FIG. 21 shows an example of states of the palm contact detection unit Tp and the thumb contact detection unit Tt when the driver tries to operate the operation target device.
  • the area of the palm contact detection unit Tp is smaller than that of FIG. 20, and the thumb contact detection unit Tt is located away from the palm contact detection unit Tp.
  • the index finger contact detection unit Ti is not shown, but the index finger contact detection unit Ti is also located away from the palm contact detection unit Tp.
  • a specific input operation is performed on the touch sensor 21 with the driver simply holding the annular portion 200r for normal driving and the thumb or index finger.
  • the palm contact detection unit Tp may include a portion where the middle finger, the ring finger, and the little finger (in some cases, the index finger in addition to this) are in contact.
  • the X coordinate 8 and Y coordinate 4 to 8 portions of the palm contact detection unit Tp are touched by the middle finger, the ring finger, and the tip of the little finger.
  • the portion touched by the tip of the middle finger, the ring finger, and the little finger has moved to the X coordinate 5 and the Y coordinates 4 to 8. This is because the positions of the tips of the middle finger, the ring finger, and the little finger have changed to the back side of the annular portion 200r.
  • the state in which the operation target device is to be operated is determined based on the change in the position of the end in the circumferential direction in the cross section of the annular portion 200r of the palm contact detection unit Tp.
  • the control unit 10 is in a state where the driver holds the annular portion 200r for normal driving. It is determined that the input operation to the touch sensor 21 by the finger is not accepted. Further, when the area of the palm contact detection unit Tp becomes a second area that is narrower than a predetermined ratio as shown in FIG. It is determined that it is in a state where it is going to be operated, and a state in which an input operation to the touch sensor 21 by a finger is accepted.
  • the operation invalid area Ariv may not be provided.
  • the operation invalid area Ariv is omitted, and the grip detection area Arg and the operation detection area Arv are preset on the touch sensor 21, or the control unit 10 operates on the touch sensor 21 with the grip detection area Arg.
  • a state in which the detection area Arv is set is shown.
  • FIG. 22 shows a state where the driver is holding the annular portion 200r for normal driving, as in FIG.
  • the control unit 10 can discriminate between the two states described above, so that an erroneous operation can be avoided.
  • the area of the palm contact detection unit Tp shown in FIG. 20 and the area of the palm contact detection unit Tp shown in FIG. 21 are registered in the control unit 10 or the storage unit 18 in advance, and an input operation to the touch sensor 21 is accepted. And a state of not accepting may be switched.
  • the area of the palm contact detection part Tp is not always a constant area, an allowable amount of the extent of the area deviation is set.
  • a change in the shape of the palm contact detection unit Tp may be detected.
  • a change in the angle ⁇ in the circumferential direction of the palm contact detection unit Tp shown in FIG. 10 or a change in the maximum length in the X coordinate direction of the palm contact detection unit Tp may be detected.
  • the state in which no input operation is accepted may be that even if a specific input operation is performed, the specific input operation is invalidated and control on the operation target device is not permitted. Even if some sensor data is input from the sensor data generation unit 22, the control unit 10 may invalidate the sensor data. As a result, the control on the operation target device may be invalidated.
  • ⁇ Fourth embodiment> A fourth embodiment of the control device and control method for the operation target device in the vehicle will be described.
  • the basic configuration and operation in the fourth embodiment are the same as those in the first embodiment, and only different parts will be described.
  • FIG. 23 shows a state in which the touch sensor 21 is deployed as in FIG. In the configuration example shown in FIG. 23, the operation invalid area Ariv is omitted.
  • the thumb contact detection unit Tt and the index finger contact detection unit Ti are considered to be relatively close to the palm contact detection unit Tp.
  • the thumb contact detection unit Tt and the index finger contact detection unit Ti when the driver is not trying to operate the device to be operated and is simply holding the annular portion 200r are denoted by Tt0 and Ti0, respectively.
  • FIG. 23 shows that when the driver simply holds the annular portion 200r and does not attempt to operate the operation target device, the thumb contact detection unit Tt0 and the index finger contact detection unit Ti0 are detected, and the driver is the operation target. It is shown that when the device is operated, the thumb contact detection unit Tt and the index finger contact detection unit at a position separated from the palm contact detection unit Tp are moved.
  • the X coordinates of the thumb contact detection unit Tt0 and the thumb contact detection unit Tt are the same, and the X coordinates of the index finger contact detection unit Ti0 and the index finger contact detection unit Ti are the same, but the X coordinate is shifted. There is also. Even in this case, it is only necessary to pay attention to the movement of the Y coordinate.
  • the control unit 10 is configured such that the end of the palm contact detection unit Tp on the thumb contact detection unit Tt0 side and the end of the thumb contact detection unit Tt0 on the palm contact detection unit Tp side in a state where the driver normally holds the annular portion 200r. Is stored as a reference distance.
  • the control unit 10 may store the reference distance ⁇ 1 as a storage unit or may store the reference distance ⁇ 1 in the storage unit 18.
  • the distance between the end of the palm contact detection unit Tp on the thumb contact detection unit Tt side and the end of the thumb contact detection unit Tt on the palm contact detection unit Tp side when the driver is about to operate the operation target device is: For example, the distance ⁇ 2 is longer than the reference distance ⁇ 1.
  • the control unit 10 detects the thumb contact detection unit Tt at a position longer than the reference distance ⁇ 1 by a predetermined distance or more. It is determined that the target device is being operated. And the control part 10 accepts the input operation by the thumb detected by the thumb contact detection part Tt in this state as effective.
  • FIG. 23 shows only the distances ⁇ 1 and ⁇ 2 between the palm contact detection unit Tp and the thumb contact detection units Tt0 and Tt, but the distance between the palm contact detection unit Tp and the index finger contact detection unit Ti0 is also stored in the same manner. What is necessary is just to judge the position of the index finger contact detection part Ti at the time of trying to operate an operation target apparatus.
  • the palm contact detection unit Tp in which the driver's palm is in contact with the touch sensor 21 and the finger (thumb or A reference distance from the finger contact detection unit (thumb contact detection unit Tt0 or index finger contact detection unit Ti0) in which the index finger is in contact with the touch sensor 21 is stored, and the palm contact detection unit Tp and the finger contact detection unit An input operation with a finger may be validated in a state where the distance is longer than the reference distance by a predetermined distance.
  • the operation invalid area Ariv is omitted, but the operation invalid area Ariv may be provided.
  • the distance of the operation invalid area Ariv may be shorter than that in FIG.
  • a fifth embodiment of the control device and control method for the operation target device in the vehicle will be described.
  • the basic configuration and operation in the fifth embodiment are the same as those in the first embodiment, and only different parts will be described.
  • the fifth embodiment is still another configuration example for reducing erroneous operations.
  • the control unit 10 may validate the input operation when the detection unit 10a detects that the same input operation has been performed with the left and right hands. It is preferable that the control unit 10 validates the input operation when the same input operation is performed at the same timing with the left and right hands.
  • FIG. 13 (a) An example of an input operation when the same input operation is performed at the same timing with the left and right hands will be described with reference to FIGS. Similar to FIG. 24 (a) is FIG. 13 (a), the left direction drag D TL sliding the thumb to the left relative to the left touch sensor 21L, the right sliding the thumb in the right direction with respect to the right touch sensor 21R A case where the direction drag DTR is performed at the same timing is shown.
  • both the left and right thumbs are dragged from the inner periphery side to the outer periphery side of the annular portion 200r and the same input operation is performed.
  • the same input operation may be defined.
  • the left and right symmetrical input operations as shown in FIG.
  • the downward drag D TD for sliding the thumb downward is performed at the same timing on both the left touch sensor 21L and the right touch sensor 21R. Shows the case.
  • both the left touch sensor 21L and the right touch sensor 21R are subjected to an upward drag DTU that causes the thumb to slide upward at the same timing, the same input operation may be defined.
  • the finger is dragged in the vertical direction, it is preferable to define the same input operation as dragging in the same direction on the left and right instead of symmetrically.
  • the input operation is performed with the thumb, but an index finger may be used.
  • FIG. 24 (c) shows a case where the tap T is tapped at the same timing with both the left touch sensor 21L and the right touch sensor 21R by hitting the touch sensor 21 with the thumb or index finger.
  • the control unit 10 determines that the timing is the same in the following case. For example, in the case of drag, as shown in FIG. 25 (a), and time TM L from the start timing t1 of dragging with the left hand fingers to the end time t3, until the end timing t4 from the start timing t2 of the drug by the right hand finger it can be a time TM R of considers the same timing when the overlapping predetermined time (predetermined ratio) or more. Further, as shown in FIG. 25 (b), previously measured for a predetermined time TM P1 from the start timing t1 of the drag by the finger of the left hand for example initiated the drag, the drag due to the fingers of the right hand in time TM P1 made Can be regarded as the same timing. The criterion for determining that the timing is the same may be set as appropriate.
  • an allowable range is assumed to be regarded as the same input operation.
  • dragging if the direction in which the finger slides is the same within a predetermined allowable range, it is regarded as the same input operation.
  • tap T if the location of the tap T is the same, it can be regarded as the same input operation. When the location of the tap T is common to the front surface portion 21f or the back surface portion 21r, the same location can be obtained.
  • the control unit 10 may set a reception mode for receiving a specific input operation to the touch sensor 21 as described above so that the driver intentionally enters the reception mode. . Even when shifting from a state other than the reception mode to the reception mode, it is necessary to avoid unintentionally shifting to the reception mode. Therefore, when the detection unit 10a detects that the same input operation has been performed on the touch sensor 21 with both hands, the control unit 10 accepts the specific input operation from the state where the specific input operation is not accepted. Transition to the state (acceptance mode). The same input operation is as described in FIG. Also in this case, as described with reference to FIG. 25, when it is detected that the same input operation is performed at the same timing, it is preferable to shift to the reception mode.
  • the detection unit 10a detects that the specific input operation (first specific input operation) described with reference to FIGS. 11 to 13 has been performed, and then defines the same input operation as described with reference to FIG.
  • the control unit 10 determines the first specific input operation input immediately before.
  • the control unit 10 performs the first input just before. Confirm a specific input operation.
  • the reception mode may be set. As shown in FIG. 27, time TM L upward drag D IU by left hand forefinger, if between the upward drag D TU is a predetermined time TM P2 by the right thumb, the control unit 10, the upper Assuming that the direction drag D IU and the upward drag D TU are continuous input operations, a reception mode is set.
  • the operation target can be switched according to the input operation pattern by the left and right hands.
  • the control unit 10 sets the audio operation mode in which the audio playback unit 12 is operated.
  • the control unit 10 sets a target to be operated based on a specific input operation as the audio playback unit 12 in the in-vehicle device 100.
  • the control unit 10 sets the navigation operation mode in which the navigation processing unit 11 is operated.
  • the control unit 10 sets a target to be operated based on a specific input operation as the navigation processing unit 11 in the in-vehicle device 100.
  • the combination of these input operations is merely an example, and is not limited to FIGS.
  • the driver is holding the annular portion 200r (touch sensor 21).
  • a sixth embodiment of the control device and control method for the operation target device in the vehicle will be described.
  • the basic configuration and operation in the sixth embodiment are the same as those in the first embodiment, and only different parts will be described.
  • the sixth embodiment is still another configuration example for reducing erroneous operations.
  • FIG. 30A shows an example in which the grip detection area Arg, the operation invalid area Ariv, and the operation detection area Arv of the touch sensor 21 are color-coded.
  • it may be color-coded by applying paint, or color-coded by pasting sheets of the respective colors.
  • It is also effective to color-code the portion of the touch sensor 21 and the portion other than the touch sensor 21 of the annular portion 200r.
  • a color may be given to the part of the touch sensor 21, or a color may be given to a part other than the touch sensor 21.
  • the parts other than the touch sensor 21, the grip detection area Arg, the operation invalid area Ariv, and the operation detection area Arv may be different colors.
  • FIG. 30B shows an example in which the operation invalid area Ariv is not provided, and shows an example in which the grip detection area Arg and the operation detection area Arv are color-coded. Furthermore, it is preferable to color-code the portion of the touch sensor 21 and the portion other than the touch sensor 21 of the annular portion 200r because the driver can clearly and immediately recognize the position of the touch sensor 21. As shown in FIGS. 30A and 30B, it is more preferable to color-code each area because the driver can clearly and immediately recognize the position of each area of the touch sensor 21. 30A and 30B, the color change sheet 41 described above can also be used.
  • the color change sheet 41 can be used as follows. .
  • the controller 10 sets a grip detection area Arg and an operation detection area Arv for the touch sensor 21 after the driver grips the portion of the touch sensor 21 in the annular portion 200r. Then, after setting the grip detection area Arg and the operation detection area Arv, the control unit 10 colors the grip detection area Arg and the operation detection area Arv.
  • the color coding may be performed by coloring each area, or may be color coded as a result by coloring some areas.
  • FIG. 31 (a) shows an example in which markers M1 and M2 of a predetermined color are attached to the boundaries of the grip detection area Arg, the operation invalid area Ariv, and the operation detection area Arv.
  • the markers M1 and M2 are examples of boundary identification means for identifying the boundary.
  • the markers M1 and M2 may be provided by a paint or a seal, for example.
  • FIG. 31B shows an example in which the operation invalid area Ariv is not provided, and shows an example in which a marker M3 of a predetermined color is added to the boundary between the grip detection area Arg and the operation detection area Arv. As shown in FIGS. 31A and 31B, it is preferable to indicate the position of the boundary because the driver can clearly and immediately visually recognize the position of each area of the touch sensor 21.
  • FIG. 32 shows an example in which the diameter of the annular portion 200r in the operation detection area Arv is smaller than the diameter of the annular portion 200r in the grip detection area Arg.
  • FIG. 32 shows an example in which the operation invalid area Ariv is not provided. What is necessary is just to make a diameter so thin that it does not become a hindrance when a driver operates steering wheel 200, and it can recognize by operation that it is operation detection area Arv. It is preferable that the diameter of the boundary between the operation detection area Arv and the grip detection area Arg is gradually changed.
  • FIG. 33 shows an example in which the diameter of the annular portion 200r in the operation detection area Arv is larger than the diameter of the annular portion 200r in the grip detection area Arg.
  • FIG. 33 shows an example in which the operation invalid area Ariv is not provided. What is necessary is just to make a diameter so thick that it does not become a hindrance when a driver operates steering wheel 200, and it can recognize by operation that it is operation detection area Arv. It is preferable that the diameter of the boundary between the operation detection area Arv and the grip detection area Arg is gradually changed. 32 and 33, the diameter of the annular portion 200r changes at the boundary between the grip detection area Arg and the operation detection area Arv. The change in diameter can be interpreted as an example of boundary identification means for physically identifying the boundary.
  • FIG. 34 shows an example in which recesses B1 and B2 are provided at the boundaries of the grip detection area Arg, the operation invalid area Ariv, and the operation detection area Arv.
  • the driver can visually recognize the position of each area by the recesses B1 and B2, and can recognize the position of each area by touch when the touch sensor 21 is gripped.
  • a recess may be provided at the boundary between the grip detection area Arg and the operation detection area Arv.
  • the touch sensor 21 may be divided by the recesses B1 and B2, or may not be divided.
  • the recesses B1 and B2 are another example of boundary identification means for physically identifying the boundary.
  • FIG. 35 shows an example in which convex portions B3 and B4 are provided at the boundaries of the grip detection area Arg, the operation invalid area Ariv, and the operation detection area Arv.
  • the driver can visually recognize the position of each area by the convex portions B3 and B4, and can recognize the position of each area by tactile sense when holding the touch sensor 21.
  • a convex portion may be provided at the boundary between the grip detection area Arg and the operation detection area Arv.
  • the touch sensor 21 may be divided by the recesses B3 and B4, or may not be divided.
  • the convex portions B3 and B4 are still another example of boundary identifying means for physically identifying the boundary.
  • FIG. 36A shows a state where the driver has not yet gripped the grip detection area Arg. In this example, the operation invalid area Ariv is not provided.
  • FIG. 36B shows a state where the driver holds the grip detection area Arg.
  • the color change sheet 41 described above is provided on the lower surface side of the operation detection area Arv.
  • FIG. 37A and 37B show an example in which the above-described tactile feedback sheet 42 is provided on the upper surface side of the operation detection area Arv.
  • FIG. 37A shows a state where the driver has not yet gripped the grip detection area Arg.
  • FIG. 37B shows a state where the driver has gripped the grip detection area Arg.
  • the control unit 10 controls the tactile feedback sheet 42 to change the tactile sense of the tactile feedback sheet 42, for example, as shown in FIG. Change to a rough state.
  • the tactile feedback sheet 42 may be in a rough state, and when it is detected that the grip detection area Arg is gripped, it may be changed to a smooth state.
  • the driver can clearly recognize the position of the operation detection area Arv by tactile sensation, so that it is possible to further reduce erroneous operations.
  • the tactile sensation of the tactile feedback sheet 42 it is not necessary to visually check the operation detection area Arv, which contributes to safe driving.
  • the method of changing the tactile sensation of the tactile feedback sheet 42 is arbitrary.
  • the portion of the operation detection area Arv may have a different tactile sense from the grip detection area Arg and the operation invalid area Ariv in advance.
  • the surface of the operation detection area Arv may be roughened, or surface treatment may be applied or a sheet of those tactile sensations may be pasted so as to have a tactile sensation different from the grip detection area Arg and the operation invalid area Ariv.
  • the grip detection area Arg and the operation detection area Arv are configured to be distinguishable when at least the detection unit detects that the grip detection area Arg is gripped. Yes.
  • the configurations shown in FIGS. 30 to 24 are examples. For example, the configurations shown in FIGS. 30 to 24 may be combined.
  • the grip detection area Arg and the operation detection area Arv may be configured to be always distinguishable. However, the grip detection area Arg and the operation detection area are distinguished only when it is detected that the grip detection area Arg is gripped. If configured so as to be able to indicate whether or not the grip detection area Arg and the operation detection area Arv are distinguishable, it is possible to indicate whether or not the operation input to the operation detection area Arv is being accepted.
  • FIG. 38 shows an example of a locus when the finger of the left hand is slid in the left-right direction on the touch sensor 21.
  • the left side of FIG. 38 is the outside of the annular part 200r, and the right side is the inside of the annular part 200r. As shown in FIG. 38, the inner side tends to be lower than the outer side of the annular portion 200r.
  • the control unit 10 determines that the difference dxh of the x component, which is the horizontal component between the trajectory start point Ps and the end point Pe, is greater than or equal to a predetermined threshold value. If the difference dyh of the y component, which is the vertical component, is less than the predetermined threshold, it is assumed that the drag is linearly made in the horizontal direction as shown in FIG.
  • the control unit 10 determines that the y component difference dyv between the trajectory start point Ps and the end point Pe is equal to or greater than a predetermined threshold, and the x component difference dxv. If it is less than the predetermined threshold, it is assumed that the user has dragged in a straight line in the vertical direction as shown in FIG.
  • the threshold for the difference dxh is THxh
  • the threshold for the difference dyh is THyh
  • the threshold for the difference dyv is THyv
  • the threshold for the difference dxv is THxv
  • FIG. 41A shows a state in which any finger is slid in the right direction with respect to the left touch sensor 21L and a state in which any finger is slid in the downward direction with respect to the right touch sensor 21R. Yes.
  • the correction of the locus described in FIGS. 39 and 40, the control unit 10, as shown in FIG. 41 (b), the left touch sensor 21L in the right direction drag D R, downward drag in the right touch sensor 21R D can be regarded as D.
  • Example shown in FIG. 41 (d) is a diagonal direction drag D O of the lower right direction, likewise, to achieve upper right direction, lower left direction, diagonally left upward diagonal direction drag D O be able to. If the diagonal drag D0 is realized as in the present embodiment, the operability is improved.
  • the control unit 10 has two same directions (in this case, the upward direction). Control may be performed so as to synthesize a drag vector and perform an operation based on a larger vector. By controlling in this way, when the map is scrolled according to the drag operation, the map can be largely scrolled by one drag operation, and the operability is improved. Further, the drag operation vector for the left touch sensor 21L and the drag operation vector for the right touch sensor 21R are in opposite directions, and the angle formed by the two vectors is close to 180 ° (for example, 180 ° ⁇ ⁇ : ⁇ is A special operation may be performed at an arbitrary angle). For example, the map may be rotated.
  • control unit 10 of the present embodiment controls the operation target device according to a pattern based on a combination of the input operation for the left touch sensor 21L and the input operation for the right touch sensor 21R.
  • the vector composition based on the four directions of the upward direction, the downward direction, the left direction, and the right direction has been described.
  • the vector may be composed based on more directions.
  • the deviation between the trajectory of the drag operation intended by the user and the trajectory of the drag operation actually performed is often left-right symmetric, and the deviation of the trajectory can be absorbed by performing vector synthesis. Therefore, it is possible to perform only the correction for making the drag operation linear to connect the start point and the end point, and not to regard it as either the horizontal direction or the vertical direction as described above.
  • an operation of sliding a finger on the touch sensor 21 in the radial direction of the annular portion 200r is defined as a horizontal drag Dh, and the finger is annular.
  • An operation of sliding the portion 200r in the circumferential direction is defined as a vertical drag Dv.
  • FIG. 43 shows a horizontal drag Dh and a vertical drag Dv in the development of the touch sensor 21 in FIG. 42 and 43, the horizontal drag Dh and the vertical drag Dv are shown in only one column of the detection region R for each of the X coordinate and the Y coordinate, but the finger touches the detection region R in a plurality of columns and drags. Sometimes it is done.
  • locus correction and vector synthesis in the eighth embodiment are the same as those in the seventh embodiment described with reference to FIGS.
  • a ninth embodiment of a control device and control method for an operation target device in a vehicle will be described.
  • the basic configuration and operation in the ninth embodiment are the same as those in the first embodiment, and only different parts will be described.
  • the left and right part of the annular portion 202r is a right columnar gripping portion 202s gripped by the driver.
  • the pair of left and right grips 202s are connected by an upper connecting portion 202c1 and a lower connecting portion 202c2 to form an annular portion 202r.
  • the touch sensor 21 is attached to the gripping part 202s.
  • FIG. 45 is an enlarged view of the boundary portion between the connecting portion 202c1 and the gripping portion 202s surrounded by the dashed-dotted ellipse in FIG.
  • FIG. 46 shows an AA cross section of FIG. Since the gripping portion 202s has a slightly smaller diameter than the connecting portions 202c1 and 202c2, when the touch sensor 21 is attached to the gripping portion 202s, there is almost a step at the boundary between the gripping portion 202s and the connecting portions 202c1 and 202c2. The surface is continuous.
  • the gripping portion 202s is used to switch the input operation to the touch sensor 21 between on and off. Turning on the input operation means permitting (validating) the above-described specific input operation, and turning off the input operation means disallowing (invalidating) the above-mentioned specific input operation.
  • the gripper 202s has a built-in on / off switching mechanism, and the input operation is switched on and off by the on / off switching mechanism.
  • FIG. 47 shows a BB cross section of FIG.
  • the end of the connecting portion 202c1 on the gripping portion 202s side is a protruding portion 27.
  • An end portion of the gripping portion 202s on the side of the connecting portion 202c1 serves as a receiving portion 28 having a concave portion for accommodating the protruding portion 27.
  • FIGS. 47 (a) to 47 (c) a part of the protrusion 27 in the circumferential direction is notched, forming a recess 27cp.
  • An elastic deformation portion 29 having a protrusion 29p is fixed to the recess 27cp.
  • Two recesses 28 cp 1 and 28 cp 2 are formed on the inner peripheral surface of the receiving portion 28.
  • the gripper 202s In the normal state of the modified steering wheel 202, the gripper 202s is in the state shown in FIG. That is, the protrusion 29p is engaged with the recess 28cp1.
  • the state shown in FIG. 47A is a state where the input operation to the touch sensor 21 is turned off. When the operation target device is not operated by the touch sensor 21 and the vehicle is normally driven, the off state shown in FIG. 47A is set.
  • the gripping portion 202s is turned to the outer peripheral side of the deformed steering wheel 202 from the OFF state shown in FIG. 47A, the projection 29p and the recess 28cp1 are disengaged as shown in FIG. 47B.
  • the projecting portion 29p comes into contact with the convex portion between the concave portions 28cp1, 28cp2. At this time, the elastic deformation portion 29 is pushed and deformed by the convex portion between the concave portions 28cp1 and 28cp2.
  • the gripping part 202s When the gripping part 202s is further rotated to the outer peripheral side, the projecting part 29p is engaged with the concave part 28cp2 and the input operation to the touch sensor 21 is turned on as shown in FIG. 47 (c). Although not shown, the state in which the input operation to the touch sensor 21 shown in FIG. 47A is turned off and the state in which the input operation to the touch sensor 21 shown in FIG. To be detected. A state detection signal of the on / off switching mechanism of the gripping part 202s is input to the control part 10.
  • the driver When the driver does not operate the operation target device with the touch sensor 21 and normally drives the vehicle, the driver is in the state of FIG. 47A.
  • the gripping part 202s is turned to the outer peripheral side to obtain the state shown in FIG.
  • the state of FIG. 47 (a) is switched to the state of FIG. 47 (c)
  • the protrusion 29p engages with the recess 28cp2
  • the state of FIG. 47 (c) to the state of FIG. 47 (a).
  • a click feeling is obtained when the protrusion 29p is engaged with the recess 28cp1, and the driver can perceive that the on state and the off state are switched.
  • the on / off switching mechanism shown in FIG. 47 may be provided on both the left and right gripping sections 202s, or may be provided only on one side.
  • the input operation may be turned on when both the left and right gripping portions 202s are turned on, or one of them is turned on. It is good also as a state which turns ON input operation when it will be in a state.
  • the gripping part 202s is turned to the inner peripheral side, it may be turned on. In the configuration example of FIG. 44, the feeling (grip feeling) that the driver grips the gripping portion 202s does not change between the state in which the input operation is turned on and the state in which the input operation is turned off.
  • the shape of the touch sensor 21 does not have to be a complicated shape as described in FIG. A simple plane such as Accordingly, since the shape of the touch sensor 21 can be simplified, the touch sensor 21 itself can be made inexpensive, and the man-hour for mounting the touch sensor 21 on the steering wheel (deformed steering wheel 202) is also simplified. It becomes possible to realize the control device of the device at a low cost.
  • the on / off switching mechanism is a rotation switch that rotates in the circumferential direction.
  • a grip detection area Arg, an operation detection area Arv, and an operation invalid area Ariv as described with reference to FIGS. 8 and 9 are set in the touch sensor 21 attached to the gripping part 202s having an on / off switching mechanism. Also good. However, since it is clear whether the driver intends to operate the operation target device by the on / off switching mechanism, the grip detection area Arg and the operation invalid area Ariv are not set, and only the operation detection area Arv is set. Also good. That is, the entire surface of the touch sensor 21 may be set as the operation detection area Arv.
  • step S21 determines in step S21 whether or not the on / off switching mechanism is on. If it is not determined that the on / off switching mechanism is on (NO), the control unit 10 returns the process to step S21. If it is determined that the on / off switching mechanism is on (YES), the control unit 10 acquires sensor data output from the sensor data generation unit 22 in step S22. In step S23, the control unit 10 determines whether or not an input operation has been performed based on the detection output from the detection unit 10a.
  • step S24 the control unit 10 determines whether or not to permit an operation on the operation target device by the input operation in step S23. If it is determined that the operation is permitted (YES), the control unit 10 moves the process to step S25. If it is not determined that the operation is permitted (NO), the control unit 10 returns the process to step S21.
  • the control unit 10 permits an operation on the operation target device when a specific input operation is performed on the touch sensor 21.
  • step S25 the control unit 10 determines an operation based on the input operation.
  • step S26 the control unit 10 performs control corresponding to the determined operation on the operation target device, and returns the process to step S21. .
  • step S2 of FIG. 4 the process corresponding to step S2 of FIG. 4 is omitted, but the process of determining whether or not the gripping part 202s corresponding to step S2 of FIG. You may provide between step S23.
  • the control unit 10 objectively determines whether the driver intends to operate the operation target device. Judgment can be made. Accordingly, it is possible to greatly reduce erroneous operations.
  • the gripper 202s may be returned to the normal state shown in FIG. 47 (a). In this case, a motor for returning the state shown in FIG. 47C from the state shown in FIG. 47C to the state shown in FIG.
  • the touch sensor 21 may be detachably attached to the annular portion 200r or the annular portion 202r using a surface fastener.
  • the annular portion 200r is the gripping portion, the gripping portion is not necessarily circular. It may be deformed like the annular portion 202r or may not be annular.
  • a receiving part having a recess is provided on the connecting part 202c1 or 202c2 side, and a protruding part is provided on the gripping part 202s side, and the gripping part 202s and the connecting parts 202c1, 202c2 are joined. May be.
  • the configuration shown in FIGS. 46 and 47 is an example of the configuration of the on / off switching mechanism, and is not limited to the configuration shown in FIGS. 46 and 47.
  • ⁇ Tenth Embodiment> 10th Embodiment of the control apparatus of the operation target apparatus in a vehicle is described.
  • the tenth embodiment is an embodiment of a driver specifying method.
  • the basic configuration and operation in the tenth embodiment are the same as those in the first embodiment, and only different parts will be described.
  • the state of the in-vehicle device 100 is set to an optimum state according to each driver, or the state of the vehicle is set to an optimum state according to each driver.
  • the driver may automatically play a song that is frequently played by the audio playback unit 12, or may display a song that is frequently played when displaying a list of songs at the top. It is done. It is also conceivable to set the condition of the air conditioner or adjust the seat position according to the driver.
  • FIG. 49 shows an example of the state of the palm contact detection unit Tp and the thumb contact detection unit Tt when the driver grips the annular portion 200r in an attempt to drive the vehicle.
  • the thumb contact detection unit Tt is located close to the palm contact detection unit Tp.
  • the index finger contact detector Ti is not shown.
  • the control unit 10 detects the touched length in the X coordinate direction in the grip detection area Arg.
  • the sum of the length Lx1 and the length Lx2 is the touched length in the X coordinate direction.
  • FIG. 50 shows a state in which the thumb contact detection units Tt divided in FIG. 49 are connected, and shows a touched length Lx in the X coordinate direction.
  • the length Lx is information indicating the length of a portion (palm contact detection portion Tp) touched by the palm on the touch sensor 21 in the circumferential direction in the cross section when the annular portion 200r is cut in the radial direction.
  • the length Lx is a first example of gripping state identification data indicating how the driver is gripping the annular portion 200r where the touch sensor 21 is attached.
  • control unit 10 detects the length Lx, but the number of detection regions R corresponding to the length Lx may be obtained. Of course, it is possible to convert the number of detection regions R corresponding to the length Lx into an actual distance.
  • the control unit 10 detects the length Lya in the Y coordinate direction between the thumb contact detection unit Tt and the thumb contact detection unit Tt.
  • the length Lya is the circumferential direction of the steering wheel 200 (annular portion 200r), where the palm on the touch sensor 21 is in contact (palm contact detection unit Tp) and the portion where the thumb is in contact (thumb contact detection).
  • Part Tt) is information indicating the length.
  • the length Lya is a second example of gripping state identification data indicating how the driver is gripping the annular portion 200r of the portion where the touch sensor 21 is attached.
  • the control unit 10 detects Lya, but the number of detection regions R corresponding to the length Lya may be obtained. Of course, it is possible to convert the number of detection regions R corresponding to the length Lya into an actual distance.
  • the end of the part in contact with the palm opposite to the part in contact with the thumb, and the part of the part in contact with the thumb opposite to the part in contact with the palm is set to the length Lya, the length is not limited to this. However, the length shown in FIG. 49 is preferably the length Lya.
  • control unit 10 detects the total number of detection regions R (contact detection region total number) that are detected to be touched in the state of FIG.
  • the total number of contact detection areas corresponds to the area where the driver's hand is in contact. It is also possible to calculate the actual area based on the detection region R being detected.
  • the total number of contact detection areas may be the total number of detection areas R that are detected to be touched in all detection areas R of the grip detection area Arg, the operation detection area Arv, and the operation invalid area Ariv. In the detection region R of only the grip detection area Arg, the total number of detection regions R that are detected to be touched may be used.
  • the information corresponding to the area of the part touched by the hand on the touch sensor 21 is gripping state identification data indicating how the driver grips the annular part 200r of the part to which the touch sensor 21 is attached. This is a third example.
  • the driver can be specified by the lengths Lx, Lya and the total number of contact detection areas. Although the driver's specific accuracy is slightly lowered, the driver may be specified only by the lengths Lx and Lya, or the driver may be specified only by the total contact detection area. The driver may be specified only by the length Lx, or the driver may be specified only by the length Lya.
  • the control unit 10 includes a thumb contact detection unit Tt and a thumb contact detection unit in a state where the driver has extended his thumb to operate the operation target device.
  • the length Lyb in the Y coordinate direction with respect to Tt is detected.
  • FIG. 51 shows a state where the driver has extended his thumb to operate the operation target device.
  • the length Lyb in the Y-coordinate direction between the thumb contact detection unit Tt and the thumb contact detection unit Tt is longer than the length Lya.
  • the voice guidance is given as “Specify the driver. Extend the thumb and operate the touch sensor.”, The length Lyb will also be immediately Can be detected. There is no problem even if the length Lyb is detected after waiting for the driver to actually operate the touch sensor 21 without such guidance.
  • the end of the part in contact with the palm opposite to the part in contact with the thumb and the part of the part in contact with the thumb in contact with the palm are opposite.
  • the length between the end portions on the side is the length Lyb, it is not limited to this.
  • the length shown in FIG. 51 is preferably the length Lyb.
  • the length Lyb is a fourth example of gripping state identification data indicating how the driver is gripping the annular portion 200r where the touch sensor 21 is attached.
  • FIG. 52 shows an example of the driver database stored in the storage unit 18.
  • the lengths Lx, Lya, and Lyb and the total number of contact detection areas are registered as the driver specifying data. Even for the same driver, the lengths Lx, Lya, Lyb and the total number of contact detection areas are not always the same value, so it is preferable to register an average value every time the same driver is specified.
  • the driver specifying data indicates how the annular portion 200r of the portion where the touch sensor 21 is attached is gripped. As described above, information on the driver specifying data may be registered in accordance with the gripping state identification data acquired by the control unit 10 in order to specify the driver.
  • the thumb contact detection unit Tt is used, but the index finger contact detection unit Ti is used instead of the thumb contact detection unit Tt, or the index finger contact detection unit Ti is used in addition to the thumb contact detection unit Tt. You can also.
  • the control unit 10 acquires sensor data output from the sensor data generation unit 22 in step S21.
  • the control unit 10 acquires the lengths Lx and Lya in step S22. Since the driver first holds the annular portion 200r in an attempt to drive the vehicle, the lengths Lx and Lya can be acquired. As described above, since the detection unit 10a detects that the annular portion 200r (touch sensor 21) is gripped based on the sensor data output from the sensor data generation unit 22, the detection is detected. Then, the lengths Lx and Lya may be acquired.
  • the controller 10 acquires the total number of contact detection areas in step S23. The order of step S22 and step S23 may be reversed.
  • the controller 10 acquires the length Lyb in step S24 after guiding the user to extend the thumb or waiting for the driver to operate the touch sensor 21. Step S24 can be omitted.
  • the control unit 10 compares the acquired lengths Lx, Lya, Lyb with the driver identification data registered in the driver database with the gripping state identification data of the total number of contact detection areas. It is determined whether or not the grip state identification data thus obtained matches any of the driver identification data. Even if it is the same driver, the data does not always match completely. Therefore, a predetermined allowable range is set in the registered driver specific data, and the obtained lengths Lx, Lya, Lyb and the total number of contact detection areas are set. If the gripping state identification data is included within the allowable range, it is determined that they match.
  • step S25 If it is determined in step S25 that it matches any one of the drivers (YES), the control unit 10 identifies the driver in step S26, and executes control corresponding to the driver in step S27. finish.
  • the control corresponding to the driver is to set the state of the in-vehicle device 100 and the state of the vehicle to an optimum state corresponding to each driver. Of course, when the driver is specified while the vehicle is traveling, the position of the seat is not adjusted in the vehicle state.
  • control unit 10 determines whether or not an instruction to register in the driver database is given in step S28. If it is determined that an instruction to register in the driver database is given (YES), the control unit 10 in step S29, the driver name input by the operation unit not shown in FIG. 1 and the acquired length Lx, Lya, Lyb and gripping state identification data consisting of the total number of contact detection areas are associated with each other and registered in the driver database as driver specifying data, and the process ends. If it is not determined that an instruction to register in the driver database has been given (NO), the process ends.
  • the control unit grasps the annular portion 200r of the portion where the touch sensor 21 is mounted based on the sensor data output from the sensor data generation unit 22. It is a driver specifying unit that specifies a driver by acquiring gripping state identification data indicating whether or not the vehicle is gripped and comparing the gripping state identification data with the driver specifying data.
  • the control unit 10 learns how the in-vehicle device 100 is operated and what the vehicle is, and grasps the characteristics of each driver. deep.
  • information indicating the condition of the air conditioner and information indicating the position of the seat are not shown to be input to the control unit 10, but these pieces of information are also transmitted to the control unit 10 via the in-vehicle communication unit 34. And supply.
  • the position where the annular portion 200r is gripped differs depending on the driver. Therefore, it is also possible to detect the position where the annular portion 200r is gripped and use it as grip state identification data for identifying the driver.
  • the positions of the grip detection area Arg, the operation detection area Arv, and the operation invalid area Ariv provided as necessary are dynamically set according to the position where the driver holds the touch sensor 21, the annular portion 200r.
  • the position where the vehicle is gripped can be gripping state identification data for identifying the driver.
  • FIG. 54A shows a state in which the driver grips the lower end of the touch sensor 21 and the grip detection area Arg is set at the lower end of the touch sensor 21.
  • FIG. 54B shows a state in which the driver holds the position slightly above the lower end of the touch sensor 21 and the grip detection area Arg is set at a position away from the lower end of the touch sensor 21.
  • the touch sensor 21 is the grip detection area Arg can be specified by the Y coordinate. As an example, if the value of the Y coordinate of the grip detection area Arg is integrated, it can be seen that the lower the integrated value, the lower the touch sensor 21 is held, and the higher the integrated value, the higher the touch sensor 21 is held.
  • Information indicating the position where the steering wheel 200 is gripped in the circumferential direction of the steering wheel 200 is registered as driver identification data in the driver database of FIG.
  • the control unit 10 acquires information indicating the position where the steering wheel 200 is gripped in the circumferential direction of the steering wheel 200 as gripping state identification data.
  • the information indicating the position where the steering wheel 200 is gripped is a fifth example of gripping state identification data indicating how the driver is gripping the annular portion 200r of the portion where the touch sensor 21 is mounted. Although the driver's specific accuracy decreases, the driver may be specified based on information indicating a position where the steering wheel 200 is gripped.
  • the first to fifth examples of the gripping state identification data described above can be arbitrarily combined as appropriate. One or more may be selected as appropriate in consideration of the driver's specific accuracy. Of course, it is preferable to use all of the first to fifth examples because the specific accuracy is greatly improved.
  • the present invention can be used as a control device for controlling an arbitrary operation target device in a vehicle. It can also be used for vehicles other than automobiles. Further, in a game device having an operation unit (controller) such as a steering wheel, it can be used as a control device for controlling the game.
  • an operation unit such as a steering wheel

Abstract

A touch sensor (21) is attached to a ring portion (200r) of a steering wheel (200). A sensor data generation unit (22) generates, based on a touch detection signal obtained from the touch sensor (21), sensor data including position data indicating which detection region is touched. A detection unit (10a) detects whether the driver grips the ring portion (200r) or not and an input operation to the touch sensor (21). When the driver grips the ring portion (200r) and it is detected that a specific input operation has been performed, a control unit (10) controls an in-vehicle device (100) according to the specific input operation.

Description

車両における操作対象装置の制御装置及び制御方法、ステアリングホイールControl device and control method for operation target device in vehicle, steering wheel
 本発明は、車両に搭載されているナビゲーション装置等の車載機器、または、変速機や方向指示器等の車両の動作を制御する車両動作制御装置を操作対象装置とし、操作対象装置を制御する制御装置及び制御方法、操作対象装置を操作するのに好適なステアリングホイールに関する。 The present invention uses an in-vehicle device such as a navigation device mounted on a vehicle or a vehicle operation control device that controls the operation of a vehicle such as a transmission or a direction indicator as an operation target device, and controls the operation target device. The present invention relates to a device, a control method, and a steering wheel suitable for operating an operation target device.
 車両に搭載されているナビゲーション装置等の車載機器を操作するための操作スイッチをステアリングホイールに配置した車両が普及している(特許文献1参照)。操作スイッチをステアリングホイールに配置すれば、運転者は車載機器を操作する際に車載機器まで手を伸ばす必要がないので、操作性が向上する。特許文献1に記載されているように、操作スイッチは、実際には、運転者が手で握る握持部であるステアリングホイールの円環状部ではなく、エアバッグが収納されているセンター部と円環状部との間を連結する連結部に配置されるのが通常である。従って、運転者は、操作スイッチを操作するために、握っている円環状部から手を離したり、手を大きくずらしたりする必要がある。特許文献2には、円環状部の背面や内側側面に操作スイッチを配置することが記載されている。 Vehicles in which operation switches for operating in-vehicle devices such as navigation devices mounted on vehicles are arranged on a steering wheel are widely used (see Patent Document 1). If the operation switch is arranged on the steering wheel, the driver does not need to reach the vehicle-mounted device when operating the vehicle-mounted device, so that the operability is improved. As described in Patent Document 1, the operation switch is not actually an annular portion of a steering wheel that is a gripping portion that is gripped by a driver, but a center portion and a circle that store an airbag. It is usual to arrange | position at the connection part which connects between annular parts. Therefore, in order to operate the operation switch, the driver needs to release his / her hand from the annular ring part he / she is holding or to largely shift his / her hand. Patent Document 2 describes that an operation switch is arranged on the back surface or inner side surface of the annular portion.
特開2007-106353号公報JP 2007-106353 A 特開2005-348123号公報JP-A-2005-348123 特開2008-195220号公報JP 2008-195220 A
 特許文献2に記載の発明によれば、円環状部に操作スイッチを配置しているので、円環状部から手を離したり、手を大きくずらしたりすることなく、操作スイッチを操作することができる。しかしながら、特許文献2に記載されている操作スイッチは押しボタンによるキーや凹凸を設けたキーであり、この種のキーは運転者がステアリングホイールを操作する際の支障となる可能性がある。運転者が握る円環状部に大きな凹凸を設けることは好ましくない。また、円環状部に操作スイッチのような操作部を配置した場合、運転者が通常の運転のために円環状部を握っている場合のように、操作対象装置を操作しようとする意思がない場合に、不用意に操作対象装置が操作されないようにすることが求められる。 According to the invention described in Patent Document 2, since the operation switch is arranged in the annular portion, the operation switch can be operated without releasing the hand from the annular portion or greatly shifting the hand. . However, the operation switch described in Patent Document 2 is a key provided by a push button or a key provided with unevenness, and this type of key may be an obstacle when the driver operates the steering wheel. It is not preferable to provide large irregularities in the annular portion gripped by the driver. In addition, when an operation unit such as an operation switch is arranged in the annular part, there is no intention to operate the operation target device as in the case where the driver holds the annular part for normal driving. In some cases, it is required to prevent the operation target device from being inadvertently operated.
 本発明はこのような要望に対応するため、握持部から手を離したり、手を大きくずらしたりすることなく操作対象装置を操作することができ、運転者がステアリングホイールを操作する際に支障となる可能性を大幅に低減させることができる車両における操作対象装置の制御装置及び制御方法、ステアリングホイールを提供することを目的とする。また、誤操作を大幅に低減させることができる車両における操作対象装置の制御装置及び制御方法、ステアリングホイールを提供することを目的とする。 In order to meet such a demand, the present invention can operate the operation target device without releasing the hand from the gripping part or greatly shifting the hand, which hinders the driver from operating the steering wheel. It is an object of the present invention to provide a control device and control method for a device to be operated in a vehicle, and a steering wheel that can greatly reduce the possibility of becoming a steering wheel. It is another object of the present invention to provide a control device and control method for an operation target device in a vehicle, and a steering wheel that can greatly reduce erroneous operations.
 上述した従来の技術の課題を解決するため、本発明の第1の態様によれば、複数の検出領域(R)を有し、ステアリングホイール(200,201)における運転者が握る握持部(200r,201s)の所定の範囲に装着されたタッチセンサ(21)から得られる接触検出信号に基づいて、どの検出領域が触られているかを示す位置データを含むセンサデータを生成するセンサデータ生成部(22)と、前記センサデータに基づいて、運転者が前記握持部を握っているか否か、及び、前記タッチセンサに対する入力操作を検出する検出部(10a)と、前記検出部によって運転者が前記握持部を握っていることが検出され、かつ、前記タッチセンサに対して特定の入力操作が行われたことが検出された場合に、前記特定の入力操作に応じて、前記タッチセンサによって操作する対象の操作対象装置を制御する制御部(10)とを備えることを特徴とする車両における操作対象装置の制御装置が提供される。 In order to solve the above-described problems of the conventional technology, according to the first aspect of the present invention, the gripping portion (having a plurality of detection regions (R) and gripped by the driver in the steering wheel (200, 201)) 200r, 201s), a sensor data generation unit that generates sensor data including position data indicating which detection area is touched based on a contact detection signal obtained from a touch sensor (21) mounted in a predetermined range. (22), based on the sensor data, whether or not the driver is gripping the gripping part, and a detection part (10a) for detecting an input operation to the touch sensor; In response to the specific input operation when it is detected that the hand grips the gripping part and a specific input operation is detected with respect to the touch sensor. The control device of the operation target apparatus in a vehicle, characterized in that it comprises a control unit (10) for controlling the operation target device for which manipulated by the touch sensor is provided.
 本発明の第2の態様によれば、複数の検出領域(R)を有し、ステアリングホイール(200,201)における運転者が握る握持部(200r,201s)の所定の範囲に装着されたタッチセンサ(21)を運転者が握っているか否かを検出し、前記タッチセンサに対して特定の入力操作が行われたか否かを検出し、運転者が前記タッチセンサを握っていることが検出され、かつ、前記特定の入力操作が行われたことが検出された場合に、前記タッチセンサによって操作する対象の操作対象装置を制御することを特徴とする車両における操作対象装置の制御方法が提供される。 According to the second aspect of the present invention, the steering wheel (200, 201) has a plurality of detection regions (R), and is mounted in a predetermined range of the grip portion (200r, 201s) gripped by the driver. It is detected whether or not the driver is holding the touch sensor (21), whether or not a specific input operation is performed on the touch sensor, and the driver is holding the touch sensor. A method for controlling an operation target device in a vehicle, comprising: detecting an operation target device to be operated by the touch sensor when it is detected and the specific input operation is detected. Provided.
 本発明の第3の態様によれば、運転者が握る部分である握持部(200r)と、複数の検出領域(R)を有し、前記握持部の所定の範囲に、前記握持部を覆うように装着されたタッチセンサ(21)と、前記タッチセンサから得られる接触検出信号に基づいて、どの検出領域が触られているかを示す位置データを含むセンサデータを生成するセンサデータ生成部(23)と、前記センサデータに基づいて、運転者が前記握持部における前記タッチセンサの部分を握っているか否か、及び、前記タッチセンサに対する入力操作を検出する検出部(24a)と、前記検出部によって運転者が前記タッチセンサの部分を握っていることが検出され、かつ、前記タッチセンサに対して特定の入力操作が行われたことが検出された場合に、前記特定の入力操作に応じて、前記タッチセンサによって操作する対象の操作対象装置を制御するための制御信号を発生する制御信号発生部(24b)と、
 を備えることを特徴とするステアリングホイールが提供される。
According to the third aspect of the present invention, the gripping portion (200r) which is a portion gripped by the driver, and a plurality of detection regions (R), the gripping portion is within a predetermined range of the gripping portion. Sensor data generation for generating sensor data including position data indicating which detection area is touched based on a touch detection signal obtained from the touch sensor (21) and the touch sensor mounted so as to cover the part A detection unit (24a) for detecting whether or not the driver is gripping the part of the touch sensor in the gripping part and an input operation to the touch sensor based on the sensor data; The specific input is detected when the detection unit detects that the driver is holding the part of the touch sensor and that a specific input operation is performed on the touch sensor. Depending on the operation, the control signal generator for generating a control signal for controlling the operation target device for which manipulated by the touch sensor and (24b),
A steering wheel is provided.
 本発明の第4の態様によれば、ステアリングホイール(200,201)における運転者が握る握持部(200r,201s)に装着されたタッチセンサ(21)における第1のエリア(Arg)が触られている状態であることを検出する第1の検出部(10a)と、前記タッチセンサにおける前記第1のエリアよりも上側に位置する第2のエリア(Arv)に対して特定の入力操作がなされたことを検出する第2の検出部(10a)と、前記第1の検出部によって前記第1のエリアが触られている状態であることが検出され、前記第2の検出部によって前記特定の入力操作がなされたことが検出された場合に、前記タッチセンサによって操作する対象の操作対象装置を、前記特定の入力操作に応じて制御する制御部(10)とを備えることを特徴とする車両における操作対象装置の制御装置が提供される。 According to the fourth aspect of the present invention, the first area (Arg) in the touch sensor (21) attached to the grip part (200r, 201s) gripped by the driver on the steering wheel (200, 201) is touched. A specific input operation is performed on the first detection unit (10a) that detects that the touch sensor is in the state of being touched and the second area (Arv) that is located above the first area of the touch sensor. It is detected that the first area is touched by the second detection unit (10a) that detects that the first area has been made, and the specific state is detected by the second detection unit. A control unit (10) that controls an operation target device to be operated by the touch sensor according to the specific input operation when it is detected that the input operation is performed. Controller of the operation target apparatus in a vehicle according to claim is provided.
 本発明の第5の態様によれば、ステアリングホイール(200,201)における運転者が握る握持部(200r,201s)に装着されたタッチセンサ(21)における第1のエリア(Arg)が触られている状態にあることを検出し、前記第1のエリアが触られている状態で、前記タッチセンサにおける前記第1のエリアよりも上側に位置する第2のエリア(Arv)に対して特定の入力操作がなされたことを検出し、前記特定の入力操作がなされたことが検出された場合に、前記タッチセンサによって操作する対象の操作対象装置を、前記特定の入力操作に応じて制御することを特徴とする車両における操作対象装置の制御方法が提供される。 According to the fifth aspect of the present invention, the first area (Arg) in the touch sensor (21) attached to the grip part (200r, 201s) gripped by the driver on the steering wheel (200, 201) is touched. Is detected, and is specified for the second area (Arv) located above the first area of the touch sensor in a state where the first area is touched. When the input operation is detected, and when it is detected that the specific input operation is performed, an operation target device to be operated by the touch sensor is controlled according to the specific input operation. There is provided a method for controlling an operation target device in a vehicle.
 本発明の第6の態様によれば、複数の検出領域(R)を有し、車両のステアリングホイール(200,201)における運転者が握る握持部(200r,201s)の所定の範囲に、前記握持部を覆うように装着されたタッチセンサ(21)から得られる接触検出信号に基づいて、どの検出領域が触られているかを示す位置データを含むセンサデータを生成するセンサデータ生成部(22)と、前記センサデータに基づいて、前記タッチセンサに対する入力操作を検出する検出部(10a)と、前記検出部によって前記タッチセンサに対して特定の入力操作が行われたことが検出された場合に、前記特定の入力操作に応じて、前記タッチセンサによって操作する対象の操作対象装置を制御する制御部(10)とを備え、前記制御部は、車両が特定の状態にあるときに、前記操作対象装置の制御を無効とすることを特徴とする車両における操作対象装置の制御装置が提供される。 According to the sixth aspect of the present invention, there are a plurality of detection regions (R), and within a predetermined range of the grip portions (200r, 201s) gripped by the driver in the steering wheel (200, 201) of the vehicle. A sensor data generation unit that generates sensor data including position data indicating which detection area is touched based on a contact detection signal obtained from a touch sensor (21) mounted so as to cover the gripping unit. 22), based on the sensor data, a detection unit (10a) that detects an input operation on the touch sensor, and the detection unit detects that a specific input operation has been performed on the touch sensor. A control unit (10) for controlling an operation target device to be operated by the touch sensor according to the specific input operation, wherein the control unit is a vehicle When in a particular state, the control device of the operation target apparatus in a vehicle, characterized in that the disabling control of the operation target device is provided.
 本発明の第7の態様によれば、複数の検出領域(R)を有し、車両のステアリングホイール(200,201)における運転者が握る握持部(200r,201s)の所定の範囲に、前記握持部を覆うように装着されたタッチセンサ(21)に対して特定の入力操作が行われたか否かを検出し、車両が特定の状態にあるか否かを検出し、車両が特定の状態にはないことが検出され、かつ、前記特定の入力操作が行われたことが検出された場合に、前記タッチセンサによって操作する対象の操作対象装置を制御することを特徴とする車両における操作対象装置の制御方法が提供される。 According to the seventh aspect of the present invention, the vehicle has a plurality of detection regions (R), and within a predetermined range of the grip portions (200r, 201s) gripped by the driver in the steering wheel (200, 201) of the vehicle. It is detected whether or not a specific input operation has been performed on the touch sensor (21) attached so as to cover the gripping portion, and whether or not the vehicle is in a specific state is detected. In a vehicle, wherein an operation target device to be operated by the touch sensor is controlled when it is detected that the specific input operation has been performed and that the specific input operation has been performed. A method for controlling an operation target device is provided.
 本発明の第8の態様によれば、複数の検出領域(R)を有し、車両のステアリングホイール(200,201)における運転者が握る握持部(200r,201s)の所定の範囲に、前記握持部を覆うように装着されたタッチセンサ(21)に対して特定の入力操作が行われたか否かを検出し、車両が特定の状態にあるか否かを検出し、車両が特定の状態にはないことが検出され、かつ、前記特定の入力操作が行われたことが検出された場合に、前記タッチセンサによって操作する対象の操作対象装置を制御することを特徴とする車両における操作対象装置の制御方法が提供される。 According to the eighth aspect of the present invention, the vehicle has a plurality of detection regions (R), and within a predetermined range of the grip portions (200r, 201s) gripped by the driver in the steering wheel (200, 201) of the vehicle. It is detected whether or not a specific input operation has been performed on the touch sensor (21) attached so as to cover the gripping portion, and whether or not the vehicle is in a specific state is detected. In a vehicle, wherein an operation target device to be operated by the touch sensor is controlled when it is detected that the specific input operation has been performed and that the specific input operation has been performed. A method for controlling an operation target device is provided.
 本発明の第9の態様によれば、複数の検出領域(R)を有し、ステアリングホイール(200,201)における運転者が握る握持部(200r,201s)の所定の範囲に装着されたタッチセンサ(21)から得られる接触検出信号に基づいて、どの検出領域が触られているかを示す位置データを含むセンサデータを生成するセンサデータ生成部(22)と、前記センサデータに基づいて、前記タッチセンサに対する入力操作を検出する検出部(10a)と、前記検出部によって、前記タッチセンサに対して運転者の左右の手で所定の入力操作が行われたことが検出された場合に、前記タッチセンサによって操作する対象の操作対象装置を操作するための第1の特定の入力操作を受け付けない状態から受け付ける状態へと移行させるよう制御する制御部(10)とを備えることを特徴とする車両における操作対象装置の制御装置が提供される。 According to the ninth aspect of the present invention, the steering wheel (200, 201) has a plurality of detection regions (R) and is mounted in a predetermined range of the gripping part (200r, 201s) gripped by the driver. Based on a contact detection signal obtained from the touch sensor (21), a sensor data generation unit (22) that generates sensor data including position data indicating which detection area is touched, and based on the sensor data, When it is detected by the detection unit (10a) that detects an input operation to the touch sensor and the detection unit that a predetermined input operation is performed on the touch sensor with the left and right hands of the driver, Transition from a state in which the first specific input operation for operating the operation target device to be operated by the touch sensor is not accepted to a state in which the first specific input operation is accepted is performed. Controller of the operation target apparatus in a vehicle, characterized in that it comprises a Gosuru controller (10) is provided.
 本発明の第10の態様によれば、複数の検出領域(R)を有し、ステアリングホイール(200,201)における運転者が握る握持部(200r,201s)の所定の範囲に装着されたタッチセンサ(21)に対する運転者の手による入力操作を検出し、前記タッチセンサに対して運転者の左右の手で所定の入力操作が行われたことが検出された場合に、前記タッチセンサによって操作する対象の操作対象装置を操作するための第1の特定の入力操作を受け付けない状態から受け付ける状態へと移行させることを特徴とする車両における操作対象装置の制御方法が提供される。 According to the tenth aspect of the present invention, the steering wheel (200, 201) has a plurality of detection regions (R) and is mounted in a predetermined range of the gripping part (200r, 201s) gripped by the driver. When an input operation by a driver's hand to the touch sensor (21) is detected and it is detected that a predetermined input operation is performed on the touch sensor by the driver's left and right hands, the touch sensor There is provided a method for controlling an operation target device in a vehicle, characterized in that a transition is made from a state in which a first specific input operation for operating an operation target device to be operated is not accepted to a state in which it is accepted.
 本発明の車両における操作対象装置の制御装置及び制御方法、ステアリングホイールによれば、握持部から手を離したり、手を大きくずらしたりすることなく操作対象装置を操作することができ、運転者がステアリングホイールを操作する際に支障となる可能性を大幅に低減させることができる。また、誤操作を大幅に低減させることができる。 According to the control device and the control method of the operation target device in the vehicle of the present invention, and the steering wheel, the operation target device can be operated without releasing the hand from the gripping part or greatly shifting the hand. The possibility of hindering the operation of the steering wheel can be greatly reduced. In addition, erroneous operations can be greatly reduced.
図1は、車両における操作対象装置の制御装置の各実施形態を示すブロック図である。FIG. 1 is a block diagram illustrating each embodiment of a control device for an operation target device in a vehicle. 図2は、各実施形態の操作対象装置の制御装置を備える車両の例を示す部分平面図である。FIG. 2 is a partial plan view illustrating an example of a vehicle including a control device for an operation target device according to each embodiment. 図3は、各実施形態におけるタッチセンサをステアリングホイールに装着する位置及び範囲の例を示す図である。FIG. 3 is a diagram illustrating an example of a position and a range where the touch sensor according to each embodiment is mounted on the steering wheel. 図4は、各実施形態におけるタッチセンサをステアリングホイールに装着する位置及び範囲の他の例を示す図である。FIG. 4 is a diagram illustrating another example of a position and a range in which the touch sensor according to each embodiment is mounted on the steering wheel. 図5は、タッチセンサを変形ステアリングホイールに装着する例を示す図である。FIG. 5 is a diagram illustrating an example in which the touch sensor is mounted on the modified steering wheel. 図6は、ステアリングホイールにおけるタッチセンサの部分を握っている状態でセンサデータが得られる部分の例を示す部分斜視図である。FIG. 6 is a partial perspective view showing an example of a portion where sensor data can be obtained in the state where the touch sensor portion of the steering wheel is gripped. 図7は、タッチセンサにおける断面周方向の座標を示す断面図である。FIG. 7 is a cross-sectional view showing coordinates in the circumferential direction of the cross section in the touch sensor. 図8は、図6に示すタッチセンサを展開した状態を示す平面図である。FIG. 8 is a plan view showing a state in which the touch sensor shown in FIG. 6 is developed. 図9は、図8に示す各領域を均等の大きさに変換した状態を示す模式図である。FIG. 9 is a schematic diagram illustrating a state in which each area illustrated in FIG. 8 is converted into an equal size. 図10は、ステアリングホイールにおけるタッチセンサの部分を握っていると判定するための要件の例を示す図である。FIG. 10 is a diagram illustrating an example of requirements for determining that the touch sensor portion of the steering wheel is being gripped. 図11は、タッチセンサに対する特定の入力操作の例を示す模式図である。FIG. 11 is a schematic diagram illustrating an example of a specific input operation with respect to the touch sensor. 図12は、タッチセンサに対する特定の入力操作の他の例を示す模式図である。FIG. 12 is a schematic diagram illustrating another example of the specific input operation with respect to the touch sensor. 図13は、タッチセンサに対する特定の入力操作のさらに他の例を示す模式図である。FIG. 13 is a schematic diagram illustrating still another example of the specific input operation with respect to the touch sensor. 図14は、各実施形態の動作を説明するためのフローチャートである。FIG. 14 is a flowchart for explaining the operation of each embodiment. 図15は、タッチセンサを操作した際に色を変化させるための構成例を示す模式的な斜視図である。FIG. 15 is a schematic perspective view illustrating a configuration example for changing the color when the touch sensor is operated. 図16は、タッチセンサを操作した際に触覚を変化させるための構成例を示す模式的な斜視図である。FIG. 16 is a schematic perspective view showing a configuration example for changing the sense of touch when the touch sensor is operated. 図17は、ステアリングホイールの一実施形態を示す平面図である。FIG. 17 is a plan view showing an embodiment of a steering wheel. 図18は、ステアリングホイールの回転角度を説明するための図である。FIG. 18 is a diagram for explaining the rotation angle of the steering wheel. 図19は、図14のステップS4の具体的な処理の例を示すフローチャートである。FIG. 19 is a flowchart showing an example of specific processing in step S4 of FIG. 図20は、運転者が通常の運転のためにタッチセンサの部分を握っている状態の一例を示す模式図である。FIG. 20 is a schematic diagram illustrating an example of a state where the driver is holding the touch sensor portion for normal driving. 図21は、運転者がタッチセンサの部分を握っていて操作対象装置を操作しようとしている状態の一例を示す模式図である。FIG. 21 is a schematic diagram illustrating an example of a state where the driver is holding the touch sensor and is about to operate the operation target device. 図22は、図11における操作無効エリアArivを省略した状態を示す模式図である。FIG. 22 is a schematic diagram showing a state in which the operation invalid area Ariv in FIG. 11 is omitted. 図23は、図6に示すタッチセンサを展開した状態を示し、運転者が操作対象装置を操作しようとしているか否かを区別する他の構成例を説明するための平面図である。FIG. 23 is a plan view for explaining another configuration example that shows a state where the touch sensor shown in FIG. 6 is developed and that distinguishes whether or not the driver is going to operate the operation target device. 図24は、左右のタッチセンサに対して、左右の手で同じタイミングで同じ入力操作を行う場合の入力操作の例を示す図である。FIG. 24 is a diagram illustrating an example of an input operation when the same input operation is performed with the left and right hands at the same timing with respect to the left and right touch sensors. 図25は、同じタイミングの入力操作とみなす場合の例を示す図である。FIG. 25 is a diagram illustrating an example when the input operations are regarded as having the same timing. 図26は、左右のタッチセンサに対して、左右の手で連続的に所定の入力操作を行う場合の入力操作の例を示す図である。FIG. 26 is a diagram illustrating an example of an input operation when a predetermined input operation is continuously performed with the left and right hands with respect to the left and right touch sensors. 図27は、連続的な入力操作とみなす場合の例を示す図である。FIG. 27 is a diagram illustrating an example in the case of being regarded as a continuous input operation. 図28は、左右のタッチセンサに対する左右の手による入力操作の組み合わせによって操作モードを設定する場合の第1の例を示す図である。FIG. 28 is a diagram illustrating a first example in which an operation mode is set by a combination of input operations by left and right hands with respect to left and right touch sensors. 図29は、左右のタッチセンサに対する左右の手による入力操作の組み合わせによって操作モードを設定する場合の第2の例を示す図である。FIG. 29 is a diagram illustrating a second example in which the operation mode is set by a combination of input operations by the left and right hands with respect to the left and right touch sensors. 図30は、タッチセンサの各エリアを色分けした例を示す図である。FIG. 30 is a diagram illustrating an example in which each area of the touch sensor is color-coded. 図31は、タッチセンサのエリアの境界にマーカを付した例を示す図である。FIG. 31 is a diagram illustrating an example in which a marker is attached to the boundary of the area of the touch sensor. 図32は、タッチセンサの操作検出エリアにおける径を細くした例を示す図である。FIG. 32 is a diagram illustrating an example in which the diameter in the operation detection area of the touch sensor is reduced. 図33は、タッチセンサの操作検出エリアにおける径を太くした例を示す図である。FIG. 33 is a diagram illustrating an example in which the diameter in the operation detection area of the touch sensor is increased. 図34は、タッチセンサのエリアの境界に凹部を設けた例を示す図である。FIG. 34 is a diagram illustrating an example in which a recess is provided at the boundary of the area of the touch sensor. 図35は、タッチセンサのエリアの境界に凸部を設けた例を示す図である。FIG. 35 is a diagram illustrating an example in which a convex portion is provided at the boundary of the area of the touch sensor. 図36は、タッチセンサのグリップ検出エリアが握られたことが検出された場合に操作検出エリアの色を変化させるようにした例を示す図である。FIG. 36 is a diagram illustrating an example in which the color of the operation detection area is changed when it is detected that the grip detection area of the touch sensor is gripped. 図37は、タッチセンサのグリップ検出エリアが握られたことが検出された場合に操作検出エリアの触覚を変化させるようにした例を示す図である。FIG. 37 is a diagram illustrating an example in which the tactile sensation of the operation detection area is changed when it is detected that the grip detection area of the touch sensor is gripped. 図38は、指を左右方向に滑らせたときの軌跡の例を示す図である。FIG. 38 is a diagram illustrating an example of a locus when a finger is slid in the left-right direction. 図39は、指を右方向に滑らせたときの軌跡の補正を説明するための図である。FIG. 39 is a diagram for explaining the correction of the locus when the finger is slid in the right direction. 図40は、指を下方向に滑らせたときの軌跡の補正を説明するための図である。FIG. 40 is a diagram for explaining the correction of the trajectory when the finger is slid downward. 図41は、斜め方向ドラッグを実現する例を説明するための図である。FIG. 41 is a diagram for explaining an example of realizing diagonal dragging. 図42は、第8実施形態における水平方向及び垂直方向のドラッグの定義を説明するための部分斜視図である。FIG. 42 is a partial perspective view for explaining the definition of dragging in the horizontal direction and the vertical direction in the eighth embodiment. 図43は、タッチセンサを展開した状態で、第8実施形態における水平方向及び垂直方向のドラッグの定義を説明するための平面図である。FIG. 43 is a plan view for explaining the definition of dragging in the horizontal direction and the vertical direction in the eighth embodiment with the touch sensor deployed. 図44は、変形ステアリングホイールを発展させた構成例を示す平面図である。FIG. 44 is a plan view showing a configuration example in which a modified steering wheel is developed. 図45は、図44の部分拡大平面図である。45 is a partially enlarged plan view of FIG. 図46は、図45のA-A断面図である。46 is a cross-sectional view taken along the line AA in FIG. 図47は、図45のB-B断面であり、オン・オフ切換機構によるオン・オフの切換動作を説明するための断面図である。FIG. 47 is a cross-sectional view taken along the line BB of FIG. 45 for explaining the on / off switching operation by the on / off switching mechanism. 図48は、図44の変形ステアリングホイールを用いた場合の第8実施形態の動作を説明するためのフローチャートである。FIG. 48 is a flowchart for explaining the operation of the eighth embodiment when the modified steering wheel of FIG. 44 is used. 図49は、運転者が、タッチセンサが装着された部分のステアリングホイールをどのように握っているかを示す握持状態識別データの例を示す模式図である。FIG. 49 is a schematic diagram illustrating an example of gripping state identification data indicating how the driver is gripping the steering wheel of the part to which the touch sensor is attached. 図50は、図49の理解を容易にするために図49を変形して示す模式図である。FIG. 50 is a schematic diagram showing a modification of FIG. 49 in order to facilitate understanding of FIG. 図51は、運転者が、タッチセンサが装着された部分のステアリングホイールをどのように握っているかを示す握持状態識別データの他の例を示す模式図である。FIG. 51 is a schematic diagram illustrating another example of gripping state identification data indicating how the driver is gripping the steering wheel of the part to which the touch sensor is attached. 図52は、運転者データベースに登録されている運転者特定データの例を示す図である。FIG. 52 is a diagram illustrating an example of driver specifying data registered in the driver database. 図53は、運転者を特定する際の動作を説明するためのフローチャートである。FIG. 53 is a flowchart for explaining an operation when a driver is specified. 図54は、運転者が、タッチセンサが装着された部分のステアリングホイールをどのように握っているかを示す握持状態識別データのさらに他の例を説明するための示す部分斜視図である。FIG. 54 is a partial perspective view illustrating still another example of gripping state identification data indicating how the driver is gripping the steering wheel of the portion where the touch sensor is mounted.
<第1実施形態>
 以下、車両における操作対象装置の制御装置及び制御方法の第1実施形態について、添付図面を参照して説明する。また、ステアリングホイールの一実施形態について説明する。図1,図2において、車載機器100は、車両のダッシュボード内に装着されている。図1に示す例では、車載機器100は、制御部10,ナビゲーション処理部11,オーディオ再生部12,テレビジョン(TV)チューナ13,映像信号処理部14,映像表示部15,音声信号処理部16,表示素子17,記憶部18を備えている。制御部10は検出部10aを含む。
<First Embodiment>
Hereinafter, a first embodiment of a control device and a control method for an operation target device in a vehicle will be described with reference to the accompanying drawings. An embodiment of the steering wheel will be described. 1 and 2, the in-vehicle device 100 is mounted in a dashboard of a vehicle. In the example illustrated in FIG. 1, the in-vehicle device 100 includes a control unit 10, a navigation processing unit 11, an audio playback unit 12, a television (TV) tuner 13, a video signal processing unit 14, a video display unit 15, and an audio signal processing unit 16. , A display element 17 and a storage unit 18. The control unit 10 includes a detection unit 10a.
 ナビゲーション処理部11は地図データを保持する記憶部とGPSアンテナ等を有しており、制御部10とナビゲーション処理部11とが協同して経路案内を行う。オーディオ再生部12は、制御部10による制御に従って、例えばコンパクトディスク等の光ディスクや半導体メモリに記録されているオーディオ信号を再生する。TVチューナ13は、制御部10による制御に従って、所定の放送局のTV放送波信号を受信する。ナビゲーション処理部11またはTVチューナ13より出力された映像信号は、制御部10を介して映像信号処理部14に入力されて処理され、液晶パネル等の映像表示部15に表示される。 The navigation processing unit 11 has a storage unit that holds map data, a GPS antenna, and the like, and the control unit 10 and the navigation processing unit 11 cooperate to provide route guidance. The audio reproducing unit 12 reproduces an audio signal recorded on an optical disc such as a compact disc or a semiconductor memory according to control by the control unit 10. The TV tuner 13 receives a TV broadcast wave signal of a predetermined broadcast station under the control of the control unit 10. The video signal output from the navigation processing unit 11 or the TV tuner 13 is input to the video signal processing unit 14 via the control unit 10 and processed, and displayed on the video display unit 15 such as a liquid crystal panel.
 ナビゲーション処理部11,オーディオ再生部12,TVチューナ13より出力された音声信号は、制御部10を介して音声信号処理部16に入力されて処理され、外部のスピーカ20にて発音される。音声信号処理部16は増幅部を含む。スピーカ20は、車両のドアの内部等に設置されている。表示素子17は例えば発光ダイオード(LED)であり、制御部10による制御に従って、後述するタッチセンサ21の接触状態に応じて点灯または消灯する。表示素子17は、運転者から視認できるように、例えば車載機器100の筐体に配置する。表示素子17を車載機器100から離して、車両のステアリングホイール200の近傍に配置してもよい。記憶部18は不揮発性のメモリである。 The audio signal output from the navigation processing unit 11, the audio reproduction unit 12, and the TV tuner 13 is input to the audio signal processing unit 16 through the control unit 10, processed, and produced by the external speaker 20. The audio signal processing unit 16 includes an amplification unit. The speaker 20 is installed inside the door of the vehicle. The display element 17 is, for example, a light emitting diode (LED), and is turned on or off according to the contact state of the touch sensor 21 described later according to control by the control unit 10. The display element 17 is disposed, for example, in a housing of the in-vehicle device 100 so that the driver can visually recognize the display element 17. The display element 17 may be arranged away from the in-vehicle device 100 and in the vicinity of the steering wheel 200 of the vehicle. The storage unit 18 is a nonvolatile memory.
 図2に示すように、操作部であるタッチセンサ21は、ステアリングホイール200の円環状部200rに装着されている。円環状部200rは、運転者が運転の際に握る部分である握持部である。図2に示す例では、タッチセンサ21は、円環状部200rにおける左右それぞれの所定の角度範囲に装着されている。タッチセンサ21は、複数箇所の接触を検出することができる、いわゆる多点検出(マルチタッチ)のタッチセンサである。タッチセンサ21は、円環状部200rの径方向の断面の周上に、360°の範囲で装着することが好ましい。360°未満の範囲であっても、実質的に円環状部200rの断面の周のほぼ全体を覆っていればよい。 As shown in FIG. 2, the touch sensor 21 serving as the operation unit is attached to the annular portion 200r of the steering wheel 200. The annular portion 200r is a gripping portion that is a portion that the driver grips during driving. In the example illustrated in FIG. 2, the touch sensor 21 is mounted in a predetermined angular range on each of the left and right sides of the annular portion 200r. The touch sensor 21 is a so-called multi-point detection (multi-touch) touch sensor that can detect contact at a plurality of locations. The touch sensor 21 is preferably mounted within a range of 360 ° on the circumference of the radial cross section of the annular portion 200r. Even in the range of less than 360 °, it is only necessary to cover substantially the entire circumference of the cross section of the annular portion 200r.
 運転者は、円環状部200rのタッチセンサ21が装着された部分を握っている。図1において、タッチセンサ21の出力はセンサデータ生成部22に入力される。タッチセンサ21を手で触ると、接触検出信号がセンサデータ生成部22へと入力される。センサデータ生成部22は、入力された接触検出信号に基づいて、タッチセンサ21のどの位置から接触検出信号が得られたかを示す位置データを含むセンサデータを生成して制御部10に供給する。タッチセンサ21とセンサデータ生成部22とを一体化してもよいし、センサデータ生成部22を制御部10内に設けてもよい。 The driver is holding the portion of the annular portion 200r where the touch sensor 21 is attached. In FIG. 1, the output of the touch sensor 21 is input to the sensor data generation unit 22. When the touch sensor 21 is touched with a hand, a contact detection signal is input to the sensor data generation unit 22. The sensor data generation unit 22 generates sensor data including position data indicating from which position of the touch sensor 21 the contact detection signal is obtained based on the input contact detection signal, and supplies the sensor data to the control unit 10. The touch sensor 21 and the sensor data generation unit 22 may be integrated, or the sensor data generation unit 22 may be provided in the control unit 10.
 タッチセンサ21としては、例えば投影式静電容量(相互キャパシタンス)方式のタッチセンサを用いることができる。円環状部200rに装着するタッチセンサ21としては、一例として、ミクロ技術研究所が開発した可撓性を有するタッチパネルを採用することができる。この可撓性を有するタッチパネルは、センサ部を0.02~0.05mmの板厚を有する超薄板ガラスとし、超薄板ガラスとPET(ポリエチレンテレフタレート)フィルムとを貼り合わせた構造になっている。円環状部200rにタッチセンサ21を装着しても、タッチセンサ21は手や指で知覚できるほどの凹凸を有さないので、運転者がステアリングホイール200を操作する際に支障となることはほとんどない。 As the touch sensor 21, for example, a projected capacitive (mutual capacitance) type touch sensor can be used. As an example of the touch sensor 21 attached to the annular portion 200r, a flexible touch panel developed by the Micro Technology Research Institute can be employed. This flexible touch panel has a structure in which a sensor portion is made of ultra-thin glass having a thickness of 0.02 to 0.05 mm, and an ultra-thin glass and a PET (polyethylene terephthalate) film are bonded together. Even when the touch sensor 21 is attached to the annular portion 200r, the touch sensor 21 does not have irregularities that can be perceived by a hand or a finger, so that it is almost impossible for the driver to operate the steering wheel 200. Absent.
 図2に破線にて示すように、タッチセンサ21と車載機器100とを結ぶ電線を、ステアリングホイール200及びダッシュボードの内部を通して配線することが好ましい。 As shown by a broken line in FIG. 2, it is preferable to wire an electric wire connecting the touch sensor 21 and the vehicle-mounted device 100 through the steering wheel 200 and the interior of the dashboard.
 図1,図2において、ステアリング角センサ31は、ステアリングホイール200の回転角度を検出する。方向指示器センサ32は、方向指示器320の操作を検出する。シフトレバーセンサ33は、シフトレバー330によるシフト位置がどこにあるかを検出する。ステアリング角センサ31,方向指示器センサ32,シフトレバーセンサ33の各検出信号は、車内通信部34を介して制御部10へと供給される。 1 and 2, the steering angle sensor 31 detects the rotation angle of the steering wheel 200. The direction indicator sensor 32 detects an operation of the direction indicator 320. The shift lever sensor 33 detects where the shift position by the shift lever 330 is. The detection signals of the steering angle sensor 31, the direction indicator sensor 32, and the shift lever sensor 33 are supplied to the control unit 10 via the in-vehicle communication unit 34.
 図3を用いて、タッチセンサ21をステアリングホイール200の円環状部200rに装着する位置及び範囲の例について説明する。なお、上、下、右、左とは、ステアリングホイール200を回転させていない状態(車両が直進する状態)において、運転者の位置からステアリングホイール200を見たときの上、下、右、左であるとする。図3(a)は、円環状部200rの全周にタッチセンサ21を装着した例である。図3(b)は、図2と同様であり、円環状部200rにおける上方で左右それぞれの所定の角度範囲にタッチセンサ21を互いに離間させて装着した例である。図3(c)は、円環状部200rにおける上方で右側のみの所定の角度範囲にタッチセンサ21を装着した例である。 An example of the position and range where the touch sensor 21 is mounted on the annular portion 200r of the steering wheel 200 will be described with reference to FIG. Note that “up”, “down”, “right”, and “left” are up, down, right, left when the steering wheel 200 is viewed from the position of the driver in a state where the steering wheel 200 is not rotated (a state where the vehicle goes straight). Suppose that FIG. 3A shows an example in which the touch sensor 21 is attached to the entire circumference of the annular portion 200r. FIG. 3B is the same as FIG. 2, and is an example in which the touch sensors 21 are mounted apart from each other in predetermined angular ranges on the left and right above the annular portion 200r. FIG. 3C shows an example in which the touch sensor 21 is mounted in a predetermined angle range only on the right side above the annular portion 200r.
 図3(d)は、円環状部200rにおける下方で左右それぞれの所定の角度範囲にタッチセンサ21を互いに離間させて装着した例である。図3(e)は、円環状部200rにおける頂部を含む上方の比較的広い角度範囲にタッチセンサ21を装着した例である。図3(e)は図3(b)における左右のタッチセンサ21を連結したものに相当する。 FIG. 3D shows an example in which the touch sensors 21 are mounted apart from each other within a predetermined angular range on the left and right sides below the annular portion 200r. FIG. 3E shows an example in which the touch sensor 21 is mounted in a relatively wide angle range above including the top of the annular portion 200r. FIG. 3E corresponds to a combination of the left and right touch sensors 21 in FIG.
 図4は、図3(b)における左右のタッチセンサ21を、上側のタッチセンサ21aと下側のタッチセンサ21bとに分割した例である。図4の例では、上側のタッチセンサ21aは手の人差し指と親指による接触を検出し、下側のタッチセンサ21bは主として手のひら,中指,薬指による接触を検出する。図5は、円形ではない変形ステアリングホイール201にタッチセンサ21を装着した例である。タッチセンサ21は、変形ステアリングホイール201の左右の直線部201sに装着されている。運転者は握持部である直線部201sを握って運転し、タッチセンサ21が手のひらや指による接触を検出する。 FIG. 4 is an example in which the left and right touch sensors 21 in FIG. 3B are divided into an upper touch sensor 21a and a lower touch sensor 21b. In the example of FIG. 4, the upper touch sensor 21a detects contact with the index finger and thumb of the hand, and the lower touch sensor 21b mainly detects contact with the palm, middle finger, and ring finger. FIG. 5 shows an example in which the touch sensor 21 is mounted on a deformed steering wheel 201 that is not circular. The touch sensor 21 is attached to the left and right straight portions 201 s of the modified steering wheel 201. The driver operates by grasping the straight portion 201s that is the gripping portion, and the touch sensor 21 detects a contact with a palm or a finger.
 ここで、図2に示すように、タッチセンサ21が円環状部200rにおける上方で左右に装着されており、運転者がタッチセンサ21の部分を握っている場合に、タッチセンサ21によって手のひらや指の接触がどのように検出されるかについて説明する。図6は、図2の右側のタッチセンサ21を運転者が握った場合に、手のひらと指が接触している範囲の例を示している。運転者が円環状部200rを手で握る握り方、手の大きさは一様ではなく、図6は単なる例である。 Here, as shown in FIG. 2, when the touch sensor 21 is attached to the left and right above the annular portion 200 r and the driver is holding the touch sensor 21, How the contact is detected will be described. FIG. 6 shows an example of a range in which the palm and the finger are in contact when the driver grips the right touch sensor 21 in FIG. The manner in which the driver grips the annular portion 200r with his / her hand and the size of the hand are not uniform, and FIG. 6 is merely an example.
 図6に示す例では、ハッチングを付したTpで示す複数の検出領域Rが手のひらの接触を検出している部分であり、ハッチングを付したTtで示す複数の検出領域Rが親指の接触を検出している部分である。以下、手のひら接触検出部Tp、親指接触検出部Ttと称することとする。図6では見えていない車両の進行方向側であるタッチセンサ21の裏面側には、人差し指が接触する。 In the example shown in FIG. 6, a plurality of detection regions R indicated by hatched Tp are portions where the palm contact is detected, and a plurality of detection regions R indicated by hatched Tt detect the contact of the thumb. It is the part which is doing. Hereinafter, the palm contact detection unit Tp and the thumb contact detection unit Tt are referred to. An index finger contacts the back side of the touch sensor 21, which is the traveling direction side of the vehicle not visible in FIG.
 図6に示すように、タッチセンサ21は手のひらや指の接触を検出する検出部分として複数の検出領域Rを有している。タッチセンサ21の各検出領域Rには座標が設定されている。図6に示すように、円環状部200rの周方向において、タッチセンサ21の下端部に位置している検出領域Rを座標0とし、上端部に位置している検出領域Rまで1,2,…,30,31と周方向の座標を設定している。タッチセンサ21における円環状部200rの周方向の座標をY座標とする。 As shown in FIG. 6, the touch sensor 21 has a plurality of detection regions R as a detection portion for detecting the contact of a palm or a finger. Coordinates are set in each detection region R of the touch sensor 21. As shown in FIG. 6, in the circumferential direction of the annular portion 200r, the detection region R located at the lower end portion of the touch sensor 21 is set to the coordinate 0, and 1, 2, 2, to the detection region R located at the upper end portion. ..., 30, 31 and circumferential coordinates are set. A coordinate in the circumferential direction of the annular portion 200r in the touch sensor 21 is defined as a Y coordinate.
 図7は、タッチセンサ21を装着している部分で円環状部200rの径方向に円環状部200rを切断した断面図である。図7に示すように、円環状部200rの断面において、例えば円環状部200rの内径側に位置している検出領域Rを座標0とする。断面における周方向のそれぞれの検出領域Rに、円環状部200rの内径側から正面側、正面側から外径側、外径側から裏面側、裏面側から内径側へと図7の反時計方向に1,2,…,21,22と座標を設定している。タッチセンサ21における断面の周方向の座標をX座標とする。センサデータ生成部22は、接触検出信号が得られた検出領域RのX座標,Y座標によって運転者がタッチセンサ21のどこを触っているかを示す位置データを得ることができる。 FIG. 7 is a cross-sectional view of the annular portion 200r cut in the radial direction of the annular portion 200r at the portion where the touch sensor 21 is mounted. As shown in FIG. 7, in the cross section of the annular portion 200r, for example, a detection region R located on the inner diameter side of the annular portion 200r is set as a coordinate 0. In each of the detection regions R in the circumferential direction in the cross section, counterclockwise in FIG. 7 from the inner diameter side to the front side, from the front side to the outer diameter side, from the outer diameter side to the rear surface side, and from the rear surface side to the inner diameter side. , 1,..., 21 and 22 are set as coordinates. The coordinate in the circumferential direction of the cross section in the touch sensor 21 is defined as an X coordinate. The sensor data generation unit 22 can obtain position data indicating where the driver is touching the touch sensor 21 based on the X coordinate and Y coordinate of the detection region R from which the contact detection signal is obtained.
 図6に示すタッチセンサ21を展開すると、図8のようになる。図8に示すタッチセンサ21の各領域を均等の大きさに変換した状態を模式的に示すと、図9のようになる。図8,図9においては、手のひら接触検出部Tpと親指接触検出部Ttに加えて、人差し指が接触している複数の検出領域Rである人差し指接触検出部Tiも示している。なお、タッチセンサ21に中指や薬指または小指が接触すれば、タッチセンサ21はそれらの指の接触も検出する。本実施形態においては、運転者が、タッチセンサ21上で特定の入力操作を行うのに好適な親指または人差し指を操作のための指として用いることとする。 6 is developed as shown in FIG. FIG. 9 schematically shows a state in which each area of the touch sensor 21 shown in FIG. 8 is converted into an equal size. 8 and 9, in addition to the palm contact detection unit Tp and the thumb contact detection unit Tt, an index finger contact detection unit Ti, which is a plurality of detection regions R with which the index finger is in contact, is also shown. In addition, if a middle finger, a ring finger, or a little finger contacts the touch sensor 21, the touch sensor 21 also detects contact of those fingers. In the present embodiment, the driver uses a thumb or index finger suitable for performing a specific input operation on the touch sensor 21 as a finger for the operation.
 親指または人差し指による入力操作の詳細については後に詳述することとする。制御部10の検出部10aは、センサデータ生成部22から出力されるセンサデータに基づいて、親指または人差し指によってタッチセンサ21上で入力操作が行われたことを検出する。検出部10aは、センサデータ生成部22から出力されるセンサデータに基づいて、円環状部200r(タッチセンサ21)を握っていることも検出する。制御部10は、タッチセンサ21に対して行った特定の入力操作に応じて操作対象装置を制御する。 The details of the input operation with the thumb or index finger will be described in detail later. The detection unit 10 a of the control unit 10 detects that an input operation has been performed on the touch sensor 21 with the thumb or index finger based on the sensor data output from the sensor data generation unit 22. Based on the sensor data output from the sensor data generation unit 22, the detection unit 10a also detects that the annular portion 200r (touch sensor 21) is being gripped. The control unit 10 controls the operation target device according to a specific input operation performed on the touch sensor 21.
 操作対象装置は一例として車載機器100である。具体的には、制御部10は、特定の入力操作に応じて、ナビゲーション処理部11における経路案内に関する制御を実行させたり、オーディオ再生部12におけるオーディオ信号を再生または停止させたり、再生するトラック(楽曲)を進めたり戻したりすることができる。また、制御部10は、特定の入力操作に応じて、TVチューナ13における受信チャンネルを切り換えたり、音声信号処理部16の増幅部を制御して音量を減少または増加させたりすることができる。 The operation target device is the in-vehicle device 100 as an example. Specifically, the control unit 10 executes control related to route guidance in the navigation processing unit 11 according to a specific input operation, plays back or stops an audio signal in the audio playback unit 12, and plays a track ( Music) can be advanced or returned. Further, the control unit 10 can switch the reception channel in the TV tuner 13 or control the amplification unit of the audio signal processing unit 16 to decrease or increase the volume according to a specific input operation.
 操作対象装置の他の例は、車両の動作を制御する車両動作制御装置である。具体的には、制御部10は、車内通信部34を介して、変速機、方向指示器、エアーコンディショナのオン・オフ、エアーコンディショナの設定温度等を制御してもよい。なお、操作対象装置を車両動作制御装置とする場合には、センサデータ生成部22から出力されるセンサデータを車両が有する制御部に入力して、車両動作制御装置を制御することが好ましい。操作対象装置を制御する制御部は、車載機器100内の制御部10であってもよいし、車両に備えられている車載機器100外部の制御部であってもよい。 Another example of the operation target device is a vehicle operation control device that controls the operation of the vehicle. Specifically, the control unit 10 may control a transmission, a direction indicator, an air conditioner on / off, a set temperature of the air conditioner, and the like via the in-vehicle communication unit 34. When the operation target device is a vehicle motion control device, it is preferable to control the vehicle motion control device by inputting sensor data output from the sensor data generation unit 22 to a control unit included in the vehicle. The control unit that controls the operation target device may be the control unit 10 in the in-vehicle device 100 or may be a control unit outside the in-vehicle device 100 provided in the vehicle.
 本実施形態によれば、運転者が握る円環状部200rに極めて薄いタッチセンサ21を装着して、タッチセンサ21を操作することによって操作対象装置を操作するので、円環状部200rから手を離したり、手を大きくずらしたりすることなく操作対象装置を操作することができる。また、タッチセンサ21は円環状部200rの表面上で凹凸がないため、運転者がステアリングホイール200を操作する際に支障となる可能性はほとんどない。 According to the present embodiment, the extremely thin touch sensor 21 is attached to the annular portion 200r gripped by the driver, and the operation target device is operated by operating the touch sensor 21, so that the hand is released from the annular portion 200r. Or the operation target device can be operated without greatly shifting the hand. Further, since the touch sensor 21 has no irregularities on the surface of the annular portion 200r, there is almost no possibility that the driver will interfere with the operation of the steering wheel 200.
 ところで、運転者が通常の運転のために円環状部200rを握っている場合のように、操作対象装置を操作しようとする意思がない場合に、不用意に操作対象装置が操作されないようにすることが必要となる。そこで、本実施形態においては、運転者が意図しない誤操作を回避するために、次のようにしている。 By the way, the operation target device is prevented from being inadvertently operated when the driver does not intend to operate the operation target device as in the case where the driver holds the annular portion 200r for normal driving. It will be necessary. Therefore, in this embodiment, in order to avoid an erroneous operation that is not intended by the driver, the following is performed.
 図8,図9に示すように、タッチセンサ21上の複数の検出領域Rに、手のひらの接触を検出するためのグリップ検出エリアArgと、親指または人差し指による操作入力を有効として、操作入力を検出するための操作検出エリアArvと、グリップ検出エリアArgと操作検出エリアArvとの間の中間領域であり、操作入力を無効とする操作無効エリアArivとを設定している。手のひら接触検出部Tpはグリップ検出エリアArg内に位置し、親指接触検出部Tt及び人差し指接触検出部Tiは操作検出エリアArv内に位置する。 As shown in FIGS. 8 and 9, a plurality of detection areas R on the touch sensor 21 detect the operation input by enabling the grip detection area Arg for detecting the contact of the palm and the operation input by the thumb or the index finger. The operation detection area Arv for performing the operation, and the operation invalid area Ariv that is an intermediate area between the grip detection area Arg and the operation detection area Arv and invalidates the operation input are set. The palm contact detection unit Tp is located in the grip detection area Arg, and the thumb contact detection unit Tt and the index finger contact detection unit Ti are located in the operation detection area Arv.
 操作無効エリアArivは、グリップ検出エリアArgと操作検出エリアArvと同様に、手のひらまたは指の接触を検出する検出領域Rを有するものの、制御部10(検出部10a)またはセンサデータ生成部22が操作無効エリアArivからの入力操作を無効とするように処理することによって、操作無効エリアとすることができる。また、操作無効エリアArivの範囲に検出領域Rを設けないようにタッチセンサ21を構成することにより、操作無効エリアとしてもよい。この場合、図4に示す例と実質的に等価である。 Similar to the grip detection area Arg and the operation detection area Arv, the operation invalid area Ariv has a detection area R for detecting a palm or finger contact, but is operated by the control unit 10 (detection unit 10a) or the sensor data generation unit 22. By performing processing so as to invalidate the input operation from the invalid area Ariv, the operation invalid area can be obtained. Further, the touch sensor 21 may be configured not to provide the detection region R in the range of the operation invalid area Ariv, so that the operation invalid area may be provided. This case is substantially equivalent to the example shown in FIG.
 運転者が通常の運転のために円環状部200rを握っている場合には、手のひら接触検出部Tpと親指接触検出部Tt及び人差し指接触検出部Tiとは比較的近い位置となる。そこで、運転者が単に円環状部200rを握っている場合と、操作対象装置を操作しようとしてタッチセンサ21を触った場合とを的確に区別するために、本実施形態においては、操作無効エリアArivを設けている。運転者は、操作対象装置を操作しようとする場合には、親指や人差し指を意図的に伸ばしてタッチセンサ21を触って、後述する特定の入力操作を行うようにする。制御部10は、操作検出エリアArv内で後述する特定の入力操作が行われた場合に、入力操作に応じて操作対象装置を制御する。 When the driver is holding the annular portion 200r for normal driving, the palm contact detection unit Tp, the thumb contact detection unit Tt, and the index finger contact detection unit Ti are relatively close to each other. Therefore, in this embodiment, in order to accurately distinguish between a case where the driver simply holds the annular portion 200r and a case where the driver touches the touch sensor 21 in order to operate the operation target device, the operation invalid area Ariv Is provided. When the driver wants to operate the operation target device, the driver intentionally extends the thumb or index finger and touches the touch sensor 21 to perform a specific input operation described later. The control unit 10 controls the operation target device according to the input operation when a specific input operation described later is performed in the operation detection area Arv.
 運転者が運転のために円環状部200rを握っていない状態で、操作検出エリアArvに不用意に触った場合に、操作対象装置が誤操作されることを回避することも必要である。そこで、本実施形態においては、検出部10aは、グリップ検出エリアArgで所定の面積以上の手のひら接触検出部Tpが得られた場合に、円環状部200rを握っていると判断する。制御部10は、円環状部200rを握っており、かつ、操作検出エリアArv内で特定の操作が行われた場合に操作対象装置を制御するように構成している。円環状部200rを握っていると判断する手のひら接触検出部Tpの面積は、複数の運転者がステアリングホイール200を通常の握り方で握った場合の面積を統計的に調べて、適宜に設定すればよい。 It is also necessary to avoid the operation target device from being erroneously operated when the driver carelessly touches the operation detection area Arv without holding the annular portion 200r for driving. Therefore, in the present embodiment, the detection unit 10a determines that the annular portion 200r is gripped when the palm contact detection unit Tp having a predetermined area or more is obtained in the grip detection area Arg. The controller 10 is configured to hold the annular portion 200r and to control the operation target device when a specific operation is performed in the operation detection area Arv. The area of the palm contact detection unit Tp that is determined to be gripping the toric part 200r should be set appropriately by statistically examining the area when a plurality of drivers hold the steering wheel 200 in a normal manner. That's fine.
 グリップ検出エリアArgにおける手のひら接触検出部Tpの面積は、運転者が円環状部200rを握っていると判定するための要件の一例であり、判定するための要件これに限定されるものではない。図10は、タッチセンサ21のグリップ検出エリアArgの部分で円環状部200rを切断した断面を示している。検出部10aは、手のひら接触検出部Tpの断面周方向の角度θが所定の角度以上のときに円環状部200rを握っていると判断することができる。所定の角度は例えば180°である。 The area of the palm contact detection portion Tp in the grip detection area Arg is an example of a requirement for determining that the driver is holding the annular portion 200r, and is not limited to this requirement. FIG. 10 shows a cross section in which the annular portion 200r is cut at the grip detection area Arg of the touch sensor 21. FIG. The detection unit 10a can determine that the annular portion 200r is gripped when the angle θ in the circumferential direction of the palm contact detection unit Tp is equal to or greater than a predetermined angle. The predetermined angle is, for example, 180 °.
 以上のように、本実施形態においては、運転者が円環状部200r(タッチセンサ21)を握っているか否かを判定して、円環状部200rを握っている場合のみタッチセンサ21に対する操作入力を受け付けるようにしているので、操作検出エリアArvに不用意に触った場合の誤操作を回避することができる。本実施形態においては、好ましい構成として、操作検出エリアArvをグリップ検出エリアArgから所定の距離離間させた位置に設けているので、運転者が意図的にタッチセンサ21に対して特定の入力操作を行っていることを的確に検出することができる。従って、誤操作を大幅に低減させることができる。 As described above, in the present embodiment, it is determined whether or not the driver is holding the annular portion 200r (touch sensor 21), and an operation input to the touch sensor 21 is performed only when the driver holds the annular portion 200r. Since the operation detection area Arv is touched carelessly, an erroneous operation can be avoided. In the present embodiment, as a preferable configuration, the operation detection area Arv is provided at a position separated from the grip detection area Arg by a predetermined distance, so that the driver intentionally performs a specific input operation on the touch sensor 21. It is possible to accurately detect what is going on. Therefore, erroneous operations can be greatly reduced.
 また、本実施形態においては、グリップ検出エリアArgにおける手のひら接触検出部Tpの面積や断面周方向の角度θを、運転者が円環状部200rを握っているか否かを判定する要件としているので、円環状部200rを握っているか否かを的確に判定することができる。従って、運転者が円環状部200rを握っていない状態で、操作検出エリアArvに不用意に触った場合の誤操作も回避することができる。 Further, in the present embodiment, the area of the palm contact detection portion Tp in the grip detection area Arg and the angle θ in the circumferential direction of the cross section are requirements for determining whether or not the driver is holding the annular portion 200r. It is possible to accurately determine whether or not the annular portion 200r is gripped. Accordingly, it is possible to avoid an erroneous operation in the case where the driver carelessly touches the operation detection area Arv in a state where the driver does not hold the annular portion 200r.
 制御部10は、検出部10aが、グリップ検出エリアArgからの接触検出信号に基づくセンサデータによって運転者が円環状部200r(タッチセンサ21)を握っていることを検出した場合に、運転者に、操作検出エリアArvによる操作入力が可能であることを知らせるために、表示素子17を点灯させる。運転者は、表示素子17の点灯・消灯によって、タッチセンサ21によって操作対象装置を操作することができるか否かを判断することができる。表示素子17をステアリングホイール200の近傍に配置する方が好ましい。 When the control unit 10 detects that the driver is holding the annular portion 200r (touch sensor 21) based on the sensor data based on the contact detection signal from the grip detection area Arg, the control unit 10 informs the driver. The display element 17 is turned on to notify that the operation input by the operation detection area Arv is possible. The driver can determine whether or not the operation target device can be operated by the touch sensor 21 by turning on / off the display element 17. It is preferable to arrange the display element 17 in the vicinity of the steering wheel 200.
 ところで、図3(a)のように円環状部200rの全周にタッチセンサ21を装着したり、図3(e)のように比較的広い範囲にタッチセンサ21を装着したりした場合には、運転者がタッチセンサ21を握る位置は固定されない。そこで、図8及び図9で説明したグリップ検出エリアArgと操作検出エリアArvや、必要に応じて設ける操作無効エリアArivの位置は、運転者がタッチセンサ21を握った位置に応じて動的に設定する必要がある。 By the way, when the touch sensor 21 is attached to the entire circumference of the annular portion 200r as shown in FIG. 3A or the touch sensor 21 is attached to a relatively wide range as shown in FIG. The position where the driver holds the touch sensor 21 is not fixed. Therefore, the positions of the grip detection area Arg and the operation detection area Arv described in FIGS. 8 and 9 and the operation invalid area Ariv provided as necessary are dynamically changed according to the position where the driver holds the touch sensor 21. Must be set.
 そこで、制御部10は、グリップ検出エリアArg等が未設定の状態でタッチセンサ21が握られた場合には、手のひら接触検出部Tpを含む領域をグリップ検出エリアArgに設定する。手のひら接触検出部Tpを含むY座標の所定の範囲をグリップ検出エリアArgとすればよい。前述のように、手のひら接触検出部Tpは所定の面積以上となるので、タッチセンサ21上の複数の検出領域Rにおいて所定の面積以上触られていることが検出された部分が手のひら接触検出部Tpとなる。または、図10で説明したように、タッチセンサ21の部分で円環状部200rを切断した断面周方向で、所定の角度以上触られていることが検出された部分が手のひら接触検出部Tpとなる。 Therefore, when the touch sensor 21 is gripped in a state where the grip detection area Arg or the like is not set, the control unit 10 sets an area including the palm contact detection unit Tp as the grip detection area Arg. A predetermined range of the Y coordinate including the palm contact detection unit Tp may be set as the grip detection area Arg. As described above, since the palm contact detection unit Tp has a predetermined area or more, a portion of the plurality of detection regions R on the touch sensor 21 that is detected to be touched by a predetermined area or more is the palm contact detection unit Tp. It becomes. Alternatively, as described with reference to FIG. 10, a portion that is detected to be touched by a predetermined angle or more in the circumferential direction of the cross section obtained by cutting the annular portion 200 r at the portion of the touch sensor 21 becomes the palm contact detection portion Tp. .
 制御部10は、グリップ検出エリアArgを設定したら、グリップ検出エリアArgよりも上側のY座標の所定の範囲を操作検出エリアArvと設定する。この場合、必要に応じて、グリップ検出エリアArgと隣接するY座標の所定の範囲を操作無効エリアArivとして設定し、グリップ検出エリアArgをグリップ検出エリアArgから離間させた位置とする。 When the control unit 10 sets the grip detection area Arg, the control unit 10 sets a predetermined range of the Y coordinate above the grip detection area Arg as the operation detection area Arv. In this case, if necessary, a predetermined range of the Y coordinate adjacent to the grip detection area Arg is set as the operation invalid area Ariv, and the grip detection area Arg is set at a position separated from the grip detection area Arg.
 次に、図11~図13を用いて、運転者が親指または人差し指を用いてタッチセンサ21に対して行う特定の入力操作の例を説明する。図11(a)~(e)は、タッチセンサ21の運転者と対向する正面側または裏面側である半分を模式的な平面で示している。図11(a)~(e)に示す操作は、正面側であれ親指で、裏面側であれば人差し指で行われることになる。図11(a)において、Dは、親指または人差し指をタッチセンサ21(操作検出エリアArv)上で右方向に滑らせる右方向ドラッグであり、Dは親指または人差し指を右方向に滑らせる左方向ドラッグである。Dは親指または人差し指を上方向に滑らせる上方向ドラッグであり、Dは親指または人差し指をした方向に滑らせる下方向ドラッグである。 Next, an example of a specific input operation performed on the touch sensor 21 by the driver using the thumb or the index finger will be described with reference to FIGS. FIGS. 11A to 11E schematically show a half that is the front side or the back side of the touch sensor 21 facing the driver. The operations shown in FIGS. 11A to 11E are performed with the thumb on the front side and with the index finger on the back side. In FIG. 11 (a), D R is the right drag sliding to the right thumb or index finger on the touch sensor 21 (operation detection area ARV), D L is the left sliding the thumb or index finger in the right direction Direction drag. D U is an upward drag that slides the thumb or index finger upward, and D D is a downward drag that slides in the direction of the thumb or index finger.
 図11(a)において、右方向ドラッグD,左方向ドラッグD,上方向ドラッグD,下方向ドラッグDの代わりに、親指または人差し指でそれぞれの方向にタッチセンサ21をはじくフリックとしてもよい。 In FIG. 11A, instead of the right drag D R , the left drag D L , the upward drag D U , and the downward drag D D, a flick that flicks the touch sensor 21 in each direction with the thumb or index finger may be used. Good.
 図11(b)は、親指または人差し指でタッチセンサ21をたたくタップTを示している。図11(c)は、親指または人差し指でタッチセンサ21上に円弧を描くようにドラッグする円弧ドラッグDを示している。図11(d)は、親指または人差し指でタッチセンサ21上にジグザグ状にドラッグするジグザグドラッグDを示している。図11(e)は、親指または人差し指で記号を書くようにドラッグする記号入力ドラッグDを示している。図11(e)は記号として数字の3を描いた状態を示している。記号としては、認識が比較的容易な数字やアルファベットを用いることが好ましい。 FIG. 11B shows a tap T that taps the touch sensor 21 with the thumb or index finger. FIG. 11 (c) shows an arc drag D C drag to draw an arc on the touch sensor 21 with the thumb or index finger. FIG. 11 (d) shows a zigzag drag D Z to drag in a zigzag shape on the touch sensor 21 with the thumb or index finger. Figure 11 (e) shows a symbol input drag D S dragging to write symbols thumb or index finger. FIG. 11E shows a state in which the numeral 3 is drawn as a symbol. As a symbol, it is preferable to use numbers and alphabets that are relatively easy to recognize.
 図12(a)~(d)は、タッチセンサ21を開いてタッチセンサ21の正面側の半分である正面部21fと裏面側の半分である裏面部21rとを模式的な平面で示している。正面部21fは、図8,図9に示すX座標の0~11の部分であり、裏面部21rは、X座標の12~22の部分である。図8,図9に示す例では、厳密には正面部21fと裏面部21rとは均等の面積にはならないが、図12(a)~(d)では正面部21fと裏面部21rとを同じ面積で示している。図12(a)~(d)は理解を容易にするため、裏面部21rは、裏面部21rを円環状部200rの裏面側から見た状態ではなく、正面部21fから裏面部21rを透視した状態を示している。 FIGS. 12A to 12D schematically show a front part 21f that is a half on the front side of the touch sensor 21 and a back part 21r that is a half on the back side, with the touch sensor 21 opened. . The front portion 21f is a portion of 0 to 11 of the X coordinate shown in FIGS. 8 and 9, and the back portion 21r is a portion of 12 to 22 of the X coordinate. In the example shown in FIGS. 8 and 9, the front part 21f and the rear part 21r do not have the same area in the strict sense, but the front part 21f and the rear part 21r are the same in FIGS. 12 (a) to 12 (d). The area is shown. 12A to 12D, for easy understanding, the back surface portion 21r is not seen from the back surface side of the annular portion 200r but the back surface portion 21r is seen through the front surface portion 21f. Indicates the state.
 図12(a)~(d)に示すように、正面部21fに対する親指による入力操作と裏面部21rに対する人差し指による入力操作とを組み合わせたパターンをタッチセンサ21に対する特定の入力操作としてもよい。図12(a)は、正面部21fで親指を右方向に滑らせる右方向ドラッグDTRと、裏面部21rで人差し指を右方向に滑らせる右方向ドラッグDIRとの双方を行う例である。親指及び人差し指の双方を円環状部200rの内周側から外周側へとドラッグすることにより、図12(a)が実現される。親指と人差し指の双方を、図12(a)とは逆方向にドラッグするパターンとしてもよい。 As shown in FIGS. 12A to 12D, a specific input operation for the touch sensor 21 may be a combination of an input operation with the thumb for the front portion 21f and an input operation with the index finger for the back portion 21r. 12 (a) is an example in which the right drag D TR sliding the thumb in the right direction at the front portion 21f, both the rightward drag D IR sliding the index finger in the right direction at the rear surface portion 21r. By dragging both the thumb and the index finger from the inner periphery side to the outer periphery side of the annular portion 200r, FIG. A pattern in which both the thumb and the index finger are dragged in the direction opposite to that shown in FIG.
 図12(b)は、正面部21fで親指を左方向に滑らせる左方向ドラッグDTLと、裏面部21rで人差し指を右方向に滑らせる右方向ドラッグDIRとの双方を行う例である。親指を円環状部200rの外周側から内周側へとドラッグし、人差し指を円環状部200rの内周側から外周側へとドラッグすることにより、図12(b)が実現される。図12(c)は、正面部21fで親指を右方向に滑らせる右方向ドラッグDTRと、裏面部21rで人差し指を左方向に滑らせる左方向ドラッグDILとの双方を行う例である。親指を円環状部200rの内周側から外周側へとドラッグし、人差し指を円環状部200rの外周側から内周側へとドラッグすることにより、図12(c)が実現される。 12 (b) is an example of performing the leftward drag D TL sliding the thumb to the left at the front portion 21f, both the rightward drag D IR sliding the index finger in the right direction at the rear surface portion 21r. FIG. 12B is realized by dragging the thumb from the outer peripheral side of the annular part 200r to the inner peripheral side and dragging the index finger from the inner peripheral side of the annular part 200r to the outer peripheral side. Figure 12 (c) is an example of performing the right drag D TR sliding the thumb in the right direction at the front portion 21f, both the left drag D IL sliding the index finger to the left at the back surface 21r. FIG. 12C is realized by dragging the thumb from the inner peripheral side to the outer peripheral side of the annular portion 200r and dragging the index finger from the outer peripheral side to the inner peripheral side of the annular portion 200r.
 図12(d)は、正面部21fで親指を上方向に滑らせる上方向ドラッグDTUと、裏面部21rで人差し指を下方向に滑らせる下方向ドラッグDIDとの双方を行う例である。親指を下方向、人差し指を上方向にドラッグとするパターンとしてもよく、親指と人差し指の双方を上方向または下方向にドラッグするパターンとしてもよい。ここでは、正面部21fに対する親指による入力操作と裏面部21rに対する人差し指による入力操作とを組み合わせた種々のパターンを示したが、入力操作のしやすさを考慮して適宜選択すればよい。 Figure 12 (d) is an example in which the direction drag D TU on sliding the thumb in the upward direction at the front portion 21f, both the downward drag D ID sliding the index finger in a downward direction at the rear surface portion 21r. A pattern in which the thumb is dragged downward and the index finger is dragged upward may be used, or a pattern in which both the thumb and index finger are dragged upward or downward may be used. Here, various patterns are shown in which the input operation with the thumb on the front surface portion 21f and the input operation with the index finger on the back surface portion 21r are shown.
 図12(a)~(d)に示すように、親指による入力操作と人差し指による入力操作とを組み合わせたパターンを、操作対象装置を制御するための特定の入力操作とすれば、さらに大幅に誤操作を低減させることが可能となる。 As shown in FIGS. 12 (a) to 12 (d), if a pattern in which the input operation with the thumb and the input operation with the index finger is combined is a specific input operation for controlling the operation target device, the operation error is further greatly reduced. Can be reduced.
 さらには、図3(a),(b),(d),(e)や図4,図5のように、左右の手でタッチセンサ21(21a,21b)を操作できる場合に、左右の手による操作を組み合わせたパターンを、操作対象装置を制御するための特定の入力操作とすることも可能である。図13(a)~(d)は、図3(b)における左側のタッチセンサ21を左側タッチセンサ21Lとし、右側のタッチセンサ21を右側タッチセンサ21Rとした場合の、左右の手による操作を組み合わせたパターンの例を示している。ここでは、親指によって操作する図12の正面部21fに相当する面を模式的な平面で示している。 Further, when the touch sensor 21 (21a, 21b) can be operated with the left and right hands as shown in FIGS. 3 (a), (b), (d), (e) and FIGS. A pattern in which operations by hand are combined can be used as a specific input operation for controlling the operation target device. FIGS. 13A to 13D show the operations by the left and right hands when the left touch sensor 21 in FIG. 3B is the left touch sensor 21L and the right touch sensor 21 is the right touch sensor 21R. An example of a combined pattern is shown. Here, a surface corresponding to the front portion 21f of FIG. 12 operated by the thumb is shown by a schematic plane.
 図13(a)は、左側タッチセンサ21Lに対して親指を左方向に滑らせる左方向ドラッグDTLと、右側タッチセンサ21Rに対して親指を右方向に滑らせる右方向ドラッグDTRとを組み合わせたパターンである。図13(b)は、左側タッチセンサ21Lに対して親指を右方向に滑らせる右方向ドラッグDTRと、右側タッチセンサ21Rに対して親指を左方向に滑らせる左方向ドラッグDTLとを組み合わせたパターンである。 13 (a) is combined with the left drag D TL sliding the thumb against the left touch sensor 21L in the left direction and a right direction drag D TR sliding the thumb in the right direction with respect to the right touch sensor 21R Pattern. FIG. 13 (b), combined with the right drag D TR sliding the thumb in the right direction with respect to the left touch sensor 21L, and a left drag D TL sliding the thumb to the left with respect to the right touch sensor 21R Pattern.
 図13(c)は、左側タッチセンサ21Lと右側タッチセンサ21Rの双方に対して、親指を上方向に滑らせる上方向ドラッグDTUを組み合わせたパターンである。図13(d)は、左側タッチセンサ21Lと右側タッチセンサ21Rの双方に対して、親指を下方向に滑らせる下方向ドラッグDTDを組み合わせたパターンである。 FIG. 13C shows a pattern in which an upward drag DTU that slides the thumb upward is combined with both the left touch sensor 21L and the right touch sensor 21R. FIG. 13 (d) for both the left touch sensor 21L and the right touch sensor 21R, a pattern that combines the downward drag D TD sliding the thumb in the downward direction.
 左右の手による入力操作を組み合わせたパターンを、操作対象装置を制御するための特定の入力操作とすれば、運転者は円環状部200rを両手で握ることになるので、安全運転に寄与することが可能である。特に、図3(b)の例は、円環状部200rを両手で握る最も適切な位置にタッチセンサ21を装着しているので、安全運転に寄与する点で最も好ましい。両手で左右のタッチセンサ21を握っている場合に入力操作を受け付けるようにしてもよい。片手がタッチセンサ21から離れた場合に入力操作を受け付けない状態としてもよい。片手がタッチセンサ21から離れた場合には入力操作を受け付ける状態を継続してもよい。片手のみの特定の入力操作を用いる場合であっても、両手で左右のタッチセンサ21を握っている場合に入力操作を受け付けるようにすれば、安全運転に寄与する。 If a pattern combining input operations by left and right hands is a specific input operation for controlling the operation target device, the driver will hold the annular portion 200r with both hands, which contributes to safe driving. Is possible. In particular, the example of FIG. 3B is most preferable in that it contributes to safe driving because the touch sensor 21 is mounted at the most appropriate position where the annular portion 200r is grasped with both hands. An input operation may be received when the left and right touch sensors 21 are held with both hands. It is good also as a state which does not accept input operation, when one hand leaves | separates from the touch sensor 21. FIG. When one hand moves away from the touch sensor 21, the state of accepting an input operation may be continued. Even when a specific input operation with only one hand is used, if the input operation is accepted when the left and right touch sensors 21 are held with both hands, it contributes to safe driving.
 運転者が通常の運転のために円環状部200rを握っていて、操作対象装置を操作しようとする意思がない場合に、偶然に、親指による入力操作と人差し指による入力操作とを組み合わせた特定のパターンや、左右の手による入力操作を組み合わせた特定のパターンになってしまう可能性は比較的低いと考えられる。そこで、親指による入力操作と人差し指による入力操作とを組み合わせた特定のパターンや、左右の手による入力操作を組み合わせた特定のパターンのみを用いる場合には、上述した誤操作を回避するための工夫の一部または全てを省略してもよい。勿論、左右の手による入力操作を組み合わせた特定のパターンのみを用いる場合であっても、上述した誤操作を回避するための工夫を併せて採用することが好ましい。 When the driver is holding the toric part 200r for normal driving and does not intend to operate the operation target device, the specific operation is a combination of the input operation with the thumb and the input operation with the index finger. It is considered that there is a relatively low possibility that a pattern or a specific pattern combining left and right hand input operations will result. Therefore, when only a specific pattern that combines the input operation with the thumb and the input operation with the index finger or only a specific pattern that combines the input operation with the left and right hands is used, one of the devices for avoiding the above-described erroneous operation. Some or all of them may be omitted. Of course, even when only a specific pattern in which the input operations by the left and right hands are combined is used, it is preferable to employ a device for avoiding the above-described erroneous operation.
 記憶部18は、以上説明した特定の入力操作または特定の入力操作を組み合わせたパターンと、操作対象装置に対する制御の種別とを対応させたテーブルを記憶している。制御部10は、記憶部18に記憶されたテーブルに従って、タッチセンサ21に入力された操作に応じて操作対象装置を制御する。記憶部18を制御部10内に設けてもよい。 The storage unit 18 stores a table in which the above-described specific input operation or a combination of specific input operations is associated with the type of control for the operation target device. The control unit 10 controls the operation target device according to the operation input to the touch sensor 21 according to the table stored in the storage unit 18. The storage unit 18 may be provided in the control unit 10.
 図14のフローチャートを用いて、本実施形態において制御部10で実行される処理について改めて説明する。図13において、制御部10は、ステップS1にて、センサデータ生成部22から出力されるセンサデータを取得する。制御部10は、ステップS2にて、検出部10aによる検出出力に基づいて、円環状部200rが握られているか否かを判定する。円環状部200rが握られていると判定されれば(YES)、制御部10は、ステップS3に処理を移し、円環状部200rが握られていると判定されなければ(NO)、制御部10は、ステップS1に処理を戻す。 The processing executed by the control unit 10 in the present embodiment will be described again using the flowchart of FIG. In FIG. 13, the control unit 10 acquires sensor data output from the sensor data generation unit 22 in step S1. In step S2, the control unit 10 determines whether or not the annular portion 200r is gripped based on the detection output from the detection unit 10a. If it is determined that the annular portion 200r is gripped (YES), the control unit 10 proceeds to step S3, and if it is not determined that the annular portion 200r is gripped (NO), the control unit 10 10 returns the process to step S1.
 制御部10は、ステップS3にて、検出部10aによる検出出力に基づいて、入力操作があったか否かを判定する。入力操作があったと判定されれば(YES)、制御部10は、ステップS4に処理を移し、操作入力があったと判定されなければ(NO)、制御部10は、ステップS1に処理を戻す。制御部10は、ステップS4にて、ステップS3における入力操作によって操作対象装置に対する操作を許可するか否かを判定する。操作を許可すると判定されれば(YES)、制御部10は、ステップS5に処理を移し、操作を許可すると判定されなければ(NO)、制御部10は、ステップS1に処理を戻す。 In step S3, the control unit 10 determines whether or not an input operation has been performed based on the detection output from the detection unit 10a. If it is determined that there is an input operation (YES), the control unit 10 moves the process to step S4. If it is not determined that there is an operation input (NO), the control unit 10 returns the process to step S1. In step S4, the control unit 10 determines whether or not to permit an operation on the operation target device by the input operation in step S3. If it is determined that the operation is permitted (YES), the control unit 10 moves the process to step S5. If it is not determined that the operation is permitted (NO), the control unit 10 returns the process to step S1.
 上述のように、制御部10は、操作検出エリアArv内で特定の入力操作がなされた場合に操作対象装置に対する操作を許可し、操作無効エリアAriv内で特定の入力操作がなされたとしても操作対象装置に対する操作を許可しない。また、制御部10は、操作検出エリアArv内で何らかの入力操作がなされたとしても、上述した特定の入力操作でない場合は操作対象装置に対する操作を許可せず、特定の入力操作がなされた場合のみ操作対象装置に対する操作を許可する。 As described above, the control unit 10 permits an operation on the operation target device when a specific input operation is performed in the operation detection area Arv, and operates even if a specific input operation is performed in the operation invalid area Ariv. Do not allow operations on the target device. In addition, even if any input operation is performed in the operation detection area Arv, the control unit 10 does not permit the operation on the operation target device if it is not the above-described specific input operation, and only when the specific input operation is performed. Allows operations on the operation target device.
 制御部10は、ステップS5にて、入力操作に基づいた操作を確定し、ステップS6にて、操作対象装置に対して、確定した操作に応じた制御を実行して、ステップS1に処理を戻す。 In step S5, the control unit 10 determines an operation based on the input operation. In step S6, the control unit 10 executes control according to the determined operation on the operation target device, and returns the process to step S1. .
 本実施形態による動作をまとめると、次の通りである。検出部10a(第1の検出部)は、ステアリングホイール200,201における運転者が握る握持部(円環状部200rまたは直線部201s)に装着されたタッチセンサ21における第1のエリアが触られている状態であることを検出する。第1のエリアの例は、グリップ検出エリアArgである。検出部10a(第2の検出部)は、第1のエリアが触られている状態で、タッチセンサ21における第1のエリアよりも上側に位置する第2のエリアに対して特定の入力操作がなされたことを検出する。第2のエリアの例は、操作検出エリアArvである。運転時、親指または人差し指は手のひらよりも上方に位置するので、第1のエリアよりも上側を第2のエリアとすればよい。そして、第1のエリアが触られている状態であり、特定の入力操作がなされたら、タッチセンサ21によって操作する対象の操作対象装置を特定の入力操作に応じて制御する。 The operation according to this embodiment is summarized as follows. The detection unit 10a (first detection unit) is touched in the first area of the touch sensor 21 attached to the gripping part (the annular part 200r or the straight part 201s) gripped by the driver in the steering wheels 200 and 201. It is detected that it is in a state. An example of the first area is the grip detection area Arg. The detection unit 10a (second detection unit) performs a specific input operation on the second area located above the first area in the touch sensor 21 while the first area is being touched. Detect what has been done. An example of the second area is the operation detection area Arv. During driving, the thumb or index finger is positioned above the palm, so the upper side of the first area may be the second area. When the first area is touched and a specific input operation is performed, an operation target device to be operated by the touch sensor 21 is controlled according to the specific input operation.
 上側に位置するエリアとは、ステアリングホイール200を回転させていない状態で運転者が握持部を握った状態で、第1のエリアよりも上側に位置するエリアである。第1のエリアが所定の面積以上触られていることを検出した場合に、第1のエリアが触られている状態であるとすることが好ましい。 The area located on the upper side is an area located on the upper side of the first area when the driver holds the gripping part without rotating the steering wheel 200. When it is detected that the first area is touched more than a predetermined area, it is preferable that the first area is touched.
 また、別の観点では、次の通りである。検出部10a(第1の検出部)は、ステアリングホイール200,201における運転者が握る握持部(円環状部200rまたは直線部201s)の所定の範囲に、握持部を覆うように装着されたタッチセンサ21が、タッチセンサ21上の第1のエリアにおいて、握持部をステアリングホイール200,201の径方向に切断したときの断面における周方向の所定角度以上触られている状態であることを検出する。検出部10a(第2の検出部)は、第1のエリアが所定角度以上触られている状態で、タッチセンサ21における第1のエリアとは異なる第2のエリアに対して特定の入力操作がなされたことを検出する。第1のエリアが前記所定角度以上触られている状態であり、特定の入力操作がなされたら、タッチセンサ21によって操作する対象の操作対象装置を特定の入力操作に応じて制御する。 Also, from another viewpoint, it is as follows. The detection unit 10a (first detection unit) is mounted so as to cover the gripping part within a predetermined range of the gripping part (annular part 200r or linear part 201s) gripped by the driver on the steering wheels 200 and 201. In the first area on the touch sensor 21, the touch sensor 21 is in a state of being touched by a predetermined angle in the circumferential direction in the cross section when the gripping part is cut in the radial direction of the steering wheels 200 and 201. Is detected. The detection unit 10a (second detection unit) performs a specific input operation on a second area different from the first area in the touch sensor 21 in a state where the first area is touched by a predetermined angle or more. Detect what has been done. When the first area is touched more than the predetermined angle and a specific input operation is performed, an operation target device to be operated by the touch sensor 21 is controlled according to the specific input operation.
 第2のエリアは、第1のエリアよりも上側に位置するエリアであることが好ましい。上側に位置するエリアとは、ステアリングホイール200を回転させていない状態で運転者が握持部を握った状態で、第1のエリアよりも上側に位置するエリアである。 It is preferable that the second area is an area located above the first area. The area located on the upper side is an area located on the upper side of the first area in a state where the driver grips the gripping part without rotating the steering wheel 200.
 図15及び図16は、タッチセンサ21を操作したことを運転者に効果的に知らせるための構成例を示している。図15及び図16は、図9と同様、タッチセンサ21を展開して矩形状に変換した模式図である。図15は、タッチセンサ21の下面側に発色物質を含む色変化シート41を設けた例である。タッチセンサ21に透明導電膜を用いることにより、運転者は、タッチセンサ21を介して、タッチセンサ21の下面に配置されている色変化シート41の色を見ることができる。制御部10による制御によって、タッチセンサ21を触った部分の色変化シート41の色を変化させることにより、運転者はタッチセンサ21を操作したことを認識することができる。 15 and 16 show a configuration example for effectively notifying the driver that the touch sensor 21 has been operated. 15 and 16 are schematic views in which the touch sensor 21 is developed and converted into a rectangular shape, as in FIG. FIG. 15 is an example in which a color change sheet 41 containing a coloring material is provided on the lower surface side of the touch sensor 21. By using the transparent conductive film for the touch sensor 21, the driver can see the color of the color change sheet 41 disposed on the lower surface of the touch sensor 21 via the touch sensor 21. The driver can recognize that the touch sensor 21 has been operated by changing the color of the color change sheet 41 at the portion where the touch sensor 21 is touched under the control of the control unit 10.
 図16は、タッチセンサ21の上面側に触覚(手触り)を変化させる触覚フィードバックシート42を設けた例である。触覚フィードバックシート42としては、例えば、フィンランドのSenseg社が開発した“E-Sense”と称されるシートを採用することができる。このシートは、フィルムを帯電させることで触覚のフィードバックを実現するものである。タッチセンサ21の上面側に触覚フィードバックシート42を設けても、タッチセンサ21は指等の接触を検出することが可能である。触覚フィードバックシート42を介してタッチセンサ21を操作した場合に、制御部10による制御によって、触覚フィードバックシート42の触覚を変化させることにより、運転者はタッチセンサ21を操作したことを認識することができる。 FIG. 16 shows an example in which a tactile feedback sheet 42 for changing a tactile sensation (hand touch) is provided on the upper surface side of the touch sensor 21. As the tactile feedback sheet 42, for example, a sheet called “E-Sense” developed by Senseg of Finland can be used. This sheet realizes tactile feedback by charging the film. Even if the tactile feedback sheet 42 is provided on the upper surface side of the touch sensor 21, the touch sensor 21 can detect contact with a finger or the like. When the touch sensor 21 is operated via the tactile feedback sheet 42, the driver can recognize that the touch sensor 21 has been operated by changing the tactile sense of the tactile feedback sheet 42 under the control of the control unit 10. it can.
 次に、図17を用いてステアリングホイールの一実施形態について説明する。図17に示す一実施形態のステアリングホイール210は、操作対象装置に対する制御信号をステアリングホイール210から出力させるように構成したものである。図17において、図1,図2と同一部分には同一符号を付し、その説明を適宜省略する。図17に示すように、ステアリングホイール210は、例えば円環状部200r以外の部分に、図1のセンサデータ生成部22と同様のセンダデータ生成部23と、制御部10と同様の制御部24を備えている。制御部24は、検出部10aと同様の検出部24aと、制御信号発生部24bを有する。 Next, an embodiment of the steering wheel will be described with reference to FIG. A steering wheel 210 according to an embodiment shown in FIG. 17 is configured to output a control signal for the operation target device from the steering wheel 210. In FIG. 17, the same parts as those in FIGS. 1 and 2 are denoted by the same reference numerals, and description thereof will be omitted as appropriate. As shown in FIG. 17, the steering wheel 210 includes, for example, a sender data generation unit 23 similar to the sensor data generation unit 22 in FIG. 1 and a control unit 24 similar to the control unit 10 in a portion other than the annular portion 200r. I have. The control unit 24 includes a detection unit 24a similar to the detection unit 10a and a control signal generation unit 24b.
 ステアリングホイール210を車両に装着した場合、制御信号発生部24bは、タッチセンサ21に対する特定の入力操作に応じて、操作対象装置を制御するための制御信号を発生する。制御信号発生部24bより出力された制御信号はケーブル25を介して出力端子26へと出力される。出力端子26を操作対象装置に接続すれば、制御信号によって操作対象装置を制御することが可能となる。特定の入力操作の例は図11~図13と同様である。制御信号発生部24bが制御信号を発生する要件も上述と同様である。 When the steering wheel 210 is mounted on the vehicle, the control signal generator 24b generates a control signal for controlling the operation target device in accordance with a specific input operation to the touch sensor 21. The control signal output from the control signal generator 24 b is output to the output terminal 26 via the cable 25. If the output terminal 26 is connected to the operation target device, the operation target device can be controlled by the control signal. Examples of specific input operations are the same as those in FIGS. The requirements for the control signal generator 24b to generate the control signal are the same as described above.
 本発明は以上説明した本実施形態に限定されるものではなく、本発明の要旨を逸脱しない範囲において種々変更可能である。タッチセンサ21を、面ファスナを用いて円環状部200rに着脱自在に装着するようにしてもよい。円環状部200rを握持部としたが、握持部は必ずしも円環状でなくてもよい。 The present invention is not limited to the embodiment described above, and various modifications can be made without departing from the scope of the present invention. The touch sensor 21 may be detachably attached to the annular portion 200r using a surface fastener. Although the annular portion 200r is the gripping portion, the gripping portion is not necessarily circular.
 タッチセンサ21は1枚のシートによって構成する必要はなく、複数のタッチセンサの片によってタッチセンサ21を構成してもよい。複数のタッチセンサの片によってタッチセンサ21を構成すれば、一つ一つのタッチセンサの片の形状を簡単な形にすることができるため、タッチセンサを生産する際に有利である。なお、複数のタッチセンサの片によってタッチセンサ21を構成する場合、必ずしもタッチセンサの片を隙間なく配置しなくてもよい。本実施形態のタッチパネル21は、握持部を覆うように装着されているが、本実施形態における、握持部を覆うように装着されているということは、複数のタッチセンサの片からなるタッチセンサ21が、タッチセンサの片と片との間に隙間がある状態で、握持部を覆うように装着されていることも含むものとする。 The touch sensor 21 does not need to be configured by a single sheet, and the touch sensor 21 may be configured by a plurality of pieces of touch sensors. If the touch sensor 21 is composed of a plurality of pieces of touch sensors, the shape of each piece of the touch sensor can be simplified, which is advantageous when producing touch sensors. In addition, when the touch sensor 21 is configured by a plurality of pieces of touch sensors, the pieces of touch sensors do not necessarily have to be arranged without gaps. The touch panel 21 according to the present embodiment is mounted so as to cover the gripping portion. However, the touch panel 21 according to the present embodiment is mounted so as to cover the gripping portion. It is assumed that the sensor 21 is mounted so as to cover the gripping portion in a state where there is a gap between the pieces of the touch sensor.
 さらには、タッチセンサ21を設ける範囲を、運転者が運転時に握る握持部(円環状部200rまたは直線部201s)に限定せず、例えばエアバッグ等が収納されるセンター部と円環状部200rとの間を連結する連結部の表面まで広げてもよい。連結部は、図2においては、図2に示す状態で左右の手の間に位置する部分であり、図21においては、センサデータ生成部23及び制御部24が設けられている部分である。このようにタッチセンサ21を連結部の表面まで広げて、連結部における握持部に近い位置に操作検出エリアArvを設定してもよい。握持部に近い位置であれば、運転者が運転時に握持部から手を離したり、手を大きくずらしたりすることなく操作対象装置を操作することができる。よって、タッチセンサ21を連結部の表面まで広げた構成としても、運転者がステアリングホイール200,201,210を操作する際に支障となる可能性はほとんどない。 Furthermore, the range in which the touch sensor 21 is provided is not limited to the gripping portion (the annular portion 200r or the straight portion 201s) that the driver grips during driving. You may extend to the surface of the connection part which connects between. In FIG. 2, the connecting portion is a portion located between the left and right hands in the state shown in FIG. 2, and in FIG. 21, the sensor data generating portion 23 and the control portion 24 are provided. In this way, the touch sensor 21 may be extended to the surface of the connecting portion, and the operation detection area Arv may be set at a position close to the gripping portion in the connecting portion. If the position is close to the gripping part, the driver can operate the operation target device without releasing the hand from the gripping part or greatly shifting the hand during driving. Therefore, even when the touch sensor 21 is extended to the surface of the connecting portion, there is almost no possibility that the driver will interfere with the operation of the steering wheels 200, 201, and 210.
<第2実施形態>
 車両における操作対象装置の制御装置及び制御方法の第2実施形態について説明する。第2実施形態における基本的構成及び動作は第1実施形態と同じであり、異なる部分のみを説明する。
<Second Embodiment>
A second embodiment of the control device and control method for the operation target device in the vehicle will be described. The basic configuration and operation in the second embodiment are the same as those in the first embodiment, and only different parts will be described.
 上述したステップS4では、車両が特定の状態にあるときに、操作対象装置に対する制御を許可しないようにする(即ち、無効にする)ことが好ましい。制御部10には、操作対象装置に対する制御を許可するか否かを判断するためのステアリングホイール200の回転角度が設定されている。図18に示すように、ステアリングホイール200を回転させていない状態を回転角度0°とし、例えば右方向に回転させたときをプラス、左方向に回転させたときをマイナスとして、一例として回転角度±30°の範囲でタッチセンサ21に対する入力操作を有効として許可し、回転角度が±30°を超えたらタッチセンサ21に対する入力操作を無効として不許可とする。 In step S4 described above, when the vehicle is in a specific state, it is preferable not to permit (that is, invalidate) the control of the operation target device. In the control unit 10, the rotation angle of the steering wheel 200 for determining whether or not to allow control of the operation target device is set. As shown in FIG. 18, when the steering wheel 200 is not rotated, the rotation angle is 0 °. For example, when the wheel is rotated rightward, the rotation angle is plus. When the wheel is rotated leftward, the rotation angle is ±. An input operation to the touch sensor 21 is permitted as valid within a range of 30 °, and if the rotation angle exceeds ± 30 °, the input operation to the touch sensor 21 is invalidated and not permitted.
 回転角度が±30°を超えた場合には、車両は右折または左折をしていたり、コーナリング走行していたりする特定の状態である。このような特定の状態で操作対象装置を制御しようとすれば誤操作が発生する可能性が高い。言い換えれば、このような特定の状態における操作入力は、ユーザの意図していない操作入力である可能性が高い。また安全性の上でも好ましくない。そこで、本実施形態では、車両が特定の状態にあるときには、操作対象装置に対する制御を無効とする。 When the rotation angle exceeds ± 30 °, the vehicle is in a specific state where it makes a right or left turn or cornering. If an operation target device is to be controlled in such a specific state, there is a high possibility that an erroneous operation will occur. In other words, there is a high possibility that the operation input in such a specific state is an operation input not intended by the user. Also, it is not preferable in terms of safety. Therefore, in the present embodiment, when the vehicle is in a specific state, the control on the operation target device is invalidated.
 前述のように、制御部10にはステアリング角センサ31によって検出されたステアリングホイール200の回転角度が入力される。制御部10は、ステアリング角センサ31によって検出されたステアリングホイール200の回転角度に応じて、タッチセンサ21に対する入力操作を有効とする状態と無効とする状態とを切り換える。制御部10には方向指示器センサ32からの検出信号も入力される。そこで、制御部10は、方向指示器センサ32からの検出信号によって方向指示器320が操作された場合に、タッチセンサ21に対する入力操作を無効としてもよい。方向指示器320が操作された場合は、ステアリングホイール200が所定の回転角度を超えて回転される特定の状態になるとみなすことができる。なお、方向指示器320が右折または左折の合図以外の操作を兼用している場合があるが、ここでの方向指示器320の操作とは右折または左折の合図を行う操作である。 As described above, the rotation angle of the steering wheel 200 detected by the steering angle sensor 31 is input to the control unit 10. The control unit 10 switches between a state where the input operation to the touch sensor 21 is enabled and a state where the input operation to the touch sensor 21 is disabled according to the rotation angle of the steering wheel 200 detected by the steering angle sensor 31. A detection signal from the direction indicator sensor 32 is also input to the control unit 10. Therefore, the control unit 10 may invalidate the input operation to the touch sensor 21 when the direction indicator 320 is operated by the detection signal from the direction indicator sensor 32. When the direction indicator 320 is operated, it can be considered that the steering wheel 200 is in a specific state in which the steering wheel 200 is rotated beyond a predetermined rotation angle. The direction indicator 320 may also be used for operations other than a right turn or left turn signal, and the operation of the direction indicator 320 here is an operation for making a right turn or left turn signal.
 さらに、車両を後退させている最中にタッチセンサ21に対して特定の入力操作を行って操作対象装置を制御することは好ましくない。誤操作が発生する可能性が高く、安全性の上でも好ましくない。そこで、制御部10は、シフトレバーセンサ33からの検出信号によって、シフトレバー330のシフト位置がリバースとなった場合にも、タッチセンサ21に対する入力操作を無効とすることが好ましい。ステアリングホイール200の回転角度が例えば±30°の所定角度を超えた場合、または、方向指示器320が操作された場合に加えて、シフトレバー330のシフト位置がリバースとなった場合に操作対象装置に対する制御を無効とすることが好ましいが、一方のみとしてもよい。 Furthermore, it is not preferable to control the operation target device by performing a specific input operation on the touch sensor 21 while the vehicle is moving backward. There is a high possibility of erroneous operation, which is not preferable in terms of safety. Therefore, it is preferable that the control unit 10 invalidates the input operation to the touch sensor 21 even when the shift position of the shift lever 330 is reversed by the detection signal from the shift lever sensor 33. The operation target device when the rotation angle of the steering wheel 200 exceeds a predetermined angle of, for example, ± 30 °, or when the shift position of the shift lever 330 is reversed in addition to when the direction indicator 320 is operated. Although it is preferable to invalidate the control for, only one may be used.
 なお、操作対象装置に対する制御を無効にするとは、仮に上述した特定の入力操作がなされたとしても特定の入力操作を無効として操作対象装置に対する制御を不許可とすることであってもよく、制御部10に対してセンサデータ生成部22から何らかのセンサデータが入力されても、制御部10がセンサデータを無効することであってもよい。結果として、操作対象装置に対する制御を無効とすればよい。 Note that disabling control on the operation target device may mean disabling control of the operation target device by disabling the specific input operation even if the specific input operation described above is performed. Even if some sensor data is input from the sensor data generation unit 22 to the unit 10, the control unit 10 may invalidate the sensor data. As a result, the control on the operation target device may be invalidated.
 図19のフローチャートを用いて、図14のステップS4の具体的な処理の例を説明する。図19において、制御部10は、ステップS41にて、シフトレバー330のシフト位置がリバースであるか否かを判定する。シフト位置がリバースであれば(YES)、制御部10は、ステップS45にて、ステップS3での入力操作を不許可として、図14のステップS1に移行させる。シフト位置がリバースでなければ(NO)、制御部10は、ステップS42にて、方向指示器320が操作されたか否かを判定する。方向指示器320が操作されたら(YES)、制御部10は、ステップS45にて、ステップS3での入力操作を不許可として、図14のステップS1に移行させる。 An example of specific processing in step S4 in FIG. 14 will be described using the flowchart in FIG. In FIG. 19, the controller 10 determines whether or not the shift position of the shift lever 330 is reverse in step S41. If the shift position is reverse (YES), in step S45, the control unit 10 disallows the input operation in step S3 and proceeds to step S1 in FIG. If the shift position is not reverse (NO), the control unit 10 determines whether or not the direction indicator 320 is operated in step S42. If the direction indicator 320 is operated (YES), the control part 10 will make the input operation in step S3 disallowed in step S45, and will transfer to step S1 of FIG.
 方向指示器320が操作されなかったら(NO)、制御部10は、ステップS43にて、ステアリングホイール200の回転角度が所定角度を超えたか否かを判定する。ステアリングホイール200の回転角度が所定角度を超えたら(YES)、制御部10は、ステップS45にて、ステップS3での入力操作を不許可として、図14のステップS1に移行させる。ステアリングホイール200の回転角度が所定角度を超えていなかったら(NO)、制御部10は、ステップS44にて、操作検出エリアArv内で特定の入力操作がなされたか否かを判定する。特定の入力操作がなされなかったら(NO)、制御部10は、ステップS45にて、ステップS3での入力操作を不許可として、図14のステップS1に移行させる。ステップS45にて、入力操作を不許可とすれば、操作対象装置に対する制御が無効となる。特定の入力操作がなされたら(YES)、制御部10は、ステップS46にて、ステップS3での入力操作を許可して、図14のステップS5に移行させる。 If the direction indicator 320 is not operated (NO), the control unit 10 determines whether or not the rotation angle of the steering wheel 200 exceeds a predetermined angle in step S43. If the rotation angle of the steering wheel 200 exceeds a predetermined angle (YES), the control unit 10 makes the input operation in step S3 not permitted in step S45 and shifts to step S1 in FIG. If the rotation angle of the steering wheel 200 does not exceed the predetermined angle (NO), the control unit 10 determines in step S44 whether or not a specific input operation has been performed in the operation detection area Arv. If the specific input operation is not performed (NO), the control unit 10 determines that the input operation in step S3 is not permitted in step S45, and proceeds to step S1 in FIG. If the input operation is not permitted in step S45, the control on the operation target device becomes invalid. When a specific input operation is performed (YES), the control unit 10 permits the input operation in step S3 in step S46, and proceeds to step S5 in FIG.
 図19に示す例では、ステップS41,S42,S43の全てを設けたが、これらのステップの内の1つまたは2つのみとしてもよい。また、ステップS41,S42,S43の全てまたはこれらのステップの内の2つを設ける場合、順番は任意である。なお、ここではシフトレバー330と称したが、車両の直進と後退の切り換え、変速機の変速比を変える操作部の形状は任意であり、フロアシフト,コラムシフト,パドルシフト等のいずれでもよい。これらは全てシフトレバーに含まれる。 In the example shown in FIG. 19, all of steps S41, S42, and S43 are provided, but only one or two of these steps may be provided. Further, when all of steps S41, S42, and S43 or two of these steps are provided, the order is arbitrary. Here, the shift lever 330 is referred to, but the shape of the operation unit for switching between straight and reverse travel of the vehicle and changing the transmission gear ratio is arbitrary, and may be any of a floor shift, a column shift, a paddle shift, and the like. These are all included in the shift lever.
 以上のように、車両が右折,左折,コーナリング走行,後退等の特定の状態にあるときに、操作対象装置に対する制御を無効とすれば、運転者はこれらの特定の状態のときにタッチセンサ21に対する入力操作を行わない。従って、安全性の向上に寄与する。 As described above, when the vehicle is in a specific state such as right turn, left turn, cornering traveling, reverse, etc., if the control on the operation target device is invalidated, the driver touches the touch sensor 21 in these specific states. Do not perform input operation on. Therefore, it contributes to the improvement of safety.
<第3実施形態>
 車両における操作対象装置の制御装置及び制御方法の第3実施形態について説明する。第3実施形態における基本的構成及び動作は第1実施形態と同じであり、異なる部分のみを説明する。
<Third Embodiment>
A third embodiment of the control device and control method for the operation target device in the vehicle will be described. The basic configuration and operation in the third embodiment are the same as those in the first embodiment, and only different parts will be described.
 第3実施形態は、運転者が通常の運転のために円環状部200rを握っていて操作対象装置を操作しようとする意思がない場合には、指によるタッチセンサ21に対する入力操作を受け付けず、運転者が操作対象装置を操作しようとした場合に、指によるタッチセンサ21に対する入力操作を受け付けるように構成したものである。 In the third embodiment, when the driver is holding the annular portion 200r for normal driving and does not intend to operate the operation target device, the driver does not accept an input operation on the touch sensor 21 with a finger, When the driver tries to operate the operation target device, an input operation to the touch sensor 21 by a finger is accepted.
 運転者が親指や人差し指によってタッチセンサ21上で特定の入力操作を行おうとする場合には、親指や人差し指を動かしやすいように円環状部200r(タッチセンサ21)の握り方を意図的または無意識に変えると考えられる。図20は、運転者が通常の運転のために円環状部200rを握っている場合の手のひら接触検出部Tpと親指接触検出部Ttの状態の一例を示している。図20は、図9と同様、タッチセンサ21の各領域を均等の大きさに変換した模式図である。図20に示すように、手のひら接触検出部Tpは比較的広い面積であり、親指接触検出部Ttは手のひら接触検出部Tpと近い位置にある。図11では人差し指接触検出部Tiの図示を省略しているが、人差し指接触検出部Tiも手のひら接触検出部Tpと近い位置にある。 When the driver wants to perform a specific input operation on the touch sensor 21 with the thumb or the index finger, the user grasps the grip of the annular portion 200r (touch sensor 21) intentionally or unconsciously so that the thumb or the index finger can be easily moved. It is thought to change. FIG. 20 shows an example of the state of the palm contact detection unit Tp and the thumb contact detection unit Tt when the driver is holding the annular portion 200r for normal driving. FIG. 20 is a schematic diagram in which each area of the touch sensor 21 is converted into an equal size, as in FIG. 9. As shown in FIG. 20, the palm contact detection unit Tp has a relatively large area, and the thumb contact detection unit Tt is located at a position close to the palm contact detection unit Tp. In FIG. 11, the index finger contact detection unit Ti is not shown, but the index finger contact detection unit Ti is also in a position close to the palm contact detection unit Tp.
 図21は、運転者が操作対象装置を操作しようとした場合の手のひら接触検出部Tpと親指接触検出部Ttの状態の一例を示している。図21に示すように、手のひら接触検出部Tpの面積は図20と比較して狭くなり、親指接触検出部Ttは手のひら接触検出部Tpから離れた位置となる。図21でも人差し指接触検出部Tiの図示を省略しているが、人差し指接触検出部Tiも手のひら接触検出部Tpから離れた位置とある。図20と図21とを比較すれば分かるように、運転者が通常の運転のために円環状部200rを単に握っている状態と、親指または人差し指によってタッチセンサ21上で特定の入力操作を行って操作対象装置を操作しようとする状態とでは、手のひら接触検出部Tpの面積が大きく異なることとなる。なお、手のひら接触検出部Tpは、中指と薬指と小指(場合によってはこれに加えて人差し指)が接触している部分を含むことがある。 FIG. 21 shows an example of states of the palm contact detection unit Tp and the thumb contact detection unit Tt when the driver tries to operate the operation target device. As shown in FIG. 21, the area of the palm contact detection unit Tp is smaller than that of FIG. 20, and the thumb contact detection unit Tt is located away from the palm contact detection unit Tp. In FIG. 21, the index finger contact detection unit Ti is not shown, but the index finger contact detection unit Ti is also located away from the palm contact detection unit Tp. As can be seen by comparing FIG. 20 and FIG. 21, a specific input operation is performed on the touch sensor 21 with the driver simply holding the annular portion 200r for normal driving and the thumb or index finger. Thus, the area of the palm contact detection unit Tp is greatly different from the state in which the operation target device is to be operated. The palm contact detection unit Tp may include a portion where the middle finger, the ring finger, and the little finger (in some cases, the index finger in addition to this) are in contact.
 図20における手のひら接触検出部TpのX座標8、Y座標4~8の部分は中指と薬指と小指の先端が触っている部分である。図21では、中指と薬指と小指の先端が触っている部分はX座標5、Y座標4~8へと移動している。これは、中指と薬指と小指の先端の位置が円環状部200rの裏面側へと変化したことである。手のひら接触検出部Tpの面積の変化と併せて、手のひら接触検出部Tpの円環状部200rの断面における周方向の端部の位置の変化に基づいて、操作対象装置を操作しようとする状態を判別することも可能である。また、手のひら接触検出部Tpの円環状部200rの断面における周方向の端部の位置の変化に基づいて、操作対象装置を操作しようとする状態を判別することも可能である。 In FIG. 20, the X coordinate 8 and Y coordinate 4 to 8 portions of the palm contact detection unit Tp are touched by the middle finger, the ring finger, and the tip of the little finger. In FIG. 21, the portion touched by the tip of the middle finger, the ring finger, and the little finger has moved to the X coordinate 5 and the Y coordinates 4 to 8. This is because the positions of the tips of the middle finger, the ring finger, and the little finger have changed to the back side of the annular portion 200r. Along with the change in the area of the palm contact detection unit Tp, the state in which the operation target device is to be operated is determined based on the change in the position of the end in the circumferential direction in the cross section of the annular portion 200r of the palm contact detection unit Tp. It is also possible to do. Further, it is also possible to determine a state in which the operation target device is to be operated based on the change in the position of the end portion in the circumferential direction in the cross section of the annular portion 200r of the palm contact detection unit Tp.
 制御部10は、手のひら接触検出部Tpの面積が図20のように所定の面積以上である第1の面積であれば、運転者が通常の運転のために円環状部200rを握っている状態であると判断し、指によるタッチセンサ21に対する入力操作を受け付けない状態とする。また、制御部10は、手のひら接触検出部Tpの面積が図21のように図20と比較して所定の割合以上狭くなって第2の面積となった場合には、運転者が操作対象装置を操作しようとする状態であると判断し、指によるタッチセンサ21に対する入力操作を受け付ける状態とする。 If the area of the palm contact detection unit Tp is the first area that is equal to or larger than the predetermined area as shown in FIG. 20, the control unit 10 is in a state where the driver holds the annular portion 200r for normal driving. It is determined that the input operation to the touch sensor 21 by the finger is not accepted. Further, when the area of the palm contact detection unit Tp becomes a second area that is narrower than a predetermined ratio as shown in FIG. It is determined that it is in a state where it is going to be operated, and a state in which an input operation to the touch sensor 21 by a finger is accepted.
 手のひら接触検出部Tpの面積の変化によって通常運転の状態と操作対象装置を操作しようとする状態とを判別する構成例では、操作無効エリアArivを設けなくてもよい。図22は、操作無効エリアArivを省略して、タッチセンサ21上にグリップ検出エリアArgと操作検出エリアArvとが予め設定されているか、制御部10がタッチセンサ21上にグリップ検出エリアArgと操作検出エリアArvとを設定した状態を示している。図22は図20と同様、運転者が通常の運転のために円環状部200rを握っている状態である。図22のように、操作無効エリアArivを設けない場合でも、制御部10は、上記の2つの状態を判別することができるので、誤操作を回避することが可能である。 In the configuration example in which the normal operation state and the state in which the operation target device is to be operated are determined based on the change in the area of the palm contact detection unit Tp, the operation invalid area Ariv may not be provided. In FIG. 22, the operation invalid area Ariv is omitted, and the grip detection area Arg and the operation detection area Arv are preset on the touch sensor 21, or the control unit 10 operates on the touch sensor 21 with the grip detection area Arg. A state in which the detection area Arv is set is shown. FIG. 22 shows a state where the driver is holding the annular portion 200r for normal driving, as in FIG. As shown in FIG. 22, even when the operation invalid area Ariv is not provided, the control unit 10 can discriminate between the two states described above, so that an erroneous operation can be avoided.
 図20に示す手のひら接触検出部Tpの面積と、図21に示す手のひら接触検出部Tpの面積とを制御部10または記憶部18に予め登録しておいて、タッチセンサ21に対する入力操作を受け付ける状態と受け付けない状態とを切り換えるようにしてもよい。勿論、手のひら接触検出部Tpの面積は常時一定の面積とは限らないので、面積のずれの程度の許容量を設定しておく。手のひら接触検出部Tpの面積の変化の代わりに、または、面積の変化に加えて、手のひら接触検出部Tpの形状の変化を検出してもよい。さらに、図10に示す手のひら接触検出部Tpの断面周方向の角度θの変化や、手のひら接触検出部TpのX座標方向の最大長さの変化を検出してもよい。 The area of the palm contact detection unit Tp shown in FIG. 20 and the area of the palm contact detection unit Tp shown in FIG. 21 are registered in the control unit 10 or the storage unit 18 in advance, and an input operation to the touch sensor 21 is accepted. And a state of not accepting may be switched. Of course, since the area of the palm contact detection part Tp is not always a constant area, an allowable amount of the extent of the area deviation is set. Instead of or in addition to the change in the area of the palm contact detection unit Tp, a change in the shape of the palm contact detection unit Tp may be detected. Furthermore, a change in the angle θ in the circumferential direction of the palm contact detection unit Tp shown in FIG. 10 or a change in the maximum length in the X coordinate direction of the palm contact detection unit Tp may be detected.
 なお、入力操作を受け付けない状態とは、仮に特定の入力操作がなされたとしても特定の入力操作を無効として操作対象装置に対する制御を不許可とすることであってもよく、制御部10に対してセンサデータ生成部22から何らかのセンサデータが入力されても、制御部10がセンサデータを無効することであってもよい。結果として、操作対象装置に対する制御を無効とすればよい。 Note that the state in which no input operation is accepted may be that even if a specific input operation is performed, the specific input operation is invalidated and control on the operation target device is not permitted. Even if some sensor data is input from the sensor data generation unit 22, the control unit 10 may invalidate the sensor data. As a result, the control on the operation target device may be invalidated.
<第4実施形態>
 車両における操作対象装置の制御装置及び制御方法の第4実施形態について説明する。第4実施形態における基本的構成及び動作は第1実施形態と同じであり、異なる部分のみを説明する。
<Fourth embodiment>
A fourth embodiment of the control device and control method for the operation target device in the vehicle will be described. The basic configuration and operation in the fourth embodiment are the same as those in the first embodiment, and only different parts will be described.
 図23を用いて、運転者が単に円環状部200rを握っている場合と、操作対象装置を操作しようとしてタッチセンサ21を触った場合とを的確に区別する第4実施形態の構成例について説明する。図23は、図8と同様、タッチセンサ21を展開した状態を示している。図23に示す構成例では、操作無効エリアArivを省略している。運転者が単に円環状部200rを握っている場合には、親指接触検出部Ttや人差し指接触検出部Tiは、手のひら接触検出部Tpに比較的近い位置にあると考えられる。運転者が操作対象装置を操作しようとしておらず、単に円環状部200rを握っている場合の親指接触検出部Ttと人差し指接触検出部TiをそれぞれTt0,Ti0とする。 A configuration example of the fourth embodiment that accurately distinguishes between a case where the driver simply holds the annular portion 200r and a case where the driver touches the touch sensor 21 in an attempt to operate the operation target device will be described with reference to FIG. To do. FIG. 23 shows a state in which the touch sensor 21 is deployed as in FIG. In the configuration example shown in FIG. 23, the operation invalid area Ariv is omitted. When the driver simply holds the annular portion 200r, the thumb contact detection unit Tt and the index finger contact detection unit Ti are considered to be relatively close to the palm contact detection unit Tp. The thumb contact detection unit Tt and the index finger contact detection unit Ti when the driver is not trying to operate the device to be operated and is simply holding the annular portion 200r are denoted by Tt0 and Ti0, respectively.
 図23は、運転者が単に円環状部200rを握っているだけで操作対象装置を操作しようとしていない状態では、親指接触検出部Tt0と人差し指接触検出部Ti0とが検出され、運転者が操作対象装置を操作しようとすると、手のひら接触検出部Tpより離れた位置の親指接触検出部Ttと人差し指接触検出部とに移ることを示している。図23では、親指接触検出部Tt0と親指接触検出部TtとのX座標を同じとしており、人差し指接触検出部Ti0と人差し指接触検出部TiとのX座標を同じとしているが、X座標がずれる場合もある。この場合でも、Y座標の移動のみに注目すればよい。 FIG. 23 shows that when the driver simply holds the annular portion 200r and does not attempt to operate the operation target device, the thumb contact detection unit Tt0 and the index finger contact detection unit Ti0 are detected, and the driver is the operation target. It is shown that when the device is operated, the thumb contact detection unit Tt and the index finger contact detection unit at a position separated from the palm contact detection unit Tp are moved. In FIG. 23, the X coordinates of the thumb contact detection unit Tt0 and the thumb contact detection unit Tt are the same, and the X coordinates of the index finger contact detection unit Ti0 and the index finger contact detection unit Ti are the same, but the X coordinate is shifted. There is also. Even in this case, it is only necessary to pay attention to the movement of the Y coordinate.
 制御部10は、運転者が通常円環状部200rを握っている状態で、手のひら接触検出部Tpの親指接触検出部Tt0側の端部と親指接触検出部Tt0の手のひら接触検出部Tp側の端部との距離α1を基準距離として記憶しておく。制御部10が記憶部として基準距離α1を記憶してもよいし、記憶部18に記憶させてもよい。 The control unit 10 is configured such that the end of the palm contact detection unit Tp on the thumb contact detection unit Tt0 side and the end of the thumb contact detection unit Tt0 on the palm contact detection unit Tp side in a state where the driver normally holds the annular portion 200r. Is stored as a reference distance. The control unit 10 may store the reference distance α1 as a storage unit or may store the reference distance α1 in the storage unit 18.
 運転者が操作対象装置を操作しようとした状態で、手のひら接触検出部Tpの親指接触検出部Tt側の端部と親指接触検出部Ttの手のひら接触検出部Tp側の端部との距離は、基準距離α1より長い例えば距離α2となる。制御部10は、手のひら接触検出部Tpの親指接触検出部Tt0側の端部との距離が基準距離α1より所定の距離以上長い位置の親指接触検出部Ttを検出した場合に、運転者が操作対象装置を操作しようとしている状態であると判断する。そして、制御部10は、この状態における親指接触検出部Ttで検出される親指による入力操作を有効として受け付けるようにする。 The distance between the end of the palm contact detection unit Tp on the thumb contact detection unit Tt side and the end of the thumb contact detection unit Tt on the palm contact detection unit Tp side when the driver is about to operate the operation target device is: For example, the distance α2 is longer than the reference distance α1. When the distance between the palm contact detection unit Tp and the end on the thumb contact detection unit Tt0 side of the palm contact detection unit Tp is detected, the control unit 10 detects the thumb contact detection unit Tt at a position longer than the reference distance α1 by a predetermined distance or more. It is determined that the target device is being operated. And the control part 10 accepts the input operation by the thumb detected by the thumb contact detection part Tt in this state as effective.
 図23では、手のひら接触検出部Tpと親指接触検出部Tt0,Ttとの距離α1,α2のみを示しているが、手のひら接触検出部Tpと人差し指接触検出部Ti0との距離も同様に記憶しておき、操作対象装置を操作しようとする際の人差し指接触検出部Tiの位置を判断すればよい。即ち、運転者が円環状部200rを握っていて入力操作を行っていない状態で、運転者の手のひらがタッチセンサ21に接触している手のひら接触検出部Tpと、入力操作を行う指(親指または人差し指)がタッチセンサ21に接触している指接触検出部(親指接触検出部Tt0または人差し指接触検出部Ti0)との基準距離を記憶しておき、手のひら接触検出部Tpと指接触検出部との距離が基準距離よりも所定距離長い状態で、指による入力操作を有効とすればよい。 FIG. 23 shows only the distances α1 and α2 between the palm contact detection unit Tp and the thumb contact detection units Tt0 and Tt, but the distance between the palm contact detection unit Tp and the index finger contact detection unit Ti0 is also stored in the same manner. What is necessary is just to judge the position of the index finger contact detection part Ti at the time of trying to operate an operation target apparatus. That is, the palm contact detection unit Tp in which the driver's palm is in contact with the touch sensor 21 and the finger (thumb or A reference distance from the finger contact detection unit (thumb contact detection unit Tt0 or index finger contact detection unit Ti0) in which the index finger is in contact with the touch sensor 21 is stored, and the palm contact detection unit Tp and the finger contact detection unit An input operation with a finger may be validated in a state where the distance is longer than the reference distance by a predetermined distance.
 図23では、操作無効エリアArivを省略しているが、操作無効エリアArivを設けてもよい。操作無効エリアArivを設ける場合には、図8よりも操作無効エリアArivの距離を短くしてもよい。 In FIG. 23, the operation invalid area Ariv is omitted, but the operation invalid area Ariv may be provided. When the operation invalid area Ariv is provided, the distance of the operation invalid area Ariv may be shorter than that in FIG.
<第5実施形態>
 車両における操作対象装置の制御装置及び制御方法の第5実施形態について説明する。第5実施形態における基本的構成及び動作は第1実施形態と同じであり、異なる部分のみを説明する。第5実施形態は、誤操作を低減させるためのさらに他の構成例である。
<Fifth Embodiment>
A fifth embodiment of the control device and control method for the operation target device in the vehicle will be described. The basic configuration and operation in the fifth embodiment are the same as those in the first embodiment, and only different parts will be described. The fifth embodiment is still another configuration example for reducing erroneous operations.
 図13では、左右の手による操作を組み合わせたパターンを、操作対象装置を制御するための特定の入力操作とすることを説明した。誤操作をさらに低減させるために、制御部10は、検出部10aによって、左右の手で同じ入力操作を行われたことが検出された場合に、入力操作を有効としてもよい。制御部10は、左右の手で同じタイミングで同じ入力操作を行われた場合に入力操作を有効とすることが好ましい。 In FIG. 13, it has been described that a pattern obtained by combining operations by left and right hands is a specific input operation for controlling the operation target device. In order to further reduce erroneous operations, the control unit 10 may validate the input operation when the detection unit 10a detects that the same input operation has been performed with the left and right hands. It is preferable that the control unit 10 validates the input operation when the same input operation is performed at the same timing with the left and right hands.
 図24(a)~(c)を用いて、左右の手で同じタイミングで同じ入力操作を行う場合の入力操作の例について説明する。図24(a)は図13(a)と同様、左側タッチセンサ21Lに対して親指を左方向に滑らせる左方向ドラッグDTLと、右側タッチセンサ21Rに対して親指を右方向に滑らせる右方向ドラッグDTRとを同じタイミングで行った場合を示している。 An example of an input operation when the same input operation is performed at the same timing with the left and right hands will be described with reference to FIGS. Similar to FIG. 24 (a) is FIG. 13 (a), the left direction drag D TL sliding the thumb to the left relative to the left touch sensor 21L, the right sliding the thumb in the right direction with respect to the right touch sensor 21R A case where the direction drag DTR is performed at the same timing is shown.
 図24(a)に示す例は、左右の親指双方を、円環状部200rの内周側から外周側へとドラッグしており、同じ入力操作であるとする。勿論、左側タッチセンサ21Lに対して親指を右方向に滑らせる右方向ドラッグDTRと、右側タッチセンサ21Rに対して親指を右方向に滑らせる右方向ドラッグDTRとを同じタイミングで行った場合や、左側タッチセンサ21Lに対して親指を左方向に滑らせる左方向ドラッグDTLと、右側タッチセンサ21Rに対して親指を左方向に滑らせる左方向ドラッグDTLとを同じタイミングで行った場合に、同じ入力操作であると定義してもよい。但し、図24(a)に示すような左右対称の入力操作を同じ入力操作とする方が好ましい。 In the example shown in FIG. 24A, it is assumed that both the left and right thumbs are dragged from the inner periphery side to the outer periphery side of the annular portion 200r and the same input operation is performed. Of course, in the case of performing the right drag D TR sliding the thumb in the right direction with respect to the left touch sensor 21L, and a right drag D TR sliding the thumb in the right direction with respect to the right touch sensor 21R at the same time or, in the case of performing the thumb against the left touch sensor 21L and the left drag D TL sliding to the left, the left drag D TL at the same time sliding the thumb to the left with respect to the right touch sensor 21R Alternatively, the same input operation may be defined. However, it is preferable that the left and right symmetrical input operations as shown in FIG.
 図24(b)に示す例は、図13(d)と同様、左側タッチセンサ21Lと右側タッチセンサ21Rの双方に対して、親指を下方向に滑らせる下方向ドラッグDTDを同じタイミングで行った場合を示している。左側タッチセンサ21Lと右側タッチセンサ21Rの双方に対して、親指を上方向に滑らせる上方向ドラッグDTUを同じタイミングで行った場合に同じ入力操作であると定義してもよい。指を上下方向にドラッグする場合には、左右対称ではなく、左右同じ方向にドラッグする場合を同じ入力操作と定義することが好ましい。 In the example shown in FIG. 24B, as in FIG. 13D, the downward drag D TD for sliding the thumb downward is performed at the same timing on both the left touch sensor 21L and the right touch sensor 21R. Shows the case. When both the left touch sensor 21L and the right touch sensor 21R are subjected to an upward drag DTU that causes the thumb to slide upward at the same timing, the same input operation may be defined. When the finger is dragged in the vertical direction, it is preferable to define the same input operation as dragging in the same direction on the left and right instead of symmetrically.
 図24(a),(b)では、親指による入力操作としたが、人差し指でもよい。 In FIGS. 24A and 24B, the input operation is performed with the thumb, but an index finger may be used.
 図24(c)に示す例は、左側タッチセンサ21Lと右側タッチセンサ21Rの双方に対して、親指または人差し指でタッチセンサ21をたたくタップTを同じタイミングで行った場合を示している。 The example shown in FIG. 24 (c) shows a case where the tap T is tapped at the same timing with both the left touch sensor 21L and the right touch sensor 21R by hitting the touch sensor 21 with the thumb or index finger.
 制御部10は、例えば次のような場合に同じタイミングであると判定する。例えばドラッグの場合では、図25(a)に示すように、左手の指によるドラッグの開始タイミングt1から終了タイミングt3までの時間TMと、右手の指によるドラッグの開始タイミングt2から終了タイミングt4までの時間TMとが所定の時間(所定の割合)以上重なっている場合に同じタイミングとみなすことができる。また、図25(b)に示すように、先にドラッグを開始した例えば左手の指によるドラッグの開始タイミングt1から所定の時間TMP1を計測し、時間TMP1内に右手の指によるドラッグがなされた場合に同じタイミングとみなすことができる。同じタイミングであるとの判定の基準は適宜設定すればよい。 For example, the control unit 10 determines that the timing is the same in the following case. For example, in the case of drag, as shown in FIG. 25 (a), and time TM L from the start timing t1 of dragging with the left hand fingers to the end time t3, until the end timing t4 from the start timing t2 of the drug by the right hand finger it can be a time TM R of considers the same timing when the overlapping predetermined time (predetermined ratio) or more. Further, as shown in FIG. 25 (b), previously measured for a predetermined time TM P1 from the start timing t1 of the drag by the finger of the left hand for example initiated the drag, the drag due to the fingers of the right hand in time TM P1 made Can be regarded as the same timing. The criterion for determining that the timing is the same may be set as appropriate.
 また、左右の指で完全に同じ入力操作となることはないため、同じ入力操作であるとみなす許容範囲を設ける。ドラッグの場合には、指を滑らせる方向が所定の許容範囲内で同じであれば、同じ入力操作とみなす。タップTの場合には、タップTの場所が同じであれば同じ入力操作とみなすことができる。タップTの場所が正面部21fまたは裏面部21rで共通の場合に、同じ場所とすることができる。即ち、左右双方で同じタイミングで親指によるタップTが行われたり、左右双方で同じタイミングで人差し指によるタップTが行われたりした場合に、同じ場所で同じタイミングで同じ入力操作がなされたとみなすことができる。 Also, since the same input operation is not performed with the left and right fingers, an allowable range is assumed to be regarded as the same input operation. In the case of dragging, if the direction in which the finger slides is the same within a predetermined allowable range, it is regarded as the same input operation. In the case of the tap T, if the location of the tap T is the same, it can be regarded as the same input operation. When the location of the tap T is common to the front surface portion 21f or the back surface portion 21r, the same location can be obtained. That is, when the tap T with the thumb is performed at the same timing in both the left and right, or when the tap T is performed with the index finger at the same timing in both the left and right, it can be regarded that the same input operation is performed at the same timing at the same location. it can.
 次に、誤操作を低減させるために、制御部10は、上述のようなタッチセンサ21に対する特定の入力操作を受け付ける受け付けモードを設定し、運転者が意図的に受け付けモードとするようにしてもよい。受け付けモードでない状態から受け付けモードへと移行させる際にも、意図せず偶然に受け付けモードに移行してしまうことを避けることが必要である。そこで、検出部10aによって、タッチセンサ21に対して両手で同じ入力操作が行われたことが検出されたら、制御部10は、上述した特定の入力操作を受け付けない状態から特定の入力操作を受け付ける状態(受け付けモード)へと移行させる。同じ入力操作とは、図24で説明した通りである。この場合も、図25で説明したように、同じタイミングで同じ入力操作が行われたことが検出された場合に、受け付けモードへと移行させることが好ましい。 Next, in order to reduce erroneous operations, the control unit 10 may set a reception mode for receiving a specific input operation to the touch sensor 21 as described above so that the driver intentionally enters the reception mode. . Even when shifting from a state other than the reception mode to the reception mode, it is necessary to avoid unintentionally shifting to the reception mode. Therefore, when the detection unit 10a detects that the same input operation has been performed on the touch sensor 21 with both hands, the control unit 10 accepts the specific input operation from the state where the specific input operation is not accepted. Transition to the state (acceptance mode). The same input operation is as described in FIG. Also in this case, as described with reference to FIG. 25, when it is detected that the same input operation is performed at the same timing, it is preferable to shift to the reception mode.
 さらに、図11~図13で説明した特定の入力操作を行った後に、特定の入力操作を確定させる操作をした場合に、特定の入力操作を確定させて操作対象装置を特定の入力操作に応じて制御するようにすることもできる。検出部10aによって、図11~図13で説明した特定の入力操作(第1の特定の入力操作)が行われたことが検出され、その後、図24で説明したように、同じ入力操作として定義される特定の入力操作(第2の特定の入力操作)が行われたことが検出された場合に、制御部10は、直前に入力された第1の特定の入力操作を確定させる。好ましくは、図25で説明したように、同じ入力操作として定義される特定の入力操作が同じタイミングで行われたことが検出された場合に、制御部10は、直前に入力された第1の特定の入力操作を確定させる。 Further, when the specific input operation described in FIGS. 11 to 13 is performed and then the specific input operation is confirmed, the specific input operation is confirmed and the operation target device is set in accordance with the specific input operation. It can also be controlled. The detection unit 10a detects that the specific input operation (first specific input operation) described with reference to FIGS. 11 to 13 has been performed, and then defines the same input operation as described with reference to FIG. When it is detected that the specific input operation to be performed (second specific input operation) is performed, the control unit 10 determines the first specific input operation input immediately before. Preferably, as described with reference to FIG. 25, when it is detected that specific input operations defined as the same input operation are performed at the same timing, the control unit 10 performs the first input just before. Confirm a specific input operation.
 また、図26に示すように、例えば、左手の人差し指で上方向ドラッグDIUが行われ、その後、所定の時間内に、右手の親指で上方向ドラッグDTUが行われたことが検出された場合に、受け付けモードとしてもよい。図27に示すように、左手の人差し指による上方向ドラッグDIUの時間TMと、右手の親指による上方向ドラッグDTUとの間が所定の時間TMP2であれば、制御部10は、上方向ドラッグDIUと上方向ドラッグDTUとが連続的な入力操作であるとして、受け付けモードとする。 Further, as shown in FIG. 26, for example, it is detected that the upward drag D IU is performed with the left index finger, and then the upward drag D TU is performed with the right thumb within a predetermined time. In some cases, the reception mode may be set. As shown in FIG. 27, time TM L upward drag D IU by left hand forefinger, if between the upward drag D TU is a predetermined time TM P2 by the right thumb, the control unit 10, the upper Assuming that the direction drag D IU and the upward drag D TU are continuous input operations, a reception mode is set.
 さらには、左右の手による入力操作のパターンによって、操作する対象を切り換えるように構成することもできる。例えば、図28に示すように、検出部10aによって、左側タッチセンサ21Lに対して人差し指を上方向に滑らせる上方向ドラッグDIUが行われた後に、右側タッチセンサ21Rを親指でたたくタップTが2回連続で行われたことが検出された場合に、制御部10は、オーディオ再生部12を操作するオーディオ操作モードとする。制御部10は、特定の入力操作に基づいて操作する対象を、車載機器100の内のオーディオ再生部12として設定する。 Furthermore, the operation target can be switched according to the input operation pattern by the left and right hands. For example, as shown in FIG. 28, after the upward drag D IU for sliding the index finger upward on the left touch sensor 21L is performed by the detection unit 10a, the right touch sensor 21R is tapped with the thumb T T Is detected to have been performed twice in succession, the control unit 10 sets the audio operation mode in which the audio playback unit 12 is operated. The control unit 10 sets a target to be operated based on a specific input operation as the audio playback unit 12 in the in-vehicle device 100.
 また、図29に示すように、検出部10aによって、左側タッチセンサ21Lに対して親指を上方向に滑らせる上方向ドラッグDTUが行われた後に、右側タッチセンサ21Rを親指でたたくタップTが2回連続で行われたことが検出された場合に、制御部10は、ナビゲーション処理部11を操作するナビゲーション操作モードとする。制御部10は、特定の入力操作に基づいて操作する対象を、車載機器100の内のナビゲーション処理部11として設定する。これらの入力操作の組み合わせは単なる例であり、図28,図29に限定されるものではない。 Further, as shown in FIG. 29, the detecting section 10a, after the upward drag D TU was performed sliding the thumb in the upward direction with respect to the left touch sensor 21L, taps T T striking the thumb of the right touch sensor 21R Is detected to have been performed twice in succession, the control unit 10 sets the navigation operation mode in which the navigation processing unit 11 is operated. The control unit 10 sets a target to be operated based on a specific input operation as the navigation processing unit 11 in the in-vehicle device 100. The combination of these input operations is merely an example, and is not limited to FIGS.
 図24~図29で説明した構成例においても、運転者が円環状部200r(タッチセンサ21)を握っていることを前提とすることが好ましい。 Also in the configuration examples described with reference to FIGS. 24 to 29, it is preferable that the driver is holding the annular portion 200r (touch sensor 21).
<第6実施形態>
 車両における操作対象装置の制御装置及び制御方法の第6実施形態について説明する。第6実施形態における基本的構成及び動作は第1実施形態と同じであり、異なる部分のみを説明する。第6実施形態は、誤操作を低減させるためのまたさらに他の構成例である。
<Sixth Embodiment>
A sixth embodiment of the control device and control method for the operation target device in the vehicle will be described. The basic configuration and operation in the sixth embodiment are the same as those in the first embodiment, and only different parts will be described. The sixth embodiment is still another configuration example for reducing erroneous operations.
 図30(a)は、タッチセンサ21のグリップ検出エリアArgと操作無効エリアArivと操作検出エリアArvとを色分けした例を示している。例えば塗料を塗って色分けしてもよいし、それぞれの色のシートを貼り付けることによって色分けしてもよい。また、タッチセンサ21の部分と円環状部200rのタッチセンサ21以外の部分とを色分けすることも効果的である。この場合、タッチセンサ21の部分に色を付してもよいし、タッチセンサ21以外の部分に色を付してもよい。タッチセンサ21以外の部分とグリップ検出エリアArgと操作無効エリアArivと操作検出エリアArvとをそれぞれ異なる色にしてもよい。 FIG. 30A shows an example in which the grip detection area Arg, the operation invalid area Ariv, and the operation detection area Arv of the touch sensor 21 are color-coded. For example, it may be color-coded by applying paint, or color-coded by pasting sheets of the respective colors. It is also effective to color-code the portion of the touch sensor 21 and the portion other than the touch sensor 21 of the annular portion 200r. In this case, a color may be given to the part of the touch sensor 21, or a color may be given to a part other than the touch sensor 21. The parts other than the touch sensor 21, the grip detection area Arg, the operation invalid area Ariv, and the operation detection area Arv may be different colors.
 図30(b)は、操作無効エリアArivを設けない場合の例であり、グリップ検出エリアArgと操作検出エリアArvとを色分けした例を示している。さらに、タッチセンサ21の部分と円環状部200rのタッチセンサ21以外の部分とを色分けすれば、運転者はタッチセンサ21の位置を明確かつ即座に視認することができるので好ましい。図30(a),(b)のように、各エリアを色分けすれば、運転者はタッチセンサ21の各エリアの位置を明確かつ即座に視認することができるのでさらに好ましい。図30(a),(b)において、上述した色変化シート41を用いることもできる。 FIG. 30B shows an example in which the operation invalid area Ariv is not provided, and shows an example in which the grip detection area Arg and the operation detection area Arv are color-coded. Furthermore, it is preferable to color-code the portion of the touch sensor 21 and the portion other than the touch sensor 21 of the annular portion 200r because the driver can clearly and immediately recognize the position of the touch sensor 21. As shown in FIGS. 30A and 30B, it is more preferable to color-code each area because the driver can clearly and immediately recognize the position of each area of the touch sensor 21. 30A and 30B, the color change sheet 41 described above can also be used.
 グリップ検出エリアArgと操作検出エリアArvの位置を、運転者がタッチセンサ21を握った位置に応じて動的に設定する場合において、色変化シート41を用いるときには、次のようにすることができる。制御部10は、運転者が円環状部200rにおけるタッチセンサ21の部分を握った後に、タッチセンサ21に対して、グリップ検出エリアArgと操作検出エリアArvとを設定する。そして、制御部10は、グリップ検出エリアArgと操作検出エリアArvとの設定後に、グリップ検出エリアArgと前記操作検出エリアArvとを色分けする。いずれの構成の場合も、色分けとは、それぞれのエリアに色を付けて色分けしてもよいし、一部のエリアに色を付けることによって結果として色分けしてもよい。 When the position of the grip detection area Arg and the operation detection area Arv is dynamically set according to the position where the driver holds the touch sensor 21, the color change sheet 41 can be used as follows. . The controller 10 sets a grip detection area Arg and an operation detection area Arv for the touch sensor 21 after the driver grips the portion of the touch sensor 21 in the annular portion 200r. Then, after setting the grip detection area Arg and the operation detection area Arv, the control unit 10 colors the grip detection area Arg and the operation detection area Arv. In any of the configurations, the color coding may be performed by coloring each area, or may be color coded as a result by coloring some areas.
 図31(a)は、グリップ検出エリアArgと操作無効エリアArivと操作検出エリアArvのそれぞれの境界に所定の色によるマーカM1,M2を付した例を示している。マーカM1,M2は、境界であることを識別するための境界識別手段の例である。マーカM1,M2は、例えば塗料またはシールによって設ければよい。図31(b)は、操作無効エリアArivを設けない場合の例であり、グリップ検出エリアArgと操作検出エリアArvとの境界に所定の色によるマーカM3を付した例を示している。図31(a),(b)のように、境界の位置を示すようにすれば、運転者はタッチセンサ21の各エリアの位置を明確かつ即座に視認することができるので好ましい。 FIG. 31 (a) shows an example in which markers M1 and M2 of a predetermined color are attached to the boundaries of the grip detection area Arg, the operation invalid area Ariv, and the operation detection area Arv. The markers M1 and M2 are examples of boundary identification means for identifying the boundary. The markers M1 and M2 may be provided by a paint or a seal, for example. FIG. 31B shows an example in which the operation invalid area Ariv is not provided, and shows an example in which a marker M3 of a predetermined color is added to the boundary between the grip detection area Arg and the operation detection area Arv. As shown in FIGS. 31A and 31B, it is preferable to indicate the position of the boundary because the driver can clearly and immediately visually recognize the position of each area of the touch sensor 21.
 図32は、操作検出エリアArvにおける円環状部200rの径をグリップ検出エリアArgにおける円環状部200rの径よりも細くした例を示している。図32は、操作無効エリアArivを設けない場合の例を示している。運転者がステアリングホイール200を操作する際に支障とならず、操作検出エリアArvであることを触覚で認識することができる程度に径を細くすればよい。操作検出エリアArvとグリップ検出エリアArgとの境界は径を徐々に変化させることが好ましい。 FIG. 32 shows an example in which the diameter of the annular portion 200r in the operation detection area Arv is smaller than the diameter of the annular portion 200r in the grip detection area Arg. FIG. 32 shows an example in which the operation invalid area Ariv is not provided. What is necessary is just to make a diameter so thin that it does not become a hindrance when a driver operates steering wheel 200, and it can recognize by operation that it is operation detection area Arv. It is preferable that the diameter of the boundary between the operation detection area Arv and the grip detection area Arg is gradually changed.
 図33は、操作検出エリアArvにおける円環状部200rの径をグリップ検出エリアArgにおける円環状部200rの径よりも太くした例を示している。図33は、操作無効エリアArivを設けない場合の例を示している。運転者がステアリングホイール200を操作する際に支障とならず、操作検出エリアArvであることを触覚で認識することができる程度に径を太くすればよい。操作検出エリアArvとグリップ検出エリアArgとの境界は径を徐々に変化させることが好ましい。図32,図33の構成例では、グリップ検出エリアArgと操作検出エリアArvとの境界で円環状部200rの径が変化している。径の変化は、境界であることを物理的に識別するための境界識別手段の例と解釈することができる。 FIG. 33 shows an example in which the diameter of the annular portion 200r in the operation detection area Arv is larger than the diameter of the annular portion 200r in the grip detection area Arg. FIG. 33 shows an example in which the operation invalid area Ariv is not provided. What is necessary is just to make a diameter so thick that it does not become a hindrance when a driver operates steering wheel 200, and it can recognize by operation that it is operation detection area Arv. It is preferable that the diameter of the boundary between the operation detection area Arv and the grip detection area Arg is gradually changed. 32 and 33, the diameter of the annular portion 200r changes at the boundary between the grip detection area Arg and the operation detection area Arv. The change in diameter can be interpreted as an example of boundary identification means for physically identifying the boundary.
 図34は、グリップ検出エリアArgと操作無効エリアArivと操作検出エリアArvのそれぞれの境界に凹部B1,B2を設けた例を示している。運転者は、凹部B1,B2によって各エリアの位置を視認することができ、タッチセンサ21を握った場合に触覚で各エリアの位置を認識することができる。特に図示していないが、操作無効エリアArivを設けない場合は、グリップ検出エリアArgと操作検出エリアArvとの境界に凹部を設ければよい。なお、凹部B1,B2によってタッチセンサ21を分割してもよいし、分割しなくてもよい。凹部B1,B2は、境界であることを物理的に識別するための境界識別手段の他の例である。 FIG. 34 shows an example in which recesses B1 and B2 are provided at the boundaries of the grip detection area Arg, the operation invalid area Ariv, and the operation detection area Arv. The driver can visually recognize the position of each area by the recesses B1 and B2, and can recognize the position of each area by touch when the touch sensor 21 is gripped. Although not particularly illustrated, when the operation invalid area Ariv is not provided, a recess may be provided at the boundary between the grip detection area Arg and the operation detection area Arv. Note that the touch sensor 21 may be divided by the recesses B1 and B2, or may not be divided. The recesses B1 and B2 are another example of boundary identification means for physically identifying the boundary.
 図35は、グリップ検出エリアArgと操作無効エリアArivと操作検出エリアArvのそれぞれの境界に凸部B3,B4を設けた例を示している。運転者は、凸部B3,B4によって各エリアの位置を視認することができ、タッチセンサ21を握った場合に触覚で各エリアの位置を認識することができる。特に図示していないが、操作無効エリアArivを設けない場合は、グリップ検出エリアArgと操作検出エリアArvとの境界に凸部を設ければよい。なお、凹部B3,B4によってタッチセンサ21を分割してもよいし、分割しなくてもよい。凸部B3,B4は、境界であることを物理的に識別するための境界識別手段のさらに他の例である。 FIG. 35 shows an example in which convex portions B3 and B4 are provided at the boundaries of the grip detection area Arg, the operation invalid area Ariv, and the operation detection area Arv. The driver can visually recognize the position of each area by the convex portions B3 and B4, and can recognize the position of each area by tactile sense when holding the touch sensor 21. Although not particularly illustrated, when the operation invalid area Ariv is not provided, a convex portion may be provided at the boundary between the grip detection area Arg and the operation detection area Arv. Note that the touch sensor 21 may be divided by the recesses B3 and B4, or may not be divided. The convex portions B3 and B4 are still another example of boundary identifying means for physically identifying the boundary.
 さらに、運転者がグリップ検出エリアArgを握った場合に、操作検出エリアArvの位置を運転者に的確に知覚させることによって、誤操作をさらに低減させるための構成例について説明する。図36(a)は、運転者がまだグリップ検出エリアArgを握っていない状態を示している。ここでは、操作無効エリアArivを設けない例を示している。図36(b)は、運転者がグリップ検出エリアArgを握った状態を示している。図36(a),(b)の構成例では、操作検出エリアArvの下面側に上述した色変化シート41を設けている。検出部10aによってグリップ検出エリアArgが握られたことが検出されたら、制御部10は、図36(b)に示すように、色変化シート41の色を変化させる。運転者は、操作検出エリアArvの位置を明確に視認することができので、誤操作をさらに低減させることが可能となる。 Further, a configuration example for further reducing erroneous operations by causing the driver to accurately perceive the position of the operation detection area Arv when the driver grips the grip detection area Arg will be described. FIG. 36A shows a state where the driver has not yet gripped the grip detection area Arg. In this example, the operation invalid area Ariv is not provided. FIG. 36B shows a state where the driver holds the grip detection area Arg. 36A and 36B, the color change sheet 41 described above is provided on the lower surface side of the operation detection area Arv. When it is detected that the grip detection area Arg is gripped by the detection unit 10a, the control unit 10 changes the color of the color change sheet 41 as shown in FIG. Since the driver can clearly see the position of the operation detection area Arv, erroneous operations can be further reduced.
 図37(a),(b)は、操作検出エリアArvの上面側に上述した触覚フィードバックシート42を設けた例を示している。図37(a)は、運転者がまだグリップ検出エリアArgを握っていない状態を示している。図37(b)は、運転者がグリップ検出エリアArgを握った状態を示している。検出部10aによってグリップ検出エリアArgが握られたことが検出されたら、制御部10は、図37(b)に示すように、触覚フィードバックシート42を制御して、触覚フィードバックシート42の触覚を例えばざらざらした状態に変化させる。図37(a)の状態で触覚フィードバックシート42の触覚をざらざらした状態にしておき、グリップ検出エリアArgが握られたことが検出されたら、つるつるした状態に変化させてもよい。 37A and 37B show an example in which the above-described tactile feedback sheet 42 is provided on the upper surface side of the operation detection area Arv. FIG. 37A shows a state where the driver has not yet gripped the grip detection area Arg. FIG. 37B shows a state where the driver has gripped the grip detection area Arg. When it is detected that the grip detection area Arg is gripped by the detection unit 10a, the control unit 10 controls the tactile feedback sheet 42 to change the tactile sense of the tactile feedback sheet 42, for example, as shown in FIG. Change to a rough state. In the state of FIG. 37 (a), the tactile feedback sheet 42 may be in a rough state, and when it is detected that the grip detection area Arg is gripped, it may be changed to a smooth state.
 触覚フィードバックシート42の触覚を変化させることによって、運転者は、操作検出エリアArvの位置を触覚で明確に認識することができるので、誤操作をさらに低減させることが可能となる。触覚フィードバックシート42の触覚を変化させる構成では、操作検出エリアArvを目視する必要はないので、安全運転に寄与する。触覚フィードバックシート42の触覚の変化のさせ方は任意である。 By changing the tactile sensation of the tactile feedback sheet 42, the driver can clearly recognize the position of the operation detection area Arv by tactile sensation, so that it is possible to further reduce erroneous operations. In the configuration in which the tactile sensation of the tactile feedback sheet 42 is changed, it is not necessary to visually check the operation detection area Arv, which contributes to safe driving. The method of changing the tactile sensation of the tactile feedback sheet 42 is arbitrary.
 図36,図37の構成例において、色変化シート41や触覚フィードバックシート42を操作検出エリアArvの部分のみではなく、タッチセンサ21の全体に設けた場合には、グリップ検出エリアArgと操作検出エリアArvや、必要に応じて設ける操作無効エリアArivの位置を、運転者がタッチセンサ21を握った位置に応じて動的に設定する場合でも、操作検出エリアArvの部分の色や触覚を変化させることが可能となる。 36 and 37, when the color change sheet 41 and the tactile feedback sheet 42 are provided not only in the operation detection area Arv but in the entire touch sensor 21, the grip detection area Arg and the operation detection area are provided. Even when the position of Arv or the operation invalid area Ariv provided as necessary is dynamically set according to the position where the driver grasps the touch sensor 21, the color or tactile sensation of the operation detection area Arv is changed. It becomes possible.
 図36,図37では、操作検出エリアArvのみ色や触覚を変化させているが、グリップ検出エリアArgと操作検出エリアArvとの色や触覚を双方のエリアの色や触覚を異ならせるように変化させてもよい。それぞれのエリアの色や触覚を変化させて色や触覚を異ならせてもよいし、一部のエリアの色や触覚を変化させて結果として色や触覚を異ならせてもよい。 In FIG. 36 and FIG. 37, only the operation detection area Arv is changed in color and touch, but the color and touch in the grip detection area Arg and operation detection area Arv are changed so that the color and touch in both areas are different. You may let them. The color and touch of each area may be changed to change the color and touch, or the color and touch of some areas may be changed to change the color and touch as a result.
 図示を省略するが、操作検出エリアArvの部分のみグリップ検出エリアArgや操作無効エリアArivとは予め触覚を異ならせておいてもよい。操作検出エリアArvの部分のみざらざら、でこぼこ、つるつる等のグリップ検出エリアArgや操作無効エリアArivとは異なる触覚となるよう、表面処理を施したり、それらの触覚のシートを貼り付けたりすればよい。この場合、操作検出エリアArvを動的に設定する場合には対応できないが、運転者は、操作検出エリアArvの位置を触覚で明確に認識することができるので好ましい。 Although illustration is omitted, only the portion of the operation detection area Arv may have a different tactile sense from the grip detection area Arg and the operation invalid area Ariv in advance. The surface of the operation detection area Arv may be roughened, or surface treatment may be applied or a sheet of those tactile sensations may be pasted so as to have a tactile sensation different from the grip detection area Arg and the operation invalid area Ariv. In this case, it is not possible to dynamically set the operation detection area Arv, but it is preferable because the driver can clearly recognize the position of the operation detection area Arv by touch.
 以上、図30~24を用いて説明したように、少なくとも検出部によってグリップ検出エリアArgが握られたことが検出されたときには、グリップ検出エリアArgと操作検出エリアArvとが区別可能に構成されている。なお、図30~24に示した構成は一例である。例えば、図30~24に示した各構成を組み合わせてもよい。 As described above with reference to FIGS. 30 to 24, the grip detection area Arg and the operation detection area Arv are configured to be distinguishable when at least the detection unit detects that the grip detection area Arg is gripped. Yes. The configurations shown in FIGS. 30 to 24 are examples. For example, the configurations shown in FIGS. 30 to 24 may be combined.
 また、グリップ検出エリアArgと操作検出エリアArvとを常に区別可能に構成してもよいが、グリップ検出エリアArgが握られたことが検出されたときだけグリップ検出エリアArgと操作検出エリアとが区別可能なように構成すれば、グリップ検出エリアArgと操作検出エリアArvとが区別可能か否かによって、操作検出エリアArvへの操作入力を受け付ける状態であるか否かを示すこともできる。 The grip detection area Arg and the operation detection area Arv may be configured to be always distinguishable. However, the grip detection area Arg and the operation detection area are distinguished only when it is detected that the grip detection area Arg is gripped. If configured so as to be able to indicate whether or not the grip detection area Arg and the operation detection area Arv are distinguishable, it is possible to indicate whether or not the operation input to the operation detection area Arv is being accepted.
<第7実施形態>
 車両における操作対象装置の制御装置及び制御方法の第7実施形態について説明する。第7実施形態における基本的構成及び動作は第1実施形態と同じであり、異なる部分のみを説明する。
<Seventh embodiment>
A seventh embodiment of the control device and control method for the operation target device in the vehicle will be described. The basic configuration and operation in the seventh embodiment are the same as those in the first embodiment, and only different parts will be described.
 図11~図13においては、指をタッチセンサ21上で左右方向または上下方向に滑らせる各種のドラッグのパターンを示したが、運転者が運転者から見て左右方向または上下方向に指を滑らせようと意図した場合でも、指の接触によって形成される軌跡は直線状にならず円弧状の曲線になる場合があり、さらには、そのドラッグの始点と終点とを結ぶ直線が水平または垂直から大きくずれることがある。これは、円環状部200rの表面は平面ではなく、しかも、指の動きは指の付け根を支点とする回動動作になりやすいためである。図38は左手の指をタッチセンサ21上で左右方向に滑らせたときの軌跡の例を示している。図38の左側が円環状部200rの外側であり、右側が円環状部200rの内側である。図38に示すように、円環状部200rの外側より内側の方が下になりやすい。 11 to 13 show various drag patterns in which the finger slides in the horizontal direction or the vertical direction on the touch sensor 21, but the driver slides the finger in the horizontal direction or the vertical direction as viewed from the driver. Even if it is intended to do so, the trajectory formed by touching the finger may not be a straight line but may be an arcuate curve, and the straight line connecting the start point and end point of the drag may be horizontal or vertical. There may be a significant shift. This is because the surface of the annular portion 200r is not a flat surface, and the movement of the finger is likely to be a turning operation with the base of the finger as a fulcrum. FIG. 38 shows an example of a locus when the finger of the left hand is slid in the left-right direction on the touch sensor 21. The left side of FIG. 38 is the outside of the annular part 200r, and the right side is the inside of the annular part 200r. As shown in FIG. 38, the inner side tends to be lower than the outer side of the annular portion 200r.
 運転者に図14のような円弧状の曲線ではなく、直線の操作を求めることは、操作性の点で好ましくない。そこで、本実施形態においては、制御部10は、図39(a)に示すように、軌跡の始点Psと終点Peとの水平方向の成分であるx成分の差dxhが所定の閾値以上であり、垂直方向の成分であるy成分の差dyhが所定の閾値未満であれば、図39(b)に示すように、水平方向へ直線状にドラッグしたとみなすこととする。 It is not preferable in terms of operability to ask the driver to operate a straight line instead of the arcuate curve as shown in FIG. Therefore, in the present embodiment, as shown in FIG. 39A, the control unit 10 determines that the difference dxh of the x component, which is the horizontal component between the trajectory start point Ps and the end point Pe, is greater than or equal to a predetermined threshold value. If the difference dyh of the y component, which is the vertical component, is less than the predetermined threshold, it is assumed that the drag is linearly made in the horizontal direction as shown in FIG.
 なお、差dxhに対する差dyhの割合(dyh/dxh)が、所定の閾値未満であるときに水平方向へ直線状にドラッグしたとみなしてもよい。この閾値は例えば1/2である。 It should be noted that when the ratio of the difference dyh to the difference dxh (dyh / dxh) is less than a predetermined threshold value, it may be considered that the line is dragged in a straight line. This threshold is 1/2, for example.
 指をタッチセンサ21上で上下方向に滑らせた場合も同様であり、直線ではなく曲線になりやすい。そこで、本実施形態においては、制御部10は、図40(a)に示すように、軌跡の始点Psと終点Peとのy成分の差dyvが所定の閾値以上であり、x成分の差dxvが所定の閾値未満であれば、図39(b)に示すように、垂直方向へ直線状にドラッグしたとみなすこととする。差dxhに対する閾値をTHxh、差dyhに対する閾値をTHyh、差dyvに対する閾値をTHyv、差dxvに対する閾値をTHxvとすると、THxv<THxhかつTHyh<THyvであることが好ましい。これらの閾値は制御部10に予め設定しておく。 The same applies when the finger is slid up and down on the touch sensor 21 and is likely to be a curved line instead of a straight line. Therefore, in the present embodiment, as illustrated in FIG. 40A, the control unit 10 determines that the y component difference dyv between the trajectory start point Ps and the end point Pe is equal to or greater than a predetermined threshold, and the x component difference dxv. If it is less than the predetermined threshold, it is assumed that the user has dragged in a straight line in the vertical direction as shown in FIG. When the threshold for the difference dxh is THxh, the threshold for the difference dyh is THyh, the threshold for the difference dyv is THyv, and the threshold for the difference dxv is THxv, it is preferable that THxv <THxh and THyh <THyv. These threshold values are set in the control unit 10 in advance.
 なお、水平方向へのドラッグと同様に、差dyhに対する差dxhの割合(dxh/dyh)が、所定の閾値未満であるときに垂直方向へ直線状にドラッグしたとみなしてもよい。この閾値は例えば1/2である。 It should be noted that, similarly to the dragging in the horizontal direction, it may be considered that the dragging is linearly performed in the vertical direction when the ratio of the difference dxh to the difference dyh (dxh / dyh) is less than a predetermined threshold. This threshold is 1/2, for example.
 ドラッグのパターンの種類を増やすため、指を斜め方向に滑らせる斜め方向ドラッグを追加することが考えられる。ところが、斜め方向ドラッグは、図38のような運転者が左右方向に指を滑らせて、意図せず円弧状になった場合との識別が難しいことがある。そこで、次のようにして斜め方向ドラッグを実現すると、さらに操作性がよくなる。図41(a)は、左側タッチセンサ21Lに対していずれかの指を右方向に滑らせた状態と、右側タッチセンサ21Rに対していずれかの指を下方向に滑らせた状態を示している。この場合、図39及び図40で説明した軌跡の補正によって、制御部10は、図41(b)に示すように、左側タッチセンサ21Lでは右方向ドラッグD、右側タッチセンサ21Rでは下方向ドラッグDとみなすことができる。 In order to increase the types of drag patterns, it is conceivable to add a diagonal drag that slides a finger diagonally. However, the diagonal drag may be difficult to distinguish from a case where the driver slides his / her finger in the left-right direction as shown in FIG. Therefore, if the oblique drag is realized as follows, the operability is further improved. FIG. 41A shows a state in which any finger is slid in the right direction with respect to the left touch sensor 21L and a state in which any finger is slid in the downward direction with respect to the right touch sensor 21R. Yes. In this case, the correction of the locus described in FIGS. 39 and 40, the control unit 10, as shown in FIG. 41 (b), the left touch sensor 21L in the right direction drag D R, downward drag in the right touch sensor 21R D can be regarded as D.
 図41(c)に示すように、右方向ドラッグDのベクトルVと下方向ドラッグDのベクトルVとを合成すれば、斜め方向のベクトルVとなる。このように、左側タッチセンサ21Lに対して右方向ドラッグDが行われ、右側タッチセンサ21Rに対して下方向ドラッグDが行われた場合、制御部10は、図41(d)に示すように、ベクトルを合成し、斜め方向のベクトルVを有する斜め方向ドラッグDが行われたと判断する。図41(d)に示す例は、右斜め下方向の斜め方向ドラッグDであるが、同様に、右斜め上方向、左斜め下方向、左斜め上方向の斜め方向ドラッグDを実現することができる。本実施形態のように斜め方向ドラッグDを実現すれば、操作性が向上する。 As shown in FIG. 41 (c), if combining the vector V D of the vector V R and downward drag D D rightward drag D R, the oblique direction of the vector V O. Thus, the right drag D R is performed on the left touch sensor 21L, when performed downward drag D D against the right touch sensor 21R, the control unit 10, shown in FIG. 41 (d) In this way, the vectors are combined, and it is determined that the diagonal drag D O having the diagonal vector V O has been performed. Example shown in FIG. 41 (d) is a diagonal direction drag D O of the lower right direction, likewise, to achieve upper right direction, lower left direction, diagonally left upward diagonal direction drag D O be able to. If the diagonal drag D0 is realized as in the present embodiment, the operability is improved.
 また、例えば、左側タッチセンサ21Lに対して上方向ドラッグが行われ、右側タッチセンサ21Rに対しても上方向ドラッグが行われた場合、制御部10は、2つの同方向(ここでは上方向)ドラッグのベクトルを合成し、より大きなベクトルに基づく動作を行うよう制御してもよい。このように制御すれば、ドラッグ操作に応じて地図をスクロールする際などは、一度のドラッグ操作で地図を大きくスクロールさせることができ、操作性が向上する。また、左側タッチセンサ21Lに対するドラッグ操作のベクトルと、右側タッチセンサ21Rに対するドラッグ操作のベクトルとが互いに逆方向であり、2つのベクトルがなす角度が180°に近い(例えば180°±α:αは任意の角度)ときに、特別な動作をするようにしてもよい。例えば、地図を回転するようにしてもよい。 For example, when the upward drag is performed on the left touch sensor 21L and the upward drag is performed also on the right touch sensor 21R, the control unit 10 has two same directions (in this case, the upward direction). Control may be performed so as to synthesize a drag vector and perform an operation based on a larger vector. By controlling in this way, when the map is scrolled according to the drag operation, the map can be largely scrolled by one drag operation, and the operability is improved. Further, the drag operation vector for the left touch sensor 21L and the drag operation vector for the right touch sensor 21R are in opposite directions, and the angle formed by the two vectors is close to 180 ° (for example, 180 ° ± α: α is A special operation may be performed at an arbitrary angle). For example, the map may be rotated.
 このように、本実施形態の制御部10は、左側タッチセンサ21Lに対する入力操作と右側タッチセンサ21Rに対する入力操作との組み合わせによるパターンに応じて、操作対象装置を制御している。 As described above, the control unit 10 of the present embodiment controls the operation target device according to a pattern based on a combination of the input operation for the left touch sensor 21L and the input operation for the right touch sensor 21R.
 なお、本実施形態においては、上方向、下方向、左方向、右方向の4方向を基準としたベクトル合成について説明したが、より多くの方向を基準としてベクトルを合成してもよい。また、ユーザが意図したドラッグ操作の軌跡と、実際に行われたドラッグ操作の軌跡とのずれ具合は左右対称となることが多く、ベクトル合成を行うことで、軌跡のずれを吸収することができるため、ドラッグ操作をその始点と終点とを結ぶ直線状にする補正だけを行い、上述のような、水平方向または垂直方向のいずれかとみなすことは、行わないようにしてもよい。 In the present embodiment, the vector composition based on the four directions of the upward direction, the downward direction, the left direction, and the right direction has been described. However, the vector may be composed based on more directions. In addition, the deviation between the trajectory of the drag operation intended by the user and the trajectory of the drag operation actually performed is often left-right symmetric, and the deviation of the trajectory can be absorbed by performing vector synthesis. Therefore, it is possible to perform only the correction for making the drag operation linear to connect the start point and the end point, and not to regard it as either the horizontal direction or the vertical direction as described above.
<第8実施形態>
 車両における操作対象装置の制御装置及び制御方法の第8実施形態について説明する。第8実施形態における基本的構成及び動作は第1実施形態と同じであり、異なる部分のみを説明する。また、第8実施形態では、水平方向及び垂直方向のドラッグの定義の仕方が第7実施形態と異なっており、指を滑らせたときの軌跡の補正及びベクトル合成の動作は第7実施形態と同様である。
<Eighth Embodiment>
An eighth embodiment of the control device and control method for the operation target device in the vehicle will be described. The basic configuration and operation in the eighth embodiment are the same as those in the first embodiment, and only different parts will be described. In the eighth embodiment, the definition of dragging in the horizontal direction and the vertical direction is different from that in the seventh embodiment, and the operations of the correction of the locus and the vector composition when the finger is slid are the same as those in the seventh embodiment. It is the same.
 本実施形態においては、図42に示すように、タッチセンサ21上で指を円環状部200r(ステアリングホイール200)の径方向にスライドさせる操作を水平方向のドラッグDhと定義し、指を円環状部200rの円周方向にスライドさせる操作を垂直方向のドラッグDvと定義することとする。図8のタッチセンサ21の展開図に水平方向のドラッグDhと垂直方向のドラッグDvを表すと、図43となる。図42,図43では、X座標,Y座標それぞれ検出領域Rの1列のみに水平方向のドラッグDhと垂直方向のドラッグDvを示しているが、指が複数列の検出領域Rに触ってドラッグされる場合もある。 In the present embodiment, as shown in FIG. 42, an operation of sliding a finger on the touch sensor 21 in the radial direction of the annular portion 200r (steering wheel 200) is defined as a horizontal drag Dh, and the finger is annular. An operation of sliding the portion 200r in the circumferential direction is defined as a vertical drag Dv. FIG. 43 shows a horizontal drag Dh and a vertical drag Dv in the development of the touch sensor 21 in FIG. 42 and 43, the horizontal drag Dh and the vertical drag Dv are shown in only one column of the detection region R for each of the X coordinate and the Y coordinate, but the finger touches the detection region R in a plurality of columns and drags. Sometimes it is done.
 図11~図13においては、指をタッチセンサ21上で左右方向または上下方向に滑らせる各種のドラッグのパターンを示したが、運転者が上述した水平方向のドラッグDhや垂直方向のドラッグDvを意図的に行おうとした場合でも、指が径方向や円周方向に正しくスライドされるとは限らない。図43の展開図より分かるように、正しく水平方向のドラッグDhや垂直方向のドラッグDvを行った場合には、接触によって形成される軌跡は直線となるが、実際には直線とならず曲線になってしまうことが考えられる。ドラッグの始点と終点とを結ぶ線が水平方向または垂直方向から大きくずれることも考えられる。 11 to 13, various drag patterns are shown in which the finger slides in the horizontal direction or the vertical direction on the touch sensor 21, but the driver may use the horizontal drag Dh or the vertical drag Dv described above. Even if it is intended, the finger is not always slid correctly in the radial direction or the circumferential direction. As can be seen from the developed view of FIG. 43, when the horizontal drag Dh and the vertical drag Dv are correctly performed, the trajectory formed by the contact is a straight line, but in reality it is not a straight line but a curved line. It is thought that it becomes. It is also conceivable that the line connecting the starting point and the ending point of the drag is greatly deviated from the horizontal direction or the vertical direction.
 第8実施形態における軌跡の補正及びベクトル合成の動作は図38~図41で説明した第7実施形態と同様であるので、説明を省略する。 The operations of locus correction and vector synthesis in the eighth embodiment are the same as those in the seventh embodiment described with reference to FIGS.
<第9実施形態>
 車両における操作対象装置の制御装置及び制御方法の第9実施形態について説明する。第9実施形態における基本的構成及び動作は第1実施形態と同じであり、異なる部分のみを説明する。
<Ninth Embodiment>
A ninth embodiment of a control device and control method for an operation target device in a vehicle will be described. The basic configuration and operation in the ninth embodiment are the same as those in the first embodiment, and only different parts will be described.
 図44~図47を用いて、図5の変形ステアリングホイール201をさらに発展させた第9実施形態の構成例について説明する。図44において、変形ステアリングホイール202は、環状部202rの左右の一部が、運転者が握る直円柱状の握持部202sとなっている。左右一対の握持部202sは上側の連結部202c1と下側の連結部202c2とによって連結されて環状部202rとなっている。握持部202sには、タッチセンサ21が装着されている。 A configuration example of the ninth embodiment in which the modified steering wheel 201 of FIG. 5 is further developed will be described with reference to FIGS. In FIG. 44, in the modified steering wheel 202, the left and right part of the annular portion 202r is a right columnar gripping portion 202s gripped by the driver. The pair of left and right grips 202s are connected by an upper connecting portion 202c1 and a lower connecting portion 202c2 to form an annular portion 202r. The touch sensor 21 is attached to the gripping part 202s.
 図45は、図44の一点鎖線の楕円で囲んだ連結部202c1と握持部202sとの境界部分を拡大して示している。図46は、図45のA-A断面を示している。握持部202sは連結部202c1,202c2よりもわずかに径が小さいので、握持部202sにタッチセンサ21を装着することによって、握持部202sと連結部202c1,202c2の境界にはほとんど段差がなく、面が連続している。 FIG. 45 is an enlarged view of the boundary portion between the connecting portion 202c1 and the gripping portion 202s surrounded by the dashed-dotted ellipse in FIG. FIG. 46 shows an AA cross section of FIG. Since the gripping portion 202s has a slightly smaller diameter than the connecting portions 202c1 and 202c2, when the touch sensor 21 is attached to the gripping portion 202s, there is almost a step at the boundary between the gripping portion 202s and the connecting portions 202c1 and 202c2. The surface is continuous.
 図44に示す変形ステアリングホイール202においては、握持部202sによって、タッチセンサ21に対する入力操作をオンするかオフするかを切り換えるようになっている。入力操作をオンするとは上述した特定の入力操作を許可する(有効とする)ことであり、入力操作をオフするとは上述した特定の入力操作を不許可とする(無効とする)ことである。握持部202sはオン・オフ切換機構を内蔵しており、オン・オフ切換機構によって入力操作のオンとオフとを切り換える。 In the modified steering wheel 202 shown in FIG. 44, the gripping portion 202s is used to switch the input operation to the touch sensor 21 between on and off. Turning on the input operation means permitting (validating) the above-described specific input operation, and turning off the input operation means disallowing (invalidating) the above-mentioned specific input operation. The gripper 202s has a built-in on / off switching mechanism, and the input operation is switched on and off by the on / off switching mechanism.
 図46及び図47を用いて、オン・オフ切換機構及びオン・オフ切換機構による切換動作について説明する。図47は、図45のB-B断面を示している。図46に示すように、連結部202c1の握持部202s側の端部は突出部27となっている。握持部202sの連結部202c1側の端部は、突出部27を収納する凹部を有する受け部28となっている。図47(a)~(c)に示すように、突出部27の周方向の一部は切り欠かかれており、凹部27cpとなっている。凹部27cpには、突出部29pを有する弾性変形部29が固着されている。受け部28の内周面には、2箇所の凹部28cp1,28cp2が形成されている。 46 and 47, the on / off switching mechanism and the switching operation by the on / off switching mechanism will be described. FIG. 47 shows a BB cross section of FIG. As shown in FIG. 46, the end of the connecting portion 202c1 on the gripping portion 202s side is a protruding portion 27. An end portion of the gripping portion 202s on the side of the connecting portion 202c1 serves as a receiving portion 28 having a concave portion for accommodating the protruding portion 27. As shown in FIGS. 47 (a) to 47 (c), a part of the protrusion 27 in the circumferential direction is notched, forming a recess 27cp. An elastic deformation portion 29 having a protrusion 29p is fixed to the recess 27cp. Two recesses 28 cp 1 and 28 cp 2 are formed on the inner peripheral surface of the receiving portion 28.
 変形ステアリングホイール202の通常状態では、握持部202sは図47(a)の状態にある。即ち、突出部29pは凹部28cp1に係合している。図47(a)の状態は、タッチセンサ21に対する入力操作をオフにしている状態である。タッチセンサ21によって操作対象装置を操作せず、通常に車両を運転する場合には図47(a)に示すオフの状態とする。図47(a)に示すオフの状態から握持部202sを変形ステアリングホイール202の外周側へと回すと、図47(b)に示すように、突出部29pと凹部28cp1との係合が外れ、突出部29pが凹部28cp1,28cp2間の凸部に当接する状態となる。このとき、弾性変形部29は凹部28cp1,28cp2間の凸部で押されて変形している。 In the normal state of the modified steering wheel 202, the gripper 202s is in the state shown in FIG. That is, the protrusion 29p is engaged with the recess 28cp1. The state shown in FIG. 47A is a state where the input operation to the touch sensor 21 is turned off. When the operation target device is not operated by the touch sensor 21 and the vehicle is normally driven, the off state shown in FIG. 47A is set. When the gripping portion 202s is turned to the outer peripheral side of the deformed steering wheel 202 from the OFF state shown in FIG. 47A, the projection 29p and the recess 28cp1 are disengaged as shown in FIG. 47B. The projecting portion 29p comes into contact with the convex portion between the concave portions 28cp1, 28cp2. At this time, the elastic deformation portion 29 is pushed and deformed by the convex portion between the concave portions 28cp1 and 28cp2.
 握持部202sを外周側へとさらに回すと、図47(c)に示すように、突出部29pは凹部28cp2に係合して、タッチセンサ21に対する入力操作をオンにする状態となる。図示を省略しているが、図47(a)に示すタッチセンサ21に対する入力操作をオフにする状態と図47(c)に示すタッチセンサ21に対する入力操作をオンにする状態は、それぞれ電気的に検出されるようになっている。握持部202sのオン・オフ切換機構の状態検出信号は制御部10へと入力される。 When the gripping part 202s is further rotated to the outer peripheral side, the projecting part 29p is engaged with the concave part 28cp2 and the input operation to the touch sensor 21 is turned on as shown in FIG. 47 (c). Although not shown, the state in which the input operation to the touch sensor 21 shown in FIG. 47A is turned off and the state in which the input operation to the touch sensor 21 shown in FIG. To be detected. A state detection signal of the on / off switching mechanism of the gripping part 202s is input to the control part 10.
 運転者は、タッチセンサ21によって操作対象装置を操作せず、通常に車両を運転する場合には図47(a)の状態とし、タッチセンサ21によって操作対象装置を操作しようとする場合には、握持部202sを外周側へと回して図47(c)の状態とする。図47(a)の状態から図47(c)の状態へと切り換えると突出部29pが凹部28cp2に係合する際、及び、図47(c)の状態から図47(a)の状態へと切り換えると突出部29pが凹部28cp1に係合する際、クリック感が得られ、運転者はオンの状態とオフの状態とが切り換えられたことを知覚することができる。 When the driver does not operate the operation target device with the touch sensor 21 and normally drives the vehicle, the driver is in the state of FIG. 47A. When the driver wants to operate the operation target device with the touch sensor 21, The gripping part 202s is turned to the outer peripheral side to obtain the state shown in FIG. When the state of FIG. 47 (a) is switched to the state of FIG. 47 (c), the protrusion 29p engages with the recess 28cp2, and from the state of FIG. 47 (c) to the state of FIG. 47 (a). When the switching is performed, a click feeling is obtained when the protrusion 29p is engaged with the recess 28cp1, and the driver can perceive that the on state and the off state are switched.
 なお、左右の握持部202sの双方に図47に示すオン・オフ切換機構を設けてもよいし、片方のみに設けてもよい。左右の握持部202sの双方にオン・オフ切換機構を設ける場合、左右の握持部202sの双方がオンの状態となったとき入力操作をオンにする状態としてもよいし、片方がオンの状態となったとき入力操作をオンにする状態としてもよい。握持部202sを内周側に回すとオンの状態となるようにしてもよい。図44の構成例では、入力操作をオンにする状態とオフにする状態とで、運転者が握持部202sを握る感触(グリップ感)が変化しないので、運転に悪影響を与えない。 It should be noted that the on / off switching mechanism shown in FIG. 47 may be provided on both the left and right gripping sections 202s, or may be provided only on one side. When both the left and right gripping portions 202s are provided with an on / off switching mechanism, the input operation may be turned on when both the left and right gripping portions 202s are turned on, or one of them is turned on. It is good also as a state which turns ON input operation when it will be in a state. When the gripping part 202s is turned to the inner peripheral side, it may be turned on. In the configuration example of FIG. 44, the feeling (grip feeling) that the driver grips the gripping portion 202s does not change between the state in which the input operation is turned on and the state in which the input operation is turned off.
 図44に示す変形ステアリングホイール202のように握持部202sにタッチセンサ21を装着する構成では、タッチセンサ21の形状は、図8で説明したような複雑な形状とする必要がなく、図9のような単純な平面でよい。従って、タッチセンサ21の形状を単純化できるので、タッチセンサ21自体も安価とすることができ、ステアリングホイール(変形ステアリングホイール202)にタッチセンサ21を装着する工数も簡略化されるので、操作対象装置の制御装置を安価に実現することが可能となる。 In the configuration in which the touch sensor 21 is attached to the gripping portion 202s as in the modified steering wheel 202 shown in FIG. 44, the shape of the touch sensor 21 does not have to be a complicated shape as described in FIG. A simple plane such as Accordingly, since the shape of the touch sensor 21 can be simplified, the touch sensor 21 itself can be made inexpensive, and the man-hour for mounting the touch sensor 21 on the steering wheel (deformed steering wheel 202) is also simplified. It becomes possible to realize the control device of the device at a low cost.
 オン・オフ切換機構は、周方向に回動させる回動スイッチとなっている。オン・オフ切換機構を有する握持部202sに装着するタッチセンサ21には、図8,図9で説明したような、グリップ検出エリアArgと操作検出エリアArvと操作無効エリアArivとを設定してもよい。しかしながら、オン・オフ切換機構によって運転者が操作対象装置を操作する意図があるか否かは明白であるため、グリップ検出エリアArgと操作無効エリアArivとを設定せず、操作検出エリアArvのみとしてもよい。即ち、タッチセンサ21の全面を操作検出エリアArvとしてもよい。 The on / off switching mechanism is a rotation switch that rotates in the circumferential direction. A grip detection area Arg, an operation detection area Arv, and an operation invalid area Ariv as described with reference to FIGS. 8 and 9 are set in the touch sensor 21 attached to the gripping part 202s having an on / off switching mechanism. Also good. However, since it is clear whether the driver intends to operate the operation target device by the on / off switching mechanism, the grip detection area Arg and the operation invalid area Ariv are not set, and only the operation detection area Arv is set. Also good. That is, the entire surface of the touch sensor 21 may be set as the operation detection area Arv.
 図48のフローチャートを用いて、変形ステアリングホイール202を用いた場合の、制御部10で実行される処理について説明する。図48において、制御部10は、ステップS21にて、オン・オフ切換機構がオンであるか否かを判定する。オン・オフ切換機構がオンであると判定されなければ(NO)、制御部10は、ステップS21に処理を戻す。オン・オフ切換機構がオンであると判定されれば(YES)、制御部10は、ステップS22にて、センサデータ生成部22から出力されるセンサデータを取得する。制御部10は、ステップS23にて、検出部10aによる検出出力に基づいて、入力操作があったか否かを判定する。 48, a process executed by the control unit 10 when the modified steering wheel 202 is used will be described. In FIG. 48, the control unit 10 determines in step S21 whether or not the on / off switching mechanism is on. If it is not determined that the on / off switching mechanism is on (NO), the control unit 10 returns the process to step S21. If it is determined that the on / off switching mechanism is on (YES), the control unit 10 acquires sensor data output from the sensor data generation unit 22 in step S22. In step S23, the control unit 10 determines whether or not an input operation has been performed based on the detection output from the detection unit 10a.
 入力操作があったと判定されれば(YES)、制御部10は、ステップS24に処理を移し、操作入力があったと判定されなければ(NO)、制御部10は、ステップS21に処理を戻す。制御部10は、ステップS24にて、ステップS23における入力操作によって操作対象装置に対する操作を許可するか否かを判定する。操作を許可すると判定されれば(YES)、制御部10は、ステップS25に処理を移し、操作を許可すると判定されなければ(NO)、制御部10は、ステップS21に処理を戻す。制御部10は、タッチセンサ21に対して特定入力の操作がなされた場合に操作対象装置に対する操作を許可する。制御部10は、ステップS25にて、入力操作に基づいた操作を確定し、ステップS26にて、操作対象装置に対して、確定した操作に応じた制御を実行して、ステップS21に処理を戻す。 If it is determined that there has been an input operation (YES), the control unit 10 moves the process to step S24, and if it is not determined that there has been an operation input (NO), the control unit 10 returns the process to step S21. In step S24, the control unit 10 determines whether or not to permit an operation on the operation target device by the input operation in step S23. If it is determined that the operation is permitted (YES), the control unit 10 moves the process to step S25. If it is not determined that the operation is permitted (NO), the control unit 10 returns the process to step S21. The control unit 10 permits an operation on the operation target device when a specific input operation is performed on the touch sensor 21. In step S25, the control unit 10 determines an operation based on the input operation. In step S26, the control unit 10 performs control corresponding to the determined operation on the operation target device, and returns the process to step S21. .
 図48に示す例では、図4のステップS2に相当する処理を省略しているが、図4のステップS2に相当する握持部202sが握られているか否かを判定する処理をステップS22とステップS23との間に設けてもよい。 In the example shown in FIG. 48, the process corresponding to step S2 of FIG. 4 is omitted, but the process of determining whether or not the gripping part 202s corresponding to step S2 of FIG. You may provide between step S23.
 図44に示す構成例のように、オン・オフ切換機構を有する握持部202sを設けることにより、制御部10は、運転者が操作対象装置を操作する意図があるか否かを客観的に判断することができる。従って、誤操作を大幅に低減させることが可能となる。エンジンを搭載した車両ではエンジンを停止したら、電気自動車では電源を切断したら、握持部202sを通常状態である図47(a)の状態に戻すようにしてもよい。この場合には、握持部202sの内部に図47(c)の状態から図47(a)の状態に戻すモータを内蔵させればよい。 As shown in the configuration example shown in FIG. 44, by providing the gripping portion 202s having an on / off switching mechanism, the control unit 10 objectively determines whether the driver intends to operate the operation target device. Judgment can be made. Accordingly, it is possible to greatly reduce erroneous operations. When the engine is stopped in a vehicle equipped with an engine, and when the electric vehicle is turned off in an electric vehicle, the gripper 202s may be returned to the normal state shown in FIG. 47 (a). In this case, a motor for returning the state shown in FIG. 47C from the state shown in FIG. 47C to the state shown in FIG.
 タッチセンサ21を、面ファスナを用いて円環状部200rや環状部202rに着脱自在に装着するようにしてもよい。円環状部200rを握持部としたが、握持部は必ずしも円環状でなくてもよい。環状部202rのように変形していてもよいし、環状でなくてもよい。図44に示す変形ステアリングホイール202において、連結部202c1または202c2側に凹部を有する受け部を設け、握持部202s側に突出部を設けて、握持部202sと連結部202c1,202c2とを接合してもよい。図46及び図47に示す構成はオン・オフ切換機構の構成の一例であり、図46及び図47に示す構成に限定されるものではない。 The touch sensor 21 may be detachably attached to the annular portion 200r or the annular portion 202r using a surface fastener. Although the annular portion 200r is the gripping portion, the gripping portion is not necessarily circular. It may be deformed like the annular portion 202r or may not be annular. In the modified steering wheel 202 shown in FIG. 44, a receiving part having a recess is provided on the connecting part 202c1 or 202c2 side, and a protruding part is provided on the gripping part 202s side, and the gripping part 202s and the connecting parts 202c1, 202c2 are joined. May be. The configuration shown in FIGS. 46 and 47 is an example of the configuration of the on / off switching mechanism, and is not limited to the configuration shown in FIGS. 46 and 47.
<第10実施形態>
 車両における操作対象装置の制御装置の第10実施形態について説明する。第10実施形態は、運転者特定方法の一実施形態である。第10実施形態における基本的構成及び動作は第1実施形態と同じであり、異なる部分のみを説明する。
<Tenth Embodiment>
10th Embodiment of the control apparatus of the operation target apparatus in a vehicle is described. The tenth embodiment is an embodiment of a driver specifying method. The basic configuration and operation in the tenth embodiment are the same as those in the first embodiment, and only different parts will be described.
 本実施形態の操作対象装置の制御装置を用いて、車両を運転している運転者を特定する運転者特定方法の一実施形態について説明する。運転者を特定することができれば、車載機器100の状態をそれぞれの運転者に応じた最適な状態に設定したり、車両の状態をそれぞれの運転者に応じた最適な状態に設定したりすることが可能となる。例えば、運転者がオーディオ再生部12で頻繁に再生している楽曲を自動的に再生させたり、楽曲のリストを表示する際に頻繁に再生している楽曲を先頭に表示したりすることが考えられる。また、エアーコンディショナの状態を設定したり、座席の位置を運転者に応じて調整したりすることが考えられる。 An embodiment of a driver specifying method for specifying a driver driving a vehicle using the control device of the operation target device according to the present embodiment will be described. If the driver can be specified, the state of the in-vehicle device 100 is set to an optimum state according to each driver, or the state of the vehicle is set to an optimum state according to each driver. Is possible. For example, the driver may automatically play a song that is frequently played by the audio playback unit 12, or may display a song that is frequently played when displaying a list of songs at the top. It is done. It is also conceivable to set the condition of the air conditioner or adjust the seat position according to the driver.
 図49~図51を用いて運転者を特定するための例について説明する。図49は、運転者が車両を運転しようとして、円環状部200rを握った場合の手のひら接触検出部Tpと親指接触検出部Ttの状態の一例を示している。この状態では、運転者はタッチセンサ21によって操作対象装置を操作しようとしていないので、親指接触検出部Ttは手のひら接触検出部Tpと近い位置にある。図49では人差し指接触検出部Tiの図示を省略している。 An example for specifying the driver will be described with reference to FIGS. 49 to 51. FIG. 49 shows an example of the state of the palm contact detection unit Tp and the thumb contact detection unit Tt when the driver grips the annular portion 200r in an attempt to drive the vehicle. In this state, since the driver does not attempt to operate the operation target device with the touch sensor 21, the thumb contact detection unit Tt is located close to the palm contact detection unit Tp. In FIG. 49, the index finger contact detector Ti is not shown.
 制御部10は、グリップ検出エリアArgにおけるX座標方向の触られている長さを検出する。図49においては、長さLx1と長さLx2との合計がX座標方向の触られている長さである。図50は、図49において分割されている親指接触検出部Ttをつなげた状態を示しており、X座標方向の触られている長さLxを示している。長さLxは、円環状部200rを径方向に切断したときの断面における周方向で、タッチセンサ21上の手のひらによって触られている部分(手のひら接触検出部Tp)の長さを示す情報である。長さLxは、運転者が、タッチセンサ21が装着された部分の円環状部200rをどのように握っているかを示す握持状態識別データの第1の例である。 The control unit 10 detects the touched length in the X coordinate direction in the grip detection area Arg. In FIG. 49, the sum of the length Lx1 and the length Lx2 is the touched length in the X coordinate direction. FIG. 50 shows a state in which the thumb contact detection units Tt divided in FIG. 49 are connected, and shows a touched length Lx in the X coordinate direction. The length Lx is information indicating the length of a portion (palm contact detection portion Tp) touched by the palm on the touch sensor 21 in the circumferential direction in the cross section when the annular portion 200r is cut in the radial direction. . The length Lx is a first example of gripping state identification data indicating how the driver is gripping the annular portion 200r where the touch sensor 21 is attached.
 ここでは、制御部10が長さLxを検出するとしたが、長さLxに対応する検出領域Rの数を求めればよい。勿論、長さLxに対応する検出領域Rの数を実際の距離に変換することは可能である。 Here, the control unit 10 detects the length Lx, but the number of detection regions R corresponding to the length Lx may be obtained. Of course, it is possible to convert the number of detection regions R corresponding to the length Lx into an actual distance.
 また、制御部10は、親指接触検出部Ttと親指接触検出部TtとのY座標方向の長さLyaを検出する。長さLyaは、ステアリングホイール200(円環状部200r)の周方向で、タッチセンサ21上の手のひらが接触している部分(手のひら接触検出部Tp)と親指が接触している部分(親指接触検出部Tt)とで決まる長さを示す情報である。長さLyaは、運転者が、タッチセンサ21が装着された部分の円環状部200rをどのように握っているかを示す握持状態識別データの第2の例である。ここでは、制御部10がLyaを検出するとしたが、長さLyaに対応する検出領域Rの数を求められばよい。勿論、長さLyaに対応する検出領域Rの数を実際の距離に変換することは可能である。 Further, the control unit 10 detects the length Lya in the Y coordinate direction between the thumb contact detection unit Tt and the thumb contact detection unit Tt. The length Lya is the circumferential direction of the steering wheel 200 (annular portion 200r), where the palm on the touch sensor 21 is in contact (palm contact detection unit Tp) and the portion where the thumb is in contact (thumb contact detection). Part Tt) is information indicating the length. The length Lya is a second example of gripping state identification data indicating how the driver is gripping the annular portion 200r of the portion where the touch sensor 21 is attached. Here, the control unit 10 detects Lya, but the number of detection regions R corresponding to the length Lya may be obtained. Of course, it is possible to convert the number of detection regions R corresponding to the length Lya into an actual distance.
 図49では、手のひらが接触している部分の、親指が接触している部分とは反対側の端部と、親指が接触している部分の、手のひらが接触している部分とは反対側の端部との間の長さを長さLyaとしているが、これに限定されるものではない。但し、図49に示す長さを長さLyaとすることが好ましい。 In FIG. 49, the end of the part in contact with the palm opposite to the part in contact with the thumb, and the part of the part in contact with the thumb opposite to the part in contact with the palm. Although the length between the ends is set to the length Lya, the length is not limited to this. However, the length shown in FIG. 49 is preferably the length Lya.
 さらに、制御部10は、図49の状態で、触られていることを検出している検出領域Rの総数(接触検出領域総数)を検出する。接触検出領域総数は運転者の手が接触している面積に相当する。検出している検出領域Rに基づいて実際の面積を算出することも可能である。なお、接触検出領域総数は、グリップ検出エリアArgと操作検出エリアArvと操作無効エリアArivの全ての検出領域Rにおいて、触られていることを検出している検出領域Rの総数としてもよいし、グリップ検出エリアArgのみの検出領域Rにおいて、触られていることを検出している検出領域Rの総数としてもよい。 Further, the control unit 10 detects the total number of detection regions R (contact detection region total number) that are detected to be touched in the state of FIG. The total number of contact detection areas corresponds to the area where the driver's hand is in contact. It is also possible to calculate the actual area based on the detection region R being detected. The total number of contact detection areas may be the total number of detection areas R that are detected to be touched in all detection areas R of the grip detection area Arg, the operation detection area Arv, and the operation invalid area Ariv. In the detection region R of only the grip detection area Arg, the total number of detection regions R that are detected to be touched may be used.
 タッチセンサ21上で手によって触られる部分の面積に相当する情報は、運転者が、タッチセンサ21が装着された部分の円環状部200rをどのように握っているかを示す握持状態識別データの第3の例である。 The information corresponding to the area of the part touched by the hand on the touch sensor 21 is gripping state identification data indicating how the driver grips the annular part 200r of the part to which the touch sensor 21 is attached. This is a third example.
 長さLx,Lya及び接触検出領域総数で運転者を特定することができる。運転者の特定の精度は若干低下するものの、長さLx,Lyaのみで運転者を特定してもよいし、接触検出領域総数のみで運転者を特定してもよい。長さLxのみで運転者を特定してもよいし、長さLyaのみで運転者を特定してもよい。 The driver can be specified by the lengths Lx, Lya and the total number of contact detection areas. Although the driver's specific accuracy is slightly lowered, the driver may be specified only by the lengths Lx and Lya, or the driver may be specified only by the total contact detection area. The driver may be specified only by the length Lx, or the driver may be specified only by the length Lya.
 本実施形態では、さらに運転者の特定の精度を向上させるために、制御部10は、運転者が操作対象装置を操作するために親指を伸ばした状態の親指接触検出部Ttと親指接触検出部TtとのY座標方向の長さLybを検出する。図51は、運転者が操作対象装置を操作するために親指を伸ばした状態を示している。この場合、親指接触検出部Ttと親指接触検出部TtとのY座標方向の長さLybは長さLyaよりも長くなる。例えば、長さLx,Lya及び接触検出領域総数の検出後に、「運転者を特定します。親指を伸ばしてタッチセンサを操作して下さい。」のように音声案内すれば、長さLybも即座に検出することができる。このような案内をせず、運転者が実際にタッチセンサ21を操作するのを待ってから長さLybを検出しても問題ない。 In this embodiment, in order to further improve the driver's specific accuracy, the control unit 10 includes a thumb contact detection unit Tt and a thumb contact detection unit in a state where the driver has extended his thumb to operate the operation target device. The length Lyb in the Y coordinate direction with respect to Tt is detected. FIG. 51 shows a state where the driver has extended his thumb to operate the operation target device. In this case, the length Lyb in the Y-coordinate direction between the thumb contact detection unit Tt and the thumb contact detection unit Tt is longer than the length Lya. For example, after detecting the lengths Lx and Lya and the total number of contact detection areas, if the voice guidance is given as “Specify the driver. Extend the thumb and operate the touch sensor.”, The length Lyb will also be immediately Can be detected. There is no problem even if the length Lyb is detected after waiting for the driver to actually operate the touch sensor 21 without such guidance.
 長さLybにおいても、手のひらが接触している部分の、親指が接触している部分とは反対側の端部と、親指が接触している部分の、手のひらが接触している部分とは反対側の端部との間の長さを長さLybとしているが、これに限定されるものではない。但し、図51に示す長さを長さLybとすることが好ましい。長さLybは、運転者が、タッチセンサ21が装着された部分の円環状部200rをどのように握っているかを示す握持状態識別データの第4の例である。 Also in the length Lyb, the end of the part in contact with the palm opposite to the part in contact with the thumb and the part of the part in contact with the thumb in contact with the palm are opposite. Although the length between the end portions on the side is the length Lyb, it is not limited to this. However, the length shown in FIG. 51 is preferably the length Lyb. The length Lyb is a fourth example of gripping state identification data indicating how the driver is gripping the annular portion 200r where the touch sensor 21 is attached.
 図52は、記憶部18に記憶させている運転者データベースの例を示している。運転者A,B,Cそれぞれに対応させて、運転者特定データとして、長さLx,Lya,Lybと接触検出領域総数を登録している。同じ運転者でも長さLx,Lya,Lybと接触検出領域総数が常時同じ値とならないため、同じ運転者を特定するたびに平均値を登録することが好ましい。運転者特定データは、タッチセンサ21が装着された部分の円環状部200rがどのように握られるかを示す。上記のように、運転者を特定するために制御部10が取得する握持状態識別データに応じて、運転者特定データの情報を登録すればよい。 FIG. 52 shows an example of the driver database stored in the storage unit 18. Corresponding to each of the drivers A, B, and C, the lengths Lx, Lya, and Lyb and the total number of contact detection areas are registered as the driver specifying data. Even for the same driver, the lengths Lx, Lya, Lyb and the total number of contact detection areas are not always the same value, so it is preferable to register an average value every time the same driver is specified. The driver specifying data indicates how the annular portion 200r of the portion where the touch sensor 21 is attached is gripped. As described above, information on the driver specifying data may be registered in accordance with the gripping state identification data acquired by the control unit 10 in order to specify the driver.
 制御部10が、手のひら接触検出部Tpの形状を把握して、運転者を特定する要素としてもよい。また、上記の例では、親指接触検出部Ttを用いたが、親指接触検出部Ttの代わりに人差し指接触検出部Tiを用いたり、親指接触検出部Ttに追加して人差し指接触検出部Tiを用いたりすることもできる。 It is good also as an element which the control part 10 grasps | ascertains the shape of the palm contact detection part Tp, and specifies a driver | operator. In the above example, the thumb contact detection unit Tt is used, but the index finger contact detection unit Ti is used instead of the thumb contact detection unit Tt, or the index finger contact detection unit Ti is used in addition to the thumb contact detection unit Tt. You can also.
 図53を用いて、運転者を特定する際の制御部10の処理について説明する。図53において、制御部10は、ステップS21にて、センサデータ生成部22から出力されるセンサデータを取得する。制御部10は、ステップS22にて長さLx,Lyaを取得する。運転者は、まず車両を運転しようとして円環状部200rを握るので、長さLx,Lyaを取得することが可能である。前述のように、検出部10aは、センサデータ生成部22から出力されるセンサデータに基づいて、円環状部200r(タッチセンサ21)を握っていることを検出するので、握っていることが検出された後に長さLx,Lyaを取得すればよい。制御部10は、ステップS23にて接触検出領域総数を取得する。ステップS22とステップS23との順番は逆でもよい。 53, the process of the control unit 10 when specifying the driver will be described. In FIG. 53, the control unit 10 acquires sensor data output from the sensor data generation unit 22 in step S21. The control unit 10 acquires the lengths Lx and Lya in step S22. Since the driver first holds the annular portion 200r in an attempt to drive the vehicle, the lengths Lx and Lya can be acquired. As described above, since the detection unit 10a detects that the annular portion 200r (touch sensor 21) is gripped based on the sensor data output from the sensor data generation unit 22, the detection is detected. Then, the lengths Lx and Lya may be acquired. The controller 10 acquires the total number of contact detection areas in step S23. The order of step S22 and step S23 may be reversed.
 親指を伸ばすよう案内するか、運転者がタッチセンサ21を操作するのを待って、制御部10は、ステップS24にて長さLybを取得する。ステップS24は省略可能である。制御部10は、ステップS25にて、取得した長さLx,Lya,Lybと接触検出領域総数の握持状態識別データが運転者データベースに登録されている運転者特定データとを照合して、取得した握持状態識別データがいずれかの運転者特定データと一致するか否かを判定する。同じ運転者であってもデータが完全に一致するとは限らないので、登録されている運転者特定データに所定の許容範囲を設定し、取得した長さLx,Lya,Lybと接触検出領域総数の握持状態識別データが許容範囲内に含まれていれば、一致すると判定する。 The controller 10 acquires the length Lyb in step S24 after guiding the user to extend the thumb or waiting for the driver to operate the touch sensor 21. Step S24 can be omitted. In step S25, the control unit 10 compares the acquired lengths Lx, Lya, Lyb with the driver identification data registered in the driver database with the gripping state identification data of the total number of contact detection areas. It is determined whether or not the grip state identification data thus obtained matches any of the driver identification data. Even if it is the same driver, the data does not always match completely. Therefore, a predetermined allowable range is set in the registered driver specific data, and the obtained lengths Lx, Lya, Lyb and the total number of contact detection areas are set. If the gripping state identification data is included within the allowable range, it is determined that they match.
 ステップS25にていずれかの運転者と一致すると判定されれば(YES)、制御部10は、ステップS26にて運転者を特定し、ステップS27にて運転者に対応させて制御を実行して終了する。運転者に対応させた制御とは、車載機器100の状態や車両の状態をそれぞれの運転者に応じた最適な状態に設定することである。なお、車両が走行している最中に運転者を特定した場合には、車両の状態の内、座席の位置を調整しないことは当然である。 If it is determined in step S25 that it matches any one of the drivers (YES), the control unit 10 identifies the driver in step S26, and executes control corresponding to the driver in step S27. finish. The control corresponding to the driver is to set the state of the in-vehicle device 100 and the state of the vehicle to an optimum state corresponding to each driver. Of course, when the driver is specified while the vehicle is traveling, the position of the seat is not adjusted in the vehicle state.
 ステップS25にていずれかの運転者と一致すると判定されなければ(NO)、制御部10は、ステップS28にて、運転者データベースに登録する指示がなされたか否かを判定する。運転者データベースへの登録指示がなされたと判定されれば(YES)、制御部10は、ステップS29にて、図1では図示していない操作部によって入力した運転者名と取得した長さLx,Lya,Lybと接触検出領域総数よりなる握持状態識別データとを対応させて運転者データベースに運転者特定データとして登録して終了する。運転者データベースへの登録指示がなされたと判定されなければ(NO)、そのまま終了する。 If it is not determined that it coincides with any driver in step S25 (NO), the control unit 10 determines whether or not an instruction to register in the driver database is given in step S28. If it is determined that an instruction to register in the driver database is given (YES), the control unit 10 in step S29, the driver name input by the operation unit not shown in FIG. 1 and the acquired length Lx, Lya, Lyb and gripping state identification data consisting of the total number of contact detection areas are associated with each other and registered in the driver database as driver specifying data, and the process ends. If it is not determined that an instruction to register in the driver database has been given (NO), the process ends.
 以上の説明より分かるように、制御部は、センサデータ生成部22から出力されるセンサデータに基づいて、運転者が、タッチセンサ21が装着された部分の円環状部200rをどのように握っているかを示す握持状態識別データを取得し、握持状態識別データを運転者特定データと照合することによって運転者を特定する運転者特定部となっている。 As can be seen from the above description, the control unit grasps the annular portion 200r of the portion where the touch sensor 21 is mounted based on the sensor data output from the sensor data generation unit 22. It is a driver specifying unit that specifies a driver by acquiring gripping state identification data indicating whether or not the vehicle is gripped and comparing the gripping state identification data with the driver specifying data.
 運転者を特定している状態では、制御部10は、車載機器100がどのように操作されているか、車両がどのような状態であるかを学習し、それぞれの運転者の特性を把握しておく。なお、図1では、エアーコンディショナの状態を示す情報や座席の位置を示す情報を制御部10に入力するよう図示していないが、これらの情報も車内通信部34を介して制御部10へと供給すればよい。 In the state where the driver is specified, the control unit 10 learns how the in-vehicle device 100 is operated and what the vehicle is, and grasps the characteristics of each driver. deep. In FIG. 1, information indicating the condition of the air conditioner and information indicating the position of the seat are not shown to be input to the control unit 10, but these pieces of information are also transmitted to the control unit 10 via the in-vehicle communication unit 34. And supply.
 図54を用いて、運転者を特定するために取得する握持状態識別データの他の例について説明する。運転者によって、円環状部200rを握る位置は異なる。そこで、円環状部200rを握る位置を検出して、運転者を特定するための握持状態識別データとすることも可能である。グリップ検出エリアArgと操作検出エリアArvや、必要に応じて設ける操作無効エリアArivの位置を、運転者がタッチセンサ21を握った位置に応じて動的に設定する場合には、円環状部200rを握る位置を、運転者を特定するための握持状態識別データとすることができる。 54, another example of gripping state identification data acquired to identify the driver will be described. The position where the annular portion 200r is gripped differs depending on the driver. Therefore, it is also possible to detect the position where the annular portion 200r is gripped and use it as grip state identification data for identifying the driver. When the positions of the grip detection area Arg, the operation detection area Arv, and the operation invalid area Ariv provided as necessary are dynamically set according to the position where the driver holds the touch sensor 21, the annular portion 200r. The position where the vehicle is gripped can be gripping state identification data for identifying the driver.
 図54(a)は、運転者がタッチセンサ21の下端部を握って、タッチセンサ21の下端部にグリップ検出エリアArgが設定された状態である。図54(b)は、運転者がタッチセンサ21の下端部から少し上方の位置を握って、タッチセンサ21の下端部から離れた位置にグリップ検出エリアArgが設定された状態である。タッチセンサ21のどこがグリップ検出エリアArgとなっているかは、Y座標によって特定することができる。一例として、グリップ検出エリアArgのY座標の値を積算すれば、積算値が小さいほどタッチセンサ21の下方を握っており、積算値が大きいほどタッチセンサ21の上方を握っていることが分かる。 FIG. 54A shows a state in which the driver grips the lower end of the touch sensor 21 and the grip detection area Arg is set at the lower end of the touch sensor 21. FIG. 54B shows a state in which the driver holds the position slightly above the lower end of the touch sensor 21 and the grip detection area Arg is set at a position away from the lower end of the touch sensor 21. Where the touch sensor 21 is the grip detection area Arg can be specified by the Y coordinate. As an example, if the value of the Y coordinate of the grip detection area Arg is integrated, it can be seen that the lower the integrated value, the lower the touch sensor 21 is held, and the higher the integrated value, the higher the touch sensor 21 is held.
 図52の運転者データベースに、ステアリングホイール200の周方向で、ステアリングホイール200を握る位置を示す情報を運転者特定データとして登録しておく。制御部10は、ステアリングホイール200の周方向で、ステアリングホイール200を握っている位置を示す情報を握持状態識別データとして取得する。ステアリングホイール200を握る位置を示す情報は、運転者が、タッチセンサ21が装着された部分の円環状部200rをどのように握っているかを示す握持状態識別データの第5の例である。運転者の特定の精度は低下するものの、ステアリングホイール200を握る位置を示す情報に基づいて運転者を特定してもよい。 52. Information indicating the position where the steering wheel 200 is gripped in the circumferential direction of the steering wheel 200 is registered as driver identification data in the driver database of FIG. The control unit 10 acquires information indicating the position where the steering wheel 200 is gripped in the circumferential direction of the steering wheel 200 as gripping state identification data. The information indicating the position where the steering wheel 200 is gripped is a fifth example of gripping state identification data indicating how the driver is gripping the annular portion 200r of the portion where the touch sensor 21 is mounted. Although the driver's specific accuracy decreases, the driver may be specified based on information indicating a position where the steering wheel 200 is gripped.
 以上説明した握持状態識別データの第1の例から第5の例は適宜任意の組み合わせが可能である。運転者の特定の精度を考慮して、1または複数の適宜選択すればよい。勿論、第1の例から第5の例の全てを用いれば特定の精度が大幅に向上するので好ましい。 The first to fifth examples of the gripping state identification data described above can be arbitrarily combined as appropriate. One or more may be selected as appropriate in consideration of the driver's specific accuracy. Of course, it is preferable to use all of the first to fifth examples because the specific accuracy is greatly improved.
 以上説明した第1~第10実施形態は任意に組み合わせが可能である。 The first to tenth embodiments described above can be arbitrarily combined.
 本発明は、車両内の任意の操作対象装置を制御する制御装置として利用できる。自動車以外の車両にも利用可能である。また、ステアリングホイールのような操作部(コントローラ)を有するゲーム装置において、ゲームを制御する制御装置としても利用可能である。 The present invention can be used as a control device for controlling an arbitrary operation target device in a vehicle. It can also be used for vehicles other than automobiles. Further, in a game device having an operation unit (controller) such as a steering wheel, it can be used as a control device for controlling the game.

Claims (47)

  1.  複数の検出領域を有し、ステアリングホイールにおける運転者が握る握持部の所定の範囲に装着されたタッチセンサから得られる接触検出信号に基づいて、どの検出領域が触られているかを示す位置データを含むセンサデータを生成するセンサデータ生成部と、
     前記センサデータに基づいて、運転者が前記握持部を握っているか否か、及び、前記タッチセンサに対する入力操作を検出する検出部と、
     前記検出部によって運転者が前記握持部を握っていることが検出され、かつ、前記タッチセンサに対して特定の入力操作が行われたことが検出された場合に、前記特定の入力操作に応じて、前記タッチセンサによって操作する対象の操作対象装置を制御する制御部と、
     を備えることを特徴とする車両における操作対象装置の制御装置。
    Position data indicating which detection area is touched based on a contact detection signal obtained from a touch sensor having a plurality of detection areas and attached to a predetermined range of a gripping portion gripped by a driver on the steering wheel. A sensor data generation unit that generates sensor data including:
    Based on the sensor data, whether or not the driver is gripping the gripping part, and a detection part for detecting an input operation to the touch sensor,
    When the detection unit detects that the driver is grasping the gripping unit and detects that a specific input operation is performed on the touch sensor, the specific input operation is performed. In response, a control unit that controls an operation target device to be operated by the touch sensor;
    The control apparatus of the operation target apparatus in the vehicle characterized by the above-mentioned.
  2.  前記タッチセンサには、前記複数の検出領域の一部を含み、運転者の手のひらの接触を検出するグリップ検出エリアと、前記複数の検出領域の他の一部を含み、指の接触による前記特定の入力操作を検出する操作検出エリアとが設定されており、
     前記検出部は、前記グリップ検出エリアから得られる接触検出信号に基づくセンサデータによって運転者が前記握持部を握っているか否かを検出し、前記操作検出エリアから得られる接触検出信号に基づくセンサデータによって前記特定の入力操作を検出する
     ことを特徴とする請求項1記載の車両における操作対象装置の制御装置。
    The touch sensor includes a part of the plurality of detection areas, includes a grip detection area for detecting contact of a palm of the driver, and another part of the plurality of detection areas, and the identification by contact of a finger And an operation detection area that detects input operations of
    The detection unit detects whether or not a driver is holding the gripping unit based on sensor data based on a contact detection signal obtained from the grip detection area, and a sensor based on the contact detection signal obtained from the operation detection area The control device for an operation target device in a vehicle according to claim 1, wherein the specific input operation is detected based on data.
  3.  前記制御部は、前記タッチセンサに対して、前記複数の検出領域の一部を含み、運転者の手のひらの接触を検出するグリップ検出エリアと、前記複数の検出領域の他の一部を含み、指の接触による前記特定の入力操作を検出する操作検出エリアとを設定し、
     前記検出部は、前記グリップ検出エリアから得られる接触検出信号に基づくセンサデータによって運転者が前記握持部を握っているか否かを検出し、前記操作検出エリアから得られる接触検出信号に基づくセンサデータによって前記特定の入力操作を検出する
     ことを特徴とする請求項1記載の車両における操作対象装置の制御装置。
    The control unit includes a part of the plurality of detection areas for the touch sensor, and includes a grip detection area for detecting contact of a palm of the driver, and another part of the plurality of detection areas, And setting an operation detection area for detecting the specific input operation by finger contact,
    The detection unit detects whether or not a driver is holding the gripping unit based on sensor data based on a contact detection signal obtained from the grip detection area, and a sensor based on the contact detection signal obtained from the operation detection area The control device for an operation target device in a vehicle according to claim 1, wherein the specific input operation is detected based on data.
  4.  前記タッチセンサには、前記グリップ検出エリアと前記操作検出エリアとの間に、手のひらまたは指の接触による操作を無効とする操作無効エリアが設定されていることを特徴とする請求項2に記載の車両における操作対象装置の制御装置。 The operation ineffective area which invalidates operation by the touch of a palm or a finger is set between the grip detection area and the operation detection area in the touch sensor. The control apparatus of the operation target apparatus in a vehicle.
  5.  前記制御部は、前記グリップ検出エリアと前記操作検出エリアとの間に、手のひらまたは指の接触による操作を無効とする操作無効エリアを設定することを特徴とする請求項3に記載の車両における操作対象装置の制御装置。 The operation in the vehicle according to claim 3, wherein the control unit sets an operation invalid area between the grip detection area and the operation detection area to invalidate an operation due to a palm or finger contact. Control device for the target device.
  6.  前記制御部は、前記特定の入力操作として、前記操作検出エリアに対して、前記握持部を前記ステアリングホイールの径方向に切断したときの断面における周方向に離間した2箇所を2つの指によって操作したパターンに応じて、前記操作対象装置を制御することを特徴とする請求項2~5のいずれか1項に記載の車両における操作対象装置の制御装置。 The control unit, as the specific input operation, with respect to the operation detection area, two places separated in the circumferential direction in the cross section when the gripping part is cut in the radial direction of the steering wheel with two fingers The control device for an operation target device in a vehicle according to any one of claims 2 to 5, wherein the operation target device is controlled according to an operated pattern.
  7.  前記制御部は、前記特定の入力操作として、運転者の左手による第1の入力操作と右手による第2の入力操作との組み合わせによるパターンに応じて、前記操作対象装置を制御することを特徴とする請求項1~6のいずれか1項に記載の車両における操作対象装置の制御装置。 The control unit controls the operation target device according to a pattern formed by a combination of a first input operation by a driver's left hand and a second input operation by a right hand as the specific input operation. The control device for an operation target device in a vehicle according to any one of claims 1 to 6.
  8.  前記タッチセンサは、運転者が左手で握る握持部の所定の範囲に装着された第1のタッチセンサと右手で握る握持部の所定の範囲に装着された第2のタッチセンサとを含み、
     前記制御部は、前記特定の入力操作として、前記第1のタッチセンサに対する前記第1の入力操作と前記第2のタッチセンサに対する前記第2の入力操作との組み合わせによるパターンに応じて、前記操作対象装置を制御する
     ことを特徴とする請求項7記載の車両における操作対象装置の制御装置。
    The touch sensor includes a first touch sensor attached to a predetermined range of a gripping part gripped by the driver with the left hand and a second touch sensor attached to a predetermined range of the gripping part gripped by the right hand. ,
    The control unit performs the operation as the specific input operation according to a pattern by a combination of the first input operation for the first touch sensor and the second input operation for the second touch sensor. The target device is controlled. The control device for an operation target device in a vehicle according to claim 7.
  9.  複数の検出領域を有し、ステアリングホイールにおける運転者が握る握持部の所定の範囲に装着されたタッチセンサを運転者が握っているか否かを検出し、
     前記タッチセンサに対して特定の入力操作が行われたか否かを検出し、
     運転者が前記タッチセンサを握っていることが検出され、かつ、前記特定の入力操作が行われたことが検出された場合に、前記タッチセンサによって操作する対象の操作対象装置を制御する
     ことを特徴とする車両における操作対象装置の制御方法。
    Having a plurality of detection areas, detecting whether or not the driver is gripping a touch sensor mounted in a predetermined range of a gripping portion gripped by the driver on the steering wheel;
    Detecting whether a specific input operation has been performed on the touch sensor;
    When it is detected that the driver is holding the touch sensor and it is detected that the specific input operation is performed, an operation target device to be operated by the touch sensor is controlled. A method for controlling an operation target device in a vehicle.
  10.  前記タッチセンサには、前記複数の検出領域の一部を含み、運転者の手のひらの接触を検出するグリップ検出エリアと、前記複数の検出領域の他の一部を含み、指の接触による前記特定の操作を検出する操作検出エリアとが設定されており、
     前記グリップ検出エリアに対する手のひらの接触によって、運転者が前記タッチセンサを握っていることを検出し、
     前記操作検出エリアに対する指の接触によって、前記特定の入力操作が行われたことを検出する
     ことを特徴とする請求項9記載の車両における操作対象装置の制御方法。
    The touch sensor includes a part of the plurality of detection areas, includes a grip detection area for detecting contact of a palm of the driver, and another part of the plurality of detection areas, and the identification by contact of a finger And an operation detection area to detect the operation of
    Detecting that the driver is holding the touch sensor by the contact of the palm with the grip detection area;
    The method for controlling an operation target device in a vehicle according to claim 9, wherein the specific input operation is detected by a finger contact with the operation detection area.
  11.  前記タッチセンサに対して、前記複数の検出領域の一部を含み、運転者の手のひらの接触を検出するグリップ検出エリアと、前記複数の検出領域の他の一部を含み、指の接触による前記特定の入力操作を検出する操作検出エリアとを設定し、
     前記グリップ検出エリアに対する手のひらの接触によって、運転者が前記タッチセンサを握っていることを検出し、
     前記操作検出エリアに対する指の接触によって、前記特定の入力操作が行われたことを検出する
     ことを特徴とする請求項9記載の車両における操作対象装置の制御方法。
    The touch sensor includes a part of the plurality of detection areas, a grip detection area for detecting contact of the palm of the driver, and another part of the plurality of detection areas. Set an operation detection area to detect specific input operations,
    Detecting that the driver is holding the touch sensor by the contact of the palm with the grip detection area;
    The method for controlling an operation target device in a vehicle according to claim 9, wherein the specific input operation is detected by a finger contact with the operation detection area.
  12.  前記操作検出エリアは、前記タッチセンサ上で、前記グリップ検出エリアと所定の距離離間した位置に設定されており、
     前記グリップ検出エリアと前記操作検出エリアとの間の中間エリアに対する手のひらまたは指の接触による入力操作を無効とする
     ことを特徴とする請求項10記載の車両における操作対象装置の制御方法。
    The operation detection area is set at a position spaced apart from the grip detection area by a predetermined distance on the touch sensor.
    The method of controlling an operation target device in a vehicle according to claim 10, wherein an input operation by a palm or finger contact with an intermediate area between the grip detection area and the operation detection area is invalidated.
  13.  前記操作検出エリアを、前記タッチセンサ上で、前記グリップ検出エリアと所定の距離離間した位置に設定し、
     前記グリップ検出エリアと前記操作検出エリアとの間の中間エリアに対する手のひらまたは指の接触による入力操作を無効とする
     ことを特徴とする請求項11記載の車両における操作対象装置の制御方法。
    The operation detection area is set on the touch sensor at a position separated from the grip detection area by a predetermined distance,
    The method of controlling an operation target device in a vehicle according to claim 11, wherein an input operation by a palm or finger contact with an intermediate area between the grip detection area and the operation detection area is invalidated.
  14.  前記特定の入力操作として、前記操作検出エリアに対して、前記握持部を前記ステアリングホイールの径方向に切断したときの断面における周方向に離間した2箇所を2つの指によって操作したパターンに応じて、前記操作対象装置を制御する
     ことを特徴とする請求項10~13のいずれか1項に記載の車両における操作対象装置の制御方法。
    As the specific input operation, according to a pattern in which two positions separated in the circumferential direction in the cross section when the gripping part is cut in the radial direction of the steering wheel with respect to the operation detection area are operated by two fingers. The method of controlling an operation target device in a vehicle according to any one of claims 10 to 13, wherein the operation target device is controlled.
  15.  前記特定の入力操作として、運転者の左手による第1の入力操作と右手による第2の入力操作との組み合わせによるパターンに応じて、前記操作対象装置を制御することを特徴とする請求項9~14のいずれか1項に記載の車両における操作対象装置の制御方法。 The operation target device is controlled according to a pattern formed by a combination of a first input operation by a driver's left hand and a second input operation by a right hand as the specific input operation. 14. The method for controlling an operation target device in a vehicle according to claim 14.
  16.  前記タッチセンサは、運転者が左手で握る握持部の所定の範囲に装着された第1のタッチセンサと右手で握る握持部の所定の範囲に装着された第2のタッチセンサとを含み、
     前記特定の入力操作として、前記第1のタッチセンサに対する前記第1の入力操作と前記第2のタッチセンサに対する前記第2の入力操作との組み合わせによるパターンに応じて、前記操作対象装置を制御する
     ことを特徴とする請求項15記載の車両における操作対象装置の制御方法。
    The touch sensor includes a first touch sensor attached to a predetermined range of a gripping part gripped by the driver with the left hand and a second touch sensor attached to a predetermined range of the gripping part gripped by the right hand. ,
    As the specific input operation, the operation target device is controlled in accordance with a pattern formed by a combination of the first input operation with respect to the first touch sensor and the second input operation with respect to the second touch sensor. The method of controlling an operation target device in a vehicle according to claim 15.
  17.  運転者が握る部分である握持部と、
     複数の検出領域を有し、前記握持部の所定の範囲に、前記握持部を覆うように装着されたタッチセンサと、
     前記タッチセンサから得られる接触検出信号に基づいて、どの検出領域が触られているかを示す位置データを含むセンサデータを生成するセンサデータ生成部と、
     前記センサデータに基づいて、運転者が前記握持部における前記タッチセンサの部分を握っているか否か、及び、前記タッチセンサに対する入力操作を検出する検出部と、
     前記検出部によって運転者が前記タッチセンサの部分を握っていることが検出され、かつ、前記タッチセンサに対して特定の入力操作が行われたことが検出された場合に、前記特定の入力操作に応じて、前記タッチセンサによって操作する対象の操作対象装置を制御するための制御信号を発生する制御信号発生部と、
     を備えることを特徴とするステアリングホイール。
    A gripping part which is a part gripped by the driver;
    A touch sensor that has a plurality of detection areas and is attached to a predetermined range of the gripping part so as to cover the gripping part;
    A sensor data generation unit that generates sensor data including position data indicating which detection area is touched based on a contact detection signal obtained from the touch sensor;
    Based on the sensor data, whether the driver is gripping the part of the touch sensor in the gripping unit, and a detection unit that detects an input operation to the touch sensor,
    When the detection unit detects that the driver is holding the part of the touch sensor and detects that a specific input operation is performed on the touch sensor, the specific input operation is performed. And a control signal generator for generating a control signal for controlling an operation target device to be operated by the touch sensor,
    A steering wheel characterized by comprising:
  18.  ステアリングホイールにおける運転者が握る握持部に装着されたタッチセンサにおける第1のエリアが触られている状態であることを検出する第1の検出部と、
     前記タッチセンサにおける前記第1のエリアよりも上側に位置する第2のエリアに対して特定の入力操作がなされたことを検出する第2の検出部と、
     前記第1の検出部によって前記第1のエリアが触られている状態であることが検出され、前記第2の検出部によって前記特定の入力操作がなされたことが検出された場合に、前記タッチセンサによって操作する対象の操作対象装置を、前記特定の入力操作に応じて制
    御する制御部と、
     を備えることを特徴とする車両における操作対象装置の制御装置。
    A first detection unit for detecting that the first area in the touch sensor attached to the gripping unit gripped by the driver on the steering wheel is touched;
    A second detection unit for detecting that a specific input operation has been performed on a second area located above the first area in the touch sensor;
    The touch is detected when the first detection unit detects that the first area is touched and the second detection unit detects that the specific input operation is performed. A control unit that controls an operation target device to be operated by a sensor according to the specific input operation;
    The control apparatus of the operation target apparatus in the vehicle characterized by the above-mentioned.
  19.  前記第1の検出部は、前記第1のエリアが所定の面積以上触られていることを検出した場合に、前記第1のエリアが触られている状態であるとする
     ことを特徴とする請求項18記載の車両における操作対象装置の制御装置。
    When the first detection unit detects that the first area is touched more than a predetermined area, the first detection unit is in a state where the first area is touched. Item 19. A control device for an operation target device in a vehicle according to Item 18.
  20.  ステアリングホイールにおける運転者が握る握持部に装着されたタッチセンサにおける第1のエリアが触られている状態にあることを検出し、
     前記第1のエリアが触られている状態で、前記タッチセンサにおける前記第1のエリアよりも上側に位置する第2のエリアに対して特定の入力操作がなされたことを検出し、
     前記特定の入力操作がなされたことが検出された場合に、前記タッチセンサによって操作する対象の操作対象装置を、前記特定の入力操作に応じて制御する
     ことを特徴とする車両における操作対象装置の制御方法。
    Detecting that the first area of the touch sensor attached to the gripping part gripped by the driver on the steering wheel is being touched;
    In a state where the first area is touched, it is detected that a specific input operation is performed on the second area located above the first area in the touch sensor,
    When it is detected that the specific input operation is performed, an operation target device to be operated by the touch sensor is controlled according to the specific input operation. Control method.
  21.  前記第1のエリアが所定の面積以上触られていることを検出した場合に、前記第1のエリアが触られている状態であるとする
     ことを特徴とする請求項20記載の車両における操作対象装置の制御方法。
    The operation target in the vehicle according to claim 20, wherein when the first area is detected to be touched more than a predetermined area, the first area is touched. Control method of the device.
  22.  複数の検出領域を有し、車両のステアリングホイールにおける運転者が握る握持部の所定の範囲に、前記握持部を覆うように装着されたタッチセンサから得られる接触検出信号に基づいて、どの検出領域が触られているかを示す位置データを含むセンサデータを生成するセンサデータ生成部と、
     前記センサデータに基づいて、前記タッチセンサに対する入力操作を検出する検出部と、
     前記検出部によって前記タッチセンサに対して特定の入力操作が行われたことが検出された場合に、前記特定の入力操作に応じて、前記タッチセンサによって操作する対象の操作対象装置を制御する制御部と、
     を備え、
     前記制御部は、車両が特定の状態にあるときに、前記操作対象装置の制御を無効とすることを特徴とする車両における操作対象装置の制御装置。
    Based on a contact detection signal obtained from a touch sensor that has a plurality of detection areas and is attached to a predetermined range of a gripping part gripped by a driver on a steering wheel of a vehicle so as to cover the gripping part. A sensor data generation unit that generates sensor data including position data indicating whether the detection area is touched;
    A detection unit that detects an input operation to the touch sensor based on the sensor data;
    Control that controls an operation target device to be operated by the touch sensor according to the specific input operation when the detection unit detects that the specific input operation has been performed on the touch sensor. And
    With
    The control unit for an operation target device in a vehicle, wherein the control unit invalidates the control of the operation target device when the vehicle is in a specific state.
  23.  前記検出部は、前記センサデータに基づいて、運転者が前記握持部を握っているか否かを検出し、
     前記制御部は、前記検出部によって運転者が前記握持部を握っていることが検出され、かつ、前記タッチセンサに対して特定の入力操作が行われたことが検出された場合に、前記特定の入力操作に応じて、前記タッチセンサによって操作する対象の操作対象装置を制御することを特徴とする請求項22記載の車両における操作対象装置の制御装置。
    The detection unit detects whether or not a driver is holding the gripping unit based on the sensor data,
    When the control unit detects that the driver is gripping the gripping unit and detects that a specific input operation is performed on the touch sensor, the control unit The control device for an operation target device in a vehicle according to claim 22, wherein an operation target device to be operated by the touch sensor is controlled according to a specific input operation.
  24.  前記制御部は、前記ステアリングホイールの回転角度が所定の角度を超えたとき車両が特定の状態にあるとして、前記操作対象装置の制御を無効とすることを特徴とする請求項22または23に記載の車両における操作対象装置の制御装置。 24. The control unit according to claim 22 or 23, wherein the control unit invalidates the control of the operation target device on the assumption that the vehicle is in a specific state when a rotation angle of the steering wheel exceeds a predetermined angle. Control device for operation target device in a vehicle.
  25.  前記制御部は、方向指示器が操作されたとき車両が特定の状態にあるとして、前記操作対象装置の制御を無効とすることを特徴とする請求項22または23に記載の車両における操作対象装置の制御装置。 The operation target device in the vehicle according to claim 22 or 23, wherein the control unit invalidates the control of the operation target device on the assumption that the vehicle is in a specific state when the direction indicator is operated. Control device.
  26.  前記制御部は、シフトレバーのシフト位置がリバースとなったとき車両が特定の状態にあるとして、前記操作対象装置の制御を無効とすることを特徴とする請求項22~25のいずれか1項に記載の車両における操作対象装置の制御装置。 26. The control unit according to claim 22, wherein the control unit invalidates the control of the operation target device, assuming that the vehicle is in a specific state when the shift position of the shift lever is reversed. The control apparatus of the operation target apparatus in the vehicle described in 1.
  27.  前記タッチセンサには、前記複数の検出領域の一部を含み、運転者の手のひらの接触を検出するグリップ検出エリアと、前記複数の検出領域の他の一部を含み、指の接触による前記特定の入力操作を検出する操作検出エリアとが設定されており、
     前記検出部は、前記グリップ検出エリアから得られる接触検出信号に基づくセンサデータによって運転者が前記握持部を握っているか否かを検出し、前記操作検出エリアから得られる接触検出信号に基づくセンサデータによって前記特定の入力操作を検出する
     ことを特徴とする請求項23~26のいずれか1項に記載の車両における操作対象装置の制御装置。
    The touch sensor includes a part of the plurality of detection areas, includes a grip detection area for detecting contact of a palm of the driver, and another part of the plurality of detection areas, and the identification by contact of a finger And an operation detection area that detects input operations of
    The detection unit detects whether or not a driver is holding the gripping unit based on sensor data based on a contact detection signal obtained from the grip detection area, and a sensor based on the contact detection signal obtained from the operation detection area The control device for an operation target device in a vehicle according to any one of claims 23 to 26, wherein the specific input operation is detected based on data.
  28.  前記制御部は、前記特定の入力操作として、前記操作検出エリアに対して、前記握持部を前記ステアリングホイールの径方向に切断したときの断面における周方向に離間した2箇所を2つの指によって操作したパターンに応じて、前記操作対象装置を制御する
     ことを特徴とする請求項27記載の車両における操作対象装置の制御装置。
    The control unit, as the specific input operation, with respect to the operation detection area, two places separated in the circumferential direction in the cross section when the gripping part is cut in the radial direction of the steering wheel with two fingers The control device for an operation target device in a vehicle according to claim 27, wherein the operation target device is controlled according to an operated pattern.
  29.  前記タッチセンサは、運転者が左手で握る握持部の所定の範囲に装着された第1のタッチセンサと右手で握る握持部の所定の範囲に装着された第2のタッチセンサとを含み、
     前記制御部は、前記特定の入力操作として、前記第1のタッチセンサに対する第1の入力操作と前記第2のタッチセンサに対する第2の入力操作との組み合わせによるパターンに応じて、前記操作対象装置を制御する
     ことを特徴とする請求項22~28のいずれか1項に記載の車両における操作対象装置の制御装置。
    The touch sensor includes a first touch sensor attached to a predetermined range of a gripping part gripped by the driver with the left hand and a second touch sensor attached to a predetermined range of the gripping part gripped by the right hand. ,
    The control unit, as the specific input operation, according to a pattern based on a combination of a first input operation for the first touch sensor and a second input operation for the second touch sensor, The control device for an operation target device in a vehicle according to any one of claims 22 to 28, wherein:
  30.  複数の検出領域を有し、車両のステアリングホイールにおける運転者が握る握持部の所定の範囲に、前記握持部を覆うように装着されたタッチセンサに対して特定の入力操作が行われたか否かを検出し、
     車両が特定の状態にあるか否かを検出し、
     車両が特定の状態にはないことが検出され、かつ、前記特定の入力操作が行われたことが検出された場合に、前記タッチセンサによって操作する対象の操作対象装置を制御する
     ことを特徴とする車両における操作対象装置の制御方法。
    Has a specific input operation been performed on a touch sensor that has a plurality of detection areas and is mounted so as to cover the gripping part within a predetermined range of the gripping part gripped by the driver on the steering wheel of the vehicle? Detect whether or not
    Detect if the vehicle is in a certain state,
    When it is detected that the vehicle is not in a specific state and the specific input operation is detected, an operation target device to be operated by the touch sensor is controlled. Control method for an operation target device in a vehicle.
  31.  前記タッチセンサを運転者が握っているか否かを検出し、
     車両が特定の状態にはないことが検出され、運転者が前記タッチセンサを握っていることが検出され、かつ、前記特定の入力操作が行われたことが検出された場合に、前記タッチセンサによって操作する対象の操作対象装置を制御する
     ことを特徴とする請求項30記載の車両における操作対象装置の制御方法。
    Detecting whether the driver is holding the touch sensor;
    The touch sensor is detected when it is detected that the vehicle is not in a specific state, the driver is grasping the touch sensor, and the specific input operation is detected. The method of controlling an operation target device in a vehicle according to claim 30, wherein the operation target device to be operated is controlled by the control.
  32.  前記ステアリングホイールの回転角度が所定の角度を超えたとき車両が特定の状態にあるにあるとして、前記操作対象装置の制御を無効とすることを特徴とする請求項30または31に記載の車両における操作対象装置の制御方法。 The vehicle according to claim 30 or 31, wherein the control of the operation target device is invalidated because the vehicle is in a specific state when the rotation angle of the steering wheel exceeds a predetermined angle. Control method of operation target device.
  33.  方向指示器が操作されたとき車両が特定の状態にあるとして、前記操作対象装置の制御を無効とすることを特徴とする請求項30または31に記載の車両における操作対象装置の制御方法。 32. The method of controlling an operation target device in a vehicle according to claim 30 or 31, wherein control of the operation target device is invalidated on the assumption that the vehicle is in a specific state when the direction indicator is operated.
  34.  シフトレバーのシフト位置がリバースとなったとき車両が特定の状態にあるとして、前記操作対象装置の制御を無効とすることを特徴とする請求項30~33のいずれか1項に記載の車両における操作対象装置の制御方法。 The vehicle according to any one of claims 30 to 33, wherein the control of the operation target device is invalidated because the vehicle is in a specific state when the shift position of the shift lever is reversed. Control method of operation target device.
  35.  前記タッチセンサには、前記複数の検出領域の一部を含み、運転者の手のひらの接触を検出するグリップ検出エリアと、前記複数の検出領域の他の一部を含み、指の接触による前記特定の操作を検出する操作検出エリアとが設定されており、
     前記グリップ検出エリアに対する手のひらの接触によって、運転者が前記タッチセンサを握っていることを検出し、
     前記操作検出エリアに対する指の接触によって、前記特定の入力操作が行われたことを検出する
     ことを特徴とする請求項31~34のいずれか1項に記載の車両における操作対象装置の制御方法。
    The touch sensor includes a part of the plurality of detection areas, includes a grip detection area for detecting contact of a palm of the driver, and another part of the plurality of detection areas, and the identification by contact of a finger And an operation detection area to detect the operation of
    Detecting that the driver is holding the touch sensor by the contact of the palm with the grip detection area;
    The method of controlling an operation target device in a vehicle according to any one of claims 31 to 34, wherein the specific input operation is detected by contact of a finger with the operation detection area.
  36.  前記特定の入力操作として、前記操作検出エリアに対して、前記握持部を前記ステアリングホイールの径方向に切断したときの断面における周方向に離間した2箇所を2つの指によって操作したパターンに応じて、前記操作対象装置を制御する
     ことを特徴とする請求項35記載の車両における操作対象装置の制御方法。
    As the specific input operation, according to a pattern in which two positions separated in the circumferential direction in the cross section when the gripping part is cut in the radial direction of the steering wheel with respect to the operation detection area are operated by two fingers. 36. The method of controlling an operation target device in a vehicle according to claim 35, wherein the operation target device is controlled.
  37.  前記タッチセンサは、運転者が左手で握る握持部の所定の範囲に装着された第1のタッチセンサと右手で握る握持部の所定の範囲に装着された第2のタッチセンサとを含み、
     前記特定の入力操作として、前記第1のタッチセンサに対する第1の入力操作と前記第2のタッチセンサに対する第2の入力操作との組み合わせによるパターンに応じて、前記操作対象装置を制御する
     ことを特徴とする請求項30~36のいずれか1項に記載の車両における操作対象装置の制御方法。
    The touch sensor includes a first touch sensor attached to a predetermined range of a gripping part gripped by the driver with the left hand and a second touch sensor attached to a predetermined range of the gripping part gripped by the right hand. ,
    Controlling the operation target device according to a pattern by a combination of a first input operation with respect to the first touch sensor and a second input operation with respect to the second touch sensor as the specific input operation. The method for controlling an operation target device in a vehicle according to any one of claims 30 to 36, wherein:
  38.  複数の検出領域を有し、ステアリングホイールにおける運転者が握る握持部の所定の範囲に装着されたタッチセンサから得られる接触検出信号に基づいて、どの検出領域が触られているかを示す位置データを含むセンサデータを生成するセンサデータ生成部と、
     前記センサデータに基づいて、前記タッチセンサに対する入力操作を検出する検出部と、
     前記検出部によって、前記タッチセンサに対して運転者の左右の手で所定の入力操作が行われたことが検出された場合に、前記タッチセンサによって操作する対象の操作対象装置を操作するための第1の特定の入力操作を受け付けない状態から受け付ける状態へと移行させるよう制御する制御部と、
     を備えることを特徴とする車両における操作対象装置の制御装置。
    Position data indicating which detection area is touched based on a contact detection signal obtained from a touch sensor having a plurality of detection areas and attached to a predetermined range of a gripping portion gripped by a driver on the steering wheel. A sensor data generation unit that generates sensor data including:
    A detection unit that detects an input operation to the touch sensor based on the sensor data;
    When the detection unit detects that a predetermined input operation is performed on the touch sensor with the left and right hands of the driver, for operating an operation target device to be operated by the touch sensor. A control unit that controls to shift from the state of not accepting the first specific input operation to the state of accepting;
    The control apparatus of the operation target apparatus in the vehicle characterized by the above-mentioned.
  39.  前記制御部は、前記タッチセンサに対して左右の手で同じ入力操作であると定義される第2の特定の入力操作が行われたことが検出された場合に、前記第1の特定の入力操作を受け付ける状態へと移行させるよう制御することを特徴とする請求項38記載の車両における操作対象装置の制御装置。 The control unit detects the first specific input when it is detected that a second specific input operation defined as the same input operation with the left and right hands is performed on the touch sensor. The control device for an operation target device in a vehicle according to claim 38, wherein control is performed so as to shift to a state in which an operation is accepted.
  40.  前記制御部は、前記第2の特定の入力操作として、左右の手による入力操作が同じタイミングで行われたとみなされる場合に、前記第1の特定の入力操作を受け付ける状態へと移行させるよう制御することを特徴とする請求項39記載の車両における操作対象装置の制御装置。 The control unit controls to shift to a state of accepting the first specific input operation when it is considered that the input operation by the left and right hands is performed at the same timing as the second specific input operation. 40. The control device for an operation target device in a vehicle according to claim 39.
  41.  前記制御部は、前記タッチセンサに対して左右の手で異なる入力操作の組み合わせによるパターンの第2の特定の入力操作が行われたことが検出された場合に、前記第1の特定の入力操作を受け付ける状態へと移行させるよう制御することを特徴とする請求項38記載の車両における操作対象装置の制御装置。 The control unit detects the first specific input operation when it is detected that a second specific input operation of a pattern based on a combination of different input operations between the left and right hands is performed on the touch sensor. 39. The control device for an operation target device in a vehicle according to claim 38, wherein the control is performed so as to shift to a state of accepting the operation.
  42.  前記制御部は、前記第2の特定の入力操作として、左手による入力操作と右手による入力操作とがいずれかの順で所定の時間内に連続して行われたことが検出された場合に、前記第1の特定の入力操作を受け付ける状態へと移行させるよう制御することを特徴とする請求項41記載の車両における操作対象装置の制御装置。 When the control unit detects that the input operation with the left hand and the input operation with the right hand are continuously performed in any order within a predetermined time as the second specific input operation, 42. The control device for an operation target device in a vehicle according to claim 41, wherein control is performed so as to shift to a state of accepting the first specific input operation.
  43.  複数の検出領域を有し、ステアリングホイールにおける運転者が握る握持部の所定の範囲に装着されたタッチセンサに対する運転者の手による入力操作を検出し、
     前記タッチセンサに対して運転者の左右の手で所定の入力操作が行われたことが検出された場合に、前記タッチセンサによって操作する対象の操作対象装置を操作するための第1の特定の入力操作を受け付けない状態から受け付ける状態へと移行させる
     ことを特徴とする車両における操作対象装置の制御方法。
    Having a plurality of detection areas, detecting an input operation by a driver's hand with respect to a touch sensor mounted in a predetermined range of a gripping portion gripped by the driver on the steering wheel;
    When it is detected that a predetermined input operation is performed on the touch sensor with the left and right hands of the driver, a first specific for operating the operation target device to be operated by the touch sensor A method for controlling an operation target device in a vehicle, wherein the state is changed from a state in which an input operation is not accepted to a state in which an input operation is accepted.
  44.  前記タッチセンサに対して左右の手で同じ入力操作であると定義される第2の特定の入力操作が行われたことが検出された場合に、前記第1の特定の入力操作を受け付ける状態へと移行させることを特徴とする請求項43記載の車両における操作対象装置の制御方法。 When it is detected that a second specific input operation, which is defined as the same input operation with the left and right hands, is performed on the touch sensor, the state of accepting the first specific input operation is entered. 45. The method of controlling an operation target device in a vehicle according to claim 43, wherein
  45.  前記第2の特定の入力操作として、左右の手で同じ入力操作が同じタイミングで行われたとみなされる場合に、前記第1の特定の入力操作を受け付ける状態へと移行させることを特徴とする請求項44記載の車両における操作対象装置の制御方法。 The second specific input operation is shifted to a state in which the first specific input operation is accepted when it is considered that the same input operation is performed with the left and right hands at the same timing. Item 45. A method of controlling an operation target device in a vehicle according to Item 44.
  46.  前記タッチセンサに対して左右の手で異なる入力操作の組み合わせによるパターンの第2の特定の入力操作が行われたことが検出された場合に、前記第1の特定の入力操作を受け付ける状態へと移行させることを特徴とする請求項43記載の車両における操作対象装置の制御方法。 When it is detected that a second specific input operation of a pattern by a combination of different input operations with the left and right hands is performed on the touch sensor, the state is changed to a state of accepting the first specific input operation. 44. The method of controlling an operation target device in a vehicle according to claim 43, wherein the control is performed.
  47.  前記第2の特定の入力操作として、左手による入力操作と右手による入力操作とがいずれかの順で所定の時間内に連続して行われたことが検出された場合に、前記第1の特定の入力操作を受け付ける状態へと移行させることを特徴とする請求項46記載の車両における操作対象装置の制御方法。 When it is detected that the input operation with the left hand and the input operation with the right hand are continuously performed in any order within a predetermined time as the second specific input operation, the first specific input operation is performed. 47. The method of controlling an operation target device in a vehicle according to claim 46, wherein a transition is made to a state in which an input operation is accepted.
PCT/JP2012/059712 2011-08-11 2012-04-09 Device and method for controlling device to be operated in vehicle, and steering wheel WO2013021685A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US14/176,626 US9267809B2 (en) 2011-08-11 2014-02-10 Control apparatus and method for controlling operation target device in vehicle, and steering wheel
US14/939,375 US9886117B2 (en) 2011-08-11 2015-11-12 Control apparatus and method for controlling operation target device in vehicle, and steering wheel

Applications Claiming Priority (22)

Application Number Priority Date Filing Date Title
JP2011176168 2011-08-11
JP2011-176168 2011-08-11
JP2011-200563 2011-09-14
JP2011200563 2011-09-14
JP2011-201354 2011-09-15
JP2011-201356 2011-09-15
JP2011201354 2011-09-15
JP2011201356 2011-09-15
JP2011206099 2011-09-21
JP2011206150 2011-09-21
JP2011-206096 2011-09-21
JP2011206096 2011-09-21
JP2011-206099 2011-09-21
JP2011-206150 2011-09-21
JP2011212025 2011-09-28
JP2011-212025 2011-09-28
JP2012-007894 2012-01-18
JP2012007894A JP5821647B2 (en) 2012-01-18 2012-01-18 Control device and control method for operation target device in vehicle
JP2012-043554 2012-02-29
JP2012043554A JP5825146B2 (en) 2011-09-14 2012-02-29 Control device and control method for operation target device in vehicle
JP2012073562A JP5765282B2 (en) 2012-03-28 2012-03-28 Control device and control method for operation target device in vehicle
JP2012-073562 2012-03-28

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14/176,626 Continuation US9267809B2 (en) 2011-08-11 2014-02-10 Control apparatus and method for controlling operation target device in vehicle, and steering wheel

Publications (1)

Publication Number Publication Date
WO2013021685A1 true WO2013021685A1 (en) 2013-02-14

Family

ID=47668214

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/059712 WO2013021685A1 (en) 2011-08-11 2012-04-09 Device and method for controlling device to be operated in vehicle, and steering wheel

Country Status (1)

Country Link
WO (1) WO2013021685A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107851394A (en) * 2015-07-31 2018-03-27 松下知识产权经营株式会社 Drive assistance device, drive assist system, driving assistance method and automatic driving vehicle
CN115129221A (en) * 2021-03-26 2022-09-30 丰田自动车株式会社 Operation input device, operation input method, and computer-readable medium having operation input program recorded thereon

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS61161852U (en) * 1985-03-27 1986-10-07
JPH06156114A (en) * 1992-11-16 1994-06-03 Makoto Ueda Dozing driving prevention apparatus
JP2000228126A (en) * 1999-02-05 2000-08-15 Matsushita Electric Ind Co Ltd Steering input device
JP2007076491A (en) * 2005-09-14 2007-03-29 Hitachi Ltd Operation device for on-vehicle equipment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS61161852U (en) * 1985-03-27 1986-10-07
JPH06156114A (en) * 1992-11-16 1994-06-03 Makoto Ueda Dozing driving prevention apparatus
JP2000228126A (en) * 1999-02-05 2000-08-15 Matsushita Electric Ind Co Ltd Steering input device
JP2007076491A (en) * 2005-09-14 2007-03-29 Hitachi Ltd Operation device for on-vehicle equipment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107851394A (en) * 2015-07-31 2018-03-27 松下知识产权经营株式会社 Drive assistance device, drive assist system, driving assistance method and automatic driving vehicle
CN115129221A (en) * 2021-03-26 2022-09-30 丰田自动车株式会社 Operation input device, operation input method, and computer-readable medium having operation input program recorded thereon

Similar Documents

Publication Publication Date Title
US9886117B2 (en) Control apparatus and method for controlling operation target device in vehicle, and steering wheel
JP5825146B2 (en) Control device and control method for operation target device in vehicle
JP5783126B2 (en) Control device and control method for operation target device in vehicle
EP2870528B1 (en) Light-based touch controls on a steering wheel and dashboard
US10203799B2 (en) Touch input device, vehicle comprising touch input device, and manufacturing method of touch input device
US8907778B2 (en) Multi-function display and operating system and method for controlling such a system having optimized graphical operating display
US20150077405A1 (en) Light-based controls on a toroidal steering wheel
US9346356B2 (en) Operation input device for vehicle
WO2013136776A1 (en) Gesture input operation processing device
JP5079582B2 (en) Touch sensor
US10967737B2 (en) Input device for vehicle and input method
JP2013075653A (en) Control device and control method of operation object device in vehicle
JP5821647B2 (en) Control device and control method for operation target device in vehicle
JP5776590B2 (en) Control device and control method for operation target device in vehicle
WO2013021685A1 (en) Device and method for controlling device to be operated in vehicle, and steering wheel
JP2013082423A (en) Device and method for controlling device to be operated in vehicle
JP2013079056A (en) Control device of device to be operated in vehicle, and method for specifying driver
JP5821696B2 (en) Control device and control method for operation target device in vehicle
JP5765282B2 (en) Control device and control method for operation target device in vehicle
JP6201864B2 (en) Vehicle control device
JP2010061256A (en) Display device
JP5790579B2 (en) Control device and control method for operation target device in vehicle
JP5768755B2 (en) Control device and control method for operation target device in vehicle
JP5783125B2 (en) Control device and control method for operation target device in vehicle
KR101876739B1 (en) In-vehicle command input system and method of controlling thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12821739

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12821739

Country of ref document: EP

Kind code of ref document: A1