WO2013021685A1 - Dispositif et procédé de commande d'un dispositif à actionner dans un véhicule, et volant - Google Patents

Dispositif et procédé de commande d'un dispositif à actionner dans un véhicule, et volant Download PDF

Info

Publication number
WO2013021685A1
WO2013021685A1 PCT/JP2012/059712 JP2012059712W WO2013021685A1 WO 2013021685 A1 WO2013021685 A1 WO 2013021685A1 JP 2012059712 W JP2012059712 W JP 2012059712W WO 2013021685 A1 WO2013021685 A1 WO 2013021685A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch sensor
input operation
target device
driver
operation target
Prior art date
Application number
PCT/JP2012/059712
Other languages
English (en)
Japanese (ja)
Inventor
学 唐沢
耕一 中島
良 近藤
亨 土井垣
俊介 福田
涼 井手上
Original Assignee
株式会社Jvcケンウッド
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2012007894A external-priority patent/JP5821647B2/ja
Priority claimed from JP2012043554A external-priority patent/JP5825146B2/ja
Priority claimed from JP2012073562A external-priority patent/JP5765282B2/ja
Application filed by 株式会社Jvcケンウッド filed Critical 株式会社Jvcケンウッド
Publication of WO2013021685A1 publication Critical patent/WO2013021685A1/fr
Priority to US14/176,626 priority Critical patent/US9267809B2/en
Priority to US14/939,375 priority patent/US9886117B2/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D1/00Steering controls, i.e. means for initiating a change of direction of the vehicle
    • B62D1/02Steering controls, i.e. means for initiating a change of direction of the vehicle vehicle-mounted
    • B62D1/04Hand wheels
    • B62D1/046Adaptations on rotatable parts of the steering wheel for accommodation of switches

Definitions

  • the present invention uses an in-vehicle device such as a navigation device mounted on a vehicle or a vehicle operation control device that controls the operation of a vehicle such as a transmission or a direction indicator as an operation target device, and controls the operation target device.
  • the present invention relates to a device, a control method, and a steering wheel suitable for operating an operation target device.
  • Patent Document 1 Vehicles in which operation switches for operating in-vehicle devices such as navigation devices mounted on vehicles are arranged on a steering wheel are widely used (see Patent Document 1). If the operation switch is arranged on the steering wheel, the driver does not need to reach the vehicle-mounted device when operating the vehicle-mounted device, so that the operability is improved. As described in Patent Document 1, the operation switch is not actually an annular portion of a steering wheel that is a gripping portion that is gripped by a driver, but a center portion and a circle that store an airbag. It is usual to arrange
  • Patent Document 2 describes that an operation switch is arranged on the back surface or inner side surface of the annular portion.
  • the operation switch since the operation switch is arranged in the annular portion, the operation switch can be operated without releasing the hand from the annular portion or greatly shifting the hand.
  • the operation switch described in Patent Document 2 is a key provided by a push button or a key provided with unevenness, and this type of key may be an obstacle when the driver operates the steering wheel. It is not preferable to provide large irregularities in the annular portion gripped by the driver.
  • an operation unit such as an operation switch is arranged in the annular part, there is no intention to operate the operation target device as in the case where the driver holds the annular part for normal driving. In some cases, it is required to prevent the operation target device from being inadvertently operated.
  • the present invention can operate the operation target device without releasing the hand from the gripping part or greatly shifting the hand, which hinders the driver from operating the steering wheel. It is an object of the present invention to provide a control device and control method for a device to be operated in a vehicle, and a steering wheel that can greatly reduce the possibility of becoming a steering wheel. It is another object of the present invention to provide a control device and control method for an operation target device in a vehicle, and a steering wheel that can greatly reduce erroneous operations.
  • the gripping portion (having a plurality of detection regions (R) and gripped by the driver in the steering wheel (200, 201)) 200r, 201s), a sensor data generation unit that generates sensor data including position data indicating which detection area is touched based on a contact detection signal obtained from a touch sensor (21) mounted in a predetermined range. (22), based on the sensor data, whether or not the driver is gripping the gripping part, and a detection part (10a) for detecting an input operation to the touch sensor; In response to the specific input operation when it is detected that the hand grips the gripping part and a specific input operation is detected with respect to the touch sensor.
  • the control device of the operation target apparatus in a vehicle characterized in that it comprises a control unit (10) for controlling the operation target device for which manipulated by the touch sensor is provided.
  • the steering wheel (200, 201) has a plurality of detection regions (R), and is mounted in a predetermined range of the grip portion (200r, 201s) gripped by the driver. It is detected whether or not the driver is holding the touch sensor (21), whether or not a specific input operation is performed on the touch sensor, and the driver is holding the touch sensor.
  • a method for controlling an operation target device in a vehicle comprising: detecting an operation target device to be operated by the touch sensor when it is detected and the specific input operation is detected. Provided.
  • the gripping portion (200r) which is a portion gripped by the driver, and a plurality of detection regions (R), the gripping portion is within a predetermined range of the gripping portion.
  • Sensor data generation for generating sensor data including position data indicating which detection area is touched based on a touch detection signal obtained from the touch sensor (21) and the touch sensor mounted so as to cover the part
  • a detection unit (24a) for detecting whether or not the driver is gripping the part of the touch sensor in the gripping part and an input operation to the touch sensor based on the sensor data; The specific input is detected when the detection unit detects that the driver is holding the part of the touch sensor and that a specific input operation is performed on the touch sensor.
  • the control signal generator for generating a control signal for controlling the operation target device for which manipulated by the touch sensor and (24b), A steering wheel is provided.
  • the first area (Arg) in the touch sensor (21) attached to the grip part (200r, 201s) gripped by the driver on the steering wheel (200, 201) is touched.
  • a specific input operation is performed on the first detection unit (10a) that detects that the touch sensor is in the state of being touched and the second area (Arv) that is located above the first area of the touch sensor. It is detected that the first area is touched by the second detection unit (10a) that detects that the first area has been made, and the specific state is detected by the second detection unit.
  • a control unit (10) that controls an operation target device to be operated by the touch sensor according to the specific input operation when it is detected that the input operation is performed. Controller of the operation target apparatus in a vehicle according to claim is provided.
  • the first area (Arg) in the touch sensor (21) attached to the grip part (200r, 201s) gripped by the driver on the steering wheel (200, 201) is touched. Is detected, and is specified for the second area (Arv) located above the first area of the touch sensor in a state where the first area is touched.
  • an operation target device to be operated by the touch sensor is controlled according to the specific input operation.
  • a sensor data generation unit that generates sensor data including position data indicating which detection area is touched based on a contact detection signal obtained from a touch sensor (21) mounted so as to cover the gripping unit. 22), based on the sensor data, a detection unit (10a) that detects an input operation on the touch sensor, and the detection unit detects that a specific input operation has been performed on the touch sensor.
  • the vehicle has a plurality of detection regions (R), and within a predetermined range of the grip portions (200r, 201s) gripped by the driver in the steering wheel (200, 201) of the vehicle. It is detected whether or not a specific input operation has been performed on the touch sensor (21) attached so as to cover the gripping portion, and whether or not the vehicle is in a specific state is detected.
  • an operation target device to be operated by the touch sensor is controlled when it is detected that the specific input operation has been performed and that the specific input operation has been performed.
  • a method for controlling an operation target device is provided.
  • the vehicle has a plurality of detection regions (R), and within a predetermined range of the grip portions (200r, 201s) gripped by the driver in the steering wheel (200, 201) of the vehicle. It is detected whether or not a specific input operation has been performed on the touch sensor (21) attached so as to cover the gripping portion, and whether or not the vehicle is in a specific state is detected.
  • an operation target device to be operated by the touch sensor is controlled when it is detected that the specific input operation has been performed and that the specific input operation has been performed.
  • a method for controlling an operation target device is provided.
  • the steering wheel (200, 201) has a plurality of detection regions (R) and is mounted in a predetermined range of the gripping part (200r, 201s) gripped by the driver.
  • a sensor data generation unit (22) that generates sensor data including position data indicating which detection area is touched, and based on the sensor data,
  • Controller of the operation target apparatus in a vehicle characterized in that it comprises a Gosuru controller (10) is provided.
  • the steering wheel (200, 201) has a plurality of detection regions (R) and is mounted in a predetermined range of the gripping part (200r, 201s) gripped by the driver.
  • the touch sensor There is provided a method for controlling an operation target device in a vehicle, characterized in that a transition is made from a state in which a first specific input operation for operating an operation target device to be operated is not accepted to a state in which it is accepted.
  • the operation target device can be operated without releasing the hand from the gripping part or greatly shifting the hand.
  • the possibility of hindering the operation of the steering wheel can be greatly reduced.
  • erroneous operations can be greatly reduced.
  • FIG. 1 is a block diagram illustrating each embodiment of a control device for an operation target device in a vehicle.
  • FIG. 2 is a partial plan view illustrating an example of a vehicle including a control device for an operation target device according to each embodiment.
  • FIG. 3 is a diagram illustrating an example of a position and a range where the touch sensor according to each embodiment is mounted on the steering wheel.
  • FIG. 4 is a diagram illustrating another example of a position and a range in which the touch sensor according to each embodiment is mounted on the steering wheel.
  • FIG. 5 is a diagram illustrating an example in which the touch sensor is mounted on the modified steering wheel.
  • FIG. 6 is a partial perspective view showing an example of a portion where sensor data can be obtained in the state where the touch sensor portion of the steering wheel is gripped.
  • FIG. 1 is a block diagram illustrating each embodiment of a control device for an operation target device in a vehicle.
  • FIG. 2 is a partial plan view illustrating an example of a vehicle including a control device for
  • FIG. 7 is a cross-sectional view showing coordinates in the circumferential direction of the cross section in the touch sensor.
  • FIG. 8 is a plan view showing a state in which the touch sensor shown in FIG. 6 is developed.
  • FIG. 9 is a schematic diagram illustrating a state in which each area illustrated in FIG. 8 is converted into an equal size.
  • FIG. 10 is a diagram illustrating an example of requirements for determining that the touch sensor portion of the steering wheel is being gripped.
  • FIG. 11 is a schematic diagram illustrating an example of a specific input operation with respect to the touch sensor.
  • FIG. 12 is a schematic diagram illustrating another example of the specific input operation with respect to the touch sensor.
  • FIG. 13 is a schematic diagram illustrating still another example of the specific input operation with respect to the touch sensor.
  • FIG. 11 is a schematic diagram illustrating an example of a specific input operation with respect to the touch sensor.
  • FIG. 14 is a flowchart for explaining the operation of each embodiment.
  • FIG. 15 is a schematic perspective view illustrating a configuration example for changing the color when the touch sensor is operated.
  • FIG. 16 is a schematic perspective view showing a configuration example for changing the sense of touch when the touch sensor is operated.
  • FIG. 17 is a plan view showing an embodiment of a steering wheel.
  • FIG. 18 is a diagram for explaining the rotation angle of the steering wheel.
  • FIG. 19 is a flowchart showing an example of specific processing in step S4 of FIG.
  • FIG. 20 is a schematic diagram illustrating an example of a state where the driver is holding the touch sensor portion for normal driving.
  • FIG. 21 is a schematic diagram illustrating an example of a state where the driver is holding the touch sensor and is about to operate the operation target device.
  • FIG. 15 is a schematic perspective view illustrating a configuration example for changing the color when the touch sensor is operated.
  • FIG. 16 is a schematic perspective view showing a configuration example for changing the sense of touch when
  • FIG. 22 is a schematic diagram showing a state in which the operation invalid area Ariv in FIG. 11 is omitted.
  • FIG. 23 is a plan view for explaining another configuration example that shows a state where the touch sensor shown in FIG. 6 is developed and that distinguishes whether or not the driver is going to operate the operation target device.
  • FIG. 24 is a diagram illustrating an example of an input operation when the same input operation is performed with the left and right hands at the same timing with respect to the left and right touch sensors.
  • FIG. 25 is a diagram illustrating an example when the input operations are regarded as having the same timing.
  • FIG. 26 is a diagram illustrating an example of an input operation when a predetermined input operation is continuously performed with the left and right hands with respect to the left and right touch sensors.
  • FIG. 27 is a diagram illustrating an example in the case of being regarded as a continuous input operation.
  • FIG. 28 is a diagram illustrating a first example in which an operation mode is set by a combination of input operations by left and right hands with respect to left and right touch sensors.
  • FIG. 29 is a diagram illustrating a second example in which the operation mode is set by a combination of input operations by the left and right hands with respect to the left and right touch sensors.
  • FIG. 30 is a diagram illustrating an example in which each area of the touch sensor is color-coded.
  • FIG. 31 is a diagram illustrating an example in which a marker is attached to the boundary of the area of the touch sensor.
  • FIG. 32 is a diagram illustrating an example in which the diameter in the operation detection area of the touch sensor is reduced.
  • FIG. 33 is a diagram illustrating an example in which the diameter in the operation detection area of the touch sensor is increased.
  • FIG. 34 is a diagram illustrating an example in which a recess is provided at the boundary of the area of the touch sensor.
  • FIG. 35 is a diagram illustrating an example in which a convex portion is provided at the boundary of the area of the touch sensor.
  • FIG. 36 is a diagram illustrating an example in which the color of the operation detection area is changed when it is detected that the grip detection area of the touch sensor is gripped.
  • FIG. 37 is a diagram illustrating an example in which the tactile sensation of the operation detection area is changed when it is detected that the grip detection area of the touch sensor is gripped.
  • FIG. 34 is a diagram illustrating an example in which the diameter in the operation detection area of the touch sensor is increased.
  • FIG. 34 is a diagram illustrating an example in which a recess is provided at the boundary of the area of the touch sensor.
  • FIG. 35 is a diagram illustrating an example in which
  • FIG. 38 is a diagram illustrating an example of a locus when a finger is slid in the left-right direction.
  • FIG. 39 is a diagram for explaining the correction of the locus when the finger is slid in the right direction.
  • FIG. 40 is a diagram for explaining the correction of the trajectory when the finger is slid downward.
  • FIG. 41 is a diagram for explaining an example of realizing diagonal dragging.
  • FIG. 42 is a partial perspective view for explaining the definition of dragging in the horizontal direction and the vertical direction in the eighth embodiment.
  • FIG. 43 is a plan view for explaining the definition of dragging in the horizontal direction and the vertical direction in the eighth embodiment with the touch sensor deployed.
  • FIG. 44 is a plan view showing a configuration example in which a modified steering wheel is developed.
  • FIG. 45 is a partially enlarged plan view of FIG. 46 is a cross-sectional view taken along the line AA in FIG.
  • FIG. 47 is a cross-sectional view taken along the line BB of FIG. 45 for explaining the on / off switching operation by the on / off switching mechanism.
  • FIG. 48 is a flowchart for explaining the operation of the eighth embodiment when the modified steering wheel of FIG. 44 is used.
  • FIG. 49 is a schematic diagram illustrating an example of gripping state identification data indicating how the driver is gripping the steering wheel of the part to which the touch sensor is attached.
  • FIG. 50 is a schematic diagram showing a modification of FIG. 49 in order to facilitate understanding of FIG. FIG.
  • FIG. 51 is a schematic diagram illustrating another example of gripping state identification data indicating how the driver is gripping the steering wheel of the part to which the touch sensor is attached.
  • FIG. 52 is a diagram illustrating an example of driver specifying data registered in the driver database.
  • FIG. 53 is a flowchart for explaining an operation when a driver is specified.
  • FIG. 54 is a partial perspective view illustrating still another example of gripping state identification data indicating how the driver is gripping the steering wheel of the portion where the touch sensor is mounted.
  • the in-vehicle device 100 is mounted in a dashboard of a vehicle.
  • the in-vehicle device 100 includes a control unit 10, a navigation processing unit 11, an audio playback unit 12, a television (TV) tuner 13, a video signal processing unit 14, a video display unit 15, and an audio signal processing unit 16.
  • the control unit 10 includes a detection unit 10a.
  • the navigation processing unit 11 has a storage unit that holds map data, a GPS antenna, and the like, and the control unit 10 and the navigation processing unit 11 cooperate to provide route guidance.
  • the audio reproducing unit 12 reproduces an audio signal recorded on an optical disc such as a compact disc or a semiconductor memory according to control by the control unit 10.
  • the TV tuner 13 receives a TV broadcast wave signal of a predetermined broadcast station under the control of the control unit 10.
  • the video signal output from the navigation processing unit 11 or the TV tuner 13 is input to the video signal processing unit 14 via the control unit 10 and processed, and displayed on the video display unit 15 such as a liquid crystal panel.
  • the audio signal output from the navigation processing unit 11, the audio reproduction unit 12, and the TV tuner 13 is input to the audio signal processing unit 16 through the control unit 10, processed, and produced by the external speaker 20.
  • the audio signal processing unit 16 includes an amplification unit.
  • the speaker 20 is installed inside the door of the vehicle.
  • the display element 17 is, for example, a light emitting diode (LED), and is turned on or off according to the contact state of the touch sensor 21 described later according to control by the control unit 10.
  • the display element 17 is disposed, for example, in a housing of the in-vehicle device 100 so that the driver can visually recognize the display element 17.
  • the display element 17 may be arranged away from the in-vehicle device 100 and in the vicinity of the steering wheel 200 of the vehicle.
  • the storage unit 18 is a nonvolatile memory.
  • the touch sensor 21 serving as the operation unit is attached to the annular portion 200r of the steering wheel 200.
  • the annular portion 200r is a gripping portion that is a portion that the driver grips during driving.
  • the touch sensor 21 is mounted in a predetermined angular range on each of the left and right sides of the annular portion 200r.
  • the touch sensor 21 is a so-called multi-point detection (multi-touch) touch sensor that can detect contact at a plurality of locations.
  • the touch sensor 21 is preferably mounted within a range of 360 ° on the circumference of the radial cross section of the annular portion 200r. Even in the range of less than 360 °, it is only necessary to cover substantially the entire circumference of the cross section of the annular portion 200r.
  • the driver is holding the portion of the annular portion 200r where the touch sensor 21 is attached.
  • the output of the touch sensor 21 is input to the sensor data generation unit 22.
  • a contact detection signal is input to the sensor data generation unit 22.
  • the sensor data generation unit 22 generates sensor data including position data indicating from which position of the touch sensor 21 the contact detection signal is obtained based on the input contact detection signal, and supplies the sensor data to the control unit 10.
  • the touch sensor 21 and the sensor data generation unit 22 may be integrated, or the sensor data generation unit 22 may be provided in the control unit 10.
  • a projected capacitive (mutual capacitance) type touch sensor can be used.
  • a flexible touch panel developed by the Micro Technology Research Institute can be employed. This flexible touch panel has a structure in which a sensor portion is made of ultra-thin glass having a thickness of 0.02 to 0.05 mm, and an ultra-thin glass and a PET (polyethylene terephthalate) film are bonded together. Even when the touch sensor 21 is attached to the annular portion 200r, the touch sensor 21 does not have irregularities that can be perceived by a hand or a finger, so that it is almost impossible for the driver to operate the steering wheel 200. Absent.
  • the steering angle sensor 31 detects the rotation angle of the steering wheel 200.
  • the direction indicator sensor 32 detects an operation of the direction indicator 320.
  • the shift lever sensor 33 detects where the shift position by the shift lever 330 is.
  • the detection signals of the steering angle sensor 31, the direction indicator sensor 32, and the shift lever sensor 33 are supplied to the control unit 10 via the in-vehicle communication unit 34.
  • FIG. 3A shows an example in which the touch sensor 21 is attached to the entire circumference of the annular portion 200r.
  • FIG. 3B is the same as FIG. 2, and is an example in which the touch sensors 21 are mounted apart from each other in predetermined angular ranges on the left and right above the annular portion 200r.
  • FIG. 3C shows an example in which the touch sensor 21 is mounted in a predetermined angle range only on the right side above the annular portion 200r.
  • FIG. 3D shows an example in which the touch sensors 21 are mounted apart from each other within a predetermined angular range on the left and right sides below the annular portion 200r.
  • FIG. 3E shows an example in which the touch sensor 21 is mounted in a relatively wide angle range above including the top of the annular portion 200r.
  • FIG. 3E corresponds to a combination of the left and right touch sensors 21 in FIG.
  • FIG. 4 is an example in which the left and right touch sensors 21 in FIG. 3B are divided into an upper touch sensor 21a and a lower touch sensor 21b.
  • the upper touch sensor 21a detects contact with the index finger and thumb of the hand
  • the lower touch sensor 21b mainly detects contact with the palm, middle finger, and ring finger.
  • FIG. 5 shows an example in which the touch sensor 21 is mounted on a deformed steering wheel 201 that is not circular.
  • the touch sensor 21 is attached to the left and right straight portions 201 s of the modified steering wheel 201.
  • the driver operates by grasping the straight portion 201s that is the gripping portion, and the touch sensor 21 detects a contact with a palm or a finger.
  • FIG. 6 shows an example of a range in which the palm and the finger are in contact when the driver grips the right touch sensor 21 in FIG.
  • the manner in which the driver grips the annular portion 200r with his / her hand and the size of the hand are not uniform, and FIG. 6 is merely an example.
  • a plurality of detection regions R indicated by hatched Tp are portions where the palm contact is detected, and a plurality of detection regions R indicated by hatched Tt detect the contact of the thumb. It is the part which is doing.
  • the palm contact detection unit Tp and the thumb contact detection unit Tt are referred to.
  • An index finger contacts the back side of the touch sensor 21, which is the traveling direction side of the vehicle not visible in FIG.
  • the touch sensor 21 has a plurality of detection regions R as a detection portion for detecting the contact of a palm or a finger. Coordinates are set in each detection region R of the touch sensor 21. As shown in FIG. 6, in the circumferential direction of the annular portion 200r, the detection region R located at the lower end portion of the touch sensor 21 is set to the coordinate 0, and 1, 2, 2, to the detection region R located at the upper end portion. ..., 30, 31 and circumferential coordinates are set. A coordinate in the circumferential direction of the annular portion 200r in the touch sensor 21 is defined as a Y coordinate.
  • FIG. 7 is a cross-sectional view of the annular portion 200r cut in the radial direction of the annular portion 200r at the portion where the touch sensor 21 is mounted.
  • a detection region R located on the inner diameter side of the annular portion 200r is set as a coordinate 0.
  • 1,..., 21 and 22 are set as coordinates.
  • the coordinate in the circumferential direction of the cross section in the touch sensor 21 is defined as an X coordinate.
  • the sensor data generation unit 22 can obtain position data indicating where the driver is touching the touch sensor 21 based on the X coordinate and Y coordinate of the detection region R from which the contact detection signal is obtained.
  • FIG. 9 schematically shows a state in which each area of the touch sensor 21 shown in FIG. 8 is converted into an equal size.
  • an index finger contact detection unit Ti which is a plurality of detection regions R with which the index finger is in contact, is also shown.
  • the touch sensor 21 also detects contact of those fingers.
  • the driver uses a thumb or index finger suitable for performing a specific input operation on the touch sensor 21 as a finger for the operation.
  • the detection unit 10 a of the control unit 10 detects that an input operation has been performed on the touch sensor 21 with the thumb or index finger based on the sensor data output from the sensor data generation unit 22. Based on the sensor data output from the sensor data generation unit 22, the detection unit 10a also detects that the annular portion 200r (touch sensor 21) is being gripped.
  • the control unit 10 controls the operation target device according to a specific input operation performed on the touch sensor 21.
  • the operation target device is the in-vehicle device 100 as an example.
  • the control unit 10 executes control related to route guidance in the navigation processing unit 11 according to a specific input operation, plays back or stops an audio signal in the audio playback unit 12, and plays a track ( Music) can be advanced or returned. Further, the control unit 10 can switch the reception channel in the TV tuner 13 or control the amplification unit of the audio signal processing unit 16 to decrease or increase the volume according to a specific input operation.
  • the operation target device is a vehicle operation control device that controls the operation of the vehicle.
  • the control unit 10 may control a transmission, a direction indicator, an air conditioner on / off, a set temperature of the air conditioner, and the like via the in-vehicle communication unit 34.
  • the operation target device is a vehicle motion control device
  • the control unit that controls the operation target device may be the control unit 10 in the in-vehicle device 100 or may be a control unit outside the in-vehicle device 100 provided in the vehicle.
  • the extremely thin touch sensor 21 is attached to the annular portion 200r gripped by the driver, and the operation target device is operated by operating the touch sensor 21, so that the hand is released from the annular portion 200r. Or the operation target device can be operated without greatly shifting the hand. Further, since the touch sensor 21 has no irregularities on the surface of the annular portion 200r, there is almost no possibility that the driver will interfere with the operation of the steering wheel 200.
  • the operation target device is prevented from being inadvertently operated when the driver does not intend to operate the operation target device as in the case where the driver holds the annular portion 200r for normal driving. It will be necessary. Therefore, in this embodiment, in order to avoid an erroneous operation that is not intended by the driver, the following is performed.
  • a plurality of detection areas R on the touch sensor 21 detect the operation input by enabling the grip detection area Arg for detecting the contact of the palm and the operation input by the thumb or the index finger.
  • the operation detection area Arv for performing the operation, and the operation invalid area Ariv that is an intermediate area between the grip detection area Arg and the operation detection area Arv and invalidates the operation input are set.
  • the palm contact detection unit Tp is located in the grip detection area Arg
  • the thumb contact detection unit Tt and the index finger contact detection unit Ti are located in the operation detection area Arv.
  • the operation invalid area Ariv has a detection area R for detecting a palm or finger contact, but is operated by the control unit 10 (detection unit 10a) or the sensor data generation unit 22. By performing processing so as to invalidate the input operation from the invalid area Ariv, the operation invalid area can be obtained. Further, the touch sensor 21 may be configured not to provide the detection region R in the range of the operation invalid area Ariv, so that the operation invalid area may be provided. This case is substantially equivalent to the example shown in FIG.
  • the palm contact detection unit Tp, the thumb contact detection unit Tt, and the index finger contact detection unit Ti are relatively close to each other. Therefore, in this embodiment, in order to accurately distinguish between a case where the driver simply holds the annular portion 200r and a case where the driver touches the touch sensor 21 in order to operate the operation target device, the operation invalid area Ariv Is provided.
  • the driver wants to operate the operation target device the driver intentionally extends the thumb or index finger and touches the touch sensor 21 to perform a specific input operation described later.
  • the control unit 10 controls the operation target device according to the input operation when a specific input operation described later is performed in the operation detection area Arv.
  • the detection unit 10a determines that the annular portion 200r is gripped when the palm contact detection unit Tp having a predetermined area or more is obtained in the grip detection area Arg.
  • the controller 10 is configured to hold the annular portion 200r and to control the operation target device when a specific operation is performed in the operation detection area Arv.
  • the area of the palm contact detection unit Tp that is determined to be gripping the toric part 200r should be set appropriately by statistically examining the area when a plurality of drivers hold the steering wheel 200 in a normal manner. That's fine.
  • the area of the palm contact detection portion Tp in the grip detection area Arg is an example of a requirement for determining that the driver is holding the annular portion 200r, and is not limited to this requirement.
  • FIG. 10 shows a cross section in which the annular portion 200r is cut at the grip detection area Arg of the touch sensor 21.
  • the detection unit 10a can determine that the annular portion 200r is gripped when the angle ⁇ in the circumferential direction of the palm contact detection unit Tp is equal to or greater than a predetermined angle.
  • the predetermined angle is, for example, 180 °.
  • the operation detection area Arv is provided at a position separated from the grip detection area Arg by a predetermined distance, so that the driver intentionally performs a specific input operation on the touch sensor 21. It is possible to accurately detect what is going on. Therefore, erroneous operations can be greatly reduced.
  • the area of the palm contact detection portion Tp in the grip detection area Arg and the angle ⁇ in the circumferential direction of the cross section are requirements for determining whether or not the driver is holding the annular portion 200r. It is possible to accurately determine whether or not the annular portion 200r is gripped. Accordingly, it is possible to avoid an erroneous operation in the case where the driver carelessly touches the operation detection area Arv in a state where the driver does not hold the annular portion 200r.
  • the control unit 10 When the control unit 10 detects that the driver is holding the annular portion 200r (touch sensor 21) based on the sensor data based on the contact detection signal from the grip detection area Arg, the control unit 10 informs the driver.
  • the display element 17 is turned on to notify that the operation input by the operation detection area Arv is possible.
  • the driver can determine whether or not the operation target device can be operated by the touch sensor 21 by turning on / off the display element 17. It is preferable to arrange the display element 17 in the vicinity of the steering wheel 200.
  • the control unit 10 sets an area including the palm contact detection unit Tp as the grip detection area Arg.
  • a predetermined range of the Y coordinate including the palm contact detection unit Tp may be set as the grip detection area Arg.
  • the palm contact detection unit Tp has a predetermined area or more, a portion of the plurality of detection regions R on the touch sensor 21 that is detected to be touched by a predetermined area or more is the palm contact detection unit Tp. It becomes.
  • a portion that is detected to be touched by a predetermined angle or more in the circumferential direction of the cross section obtained by cutting the annular portion 200 r at the portion of the touch sensor 21 becomes the palm contact detection portion Tp. .
  • the control unit 10 sets the grip detection area Arg
  • the control unit 10 sets a predetermined range of the Y coordinate above the grip detection area Arg as the operation detection area Arv. In this case, if necessary, a predetermined range of the Y coordinate adjacent to the grip detection area Arg is set as the operation invalid area Ariv, and the grip detection area Arg is set at a position separated from the grip detection area Arg.
  • FIGS. 11A to 11E schematically show a half that is the front side or the back side of the touch sensor 21 facing the driver.
  • the operations shown in FIGS. 11A to 11E are performed with the thumb on the front side and with the index finger on the back side.
  • D R is the right drag sliding to the right thumb or index finger on the touch sensor 21 (operation detection area ARV)
  • D L is the left sliding the thumb or index finger in the right direction Direction drag.
  • D U is an upward drag that slides the thumb or index finger upward
  • D D D is a downward drag that slides in the direction of the thumb or index finger.
  • FIG. 11B shows a tap T that taps the touch sensor 21 with the thumb or index finger.
  • FIG. 11 (c) shows an arc drag D C drag to draw an arc on the touch sensor 21 with the thumb or index finger.
  • FIG. 11 (d) shows a zigzag drag D Z to drag in a zigzag shape on the touch sensor 21 with the thumb or index finger.
  • Figure 11 (e) shows a symbol input drag D S dragging to write symbols thumb or index finger.
  • FIG. 11E shows a state in which the numeral 3 is drawn as a symbol. As a symbol, it is preferable to use numbers and alphabets that are relatively easy to recognize.
  • FIGS. 12A to 12D schematically show a front part 21f that is a half on the front side of the touch sensor 21 and a back part 21r that is a half on the back side, with the touch sensor 21 opened.
  • the front portion 21f is a portion of 0 to 11 of the X coordinate shown in FIGS. 8 and 9, and the back portion 21r is a portion of 12 to 22 of the X coordinate.
  • the front part 21f and the rear part 21r do not have the same area in the strict sense, but the front part 21f and the rear part 21r are the same in FIGS. 12 (a) to 12 (d).
  • the area is shown. 12A to 12D, for easy understanding, the back surface portion 21r is not seen from the back surface side of the annular portion 200r but the back surface portion 21r is seen through the front surface portion 21f. Indicates the state.
  • a specific input operation for the touch sensor 21 may be a combination of an input operation with the thumb for the front portion 21f and an input operation with the index finger for the back portion 21r.
  • 12 (a) is an example in which the right drag D TR sliding the thumb in the right direction at the front portion 21f, both the rightward drag D IR sliding the index finger in the right direction at the rear surface portion 21r.
  • FIG. 12 (b) is an example of performing the leftward drag D TL sliding the thumb to the left at the front portion 21f, both the rightward drag D IR sliding the index finger in the right direction at the rear surface portion 21r.
  • FIG. 12B is realized by dragging the thumb from the outer peripheral side of the annular part 200r to the inner peripheral side and dragging the index finger from the inner peripheral side of the annular part 200r to the outer peripheral side.
  • Figure 12 (c) is an example of performing the right drag D TR sliding the thumb in the right direction at the front portion 21f, both the left drag D IL sliding the index finger to the left at the back surface 21r.
  • FIG. 12C is realized by dragging the thumb from the inner peripheral side to the outer peripheral side of the annular portion 200r and dragging the index finger from the outer peripheral side to the inner peripheral side of the annular portion 200r.
  • Figure 12 (d) is an example in which the direction drag D TU on sliding the thumb in the upward direction at the front portion 21f, both the downward drag D ID sliding the index finger in a downward direction at the rear surface portion 21r.
  • a pattern in which the thumb is dragged downward and the index finger is dragged upward may be used, or a pattern in which both the thumb and index finger are dragged upward or downward may be used.
  • various patterns are shown in which the input operation with the thumb on the front surface portion 21f and the input operation with the index finger on the back surface portion 21r are shown.
  • FIGS. 13A to 13D show the operations by the left and right hands when the left touch sensor 21 in FIG. 3B is the left touch sensor 21L and the right touch sensor 21 is the right touch sensor 21R.
  • An example of a combined pattern is shown.
  • a surface corresponding to the front portion 21f of FIG. 12 operated by the thumb is shown by a schematic plane.
  • FIG. 13 (a) is combined with the left drag D TL sliding the thumb against the left touch sensor 21L in the left direction and a right direction drag D TR sliding the thumb in the right direction with respect to the right touch sensor 21R Pattern.
  • FIG. 13 (b) combined with the right drag D TR sliding the thumb in the right direction with respect to the left touch sensor 21L, and a left drag D TL sliding the thumb to the left with respect to the right touch sensor 21R Pattern.
  • FIG. 13C shows a pattern in which an upward drag DTU that slides the thumb upward is combined with both the left touch sensor 21L and the right touch sensor 21R.
  • a pattern combining input operations by left and right hands is a specific input operation for controlling the operation target device, the driver will hold the annular portion 200r with both hands, which contributes to safe driving.
  • the example of FIG. 3B is most preferable in that it contributes to safe driving because the touch sensor 21 is mounted at the most appropriate position where the annular portion 200r is grasped with both hands.
  • An input operation may be received when the left and right touch sensors 21 are held with both hands. It is good also as a state which does not accept input operation, when one hand leaves
  • FIG. When one hand moves away from the touch sensor 21, the state of accepting an input operation may be continued. Even when a specific input operation with only one hand is used, if the input operation is accepted when the left and right touch sensors 21 are held with both hands, it contributes to safe driving.
  • the specific operation is a combination of the input operation with the thumb and the input operation with the index finger. It is considered that there is a relatively low possibility that a pattern or a specific pattern combining left and right hand input operations will result. Therefore, when only a specific pattern that combines the input operation with the thumb and the input operation with the index finger or only a specific pattern that combines the input operation with the left and right hands is used, one of the devices for avoiding the above-described erroneous operation. Some or all of them may be omitted. Of course, even when only a specific pattern in which the input operations by the left and right hands are combined is used, it is preferable to employ a device for avoiding the above-described erroneous operation.
  • the storage unit 18 stores a table in which the above-described specific input operation or a combination of specific input operations is associated with the type of control for the operation target device.
  • the control unit 10 controls the operation target device according to the operation input to the touch sensor 21 according to the table stored in the storage unit 18.
  • the storage unit 18 may be provided in the control unit 10.
  • the control unit 10 acquires sensor data output from the sensor data generation unit 22 in step S1.
  • the control unit 10 determines whether or not the annular portion 200r is gripped based on the detection output from the detection unit 10a. If it is determined that the annular portion 200r is gripped (YES), the control unit 10 proceeds to step S3, and if it is not determined that the annular portion 200r is gripped (NO), the control unit 10 10 returns the process to step S1.
  • step S3 the control unit 10 determines whether or not an input operation has been performed based on the detection output from the detection unit 10a. If it is determined that there is an input operation (YES), the control unit 10 moves the process to step S4. If it is not determined that there is an operation input (NO), the control unit 10 returns the process to step S1. In step S4, the control unit 10 determines whether or not to permit an operation on the operation target device by the input operation in step S3. If it is determined that the operation is permitted (YES), the control unit 10 moves the process to step S5. If it is not determined that the operation is permitted (NO), the control unit 10 returns the process to step S1.
  • control unit 10 permits an operation on the operation target device when a specific input operation is performed in the operation detection area Arv, and operates even if a specific input operation is performed in the operation invalid area Ariv. Do not allow operations on the target device. In addition, even if any input operation is performed in the operation detection area Arv, the control unit 10 does not permit the operation on the operation target device if it is not the above-described specific input operation, and only when the specific input operation is performed. Allows operations on the operation target device.
  • step S5 the control unit 10 determines an operation based on the input operation.
  • step S6 the control unit 10 executes control according to the determined operation on the operation target device, and returns the process to step S1. .
  • the operation according to this embodiment is summarized as follows.
  • the detection unit 10a (first detection unit) is touched in the first area of the touch sensor 21 attached to the gripping part (the annular part 200r or the straight part 201s) gripped by the driver in the steering wheels 200 and 201. It is detected that it is in a state.
  • An example of the first area is the grip detection area Arg.
  • the detection unit 10a (second detection unit) performs a specific input operation on the second area located above the first area in the touch sensor 21 while the first area is being touched. Detect what has been done.
  • An example of the second area is the operation detection area Arv.
  • the thumb or index finger is positioned above the palm, so the upper side of the first area may be the second area.
  • the area located on the upper side is an area located on the upper side of the first area when the driver holds the gripping part without rotating the steering wheel 200.
  • the first area is touched more than a predetermined area, it is preferable that the first area is touched.
  • the detection unit 10a (first detection unit) is mounted so as to cover the gripping part within a predetermined range of the gripping part (annular part 200r or linear part 201s) gripped by the driver on the steering wheels 200 and 201.
  • the touch sensor 21 In the first area on the touch sensor 21, the touch sensor 21 is in a state of being touched by a predetermined angle in the circumferential direction in the cross section when the gripping part is cut in the radial direction of the steering wheels 200 and 201. Is detected.
  • the detection unit 10a (second detection unit) performs a specific input operation on a second area different from the first area in the touch sensor 21 in a state where the first area is touched by a predetermined angle or more. Detect what has been done. When the first area is touched more than the predetermined angle and a specific input operation is performed, an operation target device to be operated by the touch sensor 21 is controlled according to the specific input operation.
  • the second area is an area located above the first area.
  • the area located on the upper side is an area located on the upper side of the first area in a state where the driver grips the gripping part without rotating the steering wheel 200.
  • 15 and 16 show a configuration example for effectively notifying the driver that the touch sensor 21 has been operated.
  • 15 and 16 are schematic views in which the touch sensor 21 is developed and converted into a rectangular shape, as in FIG.
  • FIG. 15 is an example in which a color change sheet 41 containing a coloring material is provided on the lower surface side of the touch sensor 21.
  • the driver can see the color of the color change sheet 41 disposed on the lower surface of the touch sensor 21 via the touch sensor 21.
  • the driver can recognize that the touch sensor 21 has been operated by changing the color of the color change sheet 41 at the portion where the touch sensor 21 is touched under the control of the control unit 10.
  • FIG. 16 shows an example in which a tactile feedback sheet 42 for changing a tactile sensation (hand touch) is provided on the upper surface side of the touch sensor 21.
  • a tactile feedback sheet 42 for example, a sheet called “E-Sense” developed by Senseg of Finland can be used. This sheet realizes tactile feedback by charging the film. Even if the tactile feedback sheet 42 is provided on the upper surface side of the touch sensor 21, the touch sensor 21 can detect contact with a finger or the like. When the touch sensor 21 is operated via the tactile feedback sheet 42, the driver can recognize that the touch sensor 21 has been operated by changing the tactile sense of the tactile feedback sheet 42 under the control of the control unit 10. it can.
  • a steering wheel 210 according to an embodiment shown in FIG. 17 is configured to output a control signal for the operation target device from the steering wheel 210.
  • the same parts as those in FIGS. 1 and 2 are denoted by the same reference numerals, and description thereof will be omitted as appropriate.
  • the steering wheel 210 includes, for example, a sender data generation unit 23 similar to the sensor data generation unit 22 in FIG. 1 and a control unit 24 similar to the control unit 10 in a portion other than the annular portion 200r. I have.
  • the control unit 24 includes a detection unit 24a similar to the detection unit 10a and a control signal generation unit 24b.
  • the control signal generator 24b When the steering wheel 210 is mounted on the vehicle, the control signal generator 24b generates a control signal for controlling the operation target device in accordance with a specific input operation to the touch sensor 21.
  • the control signal output from the control signal generator 24 b is output to the output terminal 26 via the cable 25. If the output terminal 26 is connected to the operation target device, the operation target device can be controlled by the control signal. Examples of specific input operations are the same as those in FIGS.
  • the requirements for the control signal generator 24b to generate the control signal are the same as described above.
  • the touch sensor 21 may be detachably attached to the annular portion 200r using a surface fastener.
  • the annular portion 200r is the gripping portion, the gripping portion is not necessarily circular.
  • the touch sensor 21 does not need to be configured by a single sheet, and the touch sensor 21 may be configured by a plurality of pieces of touch sensors. If the touch sensor 21 is composed of a plurality of pieces of touch sensors, the shape of each piece of the touch sensor can be simplified, which is advantageous when producing touch sensors. In addition, when the touch sensor 21 is configured by a plurality of pieces of touch sensors, the pieces of touch sensors do not necessarily have to be arranged without gaps.
  • the touch panel 21 according to the present embodiment is mounted so as to cover the gripping portion. However, the touch panel 21 according to the present embodiment is mounted so as to cover the gripping portion. It is assumed that the sensor 21 is mounted so as to cover the gripping portion in a state where there is a gap between the pieces of the touch sensor.
  • the range in which the touch sensor 21 is provided is not limited to the gripping portion (the annular portion 200r or the straight portion 201s) that the driver grips during driving. You may extend to the surface of the connection part which connects between.
  • the connecting portion is a portion located between the left and right hands in the state shown in FIG. 2, and in FIG. 21, the sensor data generating portion 23 and the control portion 24 are provided.
  • the touch sensor 21 may be extended to the surface of the connecting portion, and the operation detection area Arv may be set at a position close to the gripping portion in the connecting portion. If the position is close to the gripping part, the driver can operate the operation target device without releasing the hand from the gripping part or greatly shifting the hand during driving. Therefore, even when the touch sensor 21 is extended to the surface of the connecting portion, there is almost no possibility that the driver will interfere with the operation of the steering wheels 200, 201, and 210.
  • step S4 when the vehicle is in a specific state, it is preferable not to permit (that is, invalidate) the control of the operation target device.
  • the rotation angle of the steering wheel 200 for determining whether or not to allow control of the operation target device is set. As shown in FIG. 18, when the steering wheel 200 is not rotated, the rotation angle is 0 °. For example, when the wheel is rotated rightward, the rotation angle is plus. When the wheel is rotated leftward, the rotation angle is ⁇ . An input operation to the touch sensor 21 is permitted as valid within a range of 30 °, and if the rotation angle exceeds ⁇ 30 °, the input operation to the touch sensor 21 is invalidated and not permitted.
  • the vehicle When the rotation angle exceeds ⁇ 30 °, the vehicle is in a specific state where it makes a right or left turn or cornering. If an operation target device is to be controlled in such a specific state, there is a high possibility that an erroneous operation will occur. In other words, there is a high possibility that the operation input in such a specific state is an operation input not intended by the user. Also, it is not preferable in terms of safety. Therefore, in the present embodiment, when the vehicle is in a specific state, the control on the operation target device is invalidated.
  • the rotation angle of the steering wheel 200 detected by the steering angle sensor 31 is input to the control unit 10.
  • the control unit 10 switches between a state where the input operation to the touch sensor 21 is enabled and a state where the input operation to the touch sensor 21 is disabled according to the rotation angle of the steering wheel 200 detected by the steering angle sensor 31.
  • a detection signal from the direction indicator sensor 32 is also input to the control unit 10. Therefore, the control unit 10 may invalidate the input operation to the touch sensor 21 when the direction indicator 320 is operated by the detection signal from the direction indicator sensor 32.
  • the direction indicator 320 When the direction indicator 320 is operated, it can be considered that the steering wheel 200 is in a specific state in which the steering wheel 200 is rotated beyond a predetermined rotation angle.
  • the direction indicator 320 may also be used for operations other than a right turn or left turn signal, and the operation of the direction indicator 320 here is an operation for making a right turn or left turn signal.
  • the control unit 10 invalidates the input operation to the touch sensor 21 even when the shift position of the shift lever 330 is reversed by the detection signal from the shift lever sensor 33.
  • the operation target device when the rotation angle of the steering wheel 200 exceeds a predetermined angle of, for example, ⁇ 30 °, or when the shift position of the shift lever 330 is reversed in addition to when the direction indicator 320 is operated.
  • a predetermined angle for example, ⁇ 30 °
  • disabling control on the operation target device may mean disabling control of the operation target device by disabling the specific input operation even if the specific input operation described above is performed. Even if some sensor data is input from the sensor data generation unit 22 to the unit 10, the control unit 10 may invalidate the sensor data. As a result, the control on the operation target device may be invalidated.
  • step S4 in FIG. 14 An example of specific processing in step S4 in FIG. 14 will be described using the flowchart in FIG. In FIG. 19, the controller 10 determines whether or not the shift position of the shift lever 330 is reverse in step S41. If the shift position is reverse (YES), in step S45, the control unit 10 disallows the input operation in step S3 and proceeds to step S1 in FIG. If the shift position is not reverse (NO), the control unit 10 determines whether or not the direction indicator 320 is operated in step S42. If the direction indicator 320 is operated (YES), the control part 10 will make the input operation in step S3 disallowed in step S45, and will transfer to step S1 of FIG.
  • the control unit 10 determines whether or not the rotation angle of the steering wheel 200 exceeds a predetermined angle in step S43. If the rotation angle of the steering wheel 200 exceeds a predetermined angle (YES), the control unit 10 makes the input operation in step S3 not permitted in step S45 and shifts to step S1 in FIG. If the rotation angle of the steering wheel 200 does not exceed the predetermined angle (NO), the control unit 10 determines in step S44 whether or not a specific input operation has been performed in the operation detection area Arv. If the specific input operation is not performed (NO), the control unit 10 determines that the input operation in step S3 is not permitted in step S45, and proceeds to step S1 in FIG. If the input operation is not permitted in step S45, the control on the operation target device becomes invalid. When a specific input operation is performed (YES), the control unit 10 permits the input operation in step S3 in step S46, and proceeds to step S5 in FIG.
  • steps S41, S42, and S43 are provided, but only one or two of these steps may be provided. Further, when all of steps S41, S42, and S43 or two of these steps are provided, the order is arbitrary.
  • the shift lever 330 is referred to, but the shape of the operation unit for switching between straight and reverse travel of the vehicle and changing the transmission gear ratio is arbitrary, and may be any of a floor shift, a column shift, a paddle shift, and the like. These are all included in the shift lever.
  • a third embodiment of the control device and control method for the operation target device in the vehicle will be described.
  • the basic configuration and operation in the third embodiment are the same as those in the first embodiment, and only different parts will be described.
  • the driver when the driver is holding the annular portion 200r for normal driving and does not intend to operate the operation target device, the driver does not accept an input operation on the touch sensor 21 with a finger, When the driver tries to operate the operation target device, an input operation to the touch sensor 21 by a finger is accepted.
  • FIG. 20 shows an example of the state of the palm contact detection unit Tp and the thumb contact detection unit Tt when the driver is holding the annular portion 200r for normal driving.
  • FIG. 20 is a schematic diagram in which each area of the touch sensor 21 is converted into an equal size, as in FIG. 9.
  • the palm contact detection unit Tp has a relatively large area, and the thumb contact detection unit Tt is located at a position close to the palm contact detection unit Tp.
  • the index finger contact detection unit Ti is not shown, but the index finger contact detection unit Ti is also in a position close to the palm contact detection unit Tp.
  • FIG. 21 shows an example of states of the palm contact detection unit Tp and the thumb contact detection unit Tt when the driver tries to operate the operation target device.
  • the area of the palm contact detection unit Tp is smaller than that of FIG. 20, and the thumb contact detection unit Tt is located away from the palm contact detection unit Tp.
  • the index finger contact detection unit Ti is not shown, but the index finger contact detection unit Ti is also located away from the palm contact detection unit Tp.
  • a specific input operation is performed on the touch sensor 21 with the driver simply holding the annular portion 200r for normal driving and the thumb or index finger.
  • the palm contact detection unit Tp may include a portion where the middle finger, the ring finger, and the little finger (in some cases, the index finger in addition to this) are in contact.
  • the X coordinate 8 and Y coordinate 4 to 8 portions of the palm contact detection unit Tp are touched by the middle finger, the ring finger, and the tip of the little finger.
  • the portion touched by the tip of the middle finger, the ring finger, and the little finger has moved to the X coordinate 5 and the Y coordinates 4 to 8. This is because the positions of the tips of the middle finger, the ring finger, and the little finger have changed to the back side of the annular portion 200r.
  • the state in which the operation target device is to be operated is determined based on the change in the position of the end in the circumferential direction in the cross section of the annular portion 200r of the palm contact detection unit Tp.
  • the control unit 10 is in a state where the driver holds the annular portion 200r for normal driving. It is determined that the input operation to the touch sensor 21 by the finger is not accepted. Further, when the area of the palm contact detection unit Tp becomes a second area that is narrower than a predetermined ratio as shown in FIG. It is determined that it is in a state where it is going to be operated, and a state in which an input operation to the touch sensor 21 by a finger is accepted.
  • the operation invalid area Ariv may not be provided.
  • the operation invalid area Ariv is omitted, and the grip detection area Arg and the operation detection area Arv are preset on the touch sensor 21, or the control unit 10 operates on the touch sensor 21 with the grip detection area Arg.
  • a state in which the detection area Arv is set is shown.
  • FIG. 22 shows a state where the driver is holding the annular portion 200r for normal driving, as in FIG.
  • the control unit 10 can discriminate between the two states described above, so that an erroneous operation can be avoided.
  • the area of the palm contact detection unit Tp shown in FIG. 20 and the area of the palm contact detection unit Tp shown in FIG. 21 are registered in the control unit 10 or the storage unit 18 in advance, and an input operation to the touch sensor 21 is accepted. And a state of not accepting may be switched.
  • the area of the palm contact detection part Tp is not always a constant area, an allowable amount of the extent of the area deviation is set.
  • a change in the shape of the palm contact detection unit Tp may be detected.
  • a change in the angle ⁇ in the circumferential direction of the palm contact detection unit Tp shown in FIG. 10 or a change in the maximum length in the X coordinate direction of the palm contact detection unit Tp may be detected.
  • the state in which no input operation is accepted may be that even if a specific input operation is performed, the specific input operation is invalidated and control on the operation target device is not permitted. Even if some sensor data is input from the sensor data generation unit 22, the control unit 10 may invalidate the sensor data. As a result, the control on the operation target device may be invalidated.
  • ⁇ Fourth embodiment> A fourth embodiment of the control device and control method for the operation target device in the vehicle will be described.
  • the basic configuration and operation in the fourth embodiment are the same as those in the first embodiment, and only different parts will be described.
  • FIG. 23 shows a state in which the touch sensor 21 is deployed as in FIG. In the configuration example shown in FIG. 23, the operation invalid area Ariv is omitted.
  • the thumb contact detection unit Tt and the index finger contact detection unit Ti are considered to be relatively close to the palm contact detection unit Tp.
  • the thumb contact detection unit Tt and the index finger contact detection unit Ti when the driver is not trying to operate the device to be operated and is simply holding the annular portion 200r are denoted by Tt0 and Ti0, respectively.
  • FIG. 23 shows that when the driver simply holds the annular portion 200r and does not attempt to operate the operation target device, the thumb contact detection unit Tt0 and the index finger contact detection unit Ti0 are detected, and the driver is the operation target. It is shown that when the device is operated, the thumb contact detection unit Tt and the index finger contact detection unit at a position separated from the palm contact detection unit Tp are moved.
  • the X coordinates of the thumb contact detection unit Tt0 and the thumb contact detection unit Tt are the same, and the X coordinates of the index finger contact detection unit Ti0 and the index finger contact detection unit Ti are the same, but the X coordinate is shifted. There is also. Even in this case, it is only necessary to pay attention to the movement of the Y coordinate.
  • the control unit 10 is configured such that the end of the palm contact detection unit Tp on the thumb contact detection unit Tt0 side and the end of the thumb contact detection unit Tt0 on the palm contact detection unit Tp side in a state where the driver normally holds the annular portion 200r. Is stored as a reference distance.
  • the control unit 10 may store the reference distance ⁇ 1 as a storage unit or may store the reference distance ⁇ 1 in the storage unit 18.
  • the distance between the end of the palm contact detection unit Tp on the thumb contact detection unit Tt side and the end of the thumb contact detection unit Tt on the palm contact detection unit Tp side when the driver is about to operate the operation target device is: For example, the distance ⁇ 2 is longer than the reference distance ⁇ 1.
  • the control unit 10 detects the thumb contact detection unit Tt at a position longer than the reference distance ⁇ 1 by a predetermined distance or more. It is determined that the target device is being operated. And the control part 10 accepts the input operation by the thumb detected by the thumb contact detection part Tt in this state as effective.
  • FIG. 23 shows only the distances ⁇ 1 and ⁇ 2 between the palm contact detection unit Tp and the thumb contact detection units Tt0 and Tt, but the distance between the palm contact detection unit Tp and the index finger contact detection unit Ti0 is also stored in the same manner. What is necessary is just to judge the position of the index finger contact detection part Ti at the time of trying to operate an operation target apparatus.
  • the palm contact detection unit Tp in which the driver's palm is in contact with the touch sensor 21 and the finger (thumb or A reference distance from the finger contact detection unit (thumb contact detection unit Tt0 or index finger contact detection unit Ti0) in which the index finger is in contact with the touch sensor 21 is stored, and the palm contact detection unit Tp and the finger contact detection unit An input operation with a finger may be validated in a state where the distance is longer than the reference distance by a predetermined distance.
  • the operation invalid area Ariv is omitted, but the operation invalid area Ariv may be provided.
  • the distance of the operation invalid area Ariv may be shorter than that in FIG.
  • a fifth embodiment of the control device and control method for the operation target device in the vehicle will be described.
  • the basic configuration and operation in the fifth embodiment are the same as those in the first embodiment, and only different parts will be described.
  • the fifth embodiment is still another configuration example for reducing erroneous operations.
  • the control unit 10 may validate the input operation when the detection unit 10a detects that the same input operation has been performed with the left and right hands. It is preferable that the control unit 10 validates the input operation when the same input operation is performed at the same timing with the left and right hands.
  • FIG. 13 (a) An example of an input operation when the same input operation is performed at the same timing with the left and right hands will be described with reference to FIGS. Similar to FIG. 24 (a) is FIG. 13 (a), the left direction drag D TL sliding the thumb to the left relative to the left touch sensor 21L, the right sliding the thumb in the right direction with respect to the right touch sensor 21R A case where the direction drag DTR is performed at the same timing is shown.
  • both the left and right thumbs are dragged from the inner periphery side to the outer periphery side of the annular portion 200r and the same input operation is performed.
  • the same input operation may be defined.
  • the left and right symmetrical input operations as shown in FIG.
  • the downward drag D TD for sliding the thumb downward is performed at the same timing on both the left touch sensor 21L and the right touch sensor 21R. Shows the case.
  • both the left touch sensor 21L and the right touch sensor 21R are subjected to an upward drag DTU that causes the thumb to slide upward at the same timing, the same input operation may be defined.
  • the finger is dragged in the vertical direction, it is preferable to define the same input operation as dragging in the same direction on the left and right instead of symmetrically.
  • the input operation is performed with the thumb, but an index finger may be used.
  • FIG. 24 (c) shows a case where the tap T is tapped at the same timing with both the left touch sensor 21L and the right touch sensor 21R by hitting the touch sensor 21 with the thumb or index finger.
  • the control unit 10 determines that the timing is the same in the following case. For example, in the case of drag, as shown in FIG. 25 (a), and time TM L from the start timing t1 of dragging with the left hand fingers to the end time t3, until the end timing t4 from the start timing t2 of the drug by the right hand finger it can be a time TM R of considers the same timing when the overlapping predetermined time (predetermined ratio) or more. Further, as shown in FIG. 25 (b), previously measured for a predetermined time TM P1 from the start timing t1 of the drag by the finger of the left hand for example initiated the drag, the drag due to the fingers of the right hand in time TM P1 made Can be regarded as the same timing. The criterion for determining that the timing is the same may be set as appropriate.
  • an allowable range is assumed to be regarded as the same input operation.
  • dragging if the direction in which the finger slides is the same within a predetermined allowable range, it is regarded as the same input operation.
  • tap T if the location of the tap T is the same, it can be regarded as the same input operation. When the location of the tap T is common to the front surface portion 21f or the back surface portion 21r, the same location can be obtained.
  • the control unit 10 may set a reception mode for receiving a specific input operation to the touch sensor 21 as described above so that the driver intentionally enters the reception mode. . Even when shifting from a state other than the reception mode to the reception mode, it is necessary to avoid unintentionally shifting to the reception mode. Therefore, when the detection unit 10a detects that the same input operation has been performed on the touch sensor 21 with both hands, the control unit 10 accepts the specific input operation from the state where the specific input operation is not accepted. Transition to the state (acceptance mode). The same input operation is as described in FIG. Also in this case, as described with reference to FIG. 25, when it is detected that the same input operation is performed at the same timing, it is preferable to shift to the reception mode.
  • the detection unit 10a detects that the specific input operation (first specific input operation) described with reference to FIGS. 11 to 13 has been performed, and then defines the same input operation as described with reference to FIG.
  • the control unit 10 determines the first specific input operation input immediately before.
  • the control unit 10 performs the first input just before. Confirm a specific input operation.
  • the reception mode may be set. As shown in FIG. 27, time TM L upward drag D IU by left hand forefinger, if between the upward drag D TU is a predetermined time TM P2 by the right thumb, the control unit 10, the upper Assuming that the direction drag D IU and the upward drag D TU are continuous input operations, a reception mode is set.
  • the operation target can be switched according to the input operation pattern by the left and right hands.
  • the control unit 10 sets the audio operation mode in which the audio playback unit 12 is operated.
  • the control unit 10 sets a target to be operated based on a specific input operation as the audio playback unit 12 in the in-vehicle device 100.
  • the control unit 10 sets the navigation operation mode in which the navigation processing unit 11 is operated.
  • the control unit 10 sets a target to be operated based on a specific input operation as the navigation processing unit 11 in the in-vehicle device 100.
  • the combination of these input operations is merely an example, and is not limited to FIGS.
  • the driver is holding the annular portion 200r (touch sensor 21).
  • a sixth embodiment of the control device and control method for the operation target device in the vehicle will be described.
  • the basic configuration and operation in the sixth embodiment are the same as those in the first embodiment, and only different parts will be described.
  • the sixth embodiment is still another configuration example for reducing erroneous operations.
  • FIG. 30A shows an example in which the grip detection area Arg, the operation invalid area Ariv, and the operation detection area Arv of the touch sensor 21 are color-coded.
  • it may be color-coded by applying paint, or color-coded by pasting sheets of the respective colors.
  • It is also effective to color-code the portion of the touch sensor 21 and the portion other than the touch sensor 21 of the annular portion 200r.
  • a color may be given to the part of the touch sensor 21, or a color may be given to a part other than the touch sensor 21.
  • the parts other than the touch sensor 21, the grip detection area Arg, the operation invalid area Ariv, and the operation detection area Arv may be different colors.
  • FIG. 30B shows an example in which the operation invalid area Ariv is not provided, and shows an example in which the grip detection area Arg and the operation detection area Arv are color-coded. Furthermore, it is preferable to color-code the portion of the touch sensor 21 and the portion other than the touch sensor 21 of the annular portion 200r because the driver can clearly and immediately recognize the position of the touch sensor 21. As shown in FIGS. 30A and 30B, it is more preferable to color-code each area because the driver can clearly and immediately recognize the position of each area of the touch sensor 21. 30A and 30B, the color change sheet 41 described above can also be used.
  • the color change sheet 41 can be used as follows. .
  • the controller 10 sets a grip detection area Arg and an operation detection area Arv for the touch sensor 21 after the driver grips the portion of the touch sensor 21 in the annular portion 200r. Then, after setting the grip detection area Arg and the operation detection area Arv, the control unit 10 colors the grip detection area Arg and the operation detection area Arv.
  • the color coding may be performed by coloring each area, or may be color coded as a result by coloring some areas.
  • FIG. 31 (a) shows an example in which markers M1 and M2 of a predetermined color are attached to the boundaries of the grip detection area Arg, the operation invalid area Ariv, and the operation detection area Arv.
  • the markers M1 and M2 are examples of boundary identification means for identifying the boundary.
  • the markers M1 and M2 may be provided by a paint or a seal, for example.
  • FIG. 31B shows an example in which the operation invalid area Ariv is not provided, and shows an example in which a marker M3 of a predetermined color is added to the boundary between the grip detection area Arg and the operation detection area Arv. As shown in FIGS. 31A and 31B, it is preferable to indicate the position of the boundary because the driver can clearly and immediately visually recognize the position of each area of the touch sensor 21.
  • FIG. 32 shows an example in which the diameter of the annular portion 200r in the operation detection area Arv is smaller than the diameter of the annular portion 200r in the grip detection area Arg.
  • FIG. 32 shows an example in which the operation invalid area Ariv is not provided. What is necessary is just to make a diameter so thin that it does not become a hindrance when a driver operates steering wheel 200, and it can recognize by operation that it is operation detection area Arv. It is preferable that the diameter of the boundary between the operation detection area Arv and the grip detection area Arg is gradually changed.
  • FIG. 33 shows an example in which the diameter of the annular portion 200r in the operation detection area Arv is larger than the diameter of the annular portion 200r in the grip detection area Arg.
  • FIG. 33 shows an example in which the operation invalid area Ariv is not provided. What is necessary is just to make a diameter so thick that it does not become a hindrance when a driver operates steering wheel 200, and it can recognize by operation that it is operation detection area Arv. It is preferable that the diameter of the boundary between the operation detection area Arv and the grip detection area Arg is gradually changed. 32 and 33, the diameter of the annular portion 200r changes at the boundary between the grip detection area Arg and the operation detection area Arv. The change in diameter can be interpreted as an example of boundary identification means for physically identifying the boundary.
  • FIG. 34 shows an example in which recesses B1 and B2 are provided at the boundaries of the grip detection area Arg, the operation invalid area Ariv, and the operation detection area Arv.
  • the driver can visually recognize the position of each area by the recesses B1 and B2, and can recognize the position of each area by touch when the touch sensor 21 is gripped.
  • a recess may be provided at the boundary between the grip detection area Arg and the operation detection area Arv.
  • the touch sensor 21 may be divided by the recesses B1 and B2, or may not be divided.
  • the recesses B1 and B2 are another example of boundary identification means for physically identifying the boundary.
  • FIG. 35 shows an example in which convex portions B3 and B4 are provided at the boundaries of the grip detection area Arg, the operation invalid area Ariv, and the operation detection area Arv.
  • the driver can visually recognize the position of each area by the convex portions B3 and B4, and can recognize the position of each area by tactile sense when holding the touch sensor 21.
  • a convex portion may be provided at the boundary between the grip detection area Arg and the operation detection area Arv.
  • the touch sensor 21 may be divided by the recesses B3 and B4, or may not be divided.
  • the convex portions B3 and B4 are still another example of boundary identifying means for physically identifying the boundary.
  • FIG. 36A shows a state where the driver has not yet gripped the grip detection area Arg. In this example, the operation invalid area Ariv is not provided.
  • FIG. 36B shows a state where the driver holds the grip detection area Arg.
  • the color change sheet 41 described above is provided on the lower surface side of the operation detection area Arv.
  • FIG. 37A and 37B show an example in which the above-described tactile feedback sheet 42 is provided on the upper surface side of the operation detection area Arv.
  • FIG. 37A shows a state where the driver has not yet gripped the grip detection area Arg.
  • FIG. 37B shows a state where the driver has gripped the grip detection area Arg.
  • the control unit 10 controls the tactile feedback sheet 42 to change the tactile sense of the tactile feedback sheet 42, for example, as shown in FIG. Change to a rough state.
  • the tactile feedback sheet 42 may be in a rough state, and when it is detected that the grip detection area Arg is gripped, it may be changed to a smooth state.
  • the driver can clearly recognize the position of the operation detection area Arv by tactile sensation, so that it is possible to further reduce erroneous operations.
  • the tactile sensation of the tactile feedback sheet 42 it is not necessary to visually check the operation detection area Arv, which contributes to safe driving.
  • the method of changing the tactile sensation of the tactile feedback sheet 42 is arbitrary.
  • the portion of the operation detection area Arv may have a different tactile sense from the grip detection area Arg and the operation invalid area Ariv in advance.
  • the surface of the operation detection area Arv may be roughened, or surface treatment may be applied or a sheet of those tactile sensations may be pasted so as to have a tactile sensation different from the grip detection area Arg and the operation invalid area Ariv.
  • the grip detection area Arg and the operation detection area Arv are configured to be distinguishable when at least the detection unit detects that the grip detection area Arg is gripped. Yes.
  • the configurations shown in FIGS. 30 to 24 are examples. For example, the configurations shown in FIGS. 30 to 24 may be combined.
  • the grip detection area Arg and the operation detection area Arv may be configured to be always distinguishable. However, the grip detection area Arg and the operation detection area are distinguished only when it is detected that the grip detection area Arg is gripped. If configured so as to be able to indicate whether or not the grip detection area Arg and the operation detection area Arv are distinguishable, it is possible to indicate whether or not the operation input to the operation detection area Arv is being accepted.
  • FIG. 38 shows an example of a locus when the finger of the left hand is slid in the left-right direction on the touch sensor 21.
  • the left side of FIG. 38 is the outside of the annular part 200r, and the right side is the inside of the annular part 200r. As shown in FIG. 38, the inner side tends to be lower than the outer side of the annular portion 200r.
  • the control unit 10 determines that the difference dxh of the x component, which is the horizontal component between the trajectory start point Ps and the end point Pe, is greater than or equal to a predetermined threshold value. If the difference dyh of the y component, which is the vertical component, is less than the predetermined threshold, it is assumed that the drag is linearly made in the horizontal direction as shown in FIG.
  • the control unit 10 determines that the y component difference dyv between the trajectory start point Ps and the end point Pe is equal to or greater than a predetermined threshold, and the x component difference dxv. If it is less than the predetermined threshold, it is assumed that the user has dragged in a straight line in the vertical direction as shown in FIG.
  • the threshold for the difference dxh is THxh
  • the threshold for the difference dyh is THyh
  • the threshold for the difference dyv is THyv
  • the threshold for the difference dxv is THxv
  • FIG. 41A shows a state in which any finger is slid in the right direction with respect to the left touch sensor 21L and a state in which any finger is slid in the downward direction with respect to the right touch sensor 21R. Yes.
  • the correction of the locus described in FIGS. 39 and 40, the control unit 10, as shown in FIG. 41 (b), the left touch sensor 21L in the right direction drag D R, downward drag in the right touch sensor 21R D can be regarded as D.
  • Example shown in FIG. 41 (d) is a diagonal direction drag D O of the lower right direction, likewise, to achieve upper right direction, lower left direction, diagonally left upward diagonal direction drag D O be able to. If the diagonal drag D0 is realized as in the present embodiment, the operability is improved.
  • the control unit 10 has two same directions (in this case, the upward direction). Control may be performed so as to synthesize a drag vector and perform an operation based on a larger vector. By controlling in this way, when the map is scrolled according to the drag operation, the map can be largely scrolled by one drag operation, and the operability is improved. Further, the drag operation vector for the left touch sensor 21L and the drag operation vector for the right touch sensor 21R are in opposite directions, and the angle formed by the two vectors is close to 180 ° (for example, 180 ° ⁇ ⁇ : ⁇ is A special operation may be performed at an arbitrary angle). For example, the map may be rotated.
  • control unit 10 of the present embodiment controls the operation target device according to a pattern based on a combination of the input operation for the left touch sensor 21L and the input operation for the right touch sensor 21R.
  • the vector composition based on the four directions of the upward direction, the downward direction, the left direction, and the right direction has been described.
  • the vector may be composed based on more directions.
  • the deviation between the trajectory of the drag operation intended by the user and the trajectory of the drag operation actually performed is often left-right symmetric, and the deviation of the trajectory can be absorbed by performing vector synthesis. Therefore, it is possible to perform only the correction for making the drag operation linear to connect the start point and the end point, and not to regard it as either the horizontal direction or the vertical direction as described above.
  • an operation of sliding a finger on the touch sensor 21 in the radial direction of the annular portion 200r is defined as a horizontal drag Dh, and the finger is annular.
  • An operation of sliding the portion 200r in the circumferential direction is defined as a vertical drag Dv.
  • FIG. 43 shows a horizontal drag Dh and a vertical drag Dv in the development of the touch sensor 21 in FIG. 42 and 43, the horizontal drag Dh and the vertical drag Dv are shown in only one column of the detection region R for each of the X coordinate and the Y coordinate, but the finger touches the detection region R in a plurality of columns and drags. Sometimes it is done.
  • locus correction and vector synthesis in the eighth embodiment are the same as those in the seventh embodiment described with reference to FIGS.
  • a ninth embodiment of a control device and control method for an operation target device in a vehicle will be described.
  • the basic configuration and operation in the ninth embodiment are the same as those in the first embodiment, and only different parts will be described.
  • the left and right part of the annular portion 202r is a right columnar gripping portion 202s gripped by the driver.
  • the pair of left and right grips 202s are connected by an upper connecting portion 202c1 and a lower connecting portion 202c2 to form an annular portion 202r.
  • the touch sensor 21 is attached to the gripping part 202s.
  • FIG. 45 is an enlarged view of the boundary portion between the connecting portion 202c1 and the gripping portion 202s surrounded by the dashed-dotted ellipse in FIG.
  • FIG. 46 shows an AA cross section of FIG. Since the gripping portion 202s has a slightly smaller diameter than the connecting portions 202c1 and 202c2, when the touch sensor 21 is attached to the gripping portion 202s, there is almost a step at the boundary between the gripping portion 202s and the connecting portions 202c1 and 202c2. The surface is continuous.
  • the gripping portion 202s is used to switch the input operation to the touch sensor 21 between on and off. Turning on the input operation means permitting (validating) the above-described specific input operation, and turning off the input operation means disallowing (invalidating) the above-mentioned specific input operation.
  • the gripper 202s has a built-in on / off switching mechanism, and the input operation is switched on and off by the on / off switching mechanism.
  • FIG. 47 shows a BB cross section of FIG.
  • the end of the connecting portion 202c1 on the gripping portion 202s side is a protruding portion 27.
  • An end portion of the gripping portion 202s on the side of the connecting portion 202c1 serves as a receiving portion 28 having a concave portion for accommodating the protruding portion 27.
  • FIGS. 47 (a) to 47 (c) a part of the protrusion 27 in the circumferential direction is notched, forming a recess 27cp.
  • An elastic deformation portion 29 having a protrusion 29p is fixed to the recess 27cp.
  • Two recesses 28 cp 1 and 28 cp 2 are formed on the inner peripheral surface of the receiving portion 28.
  • the gripper 202s In the normal state of the modified steering wheel 202, the gripper 202s is in the state shown in FIG. That is, the protrusion 29p is engaged with the recess 28cp1.
  • the state shown in FIG. 47A is a state where the input operation to the touch sensor 21 is turned off. When the operation target device is not operated by the touch sensor 21 and the vehicle is normally driven, the off state shown in FIG. 47A is set.
  • the gripping portion 202s is turned to the outer peripheral side of the deformed steering wheel 202 from the OFF state shown in FIG. 47A, the projection 29p and the recess 28cp1 are disengaged as shown in FIG. 47B.
  • the projecting portion 29p comes into contact with the convex portion between the concave portions 28cp1, 28cp2. At this time, the elastic deformation portion 29 is pushed and deformed by the convex portion between the concave portions 28cp1 and 28cp2.
  • the gripping part 202s When the gripping part 202s is further rotated to the outer peripheral side, the projecting part 29p is engaged with the concave part 28cp2 and the input operation to the touch sensor 21 is turned on as shown in FIG. 47 (c). Although not shown, the state in which the input operation to the touch sensor 21 shown in FIG. 47A is turned off and the state in which the input operation to the touch sensor 21 shown in FIG. To be detected. A state detection signal of the on / off switching mechanism of the gripping part 202s is input to the control part 10.
  • the driver When the driver does not operate the operation target device with the touch sensor 21 and normally drives the vehicle, the driver is in the state of FIG. 47A.
  • the gripping part 202s is turned to the outer peripheral side to obtain the state shown in FIG.
  • the state of FIG. 47 (a) is switched to the state of FIG. 47 (c)
  • the protrusion 29p engages with the recess 28cp2
  • the state of FIG. 47 (c) to the state of FIG. 47 (a).
  • a click feeling is obtained when the protrusion 29p is engaged with the recess 28cp1, and the driver can perceive that the on state and the off state are switched.
  • the on / off switching mechanism shown in FIG. 47 may be provided on both the left and right gripping sections 202s, or may be provided only on one side.
  • the input operation may be turned on when both the left and right gripping portions 202s are turned on, or one of them is turned on. It is good also as a state which turns ON input operation when it will be in a state.
  • the gripping part 202s is turned to the inner peripheral side, it may be turned on. In the configuration example of FIG. 44, the feeling (grip feeling) that the driver grips the gripping portion 202s does not change between the state in which the input operation is turned on and the state in which the input operation is turned off.
  • the shape of the touch sensor 21 does not have to be a complicated shape as described in FIG. A simple plane such as Accordingly, since the shape of the touch sensor 21 can be simplified, the touch sensor 21 itself can be made inexpensive, and the man-hour for mounting the touch sensor 21 on the steering wheel (deformed steering wheel 202) is also simplified. It becomes possible to realize the control device of the device at a low cost.
  • the on / off switching mechanism is a rotation switch that rotates in the circumferential direction.
  • a grip detection area Arg, an operation detection area Arv, and an operation invalid area Ariv as described with reference to FIGS. 8 and 9 are set in the touch sensor 21 attached to the gripping part 202s having an on / off switching mechanism. Also good. However, since it is clear whether the driver intends to operate the operation target device by the on / off switching mechanism, the grip detection area Arg and the operation invalid area Ariv are not set, and only the operation detection area Arv is set. Also good. That is, the entire surface of the touch sensor 21 may be set as the operation detection area Arv.
  • step S21 determines in step S21 whether or not the on / off switching mechanism is on. If it is not determined that the on / off switching mechanism is on (NO), the control unit 10 returns the process to step S21. If it is determined that the on / off switching mechanism is on (YES), the control unit 10 acquires sensor data output from the sensor data generation unit 22 in step S22. In step S23, the control unit 10 determines whether or not an input operation has been performed based on the detection output from the detection unit 10a.
  • step S24 the control unit 10 determines whether or not to permit an operation on the operation target device by the input operation in step S23. If it is determined that the operation is permitted (YES), the control unit 10 moves the process to step S25. If it is not determined that the operation is permitted (NO), the control unit 10 returns the process to step S21.
  • the control unit 10 permits an operation on the operation target device when a specific input operation is performed on the touch sensor 21.
  • step S25 the control unit 10 determines an operation based on the input operation.
  • step S26 the control unit 10 performs control corresponding to the determined operation on the operation target device, and returns the process to step S21. .
  • step S2 of FIG. 4 the process corresponding to step S2 of FIG. 4 is omitted, but the process of determining whether or not the gripping part 202s corresponding to step S2 of FIG. You may provide between step S23.
  • the control unit 10 objectively determines whether the driver intends to operate the operation target device. Judgment can be made. Accordingly, it is possible to greatly reduce erroneous operations.
  • the gripper 202s may be returned to the normal state shown in FIG. 47 (a). In this case, a motor for returning the state shown in FIG. 47C from the state shown in FIG. 47C to the state shown in FIG.
  • the touch sensor 21 may be detachably attached to the annular portion 200r or the annular portion 202r using a surface fastener.
  • the annular portion 200r is the gripping portion, the gripping portion is not necessarily circular. It may be deformed like the annular portion 202r or may not be annular.
  • a receiving part having a recess is provided on the connecting part 202c1 or 202c2 side, and a protruding part is provided on the gripping part 202s side, and the gripping part 202s and the connecting parts 202c1, 202c2 are joined. May be.
  • the configuration shown in FIGS. 46 and 47 is an example of the configuration of the on / off switching mechanism, and is not limited to the configuration shown in FIGS. 46 and 47.
  • ⁇ Tenth Embodiment> 10th Embodiment of the control apparatus of the operation target apparatus in a vehicle is described.
  • the tenth embodiment is an embodiment of a driver specifying method.
  • the basic configuration and operation in the tenth embodiment are the same as those in the first embodiment, and only different parts will be described.
  • the state of the in-vehicle device 100 is set to an optimum state according to each driver, or the state of the vehicle is set to an optimum state according to each driver.
  • the driver may automatically play a song that is frequently played by the audio playback unit 12, or may display a song that is frequently played when displaying a list of songs at the top. It is done. It is also conceivable to set the condition of the air conditioner or adjust the seat position according to the driver.
  • FIG. 49 shows an example of the state of the palm contact detection unit Tp and the thumb contact detection unit Tt when the driver grips the annular portion 200r in an attempt to drive the vehicle.
  • the thumb contact detection unit Tt is located close to the palm contact detection unit Tp.
  • the index finger contact detector Ti is not shown.
  • the control unit 10 detects the touched length in the X coordinate direction in the grip detection area Arg.
  • the sum of the length Lx1 and the length Lx2 is the touched length in the X coordinate direction.
  • FIG. 50 shows a state in which the thumb contact detection units Tt divided in FIG. 49 are connected, and shows a touched length Lx in the X coordinate direction.
  • the length Lx is information indicating the length of a portion (palm contact detection portion Tp) touched by the palm on the touch sensor 21 in the circumferential direction in the cross section when the annular portion 200r is cut in the radial direction.
  • the length Lx is a first example of gripping state identification data indicating how the driver is gripping the annular portion 200r where the touch sensor 21 is attached.
  • control unit 10 detects the length Lx, but the number of detection regions R corresponding to the length Lx may be obtained. Of course, it is possible to convert the number of detection regions R corresponding to the length Lx into an actual distance.
  • the control unit 10 detects the length Lya in the Y coordinate direction between the thumb contact detection unit Tt and the thumb contact detection unit Tt.
  • the length Lya is the circumferential direction of the steering wheel 200 (annular portion 200r), where the palm on the touch sensor 21 is in contact (palm contact detection unit Tp) and the portion where the thumb is in contact (thumb contact detection).
  • Part Tt) is information indicating the length.
  • the length Lya is a second example of gripping state identification data indicating how the driver is gripping the annular portion 200r of the portion where the touch sensor 21 is attached.
  • the control unit 10 detects Lya, but the number of detection regions R corresponding to the length Lya may be obtained. Of course, it is possible to convert the number of detection regions R corresponding to the length Lya into an actual distance.
  • the end of the part in contact with the palm opposite to the part in contact with the thumb, and the part of the part in contact with the thumb opposite to the part in contact with the palm is set to the length Lya, the length is not limited to this. However, the length shown in FIG. 49 is preferably the length Lya.
  • control unit 10 detects the total number of detection regions R (contact detection region total number) that are detected to be touched in the state of FIG.
  • the total number of contact detection areas corresponds to the area where the driver's hand is in contact. It is also possible to calculate the actual area based on the detection region R being detected.
  • the total number of contact detection areas may be the total number of detection areas R that are detected to be touched in all detection areas R of the grip detection area Arg, the operation detection area Arv, and the operation invalid area Ariv. In the detection region R of only the grip detection area Arg, the total number of detection regions R that are detected to be touched may be used.
  • the information corresponding to the area of the part touched by the hand on the touch sensor 21 is gripping state identification data indicating how the driver grips the annular part 200r of the part to which the touch sensor 21 is attached. This is a third example.
  • the driver can be specified by the lengths Lx, Lya and the total number of contact detection areas. Although the driver's specific accuracy is slightly lowered, the driver may be specified only by the lengths Lx and Lya, or the driver may be specified only by the total contact detection area. The driver may be specified only by the length Lx, or the driver may be specified only by the length Lya.
  • the control unit 10 includes a thumb contact detection unit Tt and a thumb contact detection unit in a state where the driver has extended his thumb to operate the operation target device.
  • the length Lyb in the Y coordinate direction with respect to Tt is detected.
  • FIG. 51 shows a state where the driver has extended his thumb to operate the operation target device.
  • the length Lyb in the Y-coordinate direction between the thumb contact detection unit Tt and the thumb contact detection unit Tt is longer than the length Lya.
  • the voice guidance is given as “Specify the driver. Extend the thumb and operate the touch sensor.”, The length Lyb will also be immediately Can be detected. There is no problem even if the length Lyb is detected after waiting for the driver to actually operate the touch sensor 21 without such guidance.
  • the end of the part in contact with the palm opposite to the part in contact with the thumb and the part of the part in contact with the thumb in contact with the palm are opposite.
  • the length between the end portions on the side is the length Lyb, it is not limited to this.
  • the length shown in FIG. 51 is preferably the length Lyb.
  • the length Lyb is a fourth example of gripping state identification data indicating how the driver is gripping the annular portion 200r where the touch sensor 21 is attached.
  • FIG. 52 shows an example of the driver database stored in the storage unit 18.
  • the lengths Lx, Lya, and Lyb and the total number of contact detection areas are registered as the driver specifying data. Even for the same driver, the lengths Lx, Lya, Lyb and the total number of contact detection areas are not always the same value, so it is preferable to register an average value every time the same driver is specified.
  • the driver specifying data indicates how the annular portion 200r of the portion where the touch sensor 21 is attached is gripped. As described above, information on the driver specifying data may be registered in accordance with the gripping state identification data acquired by the control unit 10 in order to specify the driver.
  • the thumb contact detection unit Tt is used, but the index finger contact detection unit Ti is used instead of the thumb contact detection unit Tt, or the index finger contact detection unit Ti is used in addition to the thumb contact detection unit Tt. You can also.
  • the control unit 10 acquires sensor data output from the sensor data generation unit 22 in step S21.
  • the control unit 10 acquires the lengths Lx and Lya in step S22. Since the driver first holds the annular portion 200r in an attempt to drive the vehicle, the lengths Lx and Lya can be acquired. As described above, since the detection unit 10a detects that the annular portion 200r (touch sensor 21) is gripped based on the sensor data output from the sensor data generation unit 22, the detection is detected. Then, the lengths Lx and Lya may be acquired.
  • the controller 10 acquires the total number of contact detection areas in step S23. The order of step S22 and step S23 may be reversed.
  • the controller 10 acquires the length Lyb in step S24 after guiding the user to extend the thumb or waiting for the driver to operate the touch sensor 21. Step S24 can be omitted.
  • the control unit 10 compares the acquired lengths Lx, Lya, Lyb with the driver identification data registered in the driver database with the gripping state identification data of the total number of contact detection areas. It is determined whether or not the grip state identification data thus obtained matches any of the driver identification data. Even if it is the same driver, the data does not always match completely. Therefore, a predetermined allowable range is set in the registered driver specific data, and the obtained lengths Lx, Lya, Lyb and the total number of contact detection areas are set. If the gripping state identification data is included within the allowable range, it is determined that they match.
  • step S25 If it is determined in step S25 that it matches any one of the drivers (YES), the control unit 10 identifies the driver in step S26, and executes control corresponding to the driver in step S27. finish.
  • the control corresponding to the driver is to set the state of the in-vehicle device 100 and the state of the vehicle to an optimum state corresponding to each driver. Of course, when the driver is specified while the vehicle is traveling, the position of the seat is not adjusted in the vehicle state.
  • control unit 10 determines whether or not an instruction to register in the driver database is given in step S28. If it is determined that an instruction to register in the driver database is given (YES), the control unit 10 in step S29, the driver name input by the operation unit not shown in FIG. 1 and the acquired length Lx, Lya, Lyb and gripping state identification data consisting of the total number of contact detection areas are associated with each other and registered in the driver database as driver specifying data, and the process ends. If it is not determined that an instruction to register in the driver database has been given (NO), the process ends.
  • the control unit grasps the annular portion 200r of the portion where the touch sensor 21 is mounted based on the sensor data output from the sensor data generation unit 22. It is a driver specifying unit that specifies a driver by acquiring gripping state identification data indicating whether or not the vehicle is gripped and comparing the gripping state identification data with the driver specifying data.
  • the control unit 10 learns how the in-vehicle device 100 is operated and what the vehicle is, and grasps the characteristics of each driver. deep.
  • information indicating the condition of the air conditioner and information indicating the position of the seat are not shown to be input to the control unit 10, but these pieces of information are also transmitted to the control unit 10 via the in-vehicle communication unit 34. And supply.
  • the position where the annular portion 200r is gripped differs depending on the driver. Therefore, it is also possible to detect the position where the annular portion 200r is gripped and use it as grip state identification data for identifying the driver.
  • the positions of the grip detection area Arg, the operation detection area Arv, and the operation invalid area Ariv provided as necessary are dynamically set according to the position where the driver holds the touch sensor 21, the annular portion 200r.
  • the position where the vehicle is gripped can be gripping state identification data for identifying the driver.
  • FIG. 54A shows a state in which the driver grips the lower end of the touch sensor 21 and the grip detection area Arg is set at the lower end of the touch sensor 21.
  • FIG. 54B shows a state in which the driver holds the position slightly above the lower end of the touch sensor 21 and the grip detection area Arg is set at a position away from the lower end of the touch sensor 21.
  • the touch sensor 21 is the grip detection area Arg can be specified by the Y coordinate. As an example, if the value of the Y coordinate of the grip detection area Arg is integrated, it can be seen that the lower the integrated value, the lower the touch sensor 21 is held, and the higher the integrated value, the higher the touch sensor 21 is held.
  • Information indicating the position where the steering wheel 200 is gripped in the circumferential direction of the steering wheel 200 is registered as driver identification data in the driver database of FIG.
  • the control unit 10 acquires information indicating the position where the steering wheel 200 is gripped in the circumferential direction of the steering wheel 200 as gripping state identification data.
  • the information indicating the position where the steering wheel 200 is gripped is a fifth example of gripping state identification data indicating how the driver is gripping the annular portion 200r of the portion where the touch sensor 21 is mounted. Although the driver's specific accuracy decreases, the driver may be specified based on information indicating a position where the steering wheel 200 is gripped.
  • the first to fifth examples of the gripping state identification data described above can be arbitrarily combined as appropriate. One or more may be selected as appropriate in consideration of the driver's specific accuracy. Of course, it is preferable to use all of the first to fifth examples because the specific accuracy is greatly improved.
  • the present invention can be used as a control device for controlling an arbitrary operation target device in a vehicle. It can also be used for vehicles other than automobiles. Further, in a game device having an operation unit (controller) such as a steering wheel, it can be used as a control device for controlling the game.
  • an operation unit such as a steering wheel

Landscapes

  • Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Steering Controls (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Selon l'invention, un capteur tactile (21) est fixé à une partie annulaire (200r) d'un volant (200). Une unité de génération de données de capteur (22) génère, sur la base d'un signal de détection tactile obtenu à partir du capteur tactile (21), des données de capteur comprenant des données de position indiquant quelle région de détection est touchée. Une unité de détection (10a) détecte si le conducteur saisit ou non la partie annulaire (200r) et une opération d'entrée dans le capteur tactile (21). Lorsque le conducteur saisit la partie annulaire (200r) et qu'il est détecté qu'une opération d'entrée spécifique a été réalisée, une unité de commande (10) commande un dispositif dans le véhicule (100) selon l'opération d'entrée spécifique.
PCT/JP2012/059712 2011-08-11 2012-04-09 Dispositif et procédé de commande d'un dispositif à actionner dans un véhicule, et volant WO2013021685A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US14/176,626 US9267809B2 (en) 2011-08-11 2014-02-10 Control apparatus and method for controlling operation target device in vehicle, and steering wheel
US14/939,375 US9886117B2 (en) 2011-08-11 2015-11-12 Control apparatus and method for controlling operation target device in vehicle, and steering wheel

Applications Claiming Priority (22)

Application Number Priority Date Filing Date Title
JP2011-176168 2011-08-11
JP2011176168 2011-08-11
JP2011-200563 2011-09-14
JP2011200563 2011-09-14
JP2011201356 2011-09-15
JP2011-201354 2011-09-15
JP2011201354 2011-09-15
JP2011-201356 2011-09-15
JP2011-206099 2011-09-21
JP2011206096 2011-09-21
JP2011-206096 2011-09-21
JP2011206150 2011-09-21
JP2011-206150 2011-09-21
JP2011206099 2011-09-21
JP2011212025 2011-09-28
JP2011-212025 2011-09-28
JP2012007894A JP5821647B2 (ja) 2012-01-18 2012-01-18 車両における操作対象装置の制御装置及び制御方法
JP2012-007894 2012-01-18
JP2012-043554 2012-02-29
JP2012043554A JP5825146B2 (ja) 2011-09-14 2012-02-29 車両における操作対象装置の制御装置及び制御方法
JP2012073562A JP5765282B2 (ja) 2012-03-28 2012-03-28 車両における操作対象装置の制御装置及び制御方法
JP2012-073562 2012-03-28

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14/176,626 Continuation US9267809B2 (en) 2011-08-11 2014-02-10 Control apparatus and method for controlling operation target device in vehicle, and steering wheel

Publications (1)

Publication Number Publication Date
WO2013021685A1 true WO2013021685A1 (fr) 2013-02-14

Family

ID=47668214

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/059712 WO2013021685A1 (fr) 2011-08-11 2012-04-09 Dispositif et procédé de commande d'un dispositif à actionner dans un véhicule, et volant

Country Status (1)

Country Link
WO (1) WO2013021685A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107851394A (zh) * 2015-07-31 2018-03-27 松下知识产权经营株式会社 驾驶辅助装置、驾驶辅助系统、驾驶辅助方法以及自动驾驶车辆
CN114390982A (zh) * 2019-09-11 2022-04-22 采埃孚股份公司 用于车辆的操作者控制设备、用于车辆的具有操作者控制设备的方向盘、仪表板、中央控制台或扶手、具有操作者控制设备的车辆以及用于操作操作者控制设备的方法
CN115129221A (zh) * 2021-03-26 2022-09-30 丰田自动车株式会社 操作输入装置、操作输入方法以及记录有操作输入程序的计算机可读介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS61161852U (fr) * 1985-03-27 1986-10-07
JPH06156114A (ja) * 1992-11-16 1994-06-03 Makoto Ueda 居眠り運転防止器
JP2000228126A (ja) * 1999-02-05 2000-08-15 Matsushita Electric Ind Co Ltd ステアリング入力装置
JP2007076491A (ja) * 2005-09-14 2007-03-29 Hitachi Ltd 車載設備の操作装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS61161852U (fr) * 1985-03-27 1986-10-07
JPH06156114A (ja) * 1992-11-16 1994-06-03 Makoto Ueda 居眠り運転防止器
JP2000228126A (ja) * 1999-02-05 2000-08-15 Matsushita Electric Ind Co Ltd ステアリング入力装置
JP2007076491A (ja) * 2005-09-14 2007-03-29 Hitachi Ltd 車載設備の操作装置

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107851394A (zh) * 2015-07-31 2018-03-27 松下知识产权经营株式会社 驾驶辅助装置、驾驶辅助系统、驾驶辅助方法以及自动驾驶车辆
CN114390982A (zh) * 2019-09-11 2022-04-22 采埃孚股份公司 用于车辆的操作者控制设备、用于车辆的具有操作者控制设备的方向盘、仪表板、中央控制台或扶手、具有操作者控制设备的车辆以及用于操作操作者控制设备的方法
CN115129221A (zh) * 2021-03-26 2022-09-30 丰田自动车株式会社 操作输入装置、操作输入方法以及记录有操作输入程序的计算机可读介质
CN115129221B (zh) * 2021-03-26 2024-06-04 丰田自动车株式会社 操作输入装置、操作输入方法以及记录有操作输入程序的计算机可读介质

Similar Documents

Publication Publication Date Title
US9886117B2 (en) Control apparatus and method for controlling operation target device in vehicle, and steering wheel
JP5825146B2 (ja) 車両における操作対象装置の制御装置及び制御方法
JP5783126B2 (ja) 車両における操作対象装置の制御装置及び制御方法
US10203799B2 (en) Touch input device, vehicle comprising touch input device, and manufacturing method of touch input device
US8775023B2 (en) Light-based touch controls on a steering wheel and dashboard
US8907778B2 (en) Multi-function display and operating system and method for controlling such a system having optimized graphical operating display
US9346356B2 (en) Operation input device for vehicle
EP3466741A1 (fr) Commandes tactile lumineuses sur un volant de direction et tableau de bord
WO2013136776A1 (fr) Dispositif de traitement d'opération d'entrée gestuelle
JP5079582B2 (ja) タッチ式センサ
JP2013075653A (ja) 車両における操作対象装置の制御装置及び制御方法
JP5821647B2 (ja) 車両における操作対象装置の制御装置及び制御方法
WO2013021685A1 (fr) Dispositif et procédé de commande d'un dispositif à actionner dans un véhicule, et volant
US20160320960A1 (en) Touch input apparatus and vehicle having the same
JP2013095289A (ja) 入力装置
JP2009286175A (ja) 車両用表示装置
JP5776590B2 (ja) 車両における操作対象装置の制御装置及び制御方法
JP2013082423A (ja) 車両における操作対象装置の制御装置及び制御方法
JP2013079056A (ja) 車両における操作対象装置の制御装置及び運転者特定方法
JP5821696B2 (ja) 車両における操作対象装置の制御装置及び制御方法
JP5765282B2 (ja) 車両における操作対象装置の制御装置及び制御方法
JP2010061256A (ja) 表示装置
JP5790579B2 (ja) 車両における操作対象装置の制御装置及び制御方法
JP5768755B2 (ja) 車両における操作対象装置の制御装置及び制御方法
JP5783125B2 (ja) 車両における操作対象装置の制御装置及び制御方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12821739

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12821739

Country of ref document: EP

Kind code of ref document: A1