CN105739679B - Steering wheel control system - Google Patents

Steering wheel control system Download PDF

Info

Publication number
CN105739679B
CN105739679B CN201510994355.1A CN201510994355A CN105739679B CN 105739679 B CN105739679 B CN 105739679B CN 201510994355 A CN201510994355 A CN 201510994355A CN 105739679 B CN105739679 B CN 105739679B
Authority
CN
China
Prior art keywords
input
user
vehicle parameter
vehicle
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510994355.1A
Other languages
Chinese (zh)
Other versions
CN105739679A (en
Inventor
D.迪森索
S.马蒂
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harman International Industries Inc
Original Assignee
Harman International Industries Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harman International Industries Inc filed Critical Harman International Industries Inc
Publication of CN105739679A publication Critical patent/CN105739679A/en
Application granted granted Critical
Publication of CN105739679B publication Critical patent/CN105739679B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D6/00Arrangements for automatically controlling steering depending on driving conditions sensed and responded to, e.g. control circuits
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Arrangement of adaptations of instruments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • B60K35/10
    • B60K35/80
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • B60R16/023Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for transmission of signals between vehicle parts or subsystems
    • B60R16/027Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for transmission of signals between vehicle parts or subsystems between relatively movable parts of the vehicle, e.g. between steering wheel and column
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D1/00Steering controls, i.e. means for initiating a change of direction of the vehicle
    • B62D1/02Steering controls, i.e. means for initiating a change of direction of the vehicle vehicle-mounted
    • B62D1/04Hand wheels
    • B62D1/046Adaptations on rotatable parts of the steering wheel for accommodation of switches
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D1/00Steering controls, i.e. means for initiating a change of direction of the vehicle
    • B62D1/02Steering controls, i.e. means for initiating a change of direction of the vehicle vehicle-mounted
    • B62D1/04Hand wheels
    • B62D1/06Rims, e.g. with heating means; Rim covers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • B60K2360/143
    • B60K2360/146

Abstract

The present invention relates to a steering wheel control system. Embodiments of the present disclosure set forth a technique for modifying different vehicle parameters based on two-handed input. The technique includes acquiring sensor data associated with a first finger and a second finger of a user. The technique also includes analyzing the sensor data to determine a first position of the first finger and a second position of the second finger. The technique also includes selecting a first vehicle parameter based on the first position of the first finger and modifying the first vehicle parameter based on the second position of the second finger.

Description

Steering wheel control system
Cross Reference to Related Applications
This application claims the benefit of U.S. provisional patent application serial No. 62/098,967 filed on 31/12/2014 and having attorney docket number HRMN/0142 USL. The subject matter of these related applications is hereby incorporated by reference.
Technical Field
Embodiments of the present disclosure relate generally to human-vehicle interfaces and, more particularly, to steering wheel control systems.
Background
Modern vehicles typically include a variety of subsystems including multimedia subsystems, climate control subsystems, vehicle throttle/steering subsystems, and other types of subsystems for controlling various aspects of the vehicle and related components. To enable a user to interact with each subsystem, a vehicle typically includes several control panels, each of which may be dedicated to controlling one or more of the subsystems. For example, a vehicle may include a dashboard-mounted control panel having a multimedia interface and a climate control interface. Further, the vehicle may include one or more panels located near the steering wheel and including an interface for controlling a throttle subsystem (such as cruise control).
One problem faced when interacting with the panel described above is that the driver of the vehicle needs to remove at least one hand from the steering wheel in order to control the corresponding subsystem. For example, the driver must remove one of his/her hands from the steering wheel in order to interact with the navigation system and/or adjust the temperature or fan speed through the climate control interface. Thus, the driver must divert his/her attention away from the driving behavior in order to interact with the vehicle subsystems. Such transfers reduce the ability of the driver to safely operate the vehicle, which potentially compromises the safety of the occupants of the vehicle and those in the surrounding environment.
As shown above, a technique for controlling subsystems within a vehicle without requiring the driver to remove his/her hands from the steering wheel would be useful.
Disclosure of Invention
Embodiments of the present disclosure set forth a method for modifying vehicle parameters based on two-handed input (bimanual input). The method includes acquiring sensor data associated with a first finger and a second finger of a user. The method also includes analyzing the sensor data to determine a first position of the first finger and a second position of the second finger, and selecting a first vehicle parameter based on the first position of the first finger. The method also includes modifying the first vehicle parameter based on the second position of the second finger.
Further embodiments provide, among other things, systems and non-transitory computer-readable storage media configured to implement the techniques set forth above.
At least one advantage of the techniques of this disclosure is that a user is able to modify parameters associated with various types of vehicle systems without looking away from the road and/or without taking his or her hands off the steering wheel. Thus, it is possible to operate the steering wheel controller with a low cognitive load, thereby reducing the extent to which the operation of the vehicle system is distracting to the driver in the driving task.
Drawings
So that the manner in which the above recited features of the present disclosure can be understood in detail, a more particular description of the disclosure, briefly summarized above, may be had by reference to embodiments, some of which are illustrated in the appended drawings. It is to be noted, however, that the appended drawings illustrate only typical embodiments and are therefore not to be considered limiting of its scope, for the disclosure may admit to equally effective embodiments.
Fig. 1 illustrates a system for recognizing user input and modifying vehicle parameters, in accordance with various embodiments;
FIG. 2 illustrates an input area that may be implemented in conjunction with the left-hand controller and the right-hand controller of FIG. 1, in accordance with various embodiments;
fig. 3A and 3B illustrate techniques for modifying different vehicle parameters through the input region of fig. 2, in accordance with various embodiments;
4A-4C illustrate a technique for modifying different vehicle parameters through one of the input regions of FIG. 2 and a force sensor coupled to the steering wheel of FIG. 1, in accordance with various embodiments;
FIG. 5 illustrates a technique for modifying different vehicle parameters through one or more gestures in the input area of FIG. 2, in accordance with various embodiments;
fig. 6 is a flow diagram of method steps for modifying different vehicle parameters based on two-handed input, according to various embodiments.
Detailed Description
In the following description, numerous specific details are set forth to provide a more thorough understanding of embodiments of the present disclosure. It will be apparent, however, to one skilled in the art, that embodiments of the present disclosure may be practiced without one or more of these specific details.
Fig. 1 illustrates a system 100 for recognizing user input and modifying vehicle parameters, according to various embodiments. The system 100 includes a steering wheel 110, a computing device 120, and an auxiliary component 180. The steering wheel 110 includes a left hand control 112 and a right hand control 114. Left hand controller 112 and right hand controller 114 include one or more types of sensors configured to acquire data associated with the user's hands (e.g., two-hand input). The auxiliary components 180 are coupled to the computing device 120 by a bus 190 and include one or more vehicle subsystems such as a multimedia system 140, a navigation system 150, a climate control system 160, and a cruise control system 170. Each of the auxiliary components 180 is associated with one or more vehicle parameters that may be modified by the left hand controller 112 and the right hand controller 114.
Computing device 120 includes a processing unit 122, an input/output (I/O) device 124, and a storage unit 126. In various embodiments, computing device 120 may be a mobile computer, a system on a chip (SoC), an infotainment system, a navigation system, a mobile device, a mobile phone, a personal digital assistant, or any other device for practicing one or more embodiments of the present disclosure. As shown, computing device 120 is configured to receive input from left hand controller 112 and right hand controller 114. Computing device 120 may also be coupled to one or more output devices associated with system 100, including one or more devices configured to generate feedback for the user, such as haptic devices and/or speakers.
In general, the computing device 120 is configured to coordinate the overall operation of the system 100. In other embodiments, the computing device 120 may be coupled to the system 100 but separate from the system 100. In such embodiments, the system 100 may include a separate processor that receives data (e.g., sensor data) from the computing device 120 and transmits data (e.g., vehicle parameters) to the computing device 120. In general, any technically feasible system configured to implement the functionality of the system 100 falls within the scope of the present disclosure.
The processing unit 122 may include a Central Processing Unit (CPU), a digital signal processing unit (DSP), and the like. In various embodiments, the processing unit 122 is configured to analyze sensor data obtained by the left hand controller 112 and the right hand controller 114 in order to detect user input. Additionally, the processing unit 122 may be configured to modify vehicle parameters associated with one or more auxiliary components 180 controlled by the system 100. For example and without limitation, the processing unit 122 may execute a control application 130, the control application 130 processing sensor data obtained by the left hand controller 112 and the right hand controller 114 and determining how one or more vehicle parameters should be modified.
The I/O devices 124 may include input devices, output devices, and devices capable of receiving input and providing output. For example and without limitation, the I/O devices 124 may include wired and/or wireless communication devices that transmit data to the left hand controller 112 and/or the right hand controller 114, and/or receive data from the left hand controller 112 and/or the right hand controller 114. The storage unit 126 may include a storage module or a collection of storage modules. Control application 130 within storage unit 126 may be executed by processing unit 122 to implement the overall functionality of system 100. The database 135 may store sensor data, gesture recognition data, display data, vehicle parameters, and the like.
In operation, sensor data obtained through left hand controller 112 and right hand controller 114 is processed by control application 130 to determine the position and/or orientation of the user's hand, arm, and/or fingers. The position and/or orientation of the user's hand, arm, and/or fingers is then analyzed by the control application 130 to determine which vehicle parameter the user selects and how the vehicle parameter will be modified (e.g., by switching the vehicle parameter or increasing/decreasing the vehicle parameter by an amount). For example and without limitation, as described in further detail below, the left-hand controller 112 may be used to select which vehicle parameter is to be modified, and the right-hand controller 114 may be used to specify how that vehicle parameter is to be modified. Alternatively, the right-hand controller 114 may be used to select which vehicle parameter is to be modified, and the left-hand controller 112 may be used to specify how that vehicle parameter is to be modified.
In various embodiments, the control application 130 modifies the vehicle parameters based on determining that user input is received through the left-hand controller 112 (e.g., to select the vehicle parameters), and substantially simultaneously, receiving user input through the right-hand controller 114 (e.g., to specify how the vehicle parameters are to be modified). The need to receive two-handed user input through left-hand controller 112 and right-hand controller 114 substantially simultaneously may enable control application 130 to ignore accidental user interaction with left-hand controller 112 and/or right-hand controller 114, and thus more accurately determine the intent of the user. In some embodiments, the control application 130 modifies the vehicle parameters based on determining that the user input is received by the left-hand controller 114 within a threshold period of time (e.g., 1 to 3 seconds) before or after the user input is received by the right-hand controller 114. Modifying the vehicle parameters enables the application 130 to efficiently determine the user's intent without burdening the user with the requirement to input commands substantially simultaneously only when two-handed user input is received through the left-hand controller 112 and the right-hand controller 114 within a threshold period of time.
In some embodiments, left hand controller 112 and/or right hand controller 114 detect the user input through one or more visual sensors (e.g., a camera), and/or one or more touch sensors (e.g., capacitive sensors, resistive sensors, inductive sensors) configured to acquire data associated with the user's hand, arm, and/or fingers. For example and without limitation, left hand controller 112 and/or right hand controller 114 may detect user input through one or more cameras that capture images of the user's hands and/or steering wheel 110. The image may then be analyzed by control application 130 to determine the location of one or more fingers of the user relative to a particular area of left hand controller 112 and/or right hand controller 114. Subsequently, vehicle parameters and modifications thereto may be determined based on the location of one or more fingers of the user relative to particular areas of the left hand controller 112 and/or the right hand controller 114. In a non-limiting example, the control application 130 may determine (e.g., from sensor data obtained by a camera) that the user's left thumb is located within a first area of the left hand controller 112 and that the user's right thumb is located within a second area of the right hand controller 114. In response, the control application 130 may select a vehicle parameter associated with the first region of the left-hand controller 112 and modify the vehicle parameter by an amount associated with the second region of the right-hand controller 114.
In other embodiments, left hand controller 112 and/or right hand controller 114 may detect user input through one or more touch sensors. Subsequently, the sensor data obtained by the touch sensor may be analyzed by the control application 130 to determine that the user's finger (e.g., thumb) is located at a particular region within the left hand controller 112 and/or the right hand controller 114. Subsequently, the vehicle parameters and modifications thereto may be determined based on the regions of the left hand controller 112 and/or the right hand controller 114 selected by the user. For example and without limitation, as described above, the control application 130 may select a vehicle parameter associated with the region of the left hand controller 112 selected by the user's left thumb, and the control application 130 may modify the vehicle parameter by an amount associated with the region of the right hand controller 114 selected by the user's right thumb. Alternatively, as described above, the functionality of the left hand controller 112 and/or the right hand controller 114 may be switched such that the right thumb selects a vehicle parameter and the left thumb switches the vehicle parameter and/or selects the amount by which the vehicle parameter is modified.
In other embodiments, the left hand controller 112 may be implemented by a rocker switch that allows the user to scan through and select between multiple vehicle parameters. Similarly, the right hand control 114 may be implemented by a rocker switch that allows the user to toggle, increase, and/or decrease selected vehicle parameters. Additionally, in some embodiments, left hand controller 112 and/or right hand controller 114 may include any other technically feasible sensor for detecting the position and/or orientation of a user's fingers, hands, or arms. Other types of sensors that may be implemented with left hand controller 112 and/or right hand controller 114 to detect user input include force sensors, depth sensors, infrared sensors, time-of-flight sensors, ultrasonic sensors, radar sensors, laser sensors, thermal sensors, structured light sensors, and/or other types of sensors. Force sensors that may be implemented with the left hand controller 112 and/or the right hand controller 114 include, without limitation, pressure sensors, pneumatic sensors, strain gauge load sensors, and/or piezoelectric crystals disposed in the steering wheel 110 and/or coupled to the steering wheel 110.
Fig. 2 illustrates an input area 210 that may be implemented in conjunction with the left-hand controller 112 and the right-hand controller 114 of fig. 1, in accordance with various embodiments. As shown, in some embodiments, left hand controller 112 and right hand controller 114 include a plurality of input regions 210 arranged in a semi-circular configuration. In general, left-hand controller 112 and right-hand controller 114 may implement any of the sensor types described above to detect user input within each input region 210. For illustrative purposes, the embodiments described below implement the left hand controller 112 to select the vehicle parameter to be modified, and the right hand controller 114 to specify how the vehicle parameter is to be modified. However, in various embodiments, some or all of the functionality described herein as being implemented by the left-hand controller 112 may alternatively be implemented by the right-hand controller 114, and some or all of the functionality described herein as being implemented by the right-hand controller 114 may alternatively be implemented by the left-hand controller 112.
In some embodiments, each of the input areas 210 is associated with a different vehicle parameter and/or vehicle parameter modifier. For example, in embodiments where the left hand controller 112 is configured to select a vehicle parameter to be modified, each of the input regions 210-1 through 210-4 may be associated with a different auxiliary component 180. In particular embodiments, input region 210-1 is associated with navigation system 150 parameters (e.g., pan, tilt, zoom, next/previous turn), input region 210-2 is associated with multimedia system 140 parameters (e.g., volume, (track), folder, source), input region 210-3 is associated with climate control system 160 parameters (e.g., temperature, fan speed, region, mode), and input region 210-4 is associated with cruise control system 170 parameters (e.g., toggle on/off, speed, lane change). Alternatively, two or more of the input areas 210-1 through 210-4 may be configured to select different vehicle parameters associated with the same auxiliary component 180.
Additionally, in embodiments where right hand controller 114 is configured to modify a vehicle parameter (e.g., by toggling the vehicle parameter and/or by increasing/decreasing the vehicle parameter), each of input regions 210-5 through 210-6 may be associated with a different state (e.g., on/off) or modification. In particular embodiments, input region 210-5 is associated with an 'on' state and/or an increasing operation (e.g., to increase a selected vehicle parameter), and input region 210-6 is associated with an 'off' state and/or a decreasing operation (e.g., to decrease a selected vehicle parameter).
In various embodiments, the input regions 210 associated with the left hand controller 112 and/or the right hand controller 114 include topographical features (e.g., unique surface features) that enable a user to tactilely identify the input regions 210. Thus, the user may be able to determine the location of the input regions 210 without having to divert his or her eyes away from the road, and/or may learn the location of each of the input regions 210 more quickly (e.g., through muscle memory). In other embodiments, left hand controller 112 and/or right hand controller 114 do not include any topographical features and/or visual markings delineating different input regions 210. Thus, although the input region 210 shown in fig. 2 is considered a visible region, in the embodiments described herein, there may not be any visible depiction of the input region 210. For example and without limitation, the input region 210 may alternatively correspond to a region of the steering wheel 110, such as a particular region located within the perimeter of the steering wheel 110.
Fig. 3A and 3B illustrate techniques for modifying different vehicle parameters through the input area 210 of fig. 2, according to various embodiments. As shown in FIG. 3A, a user may select a vehicle parameter (e.g., temperature, fan speed, zone, mode) associated with the climate control system 160 by selecting input region 210-3 with his or her left thumb. Subsequently, at substantially the same time or within a threshold period of time, the user may specify that the vehicle parameter should be decreased (or switched) by selecting input area 210-6 with his or her right thumb. Additionally, as shown in fig. 3B, the user can select vehicle parameters (e.g., volume, (audio) track, folder, source) associated with the multimedia system 140 by selecting the input area 210-2 with his or her left thumb. Subsequently, at substantially the same time or within a threshold period of time, the user may specify that the vehicle parameter should be increased (or switched) by selecting input area 210-5 with his or her right thumb.
After the initial modification of the vehicle parameter, the vehicle parameter may remain selected for a predetermined period of time (e.g., 1-3 seconds) during which the user may continue to modify the vehicle parameter (e.g., via the right-hand controller 114) without reselecting the vehicle parameter. In other embodiments, after the initial modification to the vehicle parameter, the vehicle parameter may remain selected until the user removes his or her finger from the input area 210 associated with the vehicle parameter. For example, referring to FIG. 3A, after initially selecting and modifying the vehicle parameter associated with input area 210-3, the user may continue to modify the vehicle parameter, such as by increasing or decreasing the parameter via right-hand controller 114, without having to reselect the vehicle parameter.
As described above, in some embodiments, the vehicle parameter may be modified by increasing, decreasing, and/or switching different input regions 210 (e.g., input regions 210-5 and 210-6 in fig. 3A and 3B) of the vehicle parameter. In other embodiments, the vehicle parameter may be modified by a single input area 210 representing a continuous value that may be assigned to the vehicle parameter. For example, referring to fig. 3A and 3B, the right-hand controller 114 may implement a single input region 210 having a first end 220-1 corresponding to a higher value (e.g., a maximum value) and a second end 220-2 corresponding to a lower value (e.g., a minimum value). In such embodiments, the user may move his or her thumb within the single input area 210 in order to dynamically modify the vehicle parameters.
Additionally, in some embodiments, consecutive values associated with a single input area 210 may be statically mapped to specific locations within the input area 210 such that the selected vehicle parameter is modified based on the location of the input area 210 initially selected by the user's finger. In such implementations, the location of the input area 210 initially selected by the user's finger may be determined relative to a reference location (e.g., the center of the input area 210). For example, if the user initially selects a location at or near the first end 220-1 of the input area 210 (e.g., a maximum distance from a reference location at the center of the input area 210), the vehicle parameter may increase to a maximum value.
Alternatively, continuous values associated with a single input area 210 may be dynamically mapped to locations within the input area 210. For example, the control application 130 may map the location of the input area 210 initially selected by the user to the current value of the vehicle parameter. In such embodiments, the vehicle parameter is not modified when the user's finger initially selects the input area 210, but increases or decreases when the user moves his or her finger toward the first end 220-1 or the second end 220-2 of the input area 210.
Fig. 4A-4C illustrate techniques for modifying different vehicle parameters via one of the input regions 210 of fig. 2 and a force sensor 410 coupled to the steering wheel of fig. 1, according to various embodiments. One or more force sensors 410 may be coupled to the steering wheel 110 to detect when a user applies a physical force to the steering wheel 110. Subsequently, based on the force applied to the steering wheel 110, the control application 130 may modify the selected vehicle parameters. For example, the force sensors 410 may measure physical forces applied to the steering wheel 110 and output sensor data to the control application 130 reflecting the magnitude, direction, location, and other attributes associated with those forces. Subsequently, the control application 130 determines a modifier associated with the force and modifies the vehicle parameter based on the modifier.
Examples of forces that may be applied to the steering wheel 110 include, without limitation, torque inputs, pressure inputs, and shear inputs. For example, referring to fig. 4B, after selecting a vehicle parameter (e.g., via left hand controller 112), the user may pull the right side of steering wheel 110 toward himself or herself (from the user's perspective) to cause the vehicle parameter to increase. Additionally, referring to fig. 4C, after selecting the same vehicle parameter or a different vehicle parameter, the user may push the right side of the steering wheel 110 away from himself or herself, causing the vehicle parameter to decrease. Similarly, while the vehicle parameter is selected, the vehicle parameter may be increased or decreased based on the direction in which the torque input or shear input is applied to one or more portions of the steering wheel 110.
In general, force sensor 410 may be located in any technically feasible location relative to steering wheel 110 to detect the force applied by the user. For example, force sensor 410 may be positioned within an inner surface of steering wheel 110, coupled to an outer surface of steering wheel 110, and/or coupled to a steering column associated with steering wheel 110.
In various implementations, the right hand controller 114 (or left hand controller 112) implementing the force sensor 410 may operate in substantially the same manner as described above. For example, the control application 130 may modify a particular vehicle parameter only if the user selects the vehicle parameter via the left-hand controller 112 at substantially the same time (or within a threshold period of time thereof) that the force is applied via the right-hand controller 114. Additionally, as long as the user continues to select the vehicle parameter (e.g., by selecting the corresponding input region 210), the user may continue to modify the vehicle parameter via the force sensor 410.
Fig. 5 illustrates a technique for modifying different vehicle parameters through one or more gestures in the input area 210 of fig. 2, according to various embodiments. As described above, the system 100 may include one or more sensors (e.g., vision sensors, depth sensors, infrared sensors, time-of-flight sensors, ultrasonic sensors, radar sensors, laser sensors, thermal sensors, structured light sensors) that track the position of the user's hand. In such embodiments, the user gesture may be detected by the control application 130 through a sensor, and the selected vehicle parameter may be modified based on the gesture.
Examples of gestures that may be performed by a user to modify a vehicle parameter include, without limitation, an up gesture (e.g., by lifting a hand or palm), a down gesture (e.g., by lowering a hand or palm), a clockwise rotation gesture/counterclockwise rotation gesture (e.g., by rotating a hand clockwise or counterclockwise), and a next/previous gesture (e.g., by waving a hand right or left). For example, referring to fig. 5, after selecting the vehicle parameters (e.g., via left hand controller 112), the user may raise or lower his or her palm to perform an up gesture or a down gesture, respectively. Subsequently, the control application 130 may detect an up gesture or a down gesture through the sensors and in response increase or decrease, respectively, the selected vehicle parameter.
Alternatively, instead of (or in addition to) performing a gesture, the user may speak a command into a microphone coupled to the system 100. Subsequently, the control application 130 may process the command and modify the vehicle parameters based on the command. Examples of voice commands include, without limitation, "increase"/"decrease," up "/" down, "" next "/" previous, "and" open "/" close.
Fig. 6 is a flow diagram of method steps for modifying different vehicle parameters based on two-handed control input, according to various embodiments. Although the method steps are described in conjunction with the systems of fig. 1-5, it will be understood by those of skill in the art that any system configured to perform the method steps in any order is within the scope of the present disclosure.
As shown, method 600 begins at step 610, where control application 130 obtains sensor data associated with a user's hand (e.g., two-handed input) through left-hand controller 112 and/or right-hand controller 114. As described above, the sensor data may include the position and orientation of the user's hand, fingers, and/or arms. At step 620, the control application 130 analyzes sensor data associated with a first set of controllers (e.g., left hand controller 112) to determine which vehicle parameter the user selects. In some embodiments, the control application 130 analyzes the sensor data to determine an input area selected by the user's finger and selects a vehicle parameter associated with the input area.
At step 630, the control application 130 analyzes sensor data associated with a second set of controllers (e.g., the right-hand controller 114) to determine how the vehicle parameters should be modified. As described above, the control application 130 may analyze the sensor data to determine an input area-and in some embodiments, a specific location within the input area selected by the user's finger. Subsequently, the control application 130 may select a parameter modifier associated with the input area or a location within the input area. As described above, in various embodiments, steps 620 and 630 may be performed substantially simultaneously.
At step 640, the control application 130 determines whether the user input received through the second set of controllers is within a threshold period of time of the user input received through the first set of controllers. In some embodiments, the threshold time period is less than 1 second (e.g., user input is received by the first set of controllers and the second set of controllers substantially simultaneously), while in other embodiments, the threshold time period may be longer (e.g., 1 to 3 seconds). If the user input received through the second set of controllers is within the threshold period of time of the user input received through the first set of controllers, then the method 600 proceeds to step 650, where the control application 130 modifies the selected vehicle parameter based on the parameter modifier. If the user input received through the second set of controllers is not within the threshold period of time of the user input received through the first set of controllers, then the method 600 returns to step 610 where the control application 130 continues to acquire sensor data through the left hand controller 112 and/or the right hand controller 114.
In summary, sensor data associated with one or both hands of a user is acquired by one or more sensors. The sensor data is then analyzed by the control application to determine which vehicle parameter the user selected and how the vehicle parameter should be modified. The control application may further determine whether the selection of the vehicle parameter and the selection of the parameter modifier are received substantially simultaneously or within a threshold period of time of each other. The control application modifies the vehicle parameter based on a parameter modifier if the selection of the vehicle parameter and the selection of the parameter modifier are received substantially simultaneously or within a threshold time period of each other.
At least one advantage of the techniques described herein is that a user is able to modify parameters associated with various types of vehicle systems without looking away from the road and/or without taking his or her hands off the steering wheel. Thus, it is possible to operate the steering wheel controller with a low cognitive load, thereby reducing the extent to which the operation of the vehicle system is distracting to the driver in the driving task. In addition, the techniques described herein enable the multiple sets of physical buttons typically present in conventional vehicle systems to be replaced by simpler and cheaper interfaces.
The description of the various embodiments has been presented for purposes of illustration, but is not intended to be exhaustive or limited to the disclosed embodiments. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments.
Aspects of embodiments herein may be embodied as a system, method or computer program product. Accordingly, aspects of the present disclosure may take the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a "circuit," module "or" system. Furthermore, aspects of the present disclosure may take the form of a computer program product embodied on one or more computer-readable media having computer-readable program code embodied on the media.
Any combination of one or more computer-readable media may be used. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
Aspects of the present disclosure are described above with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. Such processors may not be limited to general purpose processors, special purpose processors, application specific processors, or field programmable processors.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The present disclosure has been described above with reference to specific embodiments. However, it will be appreciated by persons skilled in the art that various modifications and changes may be made to the specific embodiments described without departing from the broader spirit and scope as set forth in the appended claims. For example and without limitation, although many of the descriptions herein refer to specific types of sensors, input regions, and vehicle parameters, those skilled in the art will appreciate that the systems and techniques described herein may be applicable to other types of sensors, input regions, and vehicle parameters. The foregoing description and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.
While the foregoing is directed to embodiments of the present disclosure, other and further embodiments of the disclosure may be devised without departing from the basic scope thereof, and the scope thereof is determined by the claims that follow.

Claims (10)

1. A system for modifying vehicle parameters based on two-handed input, the system comprising:
at least one sensor configured to acquire first sensor data associated with a first finger of a first hand of a user and second sensor data associated with a second hand of the user;
a processor coupled to the at least one sensor and configured to:
determining that the first sensor data was acquired within a predetermined time period of acquiring the second sensor data;
analyzing the first sensor data to determine a first location of a first finger of the user;
analyzing the second sensor data to determine a first input performed by a second hand of the user;
selecting a first vehicle parameter based on the first position of the first finger;
determining a modified first vehicle parameter based on the first input performed by the user's second hand; and
controlling a first auxiliary component within the vehicle based on the modified first vehicle parameter.
2. The system of claim 1, wherein the processor is configured to determine the modified first vehicle parameter based on determining that the first input was performed by the second hand within a predetermined time period after the first finger was moved to the first position.
3. The system of claim 1, wherein the at least one sensor comprises a force sensor coupled to a steering wheel, and the first input comprises at least one of a torque input, a pressure input, and a shear input applied to the steering wheel.
4. The system of claim 3, wherein the processor is configured to increase the first vehicle parameter when the first input has a first direction and decrease the first vehicle parameter when the first input has a second direction.
5. The system of claim 1, wherein the at least one sensor comprises a touch sensor and the first input comprises at least one of a tap gesture and a swipe gesture.
6. The system of claim 1, wherein the at least one sensor comprises at least one of a camera, a depth sensor, an infrared sensor, a time-of-flight sensor, and an ultrasonic sensor for tracking a position of the second hand of the user, and the first input comprises at least one gesture performed by the second hand of the user.
7. The system of claim 1, wherein the processor is further configured to:
analyzing the sensor data to determine a second position of the first finger of the user and a second input performed by the second hand of the user;
selecting a second vehicle parameter based on the second position of the first finger; and
determining a modified second vehicle parameter based on the second input performed by the second hand of the user.
8. The system of claim 1, wherein the at least one sensor comprises a camera configured to capture one or more images, and the processor is configured to determine the first position of the user's first finger by analyzing the one or more images to determine at least one input area occupied by the user's first finger.
9. A method for modifying vehicle parameters based on two-handed input, the method comprising:
obtaining first sensor data associated with a first thumb of a first hand of a user and second sensor data associated with a second thumb of a second hand of the user;
determining that the first sensor data was acquired within a predetermined time period of acquiring the second sensor data;
analyzing the first sensor data to determine a first input region occupied by the first thumb;
analyzing the second sensor data to determine a second input area occupied by the second thumb;
selecting a first vehicle parameter based on the first input area; determining a modified first vehicle parameter based on the second input area; and
controlling a first auxiliary component within the vehicle based on the modified first vehicle parameter.
10. The method of claim 9, wherein modifying the first vehicle parameter is performed based on determining that the second thumb moved to the second input area within the predetermined time period after the first thumb moved to the first input area.
CN201510994355.1A 2014-12-31 2015-12-25 Steering wheel control system Active CN105739679B (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201462098967P 2014-12-31 2014-12-31
US62/098,967 2014-12-31
US14/805,377 2015-07-21
US14/805,377 US10035539B2 (en) 2014-12-31 2015-07-21 Steering wheel control system

Publications (2)

Publication Number Publication Date
CN105739679A CN105739679A (en) 2016-07-06
CN105739679B true CN105739679B (en) 2021-02-19

Family

ID=55027537

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510994355.1A Active CN105739679B (en) 2014-12-31 2015-12-25 Steering wheel control system

Country Status (3)

Country Link
US (1) US10035539B2 (en)
EP (1) EP3040252B1 (en)
CN (1) CN105739679B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10035539B2 (en) * 2014-12-31 2018-07-31 Harman International Industries, Incorporated Steering wheel control system
TWM504768U (en) * 2015-01-06 2015-07-11 J D Components Co Ltd Non-contact operating device of bicycle
DE102016109142A1 (en) * 2016-05-18 2017-11-23 Preh Gmbh Input device operated in an identification mode and an input mode
WO2018079096A1 (en) * 2016-10-28 2018-05-03 本田技研工業株式会社 Steering wheel unit
CN108509023A (en) * 2017-02-27 2018-09-07 华为技术有限公司 The control method and device of onboard system
US10937259B1 (en) * 2018-03-23 2021-03-02 Armorworks Holdings, Inc. Smart vehicle health system
JP2020138600A (en) * 2019-02-27 2020-09-03 本田技研工業株式会社 Vehicle control system
DE102021110691A1 (en) 2021-04-27 2022-10-27 Bayerische Motoren Werke Aktiengesellschaft Steering wheel and method for controlling a vehicle and vehicle including the steering wheel

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102700479A (en) * 2011-03-28 2012-10-03 株式会社本田阿克塞斯 Hand judgment method and judgment device, and operation device of vehicle-mounted equipment of motor vehicle
CN102785689A (en) * 2011-05-20 2012-11-21 罗伯特·博世有限公司 Haptic steering wheel, steering-wheel system and driver assistance system for a motor vehicle
WO2014085277A1 (en) * 2012-11-27 2014-06-05 Neonöde Inc. Light-based touch controls on a steering wheel and dashboard

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9272724B2 (en) * 2009-09-08 2016-03-01 Golomb Mercantile Company Llc Integrated vehicle control system and apparatus
KR20110049379A (en) * 2009-11-05 2011-05-12 삼성전자주식회사 Apparatus for multi touch and proximated object sensing using wedge wave guide
US8738224B2 (en) 2011-01-12 2014-05-27 GM Global Technology Operations LLC Steering wheel system
CN103717478B (en) * 2011-06-22 2017-02-15 Tk控股公司 Sensor system for steering wheel for vehicle
JP2013025620A (en) 2011-07-22 2013-02-04 Alpine Electronics Inc On-vehicle system
US9267809B2 (en) 2011-08-11 2016-02-23 JVC Kenwood Corporation Control apparatus and method for controlling operation target device in vehicle, and steering wheel
JP2013079058A (en) 2011-09-21 2013-05-02 Jvc Kenwood Corp Apparatus and method for controlling device to be operated in vehicle
EP2798439B1 (en) 2011-12-29 2020-11-25 Intel Corporation Methods and systems for typing on keyboards on a steering wheel
KR101490908B1 (en) * 2012-12-05 2015-02-06 현대자동차 주식회사 System and method for providing a user interface using hand shape trace recognition in a vehicle
US10035539B2 (en) * 2014-12-31 2018-07-31 Harman International Industries, Incorporated Steering wheel control system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102700479A (en) * 2011-03-28 2012-10-03 株式会社本田阿克塞斯 Hand judgment method and judgment device, and operation device of vehicle-mounted equipment of motor vehicle
CN102785689A (en) * 2011-05-20 2012-11-21 罗伯特·博世有限公司 Haptic steering wheel, steering-wheel system and driver assistance system for a motor vehicle
WO2014085277A1 (en) * 2012-11-27 2014-06-05 Neonöde Inc. Light-based touch controls on a steering wheel and dashboard

Also Published As

Publication number Publication date
EP3040252B1 (en) 2020-09-30
US10035539B2 (en) 2018-07-31
US20160185385A1 (en) 2016-06-30
EP3040252A1 (en) 2016-07-06
CN105739679A (en) 2016-07-06

Similar Documents

Publication Publication Date Title
CN105739679B (en) Steering wheel control system
CN109478354B (en) Haptic guidance system
US10053110B2 (en) Systems and methodologies for controlling an autonomous vehicle
US9239624B2 (en) Free hand gesture control of automotive user interface
US9440537B2 (en) Method and device for operating functions displayed on a display unit of a vehicle using gestures which are carried out in a three-dimensional space, and corresponding computer program product
US9937839B2 (en) Feedback by modifying stiffness
JP5703800B2 (en) Fingertip touch determination device and fingertip touch determination method
US9983745B2 (en) Touch sensor device and controller
JP6323960B2 (en) Input device
CN107547738B (en) Prompting method and mobile terminal
JP6188998B2 (en) Touch panel control device and in-vehicle information device
US20130342443A1 (en) Gesture detecting apparatus and method for determining gesture according to variation of velocity
US10585487B2 (en) Gesture interaction with a driver information system of a vehicle
WO2018061603A1 (en) Gestural manipulation system, gestural manipulation method, and program
US20130176214A1 (en) Touch control method
JP2017191496A (en) Gesture determination device
KR101500412B1 (en) Gesture recognize apparatus for vehicle
JP6631329B2 (en) Information processing device and program
WO2020195835A1 (en) Touch position detection system
US11042293B2 (en) Display method and electronic device
JP6177627B2 (en) Touch operation device and touch operation program
JP6393604B2 (en) Operating device
JP6315443B2 (en) Input device, input detection method for multi-touch operation, and input detection program
JP6428382B2 (en) Input system, input device, and program
JP6447179B2 (en) Information input system and input device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant