JP6413647B2 - Operation input device - Google Patents

Operation input device Download PDF

Info

Publication number
JP6413647B2
JP6413647B2 JP2014222788A JP2014222788A JP6413647B2 JP 6413647 B2 JP6413647 B2 JP 6413647B2 JP 2014222788 A JP2014222788 A JP 2014222788A JP 2014222788 A JP2014222788 A JP 2014222788A JP 6413647 B2 JP6413647 B2 JP 6413647B2
Authority
JP
Japan
Prior art keywords
gesture
operation
steering wheel
vehicle
region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2014222788A
Other languages
Japanese (ja)
Other versions
JP2016091182A (en
Inventor
忠孝 八幡
忠孝 八幡
慎梧 豊留
慎梧 豊留
Original Assignee
三菱自動車工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱自動車工業株式会社 filed Critical 三菱自動車工業株式会社
Priority to JP2014222788A priority Critical patent/JP6413647B2/en
Publication of JP2016091182A publication Critical patent/JP2016091182A/en
Application granted granted Critical
Publication of JP6413647B2 publication Critical patent/JP6413647B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to an operation input device that receives an operation input to an operation target device.

Conventionally, an operation input device that recognizes an operator's gesture and receives an operation input to an operation target device is known. Compared with operation input using an input unit such as a switch or a button, operation input using a gesture can be performed without directing the operator's line of sight or attention to the input unit. Effective for operation input to equipment.
For example, in Patent Document 1 below, a plurality of gesture regions provided corresponding to a plurality of predetermined operation target devices are irradiated with light of different colors, and a gesture operation is performed based on a result of photographing the plurality of gesture regions. In addition to detecting, a gesture region in which the gesture operation is performed is detected. Then, control is performed to perform a predetermined operation corresponding to the detected gesture operation on the operation target device corresponding to the detected gesture region.

JP 2014-153554 A

The gesture is generally performed by moving the driver's hand as a part of the operator's body, particularly when the operation target device is an in-vehicle device. While the vehicle is being driven, the driver's hand is usually in a state of gripping the steering wheel, but may be in a state of releasing the steering wheel while performing a gesture. Even if the gesture is performed while holding the steering wheel, the driver's attention is slightly directed toward the gesture.
Therefore, when a driving operation needs to be performed more carefully, for example, when the steering wheel is steered greatly, an operation input using a gesture may not be preferable.
Further, when the steering amount of the steering wheel is large, there is a possibility that the driving operation (steering of the steering wheel) may be erroneously recognized as a gesture operation, and the erroneous recognition rate of the operation input device may increase. is there.

  The present invention has been made in view of such circumstances, and an object of the present invention is to improve safety and reduce a false recognition rate at the time of steering by an operation input using a gesture.

In order to achieve the above object, an operation input device according to the invention of claim 1 is an operation input device that recognizes an operator's gesture in a three-dimensional space and receives an operation input to an operation target device. The operation target device is an on-vehicle device mounted on a vehicle, and a steering angle detection unit that detects a steering angle of a steering wheel of the vehicle, and whether to accept an operation input using the gesture based on the steering angle. Based on the photographed image photographed by the photographing means when the gesture operation enable / disable determining means to be determined, the photographing means for continuously photographing the three-dimensional space, and the operation input using the gesture are permitted, Identifying whether or not the operator has performed a predetermined gesture, and when performing the predetermined gesture, an action corresponding to the predetermined gesture is performed Comprising a gesture identification means for outputting a control signal for causing the work target device, wherein the predetermined gesture is a plurality of types set, the gesture control availability determining means, a predetermined gesture of the plurality of types, the steering A gesture group for prohibiting acceptance when the angle is greater than or equal to a predetermined angle and a permission gesture group that is also accepted when the steering angle is greater than or equal to the predetermined angle are set separately. If the angle is greater than or equal to a predetermined angle, it is determined whether the operator has made a gesture belonging to the permitted gesture group, and when a gesture belonging to the permitted gesture group is performed, an operation corresponding to the gesture is performed. A control signal to be executed by the target device is output, and the prohibited gesture group includes at least the steering wheel of the operator. Includes operations likely to erroneously recognized as a steering operation, characterized and this.
In the operation input device according to the second aspect of the present invention, an operation with a high possibility of erroneously recognizing the steering operation of the steering wheel of the operator reciprocates the operator's hand along the steering wheel in the vehicle width direction. It is a gesture to be performed.
In the operation input device according to a third aspect of the invention, the predetermined gesture is specified by a gesture action and a gesture area indicating a position in the three-dimensional space where the gesture action is performed. The three-dimensional space is divided into a plurality of regions along at least two coordinate axis directions.
In the operation input device according to a fourth aspect of the present invention, a steering wheel of a vehicle is arranged in the three-dimensional space, and the gesture region has the three-dimensional space as an origin at a center point of the steering wheel. It is divided along a substantially vertical axis substantially along the vertical direction and a substantially longitudinal axis substantially along the longitudinal direction of the vehicle, and the space along the substantially vertical axis is an inner space of the steering wheel. The space is divided into a circumferential region, an outer edge region of the steering wheel, and an upper region above the outer edge region, and the space along the substantially front-rear axis is a front region where the steering wheel is a surface facing the driver; The steering wheel is divided into a rear surface region that is a mounting surface to the vehicle body.
In the operation input device according to a fifth aspect of the present invention, a steering wheel of a vehicle is arranged in the three-dimensional space, and the gesture region of the vehicle has the three-dimensional space as an origin at a center point of the steering wheel. Divided along a substantially vertical axis substantially along the vertical direction, a substantially longitudinal axis substantially along the vehicle longitudinal direction, and a substantially vehicle width direction axis substantially along the vehicle width direction of the vehicle, The space along the substantially vertical axis is divided into an inner circumferential region of the steering wheel, an outer edge region of the steering wheel, and an upper region above the outer edge region, and the space along the substantially longitudinal axis is The steering wheel is divided into a front region that is a surface facing the driver and a rear region that is a mounting surface of the steering wheel to the vehicle body. Space along the substantially vehicle transverse axis, said and one of the car-side side area from the center point of the steering wheel, is divided from the center point to the other car side side region, it is characterized.

According to the present invention, it is determined whether to accept an operation input using a gesture based on the steering angle of the steering wheel of the vehicle. In many cases, the operator in the vehicle is a driver, but the physical and psychological state of the driver varies greatly depending on the traveling state of the vehicle, for example, the magnitude of the steering angle. Therefore, by determining whether or not an operation input using a gesture is possible based on the steering angle of the steering wheel, it is possible to realize an operation input that matches the physical and psychological state of the operator. Input can function more effectively. In addition, the vehicle-mounted device can be operated without using an input unit or the like during driving of a vehicle that requires forward gaze, and safety during driving of the vehicle can be improved.
According to the present invention, when the steering angle is equal to or larger than the predetermined angle, that is, it is necessary to perform the driving operation more carefully, and in the state where the body of the operator (driver) is largely restrained by the steering steering, While prohibiting a gesture as a prohibited gesture group, the remaining gestures receive an operation input as a permitted gesture group. Therefore, for example, it is possible to reduce the number of operation inputs at the time of steering steering by prohibiting operation inputs that are less necessary at the time of steering, and to turn the operator's (driver) 's attention at the time of steering to the driving operation. It can be made easier .
According to the present invention, when recognizing an operator's gesture in a three-dimensional space and receiving an operation input, whether or not a predetermined gesture operation is performed in advance and an area where the predetermined gesture is performed are determined. Based on this, a predetermined gesture is identified. Therefore, compared with the case where a gesture is performed in a single space that is not divided into regions, it is possible to reduce the possibility of erroneously recognizing a simple operation with no intention of operation input as a gesture of operation input. In addition, since the gesture region is obtained by dividing the three-dimensional space into a plurality of regions along at least two coordinate axis directions, the position where the gesture operation is performed can be specified in more detail, and the recognition rate of operation input is improved. Can be made. Further, the types of operation inputs can be increased by combining the gesture operation and the gesture area.
According to the present invention, the operator can easily recognize the direction of the coordinate axis based on the steering wheel of the vehicle, and the gesture operation can be performed easily and accurately.

It is a block diagram which shows the structure of the operation input apparatus 10 concerning embodiment. 2 is a front view of a steering wheel 20. FIG. FIG. 3 is a side view of the steering wheel 20 (viewed in the direction of arrow A in FIG. 2). It is a table | surface which shows the content of operation identification database DB typically. It is explanatory drawing which shows the gesture of operation input typically. It is explanatory drawing which shows the gesture of operation input typically. It is explanatory drawing which shows the gesture of operation input typically. It is explanatory drawing which shows the gesture of operation input typically. 4 is a flowchart illustrating a processing procedure performed by the operation input device 10. It is another example of area setting in a three-dimensional space.

  Exemplary embodiments of an operation input device according to the present invention will be explained below in detail with reference to the accompanying drawings.

(Embodiment 1)
FIG. 1 is a block diagram illustrating a configuration of an operation input device 10 according to the embodiment.
The operation input device 10 according to the embodiment is mounted on a vehicle (not shown), recognizes an operator's gesture in the three-dimensional space S, and inputs an operation input to the in-vehicle device 30 (operation target device) in the vehicle. Accept.
In the present embodiment, the three-dimensional space S is a cabin space in the vehicle, and particularly refers to a space where the steering wheel 20 in the vicinity of the driver's seat is arranged.
In the present embodiment, the surrounding space is divided into a plurality of regions (gesture regions) with reference to the steering wheel 20, and the operation content is identified according to what gesture (gesture operation) is performed in which region. To do.
That is, when an operation input using a gesture is performed, the operation input device 10 specifies the content of the operation input based on the gesture operation and the gesture area indicating the position where the gesture operation is performed.
In this way, by specifying the reference member (steering wheel 20) in the space in the space, it is easy for the operator (driver) to recognize the reference position of the gesture and to perform the operation. At the same time, the gesture recognition rate in the operation input device 10 can be improved.

FIG. 2 is a front view of the steering wheel 20, and FIG. 3 is a side view of the steering wheel 20 (viewed in the direction of arrow A in FIG. 2).
As shown in FIG. 2, the steering wheel 20 includes an annular grip portion 22, a center portion 24 disposed in the center of the grip portion 22, and a center portion extending from the center portion 24 in the left and right radial directions. 24 and a plurality of spoke portions 26 for connecting the grip portion 22 to each other.
Further, as shown in FIG. 3, the steering wheel 20 is disposed via a column cover 32 from an instrument panel 31 formed to extend in the width direction in front of the vehicle interior.
The steering wheel 20 is connected to the upper end of the steering shaft 34 in the column cover 32. The steering shaft 34 is rotatably supported in a cylindrical steering column 36 and extends from the column cover 32 to the instrument panel 31.
The steering column 36 is provided with a tilt device 38 that changes the angle of the steering wheel 20 and a steering angle sensor 39 that is integrated with the tilt device 38 and detects the steering angle of the steering wheel 20. These mechanisms are accommodated in the instrument panel 31.

In the present embodiment, the gesture region is specified by dividing the three-dimensional space S around the steering wheel 20 into a plurality of regions along at least two coordinate axis directions.
In the present embodiment, the three-dimensional space S is set in three coordinate axis directions, that is, with the center point of the steering wheel 20 as the origin O, the X axis that is a substantially vertical axis substantially along the vehicle vertical direction, and the vehicle longitudinal direction. A gesture region is specified by dividing along a Z axis that is a substantially longitudinal axis that is substantially along, and a Y axis that is a substantially vehicle width direction axis that is substantially along the vehicle width direction of the vehicle.
Here, “almost along the vertical direction or the front-rear direction of the vehicle” means that the steering wheel 20 is often inclined with respect to the front-rear direction of the vehicle as shown in FIG. When the coordinate axis is set based on the circular ring, it means that the vehicle does not strictly follow the vertical direction or the front-rear direction of the vehicle.

As shown in FIG. 2, the space along the substantially vertical axis (X axis) is divided into an inner peripheral region X1 of the steering wheel 20, an outer edge region X2 of the steering wheel 20, and an upper region X3 above the outer edge region X2. The
The outer edge region X2 is, for example, a region from the upper end 22A of the grip portion 22 of the steering wheel 20 up to about several centimeters (about the height of the fist of the operator).
In the present embodiment, the regions X1 to X3 are set only in the space above the origin O. This is because the space below the origin O is difficult to enter the operator's field of view and is difficult to operate.
However, as shown in FIG. 10A, the regions X1 to X3 may be set in a space below the origin O. Further, as shown in FIG. 10B, the regions X1 to X3 set in the space above the origin O and the regions X1 to X3 (regions X1 ′ to X3 ′) set in the space below the origin O are separated. It may be set as a region.

  As shown in FIG. 2, the space along the substantially vehicle width direction axis (Y axis) is a region on the vehicle side from the center point (origin O) of the steering wheel 20 (in FIG. 2, on the right side as viewed from the operator). It is divided into a right area Y1) and a center area (origin O) on the other vehicle side (left area Y2 located on the left side as viewed from the operator in FIG. 2).

  As shown in FIG. 3, the space along the substantially longitudinal axis (Z-axis) includes a front area Z1 where the steering wheel 20 faces the driver and a rear area where the steering wheel 20 is attached to the vehicle body. Divided into Z2.

  2 and 3, the case where the three-dimensional space S is divided into three coordinate axis directions has been described. However, the three-dimensional space S is divided into two coordinate axis directions, for example, an X axis that is a substantially vertical axis and a substantially front-rear direction. You may make it divide | segment into the Z-axis which is an axis | shaft.

Returning to the description of FIG. 1, the operation input device 10 includes a steering angle detection unit 11, a gesture operation availability determination unit 12, a photographing unit 13, a gesture identification unit 14, and an operation identification database DB.
Among the above configurations, the gesture operation availability determination unit 12 and the gesture identification unit 14 are configured by a processing unit such as a vehicle ECU. The vehicle ECU includes a CPU, a ROM that stores and stores a control program, a RAM as an operation area of the control program, an EEPROM that holds various data in a rewritable manner, an interface unit that interfaces with peripheral circuits, and the like. When the CPU executes the control program, the CPU functions as the gesture operation availability determination unit 12 or the gesture identification unit 14.

The steering angle detection means 11 detects the steering angle of the steering wheel 20. In the present embodiment, the steering angle detection means 11 is the steering angle sensor 39 shown in FIG.
Based on the steering angle detected by the steering angle detection unit 11, the gesture operation propriety determination unit 12 determines whether to accept an operation input using a gesture.
More specifically, when the steering angle of the steering wheel 20 is equal to or greater than a predetermined angle, the gesture operation propriety determination unit 12 needs to direct more attention of the operator to the driving operation, or erroneous gesture recognition. Therefore, at least a part of the gesture operation is prohibited.
In the following description, “more than a predetermined angle” means that the steering wheel 20 is more than a predetermined angle clockwise or counterclockwise from the reference position (position in the straight traveling state), and the rotation direction is not limited.

The determination result by the gesture operation propriety determination means 12 includes the following types <1> or <2>, for example.
Below, <1> will be mainly described.
<1> When the steering angle is large (more than a predetermined angle), some of the plural types of operation inputs are prohibited, and the rest are accepted as usual.
That is, a plurality of types of gestures recognized as operation inputs (predetermined gestures) are set, and a plurality of types of gestures are prohibited from being accepted when the steering angle is a predetermined angle or more, and the steering angle is a predetermined angle. Even in the above case, the permission gesture group to be accepted is set separately.
<2> When the steering angle is large (greater than a predetermined angle), operation input using gestures is completely prohibited.

In order to describe the gesture operation availability determination unit 12 in more detail, the operation identification database DB will be described.
In the operation identification database DB, data for identifying the operation content is recorded depending on which gesture operation is performed in which region in the three-dimensional space S.
FIG. 4 is a table schematically showing the contents of the operation identification database DB.
In the table of FIG. 4, for five types of operation inputs (first operation input to fifth operation input), the operation target device 1481, the operation content 1482, the gesture content 1483, and at the time of large steering where the steering angle is a predetermined angle or more. The operational input propriety 1484 is shown.
Note that the gesture content 1483 is specified from two elements: a gesture action and an area in which the gesture action is performed (gesture area).

The operation target device 30 of the first operation input is a small lamp and a tail lamp of the vehicle, and the operation content is turned on and off, that is, when this gesture is performed with the small lamp and the tail lamp turned off, the small lamp and the tail lamp. Is turned on, and when this gesture is performed in a state where the small lamp and the tail lamp are lit, the small lamp and the tail lamp are turned off.
The gesture content of the first operation input is a gesture operation that changes the operator's hand from goo (fist fist) to par (spreads palm) in the upper region X3 and the rear surface region Z2 of the steering wheel 20.
That is, as shown in FIG. 5, the gesture is a gesture in which the operator's hand H is positioned in a region near the instrument panel 31 above the steering wheel 20 and the shape of the hand H is changed from goo to par.
5, 6, and 8, FIG. A is a front view of the steering wheel 20, and FIG. B is a side view of the steering wheel 20. 7A and 7B are both front views of the steering wheel 20. FIG.
The first operation input is permitted even when the steering angle of the steering wheel 20 is a predetermined angle or more. That is, the first operation input belongs to the permission gesture group.

The operation target device 30 of the second operation input is a headlight of a vehicle, and when the gesture is performed with the headlight turned on, the operation content is switched between a headlight irradiation direction between high and low ( When this gesture is performed in a state where the headlight is turned off), passing lighting (instant high beam lighting) is performed.
The gesture content of the second operation input is to perform a gesture operation of moving the operator's hand from the front surface area Z1 of the inner peripheral area X1 of the steering wheel 20 to the rear surface area Z2.
That is, as shown in FIG. 6, it is a gesture for making a hand pass through the inner periphery of the steering wheel 20.
The second operation input is prohibited when the steering angle of the steering wheel 20 is greater than or equal to a predetermined angle. That is, the second operation input belongs to the prohibited gesture group. This is because when the steering angle of the steering wheel 20 is greater than or equal to a predetermined angle, the position of the inner peripheral region X1 is greatly shifted to the left or right, making it difficult to operate, and causing erroneous recognition in the operation input device 10 as well. This is because there is a possibility.

The operation target device 30 of the third operation input is a left turn lamp as viewed from the operator, and the operation content is lit.
The gesture content of the third operation input is to perform a gesture operation that moves the operator's hand from the right region Y1 to the left region Y2 along the outer edge region X2 of the steering wheel 20, and then returns to the right region Y1. is there.
That is, as shown in FIG. 7A, this is a gesture in which the operator's right hand H is reciprocated in the vehicle width direction along the outer edge of the steering wheel 20. This simulates the operation of rotating the steering wheel 20 to the left when the operator performs a left turn operation.
The third operation input is prohibited when the steering angle of the steering wheel 20 is greater than or equal to a predetermined angle. That is, the third operation input belongs to the prohibited gesture group. This is because the turn lamp is normally turned on before the vehicle turns left or right, but when the steering angle of the steering wheel 20 is greater than or equal to a predetermined angle, the vehicle has already started turning right and left, and the turn lamp is turned on. This is because there is a high possibility that the driving operation of the driver (operator) at the time of steering the steering wheel 20 is erroneously recognized.

The operation target device 30 of the fourth operation input is a right turn lamp as viewed from the operator, and the operation content is lit.
The gesture content of the fourth operation input is a gesture operation in which the operator's hand is moved from the left side area Y2 to the right side area Y1 along the outer edge area X2 of the steering wheel 20 and then returned to the left side area Y2. is there.
That is, as shown in FIG. 7B, the gesture is a reciprocation of the operator's left hand H in the vehicle width direction along the outer edge of the steering wheel 20. This simulates the operation of rotating the steering wheel 20 to the right when the operator performs a right turn operation.
Similarly to the third operation input, the fourth operation input is also prohibited when the steering angle of the steering wheel 20 is greater than or equal to a predetermined angle. That is, the fourth operation input belongs to the prohibited gesture group.

The operation target device 30 of the fifth operation input is a vehicle headlamp, and the operation content is turned on and off, that is, when this gesture is performed with the headlamp turned off, the headlamp is turned on. If this gesture is performed while is lit, the headlamp is turned off.
The gesture content of the fifth operation input is to perform a gesture operation of holding the hand over the steering wheel 20 in the inner peripheral area X1 and the front area Z1 of the steering wheel 20.
That is, as shown in FIG. 8, the gesture is to spread the operator's hand H directly against the steering wheel 20.
The fifth operation input is permitted even when the steering angle of the steering wheel 20 is greater than or equal to a predetermined angle. That is, the fifth operation input belongs to the permission gesture group.

Returning to the description of FIG. 1, the photographing unit 13 continuously photographs a three-dimensional space. Continuous shooting refers to shooting a group of images (video data) in which the behavior (gesture) of the operator in a three-dimensional space can be recognized.
The photographing means 13 is, for example, an infrared camera, and can photograph a video in a three-dimensional space even at night. Further, the photographing means 13 may be composed of a plurality of cameras (stereo cameras or the like).
The installation position of the photographing unit 13 is arbitrary, but the steering wheel 20 and the above-described regions X1 to X3, Y1, Y2, Z1, and Z2 are positions that can be included in the photographing range.

When the operation input using the gesture is permitted, the gesture identification unit 14 identifies whether or not the operator has performed the predetermined gesture based on the captured image captured by the imaging unit 13, and the predetermined gesture When the operation is performed, a control signal for causing the operation target device 30 to perform an operation corresponding to a predetermined gesture is output.
The gesture identification unit 14 includes an operation identification unit 140, a region identification unit 142, an operation content identification unit 144, and a device control unit 146.

The motion identification unit 140 identifies whether or not the operator has performed a predetermined gesture motion in the three-dimensional space based on the captured image captured by the imaging unit 13.
In the present embodiment, the operator is a driver, and the part that performs the operation is the hand H. The action identifying means 140 identifies a hand appearing in a captured image using a known image recognition technique (pattern recognition technique), and further, based on how the position of each point of the hand changes over time, Recognize hand movements. Then, it is determined whether or not the recognized hand movement matches the predetermined gesture action.
The predetermined gesture operation refers to all gesture operations recorded in the operation identification database DB when the steering angle of the steering wheel 20 is less than the predetermined angle, and when the steering angle of the steering wheel 20 is equal to or larger than the predetermined angle. This is a gesture operation belonging to the permitted gesture group in the operation identification database DB.
Further, when all the operation inputs are prohibited when the steering angle of the steering wheel 20 is greater than or equal to a predetermined angle, the predetermined gesture operation is lost. Therefore, although the image recognition of the captured image by the action identifying unit 140 may be stopped, it is preferable that the image recognition is always continued because the steering angle is frequently changed.

The area identifying unit 142 identifies which of the plurality of areas X1 to X3, Y1, Y2, Z1, and Z2 the position where the predetermined gesture operation is performed.
The region identification unit 142 determines that the gesture operation is performed in a plurality of regions X1 to X3, Y1, Y2, Z1, and Z2 based on the relative relationship between the position of the steering wheel 20 in the captured image and the position of the operator's hand during the gesture operation. To identify whether it was done.

The operation content specifying unit 144 specifies the content (operation content) of the operation input to the operation target device 30 based on the predetermined gesture operation and the area where the predetermined gesture is performed.
The operation content specifying unit 144 determines whether the combination of the type of gesture action identified by the action identifying unit 140 and the area identified by the area identifying unit 142 matches the operation input recorded in the operation identification database DB. If they match, the operation content to the operation target device 30 corresponding to the operation input is specified.
For example, it is assumed that the gesture action identified by the action identification unit 140 is a gesture action (a gesture action in the first operation input) that changes the shape of the operator's hand from goo to par.
At this time, if the region identified by the region identifying unit 142 is the upper region X3 and the rear region Z2 of the steering wheel 20, the operation content specifying unit 144 determines that the first operation input has been performed, and the vehicle It is specified that the small lamp and the tail lamp are turned on or off.
In addition, when the area identified by the area identifying unit 142 is the inner peripheral area X1 or the front area Z1 of the steering wheel 20, there is no corresponding operation input. judge.
In other words, the operation content specifying unit 144 matches the combination of the gesture action identified by the action identifying unit 140 and the area identified by the area identifying unit 142 with one of the operation inputs recorded in the operation identification database DB. Is recognized as a valid operation input only.

The device control unit 146 outputs a control signal that causes the operation target device 30 to perform an operation corresponding to the operation input.
For example, when the first operation input is performed, the device control unit 146 outputs a control signal for switching on / off the small lamp and tail lamp of the vehicle.

In the above description, when the steering angle of the steering wheel 20 is greater than or equal to a predetermined angle, the gesture motion recognized by the motion identification unit 140 (predetermined gesture motion) is set to only the permitted gesture group, thereby using the gesture. Although whether or not the operation input can be performed is switched, it is arbitrary at which stage during the operation of the operation input device 10 the switching is performed.
Depending on whether or not a control signal from another component, for example, the device control means 146 is output, whether or not an operation input using a gesture is possible may be switched.
In this case, when the steering angle of the steering wheel 20 is less than the predetermined angle, the device control unit 146 outputs a control signal for all recognized operation inputs, but the steering angle of the steering wheel 20 is greater than or equal to the predetermined angle. In this case, only the control signal for the operation input belonging to the permitted gesture input group is output, and the control signal for the operation input belonging to the prohibited gesture group is not output.
In addition, when all the operation inputs are prohibited when the steering angle of the steering wheel 20 is equal to or greater than a predetermined angle, the control signal for any operation input when the steering angle of the steering wheel 20 is equal to or greater than the predetermined angle. Is not output.

FIG. 9 is a flowchart illustrating a procedure of processing performed by the operation input device 10.
The operation input device 10 repeats the process shown in FIG. 9 while the vehicle is traveling.
First, the steering angle detection means 11 detects the steering angle of the steering wheel 20 (step S800), and determines whether or not the steering angle is equal to or greater than a predetermined angle (step S801).
If the steering angle is equal to or greater than the predetermined angle (step S801: Yes), the gesture operation of the permitted gesture group is set as a gesture operation (predetermined gesture operation) that is identified by the operation identification unit 140 (step S802). If the steering angle is less than the predetermined angle (step S801: No), all gesture actions included in the operation identification database DB are set as gesture actions (predetermined gesture actions) that are identified by the action identification unit 140. (Step S803).
Next, the vehicle interior space around the steering wheel 20 is photographed by the photographing means 13 (step S804).
The motion identification unit 140 analyzes the captured image captured by the capturing unit 13 and determines whether or not the operator has performed the predetermined gesture operation (step S805).
If the operator is not performing a predetermined gesture operation (step S805: No), the process returns to step S800 and the subsequent processing is repeated.
If the operator performs a predetermined gesture operation (step S805: Yes), the region identification unit 142 identifies the region where the gesture operation has been performed (step S806).
Next, the operation content identification unit 144 determines whether or not the combination of the gesture action performed by the operator and the region where the gesture action is performed matches any operation input (step S807).
If it does not match any of the operation inputs (step S807: No), the process returns to step S800 and the subsequent processing is repeated.
If it matches any of the operation inputs (step S807: Yes), the operation target device 30 and the control content of the operation input are specified (step S808), and the operation control device 146 controls the operation target device 30. A signal is output (step S809).
Thereafter, the process returns to step S800, and the subsequent processing is repeated.

  In FIG. 9 and the above description, the gesture region is identified by the region identifying unit 142 after identifying that the predetermined gesture operation has been performed by the motion identifying unit 140. However, the present invention is not limited to this. You may identify whether predetermined gesture operation | movement was performed after identifying. For example, it may be determined whether a predetermined gesture operation has been performed while constantly tracking the position of the operator's hand.

As described above, the operation input device 10 according to the embodiment determines whether to accept an operation input using a gesture based on the steering angle of the steering wheel 20 of the vehicle. In many cases, the operator in the vehicle is a driver, but the physical and psychological state of the driver varies greatly depending on the traveling state of the vehicle, for example, the magnitude of the steering angle. Therefore, by determining whether or not an operation input using a gesture can be performed based on the steering angle of the steering wheel 20, an operation input in accordance with the physical and psychological state of the operator can be realized. Operation input can be made to function more effectively.
In addition, the vehicle-mounted device can be operated without using an input unit or the like during driving of a vehicle that requires forward gaze, and safety during driving of the vehicle can be improved.
In addition, the operation input device 10 is necessary when the steering angle is equal to or greater than a predetermined angle, that is, when the driving operation needs to be performed more carefully and the operator's (driver's) body is largely restrained by steering. Are prohibited as a prohibited gesture group, and the remaining gestures receive an operation input as a permitted gesture group.
Therefore, for example, it is possible to reduce the number of operation inputs at the time of steering steering by prohibiting operation inputs that are less necessary at the time of steering, and to turn the operator's (driver) 's attention at the time of steering to the driving operation. It can be made easier.
In addition, if the operation input device 10 prohibits operation input using gestures when the steering angle is greater than or equal to a predetermined angle, the operator's (driver) 's attention at the time of steering is further directed toward the traveling operation. It can be made easier.
Further, when the operation input device 10 recognizes the gesture of the operator in the three-dimensional space and accepts the operation input, whether or not a predetermined gesture operation is performed in advance and an area in which the predetermined gesture is performed Based on the above, a predetermined gesture is specified.
Therefore, compared with the case where a gesture is performed in a single space that is not divided into regions, it is possible to reduce the possibility of erroneously recognizing a simple operation with no intention of operation input as a gesture of operation input.
In addition, since the gesture region is obtained by dividing the three-dimensional space into a plurality of regions along at least two coordinate axis directions, the position where the gesture operation is performed can be specified in more detail, and the recognition rate of operation input is improved. Can be made.
Further, the types of operation inputs can be increased by combining the gesture operation and the gesture area.
In the operation input device 10, the operator can easily recognize the coordinate axis direction based on the steering wheel 20 of the vehicle, and the gesture operation can be performed easily and accurately.

  DESCRIPTION OF SYMBOLS 10 ... Operation input device, 11 ... Speed detection means, 12 ... Gesture operation availability determination means, 13 ... Shooting means, 14 ... Gesture identification means, 140 ... Action identification means, 142 ... Area identification means, 144... Operation content specifying means, 146... Device control means, DB... Operation identification database, 20. Part, 30... Operation target device (in-vehicle device).

Claims (5)

  1. An operation input device that recognizes an operator's gesture in a three-dimensional space and receives an operation input to an operation target device,
    The operation target device is an in-vehicle device mounted on a vehicle,
    Steering angle detection means for detecting the steering angle of the steering wheel of the vehicle;
    Gesture operation propriety determining means for determining whether or not to accept an operation input using the gesture based on the steering angle;
    Photographing means for continuously photographing the three-dimensional space;
    When operation input using the gesture is permitted, it is determined whether or not the operator has performed a predetermined gesture based on a captured image captured by the imaging unit, and the predetermined gesture has been performed. A gesture identifying means for outputting a control signal for causing the operation target device to perform an operation corresponding to the predetermined gesture ,
    A plurality of types of the predetermined gestures are set,
    The gesture operation propriety determining means includes a prohibited gesture group that prohibits acceptance of the plurality of types of predetermined gestures when the steering angle is equal to or greater than a predetermined angle, and a permission gesture that is also accepted when the steering angle is equal to or greater than a predetermined angle. And set it in groups,
    When the steering angle is equal to or greater than a predetermined angle, the gesture identifying means identifies whether the operator has performed a gesture belonging to the permitted gesture group, and when performing a gesture belonging to the permitted gesture group , Outputting a control signal that causes the operation target device to perform an operation corresponding to the gesture,
    The prohibited gesture group includes at least an operation that is likely to be erroneously recognized as a steering operation of the steering wheel of the operator.
    Operation input device comprising a call.
  2. The operation with a high possibility of erroneously recognizing the steering operation of the steering wheel of the operator is a gesture of reciprocating the operator's hand in the vehicle width direction along the steering wheel.
    The operation input device according to claim 1.
  3. The predetermined gesture is specified by a gesture action and a gesture region indicating a position in the three-dimensional space where the gesture action is performed,
    The gesture region is obtained by dividing the three-dimensional space into a plurality of regions along at least two coordinate axis directions.
    The operation input device according to claim 1, wherein:
  4. A steering wheel of a vehicle is arranged in the three-dimensional space,
    The gesture region has a substantially vertical axis substantially along the vertical direction of the vehicle with the center point of the steering wheel as an origin in the three-dimensional space, and a substantially longitudinal axis substantially along the longitudinal direction of the vehicle. Divided,
    The space along the substantially vertical axis is divided into an inner peripheral region of the steering wheel, an outer edge region of the steering wheel, and an upper region above the outer edge region,
    The space along the substantially longitudinal axis is divided into a front region, which is a surface where the steering wheel faces the driver, and a rear region, which is a mounting surface of the steering wheel to the vehicle body.
    The operation input device according to claim 3 .
  5. A steering wheel of a vehicle is arranged in the three-dimensional space,
    The gesture region includes a substantially vertical axis substantially along the vertical direction of the vehicle with the center point of the steering wheel as an origin in the three-dimensional space, a substantially longitudinal axis substantially along the longitudinal direction of the vehicle, and the vehicle Divided substantially along the vehicle width direction axis substantially along the vehicle width direction,
    The space along the substantially vertical axis is divided into an inner peripheral region of the steering wheel, an outer edge region of the steering wheel, and an upper region above the outer edge region,
    The space along the substantially longitudinal axis is divided into a front region, which is a surface where the steering wheel faces the driver, and a rear region, which is a mounting surface of the steering wheel to the vehicle body,
    The space along the substantially vehicle width direction axis is divided into a region on one vehicle side from the center point of the steering wheel and a region on the other vehicle side from the center point.
    The operation input device according to claim 3 .
JP2014222788A 2014-10-31 2014-10-31 Operation input device Active JP6413647B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2014222788A JP6413647B2 (en) 2014-10-31 2014-10-31 Operation input device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2014222788A JP6413647B2 (en) 2014-10-31 2014-10-31 Operation input device

Publications (2)

Publication Number Publication Date
JP2016091182A JP2016091182A (en) 2016-05-23
JP6413647B2 true JP6413647B2 (en) 2018-10-31

Family

ID=56018631

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2014222788A Active JP6413647B2 (en) 2014-10-31 2014-10-31 Operation input device

Country Status (1)

Country Link
JP (1) JP6413647B2 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106828349B (en) * 2016-12-25 2019-03-05 重庆路格科技有限公司 Onboard navigation system
KR101983892B1 (en) * 2017-03-08 2019-05-29 전자부품연구원 Gesture Recognition Method, Device, and recording medium for Vehicle using Wearable device

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3900122B2 (en) * 2003-07-30 2007-04-04 日産自動車株式会社 Non-contact type information input device
JP4512999B2 (en) * 2004-09-30 2010-07-28 マツダ株式会社 Vehicle information display device
JP5201480B2 (en) * 2009-04-27 2013-06-05 株式会社デンソー Vehicle control device
JP5232889B2 (en) * 2011-03-30 2013-07-10 本田技研工業株式会社 Vehicle control device
JP5958876B2 (en) * 2011-10-21 2016-08-02 スズキ株式会社 Vehicle input device
WO2013074919A2 (en) * 2011-11-16 2013-05-23 Flextronics Ap , Llc Universal bus in the car
JP6202810B2 (en) * 2012-12-04 2017-09-27 アルパイン株式会社 Gesture recognition apparatus and method, and program

Also Published As

Publication number Publication date
JP2016091182A (en) 2016-05-23

Similar Documents

Publication Publication Date Title
JP4879189B2 (en) Safe driving support device
US20040090314A1 (en) Alarm information providing apparatus for vehicle
JP2016076229A (en) Method of detecting object adjacent to rear side face of vehicle
JP2009101984A (en) Parking assistance device and method
CN102782741B (en) Vehicle periphery monitoring device
US8004424B2 (en) Driving assist device for vehicle
JP5158063B2 (en) Vehicle display device
CN106132810A (en) Method and device for steering a car/trailer combination into a parking space
JP5812598B2 (en) Object detection device
US8560200B2 (en) Driving support apparatus for vehicle
JPWO2012164729A1 (en) Visibility support device for vehicle
JP6080955B2 (en) Information display device
WO2015037117A1 (en) Information display system, and information display device
JP5143235B2 (en) Control device and vehicle surrounding monitoring device
JP2013084242A (en) Video-based warning system for vehicle
EP2974909A1 (en) Periphery surveillance apparatus and program
JP5313072B2 (en) External recognition device
CN102783144B (en) A vehicle periphery monitoring device
CN103213540B (en) Vehicle driving environment recognition apparatus
JPWO2014132680A1 (en) Vehicle control apparatus and program
US9415719B2 (en) Headlight control device
DE102014105722A1 (en) Collision detection device and collision mitigation device
EP2682929B1 (en) Vehicle-mounted camera and vehicle-mounted camera system
CN104903946B (en) Vehicle peripheral display device
JP2014229997A (en) Display device for vehicle

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20171006

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20180427

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20180508

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20180709

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20180904

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20180917

R151 Written notification of patent or utility model registration

Ref document number: 6413647

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R151