WO2020137592A1 - Gesture detection device and gesture detection method - Google Patents

Gesture detection device and gesture detection method Download PDF

Info

Publication number
WO2020137592A1
WO2020137592A1 PCT/JP2019/048757 JP2019048757W WO2020137592A1 WO 2020137592 A1 WO2020137592 A1 WO 2020137592A1 JP 2019048757 W JP2019048757 W JP 2019048757W WO 2020137592 A1 WO2020137592 A1 WO 2020137592A1
Authority
WO
WIPO (PCT)
Prior art keywords
gesture
gaze
control unit
operator
pointing
Prior art date
Application number
PCT/JP2019/048757
Other languages
French (fr)
Japanese (ja)
Inventor
佳行 津田
Original Assignee
株式会社デンソー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社デンソー filed Critical 株式会社デンソー
Publication of WO2020137592A1 publication Critical patent/WO2020137592A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer

Definitions

  • the present disclosure relates to a gesture detection device that detects a gesture used for an input operation, and a gesture detection method.
  • the device controller specifies the device to be controlled based on the gesture (hand shape, state, change in state) of the operator, and further issues a control command. It can be determined and input (remote operation) to the control target device. For example, when the display (TV) screen is pointed by the pointing gesture with the index finger raised as the operator's gesture, the device controller recognizes that the controlled device is the TV, and the pointing gesture , You can move the cursor on the display located on the extension line of your finger.
  • the gesture detection device information input device
  • the device controller specifies the device to be controlled based on the gesture (hand shape, state, change in state) of the operator, and further issues a control command. It can be determined and input (remote operation) to the control target device. For example, when the display (TV) screen is pointed by the pointing gesture with the index finger raised as the operator's gesture, the device controller recognizes that the controlled device is the TV, and the pointing gesture , You can move the cursor on the display located on the extension line of your finger
  • the expansion surface is set by the display screen and the extension surface that is the outer area of the display screen. Then, a virtual line connecting the position of the operator's eyes (reference point) and the fingertip (pointer) of the hand held in front of the deployment surface defines the position (point of interest) on the deployment surface by pointing. It is supposed to be done.
  • one of the objects of the present disclosure is to provide a gesture detection device and a gesture detection method that can reduce erroneous operations without causing the operator to feel bothered.
  • a gesture detection device is a gesture detection device that performs an input on an operation screen away from the operator based on an instruction gesture performed by the operator, A detection unit that detects an operator's line-of-sight direction and a pointing gesture, From the detection data detected by the detection unit, if there is a gaze for a predetermined time or more on the operation screen, and if there is an instruction gesture for a predetermined time or more, the gaze direction by the gaze is the initial position on the operation screen. And a control unit that executes a fine adjustment mode in which the operation cursor is displayed for operation and the operation cursor is moved according to the movement by the instruction gesture.
  • a gesture detection method is a gesture detection method of performing input on an operation screen remote from the operator based on an instruction gesture performed by the operator, A detection step of detecting the operator's gaze direction and gesture, From the detection data detected in the detection step, if there is a gaze for a predetermined gaze time or longer on the operation screen, and if there is a gesture for a predetermined instruction time or longer, the gaze direction by the gaze is set as the initial position on the operation screen.
  • the operator can move the operation cursor displayed at the initial position of the gaze by performing an instruction gesture. Therefore, it is not necessary to finely regulate the gesture for the operator, and the operator can move the operation cursor accurately. In general, it is possible to reduce erroneous operations without causing the operator to feel bothered.
  • FIG. 1 is a block diagram showing the overall configuration of the gesture detection device according to the first embodiment.
  • FIG. 2 is a flowchart showing the control contents performed by the control unit.
  • FIG. 4 is an explanatory diagram showing a procedure for allowing a time difference between the gaze and the pointing gesture when the gaze time ⁇ instruction time.
  • FIG. 5 is an explanatory diagram showing a modified example 1 in which the operation cursor is moved according to the position of the finger.
  • FIG. 6 shows a second modification and is an explanatory diagram showing a movable range of the operation cursor.
  • FIG. 7 shows the third modification and is an explanatory diagram showing the procedure when the advance notice is displayed.
  • FIG. 8 is a diagram illustrating a third modification and is an explanatory diagram illustrating a procedure of performing a notice display when there is a time difference between the gaze and the pointing gesture.
  • FIG. 9 is a diagram illustrating a fourth modification and is an explanatory diagram illustrating a procedure for canceling an operation.
  • the gesture detection device 100 according to the first embodiment will be described with reference to FIGS. 1 to 4.
  • the gesture detection device 100 according to the present embodiment is mounted in, for example, a vehicle, detects a line-of-sight direction of an operator (driver)'s gaze, and a pointing direction and a pointing position by a pointing gesture, and, for example, various vehicle devices (operation screens). ) Is a device for instructing input.
  • the gaze direction of the operator's gaze is mainly understood as the direction of the gaze to the vehicle device (operation screen) that the operator wants to operate.
  • the pointing gesture is, for example, a finger, a hand, an arm, an eye (line of sight), a face, or the like that indicates a direction or position desired by the operator.
  • a finger pointing gesture is mainly used. It will be described as being used.
  • the finger pointing gesture is a gesture that indicates a desired direction depending on the extending direction of the finger, and for example, the index finger is used.
  • Examples of various vehicle equipment include, for example, an air conditioner that air-conditions the interior of the vehicle, a car navigation device (hereinafter, car navigation device) that displays the current position of the vehicle or guidance to the destination, or TV broadcast, radio broadcast, There is an audio device for reproducing a CD/DVD or the like.
  • an air conditioner will be described as a typical example of the vehicle equipment.
  • the air conditioner has, for example, a display unit 11 using a liquid crystal display, an organic EL display, or the like, and the display unit 11 includes, for example, an operation cursor 11a for input operation, an icon 11b, and an operating state. Is displayed.
  • the display unit 11 corresponds to the operation screen separated from the operator of the present disclosure.
  • the gesture detection device 100 includes a detection unit 110, a control unit 120, and the like.
  • the control unit 120 issues an input instruction (instruction for input operation) to the display unit 11 (air conditioner) remote from the operator based on the line-of-sight direction of the operator detected by the detection unit 110 and the pointing gesture. I am supposed to do it.
  • the set temperature and the amount of conditioned air are changed by pointing gestures.
  • the car navigation device displays the current position, sets the destination, enlarges/reduces the map, etc., and the audio device changes the television station, the radio station, selects the song, and changes the volume.
  • the detection unit 110 continuously (with time) detects a gaze state (gaze direction) of the operator with respect to the display unit 11 and a finger pointing gesture (direction of the index finger), and gaze direction data by gaze, and a finger pointing.
  • the gesture data is output to the control unit 120.
  • the detection unit 110 is configured to acquire the position of the operator's eyes (the direction of the line of sight) and the posture of the finger.
  • a camera that forms a luminance image of an object a range image sensor that forms a range image, or a combination thereof can be used.
  • the camera there is a near-infrared camera that captures near-infrared rays, a visible-light camera that captures visible light, or the like.
  • the range image sensor for example, a stereo camera that simultaneously captures images with a plurality of cameras and measures information in the depth direction from parallax, or the depth is determined by the time until the light from the light source is reflected by the object and returned.
  • the camera that forms the brightness image of the target object as described above is used as the detection unit 110.
  • the control unit 120 is, for example, a computer including a CPU, ROM, RAM and the like.
  • the control unit 120 operates the operation cursor on the display unit 11 based on the data (gaze direction data, finger pointing gesture data) detected by the detection unit 110 and preset positional relationship data between the display unit 11 and the detection unit 110.
  • 11a is displayed, and a fine adjustment mode in which the operation cursor 11a is moved according to the movement by the pointing gesture is executed.
  • the operation cursor 11a has, for example, a ring-shaped design.
  • the design of the operation cursor 11a may be an arrow, a triangle, a cross, or the like, instead of the ring-shaped one.
  • the gesture detection device 100 of the present embodiment is configured as described above, and the operation and effect will be described below with reference to the flowchart of FIG. 2 and FIGS. 3 and 4.
  • step S100 of FIG. 2 the control unit 120 acquires the operator's line-of-sight direction data and gesture data from the detection unit 110.
  • Step S100 corresponds to the detection step of the present disclosure.
  • step S110 the control unit 120 determines from the line-of-sight direction data whether the operator is gazing in the area of the display unit 11 and whether there is a pointing gesture.
  • the control unit 120 determines that there is gaze when the line of sight has been focused on an arbitrary position in the display unit 11 for a predetermined predetermined gaze time or longer. Further, when the operator's finger is pointing in the direction of the display unit 11 side for a predetermined instruction time or longer, the control unit 120 determines that there is a pointing gesture.
  • the control unit 120 makes the determination in step S110 an affirmative determination. ..
  • the control unit 120 makes the determination in step S110 an affirmative determination.
  • the control unit 120 makes the determination in step S110 an affirmative determination. That is, even if the gaze and the pointing are not performed at the same time, if the time difference is within a certain time, the control unit 120 proceeds to steps S120 and S130 below to display the operation cursor 11a and execute the fine adjustment mode. Tolerate.
  • step S110 If it is determined to be no in step S110, the control unit 120 repeats steps S100 and S110. On the other hand, if an affirmative decision is made in step S110, the control unit 120 moves to step S120.
  • step S120 the control unit 120 sets the gazed position as the initial position in the display unit 11, and displays the operation cursor 11a at the initial position.
  • step S130 the control unit 120 executes the fine adjustment mode regarding the movement of the operation cursor 11a. That is, the control unit 120 first performs an offset (correction) to match the direction initially pointed by the finger pointing gesture (the finger pointing direction at initialization) with the initial position. Then, the control unit 120 moves the operation cursor 11a according to the movement of the finger by the finger pointing gesture of the operator. That is, the operation cursor 11a is moved in the direction of the pointing gesture performed by the operator (execution of the fine adjustment mode).
  • control unit 120 may incorporate attenuation of the movement amount so that the movement amount of the operation cursor 11a becomes smaller than the movement amount of the finger.
  • the yaw direction (left-right direction) after the offset is ⁇ n
  • the pitch pointing direction (vertical direction) after offset is ⁇ n
  • the yaw direction (left-right direction) before offset is ⁇ o
  • the pointing direction in the pitch direction (vertical direction) before offset is ⁇ o
  • the physical pointing direction (yaw direction) at initialization is ⁇ p
  • the physical pointing direction (pitch direction) at initialization is ⁇ p
  • the direction (yaw direction) pointing to the initial position of the operation cursor 11a is ⁇ g
  • the direction (pitch direction) indicating the initial position of the operation cursor 11a is ⁇ g
  • control unit 120 sets the magnifications (movement magnifications) ⁇ and ⁇ of the movement of the operation cursor 11a during execution of the fine adjustment mode to be higher as the detection level of the movement by the gaze and the instruction gesture is higher. May be.
  • step S130 when the desired icon 11b is selected by the operation cursor 11a and a predetermined determination gesture (for example, tap gesture) is performed, the control unit 120 outputs an input instruction regarding the icon 11b to the air conditioner. .. That is, an input operation is performed on the air conditioner.
  • a predetermined determination gesture for example, tap gesture
  • step S140 the control unit 120 determines whether or not there is movement in the line-of-sight direction of the operator, and if there is movement in the line-of-sight direction, in step S150 whether there is a predetermined gesture by the operator in advance. Determine whether or not.
  • step S140 When the negative determination is made in step S140, and when the positive determination is made in step S140 and the negative determination is made in step S150, the control unit 120 continues the fine adjustment mode (movement of the operation cursor 11a based on the pointing gesture) in step S160.
  • the fine adjustment mode movement of the operation cursor 11a based on the pointing gesture.
  • the predetermined gesture that is determined in advance is a gesture that means that the operator intentionally stops the pointing gesture, and includes, for example, a gesture of making a hand, a gesture of breaking the shape of a hand (shape of pointing), It may be a predetermined decision gesture for input, a gesture to shake the head, a gesture to redirect the finger in the direction of the moved line of sight, a gesture to tap another finger with the thumb, or a gesture to re-hold the finger other than the index finger. it can.
  • step S140 if a positive determination is made in both steps S140 and S150, the control unit 120 ends execution of the fine adjustment mode in step S170.
  • the control unit 120 sets the gaze direction by the gaze as the initial position on the display unit 11 and sets the operation cursor. 11a is displayed, and the operation cursor 11a is moved according to the movement by the pointing gesture (fine adjustment mode execution).
  • the operator can move the operation cursor 11a displayed at the gazed initial position by performing a pointing gesture. Therefore, the operator does not need to finely control the gesture, and the operator can move the operation cursor 11a accurately. In general, it is possible to reduce erroneous operations without causing the operator to feel bothered.
  • control unit 120 continues to execute the fine adjustment mode according to the movement by the instruction gesture, even if there is a movement in the line-of-sight direction until a predetermined gesture is given. I am trying. This prevents the fine adjustment mode from being interrupted by the re-initialization even if the operator makes an unexpected movement in the line-of-sight direction.
  • step S130 the control unit 120 incorporates attenuation of the movement amount (reduces the respective magnifications ⁇ and ⁇ ) so that the movement amount of the operation cursor 11a becomes smaller than the movement amount of the finger, or
  • the respective magnifications ⁇ and ⁇ are made relatively low to detect noise. The influence of can be suppressed. If the condition is good, the magnifications ⁇ and ⁇ can be relatively increased to move the operation cursor 11a quickly.
  • control unit 120 allows the fine adjustment mode to be executed when a pointing gesture is performed within a certain time after gazing. As a result, the operating conditions can be eased, and the operator can easily use the device.
  • Modification 1 is shown in FIG.
  • the control unit 120 executes the fine adjustment mode based on a change in the direction of the finger pointing as a pointing gesture, but is not limited to this, and based on a change in the position of the pointing. It may be one.
  • a constant magnification is used to convert the movement amount (mm) of the finger from the physical coordinates at the time of initialization into the movement amount (pix (pixel)) of the operation cursor 11a, or the same as the first embodiment.
  • a polynomial, a sigmoid curve, or the like it is possible to use a polynomial, a sigmoid curve, or the like.
  • FIG. 5 shows how the operation cursor 11a is moved when the position of the finger (fingertip) is moved when executing the fine adjustment mode using the finger.
  • the plane for moving the fingers may not be parallel to the display unit 11.
  • the fingers may be tilted 90 degrees so that the direction away from the body corresponds to the upward direction on the display unit 11.
  • Modification 2 is shown in FIG. Modification 2 is different from the first embodiment in that the control unit 120 displays the movable range of the operation cursor 11a.
  • the movable range is displayed, for example, as a circle including the operation cursor 11a with the position of the operation cursor 11a as the center.
  • the size of the movable range may be increased or decreased according to the physique of the operator.
  • the movable range may be displayed under a specific condition, for example, when the operation cursor 11a is near the boundary of the movable range for a certain time after initialization.
  • Modification 3 is shown in FIGS. Modification 3 is different from the first embodiment in that the control unit 120 performs a notice display indicating the initial position when the operator is gazing and in the middle of a predetermined instruction time. It is a thing.
  • the control unit 120 displays the operation cursor 11a in advance at the initial position after the time t1 ⁇ threshold T1 is satisfied and before the time t2 ⁇ threshold T2 is satisfied, and after the time t2 ⁇ threshold T2 is satisfied. , The original operation cursor 11a is displayed.
  • the control unit 120 operates after the time t1 ⁇ threshold T1 is satisfied and before the time t2 ⁇ threshold T2 is satisfied.
  • the cursor 11a is displayed in advance at the initial position and the original operation cursor 11a is displayed after t2 ⁇ threshold value T2 is satisfied.
  • the design of the operation cursor 11a for notice display and the original design (look) of the operation cursor 11a may be different.
  • the design of the operation cursor 11a is different, the color, brightness, size, and transmittance are changed, a specific figure or an outer frame is displayed in an overlapping manner, or the movable range described in Modification 2 is announced.
  • Modification 4 is shown in FIG.
  • the modification 4 is different from the first embodiment in that the control unit 120 cancels the execution of the fine adjustment mode when the stationary state of the instruction gesture is not recognized after the gaze.
  • the control unit 120 displays the notice after the time t1 ⁇ threshold value T1 is satisfied, but when t3 ⁇ threshold value T3 is satisfied without the t2 ⁇ threshold value T2 being satisfied, the series of operations is canceled and the operation cursor 11a is moved. to erase. That is, the control unit 120 cancels the execution of the fine adjustment mode when the time t3 exceeds the threshold value T3 unless there is a still state with the pointing gesture after the gazing.
  • the operation target is various vehicle devices, but the invention is not limited to this and may be applied to devices provided for homes and facilities.
  • household equipment include televisions and audio equipment
  • equipment for facilities include ATMs at banks and automatic ticket vending machines at stations.
  • the target operator of the one installed in the vehicle is not limited to the driver, but may be the passenger seat.
  • the passenger seat can also perform various gestures by the gesture detection device 100 by performing the instruction gesture, and can operate various vehicle devices.
  • the disclosure in this specification and drawings is not limited to the illustrated embodiment.
  • the disclosure encompasses the illustrated embodiments and variations on them based on them.
  • the disclosure is not limited to the combination of parts and/or elements shown in the embodiments.
  • the disclosure can be implemented in various combinations.
  • the disclosure may have additional parts that may be added to the embodiments.
  • the disclosure includes omissions of parts and/or elements of the embodiments.
  • the disclosure includes replacements or combinations of parts and/or elements between one embodiment and another.
  • the disclosed technical scope is not limited to the description of the embodiments. It is to be understood that some technical scopes disclosed are shown by the description of the claims, and further include meanings equivalent to the description of the claims and all modifications within the scope.
  • the control unit 120 and the method thereof described in the present disclosure may be realized by a dedicated computer configuring a processor programmed to execute one or a plurality of functions embodied by a computer program.
  • the apparatus and method described in the present disclosure may be realized by a dedicated hardware logic circuit.
  • the device and the method described in the present disclosure may be realized by one or more dedicated computers configured by a combination of a processor that executes a computer program and one or more hardware logic circuits.
  • the computer program may be stored in a computer-readable non-transition tangible recording medium as an instruction executed by the computer.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

This gesture detection device, which performs inputting on a manipulation screen (11) away from an operator on the basis of an instruction gesture performed by the operator, comprises: a detection unit (110) that detects a visual line direction and the instruction gesture of the operator; and a control unit (120) that, when there is a gaze on the manipulation screen for a predetermined gazing time or longer and there is an instruction gesture for a predetermined instruction time or longer from the detection data detected by the detection unit, executes a fine adjustment mode in which a manipulation cursor (11a) for manipulation using a gaze direction of the gaze as an initial position on the manipulation screen is displayed, and the manipulation cursor moves according to motion by the instruction gesture.

Description

ジェスチャ検出装置、およびジェスチャ検出方法Gesture detection device and gesture detection method 関連出願の相互参照Cross-reference of related applications
 本出願は、2018年12月27日に出願された日本特許出願番号2018-244368号に基づくもので、その開示をここに参照により援用する。 This application is based on Japanese Patent Application No. 2018-244368 filed on Dec. 27, 2018, the disclosure of which is incorporated herein by reference.
 本開示は、入力操作に用いるジェスチャを検出するジェスチャ検出装置、およびジェスチャ検出方法に関するものである。 The present disclosure relates to a gesture detection device that detects a gesture used for an input operation, and a gesture detection method.
 従来のジェスチャ検出装置として、例えば、特許文献1に記載されたものが知られている。特許文献1のジェスチャ検出装置(情報入力装置)では、機器制御器は、操作者のジェスチャ(手の形状、状態、状態の変化)に基づいて、制御対象機器を特定し、更に、制御コマンドを決定し、制御対象機器に対する入力(遠隔操作)ができるようになっている。例えば、操作者のジェスチャとして、人差し指を立てた指差しジェスチャによって、ディスプレイ(テレビ)画面が指し示されると、機器制御器は、制御対象機器がテレビであることを認識し、この指差しジェスチャによって、指の延長線上に位置したディスプレイ上のカーソルを移動することができるようになっている。 As a conventional gesture detection device, for example, one described in Patent Document 1 is known. In the gesture detection device (information input device) of Patent Document 1, the device controller specifies the device to be controlled based on the gesture (hand shape, state, change in state) of the operator, and further issues a control command. It can be determined and input (remote operation) to the control target device. For example, when the display (TV) screen is pointed by the pointing gesture with the index finger raised as the operator's gesture, the device controller recognizes that the controlled device is the TV, and the pointing gesture , You can move the cursor on the display located on the extension line of your finger.
 また、特許文献2のジェスチャ検出装置(操作入力装置)では、ディスプレイ画面と、ディスプレイ画面の外側領域となる延長面とによって、展開面が設定されている。そして、操作者の目の位置(基準点)と、展開面の前方にかざされた手の指先(指示物)とを結ぶ仮想線によって、指差しによる展開面上の位置(対象点)が定義されるようになっている。 In addition, in the gesture detection device (operation input device) of Patent Document 2, the expansion surface is set by the display screen and the extension surface that is the outer area of the display screen. Then, a virtual line connecting the position of the operator's eyes (reference point) and the fingertip (pointer) of the hand held in front of the deployment surface defines the position (point of interest) on the deployment surface by pointing. It is supposed to be done.
特開2013-205983号公報JP, 2013-205983, A 特開2005-321870号公報JP, 2005-321870, A
 しかしながら、上記特許文献1、2では、それぞれ、所定の指差し方法が予め決められており、その方法を前提として、指差し方向が特定されるようになっている。よって、操作者が、所定の指差し方法とは異なる指差しジェスチャを行うと誤操作が発生する。一方、誤操作を回避するために、どの部位でどのように指差しをするのかを操作者に制約すると、操作者は煩わしさを感じる。 However, in each of the above-mentioned Patent Documents 1 and 2, a predetermined pointing method is determined in advance, and the pointing direction is specified based on that method. Therefore, if the operator makes a pointing gesture different from the predetermined pointing method, an erroneous operation occurs. On the other hand, if the operator is restricted in which part and how to point the finger in order to avoid erroneous operation, the operator feels troublesome.
 本開示の目的の一つは、上記問題に鑑み、操作者に対して煩わしさを感じさせることなく、誤操作の低減を可能とするジェスチャ検出装置、およびジェスチャ検出方法を提供することにある。 In view of the above problems, one of the objects of the present disclosure is to provide a gesture detection device and a gesture detection method that can reduce erroneous operations without causing the operator to feel bothered.
 本開示の一例に係るジェスチャ検出装置は、操作者が行う指示ジェスチャに基づいて、操作者から離れた操作画面に対する入力を行うジェスチャ検出装置であって、
 操作者の視線方向、および指示ジェスチャを検知する検知部と、
 検知部で検知された検知データから、操作画面に対して所定の注視時間以上の注視があり、且つ、所定の指示時間以上の指示ジェスチャがあると、注視による視線方向を操作画面上の初期位置として操作用の操作カーソルを表示すると共に、指示ジェスチャによる動きに応じて操作カーソルを移動させる微調整モードを実行する制御部と、を備える。
A gesture detection device according to an example of the present disclosure is a gesture detection device that performs an input on an operation screen away from the operator based on an instruction gesture performed by the operator,
A detection unit that detects an operator's line-of-sight direction and a pointing gesture,
From the detection data detected by the detection unit, if there is a gaze for a predetermined time or more on the operation screen, and if there is an instruction gesture for a predetermined time or more, the gaze direction by the gaze is the initial position on the operation screen. And a control unit that executes a fine adjustment mode in which the operation cursor is displayed for operation and the operation cursor is moved according to the movement by the instruction gesture.
 また、本開示の一例に係るジェスチャ検出方法は、操作者が行う指示ジェスチャに基づいて、操作者から離れた操作画面に対する入力を行うジェスチャ検出方法であって、
 操作者の視線方向、およびジェスチャを検知する検知ステップと、
 検知ステップで検知された検知データから、操作画面に対して所定の注視時間以上の注視があり、且つ、所定の指示時間以上のジェスチャがあると、注視による視線方向を操作画面上の初期位置として操作用の操作カーソルを表示すると共に、指示ジェスチャによる動きに応じて操作カーソルを移動させる微調整モードを実行する実行ステップと、を備える。
A gesture detection method according to an example of the present disclosure is a gesture detection method of performing input on an operation screen remote from the operator based on an instruction gesture performed by the operator,
A detection step of detecting the operator's gaze direction and gesture,
From the detection data detected in the detection step, if there is a gaze for a predetermined gaze time or longer on the operation screen, and if there is a gesture for a predetermined instruction time or longer, the gaze direction by the gaze is set as the initial position on the operation screen. An operation step of displaying an operation cursor for operation and executing a fine adjustment mode in which the operation cursor is moved in accordance with the movement by the instruction gesture.
 これにより、操作者は、注視した初期位置に表示される操作カーソルを、指示ジェスチャを行うことで移動させることができる。よって、操作者に対して事細かなジェスチャの規制をする必要がなく、操作者は、操作カーソルを正確に移動させることができる。総じて、操作者に対して煩わしさを感じさせることなく、誤操作の低減を可能とすることができる。 With this, the operator can move the operation cursor displayed at the initial position of the gaze by performing an instruction gesture. Therefore, it is not necessary to finely regulate the gesture for the operator, and the operator can move the operation cursor accurately. In general, it is possible to reduce erroneous operations without causing the operator to feel bothered.
 本開示の上記および他の目的、特徴や利点は、添付図面を参照した下記詳細な説明から、より明確になる。図面において、
図1は、第1実施形態におけるジェスチャ検出装置の全体構成を示すブロック図である。 図2は、制御部が行う制御内容を示すフローチャートである。 図3は、注視時間=指示時間の場合、注視および指示ジェスチャありと判定する際の要領を示す説明図である。 図4は、注視時間≠指示時間の場合、注視と指示ジェスチャとの時間差を許容する際の要領を示す説明図である。 図5は、指の位置に応じて操作カーソルが移動される変形例1を示す説明図である。 図6は、変形例2を示すものであって、操作カーソルの移動可能範囲を示す説明図である。 図7は、変形例3を示すものであって、予告表示を行う際の要領を示す説明図である。 図8は、変形例3を示すものであって、注視と指示ジェスチャとの時間差があるときに、予告表示を行う要領を示す説明図である。 図9は、変形例4を示すものであって、操作キャンセルを行う際の要領を示す説明図である。
The above and other objects, features and advantages of the present disclosure will become more apparent from the following detailed description with reference to the accompanying drawings. In the drawing,
FIG. 1 is a block diagram showing the overall configuration of the gesture detection device according to the first embodiment. FIG. 2 is a flowchart showing the control contents performed by the control unit. FIG. 3 is an explanatory diagram showing a procedure for determining that there is a gaze and an instruction gesture when the gaze time=instruction time. FIG. 4 is an explanatory diagram showing a procedure for allowing a time difference between the gaze and the pointing gesture when the gaze time≠instruction time. FIG. 5 is an explanatory diagram showing a modified example 1 in which the operation cursor is moved according to the position of the finger. FIG. 6 shows a second modification and is an explanatory diagram showing a movable range of the operation cursor. FIG. 7 shows the third modification and is an explanatory diagram showing the procedure when the advance notice is displayed. FIG. 8 is a diagram illustrating a third modification and is an explanatory diagram illustrating a procedure of performing a notice display when there is a time difference between the gaze and the pointing gesture. FIG. 9 is a diagram illustrating a fourth modification and is an explanatory diagram illustrating a procedure for canceling an operation.
 以下に、図面を参照しながら複数の実施形態を説明する。各実施形態において先行する実施形態で説明した事項に対応する部分には同一の参照符号を付して重複する説明を省略する場合がある。各実施形態において構成の一部のみを説明している場合は、構成の他の部分については先行して説明した他の実施形態を適用することができる。各実施形態で具体的に組み合わせが可能であることを明示している部分同士の組み合わせばかりではなく、特に組み合わせに支障が生じなければ、明示していなくても実施形態同士を部分的に組み合せることも可能である。 A plurality of embodiments will be described below with reference to the drawings. In each embodiment, the same reference numerals may be given to portions corresponding to the matters described in the preceding embodiments, and redundant description may be omitted. In the case where only a part of the configuration is described in each embodiment, the other embodiments described above can be applied to the other part of the configuration. Not only the combination of the parts clearly stating that each embodiment can be specifically combined, but also the combination of the embodiments is partially combined even if not explicitly stated unless there is a problem in the combination. It is also possible.
 (第1実施形態)
 第1実施形態のジェスチャ検出装置100について図1~図4を用いて説明する。本実施形態のジェスチャ検出装置100は、例えば車両に搭載され、操作者(運転者)の注視による視線方向、および指示ジェスチャによる指示方向や指示位置を検出して、例えば、各種車両機器(操作画面)に対する入力指示を行う装置となっている。
(First embodiment)
The gesture detection device 100 according to the first embodiment will be described with reference to FIGS. 1 to 4. The gesture detection device 100 according to the present embodiment is mounted in, for example, a vehicle, detects a line-of-sight direction of an operator (driver)'s gaze, and a pointing direction and a pointing position by a pointing gesture, and, for example, various vehicle devices (operation screens). ) Is a device for instructing input.
 操作者の注視による視線方向は、主に、操作者が操作したい車両機器(操作画面)に対する視線の方向として捉えられる。 ▽The gaze direction of the operator's gaze is mainly understood as the direction of the gaze to the vehicle device (operation screen) that the operator wants to operate.
 また、指示ジェスチャは、例えば、指、手、腕、目(視線)、および顔等によって、操作者が所望する方向や位置を指し示すジェスチャであり、ここでは、主に、指による指差しジェスチャを用いたものとして説明する。指差しジェスチャは、指の延びる方向によって、所望する方向を示すジェスチャであり、例えば、人差し指が使用される。 Further, the pointing gesture is, for example, a finger, a hand, an arm, an eye (line of sight), a face, or the like that indicates a direction or position desired by the operator. Here, a finger pointing gesture is mainly used. It will be described as being used. The finger pointing gesture is a gesture that indicates a desired direction depending on the extending direction of the finger, and for example, the index finger is used.
 各種車両機器としては、例えば、車室内の空調を行う空調装置、自車の現在位置表示あるいは目的地への案内表示等を行うカーナビゲーション装置(以下、カーナビ装置)、あるいはテレビ放映、ラジオ放送、CD/DVDの再生等を行うオーディオ装置等、がある。ここでは、車両機器として空調装置を代表例として説明する。 Examples of various vehicle equipment include, for example, an air conditioner that air-conditions the interior of the vehicle, a car navigation device (hereinafter, car navigation device) that displays the current position of the vehicle or guidance to the destination, or TV broadcast, radio broadcast, There is an audio device for reproducing a CD/DVD or the like. Here, an air conditioner will be described as a typical example of the vehicle equipment.
 空調装置は、例えば、液晶ディスプレイや有機ELディスプレイ等を用いた表示部11を有しており、この表示部11には、例えば、入力操作用の操作カーソル11a、アイコン11b、更には作動状態等が表示されるようになっている。表示部11は、本開示の操作者から離れた操作画面に対応する。 The air conditioner has, for example, a display unit 11 using a liquid crystal display, an organic EL display, or the like, and the display unit 11 includes, for example, an operation cursor 11a for input operation, an icon 11b, and an operating state. Is displayed. The display unit 11 corresponds to the operation screen separated from the operator of the present disclosure.
 ジェスチャ検出装置100は、図1に示すように、検知部110、および制御部120等を備えている。制御部120は、検知部110によって検知された操作者の視線方向、および指差しジェスチャに基づいて、操作者から離れた表示部11(空調装置)に対する入力指示(入力操作のための指示)を行うようになっている。 As shown in FIG. 1, the gesture detection device 100 includes a detection unit 110, a control unit 120, and the like. The control unit 120 issues an input instruction (instruction for input operation) to the display unit 11 (air conditioner) remote from the operator based on the line-of-sight direction of the operator detected by the detection unit 110 and the pointing gesture. I am supposed to do it.
 空調装置においては、指差しジェスチャによって、設定温度の変更、空調風の風量の変更等が行われる。尚、カーナビ装置では、現在位置表示、目的地設定、および地図の拡大縮小、等が行われ、また、オーディオ装置では、テレビ局、ラジオ局の変更、楽曲の選択、音量の変更等が行われる。 In the air conditioner, the set temperature and the amount of conditioned air are changed by pointing gestures. The car navigation device displays the current position, sets the destination, enlarges/reduces the map, etc., and the audio device changes the television station, the radio station, selects the song, and changes the volume.
 検知部110は、操作者の表示部11に対する注視状態(視線方向)、および指差しジェスチャ(人差し指の方向)を連続的に(時間経過と共に)検知し、注視による視線方向のデータ、および指差しジェスチャのデータを制御部120に出力するようになっている。検知部110は、操作者の目の位置(視線方向)、および指の姿勢を取得するようになっている。 The detection unit 110 continuously (with time) detects a gaze state (gaze direction) of the operator with respect to the display unit 11 and a finger pointing gesture (direction of the index finger), and gaze direction data by gaze, and a finger pointing. The gesture data is output to the control unit 120. The detection unit 110 is configured to acquire the position of the operator's eyes (the direction of the line of sight) and the posture of the finger.
 検知部110としては、対象物の輝度画像を形成するカメラ、距離画像を形成する距離画像センサ、あるいはそれらの組合せを用いることができる。カメラとしては、近赤外線を捉える近赤外線カメラ、あるいは可視光を捉える可視光カメラ等がある。また、距離画像センサとしては、例えば、複数のカメラで同時に撮影して視差から奥行方向の情報を計測するステレオカメラ、あるいは、光源からの光が対象物で反射して返るまでの時間で奥行きを計測するToF(Time of Flight)カメラ等がある。本実施形態では、検知部110としては、上記のように対象物の輝度画像を形成するカメラを用いたものとしている。 As the detection unit 110, a camera that forms a luminance image of an object, a range image sensor that forms a range image, or a combination thereof can be used. As the camera, there is a near-infrared camera that captures near-infrared rays, a visible-light camera that captures visible light, or the like. Further, as the range image sensor, for example, a stereo camera that simultaneously captures images with a plurality of cameras and measures information in the depth direction from parallax, or the depth is determined by the time until the light from the light source is reflected by the object and returned. There is a ToF (Time of Flight) camera to measure. In the present embodiment, as the detection unit 110, the camera that forms the brightness image of the target object as described above is used.
 制御部120は、例えば、CPU、ROM、RAM等を備えたコンピュータである。制御部120は、検知部110で検知されたデータ(視線方向データ、指差しジェスチャデータ)、および予め設定された表示部11と検知部110との位置関係データに基づいて表示部11に操作カーソル11aを表示させると共に、指差しジェスチャによる動きに応じて操作カーソル11aを移動させる微調整モードを実行するようになっている。操作カーソル11aは、例えば、リング状の意匠を有するものとなっている。尚、操作カーソル11aの意匠は、リング状のものに代えて、矢印、三角、十字等としてもよい。 The control unit 120 is, for example, a computer including a CPU, ROM, RAM and the like. The control unit 120 operates the operation cursor on the display unit 11 based on the data (gaze direction data, finger pointing gesture data) detected by the detection unit 110 and preset positional relationship data between the display unit 11 and the detection unit 110. 11a is displayed, and a fine adjustment mode in which the operation cursor 11a is moved according to the movement by the pointing gesture is executed. The operation cursor 11a has, for example, a ring-shaped design. The design of the operation cursor 11a may be an arrow, a triangle, a cross, or the like, instead of the ring-shaped one.
 本実施形態のジェスチャ検出装置100は、以上のような構成となっており、以下、図2のフローチャート、および図3、図4を加えて、作動および作用効果について説明する。 The gesture detection device 100 of the present embodiment is configured as described above, and the operation and effect will be described below with reference to the flowchart of FIG. 2 and FIGS. 3 and 4.
 まず、図2のステップS100で、制御部120は、検知部110から、操作者の視線方向データ、およびジェスチャデータを入手する。ステップS100は、本開示の検知ステップに対応する。 First, in step S100 of FIG. 2, the control unit 120 acquires the operator's line-of-sight direction data and gesture data from the detection unit 110. Step S100 corresponds to the detection step of the present disclosure.
 次に、ステップS110で、制御部120は、視線方向データから、操作者は、表示部11の領域内を注視しているか、また、指差しのジェスチャがあるか否かを判定する。制御部120は、視線が予め定めた所定の注視時間以上、表示部11内の任意の位置に注がれていると、注視ありと判定する。また、制御部120は、操作者の指が予め定めた所定の指示時間以上、表示部11側の方向に指し示されていると、指差しジェスチャありと判定する。尚、注視時間、指示時間は、例えば、200ms程度の時間であり、注視時間=指示時間と設定してもよいし、注視時間≠指示時間と設定してもよい。 Next, in step S110, the control unit 120 determines from the line-of-sight direction data whether the operator is gazing in the area of the display unit 11 and whether there is a pointing gesture. The control unit 120 determines that there is gaze when the line of sight has been focused on an arbitrary position in the display unit 11 for a predetermined predetermined gaze time or longer. Further, when the operator's finger is pointing in the direction of the display unit 11 side for a predetermined instruction time or longer, the control unit 120 determines that there is a pointing gesture. The gaze time and the instruction time are, for example, about 200 ms, and the gaze time=instruction time may be set, or the gaze time≠instruction time may be set.
 注視時間=指示時間の場合
 注視および指差し(動いていても可)が共に行われている時間をt1、時間t1の成立を判定する閾値をT1、
 注視および指差し(静止)が共に行われている時間をt2、時間t2の成立を判定する閾値をT2、としたときに、
 制御部120は、図3に示すように、時間t1≧閾値T1が成立後、t2≧閾値T2が成立すると(T1+T2の注視および指差しが成立すると)、ステップS110での判定を肯定判定とする。あるいは、図4に示すように、注視の後に、一定時間以内に指差しジェスチャが行われたときは、制御部120は、ステップS110での判定を肯定判定とする。
In the case of gaze time=instruction time, the time during which both gaze and finger pointing (moving is allowed) are both t1, and the threshold for determining whether time t1 is established is T1,
When the time during which both the gaze and the pointing (still) are performed is t2, and the threshold for determining the establishment of the time t2 is T2,
As shown in FIG. 3, when the time t1≧threshold value T1 is satisfied and then the t2≧threshold value T2 is satisfied (when T1+T2 gazing and finger pointing are satisfied), the control unit 120 makes the determination in step S110 an affirmative determination. .. Alternatively, as shown in FIG. 4, when the pointing gesture is performed within a certain time after the gaze, the control unit 120 makes the determination in step S110 an affirmative determination.
 注視時間≠指示時間の場合
 注視時間≠指示時間の場合は、注視および指差しが同時に行われる可能性は低い。図4に示すように、注視の後に、一定時間以内に指差しジェスチャが行われたときは、制御部120は、ステップS110での判定を肯定判定とする。つまり、注視と指差しとが同時に行われなくても、一定時間以内の時間差であれば、制御部120は、以下のステップS120、S130に進み、操作カーソル11aの表示、および微調整モードの実行を許容する。
In the case of gaze time≠instruction time In the case of gaze time≠instruction time, it is unlikely that gaze and pointing are performed simultaneously. As shown in FIG. 4, when the pointing gesture is performed within a certain period of time after gazing, the control unit 120 makes the determination in step S110 an affirmative determination. That is, even if the gaze and the pointing are not performed at the same time, if the time difference is within a certain time, the control unit 120 proceeds to steps S120 and S130 below to display the operation cursor 11a and execute the fine adjustment mode. Tolerate.
 ステップS110で否と判定すると、制御部120は、ステップS100、S110を繰り返す。一方、ステップS110で肯定判定すると、制御部120は、ステップS120に移行する。 If it is determined to be no in step S110, the control unit 120 repeats steps S100 and S110. On the other hand, if an affirmative decision is made in step S110, the control unit 120 moves to step S120.
 ステップS120では、制御部120は、注視された位置を表示部11内の初期位置とし、その初期位置に操作カーソル11aを表示させる。 In step S120, the control unit 120 sets the gazed position as the initial position in the display unit 11, and displays the operation cursor 11a at the initial position.
 次に、ステップS130で、制御部120は、操作カーソル11aの移動に関する微調整モードを実行する。即ち、制御部120は、まず、指差しジェスチャによって最初に差された方向(初期化時の指差し方向)を、初期位置に合せるためのオフセット(補正)を行う。そして、制御部120は、操作者の指差しジェスチャによる指の動きに応じて操作カーソル11aを移動させる。つまり、操作者が行う指差しジェスチャの方向に、操作カーソル11aが移動される(微調整モードの実行)。 Next, in step S130, the control unit 120 executes the fine adjustment mode regarding the movement of the operation cursor 11a. That is, the control unit 120 first performs an offset (correction) to match the direction initially pointed by the finger pointing gesture (the finger pointing direction at initialization) with the initial position. Then, the control unit 120 moves the operation cursor 11a according to the movement of the finger by the finger pointing gesture of the operator. That is, the operation cursor 11a is moved in the direction of the pointing gesture performed by the operator (execution of the fine adjustment mode).
 操作カーソル11aを移動させるにあたっては、制御部120は、指の移動量に対して操作カーソル11aの移動量が小さくなるように、移動量の減衰を織り込むようにしてもよい。 When moving the operation cursor 11a, the control unit 120 may incorporate attenuation of the movement amount so that the movement amount of the operation cursor 11a becomes smaller than the movement amount of the finger.
 ここで、
 オフセット後のヨー方向(左右方向)の指差し方向をθn、
 オフセット後のピッチ方向(上下方向)の指差し方向をφn、
 オフセット前のヨー方向(左右方向)の指差し方向をθo、
 オフセット前のピッチ方向(上下方向)の指差し方向をφo、
 初期化時の物理的な指差し方向(ヨー方向)をθp、
 初期化時の物理的な指差し方向(ピッチ方向)をφp、
 操作カーソル11aの初期位置を指す方向(ヨー方向)をθg、
 操作カーソル11aの初期位置を指す方向(ピッチ方向)をφg、
 ヨー角に対する倍率(0より大きく1以下)をα、
 ピッチ角に対する倍率(0より大きく1以下)をβ、とすると、
 θn=α(θo-θp)+θg
 φn=β(φo-φp)+φg
というように、θn、φnを線形の補正式として表すことができ、各倍率α、βを1以下の値に設定するとよい(移動量の減衰織り込み)。尚、線形の補正式に代えて、多項式やシグモイド曲線等を用いて表すようにしてもよい。
here,
The yaw direction (left-right direction) after the offset is θn,
The pitch pointing direction (vertical direction) after offset is φn,
The yaw direction (left-right direction) before offset is θo,
The pointing direction in the pitch direction (vertical direction) before offset is φo,
The physical pointing direction (yaw direction) at initialization is θp,
The physical pointing direction (pitch direction) at initialization is φp,
The direction (yaw direction) pointing to the initial position of the operation cursor 11a is θg,
The direction (pitch direction) indicating the initial position of the operation cursor 11a is φg,
The yaw angle magnification (greater than 0 and less than or equal to 1) is α,
If the scaling factor for the pitch angle (greater than 0 and less than 1) is β,
θn=α(θo−θp)+θg
φn=β(φo-φp)+φg
In this way, θn and φn can be expressed as linear correction equations, and the respective magnifications α and β should be set to values of 1 or less (attenuation weaving amount of movement). Instead of the linear correction formula, a polynomial expression or a sigmoid curve may be used.
 逆に、制御部120は、注視、および指示ジェスチャによる動きの検出レベルが高いほど、微調整モード実行時の操作カーソル11aの移動の各倍率(移動倍率)α、βを、より高くするようにしてもよい。 On the contrary, the control unit 120 sets the magnifications (movement magnifications) α and β of the movement of the operation cursor 11a during execution of the fine adjustment mode to be higher as the detection level of the movement by the gaze and the instruction gesture is higher. May be.
 ステップS130に戻って、操作カーソル11aによって所望のアイコン11bが選択されて、所定の決定ジェスチャ(例えば、タップジェスチャ)が行われると、制御部120は、アイコン11bに関する入力指示を空調装置に出力する。つまり、空調装置に対する入力操作が行われることになる。上記ステップS110~ステップS130は、本開示の実行ステップに対応する。 Returning to step S130, when the desired icon 11b is selected by the operation cursor 11a and a predetermined determination gesture (for example, tap gesture) is performed, the control unit 120 outputs an input instruction regarding the icon 11b to the air conditioner. .. That is, an input operation is performed on the air conditioner. The above steps S110 to S130 correspond to the execution steps of the present disclosure.
 尚、微調整モードの継続、終了は、以下のように実行される。即ち、ステップS140で、制御部120は、操作者の視線方向の移動があったか否かを判定し、視線方向の移動があった場合、ステップS150で、操作者による予め定めた所定のジェスチャがあったか否かを判定する。 The continuation and termination of the fine adjustment mode are executed as follows. That is, in step S140, the control unit 120 determines whether or not there is movement in the line-of-sight direction of the operator, and if there is movement in the line-of-sight direction, in step S150 whether there is a predetermined gesture by the operator in advance. Determine whether or not.
 ステップS140で否定判定した場合、およびステップS140で肯定判定してステップS150で否定判定した場合、制御部120は、ステップS160で、微調整モード(指差しジェスチャに基づく操作カーソル11aの移動)を継続させる。つまり、操作者の視線方向は、動きが激しいため、不意な動きで再初期化されてしまうおそれがあるが、操作カーソル11aを表示した後に、視線方向が移動されても、予め定めた所定のジェスチャがあるまでは、微調整モードの実行が継続される。ステップS160は、ステップS140~ステップS150の判定に応じて、繰り返し実行される。 When the negative determination is made in step S140, and when the positive determination is made in step S140 and the negative determination is made in step S150, the control unit 120 continues the fine adjustment mode (movement of the operation cursor 11a based on the pointing gesture) in step S160. Let That is, since the operator's line-of-sight direction moves rapidly, it may be re-initialized by an unexpected movement. However, even if the line-of-sight direction is moved after the operation cursor 11a is displayed, a predetermined predetermined value is set. The fine adjustment mode continues to be executed until there is a gesture. Step S160 is repeatedly executed according to the determinations of steps S140 to S150.
 予め定めた所定のジェスチャというのは、操作者が指差しジェスチャを意図的にやめることを意味するジェスチャであり、例えば、手を下すジェスチャ、手の形(指差しの形)を崩すジェスチャ、上記入力用の所定の決定ジェスチャ、首を振るジェスチャ、移動させた視線の方向へ指を向け直すジェスチャ、親指で他の指をタップするジェスチャ、あるいは人差し指以外の指を握り直すジェスチャ等とすることができる。 The predetermined gesture that is determined in advance is a gesture that means that the operator intentionally stops the pointing gesture, and includes, for example, a gesture of making a hand, a gesture of breaking the shape of a hand (shape of pointing), It may be a predetermined decision gesture for input, a gesture to shake the head, a gesture to redirect the finger in the direction of the moved line of sight, a gesture to tap another finger with the thumb, or a gesture to re-hold the finger other than the index finger. it can.
 一方、ステップS140、S150で共に、肯定判定すると、制御部120は、ステップS170で、微調整モードの実行を終了させる。 On the other hand, if a positive determination is made in both steps S140 and S150, the control unit 120 ends execution of the fine adjustment mode in step S170.
 以上のように、本実施形態では、制御部120は、操作者の表示部11に対する注視があり、且つ、指差しジェスチャがあると、注視による視線方向を表示部11上の初期位置として操作カーソル11aを表示すると共に、指差しジェスチャによる動きに応じて操作カーソル11aを移動させる(微調整モード実行)ようにしている。 As described above, in the present embodiment, when the operator is gazing at the display unit 11 and there is a pointing gesture, the control unit 120 sets the gaze direction by the gaze as the initial position on the display unit 11 and sets the operation cursor. 11a is displayed, and the operation cursor 11a is moved according to the movement by the pointing gesture (fine adjustment mode execution).
 これにより、操作者は、注視した初期位置に表示される操作カーソル11aを、指差しジェスチャを行うことで移動させることができる。よって、操作者に対して事細かなジェスチャの規制をする必要がなく、操作者は、操作カーソル11aを正確に移動させることができる。総じて、操作者に対して煩わしさを感じさせることなく、誤操作の低減を可能とすることができる。 With this, the operator can move the operation cursor 11a displayed at the gazed initial position by performing a pointing gesture. Therefore, the operator does not need to finely control the gesture, and the operator can move the operation cursor 11a accurately. In general, it is possible to reduce erroneous operations without causing the operator to feel bothered.
 また、制御部120は、操作カーソル11aを表示した後に、視線方向の移動があっても、予め定めた所定のジェスチャがあるまでは、指示ジェスチャによる動きに応じた微調整モードの実行を継続するようにしている。これにより、操作者の視線方向に不意な動きがあっても、再初期化によって微調整モードが中断されてしまうことが防止される。 Further, after displaying the operation cursor 11a, the control unit 120 continues to execute the fine adjustment mode according to the movement by the instruction gesture, even if there is a movement in the line-of-sight direction until a predetermined gesture is given. I am trying. This prevents the fine adjustment mode from being interrupted by the re-initialization even if the operator makes an unexpected movement in the line-of-sight direction.
 また、制御部120は、上記ステップS130で、指の移動量に対して操作カーソル11aの移動量が小さくなるように、移動量の減衰(各倍率α、βを小さくする)を織り込む、あるいは、逆に、注視、および指差しジェスチャによる動きの検出レベルが高いほど、微調整モード実行時の操作カーソル11aの移動の各倍率α、βを、より高くするようにしている。これにより、操作カーソル11aの操作性を向上させることができる。 In step S130, the control unit 120 incorporates attenuation of the movement amount (reduces the respective magnifications α and β) so that the movement amount of the operation cursor 11a becomes smaller than the movement amount of the finger, or On the contrary, the higher the detection level of the gaze and the motion of the pointing gesture is, the higher the respective magnifications α and β of the movement of the operation cursor 11a during the execution of the fine adjustment mode are. Thereby, the operability of the operation cursor 11a can be improved.
 具体的には、視線、および指差しジェスチャ検出のコンディションが悪い(外乱光が強い、検出処理の確信度が低い等)場合には、各倍率α、βを相対的に低くして、検出ノイズの影響を抑制することができる。また、コンディションが良い場合には、各倍率α、βを相対的に高くして、操作カーソル11aを素早く動かすことができる。 Specifically, when the condition of gaze and pointing gesture detection is bad (strong ambient light, low confidence in detection processing, etc.), the respective magnifications α and β are made relatively low to detect noise. The influence of can be suppressed. If the condition is good, the magnifications α and β can be relatively increased to move the operation cursor 11a quickly.
 また、制御部120は、注視の後に、一定時間以内に指差しジェスチャが行われたときは、微調整モードの実行を許容するようにしている。これにより、操作時の条件を緩和することができるので、操作者にとっては、使いやすいものとなる。 Further, the control unit 120 allows the fine adjustment mode to be executed when a pointing gesture is performed within a certain time after gazing. As a result, the operating conditions can be eased, and the operator can easily use the device.
 (変形例1)
 変形例1を図5に示す。上記第1実施形態では、制御部120は、指差しジェスチャとして指差しの向きの変化に基づいて、微調整モードを実行するようにしたが、これに限らず、指差しの位置の変化に基づくものとしてもよい。
(Modification 1)
Modification 1 is shown in FIG. In the first embodiment, the control unit 120 executes the fine adjustment mode based on a change in the direction of the finger pointing as a pointing gesture, but is not limited to this, and based on a change in the position of the pointing. It may be one.
 例えば、初期化時の物理的な座標からの指の移動量(mm)を、操作カーソル11aの移動量(pix(ピクセル))に変換する一定の倍率を用いる、あるいは、上記第1実施形態と同様に、多項式やシグモイド曲線等を使用する等の対応が可能である。 For example, a constant magnification is used to convert the movement amount (mm) of the finger from the physical coordinates at the time of initialization into the movement amount (pix (pixel)) of the operation cursor 11a, or the same as the first embodiment. Similarly, it is possible to use a polynomial, a sigmoid curve, or the like.
 図5では、手指を用いて微調整モードを実行する際に、手指(指先)の位置を移動させると操作カーソル11aが移動される様子を示している。手指を動かす平面は表示部11と平行でなくてもよく、例えば、手指を90度倒して、体から離れる方向を表示部11上の上方向に対応させるようにしてもよい。 FIG. 5 shows how the operation cursor 11a is moved when the position of the finger (fingertip) is moved when executing the fine adjustment mode using the finger. The plane for moving the fingers may not be parallel to the display unit 11. For example, the fingers may be tilted 90 degrees so that the direction away from the body corresponds to the upward direction on the display unit 11.
 これにより、指示ジェスチャ(指差しジェスチャ)の操作自由度を高めることができる。 With this, it is possible to increase the degree of freedom in operating the pointing gesture (pointing gesture).
 (変形例2)
 変形例2を図6に示す。変形例2は、上記第1実施形態に対して、制御部120は、操作カーソル11aの移動可能範囲を表示するようにしたものである。移動可能範囲は、例えば、操作カーソル11aの位置を中心として、操作カーソル11aを内包する円形として表示される。
(Modification 2)
Modification 2 is shown in FIG. Modification 2 is different from the first embodiment in that the control unit 120 displays the movable range of the operation cursor 11a. The movable range is displayed, for example, as a circle including the operation cursor 11a with the position of the operation cursor 11a as the center.
 移動可能範囲は、操作者の体格に応じて、その大きさが増減されるものとしてもよい。また、移動可能範囲は、特定の条件のとき、例えば、初期化から一定時間、操作カーソル11aが移動可能範囲の境界付近にあるとき等に表示されるようにしてもよい。 The size of the movable range may be increased or decreased according to the physique of the operator. The movable range may be displayed under a specific condition, for example, when the operation cursor 11a is near the boundary of the movable range for a certain time after initialization.
 これにより、操作者は、指操作の可能範囲を知ることができ、操作性を向上させることができる。 With this, the operator can know the possible range of finger operation and improve operability.
 (変形例3)
 変形例3を図7、図8に示す。変形例3は、上記第1実施形態に対して、制御部120は、操作者の注視があった場合で、且つ、所定の指示時間の途中で、初期位置を示す予告表示を行うようにしたものである。
(Modification 3)
Modification 3 is shown in FIGS. Modification 3 is different from the first embodiment in that the control unit 120 performs a notice display indicating the initial position when the operator is gazing and in the middle of a predetermined instruction time. It is a thing.
 制御部120は、図7に示すように、時間t1≧閾値T1が成立後、t2≧閾値T2が成立する前に、操作カーソル11aを初期位置に予告表示させ、t2≧閾値T2が成立後は、本来の操作カーソル11aを表示させる。 As shown in FIG. 7, the control unit 120 displays the operation cursor 11a in advance at the initial position after the time t1≧threshold T1 is satisfied and before the time t2≧threshold T2 is satisfied, and after the time t2≧threshold T2 is satisfied. , The original operation cursor 11a is displayed.
 あるいは、制御部120は、図8に示すように、注視と指差しジェスチャとの間に時間差がある場合においても、時間t1≧閾値T1が成立後、t2≧閾値T2が成立する前に、操作カーソル11aを初期位置に予告表示させ、t2≧閾値T2が成立後は、本来の操作カーソル11aを表示させる。 Alternatively, as shown in FIG. 8, even when there is a time difference between the gaze and the pointing gesture, the control unit 120 operates after the time t1≧threshold T1 is satisfied and before the time t2≧threshold T2 is satisfied. The cursor 11a is displayed in advance at the initial position and the original operation cursor 11a is displayed after t2≧threshold value T2 is satisfied.
 予告表示の操作カーソル11aの意匠と、本来の操作カーソル11aの意匠(見栄え)は、異なるものとしてもよい。操作カーソル11aの意匠が異なるものとするにあたって、色、明るさ、大きさ、透過率を変える、特定の図形や外枠を重ねて表示する、あるいは上記変形例2で説明した移動可能範囲を予告表示のときのみ、あるいは、本来の操作カーソル11aのときのみ表示する、等の方法がある。 The design of the operation cursor 11a for notice display and the original design (look) of the operation cursor 11a may be different. When the design of the operation cursor 11a is different, the color, brightness, size, and transmittance are changed, a specific figure or an outer frame is displayed in an overlapping manner, or the movable range described in Modification 2 is announced. There is a method of displaying only at the time of display or only at the time of the original operation cursor 11a.
 これにより、指差しによって操作カーソル11aを移動させる前段階に、初期位置を確認できるので、その後の操作カーソル11aの移動の調整が容易となる。 With this, since the initial position can be confirmed before moving the operation cursor 11a by pointing, it is easy to adjust the movement of the operation cursor 11a thereafter.
 (変形例4)
 変形例4を図9に示す。変形例4は、上記第1実施形態に対して、制御部120は、注視の後に、指示ジェスチャの静止状態が認められないと、微調整モードの実行をキャンセルするようにしたものである。
(Modification 4)
Modification 4 is shown in FIG. The modification 4 is different from the first embodiment in that the control unit 120 cancels the execution of the fine adjustment mode when the stationary state of the instruction gesture is not recognized after the gaze.
 時間t2≧閾値T2が成立するまでの時間をt3、時間t3の成立を判定する閾値をT3、としたときに、
 制御部120は、時間t1≧閾値T1が成立後、予告表示を行うものの、t2≧閾値T2が成立することなく、t3≧閾値T3が成立すると、一連の操作をキャンセルして、操作カーソル11aを消去する。つまり、制御部120は、注視の後、指差しジェスチャで静止状態がないと、時間t3が閾値T3を過ぎると、微調整モードの実行をキャンセルするのである。
When the time until the time t2≧the threshold T2 is established is t3, and the threshold for determining the establishment of the time t3 is T3,
The control unit 120 displays the notice after the time t1≧threshold value T1 is satisfied, but when t3≧threshold value T3 is satisfied without the t2≧threshold value T2 being satisfied, the series of operations is canceled and the operation cursor 11a is moved. to erase. That is, the control unit 120 cancels the execution of the fine adjustment mode when the time t3 exceeds the threshold value T3 unless there is a still state with the pointing gesture after the gazing.
 これにより、操作者の注視があっても、指差しジェスチャが静止していないと、明確な操作の意思がないものと判断して、不要な微調整モードの実行をやめて、制御の負荷を減らすことができる。 As a result, even if the operator is gazing, if the pointing gesture is not stationary, it is determined that there is no intention of a clear operation, the unnecessary fine adjustment mode is stopped, and the control load is reduced. be able to.
 (その他の実施形態)
 上記第1実施形態、および各変形例1~4では、操作対象を各種車両機器としたが、これに限定されることなく、家庭用や施設用に設けられた機器に適用してもよい。家庭用機器としては、例えば、テレビや、オーディオ等、また、施設用機器としては、例えば、銀行のATMや、駅の自動券売機等が挙げられる。
(Other embodiments)
In the first embodiment and each of the modified examples 1 to 4, the operation target is various vehicle devices, but the invention is not limited to this and may be applied to devices provided for homes and facilities. Examples of household equipment include televisions and audio equipment, and examples of equipment for facilities include ATMs at banks and automatic ticket vending machines at stations.
 また、車両に搭載されるものにおいて、対象となる操作者は、運転者に限らず、助手席者としてもよい。この場合、助手席者も、指示ジェスチャを行うことで、ジェスチャ検出装置100によるジェスチャ検出が行われて、各種車両機器の操作が可能となる。 In addition, the target operator of the one installed in the vehicle is not limited to the driver, but may be the passenger seat. In this case, the passenger seat can also perform various gestures by the gesture detection device 100 by performing the instruction gesture, and can operate various vehicle devices.
 この明細書および図面等における開示は、例示された実施形態に制限されない。開示は、例示された実施形態と、それらに基づく当業者による変形態様を包含する。例えば、開示は、実施形態において示された部品および/または要素の組み合わせに限定されない。開示は、多様な組み合わせによって実施可能である。開示は、実施形態に追加可能な追加的な部分をもつことができる。開示は、実施形態の部品および/または要素が省略されたものを包含する。開示は、ひとつの実施形態と他の実施形態との間における部品および/または要素の置き換え、または組み合わせを包含する。開示される技術的範囲は、実施形態の記載に限定されない。開示されるいくつかの技術的範囲は、請求の範囲の記載によって示され、更に請求の範囲の記載と均等の意味および範囲内での全ての変更を含むものと解されるべきである。 The disclosure in this specification and drawings is not limited to the illustrated embodiment. The disclosure encompasses the illustrated embodiments and variations on them based on them. For example, the disclosure is not limited to the combination of parts and/or elements shown in the embodiments. The disclosure can be implemented in various combinations. The disclosure may have additional parts that may be added to the embodiments. The disclosure includes omissions of parts and/or elements of the embodiments. The disclosure includes replacements or combinations of parts and/or elements between one embodiment and another. The disclosed technical scope is not limited to the description of the embodiments. It is to be understood that some technical scopes disclosed are shown by the description of the claims, and further include meanings equivalent to the description of the claims and all modifications within the scope.
 本開示に記載の制御部120、およびその手法は、コンピュータプログラムにより具体化された一つないしは複数の機能を実行するようにプログラムされたプロセッサを構成する専用コンピュータにより、実現されてもよい。あるいは、本開示に記載の装置、およびその手法は、専用ハードウエア論理回路により、実現されてもよい。もしくは、本開示に記載の装置、およびその手法は、コンピュータプログラムを実行するプロセッサと一つ以上のハードウエア論理回路との組み合わせにより構成された一つ以上の専用コンピュータにより、実現されてもよい。また、コンピュータプログラムは、コンピュータにより実行されるインストラクションとして、コンピュータ読み取り可能な非遷移有形記録媒体に記憶されていてもよい。

 
The control unit 120 and the method thereof described in the present disclosure may be realized by a dedicated computer configuring a processor programmed to execute one or a plurality of functions embodied by a computer program. Alternatively, the apparatus and method described in the present disclosure may be realized by a dedicated hardware logic circuit. Alternatively, the device and the method described in the present disclosure may be realized by one or more dedicated computers configured by a combination of a processor that executes a computer program and one or more hardware logic circuits. Further, the computer program may be stored in a computer-readable non-transition tangible recording medium as an instruction executed by the computer.

Claims (10)

  1.  操作者が行う指示ジェスチャに基づいて、前記操作者から離れた操作画面(11)に対する入力を行うジェスチャ検出装置において、
     前記操作者の視線方向、および前記指示ジェスチャを検知する検知部(110)と、
     前記検知部で検知された検知データから、前記操作画面に対して所定の注視時間以上の注視があり、且つ、所定の指示時間以上の前記指示ジェスチャがあると、前記注視による視線方向を前記操作画面上の初期位置として操作用の操作カーソル(11a)を表示すると共に、前記指示ジェスチャによる動きに応じて前記操作カーソルを移動させる微調整モードを実行する制御部(120)と、を備えるジェスチャ検出装置。
    In a gesture detection device for performing input on an operation screen (11) separated from the operator based on an instruction gesture made by the operator,
    A detection unit (110) for detecting the direction of the operator's line of sight and the pointing gesture;
    From the detection data detected by the detection unit, if there is a gaze for a predetermined gaze time or more on the operation screen, and if there is the pointing gesture for a predetermined pointing time or more, the gaze direction by the gaze is operated. A gesture detection including: a control unit (120) that displays an operation cursor (11a) for operation as an initial position on the screen and executes a fine adjustment mode in which the operation cursor is moved according to the movement by the instruction gesture. apparatus.
  2.  前記制御部は、前記操作カーソルを表示した後に、前記視線方向の移動があっても、予め定めた所定のジェスチャがあるまでは、前記指示ジェスチャによる動きに応じた前記微調整モードの実行を継続する請求項1に記載のジェスチャ検出装置。 After the operation cursor is displayed, the control unit continues to execute the fine adjustment mode according to the movement by the instruction gesture until a predetermined gesture is made, even if there is movement in the line-of-sight direction. The gesture detection device according to claim 1.
  3.  前記制御部は、前記指示ジェスチャによって示される向きの変化、および位置の変化の少なくとも一方に応じて前記微調整モードを実行する請求項1または請求項2に記載のジェスチャ検出装置。 The gesture detection device according to claim 1 or 2, wherein the control unit executes the fine adjustment mode in accordance with at least one of a change in a direction indicated by the instruction gesture and a change in a position.
  4.  前記制御部は、前記注視、および前記指示ジェスチャによる動きの検出レベルが高いほど、前記微調整モードの実行時の前記操作カーソルの移動倍率を高くする請求項1~請求項3のいずれか1つに記載のジェスチャ検出装置。 4. The control unit increases the moving magnification of the operation cursor when executing the fine adjustment mode, as the detection level of the gaze and the movement by the pointing gesture is higher. The gesture detection device according to 1.
  5.  前記制御部は、前記操作カーソルの移動可能範囲を表示する請求項1~請求項4のいずれか1つに記載のジェスチャ検出装置。 The gesture detection device according to any one of claims 1 to 4, wherein the control unit displays a movable range of the operation cursor.
  6.  前記制御部は、前記注視があり、且つ、前記所定の指示時間の途中で、前記初期位置を示す予告表示を行う請求項1~請求項5のいずれか1つに記載のジェスチャ検出装置。 The gesture detection device according to any one of claims 1 to 5, wherein the control unit performs a notice display indicating the initial position in the middle of the predetermined instruction time with the gaze.
  7.  前記制御部は、前記操作カーソルと、前記予告表示とでは、見栄えが異なるように表示する請求項6に記載のジェスチャ検出装置。 The gesture detection device according to claim 6, wherein the control unit displays the operation cursor and the notice display so that they look different.
  8.  前記制御部は、前記注視の後に、前記指示ジェスチャの静止状態が認められないと、前記微調整モードの実行をキャンセルする請求項1~請求項7のいずれか1つに記載のジェスチャ検出装置。 The gesture detection device according to any one of claims 1 to 7, wherein the control unit cancels execution of the fine adjustment mode when the stationary state of the instruction gesture is not recognized after the gaze.
  9.  前記制御部は、前記注視の後に、一定時間以内に前記指示ジェスチャが行われたときは、前記微調整モードの実行を許容する請求項1~請求項8のいずれか1つに記載のジェスチャ検出装置。 9. The gesture detection according to claim 1, wherein the control unit permits execution of the fine adjustment mode when the instruction gesture is performed within a certain time after the gaze. apparatus.
  10.  操作者が行う指示ジェスチャに基づいて、前記操作者から離れた操作画面(11)に対する入力を行うジェスチャ検出方法において、
     前記操作者の視線方向、および前記指示ジェスチャを検知する検知ステップ(S100)と、
     前記検知ステップで検知された検知データから、前記操作画面に対して所定の注視時間以上の注視があり、且つ、所定の指示時間以上の前記指示ジェスチャがあると、前記注視による視線方向を前記操作画面上の初期位置として操作用の操作カーソル(11a)を表示すると共に、前記指示ジェスチャによる動きに応じて前記操作カーソルを移動させる微調整モードを実行する実行ステップ(S110~S130)と、を備えるジェスチャ検出方法。

     
    In the gesture detection method for inputting on the operation screen (11) away from the operator based on the instruction gesture made by the operator,
    A detection step (S100) of detecting the operator's gaze direction and the pointing gesture,
    From the detection data detected in the detection step, if there is a gaze for a predetermined gaze time or more on the operation screen, and if there is the pointing gesture for a predetermined pointing time or more, the gaze direction by the gaze is operated. An operation step (S110 to S130) for displaying an operation cursor (11a) for operation as an initial position on the screen and executing a fine adjustment mode for moving the operation cursor according to the movement by the instruction gesture. Gesture detection method.

PCT/JP2019/048757 2018-12-27 2019-12-12 Gesture detection device and gesture detection method WO2020137592A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018244368A JP7024702B2 (en) 2018-12-27 2018-12-27 Gesture detection device and gesture detection method
JP2018-244368 2018-12-27

Publications (1)

Publication Number Publication Date
WO2020137592A1 true WO2020137592A1 (en) 2020-07-02

Family

ID=71129046

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/048757 WO2020137592A1 (en) 2018-12-27 2019-12-12 Gesture detection device and gesture detection method

Country Status (2)

Country Link
JP (1) JP7024702B2 (en)
WO (1) WO2020137592A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112817443A (en) * 2021-01-22 2021-05-18 歌尔科技有限公司 Display interface control method, device and equipment based on gestures and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015118531A (en) * 2013-12-18 2015-06-25 株式会社デンソー Display control device and program
JP2015519673A (en) * 2012-06-14 2015-07-09 クアルコム,インコーポレイテッド Interaction with user interface for transparent head mounted display
WO2018077491A1 (en) * 2016-10-26 2018-05-03 Harman Becker Automotive Systems Gmbh Combined eye and gesture tracking
JP2018516422A (en) * 2015-05-28 2018-06-21 アイサイト モバイル テクノロジーズ エルティーディー. Gesture control system and method for smart home
US20180364810A1 (en) * 2013-06-20 2018-12-20 Uday Parshionikar Gesture control via eye tracking, head tracking, facial expressions and other user actions

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015519673A (en) * 2012-06-14 2015-07-09 クアルコム,インコーポレイテッド Interaction with user interface for transparent head mounted display
US20180364810A1 (en) * 2013-06-20 2018-12-20 Uday Parshionikar Gesture control via eye tracking, head tracking, facial expressions and other user actions
JP2015118531A (en) * 2013-12-18 2015-06-25 株式会社デンソー Display control device and program
JP2018516422A (en) * 2015-05-28 2018-06-21 アイサイト モバイル テクノロジーズ エルティーディー. Gesture control system and method for smart home
WO2018077491A1 (en) * 2016-10-26 2018-05-03 Harman Becker Automotive Systems Gmbh Combined eye and gesture tracking

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112817443A (en) * 2021-01-22 2021-05-18 歌尔科技有限公司 Display interface control method, device and equipment based on gestures and storage medium

Also Published As

Publication number Publication date
JP7024702B2 (en) 2022-02-24
JP2020107030A (en) 2020-07-09

Similar Documents

Publication Publication Date Title
US9189094B2 (en) Display control apparatus and display system with pointer correction
US8384666B2 (en) Input device for operating in-vehicle apparatus
US10496236B2 (en) Vehicle display device and method for controlling vehicle display device
US20170108988A1 (en) Method and apparatus for recognizing a touch drag gesture on a curved screen
JP4789885B2 (en) Interface device, interface method, and interface program
WO2015125213A1 (en) Gesture guidance device for mobile body, gesture guidance system for mobile body, and gesture guidance method for mobile body
JP6406088B2 (en) Operation system
EP3776159A1 (en) Information processing apparatus, information processing system, information processing method, and program
US9996242B2 (en) Composite gesture for switching active regions
US10953749B2 (en) Vehicular display device
WO2020137592A1 (en) Gesture detection device and gesture detection method
JP2020107031A (en) Instruction gesture detection apparatus and detection method therefor
WO2020110706A1 (en) Gesture detection device and gesture detection method
JP2017197015A (en) On-board information processing system
WO2017188098A1 (en) Vehicle-mounted information processing system
CN117136347A (en) Method, system and computer program for touch stabilization
US10452225B2 (en) Vehicular input device and method of controlling vehicular input device
WO2022168579A1 (en) Control device for vehicle and control method for vehicle
WO2019097843A1 (en) Virtual image display system, virtual image display device, operation input device, method for displaying virtual image, program, and recording medium
WO2019189403A1 (en) Information processing apparatus, information processing system, information processing method, and program
WO2017175666A1 (en) In-vehicle information processing system
CN117555503A (en) Multi-screen cabin display system
JP2015156082A (en) operation input device
JP2014154052A (en) Scroll control device and scroll control method
JP2017199203A (en) Vehicle-mounted information processing system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19902608

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19902608

Country of ref document: EP

Kind code of ref document: A1