WO2019146032A1 - Dispositif d'opération gestuelle et procédé d'opération gestuelle - Google Patents

Dispositif d'opération gestuelle et procédé d'opération gestuelle Download PDF

Info

Publication number
WO2019146032A1
WO2019146032A1 PCT/JP2018/002242 JP2018002242W WO2019146032A1 WO 2019146032 A1 WO2019146032 A1 WO 2019146032A1 JP 2018002242 W JP2018002242 W JP 2018002242W WO 2019146032 A1 WO2019146032 A1 WO 2019146032A1
Authority
WO
WIPO (PCT)
Prior art keywords
gesture operation
operator
gesture
display
unit
Prior art date
Application number
PCT/JP2018/002242
Other languages
English (en)
Japanese (ja)
Inventor
直志 宮原
下谷 光生
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2018/002242 priority Critical patent/WO2019146032A1/fr
Priority to JP2019567457A priority patent/JP6900133B2/ja
Publication of WO2019146032A1 publication Critical patent/WO2019146032A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance

Definitions

  • the present invention relates to a gesture operation device and a gesture operation method, and more particularly to a gesture operation device and a gesture operation method for guiding a gesture operation to an operator.
  • a gesture operating device which performs volume control or response to an incoming call by performing gesture operations by an operator such as a driver.
  • the gesture operation is an operation for performing an arbitrary function by moving the hand or the like in an arbitrary direction in the three-dimensional space.
  • a technology for displaying information for guiding a gesture operation to be performed by the operator when executing an arbitrary function see, for example, Patent Document 1).
  • Patent Document 1 information for guiding a gesture operation is displayed in two dimensions. Therefore, in the case of a gesture operation involving a movement in a three-dimensional direction, the information for guiding the gesture operation is expressed in two dimensions, and the operator may not be able to correctly understand the gesture operation. In this case, the gesture operation device may not be able to accurately determine the gesture operation performed by the operator.
  • the present invention has been made to solve such a problem, and it is an object of the present invention to provide a gesture operation device and a gesture operation method capable of accurately judging a gesture operation performed by an operator. .
  • the gesture operation device may be performed by an operator for advancing an event with respect to an event detected by the event detection unit and an event detected by the event detection unit.
  • a display control unit that performs control to display an operation object for guiding a gesture operation indicating the intention of the operator in three dimensions as possible operations.
  • the gesture operation method detects an event, and guides the gesture operation indicating the intention of the operator to the action that can be performed by the operator for advancing the event with respect to the detected event. Control to display the operation object in three dimensions.
  • the gesture operation device is an event detection unit that detects an event, and an operation that can be performed by the operator to cause the event to progress with respect to the event detected by the event detection unit. Since the display control unit performs control to display in three dimensions the operation object that guides the gesture operation indicating the intention of the object, it is possible to accurately determine the gesture operation performed by the operator.
  • the gesture operation method detects an event, and guides the gesture operation indicating the intention of the operator to an action that can be performed by the operator for advancing the event with respect to the detected event. In order to perform control of displaying in three dimensions, it is possible to accurately determine the gesture operation performed by the operator.
  • FIG. 1 is a diagram showing an example of a display device according to Embodiment 1 of the present invention.
  • FIG. 1 is a diagram showing an example of a display device according to Embodiment 1 of the present invention.
  • FIG. 1 is a diagram showing an example of a display device according to Embodiment 1 of the present invention.
  • FIG. 1 is a diagram showing an example of a display device according to Embodiment 1 of the present invention.
  • FIG. 1 is a diagram showing an example of a display device according to Embodiment 1 of the present invention.
  • FIG. 1 is a diagram showing an example of a display device according to Embodiment 1 of the present invention.
  • FIG. 1 is a block diagram showing an example of the configuration of the gesture operation device 1 according to the first embodiment.
  • the minimum required structure which comprises the gesture operating device by this Embodiment is shown.
  • an operator performs gesture operation by moving an own hand is demonstrated below, it does not restrict to this.
  • the gesture operation device 1 includes an event detection unit 2 and a display control unit 3.
  • the event detection unit 2 detects an event. As an event, as described later, for example, an incoming mail, an incoming call, an operation of an in-vehicle device, etc. may be mentioned.
  • the display control unit 3 is connected to the display device 4, and for the event detected by the event detection unit 2, the operator's intention is to perform an operation that can be performed by the operator to advance the event. Control is performed to display an operation object for guiding a gesture operation to be shown in three dimensions.
  • FIGS. 2 to 4 are diagrams showing an example of the gesture operation.
  • the X axis indicates the horizontal direction as viewed from the operator
  • the Y axis indicates the vertical direction as viewed from the operator
  • the Z axis indicates the longitudinal direction as viewed from the operator.
  • the operator can perform a gesture operation to rotate his / her hand in the front-rear direction and in the vertical direction.
  • the operator can perform a gesture operation to move his / her hand in the front-rear direction.
  • the operator can perform a gesture operation to move his or her hand in the left and right direction. In this way, the operator can perform the gesture operation by moving the hand in any direction in the three-dimensional space.
  • FIGS. 5 to 7 show an example of the display device 4.
  • the X-axis indicates the horizontal direction as viewed from the operator
  • the Y-axis indicates the vertical direction as viewed from the operator
  • the Z-axis indicates the longitudinal direction as viewed from the operator.
  • FIG. 5 shows an example of the display device 4 that performs autostereoscopic display.
  • the autostereoscopic display means that the image is displayed stereoscopically when the operator 5 looks at the display screen with the naked eye, that is, it is recognized so as to sense the depth in the three-dimensional display space.
  • the display object for the operator 5 is three-dimensional by alternately displaying the display object for the right eye of the operator 5 and the display object for the left eye on the XY plane. Recognize as displayed on the screen.
  • FIG. 6 shows an example of a display device 4 that displays a virtual image, such as a HUD (Head Up Display).
  • a virtual image such as a HUD (Head Up Display).
  • HUD Head Up Display
  • FIG. 7 shows an example of the display device 4 configured by stacking a plurality of transmissive display surfaces in the z-axis direction. By displaying one display object superimposed on each display surface, the operator 5 can recognize that the display object is three-dimensionally displayed.
  • the display device 4 is a display device of any or any combination shown in FIGS. 5 to 7, and is an operation object for guiding a gesture operation in a three-dimensional space as shown in FIGS. 2 to 4 under the control of the display control unit 3. Display in three dimensions. Details of the operation object will be described later.
  • FIG. 8 is a block diagram showing an example of the configuration of the gesture operation device 6 according to another configuration.
  • the gesture operation device 6 includes an event detection unit 2, a display control unit 3, a communication unit 7, a gesture operation acquisition unit 8, an operation determination unit 9, and a device control unit 10. ing.
  • the communication unit 7 has a terminal communication unit 11 that can be communicably connected to the mobile communication terminal 12.
  • the terminal communication unit 11 receives, from the mobile communication terminal 12, for example, information indicating that a mail has arrived, and information indicating that a call has arrived. Further, the terminal communication unit 11 transmits information for operating the mobile communication terminal 12 to the mobile communication terminal 12 in accordance with the determination result of the operation determining unit 9.
  • the event detection unit 2 detects, as an event, information that the terminal communication unit 11 has received an e-mail, information that a call has been received, or the like from the mobile communication terminal 12.
  • the display control unit 3 causes the display device 4 to make a gesture indicating the intention of the operator for the action that the operator can perform to advance the event with respect to the event detected by the event detection unit 2 Control to display the operation object that guides the operation in three dimensions.
  • the gesture operation acquisition unit 8 is connected to the gesture operation detection device 13 and acquires the gesture operation of the operator 5 from the gesture operation detection device 13.
  • FIG. 9 is a diagram showing an example of the configuration of the gesture operation detection device 13.
  • the gesture operation detection device 13 is installed, for example, on a floor console in a car, and includes at least one of a ToF (Time of Flight) sensor, an image sensor, and a proximity sensor.
  • the gesture operation space 17 corresponds to a detection range by the gesture operation detection device 13. That is, the gesture operation detection device 13 detects the gesture operation performed by the operator 5 in the gesture operation space 17.
  • the gesture operation detected by the gesture operation detection device 13 is not only a gesture operation for performing an operation that can be performed by the operator in order to advance the event, but also other events. The movement of the hand is also detected as a gesture operation.
  • the sensor which comprises the gesture operation detection apparatus 13 may be one. In this case, the gesture operation detection device 13 detects a gesture operation from one direction. Moreover, the sensor which comprises the gesture operation detection apparatus 13 may be multiple. In this case, for example, if three sensors are provided at three points A, B, and C as shown in FIG. 10, the gesture operation detection device 13 detects a gesture operation from three directions. The detection accuracy of the gesture operation can be improved more than in the case of one sensor.
  • the operation determination unit 9 determines whether the gesture operation acquired by the gesture operation acquisition unit 8 is a movement according to the operation object. Further, the operation determination unit 9 is connected to the voice input device 14, and can determine what kind of operation to perform based on the content that the operator 5 utters via the voice input device 14. Thus, the operation determination unit 9 can determine the content of the voice operation by the operator 5.
  • the device control unit 10 controls devices such as the on-vehicle device 15 and the voice output device 16 based on the gesture operation.
  • the device control unit 10 controls the mobile communication terminal 12 in communication with the communication unit 7 based on the gesture operation.
  • the on-vehicle device 15 include a navigation device, an audio device, and various control devices provided in the vehicle.
  • the audio output device 16 include a speaker and the like.
  • FIG. 11 is a block diagram showing an example of the hardware configuration of the gesture operation device 6. The same applies to the gesture operation device 1 shown in FIG.
  • the functions of the terminal communication unit 11, the event detection unit 2, the display control unit 3, the gesture operation acquisition unit 8, the operation determination unit 9, and the device control unit 10 in the gesture operation device 6 are realized by a processing circuit. That is, the gesture operation device 6 communicates with the mobile communication terminal 12, detects an event, controls display, acquires a gesture operation, determines a gesture operation, and includes a processing circuit for controlling an apparatus.
  • the processing circuit is a processor 18 (also called a central processing unit, processing unit, arithmetic unit, microprocessor, microcomputer, or DSP (Digital Signal Processor)) that executes a program stored in the memory 19.
  • Each function of the terminal communication unit 11, the event detection unit 2, the display control unit 3, the gesture operation acquisition unit 8, the operation determination unit 9, and the device control unit 10 in the gesture operation device 6 is software, firmware, or software and firmware and It is realized by the combination of The software or firmware is described as a program and stored in the memory 19.
  • the processing circuit implements the functions of the respective units by reading and executing the program stored in the memory 19. That is, the gesture operation device 6 communicates with the mobile communication terminal 12, detects an event, controls display, acquires a gesture operation, determines a gesture operation, and controls an apparatus.
  • a memory 19 is provided for storing a program to be executed as a result.
  • the memory is, for example, nonvolatile or volatile such as random access memory (RAM), read only memory (ROM), flash memory, erasable programmable read only memory (EPROM), electrically erasable programmable read only memory (EEPROM), etc.
  • RAM random access memory
  • ROM read only memory
  • EPROM erasable programmable read only memory
  • EEPROM electrically erasable programmable read only memory
  • Semiconductor memory magnetic disk, flexible disk, optical disk, compact disk, mini disk, DVD or the like, or any storage medium to be used in the future.
  • FIG. 12 is a flowchart showing an example of the operation of the gesture operation device 1 shown in FIG.
  • step S101 the event detection unit 2 determines whether an event has been detected.
  • the event detection unit 2 repeats the process of step S101 until an event is detected, and when it is determined that an event is detected, the process proceeds to step S102.
  • step S102 the display control unit 3 performs a gesture operation indicating the intention of the operator 5 in an operation that can be performed by the operator 5 to advance the event with respect to the event detected by the event detection unit 2.
  • Control the display device 4 so as to display the operation object for guiding in three dimensions.
  • a three-dimensional operation object for guiding a three-dimensional gesture operation as shown in FIGS. 2 to 4 is displayed on the display device 4 capable of displaying in three dimensions.
  • FIG. 13 is a flowchart showing an example of the operation of the gesture operation device 6 shown in FIG. Note that, in FIG. 13, a case where the operation object is displayed in a tutorial will be described using FIG. 14.
  • a tutorial means displaying a series of gesture operations to be performed by the operator 5 in response to an event on the display device 4 as an operation object.
  • step S201 the event detection unit 2 determines whether an event has been detected. Specifically, the event detection unit 2 repeats the process of step S201 until the information indicating that the terminal communication unit 11 has received an e-mail or a telephone call is received from the mobile communication terminal 12. Then, the event detection unit 2 determines that an event is detected when the terminal communication unit 11 receives information from the mobile communication terminal 12 to the effect that the terminal communication unit 11 has received an e-mail or a call, and proceeds to step S202.
  • the terminal communication unit 11 when the terminal communication unit 11 receives information indicating that a mail has arrived from the mobile communication terminal 12, the unread icon 20 is displayed on the display device 4.
  • the example in FIG. 14 indicates that there are two unread emails.
  • step S202 in response to the event detected by the event detection unit 2, the display control unit 3 three-dimensionally displays an operation object indicating a series of gesture operations to be performed by the operator 5 in order to advance the event. Control the display device 4.
  • FIG. 14 when the operator 5 performs a gesture operation to select the unread icon 20 displayed on the display device 4, a list of unread e-mails is displayed, and the operator 5 further displays a specific e-mail from the unread e-mail list.
  • a tutorial displays that the contents of the mail are read out by voice.
  • an operation object 21 indicating a gesture operation for selecting the unread icon 20 is three-dimensionally displayed on the display device 4. At this time, the operation object 21 moves from the front to the back of the display device 4 so as to select the unread icon 20.
  • characters of “select mail and can be read out by voice” are displayed.
  • step S203 whether the gesture operation acquisition unit 8 acquires a gesture operation by the operator 5 detected by the gesture operation detection device 13 from the gesture operation detection device 13, that is, whether the operator 5 performed a gesture operation or not To judge.
  • the process proceeds to step S205.
  • the gesture operation has not been acquired, the process proceeds to step S204.
  • step S204 the gesture operation acquisition unit 8 determines whether a predetermined time has elapsed. Note that this time may be a time set in advance, or may be a time arbitrarily set by the user. If the predetermined time has elapsed, the operation shown in FIG. 13 is ended. On the other hand, if the predetermined time has not elapsed, the process returns to step S203.
  • step S205 the operation determination unit 9 determines whether the gesture operation by the operator 5 acquired by the gesture operation acquisition unit 8 is a movement according to the tutorial displayed on the display device 4. That is, the operation determination unit 9 determines whether the gesture operation by the operator 5 is an operation according to the operation object indicated in the tutorial. If the gesture operation is not the operation according to the tutorial, the process proceeds to step S206. On the other hand, when the gesture operation is an operation according to the tutorial, the process proceeds to step S207.
  • step S206 the display control unit 3 performs control to notify that the gesture operation has failed. Specifically, the display control unit 3 performs control to display on the display device 4 that the gesture operation by the operator 5 is not the operation according to the tutorial.
  • step S207 when determining that the gesture operation by the operator 5 is an operation according to the tutorial in step S207, the device control unit 10 controls a target device based on the gesture operation.
  • the operator 5 reads the contents of the mail selected by the gesture operation.
  • the device control unit 10 may control the voice output device 16 to read the contents of the mail from the voice output device 16, and control the terminal communication unit 11 to read the contents of the mail from the mobile communication terminal 12. You may
  • step S208 the operation determination unit 9 determines whether the gesture operation by the operator 5 has ended. Specifically, the operation determination unit 9 determines whether the gesture operation acquired by the gesture operation acquisition unit 8 has performed all the operations according to the tutorial displayed on the display device 4. When the gesture operation ends, the operation illustrated in FIG. 13 ends. On the other hand, if the gesture operation has not ended, the process returns to step S207.
  • the operator 5 can correctly understand what kind of gesture operation should be performed when a mail is received.
  • the display device 4 is a HUD as shown in FIG. 6, a driver who is the operator 5 can perform gesture operations without turning his eyes from the front.
  • FIG. 15 is a flowchart showing an example of the operation of the gesture operation device 6 shown in FIG.
  • produces is displayed is demonstrated using FIG.
  • step S301 the event detection unit 2 repeats the process of step S301 until an event is detected, as in step S201 of FIG. And if it judges that event detection part 2 detected an event, it will shift to Step S302.
  • the display device 4 displays the information on the call originator along with the handset icon.
  • “A call from Mr. Iwasaki” is displayed as the information of the call originator.
  • step S302 the display control unit 3 controls the display device 4 to three-dimensionally display an operation object indicating one gesture operation to be performed by the operator 5 next to the event detected by the event detection unit 2 Do.
  • the operation object 21 in which the hand of the handset icon is touched is displayed.
  • the display control unit 3 may perform control to display the operation object 21 so as to express a movement of putting a hand on the icon of the handset.
  • step S303 whether the gesture operation acquisition unit 8 acquires a gesture operation by the operator 5 detected by the gesture operation detection device 13 from the gesture operation detection device 13, that is, whether the operator 5 performed a gesture operation or not To judge.
  • a detection icon 22 corresponding to the hand of the operator 5 detected by the gesture operation detection device 13 is displayed on the display device 4.
  • the display device 4 may display the icon of the handset, the operation object 21 and the detection icon 22 in order from the back to the front. That is, the display control unit 3 displays the operation object 21 on the back side in the three-dimensional display space, and the detection icon 22 which is an icon of the hand of the operator 5 acquired by the gesture operation acquisition unit 8 is displayed in front of the three-dimensional display space. Control to display on the side.
  • the process proceeds to step S305.
  • the gesture operation has not been acquired, the process proceeds to step S304.
  • step S304 as in step S204 in FIG. 13, the gesture operation acquisition unit 8 determines whether a predetermined time has elapsed. If the predetermined time has elapsed, the operation shown in FIG. 15 is ended. On the other hand, if the predetermined time has not elapsed, the process returns to step S303.
  • step S305 the operation determination unit 9 determines whether the gesture operation by the operator 5 acquired by the gesture operation acquisition unit 8 is movement according to the operation object displayed on the display device 4. If the gesture operation is not the operation according to the operation object, the process proceeds to step S306. On the other hand, when the gesture operation is an operation according to the operation object, the process proceeds to step S307.
  • step S306 as in step S206 of FIG. 13, the display control unit 3 performs control to notify that the gesture operation has failed. Specifically, the display control unit 3 performs control to display on the display device 4 that the gesture operation by the operator 5 is not the operation according to the operation object.
  • step S307 when determining that the gesture operation by the operator 5 is an operation according to the operation object in step S307, the device control unit 10 controls a target device based on the gesture operation.
  • a target device In the example of FIG. 16, when the detection icon 22 overlaps the icon of the handset, a call can be made. At this time, the device control unit 10 controls the terminal communication unit 11 so that the mobile communication terminal 12 can make a call.
  • step S308 the operation determination unit 9 determines whether the gesture operation by the operator 5 has ended. Specifically, the operation determination unit 9 determines whether the gesture operation acquired by the gesture operation acquisition unit 8 has performed an operation according to the operation object displayed on the display device 4. When the gesture operation ends, the operation illustrated in FIG. 15 ends. On the other hand, if the gesture operation has not ended, the process returns to step S302.
  • FIG. 13 shows the operation in the case of displaying the operation object in the tutorial
  • FIG. 15 shows the operation in the case of displaying the operation object for inducing one gesture operation to be performed by the operator 5 next when an event occurs.
  • these operations may be combined.
  • the operation in FIG. 13 is performed to display the operation object in the tutorial, and when the same event is experienced next time or later, the operation in FIG. An operation object for guiding one gesture operation may be displayed.
  • the operation of FIG. 13 or 15 may be performed for each operator 5. In this case, it is necessary to recognize the operators 5 individually.
  • a dial is displayed together with an operation object 21 for guiding a clockwise or counterclockwise gesture operation.
  • Examples of the dial include a dial for performing volume adjustment.
  • folders A to C are displayed together with an operation object 21 for guiding a gesture operation to be rotated in the front-rear direction as viewed from the operator 5.
  • a folder a telephone directory etc. are mentioned, for example.
  • FIGS. 17 and 18 are displayed as a still image in step S 202 in FIG. 13 or step S 302 in FIG. 15. However, the operator 5 actually performs the gesture operation in step S 205 in FIG. 13 or step S 305 in FIG. When this is done, the dial rotates or moves in the order of folders A to C in accordance with the actual gesture operation. That is, when the operation determination unit 9 determines that the gesture operation by the operator 5 is a movement according to the operation object, the display control unit 3 controls the operation object to move according to the gesture operation acquired by the gesture operation acquisition unit 8 Do.
  • the list of operation items corresponds to, for example, the function of the on-vehicle device 15.
  • the list of operation items is arranged to draw a ring in the front-rear direction as viewed from the operator 5.
  • Operator 5 selects "Setting"
  • the screen changes to a screen for setting various settings.
  • AM Radio is selected
  • FM Radio FM radio is selected.
  • a function for viewing is executed, a function for viewing FM radio is executed when "Music" is selected, and a navigation function such as route guidance to a destination is executed when "Navigation" is selected.
  • a navigation function such as route guidance to a destination is executed when "Navigation" is selected.
  • the list of operation items moves in accordance with the actual gesture operation.
  • the present invention is not limited to the example shown in FIG. 19, and may be a display in which the dial is rotated in FIG.
  • Example 1 The gesture operation may be performed by the same operation as the operation of the mobile communication terminal 12 which the operator 5 normally uses.
  • FIG. 20 illustrates an example in which the operation screen of the mobile communication terminal 12 is three-dimensionally displayed on the display device 4.
  • the operation screen of the mobile communication terminal 12 shown in FIG. 20 is a screen displayed when an incoming call is received.
  • the terminal communication unit 11 acquires, from the mobile communication terminal 12, information of the operation screen displayed on the mobile communication terminal 12 when the call is received, together with the information indicating that the call is received from the mobile communication terminal 12.
  • the display control unit 3 displays an operation object indicating that the icon of the handset indicating that it responds is to be slid rightward, as shown in FIG. Control to display on the device 4 is performed.
  • the operator 5 can respond to an incoming call by performing a gesture operation according to the operation object shown in FIG. At this time, the operator 5 performs the gesture operation while looking at the operation screen displayed in three dimensions, and thus performs the gesture operation in the same sense of operation as the mobile communication terminal 12 that the operator 5 normally uses be able to.
  • an operation screen as shown in FIG. 22 may be displayed on the display device 4.
  • the display control unit 3 performs control to display on the display device 4 an operation object indicating selection of an icon of a handset indicating response.
  • the operator 5 can respond to an incoming call by performing a gesture operation according to FIG.
  • the gesture operation device 6 may store the operation screen of the mobile communication terminal 12 in a storage unit (not shown), and the communication unit 7 acquires the information of the operation screen corresponding to the mobile communication terminal 12 from an external server or the like. You may
  • Example 2 If the on-vehicle device 15 has a radio and a function to play music, these functions may be performed by a gesture operation. For example, when the operator 5 speaks “music”, “radio”, “mobile music” or the like while holding his hand, the event detection unit 2 detects these operations as an event.
  • the display control unit 3 displays, for example, a music reproduction list and a hand icon as an operation object on the display device 4.
  • the display control unit 3 may display an operation object indicating that scrolling of the list is possible when a gesture operation of flicking in the up and down direction is performed.
  • the operator 5 scrolls the list by performing a gesture operation of flicking up and down according to the operation object displayed on the display device 4, and reads out the music name to select a specific music.
  • selection of a music may be selected by gesture operation, and when the display apparatus 4 is equipped with the touch panel, you may select by touch operation.
  • an operation object as shown in FIG. 17 may be displayed on the display device 4.
  • the operator 5 can adjust the volume by performing a gesture operation such as rotating the dial with his thumb, forefinger and middle finger, for example.
  • the rotational axis of the dial may be in a shape inclined to the operator 5 side.
  • Example 3 When the on-vehicle device 15 has a navigation function, the function may be executed by a gesture operation. For example, when the operator 5 speaks “navigation”, “destination” or “map” while holding his hand, the event detection unit 2 detects these operations as an event.
  • the display control unit 3 displays, for example, a list of destinations and an icon of a hand as an operation object.
  • the display control unit 3 displays an operation object indicating that the list can be scrolled by performing a gesture operation of flicking in the vertical direction, and that a specific purpose can be selected by performing a gesture operation of touching. May be The operator 5 scrolls the list by performing a gesture operation of flicking up and down according to the operation object displayed on the display device 4, and selects the specific purpose when performing the gesture operation of touching.
  • the destination may be selected by reading out the name of the destination, for example.
  • the display control unit 3 displays, for example, a map and an icon of a hand as an operation object.
  • the display control unit 3 may display an operation object indicating that the map can be enlarged or reduced by performing a pinch-in or pinch-out gesture operation.
  • an operation object may be displayed which indicates that the map can be scrolled by performing a gesture operation of moving the palm in the front, back, left, and right directions with the palm directed downward.
  • the display control unit 3 performs a gesture operation of moving the palm upward with the palm facing upward.
  • An operation object may be displayed to indicate that the order of the home screen 23, the current location screen 24, and the map scroll screen 25 changes each time it is performed.
  • the home screen 23 may be displayed at the top when the hand is turned over.
  • the display control unit 3 may display an operation object indicating that a specific item is selected when, for example, a gesture operation to touch is performed.
  • the gesture operation device 6 can accurately determine the gesture operation performed by the operator 5.
  • FIG. 25 is a block diagram showing an example of the configuration of the gesture operation device 26 according to the second embodiment of the present invention.
  • the gesture operation device 26 is characterized in that the device control unit 10 has a sound image control unit 27.
  • the other configuration and operation are the same as in the first embodiment, and thus detailed description will be omitted here.
  • the sound image control unit 27 is connected to the sound output device 16 and controls the sound image of the sound output from the sound output device 16. Specifically, the sound image control unit 27 controls the sound image according to the display position of the operation object displayed on the display device 4 by the control of the display control unit 3.
  • FIG. 26 shows an operation object when sorting incoming mail into a storage folder and a trash can.
  • an incoming mail is displayed in front of the left
  • a trash can is displayed in front of the right
  • a save folder is displayed in the back right.
  • the hand icon moves along the arrow from the incoming mail to the trash or storage folder.
  • the sound image control unit 27 controls the sound image according to the movement of the hand icon, the sound when the hand icon moves toward the trash and the hand icon moves toward the storage folder The sound of is different.
  • the sound image control unit 27 may control the sound image according to the gesture operation when the operator 5 actually performs the gesture operation.
  • the gesture operation device 6 can accurately determine the gesture operation performed by the operator 5.
  • FIG. 27 is a block diagram showing an example of a configuration of the gesture operation device 28 according to Embodiment 3 of the present invention.
  • the gesture operation device 28 As shown in FIG. 27, the gesture operation device 28 according to the third embodiment is characterized by including a state acquisition unit 29.
  • the other configuration and operation are the same as in the first embodiment, and thus detailed description will be omitted here.
  • the state acquisition unit 29 is connected to the state detection device 30, and acquires the state of the operator 5 including the eye position and the line of sight of at least the operator 5 detected by the state detection device 30.
  • the state detection device 30 is configured by a camera, and specifies the eye position or line of sight of the operator 5 from the image captured by the camera.
  • the display control unit 3 controls the display position of the operation object based on the state of the operator 5 acquired by the state acquisition unit 29.
  • the operation object which can be easily viewed by the operator 5 can be displayed. Thereby, the operator 5 can correctly understand what kind of gesture operation should be performed. Therefore, the gesture operation device 6 can accurately determine the gesture operation performed by the operator 5.
  • FIG. 28 is a block diagram showing an example of the configuration of the gesture operation device 31 according to the fourth embodiment of the present invention.
  • the gesture operation device 31 As shown in FIG. 28, the gesture operation device 31 according to the fourth embodiment is characterized by including an operation history information storage unit 32.
  • the other configuration and operation are the same as in the third embodiment, and thus detailed description will be omitted here.
  • the operation history information storage unit 32 is composed of a storage device such as a hard disk drive (HDD) or a semiconductor memory, for example, and the operation determination unit 9 determines whether the gesture operation follows the operation object. Remember the results.
  • HDD hard disk drive
  • semiconductor memory for example
  • FIG. 29 is a flowchart showing an example of the operation of the gesture operation device 31. Steps S401 to S406 in FIG. 29 correspond to steps S201 to S206 in FIG. 13, and steps S408 and S409 in FIG. 29 correspond to steps S207 and S208 in FIG. Omit. Hereinafter, step S407 will be described.
  • the operation history information storage unit 32 stores the result determined by the operation determination unit 9 as operation history information. Specifically, when the operation determination unit 9 determines in step S405 that the gesture operation by the operator 5 is not the operation according to the operation object indicated in the tutorial, the determination result is stored in the operation history information storage unit 32. .
  • the operation history information stored in the operation history information storage unit 32 is stored for each operator 5. In addition, identification of each operator 5 may be based on the image of the face of the operator 5 which the state detection apparatus 30 detected.
  • the operation determination unit 9 is an operation according to the operation object indicated by the tutorial by the gesture operation by the operator 5 using the operation history information stored in the operation history information storage unit 32 in the subsequent step S405. It can be determined whether or not. In addition, the operation determination unit 9 can perform machine learning using the accumulated operation history information. Thereby, even if the gesture operation by the operator 5 is a little different from the operation object shown in the tutorial, it is judged that the operation according to the operation object is taken into consideration, such as judgment taking into consideration the habit of gesture operation by the operator 5 You can
  • FIG. 30 is a flowchart showing an example of the operation of the gesture operation device 31.
  • Step S501 to Step S506 in FIG. 30 correspond to Step S201 to Step S206 in FIG. 13, and Step S509 and Step S510 in FIG. 30 correspond to Step S207 and Step S208 in FIG. Omit.
  • steps S507 and S508 will be described.
  • step S507 the state acquisition unit 29 acquires the reaction of the operator 5 with respect to the result determined by the operation determination unit 9 as the state of the operator 5. Specifically, when the operation determination unit 9 determines that the gesture operation by the operator 5 is not the operation according to the operation object shown in the tutorial in step S505, and notifies that the gesture operation has failed in step S506.
  • the state detection device 30 detects the reaction of the operator 5 and the state acquisition unit 29 acquires the detected reaction of the operator 5.
  • the reaction of the operator 5 includes the expression or movement of the operator 5.
  • step S508 the operation history information storage unit 32 associates the determination result by the operation determination unit 9 with the reaction of the operator 5 acquired by the state acquisition unit 29 and stores the result as operation history information.
  • the operation history information stored in the operation history information storage unit 32 is stored for each operator 5.
  • identification of each operator 5 may be based on the image of the face of the operator 5 which the state detection apparatus 30 detected.
  • Operation determination unit 9 is an operation in which the gesture operation by operator 5 follows the operation object indicated by the tutorial, using the operation history information stored in operation history information storage unit 32 in the subsequent step S505. It can be determined whether or not. In addition, the operation determination unit 9 can perform machine learning using the accumulated operation history information. Thereby, even if the gesture operation by the operator 5 is a little different from the operation object shown in the tutorial, it is judged that the operation according to the operation object is taken into consideration, such as judgment taking into consideration the habit of gesture operation by the operator 5 You can
  • FIGS. 29 and 30 although the case of displaying the tutorial has been described, the present invention is also applicable to the case of displaying an operation object for inducing one gesture operation to be performed by the operator 5 next when an event occurs. is there.
  • the operation determination unit 9 may not perform the process of step S405 of FIG. 29 or step S505 of FIG.
  • the gesture operation by the operator 5 follows the operation object indicated by the tutorial. In order to determine whether or not there is, it is possible to make a determination in consideration of the habit of the gesture operation by the operator 5.
  • FIG. 31 is a block diagram showing an example of the configuration of the gesture operation device 33 according to the fifth embodiment of the present invention.
  • the gesture operation device 33 As shown in FIG. 31, the gesture operation device 33 according to the fifth embodiment is characterized in that the communication unit 7 includes the external information acquisition unit 34.
  • the communication unit 7 includes the external information acquisition unit 34.
  • the other configuration and operation are the same as in the first embodiment, and thus detailed description will be omitted here.
  • the external information acquisition unit 34 acquires various external information from the outside.
  • the external information include information related to the operator 5, traffic information, weather information, and the like.
  • Information related to the operator 5 includes, for example, the schedule of the operator 5, contact information, favorite music, information on SNS (Social Networking Service), traffic information or weather information on the set destination, POI (Point Of Interest) ) Information, purchasing behavior information, information of home appliances connected to the home network, and the like.
  • the event detection unit 2 detects that the external information acquisition unit 34 has acquired information on the home appliance as an event. Then, as shown in FIG. 32, the display control unit 3 causes the display device 4 to display an operation object 21 indicating an operation of turning on or off the power of the air conditioner installed on the first floor of the operator 5's home. Control. For example, an operation object may be displayed such that the power is turned on and off alternately and repeatedly each time the hand icon presses the switch icon. As a result, the operator 5 can easily grasp which room in the house in which the home appliance is to be operated, and can accurately understand what kind of gesture operation should be performed.
  • the gesture operation device 6 can accurately determine the gesture operation performed by the operator 5.
  • FIG. 33 is a block diagram showing an example of the configuration of the gesture operation device 35 according to the sixth embodiment of the present invention.
  • the gesture operation device 35 is characterized in that the device control unit 10 has a vehicle information acquisition unit 36.
  • the other configuration and operation are the same as in the first embodiment, and thus detailed description will be omitted here.
  • the vehicle information acquisition unit 36 acquires vehicle information from various control devices provided in the vehicle which is the in-vehicle device 15. Vehicle information is various information regarding a vehicle.
  • the event detection unit 2 detects that the vehicle information acquisition unit 36 acquires vehicle information as an event. Do. Then, as shown in FIG. 34, the display control unit 3 controls to display on the display device 4 an operation object 21 indicating an operation of closing the trunk of the vehicle.
  • the operator 5 can correctly understand the gesture operation for closing the trunk of the vehicle.
  • the event detection unit 2 detects that the terminal communication unit 11 has acquired information on the mobile communication terminal 12 as an event. .
  • the display control unit 3 controls the display device 4 to display an operation object 21 indicating an operation of switching the display of the information related to the mobile communication terminal 12.
  • the operator 5 can display the in-vehicle device 15 or the mobile communication terminal 12. Can accurately understand the gesture operation to control the Therefore, the gesture operation device 6 can accurately determine the gesture operation performed by the operator 5.
  • the gesture operation device described above is not limited to an on-vehicle navigation device, that is, a car navigation device, and a PND (Portable Navigation Device) that can be mounted on a vehicle, and a server provided outside the vehicle as a system as appropriate
  • the present invention can also be applied to a navigation device to be constructed or a device other than the navigation device.
  • each function or each component of the gesture operation device is distributively arranged to each function constructing the system.
  • the function of the gesture operation device can be arranged on the server.
  • the user side includes the display device 4, the mobile communication terminal 12, the gesture operation detection device 13, the voice input device 14, the in-vehicle device 15, and the voice output device 16.
  • the server 37 includes an event detection unit 2, a display control unit 3, a communication unit 7, a gesture operation acquisition unit 8, an operation determination unit 9, a device control unit 10, and a terminal communication unit 11.
  • the gesture operating device 26 shown in FIG. 25 the gesture operating device 28 shown in FIG. 27, the gesture operating device 31 shown in FIG. 28, the gesture operating device 33 shown in FIG. 31 and the gesture operating device 35 shown in FIG. .
  • software for executing the operation in the above embodiment may be incorporated into, for example, a server.
  • the gesture operation method implemented by the server executing this software detects an event, and for the detected event, the operator can perform an operation that can be performed by the operator to advance the event. It is performing control which displays the operation object which guides the gesture operation which shows an intention in three dimensions.
  • each embodiment can be freely combined, or each embodiment can be appropriately modified or omitted.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention vise à proposer un dispositif d'opération gestuelle et un procédé d'opération gestuelle capables de déterminer avec précision une opération gestuelle effectuée par un opérateur. Le dispositif d'opération gestuelle selon la présente invention comporte une unité de détection d'événement qui détecte un événement, et une unité de commande d'affichage qui effectue une commande pour afficher un objet d'opération tridimensionnel afin de guider l'opérateur pour effectuer une opération gestuelle qui peut être effectuée par l'opérateur pour amener l'événement détecté par l'unité de détection d'événement à se dérouler, et qui indique une intention de l'opérateur.
PCT/JP2018/002242 2018-01-25 2018-01-25 Dispositif d'opération gestuelle et procédé d'opération gestuelle WO2019146032A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2018/002242 WO2019146032A1 (fr) 2018-01-25 2018-01-25 Dispositif d'opération gestuelle et procédé d'opération gestuelle
JP2019567457A JP6900133B2 (ja) 2018-01-25 2018-01-25 ジェスチャー操作装置およびジェスチャー操作方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/002242 WO2019146032A1 (fr) 2018-01-25 2018-01-25 Dispositif d'opération gestuelle et procédé d'opération gestuelle

Publications (1)

Publication Number Publication Date
WO2019146032A1 true WO2019146032A1 (fr) 2019-08-01

Family

ID=67395344

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/002242 WO2019146032A1 (fr) 2018-01-25 2018-01-25 Dispositif d'opération gestuelle et procédé d'opération gestuelle

Country Status (2)

Country Link
JP (1) JP6900133B2 (fr)
WO (1) WO2019146032A1 (fr)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20210040243A (ko) * 2019-10-03 2021-04-13 구글 엘엘씨 전자 디바이스와 인터렉션하기 위해 레이더 제스처를 사용하는 것에서 사용자 숙련도 지원
US11281303B2 (en) 2019-08-30 2022-03-22 Google Llc Visual indicator for paused radar gestures
US11288895B2 (en) 2019-07-26 2022-03-29 Google Llc Authentication management through IMU and radar
US11360192B2 (en) 2019-07-26 2022-06-14 Google Llc Reducing a state based on IMU and radar
US11385722B2 (en) 2019-07-26 2022-07-12 Google Llc Robust radar-based gesture-recognition by user equipment
US11402919B2 (en) 2019-08-30 2022-08-02 Google Llc Radar gesture input methods for mobile devices
US11467672B2 (en) 2019-08-30 2022-10-11 Google Llc Context-sensitive control of radar-based gesture-recognition
US11531459B2 (en) 2016-05-16 2022-12-20 Google Llc Control-article-based control of a user interface
US11841933B2 (en) 2019-06-26 2023-12-12 Google Llc Radar-based authentication status feedback
US11868537B2 (en) 2019-07-26 2024-01-09 Google Llc Robust radar-based gesture-recognition by user equipment
US12093463B2 (en) 2019-07-26 2024-09-17 Google Llc Context-sensitive control of radar-based gesture-recognition

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008089985A (ja) * 2006-10-02 2008-04-17 Pioneer Electronic Corp 画像表示装置
JP2009089068A (ja) * 2007-09-28 2009-04-23 Victor Co Of Japan Ltd 電子機器の制御装置、制御方法及び制御プログラム
WO2012011263A1 (fr) * 2010-07-20 2012-01-26 パナソニック株式会社 Dispositif d'entrée de gestes et procédé d'entrée de gestes
JP2013211712A (ja) * 2012-03-30 2013-10-10 Sony Corp 出力制御装置、出力制御方法、及びプログラム
JP2017501500A (ja) * 2013-09-17 2017-01-12 アマゾン テクノロジーズ インコーポレイテッド 三次元オブジェクト表示のためのアプローチ
JP2017027401A (ja) * 2015-07-23 2017-02-02 株式会社デンソー 表示操作装置

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102099728A (zh) * 2008-07-15 2011-06-15 株式会社Ip舍路信 裸眼立体画面显示系统、裸眼立体画面显示装置、游戏机、视差屏障薄片
US8970484B2 (en) * 2010-07-23 2015-03-03 Nec Corporation Three dimensional display device and three dimensional display method
CN107148614B (zh) * 2014-12-02 2020-09-08 索尼公司 信息处理设备、信息处理方法和程序

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008089985A (ja) * 2006-10-02 2008-04-17 Pioneer Electronic Corp 画像表示装置
JP2009089068A (ja) * 2007-09-28 2009-04-23 Victor Co Of Japan Ltd 電子機器の制御装置、制御方法及び制御プログラム
WO2012011263A1 (fr) * 2010-07-20 2012-01-26 パナソニック株式会社 Dispositif d'entrée de gestes et procédé d'entrée de gestes
JP2013211712A (ja) * 2012-03-30 2013-10-10 Sony Corp 出力制御装置、出力制御方法、及びプログラム
JP2017501500A (ja) * 2013-09-17 2017-01-12 アマゾン テクノロジーズ インコーポレイテッド 三次元オブジェクト表示のためのアプローチ
JP2017027401A (ja) * 2015-07-23 2017-02-02 株式会社デンソー 表示操作装置

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11531459B2 (en) 2016-05-16 2022-12-20 Google Llc Control-article-based control of a user interface
US11841933B2 (en) 2019-06-26 2023-12-12 Google Llc Radar-based authentication status feedback
US12093463B2 (en) 2019-07-26 2024-09-17 Google Llc Context-sensitive control of radar-based gesture-recognition
US11868537B2 (en) 2019-07-26 2024-01-09 Google Llc Robust radar-based gesture-recognition by user equipment
US11288895B2 (en) 2019-07-26 2022-03-29 Google Llc Authentication management through IMU and radar
US11360192B2 (en) 2019-07-26 2022-06-14 Google Llc Reducing a state based on IMU and radar
US11385722B2 (en) 2019-07-26 2022-07-12 Google Llc Robust radar-based gesture-recognition by user equipment
US11790693B2 (en) 2019-07-26 2023-10-17 Google Llc Authentication management through IMU and radar
US11402919B2 (en) 2019-08-30 2022-08-02 Google Llc Radar gesture input methods for mobile devices
US11687167B2 (en) 2019-08-30 2023-06-27 Google Llc Visual indicator for paused radar gestures
US11467672B2 (en) 2019-08-30 2022-10-11 Google Llc Context-sensitive control of radar-based gesture-recognition
US11281303B2 (en) 2019-08-30 2022-03-22 Google Llc Visual indicator for paused radar gestures
US12008169B2 (en) 2019-08-30 2024-06-11 Google Llc Radar gesture input methods for mobile devices
KR20210040243A (ko) * 2019-10-03 2021-04-13 구글 엘엘씨 전자 디바이스와 인터렉션하기 위해 레이더 제스처를 사용하는 것에서 사용자 숙련도 지원
KR102320754B1 (ko) * 2019-10-03 2021-11-02 구글 엘엘씨 전자 디바이스와 인터렉션하기 위해 레이더 제스처를 사용하는 것에서 사용자 숙련도 지원

Also Published As

Publication number Publication date
JP6900133B2 (ja) 2021-07-07
JPWO2019146032A1 (ja) 2020-07-02

Similar Documents

Publication Publication Date Title
WO2019146032A1 (fr) Dispositif d'opération gestuelle et procédé d'opération gestuelle
US10123300B2 (en) Tactile feedback in an electronic device
EP3000013B1 (fr) Télécommande tactile multipoint interactive
JP5694204B2 (ja) グラフィカルユーザインターフェース装置においてテクスチャを用いるためのシステム及び方法
WO2009131089A1 (fr) Terminal d'informations portable, programme lisible par ordinateur et support d'enregistrement
US20170068418A1 (en) Electronic apparatus, recording medium, and operation method of electronic apparatus
WO2014208691A1 (fr) Appareil portable, et procédé de commande de celui-ci
EP3657311B1 (fr) Appareil comprenant un écran tactile et son procédé de changement d'écran
JP2010003307A (ja) 携帯情報端末、コンピュータ読取可能なプログラムおよび記録媒体
JP6747835B2 (ja) 画像表示装置
US20150253887A1 (en) Information processing apparatus
JP2019175449A (ja) 情報処理装置、情報処理システム、移動体、情報処理方法、及びプログラム
JP2015170282A (ja) 車両用操作装置
JP6078375B2 (ja) 電子機器及び制御プログラム並びに電子機器の動作方法
US9733725B2 (en) Control unit, input apparatus and method for an information and communication system
KR101511118B1 (ko) 분할 화면 표시 장치 및 그 방법
KR20150009695A (ko) 어플리케이션 운용 방법 및 그 전자 장치
KR101422003B1 (ko) 단말기의 메뉴 표시 방법 및 이를 이용한 단말기
US9582150B2 (en) User terminal, electronic device, and control method thereof
KR102117450B1 (ko) 디스플레이 장치 및 그 제어 방법
JP6046562B2 (ja) 携帯機器、携帯機器の制御方法およびプログラム
JP5227356B2 (ja) 情報端末および情報入力方法
US20200272325A1 (en) Input control device, input device, and input control method
JP2016063366A (ja) 携帯表示端末
JP2014164388A (ja) 情報呈示装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18901876

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019567457

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18901876

Country of ref document: EP

Kind code of ref document: A1