JP2006209563A - Interface device - Google Patents

Interface device Download PDF

Info

Publication number
JP2006209563A
JP2006209563A JP2005022365A JP2005022365A JP2006209563A JP 2006209563 A JP2006209563 A JP 2006209563A JP 2005022365 A JP2005022365 A JP 2005022365A JP 2005022365 A JP2005022365 A JP 2005022365A JP 2006209563 A JP2006209563 A JP 2006209563A
Authority
JP
Japan
Prior art keywords
user
operation
movement
detection means
target part
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2005022365A
Other languages
Japanese (ja)
Inventor
Masato Horiuchi
正人 堀内
Original Assignee
Victor Co Of Japan Ltd
日本ビクター株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Victor Co Of Japan Ltd, 日本ビクター株式会社 filed Critical Victor Co Of Japan Ltd
Priority to JP2005022365A priority Critical patent/JP2006209563A/en
Publication of JP2006209563A publication Critical patent/JP2006209563A/en
Application status is Pending legal-status Critical

Links

Images

Abstract

<P>PROBLEM TO BE SOLVED: To provide an interface device allowing a user to intuitively operate an electronic apparatus with easy operation without requiring any operation device. <P>SOLUTION: This interface device for operation input by the user in the electronic apparatus having a display device 11 is provided with a photographing means for photographing the user M facing a display screen, an object region detection means detecting a specific objective region W (a palm) from an animation of the photographed user M, a positional information detection means detecting positional information about the detected objective region W, a motion information detection means detecting the direction and speed or acceleration of the objective region W based on the detected objective region W positional information, an operation command output means outputting an operation command corresponding to the detected motion information to the electronic apparatus according to the motion of the objective region W, and a display control means displaying a state transition based on the operation command to the electronic apparatus on the display screen of the display device 11. <P>COPYRIGHT: (C)2006,JPO&NCIPI

Description

  The present invention belongs to a technical field of an interface device in an electronic apparatus including a display device such as a personal computer system or a television system.

  With the increasing functionality of consumer electronic devices such as personal computer systems and television / video systems, the operating procedures are becoming increasingly complex. In particular, in the operating environment of consumer electronic devices, an infrared remote controller is used as an alternative to an operating device (pointing device) typified by a mouse that has been frequently used in a computer environment. Manipulating a user interface (GUI for short) is becoming a conventional means.

  However, the operation environment of the GUI is not always good, and the operation is particularly complicated and difficult to understand for the elderly. In addition, each electronic device to be operated is accompanied by an annoyance such as the necessity to always have an infrared remote controller at hand.

  As one method for solving the drawbacks as an interface device in such a conventional electronic device, the electronic device can be used without an operation device such as the mouse or the infrared remote controller by utilizing the movement of the user's hand (hand gesture). A method of operation has been proposed.

  For example, in the following [Patent Document 1], as shown in FIG. 6, as the interface device of the host computer 42, the shape and movement of the user M's hand facing the CCD camera 41 attached to the display device 44 are captured. An icon 43 of a specific shape (an arrow cursor indicating pointing etc.) recognized by the image processing device of the host computer 42 and corresponding to the hand of the user M is displayed on the display screen of the display device 44, and the display of this display device 44 is displayed. The virtual switches 45, 46 and 47 displayed on the screen are displayed on the icon 43 by vertical and horizontal hand movements indicated by hatched arrows F1 and F2 in FIG. 6 perpendicular to the optical axis of the CCD camera 41 of the user M. A configuration for outputting an operation command to an electronic device such as the host computer 42 so that it can be selected. The idea of the interface device is disclosed.

JP-A-8-44490

  In the interface device disclosed in [Patent Document 1], the position information of the hand of the user M used as the operation means is detected, and the position information is displayed on the XY coordinates on the operation menu screen of the display device 44. A means for operating objects such as virtual switches 45, 46 and 47 displayed in the XY coordinate system on the operation menu screen after being plotted as an icon 43 in the system is used.

  In other words, the user M is replaced with a conventional mouse by recognizing the shape of the hand of the user M, and the function of the pointing device is realized. The operation system of moving the corresponding)) is nothing but an indirect transmission of the operation command output to the electronic device from the hand through the icon (cursor) and the display object.

  As a result, although the user M's hand replaces the mouse and no operation device is required, there are not a few cases where it is difficult to intuitively operate the electronic device. For example, when a large number of object virtual switches 45, 46,... As options displayed on the display device 44 in FIG. 6 are arranged vertically and horizontally, an icon by the hand of the user M as a mouse is used. 43 precise and detailed operations are required.

  Further, the positional relationship on the optical axis between the target part (user M's hand) and the CCD camera 41 as the imaging means, the moving distance of the target part in the real space, and the icon on the display 44 ( Regarding the correspondence with the (cursor) movement distance, etc., it is necessary to make fine adjustments that adapt to individual differences in hand gestures in accordance with the intuition of the user M.

  The present invention has been made in view of the above circumstances, and is an interface device for operating an electronic device without an operation device by a gesture using a part such as a user's hand. It is an object of the present invention to provide an interface device that realizes simple operation adapted to a user's more intuitive operation feeling by further reducing indirectness from correspondence to a coordinate system and user operation to operation command output.

The present invention
(1) An interface device for an operation input of the user M in an electronic apparatus including the display device 11, and an imaging unit for imaging the user M facing the display screen of the display device 11 (12 ), A target part detection means (22) for detecting a specific target part W from the moving image of the user M imaged by the imaging means (12), and the target part detection means (22). Based on the position information detected by the position information detecting means (24) for detecting the position information of the target part W and the position information detecting means (24), the direction and speed of the movement of the target part W Alternatively, motion information detecting means (25) for detecting acceleration as motion information, and an operation command corresponding to the motion information detected by the motion information detecting means (25) An operation command output means (26) for outputting to the electronic device in accordance with the movement of the target portion W, and a display control means for displaying the state transition of the electronic device by the operation command on the display screen of the display device 11. (27) is provided to solve the above-mentioned problem.
(2) The movement information detection means (25) includes a movement in the orthogonal direction on a plane parallel to the display screen of the display device 11 of the target portion W and a movement in a direction perpendicular to the display screen. The problem is solved by providing the interface device according to (1), wherein the motion information is detected in a three-dimensional space.

  Note that the numbers in parentheses of the above means correspond to the reference numerals given to the specific members described in this specification.

The interface device according to the present invention is configured as described above.
(1) In an electronic device equipped with a display device, an operation command can be output to the electronic device by a simpler and more natural operation in accordance with the user's intuition without using an operation device such as a mouse or an infrared remote controller. It is possible to provide an interactive interface environment for inputting a user's operation.
(2) The present invention can be applied to electronic devices including all information processing systems configured to input operation commands using a display device.

  An embodiment of an interface device according to the present invention will be described with reference to the drawings.

  FIG. 1 is a diagram for explaining a sequence (initial state) of an object selection operation in the interface apparatus according to the first embodiment of the present invention. FIG. 2 is a diagram for explaining a sequence of object selection operations (state after transition) in the interface device. FIG. 3 is a diagram for explaining the sequence of the object determining operation in the interface device. FIG. 4 is a diagram for explaining the sequence of the object selection operation in the interface apparatus according to the second embodiment of the present invention. FIG. 5 is a block diagram of an interface device for realizing the interface device according to the first and second embodiments of the present invention.

  First, a first embodiment which is a basic form of the present invention will be described.

  In FIG. 1, reference numeral 11 denotes a display device such as a CRT, liquid crystal, PDP, or organic EL, which is attached to or provided as a constituent element in an electronic apparatus typically represented by an information processing system such as a personal computer system or a television system. A user facing the display screen of the display device 11, 12 is the display device 11 as an imaging means for capturing the user M facing the display screen of the display device 11 or an electronic device main body (not shown) 14 a, 14 b, and 14 c are objects in the operation menu screen displayed on the display screen of the display device 11.

  The main circuit block of the interface device of the present invention is stored inside the electronic device and has a system configuration as shown in FIG. That is, an imaging camera 12 incorporated in the display device 11 as an imaging means for capturing a moving image of the user M facing the display screen of the display device 11, a DRAM for temporarily storing captured data, and the like A target portion for extracting a target portion W by performing shape recognition of a specific target portion W (preferably user's palm) from the moving image of the user M captured by the imaging means The target part extraction unit 22 as a detection unit, the position information detection unit 24 as a position information detection unit that detects the position information of the target part W detected by the target part detection unit, and the position information detection unit Further, based on the time change of the position information of the target part W, the direction of movement and the speed or addition of the target part W (the palm of the user M) A motion information detection unit 25 as motion information detection means for detecting the degree as motion information, and an operation command corresponding to the motion information of the target part W detected by the motion information detection means (control unit of the electronic device main body) Image information processing device 23 including an operation command generation output unit 26 as an operation command output means for generating and outputting to the electronic device according to the movement of the target portion W, and Display control means 27 for displaying the state transition of the electronic device as feedback information of the operation result on the display screen of the display device 11 as part of the instruction by the operation command.

  Here, in the image information processing apparatus 23 having the target part extraction unit 22, the position information detection unit 24, the motion information detection unit 25, and the operation command generation output unit 26, for example, a video mounted on a humanoid robot. The hardware of the automatic tracking device by the camera can be diverted. In this case, the shape and color information of a specific target part are stored in advance, the target part corresponding to the specific shape and color captured can be recognized and extracted, and its movement can be tracked.

  Next, an operation of selecting and confirming an object on the display screen of the display device 11 by the user M is performed as follows.

  First, as shown in FIG. 1, in a state where the user M faces the display device 11, a state where the palm that naturally becomes the target portion W of the operation instruction is lifted is set as the reference position. For example, by maintaining the state where the palm is raised for a few seconds (approx. 2 to 3 seconds is appropriate), the reference position is recognized by the target part extraction unit 22 and the position information detection unit 24 and input to the electronic device body. Is done. As a result, the menu operation screen appears on the screen of the display device 11 by the display control means 27, and the object arranged at the substantially center position thereof, that is, “No. 2” currently in the determined state or selected state in FIG. Assuming that the displayed object 14b is at the reference position, the selected state is highlighted corresponding to the palm of the hand.

  Second, from the above state, for example, when the user M wants to select and confirm the “first” object 14a on the left side instead of the “second” object 14b, the sequence of operations is as follows. It becomes like this. First, the user M quickly slides the palm of his / her hand toward the screen of the display device 11 to the right side (in the direction of the thick solid arrow 15), and then slowly removes the hand and slowly returns to the reference position. (Direction of thick dashed arrow 16) A hand gesture is performed. At the time of the operation of the broken line arrow 16 to return to the reference position, it is preferable not to be conscious of copying the palm to the imaging camera 12. And after the operation | movement of the broken line arrow 16 which returns to the said reference position is complete | finished, it raises the palm again at a reference position and stops.

  Third, the display information on the menu screen of the display device 11 is obtained from the arrow 15 of the user M regarding the direction of the palm movement of the user M and the speed or acceleration of the movement. As a result of detecting a series of palm movements up to the arrow 16, it is determined that the displayed command is an operation command for shifting the displayed object to the right by one unit (one object), and is displayed by the display control means 27 in FIG. Thus, the display content of the display screen is changed. That is, by the action of the palm arrow 15, the object on the display screen of the display device 11 is also moved in the same direction as the arrow 15, and the “second” object 14b at the display position corresponding to the reference position is displayed. Instead, “No. 1” object 14a is displayed at the center of the screen at the reference position, and is in a selected state. On the other hand, at the original position of the “1st” object 14a, the “0th” object 14z which is not displayed on the screen and which is located on the left side appears. First, the “No. 3” object 14 c located on the right side is further shifted to the right and disappears from the screen.

  Fourth, as shown in FIG. 3, a specific operation toward the display device 11, for example, the depth direction (Z (Z The palm of the user M is slid in the axial direction (solid arrow 17). Intuitively speaking, it is an image in which the palm is pushed toward the display device 11 facing with the palm spread.

  Fifthly, the motion information detection unit 25 detects the palm motion of the arrow 17 of the user M and determines that it is a state determination operation. The operation command (instruction information) corresponding to the transition of the object No. 14a from the selected state to the confirmed state is generated and output to the main body of the electronic device, so that the operation command output according to the intention of the user M (the electronic device The display control means 27 displays the state transition based on the confirmation of the “No. 1” object of the electronic device as feedback information of the operation result on the operation screen of the display device 11 (“No. 1” is input to the control unit). To complete the state corresponding to the object 14a ").

  As described above, the object selection operation is performed by continuously arranging the user M looking at the display device 11 on the left and right sides displayed on the operation screen of the display device 11..., X-2, X -1, X, X + 1, X + 2,... (X is a natural number) The band of the object is moved by hand so that the number is shifted one by one in the left and right directions, and the object of the desired number is the center reference position. It is done with the feeling of moving to come to. The confirmation operation is performed as if the selected object at the reference position in the center of the screen is pressed as if the button switch is pressed. These operations are compatible with the palm movement of the user M, and can be said to be a natural operation feeling that is intuitive and easy to understand for the elderly.

  Next, a second embodiment will be described with reference to FIG. The system configuration is the same as that shown in FIG. 5 described in the first embodiment, and the position where the user M rests with his palm facing the display device 32 provided with the imaging camera 12 is the reference position. Set as

  A procedure for selecting and determining an object on the two-dimensional operation menu screen displayed on the display device 32 as a menu item will be described below with the state stationary at the reference position as the initial state.

  The reference position corresponds to the central area on the display device 32 (the position of the object 33 in FIG. 4). A plurality of objects are modeled as being mapped side by side on the surface of a huge virtual sphere 31 including a display surface, and the object selection operation can be performed by using the virtual sphere 31 in, for example, a vertical direction, a horizontal direction, or a diagonal direction. The sequence of operations is such that the object to be selected can be seen in the “window” which is the display screen of the display device.

  If the menu items pasted on the virtual spherical surface 31 are aligned not only in the horizontal direction but also in the vertical direction, one of many objects is selected by panning the “window” in the two-dimensional menu area. It can be done easily.

  Further, by changing the size of the virtual spherical surface 31 to which the menu item is attached, the resolution of information displayed on the “window” of the display screen of the display device 302 can be controlled. As a result, the global information can be displayed in the “window” if the size of the rotating phantom spherical surface 31 is reduced, and the detailed information can be displayed in the “window” if the size is increased. In this regard, a complicated operation menu that is complicated for the elderly is often unnecessary, and in such a case, a display that makes it possible to display a limited and minimum required operation menu object in a large window on the display screen. Selection of the display setting of the form in advance is conceivable.

  By the way, as the target part detection means, a recognition apparatus for a specific target part by a known video camera can be applied. In particular, in the case of the present embodiment where the target part W is a palm or the like, the same part in the moving image It is preferable to use a skin color filter so as to recognize the shape of the color region and discriminate the difference between the shape in the steady state and the shape in the unsteady state.

  Further, as the specific target part W, a part of the body other than the palm of the user M, a fingertip (for example, an index finger), or a head is used as a target part, and the object is obtained by swinging and squeezing, and moving back and forth. It is also possible to select and confirm.

  Further, the number of the imaging cameras is not limited to one, and two to three may be used. The recognition of the target part W by arranging a plurality of imaging cameras on the left and right of the display device 11 provides more detailed and accurate information on the target part identification and three-dimensional movement, so the analysis is complicated. Although it is accompanied, a stable interface device that is reliable and has few malfunctions is realized.

  In addition, the movement of an object by one hand gesture (one operation) is basically performed by one unit as in the above embodiment, but depending on a specific operation, the object moves up and down, left and right on the operation screen continuously or intermittently. You may make it move to.

  The arrangement of objects simultaneously displayed on the display device is three in one row and three columns in both of the above embodiments, but a total of nine objects in three rows and three columns or a total of 25 in five rows and five columns. It is also possible to display the objects at the same time.

It is a figure for demonstrating the sequence (initial state) of the object selection operation | movement in the interface apparatus of the 1st Embodiment of this invention. It is a figure for demonstrating the sequence (state after a transition) of the object selection operation | movement in the interface apparatus of the 1st Embodiment of this invention. It is a schematic diagram for demonstrating the sequence of the object confirmation operation | movement in the interface apparatus of the 1st Embodiment of this invention. It is a figure for demonstrating the sequence of the object selection operation | movement in the interface apparatus of the 2nd Embodiment of this invention. It is a block block diagram of the interface apparatus which implement | achieves the interface apparatus of the 1st and 2nd embodiment of this invention. It is a figure explaining the structure of the interface apparatus disclosed by the well-known literature.

Explanation of symbols

11, 32, 44 Display device 12 Imaging cameras 14a, 14b, 14c Object 15 Object selection operation arrow 16 Object selection operation arrow 17 Object determination operation arrow 21 Frame memory 22 Target region extraction unit 23 Image information processing device 24 Position Information detection unit 25 Motion information detection unit 26 Operation command generation output unit 27 Display control means 31 Virtual spherical surface 33 Object 41 selected at reference position CCD camera 42 Host computer 43 Icons 45, 46, 47 Virtual switch M User

Claims (2)

  1. An interface device for user operation input in an electronic device having a display device,
    Imaging means for imaging the user facing the display screen of the display device;
    Target part detection means for detecting a specific target part from the moving image of the user imaged by the imaging means;
    Position information detection means for detecting position information of the target part detected by the target part detection means;
    Motion information detection means for detecting, as movement information, the direction of movement and the speed or acceleration of the movement of the target part based on the position information detected by the position information detection means;
    An operation command output means for outputting an operation command corresponding to the movement information detected by the movement information detection means to the electronic device according to the movement of the target part;
    Display control means for displaying a state transition of the electronic device by the operation command on the display screen of the display device;
    Interface device with
  2. The movement information detection means is configured to detect the movement information in a three-dimensional space including movement in a direction perpendicular to a plane parallel to the display screen of the display device of the target part and movement in a direction perpendicular to the display screen. The interface device according to claim 1, wherein the interface device is detected.
JP2005022365A 2005-01-31 2005-01-31 Interface device Pending JP2006209563A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2005022365A JP2006209563A (en) 2005-01-31 2005-01-31 Interface device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2005022365A JP2006209563A (en) 2005-01-31 2005-01-31 Interface device

Publications (1)

Publication Number Publication Date
JP2006209563A true JP2006209563A (en) 2006-08-10

Family

ID=36966339

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2005022365A Pending JP2006209563A (en) 2005-01-31 2005-01-31 Interface device

Country Status (1)

Country Link
JP (1) JP2006209563A (en)

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010515170A (en) * 2006-12-29 2010-05-06 ジェスチャー テック,インコーポレイテッド Manipulating virtual objects using an enhanced interactive system
JP2010176510A (en) * 2009-01-30 2010-08-12 Sanyo Electric Co Ltd Information display device
EP2241964A2 (en) 2009-04-14 2010-10-20 Sony Corporation Information processing apparatus, information processing method, and information processing program
JP2011003136A (en) * 2009-06-22 2011-01-06 Sony Corp Operation control device and method
JP2011145744A (en) * 2010-01-12 2011-07-28 Nintendo Co Ltd Information processing apparatus, information processing program, information processing system, and method of selecting object to be selected
EP2400371A2 (en) 2010-06-24 2011-12-28 Sony Corporation Gesture recognition apparatus, gesture recognition method and program
US8102380B2 (en) 2007-08-30 2012-01-24 Kabushiki Kaisha Toshiba Information processing device, program and method to detect hand rotation gestures
JP2012043195A (en) * 2010-08-19 2012-03-01 Sony Corp Information processor, information processing method, and program
JP2012053557A (en) * 2010-08-31 2012-03-15 Sony Corp Information processing unit, information processing method and program
JP2012146304A (en) * 2011-01-06 2012-08-02 Samsung Electronics Co Ltd Display unit controlled by motion and motion control method for the same
JP2012234317A (en) * 2011-04-28 2012-11-29 Ntt Docomo Inc Display device, control method for display device and program
JP2013037499A (en) * 2011-08-05 2013-02-21 Toshiba Corp Gesture recognition device, gesture recognition method, and gesture recognition program
JP2014504751A (en) * 2010-12-22 2014-02-24 トムソン ライセンシングThomson Licensing Method and apparatus for constraining user operations when applied to a card or window
WO2014034188A1 (en) * 2012-08-30 2014-03-06 楽天株式会社 Clothing image-processing device, clothing image display method and program
JP2014509103A (en) * 2011-01-12 2014-04-10 ミュエシュトロ インターアクティーフェ ゲーエムベーハー Remote control device and interface module for controlling mechanism based on moving object
JP2014096804A (en) * 2013-12-02 2014-05-22 Hitachi Consumer Electronics Co Ltd Operation control device and operation display method
KR20140089845A (en) * 2013-01-07 2014-07-16 삼성전자주식회사 Electronic apparatus and Method for controlling electronic apparatus thereof
US9086726B2 (en) 2010-12-27 2015-07-21 Hitachi Maxell, Ltd. Image processing device and image display device
US9134800B2 (en) 2010-07-20 2015-09-15 Panasonic Intellectual Property Corporation Of America Gesture input device and gesture input method
JP2016194957A (en) * 2011-10-07 2016-11-17 カシオ計算機株式会社 Electronic apparatus and program
US9939912B2 (en) 2014-03-05 2018-04-10 Denso Corporation Detection device and gesture input device
US9965028B2 (en) 2010-04-07 2018-05-08 Samsung Electronics Co., Ltd. Method for suspension sensing in interactive display, method for processing suspension sensing image, and proximity sensing apparatus

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08315154A (en) * 1995-02-21 1996-11-29 Mitsubishi Electric Res Lab Inc Gesture recognition system
JPH10255052A (en) * 1997-03-14 1998-09-25 Atr Chinou Eizo Tsushin Kenkyusho:Kk Gesture interface device
JP2000075991A (en) * 1998-08-28 2000-03-14 Aqueous Research:Kk Information input device
JP2003067108A (en) * 2001-08-23 2003-03-07 Hitachi Ltd Information display device and operation recognition method for the same
JP2003186596A (en) * 2001-12-13 2003-07-04 Seiko Epson Corp Display device and input method of the same

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08315154A (en) * 1995-02-21 1996-11-29 Mitsubishi Electric Res Lab Inc Gesture recognition system
JPH10255052A (en) * 1997-03-14 1998-09-25 Atr Chinou Eizo Tsushin Kenkyusho:Kk Gesture interface device
JP2000075991A (en) * 1998-08-28 2000-03-14 Aqueous Research:Kk Information input device
JP2003067108A (en) * 2001-08-23 2003-03-07 Hitachi Ltd Information display device and operation recognition method for the same
JP2003186596A (en) * 2001-12-13 2003-07-04 Seiko Epson Corp Display device and input method of the same

Cited By (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010515170A (en) * 2006-12-29 2010-05-06 ジェスチャー テック,インコーポレイテッド Manipulating virtual objects using an enhanced interactive system
US8102380B2 (en) 2007-08-30 2012-01-24 Kabushiki Kaisha Toshiba Information processing device, program and method to detect hand rotation gestures
JP2010176510A (en) * 2009-01-30 2010-08-12 Sanyo Electric Co Ltd Information display device
EP2241964A2 (en) 2009-04-14 2010-10-20 Sony Corporation Information processing apparatus, information processing method, and information processing program
EP2241964A3 (en) * 2009-04-14 2011-01-05 Sony Corporation Information processing apparatus, information processing method, and information processing program
US8947463B2 (en) 2009-04-14 2015-02-03 Sony Corporation Information processing apparatus, information processing method, and information processing program
JP2011003136A (en) * 2009-06-22 2011-01-06 Sony Corp Operation control device and method
JP2011145744A (en) * 2010-01-12 2011-07-28 Nintendo Co Ltd Information processing apparatus, information processing program, information processing system, and method of selecting object to be selected
US8917236B2 (en) 2010-01-12 2014-12-23 Nintendo Co., Ltd. Storage medium having information processing program stored therein, information processing apparatus, and information processing system
US9965028B2 (en) 2010-04-07 2018-05-08 Samsung Electronics Co., Ltd. Method for suspension sensing in interactive display, method for processing suspension sensing image, and proximity sensing apparatus
US8756508B2 (en) 2010-06-24 2014-06-17 Sony Corporation Gesture recognition apparatus, gesture recognition method and program
EP2400371A2 (en) 2010-06-24 2011-12-28 Sony Corporation Gesture recognition apparatus, gesture recognition method and program
US9134800B2 (en) 2010-07-20 2015-09-15 Panasonic Intellectual Property Corporation Of America Gesture input device and gesture input method
US9411410B2 (en) 2010-08-19 2016-08-09 Sony Corporation Information processing device, method, and program for arranging virtual objects on a curved plane for operation in a 3D space
US10241582B2 (en) 2010-08-19 2019-03-26 Sony Corporation Information processing device, information processing method, and program for graphical user interface
JP2012043195A (en) * 2010-08-19 2012-03-01 Sony Corp Information processor, information processing method, and program
JP2012053557A (en) * 2010-08-31 2012-03-15 Sony Corp Information processing unit, information processing method and program
US9836190B2 (en) 2010-12-22 2017-12-05 Jason Douglas Pickersgill Method and apparatus for restricting user operations when applied to cards or windows
US9990112B2 (en) 2010-12-22 2018-06-05 Thomson Licensing Method and apparatus for locating regions of interest in a user interface
JP2014504751A (en) * 2010-12-22 2014-02-24 トムソン ライセンシングThomson Licensing Method and apparatus for constraining user operations when applied to a card or window
US10514832B2 (en) 2010-12-22 2019-12-24 Thomson Licensing Method for locating regions of interest in a user interface
US9746931B2 (en) 2010-12-27 2017-08-29 Hitachi Maxell, Ltd. Image processing device and image display device
US9086726B2 (en) 2010-12-27 2015-07-21 Hitachi Maxell, Ltd. Image processing device and image display device
JP2012146304A (en) * 2011-01-06 2012-08-02 Samsung Electronics Co Ltd Display unit controlled by motion and motion control method for the same
JP2014509103A (en) * 2011-01-12 2014-04-10 ミュエシュトロ インターアクティーフェ ゲーエムベーハー Remote control device and interface module for controlling mechanism based on moving object
US9451237B2 (en) 2011-01-12 2016-09-20 Myestro Interactive Gmbh Remote control device for controlling a mechanism with the aid of a movable object and an interface module based on movement and distance of the movable object with respect to a camera
JP2012234317A (en) * 2011-04-28 2012-11-29 Ntt Docomo Inc Display device, control method for display device and program
JP2013037499A (en) * 2011-08-05 2013-02-21 Toshiba Corp Gesture recognition device, gesture recognition method, and gesture recognition program
JP2016194957A (en) * 2011-10-07 2016-11-17 カシオ計算機株式会社 Electronic apparatus and program
JP5791812B2 (en) * 2012-08-30 2015-10-07 楽天株式会社 Clothes image processing device, clothes image display method, and program
WO2014034188A1 (en) * 2012-08-30 2014-03-06 楽天株式会社 Clothing image-processing device, clothing image display method and program
US9996909B2 (en) 2012-08-30 2018-06-12 Rakuten, Inc. Clothing image processing device, clothing image display method and program
KR101713784B1 (en) 2013-01-07 2017-03-08 삼성전자주식회사 Electronic apparatus and Method for controlling electronic apparatus thereof
KR20140089845A (en) * 2013-01-07 2014-07-16 삼성전자주식회사 Electronic apparatus and Method for controlling electronic apparatus thereof
JP2014096804A (en) * 2013-12-02 2014-05-22 Hitachi Consumer Electronics Co Ltd Operation control device and operation display method
US9939912B2 (en) 2014-03-05 2018-04-10 Denso Corporation Detection device and gesture input device

Similar Documents

Publication Publication Date Title
JP6074170B2 (en) Short range motion tracking system and method
JP6116064B2 (en) Gesture reference control system for vehicle interface
CN104793868B (en) Method and apparatus for controlling media application operation
Zeleznik et al. UniCam—2D gestural camera controls for 3D environments
US9477303B2 (en) System and method for combining three-dimensional tracking with a three-dimensional display for a user interface
US6798429B2 (en) Intuitive mobile device interface to virtual spaces
EP2068237B1 (en) Information display terminal, information display method and program
Cabral et al. On the usability of gesture interfaces in virtual reality environments
TWI450132B (en) A portrait recognition device, an operation judgment method, and a computer program
CA2880054C (en) Virtual controller for visual displays
US9696808B2 (en) Hand-gesture recognition method
KR101148484B1 (en) Input apparatus
JP5405572B2 (en) Touch interaction using curved display
US8854433B1 (en) Method and system enabling natural user interface gestures with an electronic system
US9389779B2 (en) Depth-based user interface gesture control
KR101554082B1 (en) Natural gesture based user interface methods and systems
KR101809636B1 (en) Remote control of computer devices
US9030498B2 (en) Combining explicit select gestures and timeclick in a non-tactile three dimensional user interface
US8881051B2 (en) Zoom-based gesture user interface
KR20100041006A (en) A user interface controlling method using three dimension multi-touch
JP2009157908A (en) Information display terminal, information display method, and program
TWI438661B (en) User interface device and method for in response to an input event
US9170676B2 (en) Enhancing touch inputs with gestures
Malik et al. Visual touchpad: a two-handed gestural input device
US20130082922A1 (en) Tactile glove for human-computer interaction

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20070629

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20090602

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20090619

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20090804

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20100824