JP2004078977A - Interface device - Google Patents

Interface device Download PDF

Info

Publication number
JP2004078977A
JP2004078977A JP2003327809A JP2003327809A JP2004078977A JP 2004078977 A JP2004078977 A JP 2004078977A JP 2003327809 A JP2003327809 A JP 2003327809A JP 2003327809 A JP2003327809 A JP 2003327809A JP 2004078977 A JP2004078977 A JP 2004078977A
Authority
JP
Japan
Prior art keywords
unit
object
shape
display
interface device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
JP2003327809A
Other languages
Japanese (ja)
Inventor
Taro Imagawa
Toshiyuki Koda
Susumu Maruno
Michiyo Moriya
丸野 進
今川 太郎
森家 みち代
香田 敏行
Original Assignee
Matsushita Electric Ind Co Ltd
松下電器産業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Matsushita Electric Ind Co Ltd, 松下電器産業株式会社 filed Critical Matsushita Electric Ind Co Ltd
Priority to JP2003327809A priority Critical patent/JP2004078977A/en
Publication of JP2004078977A publication Critical patent/JP2004078977A/en
Application status is Withdrawn legal-status Critical

Links

Images

Abstract

<P>PROBLEM TO BE SOLVED: To allow an easy operation for an equipment, without requiring an input device such as a keyboard and a mouse. <P>SOLUTION: This device is provided with a host computer 1 for recognizing a shape and motion of a body in an image image-picked up by a CCD camera 3, and a display 2 for displaying the shape and motion of the body recognized by the host computer 1. When a user is faced to the CCD camera 3 to give an indication, for example, by gesture, the given gesture is displayed on display screen of the display 2, virtual switches 201, 202 and 203 and the like displayed on the display screen are selected by the gesture using an icon 200 of a arrow mark cursor, and the very easy operation for the equipment is allowed without requiring the input device such as the mouse. <P>COPYRIGHT: (C)2004,JPO

Description

The present invention relates to an interface device for inputting and outputting information devices such as a computer and a word processor, and devices having a display such as a television.

As a conventional interface device of this type, there is one that displays a cursor at a coordinate position detected by a mouse on a display screen and allows a user to select an arbitrary operation button displayed on the screen.

FIG. 7 shows an outline of this type of conventional interface device. In FIG. 7, reference numeral 71 denotes a display, in which virtual operation buttons 711, 712, and 713 are displayed by the host computer 72. Reference numeral 73 denotes a mouse, and reference numeral 710 denotes a mouse cursor. The host computer 72 controls display based on the amount of movement of the mouse 73 detected by the mouse 73 so as to move on the screen in synchronization with the movement of the mouse 73. The user moves the mouse 73 to move the mouse cursor 710 to a position of an arbitrary virtual operation button on the display screen, and presses a switch 731 on the mouse 73 to select an operation button. An operation instruction can be given.

However, in the above-mentioned conventional apparatus, an input device called a mouse is required separately from the main body of the apparatus, and a base such as a table for operating the mouse is required, which is not suitable for use in portable information devices and the like. In addition, since the operation is performed via a mouse, there is a problem that the interface is not always a direct and easy-to-understand interface.

In view of the above, an object of the present invention is to provide an interface device that does not require an input device such as a keyboard and a mouse and that can easily operate devices.

In order to achieve the above object, the present invention, at least an imaging unit, a motion recognition unit that recognizes the shape and motion of an object in an image captured by the imaging unit, and a shape of the object recognized by the motion recognition unit, A display unit that generates and displays an instruction icon different from the object associated with the shape of the object, and displays the instruction icon to move in response to the movement of the object recognized by the operation recognition unit. And an interface device for inputting information on the display unit by the instruction icon.

When the user faces the imaging unit of the interface device of the present invention configured as described above and gives an instruction by hand gesture, for example, the given hand gesture is displayed on the display screen. For example, the virtual switch or the like displayed on the display screen is shaken. This makes it possible to operate the device very easily without using an input device such as a mouse.

Hereinafter, embodiments of the present invention will be described with reference to the drawings. FIG. 1 shows the appearance of an embodiment of the interface device according to the present invention.

In FIG. 1, 1 is a host computer, 2 is a display used for display, and 3 is a CCD camera for picking up an image. The CCD camera 3 has an imaging surface arranged in the same direction as the display direction of the display 2, and the user can take an image of a user's hand gesture by heading toward the display surface. On the imaging surface of the CCD camera 3, a light transmission filter having a wavelength of a skin color region (not shown) is arranged, and only an image of a human face or a hand is captured by the CCD camera 3, and an image of a background or the like is captured. It is designed to be cut. On the display 2, virtual switches 201, 202, and 203 and an arrow cursor 200 for selecting the virtual switches can be displayed.

FIG. 2 is a detailed block diagram of the present embodiment. 2, an image input from the CCD camera 3 is stored in a frame memory 21. The motion recognizing unit 22 extracts, for example, a part corresponding to the hand of the user from the image stored in the frame memory 21 and changes the shape of the part to one finger as shown in FIG. It is determined whether the shape is a shape or a grip fist shape as shown in FIG.

FIG. 4 shows a detailed embodiment of the motion recognition unit 22, and shows an example in which the motion recognition unit 22 is configured by a contour extraction unit 221, an object extraction unit 222, a shape change identification unit 223, and a position detection unit 224. Things.

The outline extracting unit 221 extracts the outline shape of the object existing in the image. As a specific example of the method, the outline shape can be easily extracted by binarizing the image and extracting its edges.

The object extracting unit 222 extracts, for example, a contour shape corresponding to the user's hand from the contour shapes of the plurality of objects extracted by the contour extracting unit 221. As an example of a specific method, a method of determining an area of a closed region surrounded by a contour line, or a method of extracting a contour shape of a hand portion using a neural network can be used.

The shape change identification unit 223 identifies the contour shape of the hand part extracted by the object extraction unit 222 in more detail, and for example, determines whether the shape is a one-finger shape as shown in FIG. Then, it is determined whether or not the shape of the clenched fist as shown in FIG. At the same time, the position detection unit 224 calculates the barycentric coordinates of the contour shape of the user's hand.

The icon generation unit 24 generates an icon image to be displayed on the display based on the result of the hand shape identification performed by the shape change identification unit 223. For example, if the hand shape identification result is a one-finger shape, for example, an arrow icon as shown in FIG. 5A is used. An icon with an X mark is generated.

The display control unit 23 controls the display position of the icon generated by the icon generation unit 24 on the display, and includes a coordinate conversion unit 231 and a coordinate inversion unit 232. The coordinate conversion unit 231 converts the coordinates of the captured image into the display coordinates of the display, and the coordinate inversion unit 232 inverts the left and right positions of the converted display coordinates. That is, the coordinates of the center of gravity in the image of the part corresponding to the user's hand detected by the position detection unit 224 are converted into the display coordinates of the display, the left and right coordinates are inverted, and the icon is displayed on the display. . By this operation, if the user moves his / her hand to the right, the icon moves to the right on the display screen in the same manner as when this operation is mirrored.

操作 An operation example according to the present embodiment configured as described above is shown below. As shown in FIG. 1, when a user turns to a device equipped with the interface device according to the present embodiment and moves the hand in the shape of one finger, the icon 200 of the arrow cursor displayed on the display changes to the hand movement. Move to the corresponding position. Next, by moving the hand over an arbitrary virtual switch displayed on the display, the arrow cursor icon 200 is moved, and when the hand is gripped and formed into a fist shape, the virtual switch is selected and transmitted to the host computer 1. You can give orders.

FIG. 6 shows another embodiment of the operation recognition unit of the interface device according to the present invention. In this embodiment, the motion recognition unit 22 is configured by a contour extraction unit 221, a shape change identification unit 223, and a position detection unit 224. That is, the operation of the object extraction unit 221 is performed by the shape change identification unit 223 in a lump. In the above-described embodiment, an object having a hand shape is once extracted by the object extracting unit 222, and further, a small difference in the hand shape is identified in detail by the shape change identification unit 223. If the change in the shape of the hand is limited to, for example, about two types, one finger and a fist, as shown in FIG. 5, the shape change identification unit 223 selects the two types from a plurality of objects in the image. Only practically identifying and extracting an object corresponding to any of the above shapes is sufficient for practical use.

5) As examples of icons to be displayed, for example, as shown in FIGS. 5C and 5D, if the shape of the hand itself is converted into an icon, it is possible to correspond to the actual hand movement and to be easily understood. Specifically, images such as those shown in FIGS. 5C and 5D may be registered in advance, or the contour data of the hand extracted by the contour extracting unit 221 may be reduced to an arbitrary size. Alternatively, it can be enlarged and used as an icon image.

As described above, the present embodiment includes at least an imaging unit, a motion recognition unit that recognizes the shape and / or motion of an object in an image captured by the imaging unit, and a shape and / or motion of the object recognized by the motion recognition unit. When the user faces the imaging unit, for example, gives an instruction by hand gesture, the given hand gesture is displayed on the display screen, for example, a virtual switch or the like displayed on the display screen is hand gestured. It can be selected, and does not require an input device such as a mouse, and can be operated with very simple equipment.

As described above, the present invention provides at least an imaging unit, an operation recognition unit that recognizes a shape and a movement of an object in an image captured by the imaging unit, and a shape of the object recognized by the operation recognition unit. A display unit that generates and displays an instruction icon different from the object associated with the shape of the object, and displays the instruction icon in accordance with the movement of the object recognized by the operation recognition unit. An interface device for inputting information on the display unit by the instruction icon, which does not require an input device such as a mouse, and enables a very simple operation of the device.

FIG. 1 is an external view of an interface device according to an embodiment of the present invention. Detailed block diagram of an interface device according to an embodiment of the present invention (A), (B) is a diagram showing an example of a hand shape determined by the interface device FIG. 2 is a block diagram showing a detailed embodiment of an operation recognition unit of the interface device. 7A to 7D are diagrams illustrating examples of icons displayed on a display screen by the interface device. FIG. 4 is a block diagram showing another detailed embodiment of the operation recognition unit of the interface device according to the present invention. External view showing a conventional mouse interface device

Explanation of reference numerals

DESCRIPTION OF SYMBOLS 1 Host computer 2 Display 3 CCD camera 21 Frame memory 22 Motion recognition part 23 Display control part 24 Icon generation part 200 Icon 201, 202, 203 Virtual switch 221 Contour extraction part 222 Object extraction part 223 Shape change identification part 224 Position detection part 231 Coordinate converter 232 Coordinate inverter

Claims (11)

  1. At least an imaging unit, a motion recognition unit that recognizes the shape and motion of an object in an image captured by the imaging unit, and the shape of the object recognized by the motion recognition unit, which is previously associated with the shape of the object. A display unit for generating and displaying an instruction icon different from the object, and displaying the instruction icon to be moved in accordance with the movement of the object recognized by the operation recognition unit; and Interface device for inputting information.
  2. An imaging unit, an operation recognition unit that recognizes the shape and movement of an object in an image captured by the imaging unit, and a display unit that displays a selection instruction screen for performing a selection instruction, the display unit includes an operation recognition unit The shape of the object recognized by the object, an instruction icon different from the object previously associated with the shape of the object is generated and displayed, and the instruction icon corresponding to the movement of the object recognized by the motion recognition unit is displayed. An interface device for displaying a moving image and using it as an instruction icon for the selection instruction screen.
  3. 2. The interface device according to claim 1, wherein the imaging surface of the imaging unit is arranged in the same direction as the display direction of the display unit.
  4. 2. The interface device according to claim 1, wherein the display unit is arranged to face the eyes of the user.
  5. A display control unit that allocates an imaging range of an imaging unit and a display range of a display unit is provided, and the display control unit reverses the imaging range of the imaging unit left and right and allocates the imaging range to the display range of the display unit. The interface device according to claim 1.
  6. 2. The interface device according to claim 1, further comprising an icon generation unit configured to generate a shape of the object identified by the motion recognition unit as a plurality of icon images associated in advance.
  7. 7. The interface device according to claim 6, wherein the icon generation unit generates a schematic shape of the object identified by the motion recognition unit as an icon image.
  8. 2. The interface device according to claim 1, wherein a motion change recognizing unit includes a shape change identifying unit that identifies a shape change of a predetermined object in the image and a position detecting unit that detects a center of gravity of the object.
  9. An object extraction unit is provided in the object identification unit, and after a predetermined object is previously extracted from a plurality of objects in an image, the shape change identification unit identifies the shape change of the object extracted by the object extraction unit. The interface device according to claim 1 or 7, wherein:
  10. The interface device according to claim 1, wherein the imaging unit includes an imaging element and a skin color transmission filter disposed on a front surface of the imaging element and transmitting light having a wavelength near human flesh color.
  11. An operation recognition unit that recognizes the shape and movement of an object in an image captured by an imaging unit, and a display unit that displays a selection instruction screen for performing a selection instruction, wherein the display unit is configured to display an object recognized by the operation recognition unit. A shape is generated so that an instruction icon different from the object previously associated with the shape of the object is generated and displayed, and the instruction icon is moved in accordance with the movement of the object recognized by the motion recognition unit. An interface device for displaying, generating an instruction icon based on a recognition result of the object when instructing the selection instruction screen, and executing information input on the display unit using the instruction icon.
JP2003327809A 2003-09-19 2003-09-19 Interface device Withdrawn JP2004078977A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2003327809A JP2004078977A (en) 2003-09-19 2003-09-19 Interface device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2003327809A JP2004078977A (en) 2003-09-19 2003-09-19 Interface device

Related Child Applications (1)

Application Number Title Priority Date Filing Date
JP06176851 Division

Publications (1)

Publication Number Publication Date
JP2004078977A true JP2004078977A (en) 2004-03-11

Family

ID=32025822

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2003327809A Withdrawn JP2004078977A (en) 2003-09-19 2003-09-19 Interface device

Country Status (1)

Country Link
JP (1) JP2004078977A (en)

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100846210B1 (en) 2006-12-05 2008-07-15 한국전자통신연구원 System and method for actively inputting
KR100851977B1 (en) 2006-11-20 2008-08-12 삼성전자주식회사 Controlling Method and apparatus for User Interface of electronic machine using Virtual plane.
WO2009148064A1 (en) 2008-06-03 2009-12-10 島根県 Image recognizing device, operation judging method, and program
KR100960020B1 (en) 2008-05-28 2010-05-28 전자부품연구원 Vision network system and methor for serving image thereof
JP2010237765A (en) * 2009-03-30 2010-10-21 Toshiba Corp Information processing apparatus, focus movement control method, and focus movement control program
US7844921B2 (en) 2006-08-25 2010-11-30 Kabushiki Kaisha Toshiba Interface apparatus and interface method
JP2011081469A (en) * 2009-10-05 2011-04-21 Hitachi Consumer Electronics Co Ltd Input device
KR101079598B1 (en) 2007-12-18 2011-11-03 삼성전자주식회사 Display apparatus and control method thereof
WO2012039140A1 (en) 2010-09-22 2012-03-29 島根県 Operation input apparatus, operation input method, and program
WO2012147960A1 (en) 2011-04-28 2012-11-01 Necシステムテクノロジー株式会社 Information processing device, information processing method, and recording medium
US8384011B2 (en) 2009-12-10 2013-02-26 Sharp Kabushiki Kaisha Optical detection device and electronic equipment for detecting at least one of an X-coordinate and a Y-coordinate of an object
US8405712B2 (en) 2007-09-25 2013-03-26 Kabushiki Kaisha Toshiba Gesture recognition apparatus and method thereof
US8413053B2 (en) 2009-12-22 2013-04-02 Kabushiki Kaisha Toshiba Video reproducing apparatus and video reproducing method
JP2013070227A (en) * 2011-09-22 2013-04-18 Seiko Epson Corp Head-mounted type display device
KR101284797B1 (en) 2008-10-29 2013-07-10 한국전자통신연구원 Apparatus for user interface based on wearable computing environment and method thereof
US20130246968A1 (en) * 2012-03-05 2013-09-19 Toshiba Tec Kabushiki Kaisha Operation supporting display apparatus and method
CN103970455A (en) * 2013-01-28 2014-08-06 联想(北京)有限公司 Information processing method and electronic equipment
US8890809B2 (en) 2009-08-12 2014-11-18 Shimane Prefectural Government Image recognition apparatus, operation determining method and computer-readable medium
US8938124B2 (en) 2012-05-10 2015-01-20 Pointgrab Ltd. Computer vision based tracking of a hand
KR101514170B1 (en) * 2011-04-27 2015-04-21 엔이씨 솔루션 이노베이터 가부시키가이샤 Input device, input method and recording medium
US9134800B2 (en) 2010-07-20 2015-09-15 Panasonic Intellectual Property Corporation Of America Gesture input device and gesture input method
US9323339B2 (en) 2011-04-27 2016-04-26 Nec Solution Innovators, Ltd. Input device, input method and recording medium
KR101617645B1 (en) * 2009-02-24 2016-05-04 삼성전자주식회사 Method for controlling display and apparatus using the same
US9560272B2 (en) 2014-03-24 2017-01-31 Samsung Electronics Co., Ltd. Electronic device and method for image data processing
WO2017099311A1 (en) * 2015-12-09 2017-06-15 원광대학교 산학협력단 Automated application execution apparatus
US9962143B2 (en) 2015-04-30 2018-05-08 Olympus Corporation Medical diagnosis apparatus, ultrasound observation system, method for operating medical diagnosis apparatus, and computer-readable recording medium

Cited By (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7844921B2 (en) 2006-08-25 2010-11-30 Kabushiki Kaisha Toshiba Interface apparatus and interface method
KR100851977B1 (en) 2006-11-20 2008-08-12 삼성전자주식회사 Controlling Method and apparatus for User Interface of electronic machine using Virtual plane.
KR100846210B1 (en) 2006-12-05 2008-07-15 한국전자통신연구원 System and method for actively inputting
US8405712B2 (en) 2007-09-25 2013-03-26 Kabushiki Kaisha Toshiba Gesture recognition apparatus and method thereof
US8237654B2 (en) 2007-12-18 2012-08-07 Samsung Electroncis Co., Ltd. Display apparatus and control method thereof
KR101079598B1 (en) 2007-12-18 2011-11-03 삼성전자주식회사 Display apparatus and control method thereof
KR100960020B1 (en) 2008-05-28 2010-05-28 전자부품연구원 Vision network system and methor for serving image thereof
EP2853991A1 (en) 2008-06-03 2015-04-01 Shimane Prefectural Government Image recognizing device, operation judging method, and program
WO2009148064A1 (en) 2008-06-03 2009-12-10 島根県 Image recognizing device, operation judging method, and program
US8456416B2 (en) 2008-06-03 2013-06-04 Shimane Prefectural Government Image recognition apparatus, and operation determination method and program therefor
KR101284797B1 (en) 2008-10-29 2013-07-10 한국전자통신연구원 Apparatus for user interface based on wearable computing environment and method thereof
US9891754B2 (en) 2009-02-24 2018-02-13 Samsung Electronics Co., Ltd. Method for controlling display and device using the same
KR101617645B1 (en) * 2009-02-24 2016-05-04 삼성전자주식회사 Method for controlling display and apparatus using the same
JP2010237765A (en) * 2009-03-30 2010-10-21 Toshiba Corp Information processing apparatus, focus movement control method, and focus movement control program
US8890809B2 (en) 2009-08-12 2014-11-18 Shimane Prefectural Government Image recognition apparatus, operation determining method and computer-readable medium
US9535512B2 (en) 2009-08-12 2017-01-03 Shimane Prefectural Government Image recognition apparatus, operation determining method and computer-readable medium
JP2011081469A (en) * 2009-10-05 2011-04-21 Hitachi Consumer Electronics Co Ltd Input device
US8384011B2 (en) 2009-12-10 2013-02-26 Sharp Kabushiki Kaisha Optical detection device and electronic equipment for detecting at least one of an X-coordinate and a Y-coordinate of an object
US8413053B2 (en) 2009-12-22 2013-04-02 Kabushiki Kaisha Toshiba Video reproducing apparatus and video reproducing method
US9134800B2 (en) 2010-07-20 2015-09-15 Panasonic Intellectual Property Corporation Of America Gesture input device and gesture input method
WO2012039140A1 (en) 2010-09-22 2012-03-29 島根県 Operation input apparatus, operation input method, and program
US9329691B2 (en) 2010-09-22 2016-05-03 Shimane Prefectural Government Operation input apparatus and method using distinct determination and control areas
US9323339B2 (en) 2011-04-27 2016-04-26 Nec Solution Innovators, Ltd. Input device, input method and recording medium
KR101514170B1 (en) * 2011-04-27 2015-04-21 엔이씨 솔루션 이노베이터 가부시키가이샤 Input device, input method and recording medium
US9329673B2 (en) 2011-04-28 2016-05-03 Nec Solution Innovators, Ltd. Information processing device, information processing method, and recording medium
WO2012147960A1 (en) 2011-04-28 2012-11-01 Necシステムテクノロジー株式会社 Information processing device, information processing method, and recording medium
US9761196B2 (en) 2011-09-22 2017-09-12 Seiko Epson Corporation Head-mount display apparatus
JP2013070227A (en) * 2011-09-22 2013-04-18 Seiko Epson Corp Head-mounted type display device
US20130246968A1 (en) * 2012-03-05 2013-09-19 Toshiba Tec Kabushiki Kaisha Operation supporting display apparatus and method
US8938124B2 (en) 2012-05-10 2015-01-20 Pointgrab Ltd. Computer vision based tracking of a hand
CN103970455A (en) * 2013-01-28 2014-08-06 联想(北京)有限公司 Information processing method and electronic equipment
US9560272B2 (en) 2014-03-24 2017-01-31 Samsung Electronics Co., Ltd. Electronic device and method for image data processing
US9962143B2 (en) 2015-04-30 2018-05-08 Olympus Corporation Medical diagnosis apparatus, ultrasound observation system, method for operating medical diagnosis apparatus, and computer-readable recording medium
WO2017099311A1 (en) * 2015-12-09 2017-06-15 원광대학교 산학협력단 Automated application execution apparatus

Similar Documents

Publication Publication Date Title
Malik et al. Visual touchpad: a two-handed gestural input device
TWI534661B (en) Image recognition device and operation determination method and computer program
EP2480955B1 (en) Remote control of computer devices
CN102915111B (en) A kind of wrist gesture control system and method
AU2013239179B2 (en) Enhanced virtual touchpad and touchscreen
KR100687737B1 (en) Apparatus and method for a virtual mouse based on two-hands gesture
CN101375235B (en) Information processing device
US9389779B2 (en) Depth-based user interface gesture control
JP4771183B2 (en) Operating device
US5594469A (en) Hand gesture machine control system
US9030498B2 (en) Combining explicit select gestures and timeclick in a non-tactile three dimensional user interface
US7810050B2 (en) User interface system
US20130082922A1 (en) Tactile glove for human-computer interaction
JP4965653B2 (en) Virtual controller for visual display
US9910498B2 (en) System and method for close-range movement tracking
TWI398818B (en) Method and system for gesture recognition
US10203812B2 (en) Systems, devices, and methods for touch-free typing
US20030174125A1 (en) Multiple input modes in overlapping physical space
US10209881B2 (en) Extending the free fingers typing technology and introducing the finger taps language technology
JP3785902B2 (en) Device, device control method, pointer movement method
US6043805A (en) Controlling method for inputting messages to a computer
JP3777650B2 (en) Interface equipment
CN104898879B (en) Method and device for data input
US9274551B2 (en) Method and apparatus for data entry input
KR100851977B1 (en) Controlling Method and apparatus for User Interface of electronic machine using Virtual plane.

Legal Events

Date Code Title Description
RD01 Notification of change of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7421

Effective date: 20050708

A761 Written withdrawal of application

Free format text: JAPANESE INTERMEDIATE CODE: A761

Effective date: 20060508