USRE43447E1 - Method of application menu selection and activation using image cognition - Google Patents
Method of application menu selection and activation using image cognition Download PDFInfo
- Publication number
- USRE43447E1 USRE43447E1 US13/048,945 US201113048945A USRE43447E US RE43447 E1 USRE43447 E1 US RE43447E1 US 201113048945 A US201113048945 A US 201113048945A US RE43447 E USRE43447 E US RE43447E
- Authority
- US
- United States
- Prior art keywords
- pattern
- menu
- activating
- user
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Lifetime
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
Definitions
- the present invention relates to a method of selecting and activating an application menu, and more particularly, to an improved method of application menu selection and activation through image cognition, wherein a menu is selected and activated in correspondence to a user's motion while the motion image of the user is recognized at real time by an image-capturing device such as a camera.
- a computer In order to select and activate a particular item from a list of application menu being displayed on a monitor screen, a computer generally adopts an input device, such as keyboard, mouse and touchpad.
- an input device such as keyboard, mouse and touchpad.
- the menu item becomes activated.
- a pointer type wireless control device is employed to select and activate a menu list using an infrared transmission device.
- a pointer type wireless control device is provided with a plurality of sensors at corner portions of a monitor and it calculates a phase difference using an infrared signal being generated from a transmission unit, and accordingly coordinate values are obtained so that a transmitter may move the pointer to a desired position, thereby selecting and activating the required menu item.
- the present invention is directed to solving the conventional disadvantages.
- a user's image is recognized at real time and displayed on an initial screen of a monitor.
- the user makes a direct hand motion while viewing his own image displayed on the initial screen, and when a desired menu icon is designated among a variety of menu icons arrayed on the initial screen, the system guides the user's hand image to the corresponding menu icon for its selection.
- the system recognizes the motion for thereby activating the selected menu.
- a pattern wearable on a finger may be employed so as to accurately recognize a user's specific motion.
- the system guides the user's hand image on the screen to move toward the corresponding menu icon for the menu selection.
- the system recognizes the motion for thereby activating the selected menu.
- a particular pattern grabbable by a user is employed.
- the system guides the user's hand image displayed on the screen to move to the corresponding menu icon for its selection, and when the user operates a menu activating member provided in the pattern itself, the system responds, whereby the selected menu becomes activated.
- FIG. 1 is a schematic view for realizing an image cognition system according to the present invention
- FIG. 2 is a flow chart illustrating steps for selecting and activating an application menu using an image cognition according to a first embodiment of the present invention
- FIG. 3 is a flow chart illustrating steps for selecting and activating an application menu using an image cognition according to a second embodiment of the present invention
- FIG. 4A is a view illustrating a ring-type pattern applicable to the first embodiment of the present invention.
- FIG. 4B is a view illustrating a rod-type pattern applicable to the second embodiment of the present invention.
- FIG. 5 is a schematic view for illustrating a user's image together with a plurality of menu lists displayed on a system monitor, wherein the user's image is positioned on the central portion of the monitor screen;
- FIG. 6 is a schematic view for illustrating a user's image together with a plurality of menu lists displayed on a system monitor, wherein the user's image is positioned on a corner portion of the monitor screen.
- FIG. 1 is a schematic view illustrating an apparatus for realizing the present invention.
- the apparatus according to the present invention includes a camera 1 for capturing a user's image, and a system 2 , such as a personal computer and an HDTV set, for digital-processing the images captured by the camera 1 .
- a system 2 such as a personal computer and an HDTV set, for digital-processing the images captured by the camera 1 .
- the initial screen serving as a client's window region there are displayed a plurality of menu lists in type form of icons 11 , 12 , 13 , . . . , 16 .
- a user's image is displayed on the entire initial screen together with the menu lists.
- the menu icons 11 , 12 , 13 , . . . , 16 are displayed on the left of the screen, and the dotted squares B 1 , B 2 , B 3 , . . . , B 6 enclosing the icons 11 , 12 , 13 , . . . , 16 , respectively, are pattern regions for pattern cognition and they do not appear on the real screen.
- FIG. 2 is a flow chart illustrating steps for selecting and activating an application menu using an image cognition according to a first embodiment of the present invention.
- Step S 21 the user's image captured by the camera 1 is displayed on the monitor screen. Accordingly, the user can view his own image being displayed on the screen as shown in FIG. 1 (Step S 21 ).
- the user can easily notice his hand's location while feeling he stands in front of a mirror.
- the system 2 continuously checks up the screen color within the plurality of patterns regions B 1 , B 2 , B 3 , . . . , B 6 (Step S 22 ). Since the user's hand is flesh color and the screen background color is not so, when the user moves his hand to a certain pattern region B 2 , the color in the pattern region B 2 changes to flesh color. The system 2 checks up whether the screen color within the pattern regions B 1 , B 2 , B 3 , . . . , B 6 is converted to flesh color, thereby determining that the user's hand is positioned on a particular menu icon (Step S 23 ).
- the user's hand is positioned in the second pattern region B 2 .
- the system 2 recognizes that the pattern region B 2 has been selected by the user to thereby convert the color of the menu icon 12 . Accordingly, the user recognizes that the menu icon 12 being indicated by himself has been selected (Step S 24 ).
- the system 2 recognizes the nodding through a gesture cognition device provided within the system 2 and accordingly activates the selected menu icon 12 (Steps S 25 , S 26 ).
- the system continuously captures the user's image and the captured moving image is preprocessed, and the previously captured image is compared with the presently captured image so as to extract characteristics of the two images, whereby the nodding of the user's head can be determined on the ground of the extracted characteristics.
- the method in the above-described embodiment is to activate menu by recognizing the user's gesture.
- the menu activation can be also performed when a particular pattern stays within a certain pattern region for a certain time period.
- the stationed time period of the particular pattern may be counted so that if a predetermined time lapses the menu becomes activated.
- a ring type pattern is provided to be worn on a user's finger. Accordingly, the user's hand with the ring type pattern worn on the finger enables the system to accurately select a desired menu item displayed on the initial screen without error, in response to the pattern motion.
- the second embodiment of the present invention allows the user to grab the rod type pattern, and the user selects a desired menu item and activates the selected menu item by driving a menu activating member provided in the rod type pattern.
- the indication rod as shown in FIG. 4B includes a body 11 grabbable by the user, a first pattern portion 12 formed on an end of the body 11 , a second pattern portion 13 disposed at an outer end of the first pattern portion 12 and guidable through the first pattern portion 12 , and a button 14 for guiding the second pattern portion 13 into and out of the first pattern portion 12 .
- the indication rod illustrated on the left in FIG. 4B denotes a state before the button 14 is pressed, and that on the right denotes a state at which the second pattern portion 13 is exposed from the first pattern portion 12 in correspondence to the pressing of the button 14 .
- the data with regard to the first pattern portion 12 and the second pattern portion 13 are set in the system.
- Step S 31 is identical to Step S 21 in FIG. 2 .
- Step S 32 the user moves the indication rod while viewing his own image displayed on the monitor screen, so that the first pattern portion 12 at the end of the indication rod can reach toward the pattern region B 2 on the left side of the screen.
- the system checks up the color within the plurality of pattern regions B 1 , B 2 , B 3 , . . . , B 6 (Step S 32 ). Since the data responding to the first pattern portion 12 are already stored in the system, it can be determined whether the background color within the pattern regions B 1 , B 2 , B 3 , . . . , B 6 is converted to a color corresponding to the first pattern portion 12 (Step S 33 ). In case the first pattern portion 12 of the indication rod is moved into the pattern region B 2 , the system recognizes the first pattern portion 12 and converts color of the menu icon 12 , whereby the user recognizes that a desired menu item is selected.
- the button 14 of the indication rod is pressed by the user, the second pattern portion 13 is externally exposed from the first pattern portion 12 .
- the selected menu icon 12 becomes activated.
- the system does not require such a gesture cognition function as described in the first embodiment of the present invention.
- FIGS. 5 and 6 the user's image and a menu composition are displayed on the monitor screen of the systems 52 , 62 , which are easily applicable to electronic devices, for example, employed in karaoke.
- a predetermined size of image block 53 arrayed on the central portion of the entire screen is set to display the user's image, and a plurality of menus with song titles are displayed on each side of the image block 53 .
- a pattern region 54 is set at inner ends of the song titled portions approaching the image block 53 .
- the system 52 recognizes the leftward motion so that the hand in the user's image displayed in the image block 53 makes a leftward movement, and accordingly the user's desired menu is selected by checking up the screen color of the pattern region 54 .
- FIG. 6 shows a different composition from FIG. 5 , with regard to the image and menus being displayed on the monitor screen of the system 62 .
- An image block 63 which is similar to the image block 53 is positioned at a corner portion of the screen, and a pointer 64 is displayed on the rest of the initial screen.
- the pointer 64 serves as a mouse pointer mainly employed in the window's operating system in a computer.
- the method of menu selection and activation using image cognition may also replace the mouse-oriented menu selection and activation in prevalent window system computers.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
A method for selecting and activating a particular menu displayed in a client's region of a monitor screen by use of an image cognition is disclosed. Using an image-capturing device such as a camera attached to a system, a user's image is recognized at real time and displayed on an initial screen of a monitor. The user makes a direct hand motion while viewing his own image displayed on the initial screen, and when a desired menu icon is designated among a variety of menu icons arrayed on the initial screen, the system guides the user's hand image to the corresponding menu icon for its selection. When the user makes a particular body motion to activate the selected menu, the system recognizes the motion for thereby activating the selected menu.
Description
This application is a Reissue of U.S. Pat. No. 6,160,899. More than one reissue application has been filed for the reissue of U.S. Pat. No. 6,160,899. The Reissue application numbers are Ser. Nos. 13/027,619 and 13/048,945 (the present application).
1. Field of the Invention
The present invention relates to a method of selecting and activating an application menu, and more particularly, to an improved method of application menu selection and activation through image cognition, wherein a menu is selected and activated in correspondence to a user's motion while the motion image of the user is recognized at real time by an image-capturing device such as a camera.
2. Description of the Background Art
In order to select and activate a particular item from a list of application menu being displayed on a monitor screen, a computer generally adopts an input device, such as keyboard, mouse and touchpad.
Under a touch-screen method, the moment a user touches directly by hand a desired menu item among the menu list displayed on the monitor screen, the menu item becomes activated.
As another example, a pointer type wireless control device is employed to select and activate a menu list using an infrared transmission device. Such a pointer type wireless control device is provided with a plurality of sensors at corner portions of a monitor and it calculates a phase difference using an infrared signal being generated from a transmission unit, and accordingly coordinate values are obtained so that a transmitter may move the pointer to a desired position, thereby selecting and activating the required menu item.
However, such a conventional technology requires an additional, external device for the menu selection and activation.
Further, in case of a touch-screen and a pointer type wireless control device, there should be disadvantageously provided a plurality of sensors at corner portions of the monitor.
The present invention is directed to solving the conventional disadvantages.
Accordingly, it is an object of the present invention to provide a method of application menu selection and activation using image cognition which is capable of selecting and activating a menu list in response to a user's motion or a particular device movement while recognizing a user's image at real time by use of an image-capturing device such as a camera.
According to an embodiment of the present invention, using an image-capturing device such as a camera attached to a system, a user's image is recognized at real time and displayed on an initial screen of a monitor. The user makes a direct hand motion while viewing his own image displayed on the initial screen, and when a desired menu icon is designated among a variety of menu icons arrayed on the initial screen, the system guides the user's hand image to the corresponding menu icon for its selection. When the user makes a particular body motion to activate the selected menu, the system recognizes the motion for thereby activating the selected menu.
In the above-described embodiment, a pattern wearable on a finger may be employed so as to accurately recognize a user's specific motion. When the user indicates a desired menu icon wearing the pattern on his finger, the system guides the user's hand image on the screen to move toward the corresponding menu icon for the menu selection. As described in the above-described embodiment, when the user makes a particular body motion to activate the selected menu, the system recognizes the motion for thereby activating the selected menu.
According to another embodiment of the present invention, a particular pattern grabbable by a user is employed. When the user indicates a desired menu icon, the system guides the user's hand image displayed on the screen to move to the corresponding menu icon for its selection, and when the user operates a menu activating member provided in the pattern itself, the system responds, whereby the selected menu becomes activated.
The object and advantages of the present invention will become more readily apparent from the detailed description given hereinafter. However, it should be understood that the detailed description and specific example, while indicating preferred embodiments of the invention, are given by way of illustration only, since various changes and modifications within the spirit and scope of the invention will become apparent to those skilled in the art from this detailed description.
The present invention will become better understood with reference to the accompanying drawings which are given only by way of illustration and thus are not limitative of the present invention, wherein:
On the initial screen serving as a client's window region, there are displayed a plurality of menu lists in type form of icons 11, 12, 13, . . . , 16. A user's image is displayed on the entire initial screen together with the menu lists.
The menu icons 11, 12, 13, . . . , 16 are displayed on the left of the screen, and the dotted squares B1, B2, B3, . . . , B6 enclosing the icons 11, 12, 13, . . . , 16, respectively, are pattern regions for pattern cognition and they do not appear on the real screen.
When the system 2 starts operation, the user's image captured by the camera 1 is displayed on the monitor screen. Accordingly, the user can view his own image being displayed on the screen as shown in FIG. 1 (Step S21).
Likewise, as the user's own image is displayed on the screen, the user can easily notice his hand's location while feeling he stands in front of a mirror.
Then, following the hand's movement of the user, a menu icon will be selected and the selected menu icon will be activated and the relevant steps will now be described.
When the user moves his hand toward the region of menu icons 11, 12, 13, . . . , 16, the user's hand image on the screen also moves toward the menu icons.
In the meantime, the system 2 continuously checks up the screen color within the plurality of patterns regions B1, B2, B3, . . . , B6 (Step S22). Since the user's hand is flesh color and the screen background color is not so, when the user moves his hand to a certain pattern region B2, the color in the pattern region B2 changes to flesh color. The system 2 checks up whether the screen color within the pattern regions B1, B2, B3, . . . , B6 is converted to flesh color, thereby determining that the user's hand is positioned on a particular menu icon (Step S23).
In FIG. 1 , the user's hand is positioned in the second pattern region B2. Likewise, when the user's hand moves into the particular pattern region B2 selected from the plurality of pattern regions B1, B2, B3, . . . , B6, the system 2 recognizes that the pattern region B2 has been selected by the user to thereby convert the color of the menu icon 12. Accordingly, the user recognizes that the menu icon 12 being indicated by himself has been selected (Step S24).
In the next step, if the user nods his head, the system 2 recognizes the nodding through a gesture cognition device provided within the system 2 and accordingly activates the selected menu icon 12 (Steps S25, S26).
Meanwhile, in order for the system to recognize the user's gesture, there should be provided a pattern cognition using a moving image. That is, the system continuously captures the user's image and the captured moving image is preprocessed, and the previously captured image is compared with the presently captured image so as to extract characteristics of the two images, whereby the nodding of the user's head can be determined on the ground of the extracted characteristics.
The method in the above-described embodiment is to activate menu by recognizing the user's gesture. Here, the menu activation can be also performed when a particular pattern stays within a certain pattern region for a certain time period. Here, by adding a function to the system, the stationed time period of the particular pattern may be counted so that if a predetermined time lapses the menu becomes activated.
In the selection mode of the menu using the hand motion recognition of the user, there may occur an error operation in the result of erroneous recognition in which a hand motion of the user is mistaken for an arm motion due to the inaccurate recognition of the system. In order to overcome such an erroneous operation of the system, a simple type of pattern can be worn on a user's finger.
As shown in FIG. 4A , a ring type pattern is provided to be worn on a user's finger. Accordingly, the user's hand with the ring type pattern worn on the finger enables the system to accurately select a desired menu item displayed on the initial screen without error, in response to the pattern motion.
As further shown in FIG. 4B , the second embodiment of the present invention allows the user to grab the rod type pattern, and the user selects a desired menu item and activates the selected menu item by driving a menu activating member provided in the rod type pattern.
The indication rod as shown in FIG. 4B includes a body 11 grabbable by the user, a first pattern portion 12 formed on an end of the body 11, a second pattern portion 13 disposed at an outer end of the first pattern portion 12 and guidable through the first pattern portion 12, and a button 14 for guiding the second pattern portion 13 into and out of the first pattern portion 12. Here, the indication rod illustrated on the left in FIG. 4B denotes a state before the button 14 is pressed, and that on the right denotes a state at which the second pattern portion 13 is exposed from the first pattern portion 12 in correspondence to the pressing of the button 14.
With reference to FIGS. 1 and 3 , the application menu selection method using image cognition together with the indication rod will now be described.
First, the data with regard to the first pattern portion 12 and the second pattern portion 13 are set in the system.
Step S31 is identical to Step S21 in FIG. 2 . In Step S32, the user moves the indication rod while viewing his own image displayed on the monitor screen, so that the first pattern portion 12 at the end of the indication rod can reach toward the pattern region B2 on the left side of the screen. At this time, the system checks up the color within the plurality of pattern regions B1, B2, B3, . . . , B6 (Step S32). Since the data responding to the first pattern portion 12 are already stored in the system, it can be determined whether the background color within the pattern regions B1, B2, B3, . . . , B6 is converted to a color corresponding to the first pattern portion 12 (Step S33). In case the first pattern portion 12 of the indication rod is moved into the pattern region B2, the system recognizes the first pattern portion 12 and converts color of the menu icon 12, whereby the user recognizes that a desired menu item is selected.
Next, when the button 14 of the indication rod is pressed by the user, the second pattern portion 13 is externally exposed from the first pattern portion 12. When the exposure of the second pattern portion 13 is detected by the system, the selected menu icon 12 becomes activated. Likewise, if there is employed such an indication rod having the first and second pattern portions, the system does not require such a gesture cognition function as described in the first embodiment of the present invention.
In FIGS. 5 and 6 , the user's image and a menu composition are displayed on the monitor screen of the systems 52, 62, which are easily applicable to electronic devices, for example, employed in karaoke.
As shown in FIG. 5 , a predetermined size of image block 53 arrayed on the central portion of the entire screen is set to display the user's image, and a plurality of menus with song titles are displayed on each side of the image block 53. A pattern region 54 is set at inner ends of the song titled portions approaching the image block 53.
When the user moves his hand leftward to select a menu, the system 52 recognizes the leftward motion so that the hand in the user's image displayed in the image block 53 makes a leftward movement, and accordingly the user's desired menu is selected by checking up the screen color of the pattern region 54.
When the user moves his hand, the user's image is displayed inside the image block 63 and the system causes the pointer 64 to move in response to the user's hand movement. Here, the pointer serves as a mouse pointer mainly employed in the window's operating system in a computer.
The method of menu selection and activation using image cognition according to the preferred embodiments of the present invention may also replace the mouse-oriented menu selection and activation in prevalent window system computers.
As the present invention may be embodied in several forms without departing from the spirit of essential characteristics thereof, it should also be understood that the above-described embodiments are not limited by any of the details of the foregoing description, unless otherwise specified, but rather should be construed broadly within its spirit and scope as defined in the appended claims, and therefore all changes and modifications that fall within meets and bounds of the claims, or equivalences of such meets and bounds are therefore intended to embrace the appended claims.
Claims (25)
1. An application menu selecting and activating method using image cognition, comprising the steps of:
recognizing a pattern position on a screen using a pattern cognition function executed per predetermined time period;
selecting a menu when the recognized pattern position is within a certain pattern region on the screen, the pattern region containing the menu; and
activating the selected menu.
2. The application menu selecting and activating method of claim 1 , wherein a user's image is displayed on a client's region prior to the recognizing of the pattern position.
3. The application menu selecting and activating method of claim 2 , wherein the user's image is displayed on a predetermined position in the client's region.
4. The application menu selecting and activating method of claim 1 , wherein the pattern is a user's hand.
5. The application menu selecting and activating method of claim 1 , wherein the pattern is a ring wearable by the user.
6. The application menu selecting and activating method of claim 1 , wherein the pattern is an indication rod.
7. The application menu selecting and activating method of claim 6 , wherein the indication rod comprises:
a body grabbable by the user;
a first pattern portion formed on a side end of the body;
a second pattern portion disposed at an outer end of the first pattern portion and guidable through the first pattern portion; and
a button for guiding the second pattern portion into and out of the first pattern portion.
8. The application menu selecting and activating method of claim 1 , wherein the pattern is an indication rod having two different patterns.
9. The application menu selecting and activating method of claim 1 or 7 , wherein the recognizing of the pattern position is performed on the ground of one of the two different patterns, and the activating of the menu is performed on the ground of the other of the two different patterns.
10. The application menu selecting and activating method of claim 1 , wherein the selecting of the menu is performed when a background color of the pattern is converted in accordance with conversion of a user's pattern position.
11. The application menu selecting and activating method of claim 1 , wherein the activating of the menu is performed when a user's particular body motion is executed.
12. The application menu selecting and activating method of claim 1 , wherein the activating of the menu is performed after a particular pattern is positioned within the predetermined pattern region for a predetermined time period.
13. An application menu selecting and activating method using image cognition, comprising the steps of:
determining a pattern position on a screen by scanning a predetermined pattern region on the screen;
selecting a menu in the pattern region in which the pattern is positioned; and
activating the selected menu.
14. The application menu selecting and activating method of claim 13 , wherein the pattern is a user's hand.
15. The application menu selecting and activating method of claim 13 , wherein the pattern is a ring wearable by the user.
16. The An application menu selecting and activating method of claim 13 , comprising:
determining whether a pattern on a screen is positioned in a predetermined pattern region by scanning the predetermined pattern region on the screen;
selecting a menu in the pattern region in which the pattern is positioned; and
activating the selected menu, wherein the pattern is an indication rod, the indication rod including a first pattern portion and a second pattern portion, the first pattern portion being used in the selecting of the menu, the second pattern portion being used in the activating of the selected menu.
17. The application menu selecting and activating method of claim 13 , wherein the pattern is an indication rod having two different patterns.
18. The application menu selecting and activating method of claim 13 or 17 , wherein the recognizing of the pattern position is performed on the ground of one of the two different patterns, and the activating of the menu is performed on the ground of the other of the two different patterns.
19. An application menu selecting and activating apparatus using image cognition, comprising:
a camera for capturing an image; and
display means for displaying the image received from the camera on a screen, for designating particular regions of the screen for displaying respectively a plurality of predetermined menus, and for selecting a menu from the plurality of predetermined menus when a pattern is positioned on its corresponding region.
20. An application menu selecting and activating method using image cognition, comprising the steps of:
recognizing a user's image in real time;
displaying the user's image on a client region of a display screen;
recognizing a pattern position by a pattern cognition per predetermined time period;
selecting a menu when the recognized pattern position is within a certain pattern region containing predetermined menus; and
activating the selected menu.
21. An application menu selecting and activating method using image cognition, comprising the steps of:
displaying the a user's image on a client region of a display screen, the user's image being displayed on a first area of the display screen;
displaying a menu image on pattern regions of the display screen, the menu image being displayed on a second area of the display screen, wherein the first area and the second area are separately positioned on the display screen;
determining whether a pattern position on a screen is positioned in a predetermined pattern region by scanning a the predetermined pattern region;
selecting a menu in the pattern region in which the pattern is positioned; and
activating the selected menu, wherein the pattern is displayed and moved on the second area of the display screen, the pattern being moved between the pattern regions in response to a user's gesture displayed on the first area of the display screen.
22. An application menu selecting and activating apparatus using image cognition, comprising:
a camera for capturing a user's image in real time;
display means for displaying the user's image received from the camera on a client region and for designating a particular region of the externally applied image;
means for selecting a required menu when a pattern is positioned on a corresponding region; and
a means for activating the selected menu.
23. The method of claim 21, wherein the activating is performed by recognizing the user's gesture.
24. The method of claim 23, wherein the recognizing the user's gesture is performed by comparing a previously captured user image with a currently captured user image.
25. The method of claim 21, wherein the activating is performed by determining a predetermined stationed time lapse.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/048,945 USRE43447E1 (en) | 1997-07-22 | 2011-03-16 | Method of application menu selection and activation using image cognition |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR97/34165 | 1997-07-22 | ||
KR1019970034165A KR19990011180A (en) | 1997-07-22 | 1997-07-22 | How to select menu using image recognition |
US09/119,636 US6160899A (en) | 1997-07-22 | 1998-07-21 | Method of application menu selection and activation using image cognition |
US13/048,945 USRE43447E1 (en) | 1997-07-22 | 2011-03-16 | Method of application menu selection and activation using image cognition |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/119,636 Reissue US6160899A (en) | 1997-07-22 | 1998-07-21 | Method of application menu selection and activation using image cognition |
Publications (1)
Publication Number | Publication Date |
---|---|
USRE43447E1 true USRE43447E1 (en) | 2012-06-05 |
Family
ID=19515251
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/119,636 Ceased US6160899A (en) | 1997-07-22 | 1998-07-21 | Method of application menu selection and activation using image cognition |
US13/027,619 Expired - Lifetime USRE43184E1 (en) | 1997-07-22 | 2011-02-15 | Method of application menu selection and activation using image cognition |
US13/048,945 Expired - Lifetime USRE43447E1 (en) | 1997-07-22 | 2011-03-16 | Method of application menu selection and activation using image cognition |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/119,636 Ceased US6160899A (en) | 1997-07-22 | 1998-07-21 | Method of application menu selection and activation using image cognition |
US13/027,619 Expired - Lifetime USRE43184E1 (en) | 1997-07-22 | 2011-02-15 | Method of application menu selection and activation using image cognition |
Country Status (2)
Country | Link |
---|---|
US (3) | US6160899A (en) |
KR (1) | KR19990011180A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150177843A1 (en) * | 2013-12-23 | 2015-06-25 | Samsung Electronics Co., Ltd. | Device and method for displaying user interface of virtual input device based on motion recognition |
Families Citing this family (93)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3260653B2 (en) * | 1997-03-25 | 2002-02-25 | ヤマハ株式会社 | Karaoke equipment |
US20020036617A1 (en) | 1998-08-21 | 2002-03-28 | Timothy R. Pryor | Novel man machine interfaces and applications |
US6750848B1 (en) | 1998-11-09 | 2004-06-15 | Timothy R. Pryor | More useful man machine interfaces and applications |
US6514083B1 (en) * | 1998-01-07 | 2003-02-04 | Electric Planet, Inc. | Method and apparatus for providing interactive karaoke entertainment |
US6971882B1 (en) | 1998-01-07 | 2005-12-06 | Electric Planet, Inc. | Method and apparatus for providing interactive karaoke entertainment |
US10051298B2 (en) | 1999-04-23 | 2018-08-14 | Monkeymedia, Inc. | Wireless seamless expansion and video advertising player |
US6393158B1 (en) * | 1999-04-23 | 2002-05-21 | Monkeymedia, Inc. | Method and storage device for expanding and contracting continuous play media seamlessly |
US7015950B1 (en) | 1999-05-11 | 2006-03-21 | Pryor Timothy R | Picture taking method and apparatus |
US7406214B2 (en) | 1999-05-19 | 2008-07-29 | Digimarc Corporation | Methods and devices employing optical sensors and/or steganography |
KR100608284B1 (en) * | 1999-06-30 | 2006-08-02 | 삼성전자주식회사 | A remote pointing system |
US8391851B2 (en) | 1999-11-03 | 2013-03-05 | Digimarc Corporation | Gestural techniques with wireless mobile phone devices |
JP2001209487A (en) * | 2000-01-25 | 2001-08-03 | Uw:Kk | Handwriting communication system, and handwriting input and handwriting display device used for the system |
US20020071277A1 (en) * | 2000-08-12 | 2002-06-13 | Starner Thad E. | System and method for capturing an image |
JP3725460B2 (en) * | 2000-10-06 | 2005-12-14 | 株式会社ソニー・コンピュータエンタテインメント | Image processing apparatus, image processing method, recording medium, computer program, semiconductor device |
JP2002157606A (en) * | 2000-11-17 | 2002-05-31 | Canon Inc | Image display controller, composite reality presentation system, image display control method, and medium providing processing program |
KR100396924B1 (en) * | 2001-02-27 | 2003-09-03 | 한국전자통신연구원 | Apparatus and Method for Controlling Electrical Apparatus by using Bio-signal |
US6895520B1 (en) | 2001-03-02 | 2005-05-17 | Advanced Micro Devices, Inc. | Performance and power optimization via block oriented performance measurement and control |
US6931596B2 (en) * | 2001-03-05 | 2005-08-16 | Koninklijke Philips Electronics N.V. | Automatic positioning of display depending upon the viewer's location |
US20040125076A1 (en) * | 2001-06-08 | 2004-07-01 | David Green | Method and apparatus for human interface with a computer |
US20030043271A1 (en) * | 2001-09-04 | 2003-03-06 | Koninklijke Philips Electronics N.V. | Computer interface system and method |
KR100457929B1 (en) * | 2001-11-05 | 2004-11-18 | 한국과학기술원 | System of Soft Remote Controller Using Hand Pointing Recognition |
US20030095154A1 (en) * | 2001-11-19 | 2003-05-22 | Koninklijke Philips Electronics N.V. | Method and apparatus for a gesture-based user interface |
KR100511044B1 (en) * | 2001-12-26 | 2005-08-30 | 이문기 | Pointing apparatus using camera |
US20030132913A1 (en) * | 2002-01-11 | 2003-07-17 | Anton Issinski | Touchless computer input device to control display cursor mark position by using stereovision input from two video cameras |
DE60305662T2 (en) * | 2002-03-08 | 2007-04-05 | Revelations in Design, LP, Austin | CONTROL CONSOLE FOR ELECTRICAL EQUIPMENT |
DE20300882U1 (en) * | 2003-01-21 | 2003-03-13 | Fraunhofer Ges Forschung | Device for the interactive control of a mouse pointer of a graphical user interface |
US7426329B2 (en) | 2003-03-06 | 2008-09-16 | Microsoft Corporation | Systems and methods for receiving, storing, and rendering digital video, music, and pictures on a personal media player |
US20040201595A1 (en) * | 2003-04-11 | 2004-10-14 | Microsoft Corporation | Self-orienting display |
US8345001B2 (en) * | 2004-01-06 | 2013-01-01 | Sony Computer Entertainment Inc. | Information processing system, entertainment system, and information processing system input accepting method |
US7755608B2 (en) * | 2004-01-23 | 2010-07-13 | Hewlett-Packard Development Company, L.P. | Systems and methods of interfacing with a machine |
US20050273201A1 (en) * | 2004-06-06 | 2005-12-08 | Zukowski Deborra J | Method and system for deployment of sensors |
JP4419768B2 (en) * | 2004-09-21 | 2010-02-24 | 日本ビクター株式会社 | Control device for electronic equipment |
EP1645944B1 (en) * | 2004-10-05 | 2012-08-15 | Sony France S.A. | A content-management interface |
GB2419433A (en) * | 2004-10-20 | 2006-04-26 | Glasgow School Of Art | Automated Gesture Recognition |
KR100987248B1 (en) * | 2005-08-11 | 2010-10-12 | 삼성전자주식회사 | User input method and apparatus in mobile communication terminal |
US20070057912A1 (en) * | 2005-09-14 | 2007-03-15 | Romriell Joseph N | Method and system for controlling an interface of a device through motion gestures |
US8549442B2 (en) * | 2005-12-12 | 2013-10-01 | Sony Computer Entertainment Inc. | Voice and video control of interactive electronically simulated environment |
JP2007272067A (en) * | 2006-03-31 | 2007-10-18 | Brother Ind Ltd | Image display device |
US8972902B2 (en) * | 2008-08-22 | 2015-03-03 | Northrop Grumman Systems Corporation | Compound gesture recognition |
TW200816798A (en) * | 2006-09-22 | 2008-04-01 | Altek Corp | Method of automatic shooting by using an image recognition technology |
US8144121B2 (en) * | 2006-10-11 | 2012-03-27 | Victor Company Of Japan, Limited | Method and apparatus for controlling electronic appliance |
US8089455B1 (en) * | 2006-11-28 | 2012-01-03 | Wieder James W | Remote control with a single control button |
JP2008146243A (en) * | 2006-12-07 | 2008-06-26 | Toshiba Corp | Information processor, information processing method and program |
JP4720738B2 (en) * | 2006-12-20 | 2011-07-13 | 日本ビクター株式会社 | Electronics |
WO2008134745A1 (en) * | 2007-04-30 | 2008-11-06 | Gesturetek, Inc. | Mobile video-based therapy |
US20080276792A1 (en) * | 2007-05-07 | 2008-11-13 | Bennetts Christopher L | Lyrics superimposed on video feed |
US8726194B2 (en) * | 2007-07-27 | 2014-05-13 | Qualcomm Incorporated | Item selection using enhanced control |
JP4636064B2 (en) | 2007-09-18 | 2011-02-23 | ソニー株式会社 | Image processing apparatus, image processing method, and program |
CN103442201B (en) | 2007-09-24 | 2018-01-02 | 高通股份有限公司 | Enhancing interface for voice and video communication |
US8471868B1 (en) * | 2007-11-28 | 2013-06-25 | Sprint Communications Company L.P. | Projector and ultrasonic gesture-controlled communicator |
US8555207B2 (en) | 2008-02-27 | 2013-10-08 | Qualcomm Incorporated | Enhanced input using recognized gestures |
US9772689B2 (en) * | 2008-03-04 | 2017-09-26 | Qualcomm Incorporated | Enhanced gesture-based image manipulation |
US20090254855A1 (en) * | 2008-04-08 | 2009-10-08 | Sony Ericsson Mobile Communications, Ab | Communication terminals with superimposed user interface |
US8514251B2 (en) * | 2008-06-23 | 2013-08-20 | Qualcomm Incorporated | Enhanced character input using recognized gestures |
KR100931403B1 (en) * | 2008-06-25 | 2009-12-11 | 한국과학기술연구원 | Device and information controlling system on network using hand gestures |
US8146020B2 (en) * | 2008-07-24 | 2012-03-27 | Qualcomm Incorporated | Enhanced detection of circular engagement gesture |
CN102165396B (en) * | 2008-07-25 | 2014-10-29 | 高通股份有限公司 | Enhanced detection of waving engagement gesture |
JP4720874B2 (en) | 2008-08-14 | 2011-07-13 | ソニー株式会社 | Information processing apparatus, information processing method, and information processing program |
CA2735992A1 (en) * | 2008-09-04 | 2010-03-11 | Extreme Reality Ltd. | Method system and software for providing image sensor based human machine interfacing |
KR101602363B1 (en) * | 2008-09-11 | 2016-03-10 | 엘지전자 주식회사 | 3 Controling Method of 3 Dimension User Interface Switchover and Mobile Terminal using the same |
CN101751771B (en) * | 2008-12-09 | 2012-09-05 | 联想(北京)有限公司 | Infrared control device and method |
US20100289912A1 (en) * | 2009-05-14 | 2010-11-18 | Sony Ericsson Mobile Communications Ab | Camera arrangement with image modification |
US20100295782A1 (en) | 2009-05-21 | 2010-11-25 | Yehuda Binder | System and method for control based on face ore hand gesture detection |
US9277021B2 (en) * | 2009-08-21 | 2016-03-01 | Avaya Inc. | Sending a user associated telecommunication address |
US20110107216A1 (en) * | 2009-11-03 | 2011-05-05 | Qualcomm Incorporated | Gesture-based user interface |
JP5413673B2 (en) * | 2010-03-08 | 2014-02-12 | ソニー株式会社 | Information processing apparatus and method, and program |
US8620024B2 (en) * | 2010-09-17 | 2013-12-31 | Sony Corporation | System and method for dynamic gesture recognition using geometric classification |
US9304592B2 (en) * | 2010-11-12 | 2016-04-05 | At&T Intellectual Property I, L.P. | Electronic device control based on gestures |
JP5327211B2 (en) | 2010-12-28 | 2013-10-30 | カシオ計算機株式会社 | Imaging apparatus, imaging control method, and program |
WO2012123033A1 (en) * | 2011-03-17 | 2012-09-20 | Ssi Schaefer Noell Gmbh Lager Und Systemtechnik | Controlling and monitoring a storage and order-picking system by means of movement and speech |
AU2012245285A1 (en) | 2011-04-22 | 2013-11-21 | Pepsico, Inc. | Beverage dispensing system with social media capabilities |
US20120293546A1 (en) * | 2011-05-18 | 2012-11-22 | Tomi Lahcanski | Augmented-reality mobile communicator with orientation |
US9727132B2 (en) * | 2011-07-01 | 2017-08-08 | Microsoft Technology Licensing, Llc | Multi-visor: managing applications in augmented reality environments |
TWI448147B (en) | 2011-09-06 | 2014-08-01 | Hon Hai Prec Ind Co Ltd | Electronic device and method for selecting menus |
JP2013080413A (en) * | 2011-10-05 | 2013-05-02 | Sony Corp | Input apparatus and input recognition method |
WO2013067020A1 (en) | 2011-11-01 | 2013-05-10 | Stephen Lim | Dispensing system and user interface |
CN103135755B (en) * | 2011-12-02 | 2016-04-06 | 深圳泰山在线科技有限公司 | Interactive system and method |
KR20130081580A (en) * | 2012-01-09 | 2013-07-17 | 삼성전자주식회사 | Display apparatus and controlling method thereof |
TWI454968B (en) * | 2012-12-24 | 2014-10-01 | Ind Tech Res Inst | Three-dimensional interactive device and operation method thereof |
US10152136B2 (en) * | 2013-10-16 | 2018-12-11 | Leap Motion, Inc. | Velocity field interaction for free space gesture interface and control |
US9740296B2 (en) | 2013-12-16 | 2017-08-22 | Leap Motion, Inc. | User-defined virtual interaction space and manipulation of virtual cameras in the interaction space |
KR20150110032A (en) * | 2014-03-24 | 2015-10-02 | 삼성전자주식회사 | Electronic Apparatus and Method for Image Data Processing |
US9696795B2 (en) | 2015-02-13 | 2017-07-04 | Leap Motion, Inc. | Systems and methods of creating a realistic grab experience in virtual reality/augmented reality environments |
US10429923B1 (en) | 2015-02-13 | 2019-10-01 | Ultrahaptics IP Two Limited | Interaction engine for creating a realistic experience in virtual reality/augmented reality environments |
US11875012B2 (en) | 2018-05-25 | 2024-01-16 | Ultrahaptics IP Two Limited | Throwable interface for augmented reality and virtual reality environments |
CN110652190B (en) * | 2018-06-29 | 2022-12-06 | 浙江绍兴苏泊尔生活电器有限公司 | Cooking menu display method and device and cooking appliance |
TWI696093B (en) * | 2019-07-26 | 2020-06-11 | 香港商冠捷投資有限公司 | Display device and its control method |
US11908243B2 (en) * | 2021-03-16 | 2024-02-20 | Snap Inc. | Menu hierarchy navigation on electronic mirroring devices |
US11809633B2 (en) | 2021-03-16 | 2023-11-07 | Snap Inc. | Mirroring device with pointing based navigation |
US11978283B2 (en) * | 2021-03-16 | 2024-05-07 | Snap Inc. | Mirroring device with a hands-free mode |
US11798201B2 (en) | 2021-03-16 | 2023-10-24 | Snap Inc. | Mirroring device with whole-body outfits |
US11734959B2 (en) | 2021-03-16 | 2023-08-22 | Snap Inc. | Activating hands-free mode on mirroring device |
US20230116341A1 (en) * | 2021-09-30 | 2023-04-13 | Futian ZHANG | Methods and apparatuses for hand gesture-based control of selection focus |
Citations (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4565999A (en) * | 1983-04-01 | 1986-01-21 | Prime Computer, Inc. | Light pencil |
US5252951A (en) * | 1989-04-28 | 1993-10-12 | International Business Machines Corporation | Graphical user interface with gesture recognition in a multiapplication environment |
JPH05324181A (en) | 1992-05-26 | 1993-12-07 | Takenaka Komuten Co Ltd | Hand pointing type input device |
US5319747A (en) | 1990-04-02 | 1994-06-07 | U.S. Philips Corporation | Data processing system using gesture-based input data |
US5454043A (en) | 1993-07-30 | 1995-09-26 | Mitsubishi Electric Research Laboratories, Inc. | Dynamic and static hand gesture recognition through low-level image analysis |
US5511148A (en) * | 1993-04-30 | 1996-04-23 | Xerox Corporation | Interactive copying system |
US5528263A (en) | 1994-06-15 | 1996-06-18 | Daniel M. Platzker | Interactive projected video image display system |
US5553277A (en) | 1992-12-29 | 1996-09-03 | Fujitsu Limited | Image search method for searching and retrieving desired image from memory device |
US5617312A (en) * | 1993-11-19 | 1997-04-01 | Hitachi, Ltd. | Computer system that enters control information by means of video camera |
US5732227A (en) * | 1994-07-05 | 1998-03-24 | Hitachi, Ltd. | Interactive information processing system responsive to user manipulation of physical objects and displayed images |
US5802220A (en) * | 1995-12-15 | 1998-09-01 | Xerox Corporation | Apparatus and method for tracking facial motion through a sequence of images |
US5898434A (en) * | 1991-05-15 | 1999-04-27 | Apple Computer, Inc. | User interface system having programmable user interface elements |
US5900863A (en) * | 1995-03-16 | 1999-05-04 | Kabushiki Kaisha Toshiba | Method and apparatus for controlling computer without touching input device |
US5926264A (en) * | 1994-10-12 | 1999-07-20 | The Secretary Of State For Defence In Her Britannic Majesty's Government Of The United Kingdom Of Great Britain And Northern Ireland | Position sensing of a remote target |
US5926168A (en) * | 1994-09-30 | 1999-07-20 | Fan; Nong-Qiang | Remote pointers for interactive televisions |
US6094197A (en) * | 1993-12-21 | 2000-07-25 | Xerox Corporation | Graphical keyboard |
US6525749B1 (en) * | 1993-12-30 | 2003-02-25 | Xerox Corporation | Apparatus and method for supporting the implicit structure of freeform lists, outlines, text, tables and diagrams in a gesture-based input system and editing system |
-
1997
- 1997-07-22 KR KR1019970034165A patent/KR19990011180A/en active Search and Examination
-
1998
- 1998-07-21 US US09/119,636 patent/US6160899A/en not_active Ceased
-
2011
- 2011-02-15 US US13/027,619 patent/USRE43184E1/en not_active Expired - Lifetime
- 2011-03-16 US US13/048,945 patent/USRE43447E1/en not_active Expired - Lifetime
Patent Citations (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4565999A (en) * | 1983-04-01 | 1986-01-21 | Prime Computer, Inc. | Light pencil |
US5252951A (en) * | 1989-04-28 | 1993-10-12 | International Business Machines Corporation | Graphical user interface with gesture recognition in a multiapplication environment |
US5319747A (en) | 1990-04-02 | 1994-06-07 | U.S. Philips Corporation | Data processing system using gesture-based input data |
US5898434A (en) * | 1991-05-15 | 1999-04-27 | Apple Computer, Inc. | User interface system having programmable user interface elements |
JPH05324181A (en) | 1992-05-26 | 1993-12-07 | Takenaka Komuten Co Ltd | Hand pointing type input device |
US5553277A (en) | 1992-12-29 | 1996-09-03 | Fujitsu Limited | Image search method for searching and retrieving desired image from memory device |
US5511148A (en) * | 1993-04-30 | 1996-04-23 | Xerox Corporation | Interactive copying system |
US5454043A (en) | 1993-07-30 | 1995-09-26 | Mitsubishi Electric Research Laboratories, Inc. | Dynamic and static hand gesture recognition through low-level image analysis |
US5617312A (en) * | 1993-11-19 | 1997-04-01 | Hitachi, Ltd. | Computer system that enters control information by means of video camera |
US6094197A (en) * | 1993-12-21 | 2000-07-25 | Xerox Corporation | Graphical keyboard |
US6525749B1 (en) * | 1993-12-30 | 2003-02-25 | Xerox Corporation | Apparatus and method for supporting the implicit structure of freeform lists, outlines, text, tables and diagrams in a gesture-based input system and editing system |
US5528263A (en) | 1994-06-15 | 1996-06-18 | Daniel M. Platzker | Interactive projected video image display system |
US5732227A (en) * | 1994-07-05 | 1998-03-24 | Hitachi, Ltd. | Interactive information processing system responsive to user manipulation of physical objects and displayed images |
US5926168A (en) * | 1994-09-30 | 1999-07-20 | Fan; Nong-Qiang | Remote pointers for interactive televisions |
US5926264A (en) * | 1994-10-12 | 1999-07-20 | The Secretary Of State For Defence In Her Britannic Majesty's Government Of The United Kingdom Of Great Britain And Northern Ireland | Position sensing of a remote target |
US5900863A (en) * | 1995-03-16 | 1999-05-04 | Kabushiki Kaisha Toshiba | Method and apparatus for controlling computer without touching input device |
US5802220A (en) * | 1995-12-15 | 1998-09-01 | Xerox Corporation | Apparatus and method for tracking facial motion through a sequence of images |
Non-Patent Citations (3)
Title |
---|
Korean Office Action dated Sep. 22, 1999. (Application No. 10-1997-0034165). |
Notice of Allowance issued in U.S. Appl. No. 13/027,619 dated Dec. 13, 2011. |
Office Action issued in U.S. Appl. No. 13/027,619 dated Jul. 1, 2011. |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150177843A1 (en) * | 2013-12-23 | 2015-06-25 | Samsung Electronics Co., Ltd. | Device and method for displaying user interface of virtual input device based on motion recognition |
US9965039B2 (en) * | 2013-12-23 | 2018-05-08 | Samsung Electronics Co., Ltd. | Device and method for displaying user interface of virtual input device based on motion recognition |
Also Published As
Publication number | Publication date |
---|---|
USRE43184E1 (en) | 2012-02-14 |
KR19990011180A (en) | 1999-02-18 |
US6160899A (en) | 2000-12-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
USRE43447E1 (en) | Method of application menu selection and activation using image cognition | |
US10606441B2 (en) | Operation control device and operation control method | |
EP2045694B1 (en) | Portable electronic device with mouse-like capabilities | |
EP0953934B1 (en) | Pen like computer pointing device | |
US8311370B2 (en) | Portable terminal and data input method therefor | |
JP5802667B2 (en) | Gesture input device and gesture input method | |
US7738916B2 (en) | Portable terminal device with built-in fingerprint sensor | |
US9400560B2 (en) | Image display device and display control method thereof | |
JPH0844490A (en) | Interface device | |
JP2000298544A (en) | Input/output device and its method | |
JP2010079332A (en) | Remote operation device and remote operation method | |
CN110944139A (en) | Display control method and electronic equipment | |
JP2008181198A (en) | Image display system | |
JPH09167049A (en) | Line of sight input device for console | |
KR101911676B1 (en) | Apparatus and Method for Presentation Image Processing considering Motion of Indicator | |
JPH1039995A (en) | Line-of-sight/voice input device | |
KR0135852B1 (en) | Operation data control device and method with remote contact input controller | |
JP2000207118A (en) | Coordinate input indicator | |
KR19990061763A (en) | Method and device for interface between computer and user using hand gesture | |
KR101085391B1 (en) | Method and Apparatus of Pointing Image | |
JP2009205525A (en) | Input device for finger-disabled people | |
JP2005190158A (en) | Method and device for selecting object on screen | |
JP2001075734A (en) | Coordinate input device | |
JPH1031549A (en) | Viewpoint tracking display method and viewpoint tracking terminal equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |