WO2012047206A1 - Entering a command - Google Patents

Entering a command Download PDF

Info

Publication number
WO2012047206A1
WO2012047206A1 PCT/US2010/051487 US2010051487W WO2012047206A1 WO 2012047206 A1 WO2012047206 A1 WO 2012047206A1 US 2010051487 W US2010051487 W US 2010051487W WO 2012047206 A1 WO2012047206 A1 WO 2012047206A1
Authority
WO
WIPO (PCT)
Prior art keywords
pattern
command
sensor
program
template
Prior art date
Application number
PCT/US2010/051487
Other languages
French (fr)
Inventor
Robert Campbell
Original Assignee
Hewlett-Packard Development Company, L.P.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hewlett-Packard Development Company, L.P. filed Critical Hewlett-Packard Development Company, L.P.
Priority to GB1307602.1A priority Critical patent/GB2498485A/en
Priority to DE112010005854T priority patent/DE112010005854T5/en
Priority to PCT/US2010/051487 priority patent/WO2012047206A1/en
Priority to US13/877,380 priority patent/US20130187893A1/en
Priority to CN2010800695176A priority patent/CN103221912A/en
Priority to TW100127893A priority patent/TWI595429B/en
Publication of WO2012047206A1 publication Critical patent/WO2012047206A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • G06F3/0426Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected tracking fingers with respect to a virtual keyboard projected or printed on the surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0428Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by sensing at the edges of the touch surface the interruption of optical paths, e.g. an illumination plane, parallel to the touch surface which may be virtual
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • FIG. 1 is a drawing of a system, in accordance with an embodiment
  • FIG. 2 is a block diagram of a system that may be used to implement an embodiment
  • FIG. 3 is a drawing of a command template in accordance with an embodiment
  • Fig. 4 is an example of a template in accordance with an embodiment
  • Fig. 5 is a method for entering commands into a system, in
  • Fig. 6 is a method that may be used to enter commands to a system, in accordance with an embodiment
  • Fig. 7 is a non-transitory computer readable medium that may be used to hold code modules configured to direct a processor to enter commands, in accordance with some embodiments.
  • Embodiments described herein provide an optical command entry system that can use an optical sensor system to enter commands selected from a template.
  • the optical sensor system may be configured to monitor a three dimensional space in front of a monitor to determine locations of objects with respect to the display.
  • a pattern recognition module can monitor an image of the area in front of the display as collected by the optical sensor system. If a template having printed patterns is placed in view of the sensor, the pattern recognition module may identify the patterns, map their locations, and associate them with particular commands, such as for an application.
  • a command module may determine a location of an object, such as a finger, hand, or other object, in front of the display and, if the location of the object intersects one of the patterns, the command associated with that pattern can be passed to an application. In some embodiments, if one of the patterns is associated with a particular application, placing the template in front of the display may cause the pattern recognition module to start the associated application.
  • Fig. 1 is a drawing of a system 100, for example, an all-in-one computer system that can obtain control inputs from one or more sensors 102, in accordance with an embodiment.
  • an all-in-one computer system is a computer that includes a display, processor, memory, drives, and other functional units in a single case.
  • embodiments are not limited to the all-in-one computer system, as embodiments may include a stand-alone monitor comprising sensors, or a stand-alone monitor with separate sensors attached.
  • the sensors 102 may be constructed into the easel 04 of the system 100 or may be attached as separate units.
  • the sensors 102 can be positioned in each of the upper corners of a display 106.
  • each sensor 102 can cover an overlapping volume 108 of a three dimensional space in front of the display 106.
  • the sensors 102 may include motion sensors, infrared sensors, cameras, infrared cameras, or any other device capable of capturing an image.
  • the sensors 102 may include an infrared array or camera that senses the locations of targets using a time-of-flight calculation for each pixel in the infrared array.
  • an infrared emitter can emit pulses of infrared light, which are reflected from a target and returned to the infrared array.
  • a computational system associated with the infrared array uses the time it takes for the infrared light to reach a target and be reflected back to the infrared sensor array to generate a distance map, indicating the distance from the sensor to the target for each pixel in the infrared sensor array.
  • the infrared array can also generate a raw infrared image, in which the brightness of each pixel represents the infrared reflectivity of the target image at that pixel.
  • embodiments are not limited to an infrared sensor array, as any number of other sensors that generate an image may be used in some embodiments.
  • the volume 108 imaged by the sensors 102 can extend beyond the display 106, for example, to a surface 1 10 which may be supporting the system 100, a keyboard 1 12, or a mouse 1 14.
  • a template 1 16 may be placed on the surface 1 10 in front of the system 100 in view of the sensors 102.
  • the system 100 may be configured to note the presence of the template 1 16, for example, by recognizing patterns 1 18 on the template.
  • the system may recognize an identifying pattern 120 associated with a particular program, such as a drawing application or a computer aided drafting program, among others, or by recognizing patterns associated with individual commands.
  • the pattern recognition may be performed by any number of techniques known in the art, for example, generating a hash code from the pattern, and comparing the hash code to a library of codes. Any number of other techniques may also be used.
  • the system 100 may respond in a number of ways to recognizing a pattern, for example, the identifying pattern 120 on the template 1 16.
  • the system 100 may start a program associated with the
  • the system 100 may analyze the template 1 16 for other patterns, which can be associated with specific functions, such as save 122, undo 124, redo 126, or fill 128, among many others.
  • the system 100 can allow gestures to be used for interfacing with programs. For example, an item 130 in a program and shown on the display 106, may be selected by a gesture, such as by using a finger 132 to touch the location of the item 130 on the display 106. Further, a function identified on the template 1 16 may be selected, for example, by using a finger 132 to touch the relevant pattern 128. Touching the pattern 128 may trigger an operational code sequence associated with the pattern 128, for example, filling a previously selected item 130 with a color. Any number of functions and or shapes may be used in association with a selected item, or with open documents, the operating system itself, and the like, such as printing, saving, deleting, or closing programs, among others. Removing the template 1 16, or other patterns, from the view of the sensors 102 may trigger actions, such as querying the user about closing the program, saving the document, and the like.
  • Fig. 2 is a block diagram of a system 200 that may be used to implement an embodiment.
  • the system 200 may be implemented by an all-in- one computer system 202, or may be implemented using a modular computer system.
  • the sensors can be built into a monitor, can be constructed to fit over a top surface of the monitor, or may be free standing sensors placed in proximity to the monitor.
  • a bus 204 can provide communications between a processor 206 and a sensor system 208, such as the sensors 102 described with respect to Fig. 1 .
  • the bus 204 may be a PCI, PCIe, or any other suitable bus or communications technology.
  • the processor 206 may be a single core processor, a multi-core processor, or a computing cluster.
  • the processor 206 can access a storage system 210 over the bus 204.
  • the storage system 210 may include any combinations of non-transitory, computer readable media, including random access memory (RAM), read only memory (ROM), hard drives, optical drives, RAM drives, and the like.
  • the storage system 210 can hold code and data structures used to implement embodiments of the present techniques, including, for example, a sensor operations module 212 configured to direct the processor 206 to operate the sensor system 208.
  • a pattern recognition module 214 may include code to direct the processor 206 to obtain a pattern from the sensor system 208 and convert the pattern to a mathematical representation that can identify the pattern.
  • the pattern recognition module 214 may also include a data structure that holds a library of patterns, for example, converted into mathematic representations.
  • a command entry module 216 may use the sensor operations module 212 to determine if a command on a template has been selected and pass the appropriate command string on to an application 218.
  • a human-machine interface may be included to interface to a keyboard or a pointing device.
  • one or both of the pointing device and keyboard may be omitted in favor of using the functionality provided by the sensor system, for example, using an onscreen keyboard or a keyboard provided, or projected, as a template.
  • a display 220 will generally be built into the all-in-one computer system 202. As shown herein, the display 220 includes driver electronics, coupled to the bus 204, as well as the screen itself.
  • Other units that may be present include a network interface card (NIC) for coupling the all-in-on computer to a network 226.
  • the NIC can include an Ethernet card, a wireless network card, a mobile broadband card, or any combinations thereof.
  • Fig. 3 is a drawing of a command template 300 that can be used to operate programs, in accordance with an embodiment.
  • the application can be manually started or may be automatically triggered by a pattern recognition of an ensemble of patterns, for example, that may be used to operate a media player, such as WINDOWS MEDIA PLAYER®, REAL PLAYER®, iTUNES®, and the like.
  • the patterns may include buttons for play 302, stop 304, rewind 306, pause 308, volume up 310, and volume down 312, among others. It will be recognized that the controls are not limited to these buttons or this arrangement, as any number of other controls may be used.
  • Such additional controls may include further icons or may include text buttons, such as a button 314 for selecting other media, or a button 316 for getting information on a program.
  • the template 300 may be printed and distributed with a system. Alternatively, the template 300 may be printed out or hand drawn by a user, for example, for a computer system using an infrared sensor, the patterns may be created using an infrared absorbing material such as the toner in a laser printer or a graphite pencil. Templates may also be supplied by software companies with programs as discussed with respect to Fig. 4.
  • Fig. 4 is an example of a template 400 that may be supplied with a commercial program, in accordance with an embodiment.
  • the template 400 may have a program pattern 402 that can identify a program. Placing the template 400 in view of the sensors 102 (Fig. 1 ) may result in automatic activation of the associated program. Alternatively, a user may activate the program manually.
  • Command patterns 404 on the template 400 may be recognized and associated with commands for the associated program.
  • the command patterns 404 may include commands such as save 406, open 408, line draw 410, and the like. Selecting a command, such as by touching a command pattern 404 on the template, can be used to activate the associated command, for example, generally following the method shown in Fig. 5.
  • Fig. 5 is a method 500 for entering commands into a system, in accordance with embodiments of the present techniques.
  • the system may be the system discussed with respect to Figs. 1 and 2.
  • the method 500 begins at block 502 when the systems detects that a template or pattern is present. The detection may be based on identifying a pattern present in view of an imaging sensor.
  • the pattern may be drawn or printed on the template, but is not limited to any particular implementation. Indeed, the pattern may be hand drawn on the desktop in front of the system, so long as the computer can recognize the shape as identifying a program or command.
  • the patterns on the template may be recognized, for example, by comparing a hash code generated from the pattern to a library of codes stored for various patterns.
  • a pattern may be associated with an operational code sequence, such as a command for a program.
  • the program may be manually selected by the user or may be automatically selected by a pattern on the template.
  • equivalent patterns may be associated with different commands depending on the program selected.
  • the play 302 and rewind 306 patterns discussed with respect to Fig. 3 may be associated with channel up and channel down, respectively, in a television tuner application. If a user should select a different program, the patterns may be automatically associated with the correct command, for example, for the program currently selected for display.
  • Fig. 6 is a method 600 that may be used to enter commands to a computer system, in accordance with an embodiment.
  • the method 600 begins at block 602 with the computer system detecting a template. The detection may look for all of the patterns present in a library of patterns or may look for patterns that identify specific programs. The latter situation may be used for lowering computational costs on a system when a large number of patterns are present. If a template is recognized as being present at block 604, flow proceeds to block 606, at which the patterns are recognized and associated with relevant commands. At block 608, a program associated with a pattern on the template may be automatically loaded. However, embodiments are not limited to the automatic loading of a program. In some embodiments, a user may manually select a program to be used with the template.
  • the computer system may identify an input corresponding to a user action.
  • the input may include the user touching a pattern on a template with a finger or other object.
  • a detection system within the computer system may locate an object in the three dimensional space in front of the screen. When the object and a command location, such as a pattern on the template, intersect, the detection system may send a command to the program through the operating system.
  • the object may include three dimensional shapes that activate specific commands, or code modules, that are relevant to the shape and the location selected.
  • An example of such a shape could be a pyramidal object that represents a printer. If the printer shape is touched to a pattern on the template, the associated command may be executed with a parameter controlled by the shape.
  • Such shapes may also represent a program parameter, such as an operational selection. For example, touching a first shape to a pattern on a template may initiate a code module that prints the object, while touching a second shape to a pattern on a template may initiate a code module that saves the current file.
  • Other shapes may activate code modules that modifies the object, or transmits the data representing the object to another system or location.
  • process flow proceeds to block 614 where an associated command can be entered into the program.
  • the system may determine if the template has been removed from the scanned area. If not, process flow may return to block 610 to continue looking for user input. While the computer system is specifically looking for input relevant to the template present, it may detect the placement of another template in view of the imaging sensors, for example, by continuing to execute block 602 in parallel.
  • process flow may proceed to block 618, at which the system may perform a series of actions to close the program.
  • embodiments are not limited to automatically closing the program, as the user may manually close the program at any time.
  • removing the template may have no effect except to eliminate selection of the associated commands using the template.
  • the system may also take other actions to close out the program, such as saving the files in the program or prompting a user to save the files.
  • Fig. 7 is a non-transitory computer readable medium 700 that may be used to hold code modules configured to direct a processor 702 to enter commands, in accordance with some embodiments.
  • the processor 702 may include a single core processor, a multi-core processor, or a computing cluster.
  • the processor 702 may access the non-transitory computer readable medium 700 over a bus 704, including, for example, a PCI bus, a PCIe bus, an Ethernet connection, or any number of other communications technologies.
  • the code modules may include a pattern detection module 706, configured to direct a processor to detect a pattern placed in view of a sensor, as described herein.
  • a pattern recognition module 708 may recognize the pattern, and, in some embodiments, start an associated program.
  • a pattern association module 710 may recognize patterns in view of the sensor and associate the patterns with particular operational code sequences, such as commands.
  • a command entry module 712 may detect an intersection of an object, such as a hand or other three dimensional shape, with a pattern, and enter the associated command to a program.

Abstract

An embodiment provides for entering a command into a system. The method includes detecting a pattern placed in view of a sensor. The pattern can be recognized and associated with an operation code sequence. The operational code sequence may be executed when the sensor detects an intersection between the recognized pattern and an object.

Description

ENTERING A COMMAND
BACKGROUND
[0001] Early systems for entering commands into programs used keyboards to enter text strings that included the names of the commands, any input parameters, and any switches to modify operation of the commands. Over the last couple of decades, these systems have been nearly replaced by graphical input systems that use a pointing device to move an icon, such as a graphical representation of an arrow, to point at objects displayed on the screen and, then, select them for further operations. The selection may be performed, for example, by setting the icon over the object and clicking a button on the pointing device. In recent years, systems for entering commands have been developed that more strongly emulate physical reality, for example, allowing physical selection of items on a touch sensitive screen.
BRIEF DESCRIPTION OF THE DRAWINGS
[0002] Certain exemplary embodiments are described in the following detailed description and in reference to the drawings, in which:
[0003] Fig. 1 is a drawing of a system, in accordance with an embodiment;
[0004] Fig. 2 is a block diagram of a system that may be used to implement an embodiment;
[0005] Fig. 3 is a drawing of a command template in accordance with an embodiment;
[0006] Fig. 4 is an example of a template in accordance with an embodiment;
[0007] Fig. 5 is a method for entering commands into a system, in
accordance with an embodiment;
[0008] Fig. 6 is a method that may be used to enter commands to a system, in accordance with an embodiment; and
[0009] Fig. 7 is a non-transitory computer readable medium that may be used to hold code modules configured to direct a processor to enter commands, in accordance with some embodiments. DETAILED DESCRIPTION OF SPECIFIC EMBODIMENTS
[0010] Embodiments described herein provide an optical command entry system that can use an optical sensor system to enter commands selected from a template. The optical sensor system may be configured to monitor a three dimensional space in front of a monitor to determine locations of objects with respect to the display. A pattern recognition module can monitor an image of the area in front of the display as collected by the optical sensor system. If a template having printed patterns is placed in view of the sensor, the pattern recognition module may identify the patterns, map their locations, and associate them with particular commands, such as for an application. A command module may determine a location of an object, such as a finger, hand, or other object, in front of the display and, if the location of the object intersects one of the patterns, the command associated with that pattern can be passed to an application. In some embodiments, if one of the patterns is associated with a particular application, placing the template in front of the display may cause the pattern recognition module to start the associated application.
[0011] Fig. 1 is a drawing of a system 100, for example, an all-in-one computer system that can obtain control inputs from one or more sensors 102, in accordance with an embodiment. As used herein, an all-in-one computer system, is a computer that includes a display, processor, memory, drives, and other functional units in a single case. However, embodiments are not limited to the all-in-one computer system, as embodiments may include a stand-alone monitor comprising sensors, or a stand-alone monitor with separate sensors attached. The sensors 102 may be constructed into the easel 04 of the system 100 or may be attached as separate units. In an embodiment, the sensors 102 can be positioned in each of the upper corners of a display 106. In this embodiment, each sensor 102 can cover an overlapping volume 108 of a three dimensional space in front of the display 106.
[0012] The sensors 102 may include motion sensors, infrared sensors, cameras, infrared cameras, or any other device capable of capturing an image. In an embodiment, the sensors 102 may include an infrared array or camera that senses the locations of targets using a time-of-flight calculation for each pixel in the infrared array. In this embodiment, an infrared emitter can emit pulses of infrared light, which are reflected from a target and returned to the infrared array. A computational system associated with the infrared array uses the time it takes for the infrared light to reach a target and be reflected back to the infrared sensor array to generate a distance map, indicating the distance from the sensor to the target for each pixel in the infrared sensor array. The infrared array can also generate a raw infrared image, in which the brightness of each pixel represents the infrared reflectivity of the target image at that pixel. However, embodiments are not limited to an infrared sensor array, as any number of other sensors that generate an image may be used in some embodiments.
[0013] The volume 108 imaged by the sensors 102 can extend beyond the display 106, for example, to a surface 1 10 which may be supporting the system 100, a keyboard 1 12, or a mouse 1 14. A template 1 16 may be placed on the surface 1 10 in front of the system 100 in view of the sensors 102. The system 100 may be configured to note the presence of the template 1 16, for example, by recognizing patterns 1 18 on the template. For example, the system may recognize an identifying pattern 120 associated with a particular program, such as a drawing application or a computer aided drafting program, among others, or by recognizing patterns associated with individual commands. The pattern recognition may be performed by any number of techniques known in the art, for example, generating a hash code from the pattern, and comparing the hash code to a library of codes. Any number of other techniques may also be used.
[0014] The system 100 may respond in a number of ways to recognizing a pattern, for example, the identifying pattern 120 on the template 1 16. In one embodiment, the system 100 may start a program associated with the
identifying pattern 120. The system 100 may analyze the template 1 16 for other patterns, which can be associated with specific functions, such as save 122, undo 124, redo 126, or fill 128, among many others.
[0015] The system 100 can allow gestures to be used for interfacing with programs. For example, an item 130 in a program and shown on the display 106, may be selected by a gesture, such as by using a finger 132 to touch the location of the item 130 on the display 106. Further, a function identified on the template 1 16 may be selected, for example, by using a finger 132 to touch the relevant pattern 128. Touching the pattern 128 may trigger an operational code sequence associated with the pattern 128, for example, filling a previously selected item 130 with a color. Any number of functions and or shapes may be used in association with a selected item, or with open documents, the operating system itself, and the like, such as printing, saving, deleting, or closing programs, among others. Removing the template 1 16, or other patterns, from the view of the sensors 102 may trigger actions, such as querying the user about closing the program, saving the document, and the like.
[0016] Fig. 2 is a block diagram of a system 200 that may be used to implement an embodiment. The system 200 may be implemented by an all-in- one computer system 202, or may be implemented using a modular computer system. In a modular system, for example, the sensors can be built into a monitor, can be constructed to fit over a top surface of the monitor, or may be free standing sensors placed in proximity to the monitor.
[0017] In the all-in-one computer system 202, a bus 204 can provide communications between a processor 206 and a sensor system 208, such as the sensors 102 described with respect to Fig. 1 . The bus 204 may be a PCI, PCIe, or any other suitable bus or communications technology. The processor 206 may be a single core processor, a multi-core processor, or a computing cluster. The processor 206 can access a storage system 210 over the bus 204. The storage system 210 may include any combinations of non-transitory, computer readable media, including random access memory (RAM), read only memory (ROM), hard drives, optical drives, RAM drives, and the like. The storage system 210 can hold code and data structures used to implement embodiments of the present techniques, including, for example, a sensor operations module 212 configured to direct the processor 206 to operate the sensor system 208. A pattern recognition module 214 may include code to direct the processor 206 to obtain a pattern from the sensor system 208 and convert the pattern to a mathematical representation that can identify the pattern. The pattern recognition module 214 may also include a data structure that holds a library of patterns, for example, converted into mathematic representations. A command entry module 216 may use the sensor operations module 212 to determine if a command on a template has been selected and pass the appropriate command string on to an application 218.
[0018] Other units are generally included in the all-in-one computer system 202 to provide functionality. For example, a human-machine interface may be included to interface to a keyboard or a pointing device. In some embodiments, one or both of the pointing device and keyboard may be omitted in favor of using the functionality provided by the sensor system, for example, using an onscreen keyboard or a keyboard provided, or projected, as a template. A display 220 will generally be built into the all-in-one computer system 202. As shown herein, the display 220 includes driver electronics, coupled to the bus 204, as well as the screen itself. Other units that may be present include a network interface card (NIC) for coupling the all-in-on computer to a network 226. The NIC can include an Ethernet card, a wireless network card, a mobile broadband card, or any combinations thereof.
[0019] Fig. 3 is a drawing of a command template 300 that can be used to operate programs, in accordance with an embodiment. In this embodiment, no specific pattern identifies a program for use with the template. Instead, the application can be manually started or may be automatically triggered by a pattern recognition of an ensemble of patterns, for example, that may be used to operate a media player, such as WINDOWS MEDIA PLAYER®, REAL PLAYER®, iTUNES®, and the like. The patterns may include buttons for play 302, stop 304, rewind 306, pause 308, volume up 310, and volume down 312, among others. It will be recognized that the controls are not limited to these buttons or this arrangement, as any number of other controls may be used. Such additional controls may include further icons or may include text buttons, such as a button 314 for selecting other media, or a button 316 for getting information on a program. The template 300 may be printed and distributed with a system. Alternatively, the template 300 may be printed out or hand drawn by a user, for example, for a computer system using an infrared sensor, the patterns may be created using an infrared absorbing material such as the toner in a laser printer or a graphite pencil. Templates may also be supplied by software companies with programs as discussed with respect to Fig. 4.
[0020] Fig. 4 is an example of a template 400 that may be supplied with a commercial program, in accordance with an embodiment. As discussed previously, the template 400 may have a program pattern 402 that can identify a program. Placing the template 400 in view of the sensors 102 (Fig. 1 ) may result in automatic activation of the associated program. Alternatively, a user may activate the program manually.
[0021] Command patterns 404 on the template 400 may be recognized and associated with commands for the associated program. For example, the command patterns 404 may include commands such as save 406, open 408, line draw 410, and the like. Selecting a command, such as by touching a command pattern 404 on the template, can be used to activate the associated command, for example, generally following the method shown in Fig. 5.
[0022] Fig. 5 is a method 500 for entering commands into a system, in accordance with embodiments of the present techniques. The system may be the system discussed with respect to Figs. 1 and 2. The method 500 begins at block 502 when the systems detects that a template or pattern is present. The detection may be based on identifying a pattern present in view of an imaging sensor. The pattern may be drawn or printed on the template, but is not limited to any particular implementation. Indeed, the pattern may be hand drawn on the desktop in front of the system, so long as the computer can recognize the shape as identifying a program or command.
[0023] At block 504, the patterns on the template may be recognized, for example, by comparing a hash code generated from the pattern to a library of codes stored for various patterns. Once a pattern is identified, at block 506, it may be associated with an operational code sequence, such as a command for a program. The program may be manually selected by the user or may be automatically selected by a pattern on the template. Further, equivalent patterns may be associated with different commands depending on the program selected. For example, the play 302 and rewind 306 patterns discussed with respect to Fig. 3 may be associated with channel up and channel down, respectively, in a television tuner application. If a user should select a different program, the patterns may be automatically associated with the correct command, for example, for the program currently selected for display.
[0024] Fig. 6 is a method 600 that may be used to enter commands to a computer system, in accordance with an embodiment. The method 600 begins at block 602 with the computer system detecting a template. The detection may look for all of the patterns present in a library of patterns or may look for patterns that identify specific programs. The latter situation may be used for lowering computational costs on a system when a large number of patterns are present. If a template is recognized as being present at block 604, flow proceeds to block 606, at which the patterns are recognized and associated with relevant commands. At block 608, a program associated with a pattern on the template may be automatically loaded. However, embodiments are not limited to the automatic loading of a program. In some embodiments, a user may manually select a program to be used with the template.
[0025] After patterns are associated with commands for a loaded program, at block 610, the computer system may identify an input corresponding to a user action. The input may include the user touching a pattern on a template with a finger or other object. For example, a detection system within the computer system may locate an object in the three dimensional space in front of the screen. When the object and a command location, such as a pattern on the template, intersect, the detection system may send a command to the program through the operating system. In some embodiments, the object may include three dimensional shapes that activate specific commands, or code modules, that are relevant to the shape and the location selected.
[0026] An example of such a shape could be a pyramidal object that represents a printer. If the printer shape is touched to a pattern on the template, the associated command may be executed with a parameter controlled by the shape. Such shapes may also represent a program parameter, such as an operational selection. For example, touching a first shape to a pattern on a template may initiate a code module that prints the object, while touching a second shape to a pattern on a template may initiate a code module that saves the current file. Other shapes may activate code modules that modifies the object, or transmits the data representing the object to another system or location.
[0027] If a template pattern has been selected at block 612, process flow proceeds to block 614 where an associated command can be entered into the program. At block 616, the system may determine if the template has been removed from the scanned area. If not, process flow may return to block 610 to continue looking for user input. While the computer system is specifically looking for input relevant to the template present, it may detect the placement of another template in view of the imaging sensors, for example, by continuing to execute block 602 in parallel.
[0028] If at block 616 it is determined that the template is no longer in the imaged volume in front of the computer system, process flow may proceed to block 618, at which the system may perform a series of actions to close the program. However, embodiments are not limited to automatically closing the program, as the user may manually close the program at any time. In an embodiment, removing the template may have no effect except to eliminate selection of the associated commands using the template. The system may also take other actions to close out the program, such as saving the files in the program or prompting a user to save the files.
[0029] Fig. 7 is a non-transitory computer readable medium 700 that may be used to hold code modules configured to direct a processor 702 to enter commands, in accordance with some embodiments. The processor 702 may include a single core processor, a multi-core processor, or a computing cluster. The processor 702 may access the non-transitory computer readable medium 700 over a bus 704, including, for example, a PCI bus, a PCIe bus, an Ethernet connection, or any number of other communications technologies. The code modules may include a pattern detection module 706, configured to direct a processor to detect a pattern placed in view of a sensor, as described herein. A pattern recognition module 708 may recognize the pattern, and, in some embodiments, start an associated program. A pattern association module 710 may recognize patterns in view of the sensor and associate the patterns with particular operational code sequences, such as commands. A command entry module 712 may detect an intersection of an object, such as a hand or other three dimensional shape, with a pattern, and enter the associated command to a program.

Claims

CLAIMS What is claimed is:
1 . A method for entering a command into a system, comprising: detecting a pattern placed in view of a sensor;
recognizing the pattern;
associating the recognized pattern with an operational code sequence; and
executing the operational code sequence, based, at least in part, on an intersection of the recognized pattern and an object detected by the sensor.
2. The method of claim 1 , wherein detecting a pattern comprises analyzing an image obtained from the sensor.
3. The method of claim 2, comprising changing a parameter provided to the operational code sequence based, at least in part, on a shape of an object contacting the recognized pattern.
4. The method of claim 3, wherein the parameter may determine an action taken by the operational code sequence.
5. The method of claim 1 , comprising activating a program when a pattern associated with the program is detected.
6. The method of claim 1 , comprising:
detecting when the recognized pattern is removed from view of the
system; and
performing actions to close the program.
7. A command entry system, comprising:
a processor;
a display; a sensor configured to obtain input from a volume;
a command module configured to direct the processor to:
identify a command based, at least in part, on an image identified in the volume by a pattern recognition module; and determine if the command has been selected, based, at least in part, on an intersection of the pattern and an object detected by the sensor.
8. The command entry system of claim 7 comprising a template comprising a plurality of patterns.
9. The command entry system of claim 8, wherein an identifying pattern in the plurality of patterns is associated with one of a plurality of applications, and, when the pattern recognition module identifies the identifying pattern, the command module starts the associated one of the plurality of programs.
10. The command entry system of claim 7, comprising an all-in-one computer system.
1 1 . The command entry system of claim 8, wherein the plurality of patterns are printed in an infrared absorbing material.
12. The command entry system of claim 7, wherein the object represents an action that may be taken by a program.
13. The command entry system of claim 7, comprising a stand-alone monitor having an associated sensor.
14. A non-transitory, computer readable medium comprising code configured to direct a processor to:
detect a pattern placed in view of a sensor; recognize the pattern;
associate the recognized pattern with an operational code sequence; and execute the operational code sequence, based, at least in part, on an intersection of the recognized pattern and an object detected by the sensor.
15. The non-transitory, computer readable medium of claim 14, comprising code configured to direct the processor to analyze images obtained from the sensor.
PCT/US2010/051487 2010-10-05 2010-10-05 Entering a command WO2012047206A1 (en)

Priority Applications (6)

Application Number Priority Date Filing Date Title
GB1307602.1A GB2498485A (en) 2010-10-05 2010-10-05 Entering a command
DE112010005854T DE112010005854T5 (en) 2010-10-05 2010-10-05 Enter a command
PCT/US2010/051487 WO2012047206A1 (en) 2010-10-05 2010-10-05 Entering a command
US13/877,380 US20130187893A1 (en) 2010-10-05 2010-10-05 Entering a command
CN2010800695176A CN103221912A (en) 2010-10-05 2010-10-05 Entering a command
TW100127893A TWI595429B (en) 2010-10-05 2011-08-05 Entering a command

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2010/051487 WO2012047206A1 (en) 2010-10-05 2010-10-05 Entering a command

Publications (1)

Publication Number Publication Date
WO2012047206A1 true WO2012047206A1 (en) 2012-04-12

Family

ID=45927996

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2010/051487 WO2012047206A1 (en) 2010-10-05 2010-10-05 Entering a command

Country Status (6)

Country Link
US (1) US20130187893A1 (en)
CN (1) CN103221912A (en)
DE (1) DE112010005854T5 (en)
GB (1) GB2498485A (en)
TW (1) TWI595429B (en)
WO (1) WO2012047206A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140152622A1 (en) * 2012-11-30 2014-06-05 Kabushiki Kaisha Toshiba Information processing apparatus, information processing method, and computer readable storage medium

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10353488B2 (en) 2014-05-30 2019-07-16 Hewlett-Packard Development Company, L.P. Positional input on displays
US20170351336A1 (en) * 2016-06-07 2017-12-07 Stmicroelectronics, Inc. Time of flight based gesture control devices, systems and methods

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050012721A1 (en) * 2003-07-18 2005-01-20 International Business Machines Corporation Method and apparatus for providing projected user interface for computing device
KR100631779B1 (en) * 2005-10-07 2006-10-11 삼성전자주식회사 Data input apparatus and method for data input detection using the same
KR20070071187A (en) * 2005-12-29 2007-07-04 삼성전자주식회사 Method and apparatus of multi function virtual user interface
KR100756521B1 (en) * 2006-05-03 2007-09-10 포텍마이크로시스템(주) Projection keyboard system for child education and method thereof

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
MY118364A (en) * 1996-11-26 2004-10-30 Sony Corp Information input method and apparatus using a target pattern and an access indication pattern
US5909211A (en) * 1997-03-25 1999-06-01 International Business Machines Corporation Touch pad overlay driven computer system
US6104604A (en) * 1998-01-06 2000-08-15 Gateway 2000, Inc. Modular keyboard
US6614422B1 (en) * 1999-11-04 2003-09-02 Canesta, Inc. Method and apparatus for entering data using a virtual input device
US8035612B2 (en) * 2002-05-28 2011-10-11 Intellectual Ventures Holding 67 Llc Self-contained interactive video display system
CN1926497A (en) * 2003-12-09 2007-03-07 雷阿卡特瑞克斯系统公司 Interactive video display system
US7664323B2 (en) * 2005-01-28 2010-02-16 Microsoft Corporation Scalable hash-based character recognition
KR100987248B1 (en) * 2005-08-11 2010-10-12 삼성전자주식회사 User input method and apparatus in mobile communication terminal
US7770118B2 (en) * 2006-02-13 2010-08-03 Research In Motion Limited Navigation tool with audible feedback on a handheld communication device having a full alphabetic keyboard
CN101589425A (en) * 2006-02-16 2009-11-25 Ftk技术有限公司 A system and method of inputting data into a computing system
JP2009245392A (en) * 2008-03-31 2009-10-22 Brother Ind Ltd Head mount display and head mount display system
US20100177035A1 (en) * 2008-10-10 2010-07-15 Schowengerdt Brian T Mobile Computing Device With A Virtual Keyboard
US20110307842A1 (en) * 2010-06-14 2011-12-15 I-Jen Chiang Electronic reading device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050012721A1 (en) * 2003-07-18 2005-01-20 International Business Machines Corporation Method and apparatus for providing projected user interface for computing device
KR100631779B1 (en) * 2005-10-07 2006-10-11 삼성전자주식회사 Data input apparatus and method for data input detection using the same
KR20070071187A (en) * 2005-12-29 2007-07-04 삼성전자주식회사 Method and apparatus of multi function virtual user interface
KR100756521B1 (en) * 2006-05-03 2007-09-10 포텍마이크로시스템(주) Projection keyboard system for child education and method thereof

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140152622A1 (en) * 2012-11-30 2014-06-05 Kabushiki Kaisha Toshiba Information processing apparatus, information processing method, and computer readable storage medium

Also Published As

Publication number Publication date
US20130187893A1 (en) 2013-07-25
GB201307602D0 (en) 2013-06-12
DE112010005854T5 (en) 2013-08-14
TW201222425A (en) 2012-06-01
CN103221912A (en) 2013-07-24
TWI595429B (en) 2017-08-11
GB2498485A (en) 2013-07-17

Similar Documents

Publication Publication Date Title
US11048333B2 (en) System and method for close-range movement tracking
JP5991041B2 (en) Virtual touch screen system and bidirectional mode automatic switching method
US9207806B2 (en) Creating a virtual mouse input device
US9910498B2 (en) System and method for close-range movement tracking
US8433138B2 (en) Interaction using touch and non-touch gestures
US20130191768A1 (en) Method for manipulating a graphical object and an interactive input system employing the same
US7904837B2 (en) Information processing apparatus and GUI component display method for performing display operation on document data
US9035882B2 (en) Computer input device
JP5306528B1 (en) Electronic device and handwritten document processing method
CN101636711A (en) Gesturing with a multipoint sensing device
US20140161309A1 (en) Gesture recognizing device and method for recognizing a gesture
Zhang et al. Gestkeyboard: enabling gesture-based interaction on ordinary physical keyboard
WO2012094740A1 (en) Method for supporting multiple menus and interactive input system employing same
US10402080B2 (en) Information processing apparatus recognizing instruction by touch input, control method thereof, and storage medium
US20150169134A1 (en) Methods circuits apparatuses systems and associated computer executable code for providing projection based human machine interfaces
TW201421322A (en) Hybrid pointing device
CN103164160A (en) Left hand and right hand interaction device and method
US9183276B2 (en) Electronic device and method for searching handwritten document
US9940536B2 (en) Electronic apparatus and method
US20130187893A1 (en) Entering a command
US9542040B2 (en) Method for detection and rejection of pointer contacts in interactive input systems
US20150139549A1 (en) Electronic apparatus and method for processing document
JP6821998B2 (en) Electronic blackboard, program, method
US20230070034A1 (en) Display apparatus, non-transitory recording medium, and display method
KR20140086805A (en) Electronic apparatus, method for controlling the same and computer-readable recording medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10858229

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 13877380

Country of ref document: US

WWE Wipo information: entry into national phase

Ref document number: 112010005854

Country of ref document: DE

Ref document number: 1120100058547

Country of ref document: DE

ENP Entry into the national phase

Ref document number: 1307602

Country of ref document: GB

Kind code of ref document: A

Free format text: PCT FILING DATE = 20101005

WWE Wipo information: entry into national phase

Ref document number: 1307602.1

Country of ref document: GB

Ref document number: 2010858229

Country of ref document: EP