WO2015188268A1 - Gestural interface with virtual control layers - Google Patents

Gestural interface with virtual control layers Download PDF

Info

Publication number
WO2015188268A1
WO2015188268A1 PCT/CA2015/050493 CA2015050493W WO2015188268A1 WO 2015188268 A1 WO2015188268 A1 WO 2015188268A1 CA 2015050493 W CA2015050493 W CA 2015050493W WO 2015188268 A1 WO2015188268 A1 WO 2015188268A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
program
gesture
robot
virtual
Prior art date
Application number
PCT/CA2015/050493
Other languages
French (fr)
Inventor
Hsien-Hsiang Chiu
Original Assignee
Hsien-Hsiang Chiu
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US14/723,435 external-priority patent/US9696813B2/en
Application filed by Hsien-Hsiang Chiu filed Critical Hsien-Hsiang Chiu
Priority to CA2917590A priority Critical patent/CA2917590A1/en
Publication of WO2015188268A1 publication Critical patent/WO2015188268A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C17/00Arrangements for transmitting signals characterised by the use of a wireless electrical link
    • G08C17/02Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C2201/00Transmission systems of control signals via wireless link
    • G08C2201/30User interface
    • G08C2201/32Remote control based on movements, attitude of remote control device

Definitions

  • This invention relates to using intelligent gesture interface robot equipped video vision sensor reading user hand gesture, to operate computer, machine, and intelligent robot.
  • the unique gesture reading method is Gesture Interface Robot using Puzzle Cell Mapping dynamic multiple sandwich layers work zone of virtual touch screen mouse, keyboard, and control panel in user's comfortable gesture action area, simple move hands and push to click. It is easy to operate, no require user make hands swings action or abnormal body posting actions that user could hurt themselves and hit object or others around them, so the best gesture solution is my invention proposal method using puzzle cell mapping gesture method which is consider safe and efficient way. Use can use simple gesture action to control all kind of computer machines all together, and not require remembering which gesture body post for which command.
  • Gesture Interface Robot using Puzzle Cell Mapping dynamic multiple sandwich layers work zone of virtual touch screen mouse, keyboard, and control panel in user's comfortable gesture action area, simple move hands and push to click, easy gesture actions to control complex machines operate in real time, and prevent injury.
  • controllers Too many keyboard, mousse, remote controller, smart phone, tablet device controller is trouble. Each controller has its key functions, unnecessary trouble to operate, require to click many keys to just turn on TV or DVD to watch. Difficult to remember which key on which control problem too.
  • controller control panel interface on equipment, transportation, cars, airplane, spaceship, control office center ... etc. Stop waste resource, pollution and save money.
  • Regular gesture device doesn't have the sufficient functions, also require big body gesture actions, and can not truly use to control complex computer action that user need. Reduce unnecessary equipment interface installation can benefit spaceship to reduce weight, and free the room space.
  • Gesture Interface Robot makes gesture control in all areas possible, and it is both ways, robot has intelligent to make gesture operation easily. Improving our lifestyle, and change the way that people operate computer, machine, and robot all around the world.
  • IRIID Gesture Interface Robot can be the communication bridge between human and autonomous robot machine world. IRIID will change the way how people operate computers, machines, and intelligent robots the entire world.
  • Gesture Interface Robot To solve problems and improve a better way for human control computer, machine, and intelligent robot, I have purpose the Gesture Interface Robot that allow user use hands to moving in comfortable area to select virtual puzzle cell keyboard, mouse, control panel key and push out the hand toward as click select action and the Gesture Interface Robot video vision recognize user select gesture action and its location according to the center point assigned on user.
  • Gesture Interface Robot use the relative distant on hands location with the center point to determine the X, Y position of which puzzle cell position; in addition, Gesture Interface Robot also can recognize user push hands location in Z dimension distant change between hands and user body distant; for example, if hand location push out, the distant between hand and user body will increase and the maximum of this push distant is the hand and arms total length that normal human can be push their hand out distant.
  • Gesture Interface Robot can virtual divide this total hand push out distant in 3 selection zones, first selection zone is unlock user hand selection key zone, and the second section zone is moving hand in all directions UP, Down, Left, Right to select virtual key selection zone, and the 3 rd selection zone is push out as click selection zone.
  • the robot update real time visual puzzle cell map to display both hands position on graphic puzzle cell map control keyboard that display on monitor as visual indicate to user, so user know which is left hand's select on which virtual key and which is right hand's selection virtual key.
  • the selected virtual keys will be highlight and increase font size as indication on the graphic puzzle cell map keyboard that display on monitor for example, left hand selection is highlight Red color and enlarge font size, and right hand selection key is highlight white color and enlarge font size.
  • robot video sensor recognize user's click action, match the X, Y with its puzzle cell map and translate into computer command, and then send command to automation program that has command script, functions, or Marco with action trigger to exercise the command.
  • robot's web server function can activate web browser and enter URL plus command text code and enter, the specific web page will be execute open with specific command text code whatever user selected embedded in web link format.
  • the robot main computer can have automation program such as EventGhost that can detected the trigger action and exercise the command include in the same Marco. So each web page URL with particular text code will active a different trigger action and exercise different command accordingly.
  • the automation program such as EventGhost can also recognize key click event as trigger, so robot can send key click to trigger action.
  • There is limited computer keys can be assigned to particular commands, and also particular physical key will be assigned and can not be use for normal typing function any more; therefore, recommends using the web server IIS service and activate specific web page with specific text code is the best way, unlimited assign command by differential folders on each control machines and trigger Marco actions and free keys to be click able as normal computer functions.
  • automation program such as EventGhost that can have many folders and include save Marcos with trigger actions, and can detected the specific trigger command
  • the Marco can exercise command such as sending text key command, display A - Z, 0-9, symbols keys, functions key, open computer program, internet browser, words, calculator, 3D graphic drawing CAD program ... etc;
  • automation program such as EventGhost can include USB UIRT cable to learn physical Infrared Remote Controller each function keys signal and recorded in Marco action.
  • EventGhost will send infrared signal out through USB-UIRT cable device, the IR signal can be control a physical machine such as computer, machine, and intelligent robot.
  • robot is sending IR signal out to control a TV to turn ON/OFF.
  • another computer can equipped IR receiver, then Gesture Interface Robot can sending IR signal to control the other computer such as display a - z, 0 -9, symbols, function keys, open computer programs, Media, running DVD player, playing music, video, internet browser, playing games, and moving mouse position, Right click, Left Click, Double click, wheel up, wheel down ... etc computer functions.
  • Gesture Interface Robot can control self-intelligent machines, and intelligent robots. Soon self-intelligent driving car, flight jet, and spaceship, intelligent robot will be using in people lifestyle daily home, healthy care, education, medical, transportation, public services ... etc.
  • Robot program can directly coding with USB-UIRT cable's API library add in to be assemble of available functions, so robot program can directly control USB-UIRT cable to the IR signal learning and send out IR signal commands, robot can be directly control physical machine such as TV, Computer, machines in the robot program without need to have 3 rd party automation program such as EventGhost to run it. Similar the robot program can sending enter key command directly to the activate program, for example enter key to notepad or Microsoft Word, program coding to sending text key command to display type words on the writing program directly without need to have 3 rd part automation program too.
  • the left and right shoulder's edge points can be add into program for enhanced the selection X, Y value reading accuracy, and the hands' palm size value (open and hold fingers) can be add in program to enhance the click selection reading accuracy.
  • robot create a prefect comfortable work space zone for the user, and user can moving their hands in comfortable space, and all directions without difficulty, and prevent problems such as injury themselves, or hitting any others, or objects around.
  • gesture Interface root is using puzzle cell mapping method
  • robot can graphic drawing virtually any virtual mouse, keyboard, and control panels that user wanted instantly, and the gesture video sensor require user do simple hand moving, and click action, and whole Gesture Interface Robot can be build by using a regular computer, laptop with video camera, lower system electricity consumption, low building equipment cost, and Gesture Interface Robot can be use by everyone convince, waling moving, sitting and everywhere.
  • Gesture Interface Robot can be used in all areas on the Earth; furthermore, In zero gravity environment, physical moving is difficulty, Gesture Interface Robot is useful in spaceship that astronaut can using gesture action moving their hands in front of them to control computer, machine, and intelligent robot in zero gravity environment easily and plus free the room space and reduce spaceship weight.
  • Gesture Interface Robot vision is enable user moving hand in front of them like a fish swimming their fins smoothly and softly moving each fingers UP and Down like waving fin to control a continuous click action in 3rd click selection zone.
  • 3 rd selection zone user's hand palm make this fish fin waving swimming gesture action as hand sign and the robot vision program can detect the distance changing, hand's palm center visible blinking like night sky star view, each wave makes a blinking and robot automatically blinking to continue click action without require user pulling hand back to 1 st select zone to unlock and push out to reselect action.
  • This unique gesture of fish fin waving swimming hand palm sign make user very easy to control machines when continuous clicks required such as TV volume UP/Down, or computer Mouse moving UP, Down, Left, Right... etc.
  • Gesture Interface Robot can support an advance gesture action of the Touchscreen of Mouse that virtual sandwich layers to combine virtual control panel keys zone function.
  • Robot vision program is enable user to decide which hand for TouchScreen Mouse and the other hand can virtual click the virtual puzzle cell Mouse keys, can be assign any commands, and for mouse function can be such as Mouse Double click, Left click, Right click, Mouse Left click UP, Mouse Left click Down, Mouse Right Click UP, Mouse Right Click Down, Wheel UP, Wheel Down ... etc.
  • mouse function can be such as Mouse Double click, Left click, Right click, Mouse Left click UP, Mouse Left click Down, Mouse Right Click UP, Mouse Right Click Down, Wheel UP, Wheel Down ... etc.
  • robot program activate the virtual TouchScreen Mouse function, enable tracking user's right hand location and moving the mouse position accordingly on the display monitor.
  • the robot program moving the mouse cursor position UP on the monitor accordingly the distant of the hand move distant.
  • the move distance can be determined where its location on the right side of the Work Zone space, and robot program calculate the ratio of X, Y distance between virtual center point, and update the same ratio distant moving the mouse cursor position in the same direction. Therefore, if user's moving right hand make a circle, the mouse cursor will moving a circle on the monitor in real time.
  • the robot recognize the click select, it will do the Mouse LEFT click as default selection click action.
  • the other mouse click action is required,
  • the other hand can moving and click the virtual mouse puzzle cell keys for example, the other hand click Double Click, then user moving right hand to control TouchScreen Mouse cursor on a program icon, and push hand out, robot program will perform the Double click for that click instead of default Left click; therefore, the program icon will be Double click and running.
  • the other virtual mouse puzzle cell keys are also useful when specific mouse action click need to be specific, For example, if user in view of a large page or a drawing image page, to perform the Left Click Down will makes the whole drawing image page sheet moving follow right hand moving in all directly, and when user moving image sheet to right location, do virtual Left Click Down click to release the Touchscreen Mouse Grip action, and back to default.
  • the Touchscreen Mouse can be performing by right hand or left hand, and the each hand mouse cursor start position prefers to be initial on its start location. Because robot program vision calibrate the user working space zone into 4 sections, X and Y dimension lines across on virtual center point, so it divide into 4 sections where value of section I, (X+, Y+), section II, (X-, Y+), section III, (X+, Y-), and section IV, (X-, Y-). This means for the right hand will be determine position using X, Y value of section I and III, the Right Hand TouchScreen Mouse program function prefer to start cursor position will be in monitor LEFT-TOP comer position that is video card monitor 0,0 position.
  • the LEFT Hand TouchScreen Mouse program function prefer to start cursor position will be in monitor Right-Bottom comer position, if a monitor video card use resolution as 1900 x 1200, then the cursor start position is 1900 xl200 on the monitor.
  • Robot program will determine its video view frame width and height ratio to compare with monitor screen resolution ration, and moving mouse cursor distance accordingly with hand in all direction 360 degree.
  • TouchScreen Mouse can use gesture click action with computer virtual keyboard keys buttons as well, and to click keys buttons on computer monitor. If computer windows desktop screen are tile fill up click-able buttons on surface, then user can use TocuhScreen Mouse to select which button to be clicked by gesture action.
  • Gesture Interface Robot can equipped output display devices options, such as display monitor, visual image projector on any surface, wireless monitor glass that user can wear and see the project monitor screen in the lances.
  • Robot can control a wireless Bluetooth card attached with micro controller board or a smart phone to light on LED light on and OFF to display Morse code of text command select on , or generate a vibration long and short signals of Morse code of text command too.
  • User can wear the wireless display Morse code text command device on their palm's back and LED lights face direction to themselves, or like a watch.
  • robot program sending command to wireless micro controller boards to blink LED light ON/OFF Long and short to indicate which command select on and/or motor vibration long and short signal for silent reading text command.
  • Gesture Interface Robot can equipment wireless equipment such as Bluetooth, Wi-Fi network equipment that can sending signal to control other wireless network smart phone, micro controller board, machines, car's Bluetooth system, other computer, another machine, another network nodes on the networks, through World Wide Web, Internet TCP/IP protocol and using server-client network software program to remote control operation and diagnostic configuration other robot machines, or connect to space signal transmitter station to sending signal to space remote control , Harvard Space Telescope, or Rover robot on the Mars ... etc.
  • wireless equipment wireless equipment such as Bluetooth, Wi-Fi network equipment that can sending signal to control other wireless network smart phone, micro controller board, machines, car's Bluetooth system, other computer, another machine, another network nodes on the networks, through World Wide Web, Internet TCP/IP protocol and using server-client network software program to remote control operation and diagnostic configuration other robot machines, or connect to space signal transmitter station to sending signal to space remote control , Harvard Space Telescope, or Rover robot on the Mars ... etc.
  • Gesture Interface Robot will change the way how people using computer, machine, and intelligent robot all around the world.
  • Figure 1 is a drawing showing the hardware components of Gesture Interface Robot, peripherals wireless network devices, display devices and the robot vision tracking software programs.
  • Figure 2 is a drawing showing robot vision program automatic measure user's work space, assign virtual center point, create work space zone in conformable area, establish puzzle cell mapping keys, virtual control panel in front of user to click.
  • Figure 3 is a drawing showing the hand push out in z dimension push to click virtual key, and the z dimension distant, between hand palm and user body distant, has divide into 3 zones, 1 st selection unlock selected key gate zone, 2 nd moving to select virtual key zone, and 3 rd push hand to click the selected virtual key zone; in addition, showing unique special GIR fingers hand sign enhance selection control accuracy.
  • Figure 4 is a drawing showing a special GIR hand sign gesture moving like fish swimming its fins, moving fingers up and down routing 1 by 1 that making waving fingers hand sign in the 3 rd selected zone, vision program will detect and continuous click virtual key without pull hand back to unlock selected key gate zone.
  • Figure 5 a drawing showing robot vision program to tracking user's hands position in the work zone.
  • Robot vision program draw virtual puzzle cell map keys control panel graphic image on display monitor.
  • Vision program using the tracking user's hands location determine which keys are selected, and on display monitor, vision program highlight particular puzzle cells as visual indication, user know which keys that they are selected on right hand and left hand.
  • Figure 6 is a drawing showing robot vision program drawing virtual puzzle cell map keys control panel as graphic image like watercolor painting artist (Picasso). Program draw the virtual keys in Grid Row and Column cells, and insert TextBlock field into each grid cell, then filling text word into TextBlock field as indicate command for user to select; for example a Q WERT standard virtual puzzle cell keyboard.
  • robot vision program is able to work with automation program to control USB-UIRT cable to send Infrared signals to remote control another computer keyboard and mouse operation.
  • Figure 7 is a drawing showing vision program drawing a mouse keyboard, control panel, User can select the virtual keys to control mouse position, and mouse click functions.
  • the virtual puzzle cell map keyboard, control panel, prefer special interface section arrangement can be divide into Left and Right hand zones, a reserve the center area of work space to display real-time user image video, showing user action, user can see themselves on the display monitor with virtual keyboards together, this special virtual gesture interface arrangement make good visual feedback and indication easy for eyes during for user operation.
  • Figure 8 is a drawing showing Gesture Interface Robot can create any keyboard and control panel that user wanted.
  • the varieties of virtual keyboard, control panel, each keyboard has its own control commands, and filling into each row-column puzzle cell, the virtual keyboards drawings are to show as examples
  • Figure 9 is a drawing showing more of examples of virtual keyboards drawings to show that Gesture Interface Robot is able to support computer operation functions.
  • Figure 10 is a drawing showing more of examples of virtual keyboards drawings to show that Gesture Interface Robot is able to support computer operation functions; in addition, robot using peripherals devices to be able control, network devices, computers, machines, and intelligent robot.
  • Robot can equipped speech recognition program function, array of microphones use as sound sensor, and equipped voice speaking program function use speakers to voice feedback.
  • Figure 11 is the drawing showing an advance TouchScreen Mouse combine puzzle cell virtual keyboard in sandwich layers method.
  • Figure 12 is drawing showing the enhanced wireless select key indication device, that wear on user hand palm, arms, or user body, display which selection keys by blinking LED light in MorseCode signals, and/or using vibration motor to make long-short vibrations MorseCode signal, so user don't need to watch display monitor to know what keys they select, this feature especially useful for poor eyesight, and blind users.
  • Figure 13 is the drawing showing a wireless display glass that has network protocol equipment to connect with robot, and robot send the display puzzle cell map with hands selection position, and the wireless display glass project the puzzle cell image on its lenses, so user can see which keys they select.
  • Figure 14 is the drawing robot equipped a mobile platform, example using micro controller board to control varieties motors, so robot vision program can intelligently control these motors rotation; as result, robot intelligently driving itself moving around, and able to control moving its display projector direction to project puzzle cell keyboard images on any surface.
  • the varieties of motors control modules can use to build into robot's neck, body, arms, hands, legs, so robot can be build as human shape, physical body movement ability with Gesture Interface Robot puzzle cell map function.
  • the Gesture Interface Robot becomes the communication bridge between human and intelligent robot machine world.
  • FIG. 1 drawing showing the hardware components of Gesture Interface Robot (GIR) and the vision tracking software programs, and peripherals wireless network devices, display devices.
  • GIR Gesture Interface Robot
  • the completed working example model of Gesture Interface Robot (GIR) use components includes,
  • Main Computer 1 use as Robot's brain to process video, robot vision puzzle cell map virtual keyboard control program (PCMVKCP) 3 and automation program 2 (such as EventGhost) , web server function 41, such as IIS server.
  • Video vision sensor build with variety type of sensors module 8 combine multiple microphones as sound sensor 7, Infrared Emitter 9, RGB video camera 10 (or use Web Camera instead), Infrared signal reflection detect sensor 11, 3 dimension movement accelerometer sensor 12, speakers 13, and motor control module 17, connect circle signal control line 15, intelligent rotate directions 16, 18, and this particular video sensor module system can be use Microsoft Kinect sensor 6 as available vision sensor parts component sold on the market.
  • this invention GIR proposal to build in Universal Infrared Receiver Transmitter (UIRT) 14 to this video sensor module as addition IR remote control features to physical operate machines.
  • UIRT Universal Infrared Receiver Transmitter
  • Micro Controller Board 21 can use chicken board.
  • Intelligent rotate directions 19 variety type of sensors modules 24 and GPS 22 sensor, connect cable 23, 25 can be attached to the board for external sensor reading signals by Micro Controller board 21 and send to Main Computer 1 to process.
  • USB-Universal Infrared Receiver Transmitter (UIRT) 34 built in, or USB adapter cable 33, that can learn, record, and sending Infrared signals, recording from any physical IR remote controllers.
  • USB-UIRT cables can send and receive IR signals.
  • Additional IR receiver 36 built in, or USB adapter cable 35 can be attached to Main Computer 1 too.
  • Wireless network equipments such as Bluetooth network card 38 built in, or USB adapter cable 37, Wi-Fi network card 39 built in , or USB adapter cable 40, ... etc. all wireless network protocol card devices, TCP/IP, Internet Protocol such as Xbee, Ethernet, Wify, Bluetooth, Cell Phone channel 3G, 4G, GSM, CDMA, TDMA ... etc, space telecommunication channel, satellite channels.
  • Display monitor devices such as display monitor 43, monitor cable 42, image
  • Projector 44 and wireless network display monitor glass 46.
  • wireless network example TCP/IP, or Bluetooth
  • Main Computer power source wall power plug 32 when Robot in fixed install position, and Kinect sensor 6 power plug source too.
  • Micro controller power source can be use inexpertly or from Main computer 1 through USB connection.
  • Mobile motor wheel platform 28 equipped motors wheels 26, 30, with motor signal control line 27, 29 for control motor rotation direction and speed.
  • Robots all components can be place on platform 28, and robot is able to using video vision function to drive itself, moving around.
  • the portable power source 31 can be rechargeable batteries, solar cell, fuel cell, rotation generator, wind turbine, thermo electron generator (TEG), ... etc. to regenerate electric power for robot to moving and operation.
  • TOG thermo electron generator
  • motor modules can be built variety robot body parts, motor control can be neck, robot center body, arms, hands, hip, legs, feet, mimic human physical body part movement. Therefore, it will become a human form Gesture Interface Robot that can support puzzle cell map virtual keyboard gesture functions.
  • Main Computer 1 use as Robot's brain to process video image
  • the user body part joins location 3 dimension X, Y, Z values can be program using Microsoft visual C# program 4, (or VB), to calling Kinect and other system assemble libraries, and enable Kinect sensor to reading user joint values in the program.
  • These basic video sensor reading user's 3D body joint values are available now, therefore we can write a specific puzzle cell map virtual keyboard control program (PCMVKCP) 3 that transform the basic 3D joint value, intelligently measuring calibrate into a new gesture interface input work space zone and establish puzzle cell virtual keyboard into the zone. So, user is able to moving hands, and point out to click virtual keys.
  • Those enable Kinect sensor functions to reading joints values can coding into The (PCMVKCP) program 3, and the program 3 can be a class program (example:
  • MainWindow.xaml.es that include in the Microsoft Visual Studio C# 4 as one project and build into 1 project solution, prefer in WPF Application type project, so all the video Kinect sensor reading value is available for
  • PCMVKCP program to use them in real time programming, create dynamic user graphic interface.
  • Gesture Interface robot use vision 3 dimension, X, Y, Z body parts value for robot vision puzzle cell map virtual keyboard control program (PCMVKCP) 3 to be able create work zone, establish puzzle cell map virtual keyboards, provide real-time user hands location, convert to puzzle cell position, then match puzzle cell row-column to match with its puzzle cell command map list, transfer the cell position to computer command, and sending command to automation program 2 (such as EventGhost) to run pre record macro script to execute command such as display type a text, running a computer program, sending Infrared signal to remote control TV, DVD, or another computer to typing, mouse movement, mouse clicks, running computer program, internet browser ... etc computer operations.
  • PCMVKCP robot vision puzzle cell map virtual keyboard control program
  • main computer 1 includes web server function 41, such as IIS server and can establish inter server-client network, DNS server, TCP/IPURL, namespace ... etc. web site hosting, provide HTML, XMAL, scripting functions.
  • web server function 41 such as IIS server and can establish inter server-client network, DNS server, TCP/IPURL, namespace ... etc. web site hosting, provide HTML, XMAL, scripting functions.
  • (PCMVKCP) program 3 can activate a web Brower, sending a web page URL include a specific text code, when particular web page is being running and open, the automation program 2 (such as EventGhost) detect the particular text code trigger, it will trigger the macro action in the folder.
  • EventGhost such as EventGhost
  • Figure 2 illustrates Gesture Interface Robot (PCMVKCP) program 3 automatic measure user's workspace, assign virtual center point, create workspace zone 76 in conformable area 47, establish puzzle cell mapping keys (such as 85, 86, 87, 82, 91, 92, and all other cells), virtual control panel keyboard in front of user to click.
  • PCMVKCP Gesture Interface Robot
  • program 4 can video sensor reading the user 50 body joint 3D values.
  • PCMVKCP PCMVKCP
  • the length of Right shoulder 51 can be calculated by shoulder Center 79 and Right shoulder edgejoint 52.
  • Left shoulder 49 can be calculated by shoulder Center 79 and Left shoulder edgejoint 48.
  • Right Up Arm 53 can be calculated by Right shoulder edgejoint 52 and Right hand elbow joint 57.
  • Left Up Arm 75 can be calculated by Left shoulder edge joint 48 and Left hand elbow joint 74.
  • Right Lower Arm 56 can be calculated by Right hand elbow joint 57 and Right Hand Palm Joint 77.
  • Total user height can be approximately adding all length to estimate user's height and the maximum user width length likely to be the shoulder both edge joints distant. Because human use both arms, and the comfortable movement space have limitation areas.
  • the comfortable areas 47 can be defined where in front of user, and in circle around both side. If Left hand move over right shoulder edge is difficulty and right hand move over to Left shoulder become difficulty. Both comfortable areas 47 circle create a over layer area 59 (between shoulders), and 2 circles have intersection point 60 match with user body center line 58.
  • PCMVKCP Gesture Interface Robot
  • the (PCMVKCP) program 3 will assign a virtual center point on user, which is prefer to be should center joint point 79.
  • the Workspace zone will be tracking in front of user accordingly User's should center joint point 79; therefore when user walking or moving, the Workspace zone 76 is always at the same place in front of user. If user walking the video view able area, the software keeps digital tracking, and when user walking out the video view area edge, then (PCMVKCP) program 3, will activate intelligent motor module rotate video sensor to follow aiming on user.
  • program can determined that user's Left hand 73, palm center 82 is at Puzzle Cell (Row 4, Column 2) location. If user Left hand 73 moving up to Puzzle cell 91, it will be (Row 2,Column 2), and if moving to Puzzle Cell 92, it will be(Row 1, Column 1)
  • the selection click zone 88, the total selection click zone's 88 max length is limited by total user arm hand length (75 + 72), (53+56), that is the longer length user can push their hands out.
  • the program defines the maximum hand push out surface 84. For example user push Left hand 73 out and in direction 90, and the ((PCMVKCP) program 3 reading the Left hand palm joint z dimension length value 81 changing become longer (bigger) between the user body z dimension values.
  • the compare z dimension value can be assign on user body z dimension surface 80, center point or left shoulder joint 48 or right shoulder joint 52 to compare when special measurement need, This is useful for a handicap user might use mouth bite a water color pen to select virtual key to enter, vision tracking program 3 can use special assign any point of the z surface 80 value to determine handicap user select click action.
  • Program recognize the hand push out selection click action, lock the Puzzle Cell Row4- Column2 position and match Puzzle Cell Map 2 dimension array string code to transfer position into computer command.
  • the selection click zone 88 divide into 3 selection mode zones. The require detect click action of the hand push out edge 89 prefer to be shorter than maximum z push out surface 84 to prevent user need to push hand muscle too much, and rapidly too often that could cause body arm injury.
  • FIG. 3 illustrates the hand push out in z dimension push to click virtual key, and the z dimension distant 88, between hand palm 82 and user body point 93 distant has divide into 3 zones, 1 st selection unlock selected key gate zone 99 between user body point 93 to 1 st select zone edge point 98, 2 nd moving to select virtual key zone 94 between 1st select zone edge point 98 to 2 nd select zone edge point 95, and 3 rd push hand to click the selected virtual key zone 96 between 2 nd select zone edge point 95 and 3 rd select zone edge 89; in addition, showing unique special GIR fingers hand sign enhance selection control accuracy.
  • Program 3 can detect user Left hand palm center 82 in pulling and push action direction 90. In 2 nd select key zone, user move hands and keep in 2 nd zone area to select and change any key freely. In default, when user hand making Push out direction, the program detect "Push" action, it will lock the puzzle cell position, so it will not be change even when X, Y change during the hands push out.
  • a special GIR gesture hand sign to moving fingers like spider walking its legs to change nearby puzzle cell selection.
  • User Left hand palm 82 can stay in the 2 nd select key zone 94, hand fingers 103, 105, 106, 107, and 108 moving like spider legs walking gesture, the puzzle cell row-column lines like spider web net. So tiny moving fingers in waking direction of up, down, left, right, the program can detect the weight most area hand palm 82 on which puzzle cell, so user don't need to make big hand movement to change puzzle cell where just beside the current select puzzle cell position.
  • Each finger has 2 joints sections example finger 103 has two joints 101, 102, connect to hand palm at joint 100.
  • the Left hand palm 82 detect circle area size 109 diameter length 104 is larger than when all fingers close and holding 111, the vision tracking program detect hand area circle 113, diameter 112 become smaller. This difference become useful to enhance puzzle cell selection, when use locate select command, then close all finger and push hand out, program will lock the puzzle row-column value regardless even hand moving X, Y direction, the puzzle cell position will not be changed.
  • This hold to grip click, special GIR hand gesture feature is useful when user need to rush to click a virtual key for sure in a emergency situation such as in spaceship out of control or user has shaking hands illness problem, the program will support for the need.
  • Figure 4 is a drawing showing a special GIR hand sign gesture to continuous click without pull hand back to unlock, by moving like fish swimming its fins, moving fingers 100, 105, 106, 107, 108 up and down between horizontal line 1 17 routing 1 by 1 that making waving fingers GIR hand sign gesture in the 3 rd selected click zone 96, the vision tracking function in program 3 will detect hand size area 1 16, hand palm center point 82 distant value.
  • Robot vision program draw virtual puzzle cell map keys control panel graphic image 141 on display monitor 43.
  • Vision program using the tracking user's hands 77, 82 location determine which keys are selected, and real time update on display monitor 43, vision program highlight 130, 138 in different color, enlarge font size on the particular puzzle cells 132, 136 as visual indication, user know which keys that they are selected on right hand palm 77 and left hand palm 82.
  • the graphic puzzle cell map image center point 133 match with virtual workspace center point 79. So hands X, Y value can be match on the graphic puzzle cell map image on select keys correctly. If Left hand palm 82 moving to up direction 139, the highlight will change to puzzle cell 140, if Left hand move down and out of puzzle cell map area 137, the program will not indicate any select key.
  • FIG. 6 is a drawing showing robot vision program drawing virtual puzzle cell map keys control panel graphic image 141 like watercolor painting artist
  • robot vision program is able to work with automation program 2 (Example EventGhost) to control USB-UIRT cable 34 to send Infrared signals 171 to remote control another computer 164 with IR receiver 172, control its keyboard to typing display "X" and "0" on notepad program 167 on monitor, and when puzzle cell loading mouse keys, then user able click to sending mouse moving IR signal 171 to control the other computer 164 move its mouse 168 position and do mouse 168 click operation.
  • the command execute signal can be also sending by Bluetooth device to control a Bluetooth micro controller board device that user wear on to blink LED light as Morse code, or vibration long-short as Morse code signal. It can send signal through Wi-Fi network device 39, TCP/IP, Internet network server-client program to control another node on the network, computer, machines, and intelligent robot.
  • automation program 2 such as EventGhost that can create many folders to save macros script with trigger actions, and can detected the specific trigger command events, the Marcos can exercise command such as sending text key command, display A - Z, 0-9, symbols keys, functions key, open computer program, internet browser, words, calculator, 3D graphic drawing CAD program ... etc; in addition, automation program 2 such as EventGhost can include USB UIRT cable 34 to learn physical Infrared Remote Controller each function keys signal and recorded and sending out by macros script action.
  • EventGhost will send infrared signal 171 out through USB-UIRT cable device 34, the IR signal can be control a physical machine such as computer 164, machine, and intelligent robot.
  • robot is sending IR signal 171 out to control a TV to turn ON/OFF.
  • another computer 164 can equipped IR receiver 172, then Gesture Interface Robot can sending IR signal 171 to control the other computer 164 such as display a - z, 0 -9, symbols, function keys, open computer programs, Media, running DVD player, playing music, video, internet browser, playing games, and moving mouse 168 position, Right click, Left Click, Double click, wheel up, wheel down ... etc computer functions.
  • Gesture Interface Robot can control self-intelligent machines, and intelligent robots. Soon self-intelligent driving car, flight jet, and spaceship, intelligent robot will be using in people lifestyle daily home, healthy care, education, medical, transportation, public services ... etc.
  • robot program can have private own automation program 2 functions control features include in the robot program 3, Robot program can directly coding with USB-UIRT cable's API library add in to be assemble of available functions directly in (PCMVKCP) program 3 function code, so robot program can directly control USB-UIRT cable to the IR signal learning, record and send out IR signal commands, robot can be directly control physical machine such as TV, Computer, machines in the robot program without need to have 3 rd party automation program 2 such as EventGhost to run it. Similar the robot program can sending enter key command directly to the activate program, for example enter key to notepad or Microsoft Word, program coding to sending text key command to display type words on the writing program directly without need to have 3 rd part automation program too.
  • PCMVKCP PCMVKCP
  • Figure 7 is a drawing showing vision program drawing a mouse keyboard, control panel on virtual puzzle cell map keys control panel graphic image, divide 2 mouse section, Left hand mouse 186, and Right Hand mouse 174.
  • TextBlock 185 "Mouse Up” to TextBlock 184, “Mouse Left” to TextBlock 183, “Mouse 225" to TextBlock 182, "Double Click” to TextBlock 181, “Left Click” to TextBlock 180, “Right Click” to TextBlock 179, and all other keys.
  • the virtual puzzle cell map keyboard, control panel, prefer special interface section arrangement can be divide into Left and Right hand zones, The center area 173 of virtual puzzle cell map keys control panel graphic image is reserved to place a real time video image 187 that show user actions 188. So user can see themselves and all the control virtual keys together on monitor, this special virtual gesture interface arrangement make good visual feedback indication controls and easy for eyesight during for user operation.
  • a real example of GIR robot program for mouse key control interface arrangement It prefer arrange the interface of mouse keys control panel to support both in Left Hand Mouse Key area and Right Hand Mouse Key area with all direction moving keys, UP, DOWN, LEFT, Right, 45 degree, 135 degree, 225 degree, and 315 degree keys.
  • the mouse movement can have 1 small move key for UP, Down, Left, right, 45, 135, 225, 315. This is useful when mouse is near the target to be click, so tiny mouse movement for mouse to select on the target.
  • the mouse movement can have 1 large move key for UP8, Down8, Left8, right8, 45-8, 135-8, 225-8, 315-8. "8" means 8 times move distant of small mouse movement. This is useful when mouse is some distant to move to the target, so large mouse movement for mouse to select on the target, less click gesture action.
  • All Mouse Keys selection click is not locked in 3 rd selection click zone, that means, all mouse keys can be re-click again in the 3 rd select click zone without pulling hand back.
  • Combine Fish swimming Fin gesture user can very easily control mouse location and accurate to point on the target and do mouse click functions. Please see "//" comment of array key define distant and multiple speed keys beside code.
  • Puzzle cell Size (H x W) will be calculated by Workspace Zone size (H x W) divide by the rows, and columns.
  • puzzleCell apList [ l, 1 ] " " " ; // Fi rst now reserve Robot menu
  • puzzleCellMapList [ 2, 1 ] "MKEY” ;
  • puzzleCellMapList [ 3, 1 ] " " " ;
  • puzzleCellMapList [4, 1 ] " " " ;
  • puzzleCellMapList[3, 2] "MU8" ; // ReClick able, Large Move
  • puzzleCellMapList[4, 2] "MU8" ; // Move Mouse Large Up
  • puzzleCellMapList[5, 2] "MU8" ;
  • puzzleCellMapList[13, 2] "MU8" ;
  • puzzleCellMapList[14, 2] "MU8" ; // Move mouse larger Up
  • puzzleCellMapList[15, 2] "MU8" ;
  • puzzleCellMapList[16, 2] "MR45-8"; // Move Mouse Large 45 degree
  • puzzleCellMapList[17, 2] ""
  • puzzleCellMapList[l, 3] ""
  • puzzleCellMapList[2, 3] "ML8" ;
  • puzzleCellMapList[6, 3] "MR8" ;
  • puzzleCellMapList[7, 3] ""; puzzleCellMapList [8, 3]
  • puzzleCellMapList[6, 4] "MR8" ; //Move Mouse
  • puzzleCellMapList[9, 4] puzzleCellMapList [10, 4] "";
  • puzzleCellMapList[12, 4] "ML8" ;
  • puzzleCellMapList[16, 4] "MR8" ;
  • puzzleCellMapList[17., 4] "ENTER”; puzzleCellMapList[l, 5]
  • puzzleCellMapList[6, 5] "MR8" ;
  • puzzleCellMapList[12, 5] puzzleCellMapList [13., 5] "ML225
  • puzzleCellMapList[16, 5] "MR8" ;
  • puzzleCellMapList[17, 5] ""; puzzleCellMapList[l, 6]
  • puzzleCellMapList[14, 6] "MD8"
  • puzzleCellMapList[15, 6] "MD8" ;
  • puzzleCellMapList[17, 6] ""; puzzleCellMapList[l, 7]
  • puzzleCellMapList[2, 7] "DCLICK";// Mouse
  • puzzleCellMapList[4, 7] "WWT” ; // change to
  • puzzleCellMapList[5, 7] "SLOT";// change to SLOT control
  • puzzleCellMapList[6, 7] "DDING" ; // change to DDING control L
  • puzzleCellMapList[12 J 7] "DCLICK"
  • puzzleCellMapList[13, 7] "LCLICK"
  • vision tracking function in the program 3 use the puzzle cell position (Row, Column) to call the particular (Row, column) array string value of the puzzle cell 2 dimension string array code and obtain the text word command. For example, if user Right hand move to "MU" and click,
  • the program activates specific web page and generates a HTTP browser command
  • the web page activate link to trigger automation program EventGhost trigger event (KEYS Folder, MU event), and exercise the MU Marco script to sending out Infrared Signal to control another computer to move its mouse position UP small distant, if "MU8”, then move its mouse position UP large distant, if "ML225”, then move its mouse position 225 degree small distant. if "ML225-8", then move its mouse position 225 degree, 8 times of small distant.
  • EventGhost trigger event KYS Folder, MU event
  • the puzzle cell keys can be define in software function coding by allow the keys to multiple click, multiple speed, different move distant, enable multiple clicks by 1 gesture action and also allow to control the lock or unlock the key that to enable re-click in 3 rd zone key.
  • GIR special gesture hand sign can continuous click virtual keys easily in 3 rd selection click zone.
  • This key control definitions method is use for all other keys and actions in all virtual control panels, keyboards.
  • the first Row of virtual keyboard controller is reserve for robot function menu, and the last row is reserve for program controls, change controller ...etc.
  • Figure 8 is a drawing showing Gesture Interface Robot can create any keyboard and control panel that user wanted.
  • program draw a new virtual puzzle cell map keys control panel graphic image to a virtual control panel WWT 189 for Microsoft World Wide Telescope program.
  • program draw a new virtual puzzle cell map keys control panel graphic image to a virtual control panel WWT 189 for Microsoft World Wide Telescope program.
  • drawing mouse control keys such as “Left Click” 180, "Right Click” 179 and all other keys on its cell.
  • program re-draw a new virtual puzzle cell map keys control panel graphic image to a virtual control panel SLOT 196 for control a SLOT machine simulation program.
  • program re-draw a new virtual puzzle cell map keys control panel graphic image to a virtual control panel DJING 197 for control a Disco DJ machine simulation program.
  • program re-draw a new virtual puzzle cell map keys control panel graphic image to a virtual control panel 2ndLife 198 for control a virtual 3D world avatar in 2 nd Life viewer program.
  • ROVER controller If user select ROVER controller, then program re-draw a new virtual puzzle cell map keys control panel graphic image to a virtual control panel ROVER 199 for control a Mars Rover simulation program to control rover robot to driving, take pictures, transmitting pictures back to Earth, use Claw, Driller to take rock samples, ... etc, intelligent robot operations.
  • Figure 9 is a drawing showing more of examples of virtual keyboards drawings to show that Gesture Interface Robot is able to support computer using USB-UIRT to remote control machines such as TV, DVD, SIRIUS radio, Disco Light ...etc, and special Morse Keyboard.
  • program re-draw a new virtual puzzle cell map keys control panel graphic image to a virtual control panel TV 200 for control TV functions.
  • SIRIUS controller If user select SIRIUS controller, then program re-draw a new virtual puzzle cell map keys control panel graphic image to a virtual control panel SIRIUS 203 for control Sirius radio functions.
  • Morse code Keyboard controller If user select Morse code Keyboard controller, then program re-draw a new virtual puzzle cell map keys control panel graphic image to a virtual control panel Morse code 204 for using enter Morse Code to enter key functions.
  • puzzle cell Row 2, Column 2 a ".” Represent “Di”, and puzzle cell row 2, column 4 a " - " represent "DHA”, user can click on the cells to make “Di”, “DHA” signals,
  • (PCMVKCP) program 3 include Morse code signals convert to A -Z, 0 - 9 functions, so user enter Morse Code, then click CONVERT 193, it transfer to character to execute command.
  • the Read command is use during the Morse code enter, user can read what code has been enter so far, and can Erase all to re-enter again and can click BKSP 190 to deleted just a signal "Di” , "DHA”.
  • This GIR Morse Morse Code Keyboard is useful for poor eyesight user, and blind user to enter command by simplest gesture action "Di", "DHA” actions to control machines. If user select SYMBOLS controller, then program re-draw a new virtual puzzle cell map keys control panel graphic image to a virtual control panel
  • SYMBOLS 205 for control another computer enter display symbols keys.
  • Figure 10 is a drawing showing more of examples of virtual keyboards drawings to show that Gesture Interface Robot is able to support computer operation functions.
  • PROGRAM controller If user select PROGRAM controller, then program re-draw a new virtual puzzle cell map keys control panel graphic image to a virtual control panel
  • PROGRAM 209 for control another computer execute computer program to run.
  • Example click Take
  • Robot will take picture of user and save. If user click the "LOOK UP”, “LOOK RIGHT”, “LOOK LEFT”, “LOOK DOWN” keys, robot will control its motor module to rotate its video sensor to turn UP, RIGHT, LEFT or Down direction.
  • the special arrange area of virtual puzzle cell map keys control panel graphic image, on the first row area 211, it reserved for robot operation function menu and the last row 212 area, it reserved for program type of control panels. This makes easier when user want to use different controller, find it at last row, and when user want to configuration robot support function look on the first row of puzzle cell map image.
  • Robot using peripherals devices to be able control, network devices, computers, machines, and intelligent robot.
  • Robot can equipped speech recognition program function 213, array of microphones use as sound sensor, and equipped voice speaking program function 214 use speakers to voice feedback.
  • Robot vision program can support Hand Sign Language function 179. Each hand and fingers gestures and positions value on each video frame will be compare and distinguish the hand sign on puzzle cell area to determine what hand sign language and program execute the command.
  • Figure 11 is the drawing showing an advance TouchScreen Mouse 224 combine puzzle cell virtual keyboard 221 in sandwich layers method.
  • Gesture Interface Robot support a new revolution gesture input of computer interface method
  • Gesture Interface Robot can support an advance gesture action of the
  • Robot vision program 3 is enable user to decide which hand for TouchScreen Mouse 221, 222 and the other hand can virtual click the virtual puzzle cell Mouse keys, can be assign any commands, and for mouse function can be such as Mouse Double click 195, 175, Left click 193, Right click 177, Mouse Left click UP 194, Mouse Left click Down 192, Mouse Right Click UP 176, Mouse Right Click Down 178, 190, Wheel UP, Wheel Down ... etc.
  • robot program 3 activate the virtual Touchscreen Mouse 224 function, disable Right Hand select and enable Left Hand select only on virtual keys, and enable tracking user's right hand 77 location and moving the mouse 224 position accordingly on the display monitor 43. If user's right hand 77 moving UP, the robot program moving the mouse 224 cursor position UP on the monitor 43 accordingly the distant 78 of the hand move distant.
  • the move distance can be determined where its location on the right side of the Work Zone space 76, and robot program calculate the ratio of X 234, Y 224 distance between virtual center point 79, and update the same ratio distant 232 moving the mouse 224 cursor position in the same direction.
  • mouse 224 cursor will moving a circle on the monitor 43 in real time.
  • the other mouse click action is required,
  • the other hand can moving and click the virtual mouse puzzle cell keys for example, the other hand 82 click Double Click 195, then user moving right hand 77 to control TouchScreen Mouse 224 cursor on a program icon, and push hand out, robot program 3 will perform the Double click 195 for that click instead of default Left click; therefore, the program icon will be Double click 195 to open and running.
  • the other virtual mouse puzzle cell keys are also useful when specific mouse action click need to be specific, For example, if user in view of a large page or a drawing image page, to perform the Left Click Down 192 will makes the whole drawing image page sheet moving follow right hand 77 moving in all directly, and when user moving image sheet to right location, do virtual Left Click Down click 194 to release the TouchScreen Mouse 224 Grip action, and back to default.
  • the TouchScreen Mouse 224 can be performing by right hand 77 or left hand 82, and the each hand mouse 224 cursor start position prefers to be initial on its start location.
  • robot program vision calibrate the user working space zone 76 into 4 sections, X 218 and Y 216 dimension lines across on virtual center point 79, so it divide into 4 sections where value of section I, (X+, Y+) 217, section II, (X-, Y+) 215, section III, (X+, Y-) 219, and section IV, (X-, Y-) 220.
  • This means for the right hand 77 will be determine position using X, Y value of section I, II, III, and IV.
  • leftof sc reen mouseScreenSetllpX + (int)mouseSelectHandX *
  • mouseSeletY topofscreen
  • the Right Hand 77 TouchScreen Mouse 224 program function can be setup to start cursor position will be in monitor LEFT-TOP corner position 231 that is video card monitor 0,0 position.
  • the LEFT Hand TouchScreen Mouse 224 program function can setup to start cursor position in monitor Right-Bottom corner position 227, For example, if a monitor video card use resolution as 1900 x 1200, 228, 230, then the cursor start position is 1900 xl200 on the monitor.
  • Robot program will determine its video view frame width and height ratio to compare with monitor screen resolution ration, and moving mouse cursor distance accordingly with hand in all direction 360 degree.
  • TouchScreen Mouse 224 can use gesture click action with computer virtual keyboard keys buttons as well, and to click keys buttons on computer monitor.
  • mouse option key selection zone can be code in this way, This is directly copy and paste from robot working prototype c# program. Copyright.
  • gesture push to click When user move mouse to target position, for example a web page, then gesture push to click.
  • the hand gesture can control both in Mouse movement, and decide what mouse click action for operate computer, and programs.
  • Figure 12 is drawing showing the enhanced wireless select key indication device 235, 236 that wear on user hand palm 82, arms, or user body.
  • the wireless indication device has 2 styles, style 235 includes micro controller 240, Bluetooth 239, LED light 242, vibration motor 244 and power source 237 with flexible belt 245 that can tight hold on hand palm 82.
  • the 2 nd style i236 include micro controller 240, wireless Wi-Fi, TCP/IP network card 246, LCD display screen 247, vibration motor 244, power source 237, watch belt to hold the device on hand 72.
  • program When user push hand 82 out, program will sending wireless network signals to device to signal display which selection keys; for example by blinking LED light 242 in MorseCode signals, and/or using vibration motor 244 to make long-short vibrations MorseCode signal, so user don't need to watch display monitor 43 to know what keys they select, this feature especially useful for poor eyesight, and blind users.
  • the LCD screen can display real time monitor content, see the puzzle cell map image.
  • Figure 13 is the drawing showing a wireless display glass 46 that has network protocol equipment 45 includes wireless network card equipment 249, video image process card equipment 250, connect with projector 252, power source 247, and wireless server-client program to connect with robot, and robot send the display signals of puzzle cell map image with hands selection positions 253, 265, and the wireless display glass projector 252 project the puzzle cell image keys on its lenses 246, so user can see which keys they select.
  • the left side 269 area is for left hand keys 270, 271, 272, 273, 274, 275
  • the right side 266 area is for the right hand keys 259, 260, 261, 262,263, 264.
  • the lenses center area can optional to reserved for display robot text feedback 268, and real-time video image of user action 267.
  • Figure 14 is the drawing robot equipped a mobile platform, example using micro controller board to control varieties motors 26, 30, so robot main computer 1, vision program can intelligently control these motors rotation; as result, robot intelligently driving itself moving around 276, and able to control moving its display projector 44direction to project puzzle cell keyboard images 277 on any surface 278.
  • This chicken can download to Adruino Micro controller and connect the
  • the motor module can be use for video sensor Tilt, and Pan rotation, robot body movement, neck, arms, legs, and mobile wheels.
  • the varieties of motors control modules can use to build into robot's neck, body, arms, hands, legs, so robot can be build as human shape, physical body movement ability with Gesture Interface Robot puzzle cell map function.
  • the Gesture Interface Robot becomes the communication bridge between human and intelligent robot machine world.
  • This invention proposal Gesture Interface Robot example is use Microsoft Kinect sensor, Microsoft Visual Studio C# programming, chicken micro control board as demonstration to build a completed working Gesture Interface Robot demonstration. There are alternative methods available to customize build the Gesture Interface Robot as well.

Abstract

In a virtual, three-dimensional working space a gesture sensing input device is operative to translate hand gestures of a user into commands for operating a computer or various machines. The input device tracks the user and recognizes the user's hand gestures by correlating the gestures with defined "puzzle-cell" positions established in virtual working space zones, the "puzzle-cell" positions being mapped for converting the hand gestures into computer commands. In the virtual working space, a mouse zone, keyboard zone, and hand sign language zone are defined. The working space is further defined by virtual, layered control zones whereby a plane in which a zone lies may be used to determine whether an actuation has occurred by the crossing of a boundary.

Description

GESTURAL INTERFACE WITH VIRTUAL CONTROL LAYERS
BACKGROUND OF THE INVENTNION
Field of the invention
This invention relates to using intelligent gesture interface robot equipped video vision sensor reading user hand gesture, to operate computer, machine, and intelligent robot. The unique gesture reading method is Gesture Interface Robot using Puzzle Cell Mapping dynamic multiple sandwich layers work zone of virtual touch screen mouse, keyboard, and control panel in user's comfortable gesture action area, simple move hands and push to click. It is easy to operate, no require user make hands swings action or abnormal body posting actions that user could hurt themselves and hit object or others around them, so the best gesture solution is my invention proposal method using puzzle cell mapping gesture method which is consider safe and efficient way. Use can use simple gesture action to control all kind of computer machines all together, and not require remembering which gesture body post for which command. Gesture Interface Robot display real time highlight on keyboard graphic image on display monitor for visual indication, so user know which command their selected, and push hand forward out to confirm the selection. Puzzle cell mapping gesture command method enable using simple move and click gesture action can easily control complex multiple computer machines, robots at the same time. Background Art
Gesture Interface Robot using Puzzle Cell Mapping dynamic multiple sandwich layers work zone of virtual touch screen mouse, keyboard, and control panel in user's comfortable gesture action area, simple move hands and push to click, easy gesture actions to control complex machines operate in real time, and prevent injury.
Problems to solve and benefits:
1. Current Gesture system required user do big gesture actions that could cause injury, such as hit object or someone around; in addition, extend hands or body muscle rapidly could lead injury as well. When user doing abnormal gesture body action, it could hurt them.
2. In our house, work office, business, everywhere we go, tons of remote
controllers. Too many keyboard, mousse, remote controller, smart phone, tablet device controller is trouble. Each controller has its key functions, unnecessary trouble to operate, require to click many keys to just turn on TV or DVD to watch. Difficult to remember which key on which control problem too.
3. Eliminate require building tons of physical mouse, keyboard, remote
controller, control panel interface on equipment, transportation, cars, airplane, spaceship, control office center ... etc. Stop waste resource, pollution and save money.
4. Regular gesture device doesn't have the sufficient functions, also require big body gesture actions, and can not truly use to control complex computer action that user need. Reduce unnecessary equipment interface installation can benefit spaceship to reduce weight, and free the room space.
In space, zero gravity, Gesture Interface Robot Puzzle Cell map method is perfect solution for astronaut can use simple gesture to control computer, machines, intelligent robot in zero gravity environment.
Gesture Interface Robot makes gesture control in all areas possible, and it is both ways, robot has intelligent to make gesture operation easily. Improving our lifestyle, and change the way that people operate computer, machine, and robot all around the world.
Soon, auto self driving car, flight jet, and spaceship will become self- intelligent. People need to able to communicate with autonomous robot by gesture action. IRIID Gesture Interface Robot can be the communication bridge between human and autonomous robot machine world. IRIID will change the way how people operate computers, machines, and intelligent robots the entire world.
SUMMARY OF THE INVENTION
The above problems are solved by the present invention of Gesture Interface Robot. To solve problems and improve a better way for human control computer, machine, and intelligent robot, I have purpose the Gesture Interface Robot that allow user use hands to moving in comfortable area to select virtual puzzle cell keyboard, mouse, control panel key and push out the hand toward as click select action and the Gesture Interface Robot video vision recognize user select gesture action and its location according to the center point assigned on user. Gesture Interface Robot use the relative distant on hands location with the center point to determine the X, Y position of which puzzle cell position; in addition, Gesture Interface Robot also can recognize user push hands location in Z dimension distant change between hands and user body distant; for example, if hand location push out, the distant between hand and user body will increase and the maximum of this push distant is the hand and arms total length that normal human can be push their hand out distant.
Gesture Interface Robot can virtual divide this total hand push out distant in 3 selection zones, first selection zone is unlock user hand selection key zone, and the second section zone is moving hand in all directions UP, Down, Left, Right to select virtual key selection zone, and the 3rd selection zone is push out as click selection zone. As a result, when user moving hands in the 2nd selection zone, the robot update real time visual puzzle cell map to display both hands position on graphic puzzle cell map control keyboard that display on monitor as visual indicate to user, so user know which is left hand's select on which virtual key and which is right hand's selection virtual key. The selected virtual keys will be highlight and increase font size as indication on the graphic puzzle cell map keyboard that display on monitor for example, left hand selection is highlight Red color and enlarge font size, and right hand selection key is highlight white color and enlarge font size. When user location their hand on the desire select command, user then push their hand out in Z dimension direction into the 3rd selection zone, robot video sensor recognize user's click action, match the X, Y with its puzzle cell map and translate into computer command, and then send command to automation program that has command script, functions, or Marco with action trigger to exercise the command. So, robot's web server function can activate web browser and enter URL plus command text code and enter, the specific web page will be execute open with specific command text code whatever user selected embedded in web link format. When such browser open the specific web page, the robot main computer can have automation program such as EventGhost that can detected the trigger action and exercise the command include in the same Marco. So each web page URL with particular text code will active a different trigger action and exercise different command accordingly.
The automation program such as EventGhost can also recognize key click event as trigger, so robot can send key click to trigger action. However, there is limited computer keys can be assigned to particular commands, and also particular physical key will be assigned and can not be use for normal typing function any more; therefore, recommends using the web server IIS service and activate specific web page with specific text code is the best way, unlimited assign command by differential folders on each control machines and trigger Marco actions and free keys to be click able as normal computer functions. Once automation program such as EventGhost that can have many folders and include save Marcos with trigger actions, and can detected the specific trigger command, the Marco can exercise command such as sending text key command, display A - Z, 0-9, symbols keys, functions key, open computer program, internet browser, words, calculator, 3D graphic drawing CAD program ... etc; in addition, automation program such as EventGhost can include USB UIRT cable to learn physical Infrared Remote Controller each function keys signal and recorded in Marco action.
When robot program trigger the action, EventGhost will send infrared signal out through USB-UIRT cable device, the IR signal can be control a physical machine such as computer, machine, and intelligent robot. For example, robot is sending IR signal out to control a TV to turn ON/OFF. Another example, another computer can equipped IR receiver, then Gesture Interface Robot can sending IR signal to control the other computer such as display a - z, 0 -9, symbols, function keys, open computer programs, Media, running DVD player, playing music, video, internet browser, playing games, and moving mouse position, Right click, Left Click, Double click, wheel up, wheel down ... etc computer functions. As a result, Gesture Interface Robot can control self-intelligent machines, and intelligent robots. Soon self-intelligent driving car, flight jet, and spaceship, intelligent robot will be using in people lifestyle daily home, healthy care, education, medical, transportation, public services ... etc.
If desire to have private own automation program control features include in the robot program, Robot program can directly coding with USB-UIRT cable's API library add in to be assemble of available functions, so robot program can directly control USB-UIRT cable to the IR signal learning and send out IR signal commands, robot can be directly control physical machine such as TV, Computer, machines in the robot program without need to have 3rd party automation program such as EventGhost to run it. Similar the robot program can sending enter key command directly to the activate program, for example enter key to notepad or Microsoft Word, program coding to sending text key command to display type words on the writing program directly without need to have 3rd part automation program too.
Current regular computer interface control device methods are not able to support input function efficiently, and current available gesture input systems that using traditional vision user body posting like looking art picture images method or require user push hands rapidly too much often, doing wide swings actions that could cause user body injury and hit others and objects near by around. The abnormal body post, or push out hands rapidly require extent muscle exercise that is not a safe way for normal people to do in long hours to control operate machines. Another problem is those traditional gesture systems require high image process CPU speed, and high cost computer vision program and hardware to be able recognize some simple gesture positing images, high electricity usage cost demand and its video vision still can not detect all user gesture action accurate, and need to be specific define on each user, and specific fixed location, these are the current traditional gesture systems problems and the reasons why those traditional gesture systems are not being using in public for real applications widely.
On the other hand, my proposal Gesture Interface Robot is using puzzle cell mapping method, Gesture Interface Robot is acting like a graphic image (Picasso) painter, robot program draw the graphic picture of virtual control panels keys. On the display monitor, the puzzle cell virtual control panel keys can be drawing a grid image of rows and columns cells and tilt with TextBlock field, then fill in text word to TextBlock field on each grid row and column cells on the graphic image as command, therefore inside the program function those text command words can be coding arranged into a 2 dimension array text strings, then loading each text word into row and column cells, so that display on the graphic puzzle cell image and virtually have assign on user's working space zone, user can freely working around, sitting in chair, robot total user tracking in video view of the user, assign virtual center point on user and create work zone and establish virtual control panel keys in front of user and plus intelligent motor module that can physical rotate video sensor to aiming vision tracking on user accordingly when if use walk out of it video view, robot can assign virtual center point on user such as prefer work space virtual center point is user's shoulder's center point where join connected with throat neck and establish the work zone size by width, and height. The prefer work space width is each shoulder length x 1.5 on each side, so total workspace zone width prefer 1.5 + 1.5 = 3 and the prefer work space height is the shoulder's center point up to the user's head center times 2 to 3. Additional virtual points to be assigned on user body if special need for handicap disability users that require special assign, could be anywhere of user body location. User without arms, could mouse biting pointer stick, or water color painting pen to make gesture selection. The left and right shoulder's edge points can be add into program for enhanced the selection X, Y value reading accuracy, and the hands' palm size value (open and hold fingers) can be add in program to enhance the click selection reading accuracy. Therefore robot create a prefect comfortable work space zone for the user, and user can moving their hands in comfortable space, and all directions without difficulty, and prevent problems such as injury themselves, or hitting any others, or objects around. Because gesture Interface root is using puzzle cell mapping method, robot can graphic drawing virtually any virtual mouse, keyboard, and control panels that user wanted instantly, and the gesture video sensor require user do simple hand moving, and click action, and whole Gesture Interface Robot can be build by using a regular computer, laptop with video camera, lower system electricity consumption, low building equipment cost, and Gesture Interface Robot can be use by everyone convince, waling moving, sitting and everywhere.
Gesture Interface Robot can be used in all areas on the Earth; furthermore, In zero gravity environment, physical moving is difficulty, Gesture Interface Robot is useful in spaceship that astronaut can using gesture action moving their hands in front of them to control computer, machine, and intelligent robot in zero gravity environment easily and plus free the room space and reduce spaceship weight.
Plus unique gesture continuous click action, Gesture Interface Robot vision is enable user moving hand in front of them like a fish swimming their fins smoothly and softly moving each fingers UP and Down like waving fin to control a continuous click action in 3rd click selection zone. In 3rd selection zone user's hand palm make this fish fin waving swimming gesture action as hand sign and the robot vision program can detect the distance changing, hand's palm center visible blinking like night sky star view, each wave makes a blinking and robot automatically blinking to continue click action without require user pulling hand back to 1st select zone to unlock and push out to reselect action. This unique gesture of fish fin waving swimming hand palm sign make user very easy to control machines when continuous clicks required such as TV volume UP/Down, or computer Mouse moving UP, Down, Left, Right... etc.
A new distinguish revolution of computer interface method, Gesture Interface Robot can support an advance gesture action of the Touchscreen of Mouse that virtual sandwich layers to combine virtual control panel keys zone function. Robot vision program is enable user to decide which hand for TouchScreen Mouse and the other hand can virtual click the virtual puzzle cell Mouse keys, can be assign any commands, and for mouse function can be such as Mouse Double click, Left click, Right click, Mouse Left click UP, Mouse Left click Down, Mouse Right Click UP, Mouse Right Click Down, Wheel UP, Wheel Down ... etc. For example, if user use right hand click virtual mouse function on the title menu of the virtual puzzle cell control panel, then robot program activate the virtual TouchScreen Mouse function, enable tracking user's right hand location and moving the mouse position accordingly on the display monitor. If user's right hand moving UP, the robot program moving the mouse cursor position UP on the monitor accordingly the distant of the hand move distant. The move distance can be determined where its location on the right side of the Work Zone space, and robot program calculate the ratio of X, Y distance between virtual center point, and update the same ratio distant moving the mouse cursor position in the same direction. Therefore, if user's moving right hand make a circle, the mouse cursor will moving a circle on the monitor in real time. When user move mouse cursor on specific position that could be a internet browser web page on the computer desktop screen, user can push right hand out, the robot recognize the click select, it will do the Mouse LEFT click as default selection click action. Sometime, the other mouse click action is required, For example, the other hand can moving and click the virtual mouse puzzle cell keys for example, the other hand click Double Click, then user moving right hand to control TouchScreen Mouse cursor on a program icon, and push hand out, robot program will perform the Double click for that click instead of default Left click; therefore, the program icon will be Double click and running. The other virtual mouse puzzle cell keys are also useful when specific mouse action click need to be specific, For example, if user in view of a large page or a drawing image page, to perform the Left Click Down will makes the whole drawing image page sheet moving follow right hand moving in all directly, and when user moving image sheet to right location, do virtual Left Click Down click to release the Touchscreen Mouse Grip action, and back to default. The Touchscreen Mouse can be performing by right hand or left hand, and the each hand mouse cursor start position prefers to be initial on its start location. Because robot program vision calibrate the user working space zone into 4 sections, X and Y dimension lines across on virtual center point, so it divide into 4 sections where value of section I, (X+, Y+), section II, (X-, Y+), section III, (X+, Y-), and section IV, (X-, Y-). This means for the right hand will be determine position using X, Y value of section I and III, the Right Hand TouchScreen Mouse program function prefer to start cursor position will be in monitor LEFT-TOP comer position that is video card monitor 0,0 position. On the other hand, for the left hand will be determined position using X, Y value of section II and IV, then the LEFT Hand TouchScreen Mouse program function prefer to start cursor position will be in monitor Right-Bottom comer position, if a monitor video card use resolution as 1900 x 1200, then the cursor start position is 1900 xl200 on the monitor. Robot program will determine its video view frame width and height ratio to compare with monitor screen resolution ration, and moving mouse cursor distance accordingly with hand in all direction 360 degree. TouchScreen Mouse can use gesture click action with computer virtual keyboard keys buttons as well, and to click keys buttons on computer monitor. If computer windows desktop screen are tile fill up click-able buttons on surface, then user can use TocuhScreen Mouse to select which button to be clicked by gesture action. In summary of this TouchScreen Mouse combine Virtual Puzzle Cell keys control panels in sandwich layers functions is an advance gesture system that included all current computer interface device methods to be the one true universal computer interface device and enable user to perform gesture control all machine functions together, and easy gesture to control computer, without need to physical built mouse, keyboard, remote controller, control interface on equipments, machines, robots. Gesture Interface Robot will replace the need for building physical control panels, interface devices, reduce high tech device pollution and save the material resource usage on the Earth.
Gesture Interface Robot can equipped output display devices options, such as display monitor, visual image projector on any surface, wireless monitor glass that user can wear and see the project monitor screen in the lances. Robot can control a wireless Bluetooth card attached with micro controller board or a smart phone to light on LED light on and OFF to display Morse code of text command select on , or generate a vibration long and short signals of Morse code of text command too. User can wear the wireless display Morse code text command device on their palm's back and LED lights face direction to themselves, or like a watch. When their hand move on puzzle cell, then robot program sending command to wireless micro controller boards to blink LED light ON/OFF Long and short to indicate which command select on and/or motor vibration long and short signal for silent reading text command. So, user doesn't need to watch display monitor, and this feature is especially useful for poor eyesight user and blind users so they can perform gesture selection like normal people do. Gesture Interface Robot can equipment wireless equipment such as Bluetooth, Wi-Fi network equipment that can sending signal to control other wireless network smart phone, micro controller board, machines, car's Bluetooth system, other computer, another machine, another network nodes on the networks, through World Wide Web, Internet TCP/IP protocol and using server-client network software program to remote control operation and diagnostic configuration other robot machines, or connect to space signal transmitter station to sending signal to space remote control , Harvard Space Telescope, or Rover robot on the Mars ... etc.
Gesture Interface Robot will change the way how people using computer, machine, and intelligent robot all around the world.
BRIEF DESCRIPTION OF THE DRAWINGS
In the drawings,
Figure 1 is a drawing showing the hardware components of Gesture Interface Robot, peripherals wireless network devices, display devices and the robot vision tracking software programs.
Figure 2 is a drawing showing robot vision program automatic measure user's work space, assign virtual center point, create work space zone in conformable area, establish puzzle cell mapping keys, virtual control panel in front of user to click.
Figure 3 is a drawing showing the hand push out in z dimension push to click virtual key, and the z dimension distant, between hand palm and user body distant, has divide into 3 zones, 1st selection unlock selected key gate zone, 2nd moving to select virtual key zone, and 3rd push hand to click the selected virtual key zone; in addition, showing unique special GIR fingers hand sign enhance selection control accuracy. Figure 4 is a drawing showing a special GIR hand sign gesture moving like fish swimming its fins, moving fingers up and down routing 1 by 1 that making waving fingers hand sign in the 3rd selected zone, vision program will detect and continuous click virtual key without pull hand back to unlock selected key gate zone.
Figure 5 a drawing showing robot vision program to tracking user's hands position in the work zone. Using hands X, Y distant between center points to determine which virtual puzzle cell position is being select on. Robot vision program draw virtual puzzle cell map keys control panel graphic image on display monitor. Vision program using the tracking user's hands location determine which keys are selected, and on display monitor, vision program highlight particular puzzle cells as visual indication, user know which keys that they are selected on right hand and left hand.
Figure 6 is a drawing showing robot vision program drawing virtual puzzle cell map keys control panel as graphic image like watercolor painting artist (Picasso). Program draw the virtual keys in Grid Row and Column cells, and insert TextBlock field into each grid cell, then filling text word into TextBlock field as indicate command for user to select; for example a Q WERT standard virtual puzzle cell keyboard.
In addition, robot vision program is able to work with automation program to control USB-UIRT cable to send Infrared signals to remote control another computer keyboard and mouse operation.
Figure 7 is a drawing showing vision program drawing a mouse keyboard, control panel, User can select the virtual keys to control mouse position, and mouse click functions. In addition, the virtual puzzle cell map keyboard, control panel, prefer special interface section arrangement, can be divide into Left and Right hand zones, a reserve the center area of work space to display real-time user image video, showing user action, user can see themselves on the display monitor with virtual keyboards together, this special virtual gesture interface arrangement make good visual feedback and indication easy for eyes during for user operation.
Figure 8 is a drawing showing Gesture Interface Robot can create any keyboard and control panel that user wanted. The varieties of virtual keyboard, control panel, each keyboard has its own control commands, and filling into each row-column puzzle cell, the virtual keyboards drawings are to show as examples Figure 9 is a drawing showing more of examples of virtual keyboards drawings to show that Gesture Interface Robot is able to support computer operation functions.
Figure 10 is a drawing showing more of examples of virtual keyboards drawings to show that Gesture Interface Robot is able to support computer operation functions; in addition, robot using peripherals devices to be able control, network devices, computers, machines, and intelligent robot. Robot can equipped speech recognition program function, array of microphones use as sound sensor, and equipped voice speaking program function use speakers to voice feedback.
Figure 11 is the drawing showing an advance TouchScreen Mouse combine puzzle cell virtual keyboard in sandwich layers method.
Figure 12 is drawing showing the enhanced wireless select key indication device, that wear on user hand palm, arms, or user body, display which selection keys by blinking LED light in MorseCode signals, and/or using vibration motor to make long-short vibrations MorseCode signal, so user don't need to watch display monitor to know what keys they select, this feature especially useful for poor eyesight, and blind users.
Figure 13 is the drawing showing a wireless display glass that has network protocol equipment to connect with robot, and robot send the display puzzle cell map with hands selection position, and the wireless display glass project the puzzle cell image on its lenses, so user can see which keys they select.
Figure 14 is the drawing robot equipped a mobile platform, example using micro controller board to control varieties motors, so robot vision program can intelligently control these motors rotation; as result, robot intelligently driving itself moving around, and able to control moving its display projector direction to project puzzle cell keyboard images on any surface. The varieties of motors control modules can use to build into robot's neck, body, arms, hands, legs, so robot can be build as human shape, physical body movement ability with Gesture Interface Robot puzzle cell map function. The Gesture Interface Robot becomes the communication bridge between human and intelligent robot machine world.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
In the drawings,
With reference to the drawings, as shown in Figure 1, drawing showing the hardware components of Gesture Interface Robot (GIR) and the vision tracking software programs, and peripherals wireless network devices, display devices.
The completed working example model of Gesture Interface Robot (GIR) use components includes,
1. Main Computer 1 use as Robot's brain to process video, robot vision puzzle cell map virtual keyboard control program (PCMVKCP) 3 and automation program 2 (such as EventGhost) , web server function 41, such as IIS server. Video vision sensor build with variety type of sensors module 8, combine multiple microphones as sound sensor 7, Infrared Emitter 9, RGB video camera 10 (or use Web Camera instead), Infrared signal reflection detect sensor 11, 3 dimension movement accelerometer sensor 12, speakers 13, and motor control module 17, connect circle signal control line 15, intelligent rotate directions 16, 18, and this particular video sensor module system can be use Microsoft Kinect sensor 6 as available vision sensor parts component sold on the market. In addition, this invention GIR proposal to build in Universal Infrared Receiver Transmitter (UIRT) 14 to this video sensor module as addition IR remote control features to physical operate machines.
Micro Controller Board 21 can use Arduino board.
Varity of motors modules 20 attached to Micro Controller Board 21.
Intelligent rotate directions 19, variety type of sensors modules 24 and GPS 22 sensor, connect cable 23, 25 can be attached to the board for external sensor reading signals by Micro Controller board 21 and send to Main Computer 1 to process.
USB-Universal Infrared Receiver Transmitter (UIRT) 34 built in, or USB adapter cable 33, that can learn, record, and sending Infrared signals, recording from any physical IR remote controllers. Usually USB-UIRT cables can send and receive IR signals. Additional IR receiver 36 built in, or USB adapter cable 35 can be attached to Main Computer 1 too.
Wireless network equipments such as Bluetooth network card 38 built in, or USB adapter cable 37, Wi-Fi network card 39 built in , or USB adapter cable 40, ... etc. all wireless network protocol card devices, TCP/IP, Internet Protocol such as Xbee, Ethernet, Wify, Bluetooth, Cell Phone channel 3G, 4G, GSM, CDMA, TDMA ... etc, space telecommunication channel, satellite channels.
7. Display monitor devices such as display monitor 43, monitor cable 42, image
Projector 44 and wireless network (example TCP/IP, or Bluetooth) display monitor glass 46.
8. Main Computer power source, wall power plug 32 when Robot in fixed install position, and Kinect sensor 6 power plug source too. Micro controller power source can be use inexpertly or from Main computer 1 through USB connection.
9. Mobile motor wheel platform 28, equipped motors wheels 26, 30, with motor signal control line 27, 29 for control motor rotation direction and speed.
Robots all components can be place on platform 28, and robot is able to using video vision function to drive itself, moving around. The portable power source 31 can be rechargeable batteries, solar cell, fuel cell, rotation generator, wind turbine, thermo electron generator (TEG), ... etc. to regenerate electric power for robot to moving and operation. Because motor modules can be built variety robot body parts, motor control can be neck, robot center body, arms, hands, hip, legs, feet, mimic human physical body part movement. Therefore, it will become a human form Gesture Interface Robot that can support puzzle cell map virtual keyboard gesture functions.
10. Main Computer 1 use as Robot's brain to process video image, The user body part joins location 3 dimension X, Y, Z values can be program using Microsoft visual C# program 4, (or VB), to calling Kinect and other system assemble libraries, and enable Kinect sensor to reading user joint values in the program. These basic video sensor reading user's 3D body joint values are available now, therefore we can write a specific puzzle cell map virtual keyboard control program (PCMVKCP) 3 that transform the basic 3D joint value, intelligently measuring calibrate into a new gesture interface input work space zone and establish puzzle cell virtual keyboard into the zone. So, user is able to moving hands, and point out to click virtual keys. Those enable Kinect sensor functions to reading joints values can coding into The (PCMVKCP) program 3, and the program 3 can be a class program (example:
MainWindow.xaml.es) that include in the Microsoft Visual Studio C# 4 as one project and build into 1 project solution, prefer in WPF Application type project, so all the video Kinect sensor reading value is available for
(PCMVKCP) program to use them in real time programming, create dynamic user graphic interface.
Gesture Interface robot use vision 3 dimension, X, Y, Z body parts value for robot vision puzzle cell map virtual keyboard control program (PCMVKCP) 3 to be able create work zone, establish puzzle cell map virtual keyboards, provide real-time user hands location, convert to puzzle cell position, then match puzzle cell row-column to match with its puzzle cell command map list, transfer the cell position to computer command, and sending command to automation program 2 (such as EventGhost) to run pre record macro script to execute command such as display type a text, running a computer program, sending Infrared signal to remote control TV, DVD, or another computer to typing, mouse movement, mouse clicks, running computer program, internet browser ... etc computer operations.
13. In main computer 1 includes web server function 41, such as IIS server and can establish inter server-client network, DNS server, TCP/IPURL, namespace ... etc. web site hosting, provide HTML, XMAL, scripting functions. When (PCMVKCP) program 3 can activate a web Brower, sending a web page URL include a specific text code, when particular web page is being running and open, the automation program 2 (such as EventGhost) detect the particular text code trigger, it will trigger the macro action in the folder.
Figure 2 illustrates Gesture Interface Robot (PCMVKCP) program 3 automatic measure user's workspace, assign virtual center point, create workspace zone 76 in conformable area 47, establish puzzle cell mapping keys (such as 85, 86, 87, 82, 91, 92, and all other cells), virtual control panel keyboard in front of user to click.
Using Microsoft visual Studio C# with assemble, Microsoft Kinect and system libraries, program 4 can video sensor reading the user 50 body joint 3D values. Such as
User Head center 50,
Comfortable left and right hand moving circle space area 47,
User Right Shoulder Edge Joint 52,
User Left Shoulder Edge Joint 48,
User Shoulder Center Joint 79, User Right Elbow Joint 57,
User Left Elbow Joint 74,
User Right Hand 54,
User Left Hand 73,
User Right Hand Palm Center 77,
User Left Hand Palm Center 82,
Here are example c# program coding using kinect 2 body joints 3D reading value to calculate the distant of two joints,
This is directly copy and paste from robot working prototype c# program. Copyright. usenWonkZoneHead2Centenl_ength = Math.Sqnt(
Math . PowiusenMeasuneCentenPoint . Position .X - usenMeasuneHeadPoint . Position .X, 2) +
Math . Pow(usenMeasuneCentenPoint . Position .Y - usenMeasuneHeadPoint . Position .Y, 2) +
Math . Pow(usenMeasuneCentenPoint . Position . Z - usenMeasuneHeadPoint . Position . Z, 2) ) ; usenWonkZoneCenten2LeftShouldenLength = Math.Sqnt(
Math . Pow(usenMeasuneCentenPoint . Position .X - usenMeasuneLeftEdge. Position .X, 2) +
Math . Pow(usenMeasuneCentenPoint . Position .Y - usenMeasuneLeftEdge. Position .Y, 2) +
Math . Pow(usenMeasuneCentenPoint . Position . Z - usenMeasuneLeftEdge . Position . Z, 2)); userWorkZoneCenter2RightShoulderLength =
Math.Sqrt(
Math . Pow(userMeasureCenterPoint . Position .X - userMeasureRightEdge . Position .X, 2) +
Math . Pow(userMeasureCenterPoint . Position .Y - userMeasureRightEdge . Position .Y, 2) +
Math . Pow(userMeasureCenterPoint . Position . Z - userMeasureRightEdge . Position . Z, 2));
So (PCMVKCP) program can use this formula to calculate all the body length. The length of Right shoulder 51 can be calculated by shoulder Center 79 and Right shoulder edgejoint 52.
The length of Left shoulder 49 can be calculated by shoulder Center 79 and Left shoulder edgejoint 48.
The length of Right Up Arm 53 can be calculated by Right shoulder edgejoint 52 and Right hand elbow joint 57.
The length of Left Up Arm 75 can be calculated by Left shoulder edge joint 48 and Left hand elbow joint 74.
The length of Right Lower Arm 56 can be calculated by Right hand elbow joint 57 and Right Hand Palm Joint 77.
The length of Left Lower Arm 72 can be calculated by Left hand elbow joint 74 and Left Hand Palm Joint 82. Simplify the other body joints values can be use the calculate user's center body length (58 = 79 to 61), hip length (71 - 62), Right upper leg (63 = 64 - 62), Left upper leg (70= 71-69), Right lower leg (65 = 64 - 66), Left lower leg (68 = 69- 67), Right leg length (63 + 65), Left leg length (70 + 68), head length is Head center 50 length x 2, and the neck joints length (79 - 50, or upper neck joint point).
Total user height can be approximately adding all length to estimate user's height and the maximum user width length likely to be the shoulder both edge joints distant. Because human use both arms, and the comfortable movement space have limitation areas. The comfortable areas 47 can be defined where in front of user, and in circle around both side. If Left hand move over right shoulder edge is difficulty and right hand move over to Left shoulder become difficulty. Both comfortable areas 47 circle create a over layer area 59 (between shoulders), and 2 circles have intersection point 60 match with user body center line 58.
When user body joint values, and the length of body, Gesture Interface Robot (PCMVKCP) program 3 can use these video sensor-reading values to create a perfect Workspace zone 76 according to the user body measurements. The (PCMVKCP) program 3 will assign a virtual center point on user, which is prefer to be should center joint point 79. The prefer Workspace zone width length is total length of each shoulder length x 1.5 (1.5 + 1.5 = 3), and the prefer Workspace zone height length is total length of Shoulder Center 79 to Head Face Center x 2. The Workspace zone will be tracking in front of user accordingly User's should center joint point 79; therefore when user walking or moving, the Workspace zone 76 is always at the same place in front of user. If user walking the video view able area, the software keeps digital tracking, and when user walking out the video view area edge, then (PCMVKCP) program 3, will activate intelligent motor module rotate video sensor to follow aiming on user.
When the Workspace zone 76 size is defined, then (PCMVKCP) program 3 will divide the Workspace zone into Puzzle Cell Row-Column; for example if the virtual keyboard need 4 row, and 10 column, total 40 puzzle cells. Program 3 will divide the width length to 10 and divide the height length to 4. As result, it can determine the length of each puzzle cell area location value according to the virtual center point 79. For example, when user's Right hand 54 moving to Puzzle Cell 85 (Row 1 Column 10), the program 3 calculated the X, Y value of the Right Hand Palm center point 77 with the virtual center point 79, known of X, Y, 2 side length, then can calculate the distant of the length to center point 78, those values can be determined that user's Right hand is on the Row 1 Column 10 location. If user Right hand 54 moving down to Puzzle cell 87, it will be (Row 4, Column 8), and if moving to Puzzle Cell 86, it will be (Row 4, Column 10)
Same method, program can determined that user's Left hand 73, palm center 82 is at Puzzle Cell (Row 4, Column 2) location. If user Left hand 73 moving up to Puzzle cell 91, it will be (Row 2,Column 2), and if moving to Puzzle Cell 92, it will be(Row 1, Column 1)
The selection click zone 88, the total selection click zone's 88 max length is limited by total user arm hand length (75 + 72), (53+56), that is the longer length user can push their hands out. The program defines the maximum hand push out surface 84. For example user push Left hand 73 out and in direction 90, and the ((PCMVKCP) program 3 reading the Left hand palm joint z dimension length value 81 changing become longer (bigger) between the user body z dimension values. The compare z dimension value can be assign on user body z dimension surface 80, center point or left shoulder joint 48 or right shoulder joint 52 to compare when special measurement need, This is useful for a handicap user might use mouth bite a water color pen to select virtual key to enter, vision tracking program 3 can use special assign any point of the z surface 80 value to determine handicap user select click action. Program recognize the hand push out selection click action, lock the Puzzle Cell Row4- Column2 position and match Puzzle Cell Map 2 dimension array string code to transfer position into computer command. The selection click zone 88 divide into 3 selection mode zones. The require detect click action of the hand push out edge 89 prefer to be shorter than maximum z push out surface 84 to prevent user need to push hand muscle too much, and rapidly too often that could cause body arm injury. This shorter select click action length feature keeps arm hand in flexible position and easier for user moving hands to select virtual keys. Figure 3 illustrates the hand push out in z dimension push to click virtual key, and the z dimension distant 88, between hand palm 82 and user body point 93 distant has divide into 3 zones, 1st selection unlock selected key gate zone 99 between user body point 93 to 1st select zone edge point 98, 2nd moving to select virtual key zone 94 between 1st select zone edge point 98 to 2nd select zone edge point 95, and 3rd push hand to click the selected virtual key zone 96 between 2nd select zone edge point 95 and 3rd select zone edge 89; in addition, showing unique special GIR fingers hand sign enhance selection control accuracy. Program 3 can detect user Left hand palm center 82 in pulling and push action direction 90. In 2nd select key zone, user move hands and keep in 2nd zone area to select and change any key freely. In default, when user hand making Push out direction, the program detect "Push" action, it will lock the puzzle cell position, so it will not be change even when X, Y change during the hands push out.
A special GIR gesture hand sign to moving fingers like spider walking its legs to change nearby puzzle cell selection. For example User Left hand palm 82 can stay in the 2nd select key zone 94, hand fingers 103, 105, 106, 107, and 108 moving like spider legs walking gesture, the puzzle cell row-column lines like spider web net. So tiny moving fingers in waking direction of up, down, left, right, the program can detect the weight most area hand palm 82 on which puzzle cell, so user don't need to make big hand movement to change puzzle cell where just beside the current select puzzle cell position. Each finger has 2 joints sections example finger 103 has two joints 101, 102, connect to hand palm at joint 100.
When all fingers 103, 105, 106, 107, 108, the Left hand palm 82 detect circle area size 109 diameter length 104 is larger than when all fingers close and holding 111, the vision tracking program detect hand area circle 113, diameter 112 become smaller. This difference become useful to enhance puzzle cell selection, when use locate select command, then close all finger and push hand out, program will lock the puzzle row-column value regardless even hand moving X, Y direction, the puzzle cell position will not be changed. This hold to grip click, special GIR hand gesture feature is useful when user need to rush to click a virtual key for sure in a emergency situation such as in spaceship out of control or user has shaking hands illness problem, the program will support for the need.
In the 2nd select key zone 94, user Left hand can using change fingers pointing example fingers 105, 106, the special GIR hand sign look like a gun gesture point to puzzle cell, so the program see the hand holding and then fingers point out that make different, program select to lock the key. Tiny gun gesture point area make vision tracking accuracy, so user move rotate finger gun point, small movement to change key selection. If user want to choice other key, simple pulling hand 82 back to 1st zone, the program detect "Pull" action then unlock the select key to be free to reselect any key again by user. In 3rd select click zone 96 and using fingers tiny movements by different fingers gun point out or holding tight, or open all fingers or close hold all fingers to make a puzzle cell select to click 97.
Figure 4 is a drawing showing a special GIR hand sign gesture to continuous click without pull hand back to unlock, by moving like fish swimming its fins, moving fingers 100, 105, 106, 107, 108 up and down between horizontal line 1 17 routing 1 by 1 that making waving fingers GIR hand sign gesture in the 3rd selected click zone 96, the vision tracking function in program 3 will detect hand size area 1 16, hand palm center point 82 distant value. When fingers waving down positions 118, 119, 120, 121 , an 122, palm face down area 73, the hand palm center point 82 will be covered, so program could not see the point 82, then fingers moving UP area to position 123, 124, 125, 126, 127, the hand palm center 82 show again in vision tracking function in the program 3, it cause the blink distant z value different that program detect and perform continuous click virtual key on each blinking, without need user pull right hand 73 back to 1st unlock selected key gate zone and push to click. Figure 5 a drawing showing robot vision program to tracking user's hands position 77, 82 in the Workspace zone 76. Using Right hands X, Y distant 78 between center points 79 to Right hand palm 77 determine which virtual puzzle cell position 85 is being select on. Using Left hands X, Y distant between center points 79 to Left hand palm 82 determine which virtual puzzle cell position (Row 4 Column 2) is being select on.
Robot vision program draw virtual puzzle cell map keys control panel graphic image 141 on display monitor 43. Vision program using the tracking user's hands 77, 82 location determine which keys are selected, and real time update on display monitor 43, vision program highlight 130, 138 in different color, enlarge font size on the particular puzzle cells 132, 136 as visual indication, user know which keys that they are selected on right hand palm 77 and left hand palm 82. The graphic puzzle cell map image center point 133 match with virtual workspace center point 79. So hands X, Y value can be match on the graphic puzzle cell map image on select keys correctly. If Left hand palm 82 moving to up direction 139, the highlight will change to puzzle cell 140, if Left hand move down and out of puzzle cell map area 137, the program will not indicate any select key. It could be user put down hand and has no intention to select key situation. If user Right hand 77 move down, the X, Y value and distant 128 also changed, the program will highlight on puzzle cell 134 or 135 where user select. If Right hand move out the workspace zone, then no key select 129. User can decide use both hands to select keys, or use left hand only and use right hand only, vision tracking function in the program 3 can support recognize all hands inputs. Figure 6 is a drawing showing robot vision program drawing virtual puzzle cell map keys control panel graphic image 141 like watercolor painting artist
(Picasso). Using WPF project in Visual C#, it has dynamic graphic user interface tool, so vision program can using Grid command to draw the puzzle cell virtual keys in Grid Row and Column cells, and have insert another TextBlock field into each grid cell, and then filling text word (0-9, a-z) into TextBlock field as indicate command for user to select; for example loading 1 - 0 to TextBlock 142, 143, 144, 145, 146, 147, 148, 149, 150, 151 that place on the puzzle cells of the row 1, and "Z" into TextBlock 162, "X" into TextBlock 161, "C" into TextBlock 160, "V" into TextBlock 159, "B" into TextBlock 158, "N" into TextBlock 157, "M" into TextBlock 156, "," into
TextBlock 155, "." into TextBlock 154, and "SP" (Space) into TextBlock 153 that place on the puzzle cells of the row 4. All other keys are load in the same from a puzzle cell 2 dimension string array code, and loading each character to its support cell position, as result, a QWERT standard virtual puzzle cell keyboard 141 is created. Vision program highlight 130, 138 in different color, enlarge font size 152 163 on the particular puzzle cell command "X" and "0" as visual indication, user know which keys that they are selected on. When user click, program 3 use the puzzle cell position (Row, Column) to call the puzzle cell 2 dimension string array code and obtain the text word command. If user Right hand move to "SP" and click, then program display typing space, if program typing display ", "
If Left hand select " W" 139, then program sending key typing display "W", if select "1", then sending word "1" to display.
In addition, robot vision program is able to work with automation program 2 (Example EventGhost) to control USB-UIRT cable 34 to send Infrared signals 171 to remote control another computer 164 with IR receiver 172, control its keyboard to typing display "X" and "0" on notepad program 167 on monitor, and when puzzle cell loading mouse keys, then user able click to sending mouse moving IR signal 171 to control the other computer 164 move its mouse 168 position and do mouse 168 click operation. The command execute signal can be also sending by Bluetooth device to control a Bluetooth micro controller board device that user wear on to blink LED light as Morse code, or vibration long-short as Morse code signal. It can send signal through Wi-Fi network device 39, TCP/IP, Internet network server-client program to control another node on the network, computer, machines, and intelligent robot.
Using the web server 41 IIS service and activate specific web page 169 with specific text code is the best way, unlimited assign command by differential folders on each control machines and trigger macro actions individually. This way to free keys locked and keep keyboard click able as normal computer functions. Once automation program 2 such as EventGhost that can create many folders to save macros script with trigger actions, and can detected the specific trigger command events, the Marcos can exercise command such as sending text key command, display A - Z, 0-9, symbols keys, functions key, open computer program, internet browser, words, calculator, 3D graphic drawing CAD program ... etc; in addition, automation program 2 such as EventGhost can include USB UIRT cable 34 to learn physical Infrared Remote Controller each function keys signal and recorded and sending out by macros script action.
When robot program trigger the action, EventGhost will send infrared signal 171 out through USB-UIRT cable device 34, the IR signal can be control a physical machine such as computer 164, machine, and intelligent robot. For example, robot is sending IR signal 171 out to control a TV to turn ON/OFF. Another example, another computer 164 can equipped IR receiver 172, then Gesture Interface Robot can sending IR signal 171 to control the other computer 164 such as display a - z, 0 -9, symbols, function keys, open computer programs, Media, running DVD player, playing music, video, internet browser, playing games, and moving mouse 168 position, Right click, Left Click, Double click, wheel up, wheel down ... etc computer functions. As a result, Gesture Interface Robot can control self-intelligent machines, and intelligent robots. Soon self-intelligent driving car, flight jet, and spaceship, intelligent robot will be using in people lifestyle daily home, healthy care, education, medical, transportation, public services ... etc.
In addition, robot program can have private own automation program 2 functions control features include in the robot program 3, Robot program can directly coding with USB-UIRT cable's API library add in to be assemble of available functions directly in (PCMVKCP) program 3 function code, so robot program can directly control USB-UIRT cable to the IR signal learning, record and send out IR signal commands, robot can be directly control physical machine such as TV, Computer, machines in the robot program without need to have 3rd party automation program 2 such as EventGhost to run it. Similar the robot program can sending enter key command directly to the activate program, for example enter key to notepad or Microsoft Word, program coding to sending text key command to display type words on the writing program directly without need to have 3rd part automation program too.
When user select key, the program can enable speaker 170 to read the character and give voice feedback. Figure 7 is a drawing showing vision program drawing a mouse keyboard, control panel on virtual puzzle cell map keys control panel graphic image, divide 2 mouse section, Left hand mouse 186, and Right Hand mouse 174.
Loading mouse command word to TextBlock field, "Mouse 315" to
TextBlock 185, "Mouse Up" to TextBlock 184, "Mouse Left" to TextBlock 183, "Mouse 225" to TextBlock 182, "Double Click" to TextBlock 181, "Left Click" to TextBlock 180, "Right Click" to TextBlock 179, and all other keys.
User can select the virtual keys to control mouse position, and mouse click functions. In addition, the virtual puzzle cell map keyboard, control panel, prefer special interface section arrangement, can be divide into Left and Right hand zones, The center area 173 of virtual puzzle cell map keys control panel graphic image is reserved to place a real time video image 187 that show user actions 188. So user can see themselves and all the control virtual keys together on monitor, this special virtual gesture interface arrangement make good visual feedback indication controls and easy for eyesight during for user operation.
A real example of GIR robot program for mouse key control interface arrangement, It prefer arrange the interface of mouse keys control panel to support both in Left Hand Mouse Key area and Right Hand Mouse Key area with all direction moving keys, UP, DOWN, LEFT, Right, 45 degree, 135 degree, 225 degree, and 315 degree keys.
The mouse movement can have 1 small move key for UP, Down, Left, right, 45, 135, 225, 315. This is useful when mouse is near the target to be click, so tiny mouse movement for mouse to select on the target. The mouse movement can have 1 large move key for UP8, Down8, Left8, right8, 45-8, 135-8, 225-8, 315-8. "8" means 8 times move distant of small mouse movement. This is useful when mouse is some distant to move to the target, so large mouse movement for mouse to select on the target, less click gesture action.
All Mouse Keys selection click is not locked in 3rd selection click zone, that means, all mouse keys can be re-click again in the 3rd select click zone without pulling hand back. Combine Fish Swimming Fin gesture, user can very easily control mouse location and accurate to point on the target and do mouse click functions. Please see "//" comment of array key define distant and multiple speed keys beside code.
Prefer arrange a 7 Rows, 17 Columns puzzle cell Mouse Key Controller map.
And
Puzzle cell Size (H x W) will be calculated by Workspace Zone size (H x W) divide by the rows, and columns.
Here are example c# program coding function to arrange Puzzle Cell Map List for
Mouse Key Controller commands by 2 dimension string array in C# code.
This is directly copy and paste from robot working prototype c# program. Copyright.
puzzleCell apList [ l, 1 ] = " " ; // Fi rst now reserve Robot menu
puzzleCellMapList [ 2, 1 ] = "MKEY" ;
puzzleCellMapList [ 3, 1 ] = " " ;
puzzleCellMapList [4, 1 ] = " " ;
puzzleCellMapList [ 5 , 1 ] = " " ; puzzleCellMapList[6, 1] = "
puzzleCellMapList[7, 1] = "
puzzleCellMapList[8, 1] = "
puzzleCellMapList[9, 1] = "
puzzleCellMapList[10, 1] =
puzzleCellMapList[ll, 1] =
puzzleCellMapList[12, 1] =
puzzleCellMapList[13, 1] =
puzzleCellMapList[14, 1] =
puzzleCellMapList[15, 1] =
puzzleCellMapList[16, 1] =
puzzleCellMapList[17, 1] = puzzleCellMapList[l, 2] = "";
puzzleCellMapList[2, 2] = "ML315-8"; //
Mutiple clicks in 1
puzzleCellMapList[3, 2] = "MU8" ; // ReClick able, Large Move
puzzleCellMapList[4, 2] = "MU8" ; // Move Mouse Large Up
puzzleCellMapList[5, 2] = "MU8" ;
puzzleCellMapList[6, 2] = "MR45-8"; //Move
45 Large
puzzleCellMapList[7, 2] = ""; puzzleCellMapList[8, 2] =
puzzleCellMapList[9, 2] =
puzzleCellMapList[10, 2] = puzzleCellMapList[ll, 2] = puzzleCellMapList[12, 2] = L315-8";// Move
Mouse Large 315degree
puzzleCellMapList[13, 2] = "MU8" ;
puzzleCellMapList[14, 2] = "MU8" ; // Move mouse larger Up
puzzleCellMapList[15, 2] = "MU8" ;
puzzleCellMapList[16, 2] = "MR45-8"; // Move Mouse Large 45 degree
puzzleCellMapList[17, 2] = ""; puzzleCellMapList[l, 3] = "";
puzzleCellMapList[2, 3] = "ML8" ;
puzzleCellMapList[3, 3] = "ML315"; // Move mouse small 315
puzzleCellMapList[4, 3] = "MU"; // Move mouse small Up
puzzleCellMapList[5, 3] = "MR45"; // Move mouse small 45
puzzleCellMapList[6, 3] = "MR8" ;
puzzleCellMapList[7, 3] = ""; puzzleCellMapList [8, 3]
puzzleCellMapList[9., 3]
puzzleCellMapList[10, 3]
puzzleCellMapList[ll, 3]
puzzleCellMapList[12, 3] "ML8";
puzzleCellMapList[13, 3] "ML315";
puzzleCellMapList[14, 3] "MU";
puzzleCellMapList[15, 3] "MR45";
puzzleCellMapList[16, 3] 'MR8";
puzzleCellMapList[17, 3] puzzleCellMapList[l, 4] = "ENTER"; //Enter key puzzleCellMapList[2, 4] = "ML8" ; // Move
Mouse Large Left
puzzleCellMapList[3, 4] = "ML"; //Move
Mouse small Left
puzzleCellMapList[4, 4] = "";
puzzleCellMapList[5, 4] = "MR"; //Move Mouse small Right
puzzleCellMapList[6, 4] = "MR8" ; //Move Mouse
Large Right
puzzleCellMapList[7, 4] = "";
puzzleCellMapList[8, 4] = "";
puzzleCellMapList[9, 4] puzzleCellMapList [10, 4] = "";
puzzleCellMapListfll, 4] = "";
puzzleCellMapList[12, 4] = "ML8" ;
puzzleCellMapList[13, 4] = "ML";
puzzleCellMapList[14, 4] = "";
puzzleCellMapList[15., 4] = "MR";
puzzleCellMapList[16, 4] = "MR8" ;
puzzleCellMapList[17., 4] = "ENTER"; puzzleCellMapList[l, 5]
puzzleCellMapList[2, 5] "ML8";
puzzleCellMapList[3, 5] "ML225"; //Move
Mouse 225 degree
puzzleCellMapList[4, 5] = "MD"; //Move Mouse Down
puzzleCellMapList[5, 5] = "MR135"; //Move
Mouse 315 degree
puzzleCellMapList[6, 5] = "MR8" ;
puzzleCellMapList[7j 5] = puzzleCellMapList[8, 5] = puzzleCellMapList[9, 5] = puzzleCellMapList[10, 5] = puzzleCellMapListfll, 5] =
puzzleCellMapList[12, 5] puzzleCellMapList [13., 5] = "ML225
puzzleCellMapList[14, 5] = "MD";
puzzleCellMapList[15, 5] = "MR135
puzzleCellMapList[16, 5] = "MR8" ;
puzzleCellMapList[17, 5] = ""; puzzleCellMapList[l, 6]
puzzleCellMapList[2, 6]
Mouse 225 Multiple
puzzleCellMapList[3j 6] //Move
Mouse Large Down
puzzleCellMapList[4, 6] "MD8" ;
puzzleCellMapList[5j 6] "MD8" ;
puzzleCellMapList[6, 6] "MR135-8"; //Move mouse 135 Mutiple
puzzleCellMapList[7, 6]
puzzleCellMapList[8., 6]
puzzleCellMapList[9, 6]
puzzleCellMapList[10, 6]
puzzleCellMapList[ll, 6]
puzzleCellMapList[12J 6] 'ML225-8";
puzzleCellMapList[13, 6] "MD8";
puzzleCellMapList[14, 6] = "MD8";
puzzleCellMapList[15, 6] = "MD8" ;
puzzleCellMapList[16., 6] = "MR135
puzzleCellMapList[17, 6] = ""; puzzleCellMapList[l, 7]
reserved controls
puzzleCellMapList[2, 7] = "DCLICK";// Mouse
Double Click
puzzleCellMapList[3, 7] = "LCLICK";// Mouse
Left Click
puzzleCellMapList[4, 7] = "WWT" ; // change to
WWT control
puzzleCellMapList[5, 7] = "SLOT";// change to SLOT control
puzzleCellMapList[6, 7] = "DDING" ; // change to DDING control L
puzzleCellMapList[7, 7] =
puzzleCellMapList[8, 7] =
puzzleCellMapList[9, 7] =
puzzleCellMapList[10., 7] =
puzzleCellMapList[ll, 7] =
puzzleCellMapList[12J 7] = "DCLICK";
puzzleCellMapList[13, 7] = "LCLICK";
puzzleCellMapList[14, 7] "RCLICK";
puzzleCellMapList[15, 7] "2NDLIFE" ; // change to 2ndLife control
puzzleCellMapList[16, 7] "ROVER"; // change to ROVER control
puzzleCellMapList[17, 7]
} // MKEY
When user gesture click, vision tracking function in the program 3 use the puzzle cell position (Row, Column) to call the particular (Row, column) array string value of the puzzle cell 2 dimension string array code and obtain the text word command. For example, if user Right hand move to "MU" and click,
The program activates specific web page and generates a HTTP browser command
Example HTTP coding from working prototype. Copyright
"http://localhost:8000/index.html?HTTP.KEYS_MU" in Browser URL and enter.
The web page activate link to trigger automation program EventGhost trigger event (KEYS Folder, MU event), and exercise the MU Marco script to sending out Infrared Signal to control another computer to move its mouse position UP small distant, if "MU8", then move its mouse position UP large distant, if "ML225", then move its mouse position 225 degree small distant. if "ML225-8", then move its mouse position 225 degree, 8 times of small distant.
The puzzle cell keys can be define in software function coding by allow the keys to multiple click, multiple speed, different move distant, enable multiple clicks by 1 gesture action and also allow to control the lock or unlock the key that to enable re-click in 3rd zone key. When user use GIR special gesture hand sign can continuous click virtual keys easily in 3rd selection click zone.
This key control definitions method is use for all other keys and actions in all virtual control panels, keyboards. The first Row of virtual keyboard controller is reserve for robot function menu, and the last row is reserve for program controls, change controller ...etc.
Figure 8 is a drawing showing Gesture Interface Robot can create any keyboard and control panel that user wanted.
If user select WWT controller, then program draw a new virtual puzzle cell map keys control panel graphic image to a virtual control panel WWT 189 for Microsoft World Wide Telescope program. Filling in special WWT command words, "Zoom In" 195, "Zoom In" 195, "Up" 194, "Left" 193, "BKSP" 192, "QWERT" 191, "Enter" 190. On the right side 174, drawing mouse control keys such as "Left Click" 180, "Right Click" 179 and all other keys on its cell.
Inside the program function those text command words can be coding arranged into a 2 dimension array text strings, then loading each text word into row and column cells, so that display on the graphic puzzle cell image and virtually have match assign on user's working space zone, The varieties of virtual keyboard, control panel, each keyboard has its own control commands, and filling into each row-column puzzle cell, the virtual keyboards drawings are to show as examples
If user select SLOT controller, then program re-draw a new virtual puzzle cell map keys control panel graphic image to a virtual control panel SLOT 196 for control a SLOT machine simulation program.
If user select DJING controller, then program re-draw a new virtual puzzle cell map keys control panel graphic image to a virtual control panel DJING 197 for control a Disco DJ machine simulation program.
If user select 2ndLife controller, then program re-draw a new virtual puzzle cell map keys control panel graphic image to a virtual control panel 2ndLife 198 for control a virtual 3D world avatar in 2ndLife viewer program.
If user select ROVER controller, then program re-draw a new virtual puzzle cell map keys control panel graphic image to a virtual control panel ROVER 199 for control a Mars Rover simulation program to control rover robot to driving, take pictures, transmitting pictures back to Earth, use Claw, Driller to take rock samples, ... etc, intelligent robot operations.
Figure 9 is a drawing showing more of examples of virtual keyboards drawings to show that Gesture Interface Robot is able to support computer using USB-UIRT to remote control machines such as TV, DVD, SIRIUS radio, Disco Light ...etc, and special Morse Keyboard.
For example, If user select TV controller, then program re-draw a new virtual puzzle cell map keys control panel graphic image to a virtual control panel TV 200 for control TV functions.
If user select DVD controller, then program re-draw a new virtual puzzle cell map keys control panel graphic image to a virtual control panel DVD 201 for control DVD functions.
If user select LIGHT controller, then program re-draw a new virtual puzzle cell map keys control panel graphic image to a virtual control panel LIGHT 202 for control LIGHT functions.
If user select SIRIUS controller, then program re-draw a new virtual puzzle cell map keys control panel graphic image to a virtual control panel SIRIUS 203 for control Sirius radio functions.
If user select Morse code Keyboard controller, then program re-draw a new virtual puzzle cell map keys control panel graphic image to a virtual control panel Morse code 204 for using enter Morse Code to enter key functions. In the puzzle cell Row 2, Column 2 a "." Represent "Di", and puzzle cell row 2, column 4 a " - " represent "DHA", user can click on the cells to make "Di", "DHA" signals,
((PCMVKCP) program 3 include Morse code signals convert to A -Z, 0 - 9 functions, so user enter Morse Code, then click CONVERT 193, it transfer to character to execute command. The Read command is use during the Morse code enter, user can read what code has been enter so far, and can Erase all to re-enter again and can click BKSP 190 to deleted just a signal "Di" , "DHA". This GIR Morse Morse Code Keyboard is useful for poor eyesight user, and blind user to enter command by simplest gesture action "Di", "DHA" actions to control machines. If user select SYMBOLS controller, then program re-draw a new virtual puzzle cell map keys control panel graphic image to a virtual control panel
SYMBOLS 205 for control another computer enter display symbols keys. Figure 10 is a drawing showing more of examples of virtual keyboards drawings to show that Gesture Interface Robot is able to support computer operation functions.
If user select ABC controller, then program re-draw a new virtual puzzle cell map keys control panel graphic image to a virtual control panel ABC 206 for control another computer enter display A-Z keys.
If user select 123 controller, then program re-draw a new virtual puzzle cell map keys control panel graphic image to a virtual control panel 123 207 for control another computer enter display 0-9 keys.
If user select FN controller, then program re-draw a new virtual puzzle cell map keys control panel graphic image to a virtual control panel FN 208 for control another computer enter Function Fl - F12 keys.
If user select PROGRAM controller, then program re-draw a new virtual puzzle cell map keys control panel graphic image to a virtual control panel
PROGRAM 209 for control another computer execute computer program to run. Example click "Take
Picture" Robot will take picture of user and save. If user click the "LOOK UP", "LOOK RIGHT", "LOOK LEFT", "LOOK DOWN" keys, robot will control its motor module to rotate its video sensor to turn UP, RIGHT, LEFT or Down direction. The special arrange area of virtual puzzle cell map keys control panel graphic image, on the first row area 211, it reserved for robot operation function menu and the last row 212 area, it reserved for program type of control panels. This makes easier when user want to use different controller, find it at last row, and when user want to configuration robot support function look on the first row of puzzle cell map image. A special "HOME" 210 link to e click to root for fast return to start program position, or when user lost in menu structure and wish jump back to start.
In addition, robot using peripherals devices to be able control, network devices, computers, machines, and intelligent robot. Robot can equipped speech recognition program function 213, array of microphones use as sound sensor, and equipped voice speaking program function 214 use speakers to voice feedback. Robot vision program can support Hand Sign Language function 179. Each hand and fingers gestures and positions value on each video frame will be compare and distinguish the hand sign on puzzle cell area to determine what hand sign language and program execute the command.
Figure 11 is the drawing showing an advance TouchScreen Mouse 224 combine puzzle cell virtual keyboard 221 in sandwich layers method.
Gesture Interface Robot support a new revolution gesture input of computer interface method, Gesture Interface Robot can support an advance gesture action of the
TouchScreen of Mouse 224 that virtual sandwich layers to combine virtual control panel keys zone function. Robot vision program 3 is enable user to decide which hand for TouchScreen Mouse 221, 222 and the other hand can virtual click the virtual puzzle cell Mouse keys, can be assign any commands, and for mouse function can be such as Mouse Double click 195, 175, Left click 193, Right click 177, Mouse Left click UP 194, Mouse Left click Down 192, Mouse Right Click UP 176, Mouse Right Click Down 178, 190, Wheel UP, Wheel Down ... etc. For example, if user use right hand click virtual mouse 222 function on the title menu 211 of the virtual mouse, then robot program 3 activate the virtual Touchscreen Mouse 224 function, disable Right Hand select and enable Left Hand select only on virtual keys, and enable tracking user's right hand 77 location and moving the mouse 224 position accordingly on the display monitor 43. If user's right hand 77 moving UP, the robot program moving the mouse 224 cursor position UP on the monitor 43 accordingly the distant 78 of the hand move distant. The move distance can be determined where its location on the right side of the Work Zone space 76, and robot program calculate the ratio of X 234, Y 224 distance between virtual center point 79, and update the same ratio distant 232 moving the mouse 224 cursor position in the same direction. Therefore, if user's moving right hand 77 makes a circle, the mouse 224 cursor will moving a circle on the monitor 43 in real time. When user move mouse 224 cursor on specific position that could be a internet browser web page on the computer desktop screen 226, user can push right hand out, the robot recognize the click select, it will do the Mouse LEFT click as default selection click action. Sometime, the other mouse click action is required, For example, the other hand can moving and click the virtual mouse puzzle cell keys for example, the other hand 82 click Double Click 195, then user moving right hand 77 to control TouchScreen Mouse 224 cursor on a program icon, and push hand out, robot program 3 will perform the Double click 195 for that click instead of default Left click; therefore, the program icon will be Double click 195 to open and running. The other virtual mouse puzzle cell keys are also useful when specific mouse action click need to be specific, For example, if user in view of a large page or a drawing image page, to perform the Left Click Down 192 will makes the whole drawing image page sheet moving follow right hand 77 moving in all directly, and when user moving image sheet to right location, do virtual Left Click Down click 194 to release the TouchScreen Mouse 224 Grip action, and back to default. The TouchScreen Mouse 224 can be performing by right hand 77 or left hand 82, and the each hand mouse 224 cursor start position prefers to be initial on its start location. Because robot program vision calibrate the user working space zone 76 into 4 sections, X 218 and Y 216 dimension lines across on virtual center point 79, so it divide into 4 sections where value of section I, (X+, Y+) 217, section II, (X-, Y+) 215, section III, (X+, Y-) 219, and section IV, (X-, Y-) 220. This means for the right hand 77 will be determine position using X, Y value of section I, II, III, and IV.
Here are the steps how control mouse position in each section.
How obtain current mouse X, Y position on the monitor screen, and use the value plus Right hand 77 gesture distant X, Y and multiple the ratio of the screen resolution.
This is directly copy and paste from robot working prototype c# program. Copyright. First step is obtained where is current mouse X, Y position, and then where to move mouse on screen.
Recalculate the new position,
leftof sc reen = mouseScreenSetllpX + (int)mouseSelectHandX *
mouseScreenResoultionRatioX;
(Current mouse X position + gesture distant X * screen resolution width ratio) topofscreen =
mouseScreenSetllpY + (int)mouseSelectHandY *
mouseScreenResoultionRatioY;
(Current mouse Y position + gesture distant Y * screen resolution height ratio) To assign new mouse X, Y value to move the mouse to new position mouseSeletX = leftofscreen;
mouseSeletY = topofscreen;
Then the program moves mouse to new position on display monitor.
For user to convince to move their hand control mouse point, the Right Hand 77 TouchScreen Mouse 224 program function can be setup to start cursor position will be in monitor LEFT-TOP corner position 231 that is video card monitor 0,0 position. On the other hand, for the left hand 82 will be determined position using X 223 , Y 229 value between center point 79 of section I, II, III,and IV, then the LEFT Hand TouchScreen Mouse 224 program function can setup to start cursor position in monitor Right-Bottom corner position 227, For example, if a monitor video card use resolution as 1900 x 1200, 228, 230, then the cursor start position is 1900 xl200 on the monitor. Robot program will determine its video view frame width and height ratio to compare with monitor screen resolution ration, and moving mouse cursor distance accordingly with hand in all direction 360 degree. TouchScreen Mouse 224 can use gesture click action with computer virtual keyboard keys buttons as well, and to click keys buttons on computer monitor.
Combine Right hand mouse, right hand for moving mouse position with left hand key zone, and if Left hand mouse selected, then Left hand moving mouse position and combine with right hand key selection zone.
If computer windows desktop screen 226 are tile fill up click-able buttons on surface, then user can use TocuhScreen Mouse 224 to select which button to be clicked by gesture action.
The variety mouse option key selection zone can be code in this way, This is directly copy and paste from robot working prototype c# program. Copyright.
Example:
if (mouseClickTypeSelection == 0)
{
DoMouseClickQ; //default Left Mouse click
}
else if (mouseClickTypeSelection == 1)
{
DoMouseLeftClickUpQ; // if Key Left Up select
}
else if (mouseClickTypeSelection == 2) {
DoMouseDoubleClickQ; // if Key Double click select
}
else if (mouseClickTypeSelection == 3) {
DoMouseLeftClickDownQ; // if Key Left
Down select
}
else if (mouseClickTypeSelection == 4) {
DoMouseRightClickUpO; // if Key Right Up select
}
else if (mouseClickTypeSelection == 5) {
DoMouseRightClickQ;// if Key Right Click select
}
else if (mouseClickTypeSelection {
DoMouseRightClickDownQ;// if Key Right
Down select
}
When user move mouse to target position, for example a web page, then gesture push to click.
Another example, a program icon on the desktop screen, then user using left hand to click virtual Double click key, and use Right Hand to push click on the program icon to open the program to run.
So, the hand gesture can control both in Mouse movement, and decide what mouse click action for operate computer, and programs.
In summary of this TouchScreen Mouse combine Virtual Puzzle Cell keys control panels using sandwich interface layers functions is an advance gesture system that included all current computer interface device input methods to be the one true universal computer interface device and enable user to perform gesture control all machine functions together, and easy gesture to control computer, without need to built physical mouse, keyboard, remote controller, or build control interface on equipments, machines, robots. Gesture Interface Robot will replace the need for building physical control panels, interface devices, reduce high tech device pollution and save the material resource usage on the Earth.
Figure 12 is drawing showing the enhanced wireless select key indication device 235, 236 that wear on user hand palm 82, arms, or user body. The wireless indication device has 2 styles, style 235 includes micro controller 240, Bluetooth 239, LED light 242, vibration motor 244 and power source 237 with flexible belt 245 that can tight hold on hand palm 82. The 2nd style i236 include micro controller 240, wireless Wi-Fi, TCP/IP network card 246, LCD display screen 247, vibration motor 244, power source 237, watch belt to hold the device on hand 72.
When user push hand 82 out, program will sending wireless network signals to device to signal display which selection keys; for example by blinking LED light 242 in MorseCode signals, and/or using vibration motor 244 to make long-short vibrations MorseCode signal, so user don't need to watch display monitor 43 to know what keys they select, this feature especially useful for poor eyesight, and blind users. The LCD screen can display real time monitor content, see the puzzle cell map image.
Figure 13 is the drawing showing a wireless display glass 46 that has network protocol equipment 45 includes wireless network card equipment 249, video image process card equipment 250, connect with projector 252, power source 247, and wireless server-client program to connect with robot, and robot send the display signals of puzzle cell map image with hands selection positions 253, 265, and the wireless display glass projector 252 project the puzzle cell image keys on its lenses 246, so user can see which keys they select. The left side 269 area is for left hand keys 270, 271, 272, 273, 274, 275, and the right side 266 area is for the right hand keys 259, 260, 261, 262,263, 264. The lenses center area can optional to reserved for display robot text feedback 268, and real-time video image of user action 267. Figure 14 is the drawing robot equipped a mobile platform, example using micro controller board to control varieties motors 26, 30, so robot main computer 1, vision program can intelligently control these motors rotation; as result, robot intelligently driving itself moving around 276, and able to control moving its display projector 44direction to project puzzle cell keyboard images 277 on any surface 278.
Here are the Arduino programming code how to enable Micro controller to control the motor module rotation.
To control more than 1 motor using 1 string signal array.
Directly copy and paste from robot working prototype in Arduino code.
Copyright
#include <Servo.h>
Servo servo;
Servo servoY; void setupQ {
servo. attach(l l); // digital pin 11
servoY.attach(lO); // digital pin 10
Serial.begin(9600);
servo. write(90);
servoY.write(90);
}
void loopO
{ if (Serial.availableO >=2 )
{
byte pos= Serial. read();
byte posXY= Serial. read(); if(pos == 1)
{
servo. write(posXY);
delay(5);
}
else if (pos ==2)
{ servoY. write(posXY);
delay(5);
} } }
This Arduino can download to Adruino Micro controller and connect the
Arduino board COM port connect to the (PCMVKCP) program 3, so the robot vision program can intelligent sending value string to Adruino to rotate its motor direction, speed, intelligently. The motor module can be use for video sensor Tilt, and Pan rotation, robot body movement, neck, arms, legs, and mobile wheels.
The varieties of motors control modules can use to build into robot's neck, body, arms, hands, legs, so robot can be build as human shape, physical body movement ability with Gesture Interface Robot puzzle cell map function. The Gesture Interface Robot becomes the communication bridge between human and intelligent robot machine world. This invention proposal Gesture Interface Robot example is use Microsoft Kinect sensor, Microsoft Visual Studio C# programming, Arduino micro control board as demonstration to build a completed working Gesture Interface Robot demonstration. There are alternative methods available to customize build the Gesture Interface Robot as well.

Claims

I Claim:
1. The completed working example model of Gesture Interface Robot (GIR) intelligent create comfortable hand gesture moving and push area for user, prevent injury, sandwich virtual control layers zones and able to sending Infrared signal to control other computer, machine, intelligent robot by simple hand gesture click select virtual puzzle cell command keys use components includes,
1. Main Computer 1
2. Video vision sensor module can be use Microsoft Kinect sensor.
Build in Universal Infrared Receiver Transmitter (UIRT) 14 to this video sensor module as addition IR remote control features to physical operate machines.
3. Micro Controller Board 21 can use Arduino board.
4. Varity of motors modules
5. USB-Universal Infrared Receiver Transmitter (UIRT)
1. Display monitor
2. Gesture Interface Robot according to Claim 1 Optional select to equipped wireless network equipments such as Bluetooth network card , Wi-Fi network card , all wireless network protocol card devices, TCP/IP, Internet Protocol such as Xbee, Ethernet, Wify, Bluetooth, Cell Phone channel 3G, 4G, GSM, CDMA, TDMA ... etc, space telecommunication channel, satellite channels. Gesture Interface Robot according to Claim 1 Option select to equipped Display monitor devices such as display monitor, image Projector and wireless network (example TCP/IP, or Bluetooth) display monitor glass. Gesture Interface Robot according to Claim 1 Main Computer power source, wall power plug, and portable power source can be rechargeable batteries, solar cell, fuel cell, rotation generator, wind turbine, thermo electron generator (TEG), ... etc. to regenerate electric power for robot to moving and operation. Gesture Interface Robot according to Claim 1 Optional select to equipped mobile motor wheel platform, equipped motors wheels for control motor rotation direction and speed. Robots all components can be place on platform and robot is able to using video vision function to drive itself, moving around.
3. Gesture Interface Robot according to Claim 1 Main Computer 1 use as Robot's brain to process video image, The user body part joins location 3 dimension X, Y, Z values can be program using Microsoft visual C# program 4, (or VB), to calling Kinect and other system assemble libraries, and enable Kinect sensor to reading user joint values in the program. Gesture Interface Robot according to Claim 1 These basic video sensor reading user's 3D body joint values are available now, therefore we can write a specific puzzle cell map virtual keyboard control program (PCMVKCP) 3 that transform the basic 3D joint value, intelligently measuring calibrate into a new gesture interface input work space zone and establish puzzle cell virtual keyboard into the zone. So, user is able to moving hands, and point out to click virtual keys. Those enable Kinect sensor functions to reading joints values can coding into The
(PCMVKCP) program 3, and the program 3 can be a class program (example:
MainWindow.xaml.es) that include in the Microsoft Visual Studio C# 4 as one project and build into 1 project solution, prefer in WPF Application type project, so all the video Kinect sensor reading value is available for (PCMVKCP) program to use them in real time programming, create dynamic user graphic interface.
4. Gesture Interface Robot according to Claim 1 Gesture Interface robot use vision 3 dimension, X, Y, Z body parts value for robot vision puzzle cell map virtual keyboard control program (PCMVKCP) 3 to be able create work zone, establish puzzle cell map virtual keyboards in comfortable hand moving and push gesture area, prevent injury, provide real-time user hands location, convert to puzzle cell position, then match puzzle cell row-column to match with its puzzle cell command map list, transfer the cell position to computer command, and sending command to automation program 2 (such as EventGhost) to run pre record macro script to execute command such as display type a text, running a computer program, sending Infrared signal to remote control TV, DVD, or another computer to typing, mouse movement, mouse clicks, running computer program, internet browser ... etc computer operations.
5. Gesture Interface Robot according to Claim 1 In main computer 1 includes web server function 41, such as IIS server and can establish inter server-client network,
DNS server, TCP/IPURL, namespace ... etc. web site hosting, provide HTML, XMAL, scripting functions. When (PCMVKCP) program 3 can activate a web Brower, sending a web page URL include a specific text code, when particular web page is being running and open, the automation program 2 (such as EventGhost) detect the particular text code trigger, it will trigger the macro action in the folder. Example "http://localhost:8000/index.html?HTTP.KEYS_MU" in Browser URL and enter. The web page activate link to trigger automation program EventGhost trigger event (KEYS Folder, MU event), and exercise the MU Marco script to sending out Infrared Signal to control another computer to move its mouse position UP small distant, if "MU8", then move its mouse position UP large distant, if "ML225", then move its mouse position 225 degree small distant, if "ML225-8", then move its mouse position 225 degree, 8 times of small distant.
6. Gesture Interface Robot according to Claim 4 comfortable hands moving and push out area, Gesture Interface Robot include vision (PCMVKCP) program can use these video sensor-reading values to create a perfect Workspace zone according to the user body measurements. The (PCMVKCP) program will assign a virtual center point on user, which is prefer to be should center joint point. The prefer Workspace zone width length is total length of each shoulder length x 1.5 (1.5 + 1.5 = 3), and the prefer
Workspace zone height length is total length of Shoulder Center to Head top x 2. The Workspace zone will be tracking in front of user accordingly User's should center joint point 79; therefore when user walking or moving, the Workspace zone is always at the same place in front of user. If user walking the video view able area, the software keeps digital tracking, and when user walking out the video view area edge, then (PCMVKCP) program, will activate intelligent motor module rotate video sensor to follow aiming on user.
7. Gesture Interface Robot according to Claim 1 create the Workspace zone size is defined, then (PCMVKCP) program will divide the Workspace zone into Puzzle Cell Row-Column formation. Gesture Interface Robot t Robot according to Claim 11 he selection click zone z dimension user hands gesture click detect space is divided into 3 selection mode zones. 1st selection unlock selected key gate zone, 2nd moving to select virtual key zone, and 3rd select click zone.
8. Gesture Interface Robot according to Claim 1 Gesture Interface Robot is using puzzle cell mapping method, Gesture Interface Robot is acting like a graphic image (Picasso) painter, robot program draw the graphic picture of virtual control panels keys. On the display monitor, the puzzle cell virtual control panel keys can be drawing a grid image of rows and columns cells and tilt with TextBlock field, then fill in text word to TextBlock field on each grid row and column cells on the graphic image as command, therefore inside the program function those text command words can be coding arranged into a 2 dimension array text strings, then loading each text word into row and column cells, so that display on the graphic puzzle cell image and virtually have assign on user's working space zone. Gesture Interface Robot according to Claim 13 Gesture Interface Robot can create any keyboard and control panel that user wanted includes computer QWERT keyboard, ABC, 123, Function keys, Symbols keys, Mouse keys, program keys, and create virtual control panel for machines such as TV, DVD, Sirius radio, Light, SLOT machine, Space Telescope, Disco DJ machine, Disco Light, virtual 3D world avatar, Mar's Rover, Intelligent Robot. Gesture Interface Robot according to Claim 14 Robot vision program draw virtual puzzle cell map keys control panel graphic image on display monitor and highlight select keys as visual indication, so use know which keys they are selected.
9. Special Gesture Interface Robot has special unique fingers hand sign gestures enhance puzzle cell selection control accuracy. A special GIR gesture hand sign to moving fingers like spider walking its legs to change nearby puzzle cell selection. Moving like spider legs walking gesture, the puzzle cell row-column lines like spider web net. So tiny moving fingers in waking direction of up, down, left, right, the program can detect to change select key.
10. Special Gesture Interface Robot according to Claim 9 has special unique fingers hand sign gestures enhance puzzle cell selection control accuracy A special hold to grip click hand sign, special GIR hand gesture feature is useful when user need to rush to click a virtual key for sure in a emergency situation such as in spaceship out of control or user has shaking hands illness problem, the program will support for the need.
11. Special Gesture Interface Robot according to Claim 9 has special unique fingers hand sign gestures enhance puzzle cell selection control accuracy. A special GIR hand sign gesture to continuous click without pull hand back to unlock, by moving like fish swimming its fins, moving fingers 100, 105, 106, 107, 108 up and down that cause hand palm center blinking, a vision tracking function detect perform continuous clicks, without pulling hand back to 1st zone.
12. Special Gesture Interface Robot according to Claim 9 has special unique fingers hand sign gestures enhance puzzle cell selection control accuracy. A special GIR hand sign look like a gun gesture point to puzzle cell, so the program see the hand holding and then fingers point out that make different, program select to lock the key. Tiny gun gesture point area make vision-tracking accuracy, so user move rotate finger gun point, small movement to change key selection.
13. Gesture Interface Robot according to Claim 1 has special interface arrangement, the virtual puzzle cell map keyboard, control panel, prefer special interface section arrangement, can be divide into Left and Right hand zones, The center area of virtual puzzle cell map keys control panel graphic image is reserved to place a real time video image that show user actions. So user can see themselves and all the control virtual keys together on monitor, this special virtual gesture interface arrangement make good visual feedback indication controls and easy for eyesight during for user operation.
14. Gesture Interface Robot according to Claim 1 able to listening voice command, and speak voice feedback. If user say keyword the "LOOK UP", "LOOK RIGHT", "LOOK LEFT", "LOOK DOWN" keys, robot will control its motor module to rotate its video sensor to turn UP, RIGHT, LEFT or Down direction., if user call Robot's name, robot will determine sound source and automatic locate the user location and rotate motor to project video sensor view on the user.
15. Gesture Interface Robot according to Claim 13 Puzzle Cell Morse code Keyboard is useful for poor eyesight user, and blind user to enter command by simplest gesture action "Di", "DHA" actions to control machines.
16. Gesture Interface Robot enable advance gesture interface that Touchscreen Mouse combine puzzle cell virtual keyboard in sandwich layers method. It is a virtual touch screen that user can use hand gesture to move the mouse position and select mouse click action to operate computer.
17. Gesture Interface Robot according to Claim 16 Gesture Interface Robot
TouchScreen Mouse, For user to convince to move their hand control mouse point, the Right Hand TouchScreen Mouse program function can be setup to start cursor position will be in monitor LEFT-TOP corner position that is video card monitor 0,0 position. On the other hand, for the left hand will be determined position using X , Y value between center point of section I, II, III,and IV, then the LEFT Hand
TouchScreen Mouse program function can setup to start cursor position in monitor Right-Bottom comer position, Gesture Interface Robot Puzzle Cell Keys can be define in software function coding by allow the keys to multiple click, multiple speed, different move distant, enable multiple clicks by 1 gesture action and also allow to control the lock or unlock the key that to enable re-click in 3rd zone key. When user use GIR special gesture hand sign can continuous click virtual keys easily in 3rd selection click zone.
18. Gesture Interface Robot according to Claim 1 optional equipped the enhanced wireless select key indication device that wear on user hand palm, arms, or user body. The wireless indication device has 2 styles, 1st style includes micro controller, Bluetooth, LED light, vibration motor and power source with flexible belt that can tight hold on hand palm. The 2nd style include micro controller, wireless Wi-Fi,
TCP/IP network card, LCD display screen, vibration motor, power source, watch belt to hold the device on hand. When user push hand out, program will sending wireless network signals to device to signal display which selection keys; for example by blinking LED light in MorseCode signals, and/or using vibration motor to make long- short vibrations MorseCode signal, so user don't need to watch display monitor to know what keys they select, this feature especially useful for poor eyesight, and blind users. The LCD screen can display real time monitor content, see the puzzle cell map image.
19. Gesture Interface Robot according Claim 1 equipped a mobile platform, example using micro controller board to control varieties motors, robot main computer vision program can intelligently control these motors rotation; as result, robot intelligently driving itself moving around and able to control moving its display projector direction to project puzzle cell keyboard images on any surface. Gesture Interface robot equipped micro controller board such as Arduino can programming the Micro controller and connect the Arduino board by COM port that connect to the
(PCMVKCP) program in main computer, so the robot vision program can intelligent sending value string to Adruino board to rotate its motor direction, control speed, intelligently. The motor module can be use for video sensor Tilt, and Pan rotation, robot body movement, neck, arms, legs, and mobile wheels. Gesture Interface robot according to Claim 28, the varieties of motors control modules can use to build into robot's neck, body, arms, hands, legs, so robot can be build as human shape, physical body movement ability with Gesture Interface Robot puzzle cell map function. The Gesture Interface Robot becomes the communication bridge between human and intelligent robot machine world.
20. Gesture Interface Robot according to Claim 1 Wireless Display Glass that has network protocol equipment includes wireless network card equipment, video image process card equipment, connect with projector, power source, and wireless server- client program to connect with robot, and robot send the display signals of puzzle cell map image with hands selection positions, and the wireless display glass projector project the puzzle cell image keys info on its lenses, so user can see which keys they select. The left side screen area is for left hand keys and the right side screen area is for the right hand keys. The lenses center area can optional to reserved for display robot text feedback, and real-time video image of user action.
PCT/CA2015/050493 2014-06-08 2015-05-29 Gestural interface with virtual control layers WO2015188268A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CA2917590A CA2917590A1 (en) 2014-06-08 2015-05-29 Gestural interface with virtual control layers

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201462009302P 2014-06-08 2014-06-08
US62/009,302 2014-06-08
US14/723,435 US9696813B2 (en) 2015-05-27 2015-05-27 Gesture interface robot
US14/723,435 2015-05-27

Publications (1)

Publication Number Publication Date
WO2015188268A1 true WO2015188268A1 (en) 2015-12-17

Family

ID=54832656

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CA2015/050493 WO2015188268A1 (en) 2014-06-08 2015-05-29 Gestural interface with virtual control layers

Country Status (2)

Country Link
CA (3) CA2917590A1 (en)
WO (1) WO2015188268A1 (en)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105773633A (en) * 2016-04-14 2016-07-20 中南大学 Mobile robot man-machine control system based on face location and flexibility parameters
CN105999670A (en) * 2016-05-31 2016-10-12 山东科技大学 Shadow-boxing movement judging and guiding system based on kinect and guiding method adopted by same
CN106514667A (en) * 2016-12-05 2017-03-22 北京理工大学 Human-computer cooperation system based on Kinect skeletal tracking and uncalibrated visual servo
CN106826846A (en) * 2017-01-06 2017-06-13 南京赫曼机器人自动化有限公司 The intellect service robot and method driven based on abnormal sound and image event
CN107193385A (en) * 2017-06-29 2017-09-22 云南大学 It is a kind of based on methods of the Kinect to keyboard Behavior modeling
CN107639620A (en) * 2017-09-29 2018-01-30 西安交通大学 A kind of control method of robot, body feeling interaction device and robot
CN108638069A (en) * 2018-05-18 2018-10-12 南昌大学 A kind of mechanical arm tail end precise motion control method
CN108829252A (en) * 2018-06-14 2018-11-16 吉林大学 Gesture input computer character device and method based on electromyography signal
US10448762B2 (en) 2017-09-15 2019-10-22 Kohler Co. Mirror
US10663938B2 (en) 2017-09-15 2020-05-26 Kohler Co. Power operation of intelligent devices
CN111694428A (en) * 2020-05-25 2020-09-22 电子科技大学 Gesture and track remote control robot system based on Kinect
US10887125B2 (en) 2017-09-15 2021-01-05 Kohler Co. Bathroom speaker
US11099540B2 (en) 2017-09-15 2021-08-24 Kohler Co. User identity in household appliances
US20230071312A1 (en) * 2021-09-08 2023-03-09 PassiveLogic, Inc. External Activation of Quiescent Device
US11921794B2 (en) 2017-09-15 2024-03-05 Kohler Co. Feedback for water consuming appliance

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112894857B (en) * 2021-03-02 2024-04-09 路邦科技授权有限公司 Key control method for clinical auxiliary robot in hospital

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030156756A1 (en) * 2002-02-15 2003-08-21 Gokturk Salih Burak Gesture recognition system using depth perceptive sensors
US20100231522A1 (en) * 2005-02-23 2010-09-16 Zienon, Llc Method and apparatus for data entry input
US8552983B2 (en) * 2007-07-11 2013-10-08 Hsien-Hsiang Chiu Intelligent robotic interface input device
US20140006997A1 (en) * 2011-03-16 2014-01-02 Lg Electronics Inc. Method and electronic device for gesture-based key input

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030156756A1 (en) * 2002-02-15 2003-08-21 Gokturk Salih Burak Gesture recognition system using depth perceptive sensors
US20100231522A1 (en) * 2005-02-23 2010-09-16 Zienon, Llc Method and apparatus for data entry input
US8552983B2 (en) * 2007-07-11 2013-10-08 Hsien-Hsiang Chiu Intelligent robotic interface input device
US20140006997A1 (en) * 2011-03-16 2014-01-02 Lg Electronics Inc. Method and electronic device for gesture-based key input

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105773633A (en) * 2016-04-14 2016-07-20 中南大学 Mobile robot man-machine control system based on face location and flexibility parameters
CN105999670A (en) * 2016-05-31 2016-10-12 山东科技大学 Shadow-boxing movement judging and guiding system based on kinect and guiding method adopted by same
CN106514667A (en) * 2016-12-05 2017-03-22 北京理工大学 Human-computer cooperation system based on Kinect skeletal tracking and uncalibrated visual servo
CN106826846B (en) * 2017-01-06 2020-02-14 南京赫曼机器人自动化有限公司 Intelligent service robot and method based on abnormal sound and image event driving
CN106826846A (en) * 2017-01-06 2017-06-13 南京赫曼机器人自动化有限公司 The intellect service robot and method driven based on abnormal sound and image event
CN107193385A (en) * 2017-06-29 2017-09-22 云南大学 It is a kind of based on methods of the Kinect to keyboard Behavior modeling
US11949533B2 (en) 2017-09-15 2024-04-02 Kohler Co. Sink device
US11921794B2 (en) 2017-09-15 2024-03-05 Kohler Co. Feedback for water consuming appliance
US10448762B2 (en) 2017-09-15 2019-10-22 Kohler Co. Mirror
US11314214B2 (en) 2017-09-15 2022-04-26 Kohler Co. Geographic analysis of water conditions
US10663938B2 (en) 2017-09-15 2020-05-26 Kohler Co. Power operation of intelligent devices
US11892811B2 (en) 2017-09-15 2024-02-06 Kohler Co. Geographic analysis of water conditions
US10887125B2 (en) 2017-09-15 2021-01-05 Kohler Co. Bathroom speaker
US11314215B2 (en) 2017-09-15 2022-04-26 Kohler Co. Apparatus controlling bathroom appliance lighting based on user identity
US11099540B2 (en) 2017-09-15 2021-08-24 Kohler Co. User identity in household appliances
CN107639620A (en) * 2017-09-29 2018-01-30 西安交通大学 A kind of control method of robot, body feeling interaction device and robot
CN108638069B (en) * 2018-05-18 2021-07-20 南昌大学 Method for controlling accurate motion of tail end of mechanical arm
CN108638069A (en) * 2018-05-18 2018-10-12 南昌大学 A kind of mechanical arm tail end precise motion control method
CN108829252A (en) * 2018-06-14 2018-11-16 吉林大学 Gesture input computer character device and method based on electromyography signal
CN111694428B (en) * 2020-05-25 2021-09-24 电子科技大学 Gesture and track remote control robot system based on Kinect
CN111694428A (en) * 2020-05-25 2020-09-22 电子科技大学 Gesture and track remote control robot system based on Kinect
US20230071312A1 (en) * 2021-09-08 2023-03-09 PassiveLogic, Inc. External Activation of Quiescent Device

Also Published As

Publication number Publication date
CA2917590A1 (en) 2015-12-17
CA3204400A1 (en) 2015-12-17
CA3204405A1 (en) 2015-12-17

Similar Documents

Publication Publication Date Title
US20160350589A1 (en) Gesture Interface Robot
WO2015188268A1 (en) Gestural interface with virtual control layers
CN104520787B (en) Wearing-on-head type computer is as the secondary monitor inputted with automatic speech recognition and head-tracking
CN105144057B (en) For moving the equipment, method and graphic user interface of cursor according to the cosmetic variation of the control icon with simulation three-dimensional feature
Kamel Boulos et al. Web GIS in practice X: a Microsoft Kinect natural user interface for Google Earth navigation
KR102184269B1 (en) Display apparatus, portable apparatus and method for displaying a screen thereof
Lifton et al. Metaphor and manifestation cross-reality with ubiquitous sensor/actuator networks
US20120310622A1 (en) Inter-language Communication Devices and Methods
CN102812417A (en) Wireless hands-free computing headset with detachable accessories controllable by motion, body gesture and/or vocal commands
CN102541256A (en) Position aware gestures with visual feedback as input method
CN103064514A (en) Method for achieving space menu in immersive virtual reality system
US9870139B2 (en) Portable apparatus and method for sharing content with remote device thereof
CN105872201A (en) Method for remotely controlling document display through intelligent terminal, intelligent terminal and computer equipment
CN108037885A (en) A kind of operation indicating method and mobile terminal
Grill et al. ConWIZ: a tool supporting contextual wizard of Oz simulation
JP2018005663A (en) Information processing unit, display system, and program
CN109857299A (en) A kind of display methods and terminal
EP3285143A1 (en) Ar/vr device virtualisation
JP2018005660A (en) Information processing device, program, position information creation method, and information processing system
Takeuchi Synthetic space: inhabiting binaries
Klein A Gesture Control Framework Targeting High-Resolution Video Wall Displays
CN106687917A (en) Full screen pop-out of objects in editable form
KR102468096B1 (en) Electronic board including usb terminal and operating method therefor
Keller et al. A prototyping and evaluation framework for interactive ubiquitous systems
RU2783486C1 (en) Mobile multimedia complex

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2917590

Country of ref document: CA

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15806120

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15806120

Country of ref document: EP

Kind code of ref document: A1