CA3204405A1 - Gestural interface with virtual control layers - Google Patents
Gestural interface with virtual control layersInfo
- Publication number
- CA3204405A1 CA3204405A1 CA3204405A CA3204405A CA3204405A1 CA 3204405 A1 CA3204405 A1 CA 3204405A1 CA 3204405 A CA3204405 A CA 3204405A CA 3204405 A CA3204405 A CA 3204405A CA 3204405 A1 CA3204405 A1 CA 3204405A1
- Authority
- CA
- Canada
- Prior art keywords
- input device
- sensing input
- gesture sensing
- user
- virtual
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000006870 function Effects 0.000 claims description 98
- 230000009471 action Effects 0.000 claims description 70
- 230000004438 eyesight Effects 0.000 claims description 64
- 238000000034 method Methods 0.000 claims description 30
- 230000033001 locomotion Effects 0.000 claims description 25
- 230000008859 change Effects 0.000 claims description 16
- 230000000007 visual effect Effects 0.000 claims description 15
- 238000013507 mapping Methods 0.000 claims description 9
- 230000008569 process Effects 0.000 claims description 9
- 239000011521 glass Substances 0.000 claims description 8
- 238000013515 script Methods 0.000 claims description 8
- 238000005259 measurement Methods 0.000 claims description 4
- 230000004913 activation Effects 0.000 claims description 3
- 239000000446 fuel Substances 0.000 claims description 2
- 230000015572 biosynthetic process Effects 0.000 claims 8
- 230000001953 sensory effect Effects 0.000 claims 2
- 230000002452 interceptive effect Effects 0.000 claims 1
- 241000699666 Mus <mouse, genus> Species 0.000 description 142
- 238000004590 computer program Methods 0.000 description 7
- 210000001503 joint Anatomy 0.000 description 7
- 241001061260 Emmelichthys struhsakeri Species 0.000 description 6
- 101100288498 Mus musculus Large1 gene Proteins 0.000 description 6
- 210000002310 elbow joint Anatomy 0.000 description 6
- 230000009182 swimming Effects 0.000 description 6
- 208000027418 Wounds and injury Diseases 0.000 description 5
- 230000004397 blinking Effects 0.000 description 5
- 230000006378 damage Effects 0.000 description 5
- 208000014674 injury Diseases 0.000 description 5
- 230000002093 peripheral effect Effects 0.000 description 4
- 241000251468 Actinopterygii Species 0.000 description 3
- 241000239290 Araneae Species 0.000 description 3
- 230000002159 abnormal effect Effects 0.000 description 3
- 230000008901 benefit Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 3
- 230000002708 enhancing effect Effects 0.000 description 3
- 230000005484 gravity Effects 0.000 description 3
- 210000003205 muscle Anatomy 0.000 description 3
- 238000010429 water colour painting Methods 0.000 description 3
- 241000699670 Mus sp. Species 0.000 description 2
- 230000003213 activating effect Effects 0.000 description 2
- 210000004690 animal fin Anatomy 0.000 description 2
- 210000004556 brain Anatomy 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 230000005611 electricity Effects 0.000 description 2
- 230000036541 health Effects 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- PWPJGUXAGUPAHP-UHFFFAOYSA-N lufenuron Chemical compound C1=C(Cl)C(OC(F)(F)C(C(F)(F)F)F)=CC(Cl)=C1NC(=O)NC(=O)C1=C(F)C=CC=C1F PWPJGUXAGUPAHP-UHFFFAOYSA-N 0.000 description 2
- 235000008409 marco Nutrition 0.000 description 2
- 244000078446 marco Species 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 210000000323 shoulder joint Anatomy 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- 239000011800 void material Substances 0.000 description 2
- 208000021421 Arm injury Diseases 0.000 description 1
- 241001050985 Disco Species 0.000 description 1
- 210000000078 claw Anatomy 0.000 description 1
- 238000012905 input function Methods 0.000 description 1
- 230000003278 mimic effect Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 239000011435 rock Substances 0.000 description 1
- 230000002618 waking effect Effects 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C17/00—Arrangements for transmitting signals characterised by the use of a wireless electrical link
- G08C17/02—Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C2201/00—Transmission systems of control signals via wireless link
- G08C2201/30—User interface
- G08C2201/32—Remote control based on movements, attitude of remote control device
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Networks & Wireless Communication (AREA)
- Position Input By Displaying (AREA)
- User Interface Of Digital Computer (AREA)
- Manufacture Of Alloys Or Alloy Compounds (AREA)
- Ceramic Products (AREA)
- Glass Compositions (AREA)
Abstract
In a virtual, three-dimensional working space a gesture sensing input device is operative to translate hand gestures of a user into commands for operating a computer or various machines.
The input device tracks the user and recognizes the user's hand gestures by correlating the gestures with defined "puzzle-cell" positions established in virtual working space zones, the "puzzle-cell" positions being mapped for converting the hand gestures into computer commands.
In the virtual working space, a mouse zone, keyboard zone, and hand sign language zone are defined. The working space is further defined by virtual, layered control zones whereby a plane in which a zone lies may be used to determine whether an actuation has occurred by the crossing of a boundary.
The input device tracks the user and recognizes the user's hand gestures by correlating the gestures with defined "puzzle-cell" positions established in virtual working space zones, the "puzzle-cell" positions being mapped for converting the hand gestures into computer commands.
In the virtual working space, a mouse zone, keyboard zone, and hand sign language zone are defined. The working space is further defined by virtual, layered control zones whereby a plane in which a zone lies may be used to determine whether an actuation has occurred by the crossing of a boundary.
Description
SPECIFICATION
CROSS-REFERENCE TO RELATED APPLICATIONS
This patent application is a divisional patent application of CA. Non Provisional Patent Application No. 2,917,590 filed on January 06, 2016, which claims priority to and the benefit of U.S. Provisional Patent Application Ser. No. 62/009,302, filed on June 08, 2014, and U.S. Non Provisional Patent Application Ser. No. 14/723,435, filed on May 27, 2015, now the entire contents of which is hereby incorporated herein by reference.
Title of the Invention GESTURAL INTERFACE WITH VIRTUAL CONTROL LAYERS
[0001] Field of the Invention
CROSS-REFERENCE TO RELATED APPLICATIONS
This patent application is a divisional patent application of CA. Non Provisional Patent Application No. 2,917,590 filed on January 06, 2016, which claims priority to and the benefit of U.S. Provisional Patent Application Ser. No. 62/009,302, filed on June 08, 2014, and U.S. Non Provisional Patent Application Ser. No. 14/723,435, filed on May 27, 2015, now the entire contents of which is hereby incorporated herein by reference.
Title of the Invention GESTURAL INTERFACE WITH VIRTUAL CONTROL LAYERS
[0001] Field of the Invention
[0002] This invention relates to an intelligent gesture sensing input device using the method of gestural Interface with virtual control layers (IGSID-GIVCL) equipped with a video vision sensor to read user hand gestures, to operate computers, machines, and intelligent robots. The unique gesture reading method of gestural Interface with virtual control layers is the IGSID ¨
GIVCL has vision puzzle cell map virtual keyboard control program (PCMVKCP) functions that uses a puzzle cell mapping dynamic multiple sandwich layers work zone of a virtual touch screen, mouse, keyboard, and control panel that establish within user's comfortable gesture action area. The IGSID-GIVCL allows the user to easily move hands and push to click. It is easy to operate. It does not require the user to make hands swings action or abnormal body posting actions that the user could get hurt and hit object or others around them. The best gesture solution is my invented method using a puzzle cell mapping gesture method which is considered as a safe and efficient way. The user uses simple gesture actions to control all kind of computer machines all together. It does not require the user to remember which gesture body post for which command. The IGSID-GIVCL displays a real time highlight on keyboard graphic image Date Recue/Date Received 2021-11-21 or on a display monitor for visual indication. Therefore, the user knows which command the user selected, and the user extends the user's hand forward to confirm the selection. The puzzle cell mapping gesture command method enables using simple move and click gesture actions to easily control complex multiple computer machines and robots at the same time.
GIVCL has vision puzzle cell map virtual keyboard control program (PCMVKCP) functions that uses a puzzle cell mapping dynamic multiple sandwich layers work zone of a virtual touch screen, mouse, keyboard, and control panel that establish within user's comfortable gesture action area. The IGSID-GIVCL allows the user to easily move hands and push to click. It is easy to operate. It does not require the user to make hands swings action or abnormal body posting actions that the user could get hurt and hit object or others around them. The best gesture solution is my invented method using a puzzle cell mapping gesture method which is considered as a safe and efficient way. The user uses simple gesture actions to control all kind of computer machines all together. It does not require the user to remember which gesture body post for which command. The IGSID-GIVCL displays a real time highlight on keyboard graphic image Date Recue/Date Received 2021-11-21 or on a display monitor for visual indication. Therefore, the user knows which command the user selected, and the user extends the user's hand forward to confirm the selection. The puzzle cell mapping gesture command method enables using simple move and click gesture actions to easily control complex multiple computer machines and robots at the same time.
[0003] Background Art
[0004] The IGSID-GIVCL uses a puzzle cell mapping dynamic multiple sandwich layers work zone of a virtual touch screen mouse, keyboard, and control panel in a user's comfortable gesture action area. The user easily moves hands and push to click. The user applies easy gesture actions to control complex machines in real time. The IGSID-GIVCL prevents injury.
Problems to Solve and Benefits
Problems to Solve and Benefits
[0005] 1. Current gesture systems require users to do big gesture actions that could cause injury.
For example, the users hit object or someone around. In addition, extending hands or body muscles rapidly could lead to injury as well. When the users conduct abnormal gesture body action, it could hurt them.
For example, the users hit object or someone around. In addition, extending hands or body muscles rapidly could lead to injury as well. When the users conduct abnormal gesture body action, it could hurt them.
[0006] 2. In our house, work office, business, everywhere we go, there are tons of remote controllers. Too many keyboards, mice, remote controllers, smart phones, and tablet device controllers may cause trouble. Each controller has its key functions, unnecessary burden to operate, requirement to click many keys to just turn on a TV or DVD to watch.
It is difficult to remember which key on which control.
It is difficult to remember which key on which control.
[0007] 3. It eliminates requirements to build tons of physical mice, keyboards, remote controllers, control panel interfaces on equipment, transportation, cars, airplanes, spaceships, control office centers, etc. It stops wasting resource, pollution and it saves money.
[0008] 4. A regular gesture device doesn't have the sufficient functions. The regular gesture device requires big body gesture actions. The regular gesture device cannot be used to control complex computer actions that users needed.
[0009] 5. A reduction of unnecessary equipment interface installation can benefit spaceship to Date Recue/Date Received 2021-11-21 reduce weight. It also frees the room space.
[0010] 6. In space, under zero gravity, an IGSID-GIVCL puzzle cell map method is a perfect solution for an astronaut because it uses simple gestures to control computers, machines, and intelligent robots in a zero gravity environment.
[0011] 7. An IGSID-GIVCL makes gesture control in all areas possible. It is in both ways. The IGSID-GIVCL has intelligence to make gesture operation easily. It improves our lifestyle, and changes the way that people operate computers, machines and robots all around the world.
[0012] 8. Soon, an auto self-driving car, a flight jet, and a spaceship will become self-intelligent.
People need to be able to communicate with autonomous robots by gesture actions. IRIID
IGSID-GIVCL can be the communication bridge between human and autonomous robot machine world. IRIID IGSID-GIVCL will change the way how people operate computers, machines, and intelligent robots in the entire world.
Summary of the Invention
People need to be able to communicate with autonomous robots by gesture actions. IRIID
IGSID-GIVCL can be the communication bridge between human and autonomous robot machine world. IRIID IGSID-GIVCL will change the way how people operate computers, machines, and intelligent robots in the entire world.
Summary of the Invention
[0013] The above problems are solved by the present invention of an IGSID-GIVCL. To solve problems and improve a better way for human controlling computers, machines and intelligent robots, I have purposed the IGSID-GIVCL that allows a user to use hands to move in a comfortable area to select a virtual puzzle cell keyboard, a mouse, or a control panel key and to push out the hand toward a selection as a click select action. A video vision of the IGSID-GIVCL recognizes a user selection gesture action and its location according to the center point assigned on user. The IGSID-GIVCL uses the relative distant on hands location with the center point to determine the X, Y position of a corresponding puzzle cell position.
In addition, the IGSID-GIVCL can also recognize user push hands locations in a Z direction from a distant change between hands and user body distant. For example, if a hand location is being pushed out, the distant between the hand and the user body will increase. The maximum of this push distant is a total length of the hand and arms that normal human can push the hand out.
In addition, the IGSID-GIVCL can also recognize user push hands locations in a Z direction from a distant change between hands and user body distant. For example, if a hand location is being pushed out, the distant between the hand and the user body will increase. The maximum of this push distant is a total length of the hand and arms that normal human can push the hand out.
[0014] An IGSID-GIVCL can virtually divide this total hand push out distant in 3 selection Date Recue/Date Received 2021-11-21 zones. A first selection zone is an unlocking user hand selection key zone. A
second section zone is to move a hand in all directions including UP, Down, Left, Right to select a virtual key selection zone. A third selection zone is to push out as a clicking selection zone. As a result, when the user moves hands in the 2nd selection zone, the IGSID-GIVCL updates a real time visual puzzle cell map to display both hands position on a graphic puzzle cell map control keyboard displayed on monitor as visual indicate to user. So, the user knows which left hand's selected virtual key is and which right hand's selected virtual key is. The selected virtual keys will be highlighted and increased in font sizes as indications on the graphic puzzle cell map keyboard displayed on the monitor. For example, the left-hand selection key is highlighted in red color and with an enlarged font size, and right-hand selection key is highlighted in white color and with an enlarged font size. When the user locates the user's hand on the desired selection command, the user pushes the user's hand out in the Z direction into the 3rd selection zone. The video sensor of the IGSID-GIVCL recognizes user's click action, matches the X, Y
with its puzzle cell map and translates it into a computer command, and then send the command to an automation program that has command script, functions, or Marco with action trigger to exercise the command. So, a web server function of the IGSID-GIVCL can activate web browser and enter URL plus command text code. The specific web page will be opened with specific command text code with whatever the user selected and embedded in a web link format.
When such browser opens the specific web page, a main computer of the IGSID-GIVCL can have automation program such as EVENTGHOST that can detect the trigger action and exercise the command included in the same Marco. So, each web page URL with particular text code will active a different trigger action and exercise different command accordingly.
second section zone is to move a hand in all directions including UP, Down, Left, Right to select a virtual key selection zone. A third selection zone is to push out as a clicking selection zone. As a result, when the user moves hands in the 2nd selection zone, the IGSID-GIVCL updates a real time visual puzzle cell map to display both hands position on a graphic puzzle cell map control keyboard displayed on monitor as visual indicate to user. So, the user knows which left hand's selected virtual key is and which right hand's selected virtual key is. The selected virtual keys will be highlighted and increased in font sizes as indications on the graphic puzzle cell map keyboard displayed on the monitor. For example, the left-hand selection key is highlighted in red color and with an enlarged font size, and right-hand selection key is highlighted in white color and with an enlarged font size. When the user locates the user's hand on the desired selection command, the user pushes the user's hand out in the Z direction into the 3rd selection zone. The video sensor of the IGSID-GIVCL recognizes user's click action, matches the X, Y
with its puzzle cell map and translates it into a computer command, and then send the command to an automation program that has command script, functions, or Marco with action trigger to exercise the command. So, a web server function of the IGSID-GIVCL can activate web browser and enter URL plus command text code. The specific web page will be opened with specific command text code with whatever the user selected and embedded in a web link format.
When such browser opens the specific web page, a main computer of the IGSID-GIVCL can have automation program such as EVENTGHOST that can detect the trigger action and exercise the command included in the same Marco. So, each web page URL with particular text code will active a different trigger action and exercise different command accordingly.
[0015] The automation program such as EVENTGHOST can also recognize a key clicking event as a trigger, so the IGSID-GIVCL can send key click to trigger action.
However, there is limited computer keys that can be assigned to particular commands, and also particular physical key will be assigned and cannot be used for normal typing function anymore.
Therefore, recommendations using the web server ITS service and activating specific web page with specific text code is the best way. It unlimitedly assigns command by differential folders on each control machines and to trigger Marco actions and to free keys to be clickable as normal computer functions. Once automation program such as EVENTGHOST that can have many folders and include save Marcos with trigger actions, and can detected the specific trigger command, the Date Recue/Date Received 2021-11-21 Marco can exercise command such as sending text key command, display A-Z, 0-9, symbols keys, functions key, open computer program, internet browser, words, calculator, 3D graphic drawing CAD program, etc. In addition, the automation program such as EVENTGHOST can include USB UIRT cable to learn physical Infrared Remote Controller each function keys signal and recorded in Marco action.
However, there is limited computer keys that can be assigned to particular commands, and also particular physical key will be assigned and cannot be used for normal typing function anymore.
Therefore, recommendations using the web server ITS service and activating specific web page with specific text code is the best way. It unlimitedly assigns command by differential folders on each control machines and to trigger Marco actions and to free keys to be clickable as normal computer functions. Once automation program such as EVENTGHOST that can have many folders and include save Marcos with trigger actions, and can detected the specific trigger command, the Date Recue/Date Received 2021-11-21 Marco can exercise command such as sending text key command, display A-Z, 0-9, symbols keys, functions key, open computer program, internet browser, words, calculator, 3D graphic drawing CAD program, etc. In addition, the automation program such as EVENTGHOST can include USB UIRT cable to learn physical Infrared Remote Controller each function keys signal and recorded in Marco action.
[0016] When PCMVKCP program trigger the action, EVENTGHOST will send infrared signal out through USB-UIRT cable device. The IR signal can be used to control a physical machine such as a computer, a machine, and an intelligent robot. For example, robot is sending IR signal out to control a TV to turn ON/OFF. Another example, another computer can be equipped IR receiver, then IGSID-GIVCL can send IR signal to control the other computer such as display a-z, 0-9, symbols, function keys, open computer programs, Media, running DVD player, playing music, video, internet browser, playing games, and moving mouse position, Right click, Left Click, Double click, wheel up, wheel down computer functions, etc. As a result, IGSID-GIVCL can control self-intelligent machines, and intelligent robots. Soon self-intelligent driving car, flight jet, and spaceship, intelligent robot will be used in people daily home, health care, education, medical, transportation, public services, etc.
[0017] If a desire to have private own automation program control features being included in the robot program, IGSID-GIVCL, PCMVKCP program can directly be coded with USB-UIRT
cable's API library with an add in to be assembled with available functions.
So, the IGSID-GIVCL, PCMVKCP program can directly control USB-UIRT cable to improve the IR
signal learning and to send out IR signal commands. The IGSID-GIVCL can directly control physical machine such as TV, Computer, machines in the PCMVKCP program without a need to have 31d party automation program such as EVENTGHOST to run it. Similarly, the IGSID-GIVCL
PCMVKCP program can send and enter key command directly to the activation program. For example, an enter key to notepad or MICROSOFT WORD, program coding to send text key command to display typed words on the writing program directly without a need to have a third-party automation program, too.
cable's API library with an add in to be assembled with available functions.
So, the IGSID-GIVCL, PCMVKCP program can directly control USB-UIRT cable to improve the IR
signal learning and to send out IR signal commands. The IGSID-GIVCL can directly control physical machine such as TV, Computer, machines in the PCMVKCP program without a need to have 31d party automation program such as EVENTGHOST to run it. Similarly, the IGSID-GIVCL
PCMVKCP program can send and enter key command directly to the activation program. For example, an enter key to notepad or MICROSOFT WORD, program coding to send text key command to display typed words on the writing program directly without a need to have a third-party automation program, too.
[0018] Current regular computer interface control device methods are not able to support input function efficiently. Current available gesture input systems that uses traditional vision user Date Recue/Date Received 2021-11-21 body posting like looking art picture images method or requires the user to push hands rapidly, to do wide swings actions that could cause user body injury and hit others and objects nearby. The abnormal body post, or to push out hands rapidly require extent muscle exercise that is not a safe way for normal people to do in long hours to control or to operate machines.
Another problem is that those traditional gesture systems require high image process CPU speed, and high cost computer vision program and hardware to be able to recognize some simple gesture positing images. High electricity usage cost demand and its video vision still cannot detect all user gesture action accurately. It needs to be specifically defined on each user, and specific fixed location. These are the current traditional gesture systems problems. These are the reasons why those traditional gesture systems are not being used in public for real applications widely.
Another problem is that those traditional gesture systems require high image process CPU speed, and high cost computer vision program and hardware to be able to recognize some simple gesture positing images. High electricity usage cost demand and its video vision still cannot detect all user gesture action accurately. It needs to be specifically defined on each user, and specific fixed location. These are the current traditional gesture systems problems. These are the reasons why those traditional gesture systems are not being used in public for real applications widely.
[0019] On the other hand, my proposed IGSID-GIVCL is using the method of gestural Interface with virtual control layers has vision puzzle cell map virtual keyboard control program (PCMVKCP) functions. The IGSID-GIVCL is acting like a graphic image (Picasso) painter.
The IGSID- GIVCL, PCMVKCP program draws the graphic picture of virtual control panels keys.
On the display monitor, the puzzle cell virtual control panel keys can be drawn as a grid image of rows and columns cells and be tilted with Text Block field, then fill in text word to Text Block field on each grid row and column cells on the graphic image as command.
Therefore, inside the program function those text command words can be coded and arranged into a two-dimensional array text strings, then loading each text word into row and column cells, so that display on the graphic puzzle cell image and virtually have assign on user's working space zone. The user can freely work around, sit in chair, the IGSID-GIVCL provides user tracking in video view of the user, assign virtual center point on user and create work zone and establish virtual control panel keys in front of user and plus intelligent motor module that can physical rotate video sensor to aim vision tracking on user accordingly when if use walk out of it video view.
The IGSID-GIVCL can assign virtual center point on user such as prefer work space virtual center point is user's shoulder's center point where join connected with throat neck and establish the work zone size by width, and height. The prefer work space width is each shoulder lengthx1.5 on each side, so total workspace zone width prefer 1.5+1.5=3 and the prefer work space height is the shoulder's center point up to the user's head center times 2 to 3. Additional virtual points to be assigned on user body if special need for handicap disability users that require special Date Recue/Date Received 2021-11-21 assignment that could be anywhere of user body location. User without arms, could use mouth to hold pointer stick, or water color painting pen to make gesture selection. The left and right shoulder's edge points can be added into PCMVKCP program for enhancement of the selection X, Y value reading accuracy, and the hands' palm size value (open and hold fingers) can be added in the PCMVKCP program to enhance the click selection reading accuracy.
Therefore, the IGSID-GIVCL creates a prefect comfortable work space zone for the user, and the user can move the user's hands in comfortable space, and in all directions without difficulty, and prevent problems such as injury of themselves, or hitting any others, or objects around.
Because the IGSID-GIVCL
uses puzzle cell mapping method, the IGSID-GIVCL can graphically draw any virtual mouse, keyboard, and control panels that the user wanted instantly. The gesture video sensor requires the user to do simple hand moving and click action. the IGSID-GIVCL can be built by using a regular computer, laptop with video camera, lower system electricity consumption, low building equipment cost. The IGSID-GIVCL can be used by everyone in convenience, waling moving, sitting and everywhere.
The IGSID- GIVCL, PCMVKCP program draws the graphic picture of virtual control panels keys.
On the display monitor, the puzzle cell virtual control panel keys can be drawn as a grid image of rows and columns cells and be tilted with Text Block field, then fill in text word to Text Block field on each grid row and column cells on the graphic image as command.
Therefore, inside the program function those text command words can be coded and arranged into a two-dimensional array text strings, then loading each text word into row and column cells, so that display on the graphic puzzle cell image and virtually have assign on user's working space zone. The user can freely work around, sit in chair, the IGSID-GIVCL provides user tracking in video view of the user, assign virtual center point on user and create work zone and establish virtual control panel keys in front of user and plus intelligent motor module that can physical rotate video sensor to aim vision tracking on user accordingly when if use walk out of it video view.
The IGSID-GIVCL can assign virtual center point on user such as prefer work space virtual center point is user's shoulder's center point where join connected with throat neck and establish the work zone size by width, and height. The prefer work space width is each shoulder lengthx1.5 on each side, so total workspace zone width prefer 1.5+1.5=3 and the prefer work space height is the shoulder's center point up to the user's head center times 2 to 3. Additional virtual points to be assigned on user body if special need for handicap disability users that require special Date Recue/Date Received 2021-11-21 assignment that could be anywhere of user body location. User without arms, could use mouth to hold pointer stick, or water color painting pen to make gesture selection. The left and right shoulder's edge points can be added into PCMVKCP program for enhancement of the selection X, Y value reading accuracy, and the hands' palm size value (open and hold fingers) can be added in the PCMVKCP program to enhance the click selection reading accuracy.
Therefore, the IGSID-GIVCL creates a prefect comfortable work space zone for the user, and the user can move the user's hands in comfortable space, and in all directions without difficulty, and prevent problems such as injury of themselves, or hitting any others, or objects around.
Because the IGSID-GIVCL
uses puzzle cell mapping method, the IGSID-GIVCL can graphically draw any virtual mouse, keyboard, and control panels that the user wanted instantly. The gesture video sensor requires the user to do simple hand moving and click action. the IGSID-GIVCL can be built by using a regular computer, laptop with video camera, lower system electricity consumption, low building equipment cost. The IGSID-GIVCL can be used by everyone in convenience, waling moving, sitting and everywhere.
[0020] The IGSID-GIVCL can be used in all areas on the Earth. Furthermore, under zero gravity environment, physical motion is difficult, the IGSID-GIVCL is useful in spaceship that astronaut can use gesture action to move their hands in front of them to control computer, machine, and intelligent robot. The IGSID-GIVCL also frees the room space and reduce spaceship weight.
[0021] In addition to unique gesture continuous click action, the IGSID-GIVCL, PCMVKCP
vision enables the user to move the user's hand in front of the user like a fish swimming using fins smoothly and to softly move each fingers UP and Down like waving fin to control a continuous click action in 3rd click selection zone. In 3rd selection zone user's hand palm makes the fish fin waving swimming gesture action as hand sign and the IGSID-GIVCL, PCMVKCP
vision program can detect the distance changing, hand's palm center visible blinking like night sky star view, each wave makes a blinking and the IGSID-GIVCL, PCMVKCP
automatically detect blinks to continue click action without requiring the user pulling hand back to 14 selection zone to unlock and push out to reselect action. This unique gesture of fish fin waving swimming hand palm sign makes the user very easy to control machines when continuous clicks are required such as TV volume UP/Down, or computer Mouse moving UP, Down, Left, Right, etc.
Date Recue/Date Received 2021-11-21
vision enables the user to move the user's hand in front of the user like a fish swimming using fins smoothly and to softly move each fingers UP and Down like waving fin to control a continuous click action in 3rd click selection zone. In 3rd selection zone user's hand palm makes the fish fin waving swimming gesture action as hand sign and the IGSID-GIVCL, PCMVKCP
vision program can detect the distance changing, hand's palm center visible blinking like night sky star view, each wave makes a blinking and the IGSID-GIVCL, PCMVKCP
automatically detect blinks to continue click action without requiring the user pulling hand back to 14 selection zone to unlock and push out to reselect action. This unique gesture of fish fin waving swimming hand palm sign makes the user very easy to control machines when continuous clicks are required such as TV volume UP/Down, or computer Mouse moving UP, Down, Left, Right, etc.
Date Recue/Date Received 2021-11-21
[0022] A new distinguished revolution of computer interface method is that the IGSID-GIVCL, PCMVKCP can support an advanced gesture action of the TouchScreen of Mouse that virtual sandwich layers to combine virtual control panel keys zone functions. The IGSID-GIVCL, PCMVKCP vision program enables the user to decide which hand for TouchScreen Mouse and the other hand can virtually click the virtual puzzle cell Mouse keys. It can assign any commands. The mouse function can be such as Mouse Double click, Left click, Right click, Mouse Left click UP, Mouse Left click Down, Mouse Right Click UP, Mouse Right Click Down, Wheel UP, Wheel Down, etc. For example, if the user uses the user's right hand to click a virtual mouse function on the title menu of the virtual puzzle cell control panel, then the IGSID-GIVCL, PCMVKCP program activates the virtual TouchScreen Mouse function. It enables tracking of the user's right-hand location and moving the mouse position accordingly on the display monitor. If the user's right hand moves UP, the IGSID-GIVCL, PCMVKCP
program moves the mouse cursor position UP on the monitor accordingly with respect to the distant of the hand moving distant. The moving distance can be determined where its location on the right side of the Work Zone space, and the IGSID-GIVCL, PCMVKCP program calculates the ratio of X, Y distance between virtual center point, and updates the same ratio distant of moving the mouse cursor position in the same direction. Therefore, if the user's right hand draws a circle, the mouse cursor will move a circle on the monitor in real time. When the user moves mouse cursor on specific position that could be an internet browser web page on the computer desktop screen, the user can push right hand out, the IGSID-GIVCL recognizes the click selection.
It will do the Mouse LEFT click as default selection click action. Sometimes, the other mouse click action is required. For one example, the other hand can move and click the virtual mouse puzzle cell keys.
For another example, the other hand clicks double-click, then the user moves right hand to control TouchScreen Mouse cursor on a program icon, and push hand out, the IGSID-GIVCL, PCMVKCP program will perform the Double click for that click instead of default Left click.
Therefore, the program icon will be double-clicked and be running. The other virtual mouse puzzle cell keys are also useful when specific mouse action click needs to be specific. For example, if the user is in view of a large page or a drawing image page, performing the Left Click Down will make the whole drawing image page sheet follow right hand moving in all directions. When the user moves image sheet to right location, conducts virtually Left Click Down click to release the TouchScreen Mouse Grip action, and back to default.
The Date Recue/Date Received 2021-11-21 TouchScreen Mouse can be operated by right hand or left hand. Each hand mouse cursor starting position prefers to be initially on corresponding starting location.
Because IGSID-GIVCL, PCMVKCP program vision calibrates the user working space zone into 4 sections, X
and Y dimension lines across on virtual center point. So, it divides into 4 sections where value of section I, (X+, Y+), section II, (X-, Y+), section III, (X+, Y-), and section IV, (X-, Y-). This means for the right hand will be determine position using X, Y value of section I and III, the Right-Hand TouchScreen Mouse program function prefers to have a starting cursor position in monitor LEFT-TOP corner position that is video card monitor 0,0 position. On the other hand, for the left hand will be determined position using X, Y value of section II
and IV, then the LEFT-Hand TouchScreen Mouse program function prefers to have a starting cursor position in monitor Right-Bottom corner position. If a monitor video card uses a resolution of 1900x1200, then the cursor start position is 1900x1200 on the monitor. The IGSID-GIVCL
PCMVKCP
program will determine its video view frame width and height ratio to compare with monitor screen resolution ratio, and moving mouse cursor distance accordingly with hand in all Directions covering 360 degrees. TouchScreen Mouse can use gesture click action with computer virtual keyboard keys buttons as well, and to click keys buttons on computer monitor.
If computer windows desktop screen is filled up with click-able buttons on surface, then the user can use TocuhScreen Mouse to select which button to be clicked by gesture action.
program moves the mouse cursor position UP on the monitor accordingly with respect to the distant of the hand moving distant. The moving distance can be determined where its location on the right side of the Work Zone space, and the IGSID-GIVCL, PCMVKCP program calculates the ratio of X, Y distance between virtual center point, and updates the same ratio distant of moving the mouse cursor position in the same direction. Therefore, if the user's right hand draws a circle, the mouse cursor will move a circle on the monitor in real time. When the user moves mouse cursor on specific position that could be an internet browser web page on the computer desktop screen, the user can push right hand out, the IGSID-GIVCL recognizes the click selection.
It will do the Mouse LEFT click as default selection click action. Sometimes, the other mouse click action is required. For one example, the other hand can move and click the virtual mouse puzzle cell keys.
For another example, the other hand clicks double-click, then the user moves right hand to control TouchScreen Mouse cursor on a program icon, and push hand out, the IGSID-GIVCL, PCMVKCP program will perform the Double click for that click instead of default Left click.
Therefore, the program icon will be double-clicked and be running. The other virtual mouse puzzle cell keys are also useful when specific mouse action click needs to be specific. For example, if the user is in view of a large page or a drawing image page, performing the Left Click Down will make the whole drawing image page sheet follow right hand moving in all directions. When the user moves image sheet to right location, conducts virtually Left Click Down click to release the TouchScreen Mouse Grip action, and back to default.
The Date Recue/Date Received 2021-11-21 TouchScreen Mouse can be operated by right hand or left hand. Each hand mouse cursor starting position prefers to be initially on corresponding starting location.
Because IGSID-GIVCL, PCMVKCP program vision calibrates the user working space zone into 4 sections, X
and Y dimension lines across on virtual center point. So, it divides into 4 sections where value of section I, (X+, Y+), section II, (X-, Y+), section III, (X+, Y-), and section IV, (X-, Y-). This means for the right hand will be determine position using X, Y value of section I and III, the Right-Hand TouchScreen Mouse program function prefers to have a starting cursor position in monitor LEFT-TOP corner position that is video card monitor 0,0 position. On the other hand, for the left hand will be determined position using X, Y value of section II
and IV, then the LEFT-Hand TouchScreen Mouse program function prefers to have a starting cursor position in monitor Right-Bottom corner position. If a monitor video card uses a resolution of 1900x1200, then the cursor start position is 1900x1200 on the monitor. The IGSID-GIVCL
PCMVKCP
program will determine its video view frame width and height ratio to compare with monitor screen resolution ratio, and moving mouse cursor distance accordingly with hand in all Directions covering 360 degrees. TouchScreen Mouse can use gesture click action with computer virtual keyboard keys buttons as well, and to click keys buttons on computer monitor.
If computer windows desktop screen is filled up with click-able buttons on surface, then the user can use TocuhScreen Mouse to select which button to be clicked by gesture action.
[0023] In summary, the TouchScreen Mouse is combined with the Virtual Puzzle Cell keys control panels in sandwich layers functions as an advanced gesture system that includes all current computer interface device methods to be the one true universal computer interface device and enable the user to perform gesture control of all machine functions together, and easy gesture to control computer, without need to physical built mouse, keyboard, remote controller, control interface on equipments, machines, robots. The IGSID-GIVCL will replace the need for building physical control panels, interface devices, reduce high tech device pollution and save the material resource usage on the Earth.
[0024] The IGSID-GIVCL can be equipped with output display devices options, such as display monitor, visual image projector on any surface, wireless monitor glass that user can wear and see the project monitor screen in the lances. The IGSID-GIVCL can control a wireless BLUETOOTH card attached with micro controller board or a smart phone to control LED light Date Recue/Date Received 2021-11-21 on and OFF to display MORSE code of text command select on, or to generate a vibration long and short signals of MORSE code of text command too. User can wear the wireless display MORSE code text command device on their palm's back and LED lights face direction to themselves, or like a watch. When the user's hand moves on puzzle cell, then the IGSID-GIVCL
PCMVKCP program will send a command to wireless micro controller boards to blink LED light ON/OFF Long and short to indicate which command select on and/or motor vibration long and short signal for silent reading text command. So, the user doesn't need to watch display monitor, and this feature is especially useful for poor eyesight user and blind users so they can perform gesture selection like normal people do.
PCMVKCP program will send a command to wireless micro controller boards to blink LED light ON/OFF Long and short to indicate which command select on and/or motor vibration long and short signal for silent reading text command. So, the user doesn't need to watch display monitor, and this feature is especially useful for poor eyesight user and blind users so they can perform gesture selection like normal people do.
[0025] The IGSID-GIVCL can be equipped with a wireless equipment such as BLUETOOTH, Wi-Fi network equipment that can send signal to control other wireless network smart phone, micro controller board, machines, car's BLUETOOTH system, other computer, another machine, another network nodes on the networks, through World Wide Web, Internet TCP/IP
protocol and using server-client network software program to remotely control operation and to diagnose configuration of other robot machines, or connect to space signal transmitter station to send signal to space remote control, Harvard Space Telescope, or Rover robot on the Mars, etc.
protocol and using server-client network software program to remotely control operation and to diagnose configuration of other robot machines, or connect to space signal transmitter station to send signal to space remote control, Harvard Space Telescope, or Rover robot on the Mars, etc.
[0026] The IGSID-GIVCL will change the way how people use computers, machines, and intelligent robots all around the world.
Brief Description of the Drawings
Brief Description of the Drawings
[0027] In the drawings,
[0028] FIG. 1 is a drawing showing the hardware components of a IGSID-GIVCL
including peripherals wireless network devices, display devices and the robot vision tracking software programs.
including peripherals wireless network devices, display devices and the robot vision tracking software programs.
[0029] FIG. 2 is a drawing showing the IGSID-GIVCL, PCMVKCP vision program allowing automatic measurement of user's work space, assigning virtual center point, creating work space zone in conformable area, establishing puzzle cell mapping keys, virtually controlling panel in front of the user to be clicked.
Date Recue/Date Received 2021-11-21
Date Recue/Date Received 2021-11-21
[0030] FIG. 3 is a drawing showing the hand push out in z direction and push to click a virtual key, and the z dimension distant, between hand palm and user body distant, has been divided into 3 zones, 14 selection unlock selected key gate zone, 2nd moving to select virtual key zone, and 3rd push hand to click the selected virtual key zone; in addition, showing unique special IGSID-GIVCL, PCMVKCP fingers hand sign enhancing selection control accuracy.
[0031] FIG. 4 is a drawing showing a special IGSID-GIVCL, PCMVKCP hand sign gesture moving like fish swimming its fins, moving fingers up and down routing 1 by 1 that making waving fingers hand sign in the 31d selected zone, vision program will detect and continuously click virtual key without pull hand back to unlock selected key gate zone.
[0032] FIG. 5 is a drawing showing the IGSID-GIVCL, PCMVKCP vision program to track user's hands position in the work zone. Using hands X, Y distant between center points to determine which virtual puzzle cell position is being selected. The IGSID-GIVCL, PCMVKCP
vision program draws virtual puzzle cell map keys control panel graphic image on display monitor. PCMVKCP vision program tracks user's hands location to determine which keys are selected. On display monitor, vision program highlights particular puzzle cells as visual indication. The user knows which keys that they are selected on right hand and left hand.
vision program draws virtual puzzle cell map keys control panel graphic image on display monitor. PCMVKCP vision program tracks user's hands location to determine which keys are selected. On display monitor, vision program highlights particular puzzle cells as visual indication. The user knows which keys that they are selected on right hand and left hand.
[0033] FIG. 6 is a drawing showing the IGSID-GIVCL, PCMVKCP vision program drawing virtual puzzle cell map keys control panel as graphic image like watercolor painting artist (Picasso). PCMVKCP program draws the virtual keys in Grid Row and Column cells, and inserts Text Block field into each grid cell, then fills text word into Text Block field as indication command for user to select. For example, a QWERT standard virtual puzzle cell keyboard. In addition, the IGSID-GIVCL, PCMVKCP vision program is able to work with automation program to control USB-UIRT cable to send Infrared signals to remotely control another computer keyboard and mouse operation.
[0034] FIG. 7 is a drawing showing PCMVKCP vision program drawing a mouse keyboard and a control panel. User can select the virtual keys to control mouse position, and mouse click functions. In addition, the virtual puzzle cell map keyboard, control panel, prefer special interface section arrangement, can be divided into Left and Right hand zones, a reservation of the center area of work space to display real-time user image video, showing user action. the user Date Recue/Date Received 2021-11-21 can see the user himself or herself on the display monitor with virtual keyboards together. This special virtual gesture interface arrangement makes good visual feedback and indication and is easy for eyes during user's operation.
[0035] FIG. 8 is a drawing showing that the IGSID-GIVCL can create any keyboard and control panel that the user wanted. It includes varieties of virtual keyboards and control panels. Each keyboard has its own control commands being filled into each row-column puzzle cell. The virtual keyboards drawings are to be shown as examples
[0036] FIG. 9 is a drawing showing examples of virtual keyboards drawings to show that the IGSID-GIVCL is able to support computer operation functions.
[0037] FIG. 10 is a drawing showing examples of virtual keyboards drawings to show that the IGSID-GIVCL is able to support computer operation functions. In addition, the IGSID-GIVCL
uses peripheral devices to control, network devices, computers, machines, and intelligent robot.
The IGSID-GIVCL can be equipped with speech recognition program function, array of microphones used as sound sensors, and be equipped with voice speaking program function used as speakers for voice feedback.
uses peripheral devices to control, network devices, computers, machines, and intelligent robot.
The IGSID-GIVCL can be equipped with speech recognition program function, array of microphones used as sound sensors, and be equipped with voice speaking program function used as speakers for voice feedback.
[0038] FIG. 11 is a drawing showing an advanced TouchScreen Mouse combined with a puzzle cell virtual keyboard in sandwich layers method.
[0039] FIG. 12 is a drawing showing the enhanced wireless select key indication device, that is worn on user hand palm, arms, or user body, display which selection keys by blinking LED light in MORSE Code signals, and/or using vibration motor to make long-short vibrations MORSE
Code signal. So, the user doesn't need to watch display monitor to know what keys they select.
This feature is especially useful for poor eyesight, and blind users.
Code signal. So, the user doesn't need to watch display monitor to know what keys they select.
This feature is especially useful for poor eyesight, and blind users.
[0040] FIG. 13 is the drawing showing a wireless display glass that has network protocol equipment to connect with the IGSID-GIVCL. The IGSID-GIVCL sends the display puzzle cell map with hands selection position. The wireless display glass projects the puzzle cell image on its lenses. So, the user can see which keys the user selects.
[0041] FIG. 14 is a drawing showing that the IGSID-GIVCL is equipped with a mobile platform, Date Recue/Date Received 2021-11-21 for example, using micro controller board to control varieties of motors. So, the IGSID-GIVCL, PCMVKCP vision program can intelligently control these motors to rotate. As a result, the IGSID-GIVCL intelligently drives itself to move around and is able to control moving its display projector direction to project puzzle cell keyboard images on any surface. The varieties of motors control modules can be built into the IGSID-GIVCL's neck, body, arms, hands, legs, so the IGSID-GIVCL can be built as human shape, having physical body movement ability with the IGSID-GIVCL PCMVKCP puzzle cell map function. The IGSID-GIVCL becomes the communication bridge between human and intelligent robot machine world.
Detailed Description of the Preferred Embodiments
Detailed Description of the Preferred Embodiments
[0042] In the drawings,
[0043] With reference to the drawings, as shown in FIG. 1, the drawing shows the hardware components of the IGSID-GIVCL has PCMVKCP, the vision tracking software programs, peripheral wireless network devices, and display devices.
[0044] The complete working example model of the IGSID-GIVCL includes,
[0045] 1. Main Computer 1 used as the IGSID-GIVCL's brain to process video, the IGSID-GIVCL, PCMVKCP vision puzzle cell map virtual keyboard control program (PCMVKCP) 3 and automation program 2 (such as EVENTGHOST), web server function 41, such as IIS server.
[0046] 2. Video vision sensor built with variety type of sensors module 8, combine multiple microphones as sound sensor 7, Infrared Emitter 9, RGB video camera 10 (or use Web Camera instead), Infrared signal reflection detect sensor 11, three-dimensional movement accelerometer sensor 12, speakers 13, and motor control module 17, connect circle signal control line 15, intelligent rotating directions 16, 18. This particular video sensor module system can use MICROSOFT KINECT sensor 6 as available vision sensor parts component sold on the market.
In addition, this invention IGSID-GIVCL proposal to build in Universal Infrared Receiver Transmitter (UIRT) 14 to this video sensor module as addition IR remote control features to physically operate machines.
In addition, this invention IGSID-GIVCL proposal to build in Universal Infrared Receiver Transmitter (UIRT) 14 to this video sensor module as addition IR remote control features to physically operate machines.
[0047] 3. Micro Controller Board 21 can use Arduino board.
Date Recue/Date Received 2021-11-21
Date Recue/Date Received 2021-11-21
[0048] 4. Varity of motors modules 20 attached to Micro Controller Board 21.
Intelligent rotating directions 19, variety type of sensors modules 24 and GPS 22 sensor, connect cable 23, 25 can be attached to the board for external sensor reading signals by Micro Controller board 21 and send to Main Computer 1 to process.
Intelligent rotating directions 19, variety type of sensors modules 24 and GPS 22 sensor, connect cable 23, 25 can be attached to the board for external sensor reading signals by Micro Controller board 21 and send to Main Computer 1 to process.
[0049] 5. USB-Universal Infrared Receiver Transmitter (UIRT) 34 built in, or USB adapter cable 33, that can learn, record, and sending Infrared signals, recording from any physical IR remote controllers. Usually USB-UIRT cables can send and receive IR signals.
Additional IR receiver 36 built in, or USB adapter cable 35 can be attached to Main Computer 1, too.
Additional IR receiver 36 built in, or USB adapter cable 35 can be attached to Main Computer 1, too.
[0050] 6. Wireless network equipment such as BLUETOOTH network card 38 built in, or USB
adapter cable 37, Wi-Fi network card 39 built in, or USB adapter cable 40, etc., all wireless network protocol card devices, TCP/IP, Internet Protocol such as Xbee, Ethernet, Wify, BLUETOOTH, Cell Phone channel 3G, 4G, GSM, CDMA, TDMA, etc., space telecommunication channel, and satellite channels.
adapter cable 37, Wi-Fi network card 39 built in, or USB adapter cable 40, etc., all wireless network protocol card devices, TCP/IP, Internet Protocol such as Xbee, Ethernet, Wify, BLUETOOTH, Cell Phone channel 3G, 4G, GSM, CDMA, TDMA, etc., space telecommunication channel, and satellite channels.
[0051] 7. Display monitor devices such as display monitor 43, monitor cable 42, image Projector 44 and wireless network (example TCP/IP, or BLUETOOTH) display monitor glass 46.
[0052] 8. Main Computer power source, wall power plug 32 when the IGSID-GIVCL
in a fixed installation position, and Kinect sensor 6 power plug source too. Micro controller power source can be used inexpertly or from Main computer 1 through USB connection.
in a fixed installation position, and Kinect sensor 6 power plug source too. Micro controller power source can be used inexpertly or from Main computer 1 through USB connection.
[0053] 9. Mobile motor wheel platform 28, equipped with motors wheels 26, 30, with motor signal control line 27, 29 for controlling motor rotation direction and speed.
All components of the IGSID-GIVCL can be placed on the platform 28. The IGSID-GIVCL is able to use video vision function to drive itself and to move around. The portable power source 31 can be rechargeable batteries, solar cell, fuel cell, rotation generator, wind turbine, thermo electron generator (TEG), etc. to regenerate electric power for the IGSID-GIVCL to Move and to operate.
Because motor modules can be built into a variety of the IGSID-GIVCL body parts. The motor control can be in the neck, the IGSID-GIVCL center body, arms, hands, hip, legs, feet, mimic human physical body part movement. Therefore, it will become a human form of the IGSID-GIVCL that can support puzzle cell map virtual keyboard gesture functions.
Date Recue/Date Received 2021-11-21
All components of the IGSID-GIVCL can be placed on the platform 28. The IGSID-GIVCL is able to use video vision function to drive itself and to move around. The portable power source 31 can be rechargeable batteries, solar cell, fuel cell, rotation generator, wind turbine, thermo electron generator (TEG), etc. to regenerate electric power for the IGSID-GIVCL to Move and to operate.
Because motor modules can be built into a variety of the IGSID-GIVCL body parts. The motor control can be in the neck, the IGSID-GIVCL center body, arms, hands, hip, legs, feet, mimic human physical body part movement. Therefore, it will become a human form of the IGSID-GIVCL that can support puzzle cell map virtual keyboard gesture functions.
Date Recue/Date Received 2021-11-21
[0054] 10. Main Computer 1 is used as the IGSID-GIVCL's brain has PCMVKCP to process video image. The user body part joint location 3-dimension X, Y, Z values can be programmed using MICROSOFT VISUAL STUDIO C# program 4, (or VB), to call KINECT and other system assemble libraries, and to enable KINECT sensor to read user joint values in the program.
[0055] 11. These basic video sensors reading user's 3D body joint values are available for PCMVKCP now. Therefore, we can write a specific puzzle cell map virtual keyboard control program (PCMVKCP) 3 that transforms the basic 3D joint value, intelligently measures and calibrates into a new gesture interface input work space zone and establishes puzzle cell virtual keyboard into the zone. So, the user is able to move hands, and to point out to click virtual keys.
Those enabling KINECT sensor functions to read joints values can be coded into the (PCMVKCP) program 3.
The program 3 can be a class program (for example:
MainWindow.xaml.cs) included in the MICROSOFT VISUAL STUDIO C#4 as one project and built into 1 project solution, preferred in WPF Application type project. So, all the video KINECT sensor reading values are available for (PCMVKCP) program to use them in real time programming, creating a dynamic user graphic interface.
Those enabling KINECT sensor functions to read joints values can be coded into the (PCMVKCP) program 3.
The program 3 can be a class program (for example:
MainWindow.xaml.cs) included in the MICROSOFT VISUAL STUDIO C#4 as one project and built into 1 project solution, preferred in WPF Application type project. So, all the video KINECT sensor reading values are available for (PCMVKCP) program to use them in real time programming, creating a dynamic user graphic interface.
[0056] 12. The IGSID-GIVCL uses vision three-dimensional X, Y, Z body parts value for the IGSID-GIVCL vision puzzle cell map virtual keyboard control program (PCMVKCP) 3 to be able to create work zone, to establish puzzle cell map virtual keyboards, to provide real-time user hands location, to convert to puzzle cell position, then to match puzzle cell row-column with its puzzle cell command map list, to transfer the cell position to computer command, and to send command to automation program 2 (such as EVENTGHOST) to run prerecord macro script to execute command such as displaying a typed text, running a computer program, sending Infrared signal to remote control TV, DVD, or another computer, mouse movement, mouse clicks, running computer program, internet browser computer operations, etc.
[0057] 13. Main computer 1 includes web server function 41, such as IIS server and can establish inter server-client network, DNS server, TCP/IPURL, namespace, etc., web site hosting, provide HTML, XMAL, scripting functions. When (PCMVKCP) program 3 can activate a web Browser, send a web page URL including a specific text code, when particular web page is being running and being opened, the automation program 2 (such as EVENTGHOST) detects the particular text Date Recue/Date Received 2021-11-21 code trigger. It will trigger the macro action in the folder.
[0058] FIG. 2 illustrates the IGSID-GIVCL (PCMVKCP) program 3 automatically measuring user's workspace, assigning virtual center point, creating workspace zone 76 in conformable area 47, establishing puzzle cell mapping keys (such as 85, 86, 87, 82, 91, 92, and all other cells), virtual control panel keyboard in front of user to click. Using MICROSOFT
VISUAL STUDIO
C# with assembly, MICROSOFT KINECT and system libraries, program 4 with video sensor can read the user 50 body joint 3D values.
Such as
VISUAL STUDIO
C# with assembly, MICROSOFT KINECT and system libraries, program 4 with video sensor can read the user 50 body joint 3D values.
Such as
[0059] User Head center 50, Comfortable left and right hand moving circle space area 47, User Right Shoulder Edge Joint 52, User Left Shoulder Edge Joint 48, User Shoulder Center Joint 79, User Right Elbow Joint 57, User Left Elbow Joint 74, User Right Hand 54, User Left Hand 73, User Right Hand Palm Center 77, User Left Hand Palm Center 82,
[0060] Here are example C# program coding using KINECT 2 body joints 3D
reading value to calculate the distant of two joints, this is a direct copy and paste from robot working prototype C# program. See Code section 1 at the end of specification.
reading value to calculate the distant of two joints, this is a direct copy and paste from robot working prototype C# program. See Code section 1 at the end of specification.
[0061]
Date Recue/Date Received 2021-11-21
Date Recue/Date Received 2021-11-21
[0062]
[0063]
[0064]
[0065]
[0066]
[0067]
[0068]
[0069]
[0070]
[0071] So (PCMVKCP) program can use this formula to calculate all the body length. The length of Right shoulder 51 can be calculated by shoulder Center 79 and Right shoulder edge joint 52.
The length of Left shoulder 49 can be calculated by shoulder Center 79 and Left shoulder edge joint 48. The length of Right Up Arm 53 can be calculated by Right shoulder edge joint 52 and Right hand elbow joint 57. The length of Left Up Arm 75 can be calculated by Left shoulder edge joint 48 and Left hand elbow joint 74. The length of Right Lower Arm 56 can be calculated by Right hand elbow joint 57 and Right Hand Palm Joint 77. The length of Left Lower Arm 72 can be calculated by Left hand elbow joint 74 and Left Hand Palm Joint 82.
Simplifying the other body joints values can be used to calculate user's center body length (58=79 to 61), hip length (71-62), Right upper leg (63=64-62), Left upper leg (70=71-69), Right lower leg (65=64-66), Left lower leg (68=69-67), Right leg length (63+65), Left leg length (70+68). Head length is Head center 50 1engthx2, and the neck joints length (79-50, or upper neck joint point). Total user height can be approximately calculated by adding all lengths to estimate user's height and the maximum user width. It is likely to be the distance between both edge joints of user's shoulder. A user uses both arms, and the comfortable movement space has limitation areas. The comfortable areas 47 can be defined as in front of user, and in circle around both side. If Left hand moving over right shoulder edge is difficult and right hand moving over Left shoulder Date Recue/Date Received 2021-11-21 Becomes difficult. Both circles of comfortable areas 47 create an overlapped layer area 59 (between shoulders), and 2 circles have intersection point 60 aligning with user body center line 58. When user body joint values, and the length of body are available, the IGSID-GIVCL
(PCMVKCP) program 3 can use these video sensor-reading values to create a perfect Workspace zone 76 according to the user body measurements. The (PCMVKCP) program 3 will assign a virtual center point on user, which is preferred to be should center joint point 79. The preferred Workspace zone width length is total length of each shoulder lengthx1.5 (1.5+1.5=3), and the prefer Workspace zone height length is total length of Shoulder Center 79 to Head Face Centerx2. The Workspace zone will be tracked as in front of user accordingly at user's shoulder center joint point 79. Therefore, while user walking or moving, the Workspace zone 76 is always at the same place in front of user. If the user walks within the video viewable area, the software keeps digital tracking. When the user walks out of the video view area edge, then (PCMVKCP) program 3, will activate intelligent motor module to rotate video sensor to follow and to aim on user. When the Workspace zone 76 size is defined, then (PCMVKCP) program 3 will divide the Workspace zone into Puzzle Cell Row-Column. For example, if the virtual keyboard needs 4 row, and 10 columns, total is 40 puzzle cells. PCMVKCP
Program 3 will divide the width length to 10 and divide the height length to 4. As result, it can determine the length of each puzzle cell area location value according to the virtual center point 79.
For example, when user's Right hand 54 moves to Puzzle Cell 85 (Row 1 Column 10), the PCMVKCP
program 3 calculated the X, Y value of the Right Hand Palm center point 77 with the virtual center point 79, known of X, Y, 2 side length, then the PCMVKCP program can calculate the distant of the length to center point 78. Those values can be determined that user's Right hand is on the Row 1 Column 10 location. If user Right hand 54 moves down to Puzzle cell 87, it will be (Row 4, Column 8), and if moves to Puzzle Cell 86, it will be (Row 4, Column 10). By applying the same method, the PCMVKCP program can determine that user's Left hand 73, palm center 82 is at Puzzle Cell (Row 4, Column 2) location. If user Left hand 73 moves up to Puzzle cell 91, it will be (Row 2, Column 2), and if moves to Puzzle Cell 92, it will be (Row 1, Column 1). The total selection clicks zone's 88 max length is limited by total user arm hand length (75+72), (53+56), that is the longer length user can push their hands out. The PCMVKCP
program defines the maximum hand push out surface 84. For example, user push Left hand 73 out and in direction 90, and the ((PCMVKCP) program 3 reads the Left hand palm joint z dimension length Date Recue/Date Received 2021-11-21 value 81 being changed and becoming longer (bigger) between the user body z dimension values. The comparison z dimension value can be assigned as user body z dimension surface 80, center point or left shoulder joint 48 or right shoulder joint 52 to compare when special measurement is needed. This is useful for a handicap user who might use mouth to hold a water color pen to select virtual key to enter. The vision tracking PCMVKCP program 3 can use special assignment of any point of the z surface 80 value to determine handicap user's selection click action. PCMVKCP program recognizes the hand push out selection click action, locks the Puzzle Cell Row4-Column2 position and matches Puzzle Cell Map 2-dimension array string code to transfer position into computer command. The selection click zone 88 is divided into 3 selection mode zones. The required detect click action of the hand push out edge 89 is preferred to be shorter than maximum z push out surface 84 to prevent the user to push hand muscle too much, and rapidly too often that could cause body arm injury. This shorter selection click action length feature keeps arm and hand in flexible position and easier for the user to move hands to select virtual keys.
The length of Left shoulder 49 can be calculated by shoulder Center 79 and Left shoulder edge joint 48. The length of Right Up Arm 53 can be calculated by Right shoulder edge joint 52 and Right hand elbow joint 57. The length of Left Up Arm 75 can be calculated by Left shoulder edge joint 48 and Left hand elbow joint 74. The length of Right Lower Arm 56 can be calculated by Right hand elbow joint 57 and Right Hand Palm Joint 77. The length of Left Lower Arm 72 can be calculated by Left hand elbow joint 74 and Left Hand Palm Joint 82.
Simplifying the other body joints values can be used to calculate user's center body length (58=79 to 61), hip length (71-62), Right upper leg (63=64-62), Left upper leg (70=71-69), Right lower leg (65=64-66), Left lower leg (68=69-67), Right leg length (63+65), Left leg length (70+68). Head length is Head center 50 1engthx2, and the neck joints length (79-50, or upper neck joint point). Total user height can be approximately calculated by adding all lengths to estimate user's height and the maximum user width. It is likely to be the distance between both edge joints of user's shoulder. A user uses both arms, and the comfortable movement space has limitation areas. The comfortable areas 47 can be defined as in front of user, and in circle around both side. If Left hand moving over right shoulder edge is difficult and right hand moving over Left shoulder Date Recue/Date Received 2021-11-21 Becomes difficult. Both circles of comfortable areas 47 create an overlapped layer area 59 (between shoulders), and 2 circles have intersection point 60 aligning with user body center line 58. When user body joint values, and the length of body are available, the IGSID-GIVCL
(PCMVKCP) program 3 can use these video sensor-reading values to create a perfect Workspace zone 76 according to the user body measurements. The (PCMVKCP) program 3 will assign a virtual center point on user, which is preferred to be should center joint point 79. The preferred Workspace zone width length is total length of each shoulder lengthx1.5 (1.5+1.5=3), and the prefer Workspace zone height length is total length of Shoulder Center 79 to Head Face Centerx2. The Workspace zone will be tracked as in front of user accordingly at user's shoulder center joint point 79. Therefore, while user walking or moving, the Workspace zone 76 is always at the same place in front of user. If the user walks within the video viewable area, the software keeps digital tracking. When the user walks out of the video view area edge, then (PCMVKCP) program 3, will activate intelligent motor module to rotate video sensor to follow and to aim on user. When the Workspace zone 76 size is defined, then (PCMVKCP) program 3 will divide the Workspace zone into Puzzle Cell Row-Column. For example, if the virtual keyboard needs 4 row, and 10 columns, total is 40 puzzle cells. PCMVKCP
Program 3 will divide the width length to 10 and divide the height length to 4. As result, it can determine the length of each puzzle cell area location value according to the virtual center point 79.
For example, when user's Right hand 54 moves to Puzzle Cell 85 (Row 1 Column 10), the PCMVKCP
program 3 calculated the X, Y value of the Right Hand Palm center point 77 with the virtual center point 79, known of X, Y, 2 side length, then the PCMVKCP program can calculate the distant of the length to center point 78. Those values can be determined that user's Right hand is on the Row 1 Column 10 location. If user Right hand 54 moves down to Puzzle cell 87, it will be (Row 4, Column 8), and if moves to Puzzle Cell 86, it will be (Row 4, Column 10). By applying the same method, the PCMVKCP program can determine that user's Left hand 73, palm center 82 is at Puzzle Cell (Row 4, Column 2) location. If user Left hand 73 moves up to Puzzle cell 91, it will be (Row 2, Column 2), and if moves to Puzzle Cell 92, it will be (Row 1, Column 1). The total selection clicks zone's 88 max length is limited by total user arm hand length (75+72), (53+56), that is the longer length user can push their hands out. The PCMVKCP
program defines the maximum hand push out surface 84. For example, user push Left hand 73 out and in direction 90, and the ((PCMVKCP) program 3 reads the Left hand palm joint z dimension length Date Recue/Date Received 2021-11-21 value 81 being changed and becoming longer (bigger) between the user body z dimension values. The comparison z dimension value can be assigned as user body z dimension surface 80, center point or left shoulder joint 48 or right shoulder joint 52 to compare when special measurement is needed. This is useful for a handicap user who might use mouth to hold a water color pen to select virtual key to enter. The vision tracking PCMVKCP program 3 can use special assignment of any point of the z surface 80 value to determine handicap user's selection click action. PCMVKCP program recognizes the hand push out selection click action, locks the Puzzle Cell Row4-Column2 position and matches Puzzle Cell Map 2-dimension array string code to transfer position into computer command. The selection click zone 88 is divided into 3 selection mode zones. The required detect click action of the hand push out edge 89 is preferred to be shorter than maximum z push out surface 84 to prevent the user to push hand muscle too much, and rapidly too often that could cause body arm injury. This shorter selection click action length feature keeps arm and hand in flexible position and easier for the user to move hands to select virtual keys.
[0072] FIG. 3 illustrates the hand push out in z dimension to click virtual key. The z dimension distant 88, between hand palm 82 and user body point 93 distant is divided into 3 zones, 1st selection unlock selected key gate zone 99 between user body point 93 to 1st select zone edge point 98, 2nd moving to select virtual key zone 94 between 1st select zone edge point 98 to 2nd select zone edge point 95, and 31d push hand to click the selected virtual key zone 96 between 2nd select zone edge point 95 and 31d select zone edge 89. In addition, Fig. 3 shows a unique special IGSID-GIVCL, PCMVKCP fingers hand sign enhancing selection control accuracy.
PCMVKCP
program 3 can detect user Left hand palm center 82 in pulling and pushing action in direction 90.
In 2nd select key zone, the user moves hands and keeps hands in 2nd zone area to select and change any key freely. In default, when the user's hand extends in a Push out direction, the PCMVKCP program detect "Push" action. It will lock the puzzle cell position.
So, it will not be changed even when X, Y is changed during the hands pushing out.
PCMVKCP
program 3 can detect user Left hand palm center 82 in pulling and pushing action in direction 90.
In 2nd select key zone, the user moves hands and keeps hands in 2nd zone area to select and change any key freely. In default, when the user's hand extends in a Push out direction, the PCMVKCP program detect "Push" action. It will lock the puzzle cell position.
So, it will not be changed even when X, Y is changed during the hands pushing out.
[0073] A special IGSID-GIVCL, PCMVKCP gesture hand sign to move fingers like a spider walking its legs is described so as to change nearby puzzle cell selection.
For example, User Left hand palm 82 can stay in the 2nd select key zone 94, hand fingers 103, 105, 106, 107, and 108 moves like spider legs walking gesture, the puzzle cell row-column lines like spider web net. So, Date Recue/Date Received 2021-11-21 tiny moving fingers in waking direction of up, down, left, right, the PCMVKCP
program can detect the most area hand palm 82 on which puzzle cell. So, the user doesn't need to make big hand movement to change puzzle cell where just beside the current selected puzzle cell position.
Each finger has 2 joints sections. For example, finger 103 has two joints 101, 102, connecting to hand palm at joint 100.
For example, User Left hand palm 82 can stay in the 2nd select key zone 94, hand fingers 103, 105, 106, 107, and 108 moves like spider legs walking gesture, the puzzle cell row-column lines like spider web net. So, Date Recue/Date Received 2021-11-21 tiny moving fingers in waking direction of up, down, left, right, the PCMVKCP
program can detect the most area hand palm 82 on which puzzle cell. So, the user doesn't need to make big hand movement to change puzzle cell where just beside the current selected puzzle cell position.
Each finger has 2 joints sections. For example, finger 103 has two joints 101, 102, connecting to hand palm at joint 100.
[0074] When all fingers 103, 105, 106, 107, 108, the Left hand palm 82 and detected circle area size 109 diameter length 104 is larger than when all fingers close and holding 111, the vision tracking PCMVKCP program detects hand area circle 113, diameter 112 become smaller. This difference becomes useful for enhancing puzzle cell selection. When the user locates select command, then closes all fingers and pushes hand out, PCMVKCP program will lock the puzzle row-column value regardless even hand moving in X, Y directions. The puzzle cell position will not be changed. This special IGSID-GIVCL, PCMVKCP hand gesture feature is useful when the user needs to rush to click a virtual key for sure in an emergent situation such as in spaceship out of control or user has shaking hands illness problem. The PCMVKCP program will support for the need.
[0075] In the 2nd select key zone 94, user Left hand can change fingers. For example, fingers 105, 106, the special IGSID-GIVCL hand sign look like a gun gesture pointing to puzzle cell.
So, the PCMVKCP program sees the hand holding and then fingers pointing out that make different. The PCMVKCP program selects to lock the key. Tiny gun gesture point area makes vision tracking accurate. So, the user moves or rotates finger gun point, and applies small movement to change key selection. If the user wants to select other key, simply pulls hand 82 back to 14 zone. The PCMVKCP program detects "Pull" action then unlock the selection key to be free to reselect any key again by user. In 31d selection click zone 96 and using fingers tiny movements by different fingers gun point out or holding tight or open all fingers or close hold all fingers to make a puzzle cell select to click 97.
So, the PCMVKCP program sees the hand holding and then fingers pointing out that make different. The PCMVKCP program selects to lock the key. Tiny gun gesture point area makes vision tracking accurate. So, the user moves or rotates finger gun point, and applies small movement to change key selection. If the user wants to select other key, simply pulls hand 82 back to 14 zone. The PCMVKCP program detects "Pull" action then unlock the selection key to be free to reselect any key again by user. In 31d selection click zone 96 and using fingers tiny movements by different fingers gun point out or holding tight or open all fingers or close hold all fingers to make a puzzle cell select to click 97.
[0076] FIG. 4 is a drawing showing a special IGSID-GIVCL, PCMVKCP hand sign gesture to continuously click without pulling hand back to unlock. By moving like fish swimming its fins, moving fingers 100, 105, 106, 107, 108 up and down between horizontal line 117 routing 1 by 1 that makes waving fingers IGSID-GIVCL, PCMVKCP hand sign gesture in the 31d selected click Date Recue/Date Received 2021-11-21 zone 96. The PCMVKCP vision tracking function in program 3 will detect hand size area 116, hand palm center point 82 distant value. While fingers waving down positions 118, 119, 120, 121, and 122, palm face down area 73, the hand palm center point 82 will be covered. So, the PCMVKCP program could not see the point 82, then fingers moving UP area to position 123, 124, 125, 126, 127, the hand palm center 82 appeared again in vision tracking function in the program 3. It causes the blink distant z value to be different from that program detects and performs continuously click virtual key on each blinking, without the need for the user to pull right hand 73 back to 1st unlock selected key gate zone and push to click.
[0077] FIG. 5 is a drawing showing the IGSID-GIVCL, PCMVKCP vision program to track user's hands position 77, 82 in the Workspace zone 76. Using Right hands X, Y
distant 78 between center points 79 to Right hand palm 77, it determines which virtual puzzle cell position 85 is being selected. Using Left hands X, Y distant between center points 79 to Left hand palm 82, it determines which virtual puzzle cell position (Row 4 Column 2) is being selected.
distant 78 between center points 79 to Right hand palm 77, it determines which virtual puzzle cell position 85 is being selected. Using Left hands X, Y distant between center points 79 to Left hand palm 82, it determines which virtual puzzle cell position (Row 4 Column 2) is being selected.
[0078] The IGSID-GIVCL, PCMVKCP vision program draws virtual puzzle cell map keys control panel graphic image 141 on display monitor 43. PCMVKCP vision program uses the tracking of user's hands 77, 82 location to determine which keys are selected, and real time update on display monitor 43. The PCMVKCP vision program highlights 130, 138 in different colors, enlarges font sizes on the particular puzzle cells 132, 136 as visual indication. The user knows which keys that are selected on right hand palm 77 and left hand palm 82. The graphic puzzle cell map image center point 133 matches with virtual workspace center point 79. So, hands X, Y values can be matched on the graphic puzzle cell map image on select keys correctly.
If Left hand palm 82 moves in an up direction 139, the highlight will change to puzzle cell 140.
If left hand moves down and out of puzzle cell map area 137, the program will not indicate any select key. It could be because the user puts down hand and has no intention to select key situation. If user Right hand 77 moves down, the X, Y values and distant 128 are also changed.
The PCMVKCP program will highlight on puzzle cell 134 or 135 where user select. If Right hand moves out the workspace zone, then no key select 129. User can decide to use both hands to select keys, or to use left hand only and to use right hand only. The PCMVKCP vision tracking function in the program 3 can support to recognize all hands inputs.
Date Recue/Date Received 2021-11-21
If Left hand palm 82 moves in an up direction 139, the highlight will change to puzzle cell 140.
If left hand moves down and out of puzzle cell map area 137, the program will not indicate any select key. It could be because the user puts down hand and has no intention to select key situation. If user Right hand 77 moves down, the X, Y values and distant 128 are also changed.
The PCMVKCP program will highlight on puzzle cell 134 or 135 where user select. If Right hand moves out the workspace zone, then no key select 129. User can decide to use both hands to select keys, or to use left hand only and to use right hand only. The PCMVKCP vision tracking function in the program 3 can support to recognize all hands inputs.
Date Recue/Date Received 2021-11-21
[0079] FIG. 6 is a drawing showing the IGSID-GIVCL, PCMVKCP vision program drawing virtual puzzle cell map keys control panel graphic image 141 like watercolor painting artist (Picasso). Using WPF project in Visual C#, it has dynamic graphic user interface tool. So, the PCMVKCP vision program can use grid command to draw the puzzle cell virtual keys in Grid Row and Column cells, and can insert another TextBlock field into each grid cell, and then can fill text word (0-9, a-z) into TextBlock field as indicate command for user to select. For example, loading 1-0 to TextBlock 142, 143, 144, 145, 146, 147, 148, 149, 150, 151 that place on the puzzle cells of the row 1, and "Z" into TextBlock 162, "X" into TextBlock 161, "C" into TextBlock 160, "V" into TextBlock 159, "B" into TextBlock 158, "N" into TextBlock 157, "M"
into TextBlock 156, "," into TextBlock 155, "." into TextBlock 154, and "SP"
(Space) into TextBlock 153 that place on the puzzle cells of the row 4. All other keys are loaded in the same from a puzzle cell 2-dimension string array code, and loading each character to its support cell position, as result, a QWERT standard virtual puzzle cell keyboard 141 is created. Vision program highlights 130, 138 in different colors, enlarges font sizes 152 163 on the particular puzzle cell command "X" and "0" as visual indication. The user knows which keys that are selected. When the user clicks, the PCMVKCP program 3 uses the puzzle cell position (Row, Column) to call the puzzle cell 2-dimension string array code and to obtain the text word command. If user Right hand moves to "SP" and clicks, then the program displays typing space, if ",", program typing display ",".
into TextBlock 156, "," into TextBlock 155, "." into TextBlock 154, and "SP"
(Space) into TextBlock 153 that place on the puzzle cells of the row 4. All other keys are loaded in the same from a puzzle cell 2-dimension string array code, and loading each character to its support cell position, as result, a QWERT standard virtual puzzle cell keyboard 141 is created. Vision program highlights 130, 138 in different colors, enlarges font sizes 152 163 on the particular puzzle cell command "X" and "0" as visual indication. The user knows which keys that are selected. When the user clicks, the PCMVKCP program 3 uses the puzzle cell position (Row, Column) to call the puzzle cell 2-dimension string array code and to obtain the text word command. If user Right hand moves to "SP" and clicks, then the program displays typing space, if ",", program typing display ",".
[0080] If Left hand selects "W" 139, then the PCMVKCP program sends key typing display "W", if select "1", then sends word "1" to display.
[0081] In addition, the IGSID-GIVCL PCMVKCP vision program is able to work with automation program 2 (For Example EVENTGHOST) to control USB-UIRT cable 34 to send Infrared signals 171 to remotely control another computer 164 with IR receiver 172, to control its keyboard to type and display "X" and "0" on notepad program 167 on monitor.
While puzzle cell loading mouse keys, the user is able to click to send mouse moving IR
signal 171 to control the other computer 164 to move its mouse 168 position and to do mouse 168 click operation. The command execution signal can also be sent by BLUETOOTH device to control a BLUETOOTH
micro controller board device that the user wears on himself or herself to blink LED light as MORSE code, or vibration long-short as MORSE code signal. It can send signal through Wi-Fi Date Recue/Date Received 2021-11-21 network device 39, TCP/IP, Internet network server-client program to control another node on the network, computer, machines, and intelligent robot.
While puzzle cell loading mouse keys, the user is able to click to send mouse moving IR
signal 171 to control the other computer 164 to move its mouse 168 position and to do mouse 168 click operation. The command execution signal can also be sent by BLUETOOTH device to control a BLUETOOTH
micro controller board device that the user wears on himself or herself to blink LED light as MORSE code, or vibration long-short as MORSE code signal. It can send signal through Wi-Fi Date Recue/Date Received 2021-11-21 network device 39, TCP/IP, Internet network server-client program to control another node on the network, computer, machines, and intelligent robot.
[0082] Using the web server 41 ITS service and activating specific web page 169 with specific text code is the best way. It allows unlimited assignment command by differential folders on each control machines and trigger macro actions individually. This is the way to free keys locked and to keep keyboard clickable as normal computer functions. Once automation program 2 such as EVENTGHOST can create many folders to save macros script with trigger actions, and can detect the specific trigger command events, the Marcos can exercise command such as sending text key command, display A-Z, 0-9, symbols keys, functions key, open computer program, internet browser, words, calculator, 3D graphic drawing CAD program, etc. In addition, automation program 2 such as EVENTGHOST can include USB UIRT cable 34 to learn physical Infrared Remote Controller each function keys signal and recorded and sent out by macros script action.
[0083] When the IGSID-GIVCL, PCMVKCP program triggers the action, EVENTGHOST
will send infrared signal 171 out through USB-UIRT cable device 34. The IR signal can control a physical machine such as computer 164, machine, and intelligent robot. For example, the IGSID-GIVCL PCMVKCP sends IR signal 171 out to control a TV to turn ON/OFF. Another example, another computer 164 can be equipped with IR receiver 172, then the IGSID-GIVCL can send IR
signal 171 to control the other computer 164 such as display a-z, 0-9, symbols, function keys, open computer programs, Media, running DVD player, playing music, video, internet browser, playing games, and moving mouse 168 position, Right click, Left Click, Double click, wheel up, wheel down computer functions, etc. As a result, the IGSID-GIVCL can control self-intelligent machines, and intelligent robots. Soon, self-intelligent driving car, flight jet, and spaceship, and intelligent robot will be used in people daily home, health care, education, medical, transportation, public services, etc.
will send infrared signal 171 out through USB-UIRT cable device 34. The IR signal can control a physical machine such as computer 164, machine, and intelligent robot. For example, the IGSID-GIVCL PCMVKCP sends IR signal 171 out to control a TV to turn ON/OFF. Another example, another computer 164 can be equipped with IR receiver 172, then the IGSID-GIVCL can send IR
signal 171 to control the other computer 164 such as display a-z, 0-9, symbols, function keys, open computer programs, Media, running DVD player, playing music, video, internet browser, playing games, and moving mouse 168 position, Right click, Left Click, Double click, wheel up, wheel down computer functions, etc. As a result, the IGSID-GIVCL can control self-intelligent machines, and intelligent robots. Soon, self-intelligent driving car, flight jet, and spaceship, and intelligent robot will be used in people daily home, health care, education, medical, transportation, public services, etc.
[0084] In addition, the IGSID-GIVCL PCMVKCP program can have privately owned automation program 2 functions control features included in the IGSID-GIVCL, PCMVKCP
program 3.
The IGSID-GIVCL PCMVKCP program can directly code with USB-UIRT cable's API
library with add in to be assembled with available functions directly in (PCMVKCP) program 3 function Date Recue/Date Received 2021-11-21 code. So, the IGSID-GIVCL program can directly control USB-UIRT cable to IR
signal learning, record and send out IR signal commands. The IGSID-GIVCL can directly control physical machine such as TV, Computer, machines in the IGSID-GIVCL, PCMVKCP
program without a need to have 31d party automation program 2 such as EVENTGHOST to run it.
Similarly, the IGSID-GIVCL PCMVKCP program can send enter key command directly to the activation program. For example, enter key to notepad or MICROSOFT WORD, PCMVKCP
program coding to send text key command to display type words on the writing program directly without a need to have 3rd part automation program, too.
program 3.
The IGSID-GIVCL PCMVKCP program can directly code with USB-UIRT cable's API
library with add in to be assembled with available functions directly in (PCMVKCP) program 3 function Date Recue/Date Received 2021-11-21 code. So, the IGSID-GIVCL program can directly control USB-UIRT cable to IR
signal learning, record and send out IR signal commands. The IGSID-GIVCL can directly control physical machine such as TV, Computer, machines in the IGSID-GIVCL, PCMVKCP
program without a need to have 31d party automation program 2 such as EVENTGHOST to run it.
Similarly, the IGSID-GIVCL PCMVKCP program can send enter key command directly to the activation program. For example, enter key to notepad or MICROSOFT WORD, PCMVKCP
program coding to send text key command to display type words on the writing program directly without a need to have 3rd part automation program, too.
[0085] When user selects key, the PCMVKCP program can enable speaker 170 to read the character and give voice feedback.
[0086] FIG. 7 is a drawing showing PCMVKCP vision program drawing a mouse keyboard, control panel on virtual puzzle cell map keys control panel graphic image, divided into 2 mouse sections, Left hand mouse 186, and Right Hand mouse 174.
[0087] Loading mouse command word to TextBlock field, "Mouse 315" to TextBlock 185, "Mouse Up" to TextBlock 184, "Mouse Left" to TextBlock 183, "Mouse 225" to TextBlock 182, "Double Click" to TextBlock 181, "Left Click" to TextBlock 180, "Right Click"
to TextBlock 179, and all other keys.
to TextBlock 179, and all other keys.
[0088] User can select the virtual keys to control mouse position, and mouse click functions. In addition, the virtual puzzle cell map keyboard, control panel, preferred special interface section arrangement, can be divided into Left and Right hand zones. The center area 173 of virtual puzzle cell map keys control panel graphic image is reserved to place a real time video image 187 that shows user actions 188. So, the user can see himself or herself and all the control virtual keys together on monitor. This special virtual gesture interface arrangement makes good visual feedback indication controls and is easy for eyesight during user operation.
In a real example of IGSID-GIVCL, PCMVKCP program for mouse key control interface arrangement, it is in a preferred arrangement having the interface of mouse keys control panel to support both in Left Hand Mouse Key area and Right Hand Mouse Key area with all direction moving keys, UP, DOWN, LEFT, Right, 45 degree, 135 degree, 225 degree, and 315 degree Date Recue/Date Received 2021-11-21 keys. The mouse movement can have 1 small move key for UP, Down, Left, right, 45, 135, 225, 315. This is useful when mouse is near the target to be clicked. So, it works for tiny mouse movement for mouse to select on the target. The mouse movement can have 1 large move key for UP8, Down8, Left8, right8, 45-8, 135-8, 225-8, 315-8. "8" means 8 times of moving distance of small mouse movement. This is useful when mouse is in some distant to move to the target.
So, it works for large mouse movement for mouse to select on the target, less click gesture action. All Mouse Keys selection click is not locked in 3rd selection click zone. It means that all mouse keys can be re-clicked again in the 31d selection click zone without pulling hand back.
Combining Fish Swimming Fin gesture, the user can very easily control mouse location with accuracy to point on the target and to do mouse click functions. Please see "II" comment of array key defining distant and multiple speed keys beside code. A preferred arrangement is a 7 Rows, 17 Columns puzzle cell Mouse Key Controller map.
And
In a real example of IGSID-GIVCL, PCMVKCP program for mouse key control interface arrangement, it is in a preferred arrangement having the interface of mouse keys control panel to support both in Left Hand Mouse Key area and Right Hand Mouse Key area with all direction moving keys, UP, DOWN, LEFT, Right, 45 degree, 135 degree, 225 degree, and 315 degree Date Recue/Date Received 2021-11-21 keys. The mouse movement can have 1 small move key for UP, Down, Left, right, 45, 135, 225, 315. This is useful when mouse is near the target to be clicked. So, it works for tiny mouse movement for mouse to select on the target. The mouse movement can have 1 large move key for UP8, Down8, Left8, right8, 45-8, 135-8, 225-8, 315-8. "8" means 8 times of moving distance of small mouse movement. This is useful when mouse is in some distant to move to the target.
So, it works for large mouse movement for mouse to select on the target, less click gesture action. All Mouse Keys selection click is not locked in 3rd selection click zone. It means that all mouse keys can be re-clicked again in the 31d selection click zone without pulling hand back.
Combining Fish Swimming Fin gesture, the user can very easily control mouse location with accuracy to point on the target and to do mouse click functions. Please see "II" comment of array key defining distant and multiple speed keys beside code. A preferred arrangement is a 7 Rows, 17 Columns puzzle cell Mouse Key Controller map.
And
[0089] Puzzle cell Size (HxW) will be calculated by Workspace Zone size (HxW) divided by the rows, and columns. Here is an example C# program coding function to arrange Puzzle Cell Map List for Mouse Key Controller commands by 2-dimension string array in C# code.
See code section 2 at the end of the specification.
See code section 2 at the end of the specification.
[0090] When the user applies a gesture to click, vision tracking function in the PCMVKCP
program 3 uses the puzzle cell position (Row, Column) to call the particular (Row, column) array string value of the puzzle cell 2 dimension string array code and to obtain the text word command. For example, if user Right hand moves to "MU" and clicks, the PCMVKCP
program activates specific web page and generates a HTTP browser command
program 3 uses the puzzle cell position (Row, Column) to call the particular (Row, column) array string value of the puzzle cell 2 dimension string array code and to obtain the text word command. For example, if user Right hand moves to "MU" and clicks, the PCMVKCP
program activates specific web page and generates a HTTP browser command
[0091] Example HTTP coding from working prototype. Copyright "http://localhost:8000/index.html?HTTP.KEYS MU" in Browser URL and enter. The web page activates a link to trigger automation program EVENTGHOST trigger event (KEYS
Folder, MU
event), and exercises the MU Marco script to send out Infrared Signal to control another computer to move its mouse position UP for a small distance. If "MU8", then the other computer moves a corresponding mouse position UP for a large distance. If "ML225", then the other computer moves the corresponding mouse position for 225 degrees and for a small Date Recue/Date Received 2021-11-21 distance. If "ML225-8", then the other computer moves the corresponding mouse position for 225 degrees and for 8 times of the small distant.
Folder, MU
event), and exercises the MU Marco script to send out Infrared Signal to control another computer to move its mouse position UP for a small distance. If "MU8", then the other computer moves a corresponding mouse position UP for a large distance. If "ML225", then the other computer moves the corresponding mouse position for 225 degrees and for a small Date Recue/Date Received 2021-11-21 distance. If "ML225-8", then the other computer moves the corresponding mouse position for 225 degrees and for 8 times of the small distant.
[0092] The puzzle cell keys can be defined in software function coding by allowing the keys to be capable of multiple click, multiple speed, different move distant. It enables multiple clicks by 1 gesture action and also allows it to control the lock or unlock the key that to enable re-click in 3rd zone key. When the user uses IGSID-GIVCL special gesture hand sign, it can continuously click virtual keys easily in 3rd selection click zone.
[0093] This key control definitions method is used for all other keys and actions in all virtual control panels, keyboards. The first Row of virtual keyboard controller is reserved for robot function menu, and the last row is reserved for PCMVKCP program controls, change controller, etc.
[0094] FIG. 8 is a drawing showing the IGSID-GIVCL creating any keyboard and controlling a panel that the user wanted.
[0095] If the user selects WWT controller, then the PCMVKCP program draws a new virtual puzzle cell map keys control panel graphic image to a virtual control panel WWT 189 for MICROSOFT WORLD WIDE TELESCOPE program. It fills in special WWT command words, "Zoom In" 195, "Zoom In" 195, "Up" 194, "Left" 193, "BKSP" 192, "QWERT"
191, "Enter" 190. On the right side 174, it draws mouse control keys such as "Left Click" 180, "Right Click" 179 and all other keys on its cell.
191, "Enter" 190. On the right side 174, it draws mouse control keys such as "Left Click" 180, "Right Click" 179 and all other keys on its cell.
[0096] Inside the PCMVKCP program function, those text command words can be coded and arranged into a two-dimensional array text string. It loads each text word into row and column cells. So, the display on the graphic puzzle cell image is virtually matched with the user's working space zone,
[0097] The varieties of virtual keyboard, control panel, each keyboard has its own control commands. It fills into each row-column puzzle cell. The virtual keyboards drawings are shown as examples
[0098] If the user selects SLOT controller, then the PCMVKCP program re-draws a new virtual Date Recue/Date Received 2021-11-21 puzzle cell map keys control panel graphic image to a virtual control panel SLOT 196 for controlling a SLOT machine simulation program.
[0099] If the user selects DENG controller, then the PCMVKCP program re-draws a new virtual puzzle cell map keys control panel graphic image to a virtual control panel DJING 197 for controlling a Disco DJ machine simulation program.
[0100] If the user selects 2nd Life controller, then the PCMVKCP program re-draws a new virtual puzzle cell map keys control panel graphic image to a virtual control panel 2ndLife 198 for controlling a virtual 3D world avatar in 2nd Life viewer program.
[0101] If the user selects ROVER controller, then the PCMVKCP program re-draws a new virtual puzzle cell map keys control panel graphic image to a virtual control panel ROVER 199 for controlling a Mars Rover simulation program to control rover robot to drive, take pictures, transmit pictures back to Earth, use Claw, Driller to take rock samples, intelligent robot operations, etc.
[0102] FIG. 9 is a drawing showing examples of virtual keyboards drawings that the IGSID-GIVCL is able to support computer using USB-UIRT to remotely control machines such as TV, DVD, SIRIUS radio, Disco Light, and special MORSE Keyboard, etc.
[0103] For example,
[0104] If the user selects TV controller, then the PCMVKCP program re-draws a new virtual puzzle cell map keys control panel graphic image to a virtual control panel TV
200 for controlling TV functions.
200 for controlling TV functions.
[0105] If the user selects DVD controller, then the PCMVKCP program re-draws a new virtual puzzle cell map keys control panel graphic image to a virtual control panel DVD 201 for controlling DVD functions.
[0106] If the user selects LIGHT controller, then the PCMVKCP program re-draws a new virtual puzzle cell map keys control panel graphic image to a virtual control panel LIGHT 202 for controlling LIGHT functions.
Date Recue/Date Received 2021-11-21
Date Recue/Date Received 2021-11-21
[0107] If the user selects SIRIUS controller, then the PCMVKCP program re-draws a new virtual puzzle cell map keys control panel graphic image to a virtual control panel SIRIUS 203 for controlling Sirius radio functions.
[0108] If the user selects MORSE code Keyboard controller, then the PCMVKCP
program re-draws a new virtual puzzle cell map keys control panel graphic image to a virtual control panel MORSE code 204 for using MORSE Code to enter key functions. In the puzzle cell Row 2, Column 2 a "." representing "Di", and puzzle cell row 2, column 4 a "2 representing "DHA".
The user can click on the cells to make "Di", "DHA" signals, ((PCMVKCP) program 3 includes MORSE code signals converted to A-Z, 0-9 functions. So, the user enters MORSE
Code, then the user clicks CONVERT 193, it is transferred to character to execute command. The Read command is used during the MORSE code enter stage. The user can read what code has been entered so far and can Erase all and can re-enter again and can click BKSP 190 to delete just a signal "Di", "DHA". This IGSID-GIVCL MORSE Code Keyboard is useful for poor eyesight user, and blind user to enter command by the simplest gesture action "Di", "DHA" actions to control machines.
program re-draws a new virtual puzzle cell map keys control panel graphic image to a virtual control panel MORSE code 204 for using MORSE Code to enter key functions. In the puzzle cell Row 2, Column 2 a "." representing "Di", and puzzle cell row 2, column 4 a "2 representing "DHA".
The user can click on the cells to make "Di", "DHA" signals, ((PCMVKCP) program 3 includes MORSE code signals converted to A-Z, 0-9 functions. So, the user enters MORSE
Code, then the user clicks CONVERT 193, it is transferred to character to execute command. The Read command is used during the MORSE code enter stage. The user can read what code has been entered so far and can Erase all and can re-enter again and can click BKSP 190 to delete just a signal "Di", "DHA". This IGSID-GIVCL MORSE Code Keyboard is useful for poor eyesight user, and blind user to enter command by the simplest gesture action "Di", "DHA" actions to control machines.
[0109] If the user selects SYMBOLS controller, then the PCMVKCP program re-draws a new virtual puzzle cell map keys control panel graphic image to a virtual control panel SYMBOLS
205 for controlling another computer enter display symbols keys.
205 for controlling another computer enter display symbols keys.
[0110] FIG. 10 is a drawing showing examples of virtual keyboards drawings that the IGSID-GIVCL is able to support computer operation functions.
[0111] If the user selects ABC controller, then the PCMVKCP program re-draws a new virtual puzzle cell map keys control panel graphic image to a virtual control panel ABC 206 for controlling another computer enter display A-Z keys.
[0112] If the user selects 123 controllers, then the PCMVKCP program re-draws a new virtual puzzle cell map keys control panel graphic image to a virtual control panel 123 207 for controlling another computer enter display 0-9 keys.
[0113] If the user selects FN controller, then the PCMVKCP program re-draws a new virtual puzzle cell map keys control panel graphic image to a virtual control panel FN
208 for Date Recue/Date Received 2021-11-21 controlling another computer enter Function Fl-F12 keys.
208 for Date Recue/Date Received 2021-11-21 controlling another computer enter Function Fl-F12 keys.
[0114] If the user selects PROGRAM controller, then the PCMVKCP program re-draws a new virtual puzzle cell map keys control panel graphic image to a virtual control panel PROGRAM
209 for controlling another computer to execute computer program to run.
Example click "Take Picture" Robot will take picture of the user and save the picture. If the user clicks the "LOOK
UP", "LOOK RIGHT", "LOOK LEFT", "LOOK DOWN" keys, the IGSID-GIVCL will control its motor module to rotate its video sensor to turn UP, RIGHT, LEFT or Down direction.
209 for controlling another computer to execute computer program to run.
Example click "Take Picture" Robot will take picture of the user and save the picture. If the user clicks the "LOOK
UP", "LOOK RIGHT", "LOOK LEFT", "LOOK DOWN" keys, the IGSID-GIVCL will control its motor module to rotate its video sensor to turn UP, RIGHT, LEFT or Down direction.
[0115] In the special arranged area of virtual puzzle cell map keys control panel graphic image, on the first-row area 211, it is reserved for the IGSID-GIVCL operation function menu. For the last row 212 area, it is reserved for PCMVKCP program type of control panels.
This makes it easier when the user wants to use different controller, the user can find it at last row. When the user wants to configure the IGSID-GIVCL PCMVKCP support function the user searches the first row of puzzle cell map image. A special "HOME" 210 link is available for fast return to start program position, or when the user gets lost in menu structure and wishes to jump back to start.
This makes it easier when the user wants to use different controller, the user can find it at last row. When the user wants to configure the IGSID-GIVCL PCMVKCP support function the user searches the first row of puzzle cell map image. A special "HOME" 210 link is available for fast return to start program position, or when the user gets lost in menu structure and wishes to jump back to start.
[0116] In addition, the IGSID-GIVCL uses peripheral devices to control network devices, computers, machines, and intelligent robot. The IGSID-GIVCL can be equipped with speech recognition program function 213. An array of microphones are used as sound sensors. The IGSID-GIVCL is equipped with a voice speaking program function 214 that uses speakers to voice feedback. The IGSID-GIVCL PCMVKCP vision program can support Hand Sign Language function 179. Each hand and fingers gestures and positions value on each video frame will be compared and distinguished by the hand sign on puzzle cell area to determine what hand sign language and program will execute the command
[0117] FIG. 11 is the drawing showing an advanced TouchScreen Mouse 224 combined with puzzle cell virtual keyboard 221 in sandwich layers method.
The IGSID-GIVCL supports a new revolution gesture input of computer interface method. The IGSID-GIVCL has vision puzzle cell map virtual keyboard control program (PCMVKCP) functions can support an advanced gesture action of the TouchScreen of Mouse 224 that virtual sandwich layers are used to combine virtual control panel keys zone function.
The IGSID-Date Recue/Date Received 2021-11-21 GIVCL PCMVKCP vision program 3 enables the user to decide which hand for TouchScreen Mouse 221, 222 and the other hand can virtually click the virtual puzzle cell Mouse keys. It can be assigned any commands. The mouse function can be such as Mouse Double click 195, 175, Left click 193, Right click 177, Mouse Left click UP 194, Mouse Left click Down 192, Mouse Right Click UP 176, Mouse Right Click Down 178, 190, Wheel UP, Wheel Down, etc.
The IGSID-GIVCL supports a new revolution gesture input of computer interface method. The IGSID-GIVCL has vision puzzle cell map virtual keyboard control program (PCMVKCP) functions can support an advanced gesture action of the TouchScreen of Mouse 224 that virtual sandwich layers are used to combine virtual control panel keys zone function.
The IGSID-Date Recue/Date Received 2021-11-21 GIVCL PCMVKCP vision program 3 enables the user to decide which hand for TouchScreen Mouse 221, 222 and the other hand can virtually click the virtual puzzle cell Mouse keys. It can be assigned any commands. The mouse function can be such as Mouse Double click 195, 175, Left click 193, Right click 177, Mouse Left click UP 194, Mouse Left click Down 192, Mouse Right Click UP 176, Mouse Right Click Down 178, 190, Wheel UP, Wheel Down, etc.
[0118] For example, if the user uses right hand to click virtual mouse 222 function on the title menu 211 of the virtual mouse, then the IGSID-GIVCL PCMVKCP program 3 activates the virtual TouchScreen Mouse 224 function, disables Right Hand select and enables Left Hand select only on virtual keys, and enables tracking user's right hand 77 location and moves the mouse 224 position accordingly on the display monitor 43. If the user's right hand 77 moves UP, the IGSID-GIVCL, PCMVKCP program moves the mouse 224 cursor position UP on the monitor 43 accordingly for the distant 78 of the hand movement distant. The movement distance can be determined where its location on the right side of the Work Zone space 76. the IGSID-GIVCL PCMVKCP program calculates the ratio of X 234, Y 224 distance between virtual center point 79, and updates the same ratio distant 232 moving the mouse 224 cursor position in the same direction. Therefore, if user's moving right hand 77 makes a circle, the mouse 224 cursor will move a circle on the monitor 43 in real time. When the user moves mouse 224 cursor on specific position that could be an internet browser web page on the computer desktop screen 226, the user can push right hand out, the IGSID-GIVCL, PCMVKCP recognizes the click select, it will do the Mouse LEFT click as default selection click action. Sometimes, the other mouse click action is required. For example, the other hand can move and click the virtual mouse puzzle cell keys. For example, the other hand 82 clicks Double Click 195, then the user moves right hand 77 to control TouchScreen Mouse 224 cursor on a program icon, and push hand out.
The IGSID-GIVCL, PCMVKCP program 3 will perform the Double click 195 for that click instead of default Left click. Therefore, the program icon will be double-clicked 195 to open and to run. The other virtual mouse puzzle cell keys are also useful when specific mouse action click needs to be specific. For example, if the user is in view of a large page or a drawing image page, to perform the Left Click Down 192, it will make the whole drawing image page sheet move and follow right hand 77 moving in all directions. When the user moves image sheet to right location, conducts virtual Left Click Down click 194 to release the TouchScreen Mouse 224 Grip action, and back to default. The TouchScreen Mouse 224 can be performed by right hand 77 or left hand Date Recue/Date Received 2021-11-21 82, and each hand mouse 224 cursor start position is preferred to be initially on its start location.
Because the IGSID-GIVCL PCMVKCP program vision calibrates the user working space zone 76 into 4 sections, X 218 and Y 216-dimension lines across on virtual center point 79. So, it divides into 4 sections where value of section I, (X+, Y+) 217, section II, (X-, Y+) 215, section III, (X+, Y-) 219, and section IV, (X-, Y-) 220. This means that for the right hand 77, it will determine position using X, Y value of section I, II, III, and IV.
The IGSID-GIVCL, PCMVKCP program 3 will perform the Double click 195 for that click instead of default Left click. Therefore, the program icon will be double-clicked 195 to open and to run. The other virtual mouse puzzle cell keys are also useful when specific mouse action click needs to be specific. For example, if the user is in view of a large page or a drawing image page, to perform the Left Click Down 192, it will make the whole drawing image page sheet move and follow right hand 77 moving in all directions. When the user moves image sheet to right location, conducts virtual Left Click Down click 194 to release the TouchScreen Mouse 224 Grip action, and back to default. The TouchScreen Mouse 224 can be performed by right hand 77 or left hand Date Recue/Date Received 2021-11-21 82, and each hand mouse 224 cursor start position is preferred to be initially on its start location.
Because the IGSID-GIVCL PCMVKCP program vision calibrates the user working space zone 76 into 4 sections, X 218 and Y 216-dimension lines across on virtual center point 79. So, it divides into 4 sections where value of section I, (X+, Y+) 217, section II, (X-, Y+) 215, section III, (X+, Y-) 219, and section IV, (X-, Y-) 220. This means that for the right hand 77, it will determine position using X, Y value of section I, II, III, and IV.
[0119] Here are the steps regarding how to control mouse position in each section.
[0120] How to obtain current mouse X, Y position on the monitor screen, and to use the value plus Right hand 77 gesture distant X, Y and to multiple the ratio of the screen resolution? This is a direct copy and paste from the IGSID-GIVCL, PCMVKCP working prototype C#
program.
Copyright. First step is to obtain where is current mouse X, Y position, and then where to move mouse on screen. Recalculate the new position.
leftofscre en=m ouse Scre en S etUpX+(int)m ous e S el ectHandX*m ous e ScreenResoulti on- Rati oX;
(Current mouse X position+gesture distant X*screen resolution width ratio) topofscre en=m ous e Screen S etUpY+(int)m ous e S el ectHandY*m ous e ScreenResoulti on- Rati oY;
(Current mouse Y position+gesture distant Y*screen resolution height ratio).
To assign new mouse X, Y value to move the mouse to new position, mouseSeletX=leftofscreen;
mouseSeletY=topofscreen. Then the PCMVKCP program moves mouse to new position on display monitor. For user to conveniently move the user's hand control mouse point, the Right Hand 77 TouchScreen Mouse 224 program function can be setup to start cursor position and it will be in monitor LEFT-TOP corner position 231 that is video card monitor 0,0 position. On the other hand, for the left hand 82, it will determine position using X 223, Y
229 value between center point 79 of section I, II, III, and IV, then the LEFT Hand TouchScreen Mouse 224 program function can setup to start cursor position in monitor Right-Bottom corner position 227.
For example, if a monitor video card uses a resolution as 1900x1200, 228, 230, then the cursor start position is 1900x1200 on the monitor. The IGSID-GIVCL PCMVKCP program will determine its video view frame width and height ratio to compare with monitor screen resolution ratio and move mouse cursor distance accordingly with hand in all direction of 360 degrees.
TouchScreen Mouse 224 can use gesture click action with computer virtual keyboard keys buttons as well, and to click keys buttons on computer monitor.
Date Recue/Date Received 2021-11-21
program.
Copyright. First step is to obtain where is current mouse X, Y position, and then where to move mouse on screen. Recalculate the new position.
leftofscre en=m ouse Scre en S etUpX+(int)m ous e S el ectHandX*m ous e ScreenResoulti on- Rati oX;
(Current mouse X position+gesture distant X*screen resolution width ratio) topofscre en=m ous e Screen S etUpY+(int)m ous e S el ectHandY*m ous e ScreenResoulti on- Rati oY;
(Current mouse Y position+gesture distant Y*screen resolution height ratio).
To assign new mouse X, Y value to move the mouse to new position, mouseSeletX=leftofscreen;
mouseSeletY=topofscreen. Then the PCMVKCP program moves mouse to new position on display monitor. For user to conveniently move the user's hand control mouse point, the Right Hand 77 TouchScreen Mouse 224 program function can be setup to start cursor position and it will be in monitor LEFT-TOP corner position 231 that is video card monitor 0,0 position. On the other hand, for the left hand 82, it will determine position using X 223, Y
229 value between center point 79 of section I, II, III, and IV, then the LEFT Hand TouchScreen Mouse 224 program function can setup to start cursor position in monitor Right-Bottom corner position 227.
For example, if a monitor video card uses a resolution as 1900x1200, 228, 230, then the cursor start position is 1900x1200 on the monitor. The IGSID-GIVCL PCMVKCP program will determine its video view frame width and height ratio to compare with monitor screen resolution ratio and move mouse cursor distance accordingly with hand in all direction of 360 degrees.
TouchScreen Mouse 224 can use gesture click action with computer virtual keyboard keys buttons as well, and to click keys buttons on computer monitor.
Date Recue/Date Received 2021-11-21
[0121] Combine Right hand mouse, right hand for moving mouse position with left hand key zone. If Left hand mouse is selected, then Left hand moves mouse position and combine with right hand key selection zone.
[0122] If computer windows desktop screen 226 are filled up with click-able buttons on surface, then the user can use TocuhScreen Mouse 224 to select which button to be clicked by gesture action.
[0123] The variety of mouse option key selection zone can be coded in this way, this is a direct copy and paste from the IGSID-GIVCL PCMVKCP working prototype C#
program. Copyright.
Example
program. Copyright.
Example
[0124] See code section 3 at the end of specification.
[0125] When the user moves mouse to target position, for example, in a web page, then a gesture pushes to click.
[0126] In another example, a program icon on the desktop screen, the user uses left hand to click virtual Double click key and uses right hand to push and click on the program icon to open the program and to run the program.
[0127] So, the hand gesture can control Mouse movement, and decide what mouse click action for operating computer, and programs.
[0128] In summary of this TouchScreen Mouse combined with Virtual Puzzle Cell keys control panels using sandwich interface layers functions is an advanced gesture system that includes all current computer interface device input methods and becomes one truly universal computer interface device and enables the user to perform gesture control of all machine functions together, and uses easy gesture to control computer, without a need to build physical mouse, keyboard, remote controller, or to build control interface on equipment's, machines, robots. The IGSID-GIVCL will replace the need for building physical control panels, interface devices, reducing high tech device pollution and saving the material resource usage on the Earth.
Date Recue/Date Received 2021-11-21
Date Recue/Date Received 2021-11-21
[0129] FIG. 12 is a drawing showing the enhanced wireless select key indication device 235, 236 worn on user hand palm 82, arms, or user body. The wireless indication device has 2 styles. The first style 235 includes micro controller 240, BLUETOOTH 239, LED light 242, vibration motor 244 and power source 237 with flexible belt 245 that can tightly hold on hand palm 82. The 2nd style 236 includes micro controller 240, wireless Wi-Fi, TCP/IP network card 246, LCD display screen 247, vibration motor 244, power source 237, watch belt to hold the device on hand 72.
[0130] When the user push hand 82 out, the PCMVKCP program will send wireless network signals to device and to signal display with selection keys. For example, by blinking LED light 242 in MORSE Code signals, and/or using vibration motor 244 to make long-short vibrations MORSE Code signal, the user doesn't need to watch display monitor 43 to know what keys they select. This feature is especially useful for poor eyesight, and blind users.
The LCD screen can display real time monitor content. See the puzzle cell map image.
The LCD screen can display real time monitor content. See the puzzle cell map image.
[0131] FIG. 13 is a drawing showing a wireless display glass 46 that has network protocol equipment 45 including wireless network card equipment 249, video image process card equipment 250, connecting with projector 252, power source 247, and wireless server-client program to connect with the IGSID-GIVCL. The IGSID-GIVCL sends the display signals of puzzle cell map image with hands selection positions 253, 265. The wireless display glass projector 252 projects the puzzle cell image keys on its lenses 246 that is wearable by the user.
So, the user can see which keys are being selected by them. The left side 269 area is for left hand keys 270, 271, 272, 273, 274, 275, and the right side 266 area is for the right hand keys 259, 260, 261, 262,263, 264. The lenses center area can be optionally reserved for display the IGSID-GIVCL text feedback 268, and real-time video image of user action 267.
So, the user can see which keys are being selected by them. The left side 269 area is for left hand keys 270, 271, 272, 273, 274, 275, and the right side 266 area is for the right hand keys 259, 260, 261, 262,263, 264. The lenses center area can be optionally reserved for display the IGSID-GIVCL text feedback 268, and real-time video image of user action 267.
[0132] FIG. 14 is a drawing showing the IGSID-GIVCL being equipped with a mobile platform.
For example, the IGSID-GIVCL uses micro controller board to control varieties motors 26, 30.
So, the IGSID-GIVCL main computer 1, vision program can intelligently control rotations of these motors. As a results, the IGSID-GIVCL intelligently drives itself to move around 276, and is able to control the movement of its display projector 44 in a direction to project puzzle cell keyboard images 277 on any surface 278.
For example, the IGSID-GIVCL uses micro controller board to control varieties motors 26, 30.
So, the IGSID-GIVCL main computer 1, vision program can intelligently control rotations of these motors. As a results, the IGSID-GIVCL intelligently drives itself to move around 276, and is able to control the movement of its display projector 44 in a direction to project puzzle cell keyboard images 277 on any surface 278.
[0133] Here are the Arduino programming code showing how to enable Micro controller to Date Recue/Date Received 2021-11-21 control the motor module rotation.
[0134] To control more than 1 motor using 1 string signal array.
[0135] See code section 4 at the end of specification.
[0136] This Arduino can download to Adruino Micro controller and connect the Arduino board COM port to the (PCMVKCP) program 3. So, the IGSID-GIVCL PCMVKCP vision program can intelligently send value string to Adruino to rotate its motor direction, speed, intelligently. The motor module can be used for tilting video sensor, and for rotation, and for the IGSID-GIVCL
body movement, neck, arms, legs, and mobile wheels.
body movement, neck, arms, legs, and mobile wheels.
[0137] The varieties of motors control modules can be used to build into the IGSID-GIVCL' s neck, body, arms, hands, legs. So, the IGSID-GIVCL can be built as human shape, physical body movement ability with the IGSID-GIVCL puzzle cell map function. The IGSID-GIVCL
becomes the communication bridge between human and intelligent robot machine world.
This invention proposed IGSID-GIVCL example is to use MICROSOFT KINECT sensor, MICROSOFT VISUAL STUDIO C# programming, Arduino micro control board as demonstration to build a completed working IGSID-GIVCL demonstration. There are alternative methods available to build the IGSID-GIVCL as well.
Code section 1:
Copyright.
userWorkZoneHead2CenterLength=Math.Sqrt( Math.Pow(userMe asure C enterPoint.P osi ti on.X-us erM easureH eadP oint.P
ositi on. - X, 2)+
Math.Pow(userMe asureC enterP oint.P osi ti on.Y-us erM easureH eadP oint.P
ositi on. - Y, 2)+
Math.Pow(userMe asureC enterP oint.P osi ti on.Z-userMeasureH eadP oint.P
ositi on. - Z, 2));
userWorkZoneCenter2LeftShoulderLength=Math.Sqrt( Math.Pow(userMeasureCenterPoint.Position.X-userMeasureLeftEdge.Position.X- , 2)+
Math.Pow(userMeasureCenterPoint.Position.Y-userMeasureLeftEdge.Position.Y- , 2)+
Math.Pow(userMeasureCenterPoint.Position.Z-userMeasureLeftEdge.Position.Z- , 2));
userWorkZon eC enter2Ri ghtShoulderLen gth=Math . Sqrt( Date Recue/Date Received 2021-11-21 Math.Pow(userMeasureCenterPoint.Position.X-userMeasureRightEdge.Position.- X, 2)+
Math.Pow(userMeasureCenterPoint.Position.Y-userMeasureRightEdge.Position.- Y, 2)+
Math.Pow(userMeasureCenterPoint.Position.Z-userMeasureRightEdge.Position.- Z, 2));
Code section 2: This is directly copy and paste from robot working prototype C# program.
Copyright.
puzzleCellMapList[1, 1] = "; //First row reserve Robot menu puzzleCellMapList[2, 1] = "MKEY";
puzzleCellMapList[3, 1] =
puzzleCellMapList[4, 1] =
puzzleCellMapList[5, 1] =
puzzleCellMapList[6, 1] =
puzzleCellMapList[7, 1] =
puzzleCellMapList[8, 1] =
puzzleCellMapList[9, 1] =
puzzleCellMapList[10, 1] =
puzzleCellMapList[ 11, 1] =
puzzleCellMapList[12, 1] = "MKEY";
puzzleCellMapList[13, 1] =
puzzleCellMapList[14, 1] =
puzzleCellMapList[15, 1] =
puzzleCellMapList[16, 1] =
puzzleCellMapList[17, 1] =
puzzleCellMapList[1, 2] =
puzzleCellMapList[2, 2] = "ML315-8"; // Mutiple clicks in 1 puzzleCellMapList[3, 2] = "MU8"; // ReClick able,Large Move puzzleCellMapList[4, 2] = "MU8"; // Move Mouse Large Up puzzleCellMapList[5, 2] =
puzzleCellMapList[6, 2] = "MR45-8"; //Move 45 Large puzzleCellMapList[7, 2] =
puzzleCellMapList[8, 2] =
Date Recue/Date Received 2021-11-21 puzzleCellMapList[9, 2] = ";
puzzleCellMapList[10, 2] = ";
puzzleCellMapList[ 1 1, 2] = ";
puzzleCellMapList[12, 2] = "ML315-8";// Move Mouse Large 315degree puzzleCellMapList[13, 2] = "MU8";
puzzleCellMapList[14, 2] = "MU8"; // Move mouse larger Up puzzleCellMapList[15, 2] = "MU8";
puzzleCellMapList[16, 2] = "MR45-8"; // Move Mouse Large 45 degree puzzleCellMapList[17, 2] = ";
puzzleCellMapList[1, 3] = ";
puzzleCellMapList[2, 3] = "ML8";
puzzleCellMapList[3, 3] = "ML315"; // Move mouse small 315 puzzleCellMapList[4, 3] = "MU"; // Move mouse small Up puzzleCellMapList[5, 3] = "MR45"; // Move mouse small 45 puzzleCellMapList[6, 3] = "MR8";
puzzleCellMapList[7, 3] = ";
puzzleCellMapList[8, 3] = ";
puzzleCellMapList[9, 3] = ";
puzzleCellMapList[10, 3] = ";
puzzleCellMapList[ 1 1, 3] = ";
puzzleCellMapList[12, 3] = "ML8";
puzzleCellMapList[13, 3] = "ML315";
puzzleCellMapList[14, 3] = "MU";
puzzleCellMapList[15, 3] = "MR45";
puzzleCellMapList[16, 3] = "MR8";
puzzleCellMapList[17, 3] = ";
puzzleCellMapList[1, 4] = "ENTER"; //Enter key puzzleCellMapList[2, 4] = "ML8"; // Move Mouse Large Left puzzleCellMapList[3, 4] = "ML"; //Move Mouse small Left puzzleCellMapList[4, 4] = ";
puzzleCellMapList[5, 4] = "MR"; //Move Mouse small Right Date Recue/Date Received 2021-11-21 puzzleCellMapList[6, 4] = "MR8"; //Move Mouse Large Right puzzleCellMapList[7, 4] = ";
puzzleCellMapList[8, 4] = ";
puzzleCellMapList[9, 4] = ";
puzzleCellMapList[10, 4] = ";
puzzleCellMapList[1 1, 4] = ";
puzzleCellMapList[12, 4] = "ML8";
puzzleCellMapList[13, 4] = "ML";
puzzleCellMapList[14, 4] = ";
puzzleCellMapList[15, 4] = "MR";
puzzleCellMapList[16, 4] = "MR8";
puzzleCellMapList[17, 4] = "ENTER";
puzzleCellMapList[1, 5] = ";
puzzleCellMapList[2, 5] = "ML8";
puzzleCellMapList[3, 5] = "ML225"; //Move Mouse 225 degree puzzleCellMapList[4, 5] = "MD"; //Move Mouse Down puzzleCellMapList[5, 5] = "MR135"; //Move Mouse 315 degree puzzleCellMapList[6, 5] = "MR8";
puzzleCellMapList[7, 5] = ";
puzzleCellMapList[8, 5] = ";
puzzleCellMapList[9, 5] = ";
puzzleCellMapList[10, 5] = ";
puzzleCellMapList[ 1 1, 5] = ";
puzzleCellMapList[12, 5] = "ML8";
puzzleCellMapList[13, 5] = "ML225";
puzzleCellMapList[14, 5] = "MD";
puzzleCellMapList[15, 5] = "MR135";
puzzleCellMapList[16, 5] = "MR8";
puzzleCellMapList[17, 5] = ";
puzzleCellMapList[1, 6] = ";
puzzleCellMapList[2, 6] = "ML225-8"; //Move Mouse 225 Multiple Date Recue/Date Received 2021-11-21 puzzleCellMapList[3, 6] = "MD8"; //Move Mouse Large Down puzzleCellMapList[4, 6] = "MD8";
puzzleCellMapList[5, 6] = "MD8";
puzzleCellMapList[6, 6] = "MR135-8"; //Move mouse 135 Mutiple puzzleCellMapList[7, 6] = ";
puzzleCellMapList[8, 6] = ";
puzzleCellMapList[9, 6] = ";
puzzleCellMapList[10, 6] = ";
puzzleCellMapList[ 1 1, 6] = ";
puzzleCellMapList[12, 6] = "ML225-8";
puzzleCellMapList[13, 6] = "MD8";
puzzleCellMapList[14, 6] = "MD8";
puzzleCellMapList[15, 6] = "MD8";
puzzleCellMapList[16, 6] = "MR135-8";
puzzleCellMapList[17, 6] = ";
puzzleCellMapList[1, 7] = "QWERT"; // Last Row reserved controls puzzleCellMapList[2, 7] = "DCLICK";// Mouse Double Click puzzleCellMapList[3, 7] = "LCLICK";// Mouse Left Click puzzleCellMapList[4, 7] = "WWT"; // change to WWT control puzzleCellMapList[5, 7] = "SLOT";// change to SLOT control puzzleCellMapList[6, 7] = "DENG"; // change to DENG control puzzleCellMapList[7, 7] = ";
puzzleCellMapList[8, 7] = ";
puzzleCellMapList[9, 7] = ";
puzzleCellMapList[10, 7] = ";
puzzleCellMapList[ 1 1, 7] = ";
puzzleCellMapList[12, 7] = "DCLICK";
puzzleCellMapList[13, 7] = "LCLICK";
puzzleCellMapList[14, 7] = "RCLICK";
puzzleCellMapList[15, 7] = "2NDLIFE"; // change to 2ndLife control puzzleCellMapList[16, 7] = "ROVER"; // change to ROVER control Date Recue/Date Received 2021-11-21 puzzleCellMapList[17, 7] = "; 1 // MKEY
Code section 3:
if (mouseClickTypeSelection ¨ 0) {
DoMouseClick( ); //default Left Mouse click }
else if (mouseClickTypeSelection ¨ 1) {
DoMouseLeftClickUp( ); // if Key Left Up select }
else if (mouseClickTypeSelection ¨ 2) {
DoMouseDoubleClick( ); // if Key Double click select }
else if (mouseClickTypeSelection ¨ 3) {
DoMouseLeftClickDown( ); // if Key Left Down select }
else if (mouseClickTypeSelection ¨ 4) {
DoMouseRightClickUp( ); // if Key Right Up select }
else if (mouseClickTypeSelection ¨ 5) {
DoMouseRightClick( );// if Key Right Click select }
else if (mouseClickTypeSelection ¨ 6) {
Date Recue/Date Received 2021-11-21 DoMouseRightClickDown( );// if Key Right Down select Code section 4: Directly copy and paste from robot working prototype in Arduino code.
Copyright #include <Servo.h>
Servo servo;
Servo servoY;
void setup( ) {
servo.attach(11); // digital pin 11 servoY.attach(10); // digital pin 10 Serial.begin(9600);
servo.write(90);
servoY.write(90);
void loop( ) {
if (Serial.available( ) >=2) {
byte pos= Serial.read( );
byte posXY= Serial.read( );
if(pos == 1) {
servo.write(posXY);
delay(5);
else if (pos ¨2) {
servoY.write(posXY); delay(5);
Date Recue/Date Received 2021-11-21
becomes the communication bridge between human and intelligent robot machine world.
This invention proposed IGSID-GIVCL example is to use MICROSOFT KINECT sensor, MICROSOFT VISUAL STUDIO C# programming, Arduino micro control board as demonstration to build a completed working IGSID-GIVCL demonstration. There are alternative methods available to build the IGSID-GIVCL as well.
Code section 1:
Copyright.
userWorkZoneHead2CenterLength=Math.Sqrt( Math.Pow(userMe asure C enterPoint.P osi ti on.X-us erM easureH eadP oint.P
ositi on. - X, 2)+
Math.Pow(userMe asureC enterP oint.P osi ti on.Y-us erM easureH eadP oint.P
ositi on. - Y, 2)+
Math.Pow(userMe asureC enterP oint.P osi ti on.Z-userMeasureH eadP oint.P
ositi on. - Z, 2));
userWorkZoneCenter2LeftShoulderLength=Math.Sqrt( Math.Pow(userMeasureCenterPoint.Position.X-userMeasureLeftEdge.Position.X- , 2)+
Math.Pow(userMeasureCenterPoint.Position.Y-userMeasureLeftEdge.Position.Y- , 2)+
Math.Pow(userMeasureCenterPoint.Position.Z-userMeasureLeftEdge.Position.Z- , 2));
userWorkZon eC enter2Ri ghtShoulderLen gth=Math . Sqrt( Date Recue/Date Received 2021-11-21 Math.Pow(userMeasureCenterPoint.Position.X-userMeasureRightEdge.Position.- X, 2)+
Math.Pow(userMeasureCenterPoint.Position.Y-userMeasureRightEdge.Position.- Y, 2)+
Math.Pow(userMeasureCenterPoint.Position.Z-userMeasureRightEdge.Position.- Z, 2));
Code section 2: This is directly copy and paste from robot working prototype C# program.
Copyright.
puzzleCellMapList[1, 1] = "; //First row reserve Robot menu puzzleCellMapList[2, 1] = "MKEY";
puzzleCellMapList[3, 1] =
puzzleCellMapList[4, 1] =
puzzleCellMapList[5, 1] =
puzzleCellMapList[6, 1] =
puzzleCellMapList[7, 1] =
puzzleCellMapList[8, 1] =
puzzleCellMapList[9, 1] =
puzzleCellMapList[10, 1] =
puzzleCellMapList[ 11, 1] =
puzzleCellMapList[12, 1] = "MKEY";
puzzleCellMapList[13, 1] =
puzzleCellMapList[14, 1] =
puzzleCellMapList[15, 1] =
puzzleCellMapList[16, 1] =
puzzleCellMapList[17, 1] =
puzzleCellMapList[1, 2] =
puzzleCellMapList[2, 2] = "ML315-8"; // Mutiple clicks in 1 puzzleCellMapList[3, 2] = "MU8"; // ReClick able,Large Move puzzleCellMapList[4, 2] = "MU8"; // Move Mouse Large Up puzzleCellMapList[5, 2] =
puzzleCellMapList[6, 2] = "MR45-8"; //Move 45 Large puzzleCellMapList[7, 2] =
puzzleCellMapList[8, 2] =
Date Recue/Date Received 2021-11-21 puzzleCellMapList[9, 2] = ";
puzzleCellMapList[10, 2] = ";
puzzleCellMapList[ 1 1, 2] = ";
puzzleCellMapList[12, 2] = "ML315-8";// Move Mouse Large 315degree puzzleCellMapList[13, 2] = "MU8";
puzzleCellMapList[14, 2] = "MU8"; // Move mouse larger Up puzzleCellMapList[15, 2] = "MU8";
puzzleCellMapList[16, 2] = "MR45-8"; // Move Mouse Large 45 degree puzzleCellMapList[17, 2] = ";
puzzleCellMapList[1, 3] = ";
puzzleCellMapList[2, 3] = "ML8";
puzzleCellMapList[3, 3] = "ML315"; // Move mouse small 315 puzzleCellMapList[4, 3] = "MU"; // Move mouse small Up puzzleCellMapList[5, 3] = "MR45"; // Move mouse small 45 puzzleCellMapList[6, 3] = "MR8";
puzzleCellMapList[7, 3] = ";
puzzleCellMapList[8, 3] = ";
puzzleCellMapList[9, 3] = ";
puzzleCellMapList[10, 3] = ";
puzzleCellMapList[ 1 1, 3] = ";
puzzleCellMapList[12, 3] = "ML8";
puzzleCellMapList[13, 3] = "ML315";
puzzleCellMapList[14, 3] = "MU";
puzzleCellMapList[15, 3] = "MR45";
puzzleCellMapList[16, 3] = "MR8";
puzzleCellMapList[17, 3] = ";
puzzleCellMapList[1, 4] = "ENTER"; //Enter key puzzleCellMapList[2, 4] = "ML8"; // Move Mouse Large Left puzzleCellMapList[3, 4] = "ML"; //Move Mouse small Left puzzleCellMapList[4, 4] = ";
puzzleCellMapList[5, 4] = "MR"; //Move Mouse small Right Date Recue/Date Received 2021-11-21 puzzleCellMapList[6, 4] = "MR8"; //Move Mouse Large Right puzzleCellMapList[7, 4] = ";
puzzleCellMapList[8, 4] = ";
puzzleCellMapList[9, 4] = ";
puzzleCellMapList[10, 4] = ";
puzzleCellMapList[1 1, 4] = ";
puzzleCellMapList[12, 4] = "ML8";
puzzleCellMapList[13, 4] = "ML";
puzzleCellMapList[14, 4] = ";
puzzleCellMapList[15, 4] = "MR";
puzzleCellMapList[16, 4] = "MR8";
puzzleCellMapList[17, 4] = "ENTER";
puzzleCellMapList[1, 5] = ";
puzzleCellMapList[2, 5] = "ML8";
puzzleCellMapList[3, 5] = "ML225"; //Move Mouse 225 degree puzzleCellMapList[4, 5] = "MD"; //Move Mouse Down puzzleCellMapList[5, 5] = "MR135"; //Move Mouse 315 degree puzzleCellMapList[6, 5] = "MR8";
puzzleCellMapList[7, 5] = ";
puzzleCellMapList[8, 5] = ";
puzzleCellMapList[9, 5] = ";
puzzleCellMapList[10, 5] = ";
puzzleCellMapList[ 1 1, 5] = ";
puzzleCellMapList[12, 5] = "ML8";
puzzleCellMapList[13, 5] = "ML225";
puzzleCellMapList[14, 5] = "MD";
puzzleCellMapList[15, 5] = "MR135";
puzzleCellMapList[16, 5] = "MR8";
puzzleCellMapList[17, 5] = ";
puzzleCellMapList[1, 6] = ";
puzzleCellMapList[2, 6] = "ML225-8"; //Move Mouse 225 Multiple Date Recue/Date Received 2021-11-21 puzzleCellMapList[3, 6] = "MD8"; //Move Mouse Large Down puzzleCellMapList[4, 6] = "MD8";
puzzleCellMapList[5, 6] = "MD8";
puzzleCellMapList[6, 6] = "MR135-8"; //Move mouse 135 Mutiple puzzleCellMapList[7, 6] = ";
puzzleCellMapList[8, 6] = ";
puzzleCellMapList[9, 6] = ";
puzzleCellMapList[10, 6] = ";
puzzleCellMapList[ 1 1, 6] = ";
puzzleCellMapList[12, 6] = "ML225-8";
puzzleCellMapList[13, 6] = "MD8";
puzzleCellMapList[14, 6] = "MD8";
puzzleCellMapList[15, 6] = "MD8";
puzzleCellMapList[16, 6] = "MR135-8";
puzzleCellMapList[17, 6] = ";
puzzleCellMapList[1, 7] = "QWERT"; // Last Row reserved controls puzzleCellMapList[2, 7] = "DCLICK";// Mouse Double Click puzzleCellMapList[3, 7] = "LCLICK";// Mouse Left Click puzzleCellMapList[4, 7] = "WWT"; // change to WWT control puzzleCellMapList[5, 7] = "SLOT";// change to SLOT control puzzleCellMapList[6, 7] = "DENG"; // change to DENG control puzzleCellMapList[7, 7] = ";
puzzleCellMapList[8, 7] = ";
puzzleCellMapList[9, 7] = ";
puzzleCellMapList[10, 7] = ";
puzzleCellMapList[ 1 1, 7] = ";
puzzleCellMapList[12, 7] = "DCLICK";
puzzleCellMapList[13, 7] = "LCLICK";
puzzleCellMapList[14, 7] = "RCLICK";
puzzleCellMapList[15, 7] = "2NDLIFE"; // change to 2ndLife control puzzleCellMapList[16, 7] = "ROVER"; // change to ROVER control Date Recue/Date Received 2021-11-21 puzzleCellMapList[17, 7] = "; 1 // MKEY
Code section 3:
if (mouseClickTypeSelection ¨ 0) {
DoMouseClick( ); //default Left Mouse click }
else if (mouseClickTypeSelection ¨ 1) {
DoMouseLeftClickUp( ); // if Key Left Up select }
else if (mouseClickTypeSelection ¨ 2) {
DoMouseDoubleClick( ); // if Key Double click select }
else if (mouseClickTypeSelection ¨ 3) {
DoMouseLeftClickDown( ); // if Key Left Down select }
else if (mouseClickTypeSelection ¨ 4) {
DoMouseRightClickUp( ); // if Key Right Up select }
else if (mouseClickTypeSelection ¨ 5) {
DoMouseRightClick( );// if Key Right Click select }
else if (mouseClickTypeSelection ¨ 6) {
Date Recue/Date Received 2021-11-21 DoMouseRightClickDown( );// if Key Right Down select Code section 4: Directly copy and paste from robot working prototype in Arduino code.
Copyright #include <Servo.h>
Servo servo;
Servo servoY;
void setup( ) {
servo.attach(11); // digital pin 11 servoY.attach(10); // digital pin 10 Serial.begin(9600);
servo.write(90);
servoY.write(90);
void loop( ) {
if (Serial.available( ) >=2) {
byte pos= Serial.read( );
byte posXY= Serial.read( );
if(pos == 1) {
servo.write(posXY);
delay(5);
else if (pos ¨2) {
servoY.write(posXY); delay(5);
Date Recue/Date Received 2021-11-21
Claims (62)
1. An intelligent gesture sensing input device comprising:
A main computer to process sensory data and to control system;
A video vision sensor module to sense gestures of a user communicatively coupled to the main computer to sense gestures of a user and send data to the main computer;
one or more display monitors or projectors communicatively coupled with the main computer to project and display playing computer contents or show puzzle cell keys being selected by the user;
A computer readable puzzle cell map virtual keyboard control program (PCMVKCP);
wherein the control program includes functions of establishing and arrangements of virtual 3-D interactive interface space between the input device and user by dividing a three-dimensional virtual workspace zone between user and input device into a puzzle cell row-column formation;
wherein the PCMVKCP program includes functions to divide space of the three-dimensional virtual workspace zone into a first, second and third selectable gate zones along a direction perpendicular to a surface of the puzzle cell row-column formation;
wherein the first selectable gate zone is to unlock a selected key gate zone;
wherein the second selectable gate zone is to select a virtual key zone;
and Date Recue/Date Received 2021-11-21 wherein the third selectable gate zone is to click a selected zone.
A main computer to process sensory data and to control system;
A video vision sensor module to sense gestures of a user communicatively coupled to the main computer to sense gestures of a user and send data to the main computer;
one or more display monitors or projectors communicatively coupled with the main computer to project and display playing computer contents or show puzzle cell keys being selected by the user;
A computer readable puzzle cell map virtual keyboard control program (PCMVKCP);
wherein the control program includes functions of establishing and arrangements of virtual 3-D interactive interface space between the input device and user by dividing a three-dimensional virtual workspace zone between user and input device into a puzzle cell row-column formation;
wherein the PCMVKCP program includes functions to divide space of the three-dimensional virtual workspace zone into a first, second and third selectable gate zones along a direction perpendicular to a surface of the puzzle cell row-column formation;
wherein the first selectable gate zone is to unlock a selected key gate zone;
wherein the second selectable gate zone is to select a virtual key zone;
and Date Recue/Date Received 2021-11-21 wherein the third selectable gate zone is to click a selected zone.
2. The intelligent gesture sensing input device of claim 1, Wherein the three-dimensional virtual workspace zone further divides into virtual layered control zone whereby a plane in which a zone lies may be used to determine whether an actuation has occurred by the crossing of a boundary; and wherein the three-dimensional virtual workspace zone having sandwich layers functions includes a keyboard zone layer, mouse zone layer, hand sign language zone layer, touchscreen mouse layers.
3. The intelligent gesture sensing input device of claim 1, wherein the gesture sensing input device allows automatic measurement of the three-dimensional virtual workspace zone of the user, assigns a virtual center point, creates the three-dimensional virtual workspace zone in a conformable area, establishes puzzle cell mapping keys, and presents virtually controlling panel in front of the user to be clicked.
4. The intelligent gesture sensing input device of claim 3, wherein said creating the three-dimensional virtual workspace zone in the conformable area is to establish a width of the three-dimensional virtual workspace zone being a pre-determined times of a sum of two shoulder widths of the user and to establish a length of the Date Recue/Date Received 2021-11-21 three-dimensional virtual workspace zone being a function of a length of an arm of the user and wherein a direction of the length is along an arm pushing out direction of the user.
5. The intelligent gesture sensing input device of claim 1, wherein a puzzle cell size is calculated by a workspace zone size divided by a number of rows and a number of columns.
6. The intelligent gesture sensing input device of claim 1, wherein the gesture sensing input device divides the three-dimensional virtual workspace zone into four sections including section I of (X+, Y+), section II of (X-, Y+), section III of (X+, Y-), and section IV of (X-, Y-).
7. The intelligent gesture sensing input device of claim 1, wherein a puzzle cell virtual keyboard in the virtual three-dimensional virtual workspace zone is presented to the user and a motion of a hand and fingers of the user selecting a selected key of the puzzle cell virtual keyboard is captured by the video vision sensor module.
8. The intelligent gesture sensing input device of claim 1, wherein the intelligent gesture sensing input device defines a hand gesture to move fingers of the user to change a selection of a selected puzzle cell.
Date Recue/Date Received 2021-11-21
Date Recue/Date Received 2021-11-21
9. The intelligent gesture sensing input device of claim 1, wherein the defined hand gesture improves a control accuracy of the selection of the selected puzzle cell.
10. The intelligent gesture sensing input device of claim 1, wherein the gesture sensing input device recognizes a gesture to control a lock or an unlock action.
11. The intelligent gesture sensing input device of claim 1, wherein the puzzle cell row-column formation includes a keyboard.
12. The intelligent gesture sensing input device of claim 1 further comprising:
a plurality of pre-recorded macro scripts, wherein the intelligent gesture sensing input device executes a selected pre-recorded macro script of the plurality of pre-recorded macro scripts corresponding to the selected key of the puzzle cell virtual keyboard to send infrared signals; and BLUETOOTH signals or Wi-Fi signals to remotely control a computer, an electronic device and a machine.
- 44 '.', Date Recue/Date Received 2021-11-21 1 3. The intelligent gesture sensing input device of claim 1, wherein the video vision sensor module and the gesture sensing input device is configured to detect three-dimensional positional values of joints of the user and to provides real-time tracking of locations of hands of the user respectively.
a plurality of pre-recorded macro scripts, wherein the intelligent gesture sensing input device executes a selected pre-recorded macro script of the plurality of pre-recorded macro scripts corresponding to the selected key of the puzzle cell virtual keyboard to send infrared signals; and BLUETOOTH signals or Wi-Fi signals to remotely control a computer, an electronic device and a machine.
- 44 '.', Date Recue/Date Received 2021-11-21 1 3. The intelligent gesture sensing input device of claim 1, wherein the video vision sensor module and the gesture sensing input device is configured to detect three-dimensional positional values of joints of the user and to provides real-time tracking of locations of hands of the user respectively.
1 4. The intelligent gesture sensing input device of claim 1, wherein the gesture sensing input device tracks X, Y, Z values of hands of the user and tracks X, Y, Z values between a body of the user and the hands of the user.
1 5. The intelligent gesture sensing input device of claim 1, wherein the gesture sensing input device uses a relative distant on hand locations to a center point to determine X, Y, Z position of a puzzle cell position and also recognizes user push hand locations in X, Y, and Z direction.
1 6. The intelligent gesture sensing input device of claim 1, wherein the gesture sensing input device divides the three-dimensional virtual workspace zone into four sections including section I of (X+, Y+), section II of (X-, Y+), section III of (X+, Y-), and section IV of (X-, Y-) where the section I and the section III are operable by a right hand of the user and the section II and the section IV are operable by a left hand of the user.
Date Recue/Date Received 2021-11-21
17. The intelligent gesture sensing input device of claim 1, wherein the gesture sensing input device detects a hand of the user moving up, down, left, right, push, and pull.
18. The intelligent gesture sensing input device of claim 1, wherein the gesture sensing input device recognizes a gesture of pushing out a hand of the user as to perform a click action.
19. The intelligent gesture sensing input device of claim 1, wherein the gesture sensing input device detects motions of fingers of the user.
20. The intelligent gesture sensing input device of claim 1, wherein the gesture sensing input device recognizes a gesture of pushing out a finger of the user as to perform a click action.
21. The intelligent gesture sensing input device of claim 1, wherein the gesture sensing input device detects a continuously fingers movement click virtual key without pull a hand of the user back.
22. The intelligent gesture sensing input device of claim 1, wherein each hand and fingers gestures and position values on each video Date Recue/Date Received 2021-11-21 frame is compared and distinguished by a respective hand sign on a puzzle cell area.
23. The intelligent gesture sensing input device of claim 1, wherein the gesture sensing input device includes a virtual mouse, a keyboard, control panels, a computer interface method, a touchscreen mouse, a touchscreen, a Morse Code keyboard, virtual puzzle cell mouse keys, a virtual control panel keyboard, a mouse key controller or a hand sign language function.
24. The intelligent gesture sensing input device of claim 1, wherein the three-dimensional virtual workspace zone includes a touchscreen mouse having sandwich layers functions.
25. The intelligent gesture sensing input device of claim 1, wherein the gesture sensing input device activates a virtual touch screen mouse function and the intelligent gesture sensing input device disables a right-hand selection function and enables a left-hand selection function.
26. The intelligent gesture sensing input device of claim 1, wherein the gesture sensing input device calculates a ratio of X, Y distance between virtual center points and applies the ratio to move a mouse cursor position.
Date Recue/Date Received 2021-11-21
Date Recue/Date Received 2021-11-21
27. The intelligent gesture sensing input device of claim 1, wherein an initial cursor position is at an upper left corner or at a lower right corner of a virtual touch screen.
28. The intelligent gesture sensing input device of claim 1, wherein the gesture sensing input device determines a first ratio of a first width to a first height of a video view frame, compares the first ratio with a second ratio of a second width to a second height of a monitor screen and moves a mouse cursor by a distance based on the comparison of the first ratio with the second ratio.
29. The intelligent gesture sensing input device of claim 1, wherein the gesture sensing input device defines a hand gesture for a touchscreen mouse grip function.
30. The intelligent gesture sensing input device of claim 1, wherein touchscreen mouse is combined with a virtual puzzle cell keys control panel using the sandwich layers functions.
31. The intelligent gesture sensing input device of claim 1, wherein the gesture sensing input device supports a hand sign language function.
- 48 '"
Date Recue/Date Received 2021-11-21
- 48 '"
Date Recue/Date Received 2021-11-21
32. The intelligent gesture sensing input device of claim 1, wherein the three-dimensional virtual workspace zone includes a touchscreen mouse having sandwich layers functions and wherein the puzzle cell row-column formation includes a keyboard.
33. The intelligent gesture sensing input device of claim 1, wherein the three-dimensional virtual workspace zone includes a touchscreen mouse having sandwich layers functions, the gesture sensing input device supports a hand sign language function and the puzzle cell row-column formation includes a keyboard.
34. The intelligent gesture sensing input device of claim 1, wherein the puzzle cell row-column formation includes a keyboard and the gesture sensing input device supports a hand sign language function.
35. The intelligent gesture sensing input device of claim 1, wherein the three-dimensional virtual workspace zone includes a touchscreen mouse having sandwich layers functions and the gesture sensing input device supports a hand sign language function.
- 49 '.', Date Recue/Date Received 2021-11-21
- 49 '.', Date Recue/Date Received 2021-11-21
36. The intelligent gesture sensing input device of claim 1, wherein the gesture sensing input device includes a virtual mouse, a keyboard, Morse Code keyboard, control panels, a computer interface method, a touchscreen, virtual puzzle cell mouse keys, a virtual control panel keyboard, a mouse key controller or a hand sign language function.
37. The intelligent gesture sensing input device of claim 1, wherein the video vision sensor module includes a vision sensor, Microsoft Kinect-alike camera sensor or 3D Depth Camera sensor, an RGB
video camera, a web camera or a video sensor camera equipped with an infrared emitter and an infrared signal reflection detect sensor, or a video sensor camera build with variety type of sensor module combine selective plurality of RGB video cameras, and / or web cameras, and / or Infrared Emitter, and / or Infrared signal reflection detect sensor, and / or microphones, sound sensors, and / or speaker, and / or three dimensional movement accelerometer sensors, and / or motor control module, and / or Universal Infrared Receiver Transmitter.
video camera, a web camera or a video sensor camera equipped with an infrared emitter and an infrared signal reflection detect sensor, or a video sensor camera build with variety type of sensor module combine selective plurality of RGB video cameras, and / or web cameras, and / or Infrared Emitter, and / or Infrared signal reflection detect sensor, and / or microphones, sound sensors, and / or speaker, and / or three dimensional movement accelerometer sensors, and / or motor control module, and / or Universal Infrared Receiver Transmitter.
38. (Newly added claim) The intelligent gesture sensing input device of claim 1 further comprising varity of motors modules which is attached to micro controller board, Intelligent rotating directions, variety type of sensors modules, GPS sensor, connect cables, can be attached to the board for external sensor reading signals by micro controller board and send to main computer to process.
Date Recue/Date Received 2021-11-21
Date Recue/Date Received 2021-11-21
39. The intelligent gesture sensing input device of claim 1 further comprising one or more display monitors or projectors.
40. The intelligent gesture sensing input device of claim 1, wherein the gesture sensing input device sends enter key command directly to an activation and operate computer readable program.
41. The intelligent gesture sensing input device of claim 1, wherein the main computer includes a web server function so as to control a computer, a machine and other electronic devices through a computer readable automation program.
42. The intelligent gesture sensing input device of claim 1 further comprising a universal infrared receiver transmitter.
43. The intelligent gesture sensing input device of claim 1 further comprising at least one network equipment.
44. The intelligent gesture sensing input device of claim 1 further comprising a wireless key-indication device configured to be worn on a palm, an arm or a body of the user.
- 51 '"
Date Recue/Date Received 2021-11-21
- 51 '"
Date Recue/Date Received 2021-11-21
45. The intelligent gesture sensing input device of claim 1 further comprising a wireless display glass projector to project key information of a puzzle cell image on lenses.
46. The intelligent gesture sensing input device of claim 1 further comprising a computer readable speech recognition program function and an array of microphones.
47. The intelligent gesture sensing input device of claim 1, wherein the gesture sensing input device is equipped with computer readable voice speaking program function.
48. The intelligent gesture sensing input device of claim 1, wherein the gesture sensing input device recognizes a pointer stick configured to be held by a mouth of the user.
49. The intelligent gesture sensing input device of claim 1 further comprising of a wireless display glass.
50. The intelligent gesture sensing input device of claim 1 further comprising of a micro controller board.
- 52 '"
Date Recue/Date Received 2021-11-21
- 52 '"
Date Recue/Date Received 2021-11-21
51. The intelligent gesture sensing input device of claim 1 further comprising of a plurality of motor modules controllable by the micro controller board.
52. The intelligent gesture sensing input device of claim 1 further comprising of a portable power source selected from the group consisting of a rechargeable battery, a solar cell, a fuel cell, a rotation generator, a wind turbine and a thermo electron generator.
53. The intelligent gesture sensing input device of claim 1, wherein the micro controller board includes sensor modules and the micro controller board sends signals to the main computer.
54. The intelligent gesture sensing input device of claim 1, wherein the micro controller board includes sensor modules and the micro controller board sends signals to the main computer and also controls rotations and speed of a plurality of motors through the micro controller board.
55. The intelligent gesture sensing input device of claim 1 is further comprising of a platform having wheels.
- 53 '.', Date Recue/Date Received 2021-11-21
- 53 '.', Date Recue/Date Received 2021-11-21
56. The intelligent gesture sensing input device of claim 1, wherein the one or more display monitors or projectors or a visual image projector displays letters a to z, numbers 0 to 9, symbols, function keys, open computer readable programs, Media, a running DVD
player, a playing music, a video, an internet browser, playing games, or computer functions.
player, a playing music, a video, an internet browser, playing games, or computer functions.
57. The intelligent gesture sensing input device of claim 1, wherein a virtual puzzle cell map keys control panel graphic image is drawn on the one or more display monitors or projectors and is divided into a left, a center and a right zones and wherein a real time video image of the user is displayed in the center zone of the virtual puzzle cell map keys control panel graphic image.
58. The intelligent gesture sensing input device of claim 1, wherein the gesture sensing input device determines a center, a width and a length of the three-dimensional virtual workspace zone and also activates a selected motor module of the plurality of motor modules to rotate the video vision sensor module when the user walks out of a video viewable area.
59. The intelligent gesture sensing input device of claim 1, wherein the gesture sensing input device can be built of a human shape moves from a first location to a second location by the wheels and also can Date Recue/Date Received 2021-11-21 be choose to build as human shape robot can do human body movement and walking.
60. A system for Gestural Interface with Virtual Control Layers (GIVCL) comprising of an intelligent gesture sensing input device further comprising of:
a main computer to process sensory data and to control the system;
a video vision sensor module communicatively coupled to the main computer to sense gestures of a user and send data to the main computer;
one or more display monitors or projectors communicatively coupled with the main computer to project and display playing computer contents or show puzzle cell keys being selected by the user;
a micro controller board reads signals from other external sensors and sends it to main computer to process;
a plurality of motor modules controllable by the micro controller board;
a computer readable puzzle cell map virtual keyboard control program (PCMVKCP) for measuring user's workspace, assigning virtual center point, creating work space zone and establishing puzzle cell mapping keys etc.;
wherein the PCMVKCP divides the three-dimensional virtual workspace zone into a virtual control layers;
- 55 '.', Date Recue/Date Received 2021-11-21 wherein the PCMVKCP divides a three-dimensional virtual workspace zone into a puzzle cell row-column formation;
wherein the PCMVKCP divides the three-dimensional virtual workspace zone into a first, second and third selectable gate zones along a direction perpendicular to a surface of the puzzle cell row-column formation wherein the first selectable gate zone is to unlock a selected key gate zone;
wherein the second selectable gate zone is to select a virtual key zone;
and wherein the third selectable gate zone is to click a selected zone.
a main computer to process sensory data and to control the system;
a video vision sensor module communicatively coupled to the main computer to sense gestures of a user and send data to the main computer;
one or more display monitors or projectors communicatively coupled with the main computer to project and display playing computer contents or show puzzle cell keys being selected by the user;
a micro controller board reads signals from other external sensors and sends it to main computer to process;
a plurality of motor modules controllable by the micro controller board;
a computer readable puzzle cell map virtual keyboard control program (PCMVKCP) for measuring user's workspace, assigning virtual center point, creating work space zone and establishing puzzle cell mapping keys etc.;
wherein the PCMVKCP divides the three-dimensional virtual workspace zone into a virtual control layers;
- 55 '.', Date Recue/Date Received 2021-11-21 wherein the PCMVKCP divides a three-dimensional virtual workspace zone into a puzzle cell row-column formation;
wherein the PCMVKCP divides the three-dimensional virtual workspace zone into a first, second and third selectable gate zones along a direction perpendicular to a surface of the puzzle cell row-column formation wherein the first selectable gate zone is to unlock a selected key gate zone;
wherein the second selectable gate zone is to select a virtual key zone;
and wherein the third selectable gate zone is to click a selected zone.
61. The system for Gestural Interface with Virtual Control Layers (GIVCL) of claim 60 is further comprising of a platform having wheels.
62. The system for Gestural Interface with Virtual Control Layers (GIVCL) of claim 60 is further comprising of a universal infrared receiver transmitter.
Date Recue/Date Received 2021-11-21
Date Recue/Date Received 2021-11-21
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201462009302P | 2014-06-08 | 2014-06-08 | |
US62/009,302 | 2014-06-08 | ||
US14/723,435 | 2015-05-27 | ||
US14/723,435 US9696813B2 (en) | 2015-05-27 | 2015-05-27 | Gesture interface robot |
CA2917590A CA2917590A1 (en) | 2014-06-08 | 2015-05-29 | Gestural interface with virtual control layers |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CA2917590A Division CA2917590A1 (en) | 2014-06-08 | 2015-05-29 | Gestural interface with virtual control layers |
Publications (1)
Publication Number | Publication Date |
---|---|
CA3204405A1 true CA3204405A1 (en) | 2015-12-17 |
Family
ID=54832656
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CA2917590A Pending CA2917590A1 (en) | 2014-06-08 | 2015-05-29 | Gestural interface with virtual control layers |
CA3204405A Pending CA3204405A1 (en) | 2014-06-08 | 2015-05-29 | Gestural interface with virtual control layers |
CA3204400A Pending CA3204400A1 (en) | 2014-06-08 | 2015-05-29 | Gestural interface with virtual control layers |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CA2917590A Pending CA2917590A1 (en) | 2014-06-08 | 2015-05-29 | Gestural interface with virtual control layers |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CA3204400A Pending CA3204400A1 (en) | 2014-06-08 | 2015-05-29 | Gestural interface with virtual control layers |
Country Status (2)
Country | Link |
---|---|
CA (3) | CA2917590A1 (en) |
WO (1) | WO2015188268A1 (en) |
Families Citing this family (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105773633B (en) * | 2016-04-14 | 2018-04-20 | 中南大学 | Mobile robot man-machine control system based on face location and sensitivity parameter |
CN105999670B (en) * | 2016-05-31 | 2018-09-07 | 山东科技大学 | Taijiquan action based on kinect judges and instructs system and its guidance method |
CN106514667B (en) * | 2016-12-05 | 2020-12-08 | 北京理工大学 | Man-machine cooperation system based on Kinect skeleton tracking and calibration-free visual servo |
CN106826846B (en) * | 2017-01-06 | 2020-02-14 | 南京赫曼机器人自动化有限公司 | Intelligent service robot and method based on abnormal sound and image event driving |
CN107193385A (en) * | 2017-06-29 | 2017-09-22 | 云南大学 | It is a kind of based on methods of the Kinect to keyboard Behavior modeling |
US11093554B2 (en) | 2017-09-15 | 2021-08-17 | Kohler Co. | Feedback for water consuming appliance |
US10887125B2 (en) | 2017-09-15 | 2021-01-05 | Kohler Co. | Bathroom speaker |
US10448762B2 (en) | 2017-09-15 | 2019-10-22 | Kohler Co. | Mirror |
US11099540B2 (en) | 2017-09-15 | 2021-08-24 | Kohler Co. | User identity in household appliances |
US11314214B2 (en) | 2017-09-15 | 2022-04-26 | Kohler Co. | Geographic analysis of water conditions |
CN107639620A (en) * | 2017-09-29 | 2018-01-30 | 西安交通大学 | A kind of control method of robot, body feeling interaction device and robot |
CN108638069B (en) * | 2018-05-18 | 2021-07-20 | 南昌大学 | Method for controlling accurate motion of tail end of mechanical arm |
CN108829252A (en) * | 2018-06-14 | 2018-11-16 | 吉林大学 | Gesture input computer character device and method based on electromyography signal |
CN111694428B (en) * | 2020-05-25 | 2021-09-24 | 电子科技大学 | Gesture and track remote control robot system based on Kinect |
CN112894857B (en) * | 2021-03-02 | 2024-04-09 | 路邦科技授权有限公司 | Key control method for clinical auxiliary robot in hospital |
US20230071312A1 (en) * | 2021-09-08 | 2023-03-09 | PassiveLogic, Inc. | External Activation of Quiescent Device |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7340077B2 (en) * | 2002-02-15 | 2008-03-04 | Canesta, Inc. | Gesture recognition system using depth perceptive sensors |
US9760214B2 (en) * | 2005-02-23 | 2017-09-12 | Zienon, Llc | Method and apparatus for data entry input |
CA2591808A1 (en) * | 2007-07-11 | 2009-01-11 | Hsien-Hsiang Chiu | Intelligent object tracking and gestures sensing input device |
WO2012124844A1 (en) * | 2011-03-16 | 2012-09-20 | Lg Electronics Inc. | Method and electronic device for gesture-based key input |
-
2015
- 2015-05-29 WO PCT/CA2015/050493 patent/WO2015188268A1/en active Application Filing
- 2015-05-29 CA CA2917590A patent/CA2917590A1/en active Pending
- 2015-05-29 CA CA3204405A patent/CA3204405A1/en active Pending
- 2015-05-29 CA CA3204400A patent/CA3204400A1/en active Pending
Also Published As
Publication number | Publication date |
---|---|
WO2015188268A1 (en) | 2015-12-17 |
CA2917590A1 (en) | 2015-12-17 |
CA3204400A1 (en) | 2015-12-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9696813B2 (en) | Gesture interface robot | |
CA3204405A1 (en) | Gestural interface with virtual control layers | |
Kamel Boulos et al. | Web GIS in practice X: a Microsoft Kinect natural user interface for Google Earth navigation | |
Ishii | Tangible user interfaces | |
Ishii | Tangible bits: beyond pixels | |
Richards-Rissetto et al. | Kinect and 3D GIS in archaeology | |
Lifton et al. | Metaphor and manifestation cross-reality with ubiquitous sensor/actuator networks | |
KR20120072126A (en) | Visual surrogate for indirect experience, apparatus and method for providing thereof | |
CN103975290A (en) | Methods and systems for gesture-based petrotechnical application control | |
Chen et al. | ARPilot: designing and investigating AR shooting interfaces on mobile devices for drone videography | |
JP2018005663A (en) | Information processing unit, display system, and program | |
Yang et al. | Turn a nintendo wiimote into a handheld computer mouse | |
Schier et al. | ViewR: Architectural-Scale Multi-User Mixed Reality with Mobile Head-Mounted Displays | |
Billinghurst et al. | Tangible interfaces for ambient augmented reality applications | |
Klein | A Gesture Control Framework Targeting High-Resolution Video Wall Displays | |
TW201447643A (en) | Enhanced presentation environments | |
Bergé et al. | Smartphone based 3D navigation techniques in an astronomical observatory context: implementation and evaluation in a software platform | |
Roudaki et al. | PhoneLens: A low-cost, spatially aware, mobile-interaction device | |
Ballagas | Bringing Iterative Design to Ubiquitous Computing: Interaction Techniques, Toolkits, and Evaluation Methods | |
Linder | LuminAR: a compact and kinetic projected augmented reality interface | |
Niemelä | Mobile augmented reality client for citizen participation | |
Geng et al. | Perceptual user interface in virtual shopping environment | |
Tobita et al. | Open-Finger: Mobile Application Platform Enhanced by Physical Finger | |
De Sousa et al. | 5* magic wand: An RGBD camera-based 5 DoF user interface for 3D interaction | |
Mane et al. | CONTROL-WAVE: Gesture Control Glove |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
EEER | Examination request |
Effective date: 20211121 |
|
EEER | Examination request |
Effective date: 20211121 |
|
EEER | Examination request |
Effective date: 20211121 |
|
EEER | Examination request |
Effective date: 20211121 |
|
EEER | Examination request |
Effective date: 20211121 |
|
EEER | Examination request |
Effective date: 20211121 |
|
EEER | Examination request |
Effective date: 20211121 |
|
EEER | Examination request |
Effective date: 20211121 |
|
EEER | Examination request |
Effective date: 20211121 |