EP3052945A1 - Vorrichtungen zur steuerung von elektrischen vorrichtungen und software-programmen sowie verfahren zur herstellung und verwendung davon - Google Patents
Vorrichtungen zur steuerung von elektrischen vorrichtungen und software-programmen sowie verfahren zur herstellung und verwendung davonInfo
- Publication number
- EP3052945A1 EP3052945A1 EP14851104.1A EP14851104A EP3052945A1 EP 3052945 A1 EP3052945 A1 EP 3052945A1 EP 14851104 A EP14851104 A EP 14851104A EP 3052945 A1 EP3052945 A1 EP 3052945A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- systems
- motion
- objects
- function
- select
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
- 238000000034 method Methods 0.000 title claims description 66
- 230000033001 locomotion Effects 0.000 claims abstract description 614
- 230000001133 acceleration Effects 0.000 claims abstract description 218
- 238000012545 processing Methods 0.000 claims abstract description 212
- 230000008859 change Effects 0.000 claims abstract description 189
- 230000006870 function Effects 0.000 claims description 393
- 239000000203 mixture Substances 0.000 claims description 101
- 238000004891 communication Methods 0.000 claims description 72
- 230000003287 optical effect Effects 0.000 claims description 39
- 230000008569 process Effects 0.000 claims description 16
- 238000003491 array Methods 0.000 claims description 15
- 230000001939 inductive effect Effects 0.000 claims description 15
- 238000004378 air conditioning Methods 0.000 claims description 12
- 238000010438 heat treatment Methods 0.000 claims description 11
- 239000000126 substance Substances 0.000 claims description 11
- 238000009423 ventilation Methods 0.000 claims description 11
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 claims description 11
- 230000007423 decrease Effects 0.000 claims description 6
- 238000007726 management method Methods 0.000 claims description 6
- 239000000446 fuel Substances 0.000 claims description 5
- 241001465754 Metazoa Species 0.000 description 22
- 230000003213 activating effect Effects 0.000 description 13
- 230000000694 effects Effects 0.000 description 12
- 238000001514 detection method Methods 0.000 description 10
- 210000003811 finger Anatomy 0.000 description 10
- 230000001965 increasing effect Effects 0.000 description 9
- 239000013598 vector Substances 0.000 description 8
- 230000007613 environmental effect Effects 0.000 description 6
- 230000002207 retinal effect Effects 0.000 description 6
- 238000005282 brightening Methods 0.000 description 5
- 210000001508 eye Anatomy 0.000 description 5
- 230000008867 communication pathway Effects 0.000 description 4
- 230000004048 modification Effects 0.000 description 4
- 238000012986 modification Methods 0.000 description 4
- 210000000617 arm Anatomy 0.000 description 3
- 150000001875 compounds Chemical class 0.000 description 3
- 238000009826 distribution Methods 0.000 description 3
- 210000002683 foot Anatomy 0.000 description 3
- 210000004247 hand Anatomy 0.000 description 3
- 210000003128 head Anatomy 0.000 description 3
- 210000002414 leg Anatomy 0.000 description 3
- 239000007788 liquid Substances 0.000 description 3
- 238000004519 manufacturing process Methods 0.000 description 3
- 230000007246 mechanism Effects 0.000 description 3
- 230000004044 response Effects 0.000 description 3
- 210000003371 toe Anatomy 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 230000008707 rearrangement Effects 0.000 description 2
- 238000001228 spectrum Methods 0.000 description 2
- 241000282412 Homo Species 0.000 description 1
- 241001676573 Minium Species 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000004913 activation Effects 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 239000004020 conductor Substances 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 230000009849 deactivation Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000008921 facial expression Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000036651 mood Effects 0.000 description 1
- 210000000214 mouth Anatomy 0.000 description 1
- 239000003973 paint Substances 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 210000003813 thumb Anatomy 0.000 description 1
- 230000036410 touch Effects 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- VLCQZHSMCYCDJL-UHFFFAOYSA-N tribenuron methyl Chemical compound COC(=O)C1=CC=CC=C1S(=O)(=O)NC(=O)N(C)C1=NC(C)=NC(OC)=N1 VLCQZHSMCYCDJL-UHFFFAOYSA-N 0.000 description 1
- 238000002604 ultrasonography Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 230000001755 vocal effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01P—MEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
- G01P13/00—Indicating or recording presence, absence, or direction, of movement
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0485—Scrolling or panning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- H—ELECTRICITY
- H05—ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
- H05B—ELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
- H05B47/00—Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
- H05B47/10—Controlling the light source
- H05B47/105—Controlling the light source in response to determined parameters
- H05B47/115—Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings
-
- H—ELECTRICITY
- H05—ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
- H05B—ELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
- H05B47/00—Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
- H05B47/10—Controlling the light source
- H05B47/155—Coordinated control of two or more light sources
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01P—MEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
- G01P15/00—Measuring acceleration; Measuring deceleration; Measuring shock, i.e. sudden change of acceleration
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02B—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
- Y02B20/00—Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
- Y02B20/40—Control techniques providing energy savings, e.g. smart controller or presence detection
Definitions
- TITLE APPARATUSES FOR CONTROLLING ELECTRICAL DEVICES AND
- Embodiments of the present invention relate to apparatuses for controlling virtual or real objects including electrical devices, hardware devices, software programs, software products, software systems, and/or software objects included in software programs, products, and/or systems and methods for making and using same.
- embodiments of this invention relate to apparatuses for controlling virtual or real objects including electrical devices, hardware devices, software programs, software products, software systems, and/or software objects included in software programs, products, and/or systems, where the apparatuses include (1) one object or a plurality of objects, (2) at least one motion sensor capable of sensing linear and/or angular motion, linear and/or angular velocity, linear and/or angular acceleration, changes in linear and/or angular motion, changes in linear and/or angular velocity, changes in linear and/or angular acceleration, and (3) at least one processing unit in control communication with the object or objects for converting sensor output into commands for controlling some or all of the objects and/or some or all of the attributes associated with some or all of the objects.
- the present invention also relates to methods for making and using the apparatuses. 2. Description of the Related Art
- United States Patent No. 2,421,881 to Heasty discloses the use of a rotatable disk with a number of recesses around its periphery.
- the disk is supported on a hub and two electric contact arms provide electric current through conductor rods in alternately spaced recesses. As the disk is rotated, electrical contact is made and then broken.
- United States Patent No. 2,863,010 to Riedl discloses a spring loaded push plate that is designed to activate all electrical contacts underneath the plate at once or to selectively engage electric contacts underneath the plate by rocking the plate in the direction of the desired electrical contact.
- dimmer switches Functioning in a manner well-known in the art, the dimmer switch is activated by the well- known lever or, in some cases, by a knob that is simply twisted.
- United States Patent Nos. 7,861, 188, 7,831,932, and 8,788,966 disclosed apparatuses and methods for controlling devices and/or programs and/or objects based on motion and changes in a direction of motion.
- motion based systems and methods have been disclosed, there is still a need in the art for motion-based apparatuses, systems, and methods, especially apparatuses, systems, and methods that are capable of monitoring and acting on linear and/or angular velocity, linear and/or angular acceleration, changes in linear and/or angular velocity, and/or changes in linear and/or angular acceleration to affect changes in real and/ or virtual obj ects or collections of real and/ or virtual obj ects .
- Embodiments of the present invention also provide control systems for controlling real and/or virtual objects such as electrical devices, hardware devices, software programs and/or software objects, where the systems convert mo vement(s) into commands for controlling the real and/or virtual objects, where themovement(s) including linear and/or angular motion, linear and/or angular velocity, linear and/or angular acceleration, changes in linear and/or angular motion, changes in linear and/or angular velocity, changes in linear and/or angular acceleration, rates of change in direction, rates of change in linear and/or angular velocity, rates of change of linear and/or angular acceleration, and/or mixtures or combinations thereof.
- the changes in movement direction, velocity, and/or acceleration may also include stops and/or timed holds in conjunction with the changes in direction, velocity and/or acceleration.
- Embodiments of the present invention provide control systems for controlling real and/or virtual objects and/or virtual systems.
- the systems include at least one motion sensor or a motion sensing apparatus including a motion-sensing component for sensing movement within at least one active sensing field, a processing unit for converting sensor output into commands for controlling the real and/or virtual objects and/or virtual systems in communication with the processing unit, and optionally a user interface for human interaction.
- these processors and components may be combined into one or more units.
- the movement may result from movement of an animal or a human, an animal or human body part, or an object under the control of an animal or a human or an autonomous robot or robotic system.
- Movement may occur in or around a sensor, array of sensors, or the sensor(s) itself may move, or a combination thereof.
- the movement may include motion in linear and/or angular motion in any direction, linear and/or angular velocity in any direction, linear and/or angular acceleration, changes in linear and/or angular motion, changes in linear and/or angular velocity, changes in linear and/or angular acceleration, rates of change in direction, rates of change in linear and/or angular velocity, rates of change of linear and/or angular acceleration, and/or mixtures or combinations thereof.
- the commands permit users to scroll through lists or menus, to simultaneously select and scroll through lists or menus, to simultaneously select and scroll through sublists or submenus, to select, control, or simultaneously select and control at least one real object or at least one list of real objects and/or virtual object or at least one list of virtual objects, to simultaneously select and control at least one real object attribute and/or at least one virtual objects attribute, or to simultaneously select and proportionally control at least one real object, at least one virtual object, at least one real object attribute, and/or at least one virtual object attribute by discernible changes in movement sensed by the sensor(s) and/or the processing unit(s).
- the systems may also include at least one remote control unit.
- the communication between the various components may be direct via wires or hardware connections and/or indirect via wireless connections.
- Embodiments of the present invention also provide systems including at least one motion sensor capable of sensing movement, where movement within at least one sensing zone of the sensor(s) produces at least one output signal for selecting, controlling or simultaneously selecting and controlling one or a plurality of the real objects and/or one or a plurality of the virtual objects or for simultaneously selecting and controlling one or a plurality of attributes of the one or a plurality of the real and/or virtual objects.
- the systems may allow attribute selection and control based on linear and/or angular motion, linear and/or angular velocity, linear and/or angular acceleration, changes in linear and/or angular motion, changes in linear and/or angular velocity, changes in linear and/or angular acceleration, rates of change in direction, rates of change in linear and/or angular velocity, rates of change of linear and/or angular acceleration, and/or mixtures or combinations thereof.
- the systems may include a preset or programmable sequence of motions within the motion sensor sensing zone, where each sequence causes a preset or pre-programmed response of selected devices and/or programs.
- sequences may also include changes in velocity or acceleration within the sequences, where these changes are used to give different commands based upon these changes in velocity or acceleration.
- the same preset or programmable sequence of motions may have different outcomes based upon differences of velocity or acceleration within the sequence of motions, yielding different output commands.
- the systems may utilize the preset or programmable sequences to control all of the real objects and/or virtual list or objects or any subset of the real and/or virtual objects, where different patterns or sequences may result in activating preset real objects and/or virtual lists or objects settings or a pre-programmed global or partial global preset setting such as mood lighting, music settings, virtual object selections and settings, etc.
- Embodiments of the present invention provide methods for controlling at least one real object and/or at least one virtual list or object or a plurality of real objects and/or virtual lists or objects using systems of this invention.
- the methods include sensing movement within at least one sensing zone of at least one motion sensor or motion sensing apparatus, where the movement includes linear and/or angular motion, linear and/or angular velocity, linear and/or angular acceleration, changes in linear and/or angular motion, changes in linear and/or angular velocity, changes in linear and/or angular acceleration, rates of change in direction, rates of change in linear and/or angular velocity, rates of change of linear and/or angular acceleration, and/or mixtures or combinations thereof.
- the methods also include producing at least one output signal from the sensor(s) based on the movement detected within the sensing zone(s).
- the methods also include converting the output signals into control functions for controlling the real objects, the virtual objects, real object attributes, and/or virtual object attributes, or any combination thereof.
- the control functions include scroll functions, selection functions, activate functions, attribute control functions, simultaneous select and scroll functions, simultaneous select and device and/or software program activate functions, simultaneous select and attribute activate functions, simultaneous select and attribute control functions, and/or simultaneous device and/or software program activate and attribute control functions, or any combination thereof.
- Embodiments of the present invention also provide user interfaces including at least one motion sensor or motion sensing component, at least one processing unit, or at least one motion sensor/processing unit and at least one communication hardware and software unit. That is, motion sensor and processing unit may be combined, in fact all these maybe combined into one unit. Sensors may be moved (such as in a phone) or the sensors may detect an object or objects moving, or a combination thereof.
- the motion sensors sense movement within at least one sensing zone and produce at least one output signal corresponding to the sensed movement, where the movement includes linear and/or angular motion, linear and/or angular velocity, linear and/or angular acceleration, changes in linear and/or angular motion, changes in linear and/or angular velocity, changes in linear and/or angular acceleration, rates of change in direction, rates of change in linear and/or angular velocity, rates of change of linear and/or angular acceleration, or mixtures or combinations thereof.
- the processing units convert the sensor output signals into command and control functions for controlling one real object and/or list, one virtual object and/or list, a plurality of real objects and/or lists, a plurality of virtual objects and/or lists, attributes of the real objects and/or attributes or the virtual objects and/or lists, and/or mixtures and combinations thereof.
- the communication units send the command and control functions to selected real and/or virtual objects.
- the motion sensors are capable of sensing movement of an animal or human, an animal or human body part, or an object controlled by an animal or human within the sensing zones, autonomous robots or robotic systems, or any combination thereof.
- Embodiments of the present invention provide methods for manipulating one real object and/or list and/or virtual object and/or list or a plurality of real objects and/or lists and/or virtual objects and/or lists and/or associated attributes (executable or controllable) using interfaces or apparatuses of this invention, including the steps of sensing movement within at least one sensing zone of at least one motion sensor.
- the movements includes linear motion and/or angular motion, linear velocity and/or angular velocity, linear acceleration and/or angular acceleration, changes in linear motion and/or angular motion, changes in linear velocity and/or angular velocity, changes in linear acceleration and/or angular acceleration, rates of change in direction of motion, rates of change in linear velocity and/or angular velocity, rates of change of linear acceleration and/or angular acceleration, and mixtures or combination thereof.
- the methods also include producing at least one output signal from the sensors.
- the methods also include converting the output signals into control functions for controlling the real and/or virtual objects, real object attributes, and/or virtual object attributes.
- the control functions include scroll functions, simultaneous select and scroll functions, simultaneous select and activate functions, simultaneous select and attribute activate functions, simultaneous select and attribute control functions, and/or simultaneous activate and attribute control functions, or any combination thereof.
- multiple outputs from multiple sensors and/or inputs may produce multiple output commands, a single command, or a combination of commands at different output rates and sequences.
- dynamic wings in aircraft including advanced swept- wing designs, where the wings will fold down and/or sweep at different rates, providing amazing turning capabilities.
- a right hand controlling a UAV from a domed surface may accelerate the hand and all the fingers forward while simultaneously moving the thumb away from the center of the hand, causing the left wing to drop down slightly, causing a banking of the plane to the left.
- Figure 1A depicts an embodiment of a system of this invention sensing an arcuate path illustrating simple angular motion.
- Figure IB depicts another embodiment of a system of this invention sensing a sinuous path illustrating complex angular motion.
- Figure 1C depicts another embodiment of a system of this invention sensing an arcuate path including segments having different velocities and/or accelerations, but the same direction.
- Figure ID depicts another embodiment of a system of this invention sensing a sequential path including a plurality of arcuate segments having different directions, velocities, and/or accelerations.
- Figure IE depicts another embodiment of a system of this invention sensing a spiral path, where the spiral path may have constant or variable velocities, and/or accelerations.
- Figure IF depicts another embodiment of a system of this invention sensing a sequential path including a plurality of straight segments having different directions with different velocities, and/or accelerations.
- Figure 1G depicts another embodiment of a system of this invention sensing a straight path having segments having different velocities and/or accelerations, but the same direction.
- Figure 1H depicts another embodiment of a system of this invention sensing a gesture including up, right, down and left segments having different directions with different velocities, and/or accelerations.
- Figure 2 depicts an embodiment of a system of this invention including a sensor and two separate movements within an active zone in two different directions by two different entity objects.
- Figure 3A depicts an embodiments of a system of this invention including a central processing unit and a plurality (here four) motions sensors having active zones pointing in the +x direction.
- Figure 3B depicts an embodiments of a system of this invention including a central processing unit and a plurality (here four) motions sensors having active zones pointing in the +x, -y, -x and +y directions.
- Figure 3C depicts an embodiments of a system of this invention including a central processing unit and a plurality (here nine) motions sensors having active zones pointing in the +x, +x- y, -y, -x-y, -x, -x+y, +y, +x+y, and +z directions.
- Figure 3D depicts an embodiments of a system of this invention including a central processing unit and a plurality (here four) motions sensors.
- Figures 4A-F depictuses of the apparatuses, systems, and methods of this invention to control lights within a room.
- Figures 5A-D depict other uses of the apparatuses, systems, and methods of this invention to control lights within a room.
- Figures 6A-B depict other uses of the apparatuses, systems, and methods of this invention to control lights within a room.
- the term "about” means that a value of a given quantity is within ⁇ 20% of the stated value. In other embodiments, the value is within ⁇ 15% of the stated value. In other embodiments, the value is within ⁇ 10% ofthe stated value. In other embodiments, the value is within ⁇ 5% ofthe stated value. In other embodiments, the value is within ⁇ 2.5% of the stated value. In other embodiments, the value is within ⁇ 1% of the stated value.
- the term “substantially” means that a value of a given quantity is within ⁇ 10% of the stated value. In other embodiments, the value is within ⁇ 5% of the stated value. In other embodiments, the value is within ⁇ 2.5% of the stated value. In other embodiments, the value is within ⁇ 1 % of the stated value. [0036]
- the term “motion” and “movement” are often used interchangeably and mean motion or movement that is capable of being detected by a motion sensor or motion sensing component within an active zone of the sensor such as a sensing area or volume of a motion sensor or motion sensing component.
- the sensor is a forward viewing sensor and is capable of sensing motion within a forward extending conical active zone
- movement of anything within that active zone that meets certain threshold detection criteria will result in a motion sensor output, where the output may include at least direction, velocity, and/or acceleration.
- the sensors does not need to have threshold detection criteria, but may simply generate output anytime motion or any nature is detected.
- the processing units can then determine whether the motion is an actionable motion or movement and a non-actionable motion or movement.
- motion sensor or “motion sensing component” means any sensor or component capable of sensing motion of any kind by anything with an active zone - area or volume, regardless of whether the sensor's or component's primary function is motion sensing.
- real object or "real world object” means any real world device or article that is capable of being controlled by a processing unit.
- Real objects include objects or articles that have real world presence including physical, mechanical, electro-mechanical, magnetic, electro-magnetic, electrical, or electronic devices or any other real world device that can be controlled by a processing unit.
- virtual object means any construct generated in a virtual world or by a computer and displayed by a display device and that are capable of being controlled by a processing unit.
- Virtual objects include objects that have no real world presence, but are still controllable by a processing unit. These objects include elements within a software system, product or program such as icons, list elements, menu elements, generated graphic objects, 2D and 3D graphic images or objects, generated real world objects such as generated people, generated animals, generated devices, generated plants, generated landscapes and landscape objects, generate seascapes and seascape objects, generated skyscapes or skyscape objects, or any other generated real world or imaginary objects. Haptic, audible, and other attributes may be associated with these virtual objects in order to make them more like "real" objects.
- entity means a human or an animal.
- entity object means a human or a part of a human (fingers, hands, toes, feet, arms, legs, eyes, head, body, etc.), an animal or a port of an animal (fingers, hands, toes, feet, arms, legs, eyes, head, body, etc), or a real world object under the control of a human or an animal, or robotics under the control of a system, computer or software system or systems, or autonomously controlled (including with artificial intelligence), and include such articles as pointers, sticks, mobile devices, or any other real world object or virtual object representing a real entity object that can be directly or indirectly controlled by a human or animal or robot or robotic system.
- At least one means one or more or one or a plurality, additionally, these three terms maybe used interchangeably within this application.
- at least one device means one or more devices or one device and a plurality of devices.
- motion sensor means any sensor capable of sensing motion directing or having a component capable of sensing motion.
- multiple inputs may produce different combinations of output commands, where differences in the velocity or acceleration of each input, or differences in combinations of rates of change from multiple inputs or sensed linear and/or angular motion, linear and/or angular velocity, linear and/or angular acceleration, changes in linear and/or angular motion, changes in linear and/or angular velocity, changes in linear and/or angular acceleration produce different output commands.
- the same output command could be reached through multiple different input combinations.
- apparatuses may be constructed to control real and/or virtual objects such as electrical devices, appliances, software programs, software routines, software objects, or other real or virtual objects
- the apparatuses may include (1) at least one motion sensor capable of sensing movement or motion within at least one sensing zone of the at least one motion sensor and at least one processing unit for receiving sensor output or (2) at least one combination sensor/processing unit or (3) any combination of these, or any device that combines these components into a single device, where the processing unit converts sensor output into command and control function for controlling at least one real and/or virtual object or a plurality of real and/or virtual objects.
- the movement or motion includes linear and/or angular motion, linear and/or angular velocity, linear and/or angular acceleration, changes in linear and/or angular motion, changes in linear and/or angular velocity, changes linear and/or angular acceleration, rates of change in direction, rates of change in linear and/or angular velocity, rates of change of linear and/or angular acceleration, and/or mixtures or combinations thereof.
- the apparatuses may also include a user interface for animal and/or human interaction.
- the apparatuses may also include remote control devices, where the remote control devices act as the motion sensor or motion sensor and processing unit of the application.
- Embodiments of the present invention broadly relate to control systems for controlling real and/or virtual objects such as electrical devices, appliances, software programs, software routines, software objects, or other real or virtual objects, where the systems include at least one motion sensor, at least one processing unit or a sensor/processing combined unit, and optionally at least one user interface.
- the motion sensors detect movement within sensing zones, areas, and/or volumes and produce output signals of the sensed movement.
- the processing units receive the output signals and convert the output signals into control and/or command functions for controlling one real and/or virtual object or a plurality of real and/or virtual objects.
- the control functions include scroll functions, select functions, attribute functions, simultaneous select and scroll functions, simultaneous select and activate functions, simultaneous select and attribute activate functions, simultaneous select and attribute control functions, simultaneous select, activate, and attribute control functions, and mixtures or combination thereof.
- the systems may also include remote control units.
- the systems of this invention may also include security units and associated software such as finger print readers, hand print readers, biometric reader, bio-kinetic readers, biomedical readers, retinal readers, voice recognition devices, other electronic security systems, key locks, any other type of mechanical locking mechanism, or mixtures or combinations thereof.
- security devices may include separate sensors or may use the motion sensors.
- an active pad sensor may be used not only to sense motion, but may also be able to process a finger print or hand print image, or bio-kinetic print, image or pattern, while an optical sensor may also support a retinal scan function.
- bio-kinetic means that the movement of a user is specific to that user, especially when considering the shape of the hand, fingers, or body parts used by the motion sensor to detect movement, and the unique EMF, optical, acoustic, and/or any other wave interference patterns associated with the biology and movement of the user.
- Embodiments of the present invention broadly relate to at least one user interface to allow the system to interact with an animal and/or a human and/or robot or robotic systems based on sensed motion.
- Embodiments of the present invention broadly relate to control systems for controlling real and/or virtual objects such as electrical devices, appliances, software programs, software routines, software objects, sensors, projected objects, or other real or virtual objects, where the systems includes at least one motion sensor, at least one processing unit, and at least one user interface.
- the motion sensors detect movement or motion within one or a plurality of sensing zones, areas, and/or volumes associated with the sensors, and the motion sensors produce output signals of the sensed movement.
- the processing units receive output signals from the motion sensors and convert the output signals into control and/or command functions for controlling one real and/or virtual object or a plurality of real and/or virtual objects.
- the motion sensors and processing units may be combined into single units sometimes referred to as sensor/processing units.
- the control and/or command functions include scroll functions, select functions, attribute functions, simultaneous select and scroll functions, simultaneous select and activate functions, simultaneous select and attribute activate functions, simultaneous select and attribute control functions, simultaneous select, activate functions and attribute control functions, simultaneous activate and attribute control functions or any combination thereof.
- the systems may also include remote units.
- the systems of this invention may also include security units and associated software such as finger print readers, hand print readers, biometric readers, bio-kinetic readers, biomedical readers, EMF detection units, optical detection units, acoustic detection units, audible detection units, or other type of wave form readers, retinal readers, voice recognition devices, other electronic security systems, key locks, any other type of mechanical locking mechanism, or mixtures or combinations thereof.
- security devices may include separate sensors or may use the motion sensors.
- an active pad sensor may be used not only to sense motion, but also to able to process a finger print or hand print image, while an optical sensor may also support a retinal scan function, or an acoustic sensor may be able to detect the motions as well as voice commands, or a combination thereof.
- Embodiments of the present invention broadly relate to control systems for real and/or virtual objects such as electrical devices, appliances, software programs, software routines, software objects, or other real or virtual objects, where the systems includes at least one remote control device including at least one motion sensor, at least one processing unit, and at least one user interface, or a unit or units that provide these functions.
- the motion sensor(s) detect movement or motion within sensing zones, areas, and/or volumes and produce output signals of the sensed movement or motion.
- the processing units receive output signals from the sensors and convert the output signals into control and/or command functions for controlling one real and/or virtual object or a plurality of real and/or virtual objects.
- the control and/or command functions include scroll functions, select functions, attribute functions, simultaneous select and scroll functions, simultaneous select and activate functions, simultaneous select and attribute activate functions, simultaneous select and attribute control functions, simultaneous select, activate functions and attribute control functions, and/or simultaneous activate and attribute control functions or any combination thereof.
- the systems may also include remote units.
- the system of this invention may also include security units and associated software such as finger print readers, hand print readers, biometric readers, bio-kinetic readers, biomedical readers, EMF detection units, optical detection units, acoustic detection units, audible detection units, or other type of wave form readers, retinal readers, voice recognition devices, other electronic security systems, key locks, any other type of mechanical locking mechanism, or mixtures or combinations thereof.
- security devices may include separate sensors or may use the motion sensors.
- an active pad sensor may be used not only to sense motion, but also to able to process a finger print or hand print image, while an optical sensor may also support a retinal scan function.
- the systems of this invention allow users to control real and/or virtual objects such as electrical devices, appliances, software programs, software routines, software objects, sensors, or other real or virtual objects based solely on movement detected with the motion sensing zones of the motion sensors without invoking any hard selection protocol, such as a mouse click or double click, touch or double touch of a pad, or any other hard selection process.
- the systems simply track movement or motion in the sensing zone, converting the sensed movement or motion into output signals that are processed into command and/or control function(s) for controlling devices, appliances, software programs, and/or real or virtual objects.
- the motion sensors and/or processing units are capable of discerning attributes of the sensed motion including direction, velocity, and/or acceleration, sensed changes in direction, velocity, and/or acceleration, or rates of change in direction, velocity, and/or acceleration. These attributes generally only trigger a command and/or control function, if the sensed motion satisfies software thresholds for movement or motion direction, movement or motion velocity, movement or motion acceleration and/or changes in movement direction, velocity, and/or acceleration and/or rates of change in direction, rates of change in linear and/or angular velocity, rates of change of linear and/or angular acceleration, and/or mixtures or combinations thereof.
- the discrimination criteria may be no discrimination (all motion generates an output signal), may be preset, may be manually adjusted or may be automatically adjust depending on the sensing zones, the type of motion being sensed, the surrounding (noise, interference, ambient light, temperature, sound changes, etc.), or other conditions that could affect the motion sensors and/or the processing unit by design or inadvertently.
- a user or robot or robotic system moves, moves a body part, moves a sensor or sensor/processing unit or moves an object under user control within one or more sensing zones, the movement and attributes thereof including at least direction, linear and/or angular velocity, linear and/or angular acceleration and/or changes in direction, linear and/or angular velocity, and/or linear and/or angular acceleration including stops and times holds are sensed.
- the sensed movement or motion is then converted by the processing units into command and control function as set forth above.
- Embodiments of the systems of this invention include motion sensors that are capable of detecting movement or motion in one dimension, two dimensions, and/or three dimensions including over time and in different conditions.
- the motion sensors may be capable of detecting motion in x, y, and/or z axes or equivalent systems such as volumes in a space, volumes in a liquid, volumes in a gas, cylindrical coordinates, spherical coordinates, radial coordinates, and/or any other coordinate system for detecting movement in three directions, or along vectors or other motion paths.
- the motion sensors are also capable of determining changes in movement or motions in one dimension (velocity and/or acceleration), two dimension (direction, area, velocity and/or acceleration), and/or three dimension (direction, area, volume, velocity and/or acceleration).
- the sensors may also be capable of determining different motions over different time spans and areas/volumes of space, combinations of inputs such as audible, tactile, environmental and other waveforms, and combinations thereof.
- the changes in movement may be changes in direction, changes in velocity, changes in acceleration and/or mixtures of changes in direction, changes in velocity or changes in acceleration and/or rates of change in direction, rates of change in velocity, rates of change of acceleration, and/or mixtures or combinations thereof, including from multiple motion sensors, sensors with motion sensing ability, or multiple sensor outputs, where the velocity and/or acceleration may be linear, angular or mixtures and combinations thereof, especially when movement or motion is detected by two or more motion sensors or two or more sensor outputs.
- the movement or motion detected by the sensor(s) is(are) used by one or move processing units to convert the sensed motion into appropriate command and control functions as set forth herein.
- the systems of this invention may also include security detectors and security software to limit access to motion detector output(s), the processing unit(s), and/or the real or virtual object(s) under the control of the processing unit(s).
- the systems of this invention include wireless receivers and/or transceivers capable of determining all or part of the controllable real and/or virtual objects within the range of the receivers and/or transceivers in the system.
- the systems are capable of polling a zone to determine numbers and types of all controllable objects within the scanning zone of the receivers and/or transceivers associated with the systems.
- the systems will poll their surroundings in order to determine the numbers and types of controllable objects, where the polling may be continuous, periodic, and/or intermittent.
- These objects whether virtual or real, may also be used as a sensor array, creating a dynamic sensor for the user to control these and other real and/or virtual objects.
- the motion sensors are capable of sensing movement of a body (e.g. , animal or human), a part of an animal or human (e.g.
- Another example of this would be to sense if multiple objects, such as people in a public assembly change their rate of walking (a change of acceleration or velocity is sensed) in an egress corridor, thus, indicating a panic situation, whereby additional egress doors are automatically opened, additional egress directional signage may also be illuminated, and/or voice commands may be activated, with or without other types of sensors being made active.
- a timed hold in front of a sensor can be used to activate different functions, e.g., for a sensor on a wall, holding a finger or object briefly in front of sensor causes lights to be adjusted to a preset level, causes TV and/or stereo equipment to be activated, or causes security systems to come on line or be activated, or begins a scroll function through submenus or subroutines. While, continuing to hold, begins a bright/ dim cycle that ends, when the hand or other body part is removed.
- the timed hold causes an attribute value to change, e.g., if the attribute is at its maximum value, a timed hold would cause the attribute value to decrease at a predetermined rate, until the body part or object is removed from or within the active zone.
- the attribute value is at its minimum value, then a timed hold would cause the attribute value to increase at a predetermined rate, until the body part or object is removed from or within the active zone.
- the software may allow random selection or may select the direction, velocity, acceleration, changes in these motion properties or rates of changes in these motion properties that may allow maximum control.
- the interface may allow for the direction, velocity, acceleration, changes in these motion properties, or rates of changes of these motion properties to be determined by the initial direction of motion, while the timed hold would continue to change the attribute value until the body part or object is removed from or within the active zone.
- a stoppage of motion maybe included, such as in the example of a user using a scroll wheel motion with a body part, whereby a list is scrolled through on a display.
- a linear scroll function begins, and remains so until a circular motion begins, at which point a circular scroll function remains in effect until stoppage of this kind of motion occurs.
- a change of direction, and/or a change of speed alone has caused a change in selection of control functions and/or attribute controls.
- an increase in acceleration might cause the list to not only accelerate in the scroll speed, but also cause the font size to appear smaller, while a decrease in acceleration might cause the scroll speed to decelerate and the font size to increase.
- Another example might be that as a user moves towards a virtual or real object, the object would move towards the user based upon the user's rate of acceleration; i.e., as the user moves faster towards the object, the object would move faster towards the user, or would change color based upon the change of speed and/or direction of the user.
- the term “brief or “briefly” means that the timed hold or cessation of movement occurs for a period to time of less than a second. In certain embodiments, the term “brief or “briefly” means for a period of time of less than 2.5 seconds. In other embodiments, the term “brief or “briefly” means for a period of time of less than 5 seconds.
- the term “brief or “briefly” means for a period of time of less than 7.5 seconds. In other embodiments, the term “brief or “briefly” means for a period of time of less than 10 seconds. In other embodiments, the term “brief or “briefly” means for a period of time of less than 15 seconds. In other embodiments, the term “brief or “briefly” means for a period of time of less than 20 seconds. In other embodiments, the term “brief or “briefly” means for aperiod of time of less than 30 seconds.
- the angle deviation can be any value, the value is may be about ⁇ 1 ° from the initial direction or about ⁇ 2.5 ° from the initial direction or about ⁇ 5 ° from the initial direction, or about ⁇ 10° from the initial direction or about ⁇ 15 ° from the initial direction.
- the deviation can be as great as about ⁇ 45 ° or about ⁇ 35 ° or about ⁇ 25 ° or about ⁇ 15 ° or about ⁇ 5 ° or about ⁇ 2.5 ° or about ⁇ 1 ° .
- movement in a given direction within an angle deviation of ⁇ x° will result in the control of a single device, while movement in a direction half way between two devices within an angle deviation of ⁇ x ° will result in the control of both devices, where the magnitude of value change may be the same or less than that for a single device and where the value of x will depend on the number of device directions active, but will preferably be less than or equal to 1 ⁇ 4 of the angle separating adjacent devices.
- changes in speed of one cm per second, or combinations of speed change and angular changes as described above will provide enough change in acceleration that the output command or control of the object(s) will occur as desired.
- the systems of the present inventions may also include gesture processing.
- the systems of this invention will be able to sense a start pose, a motion, and an end pose, where the sensed gesture may be referenced to a list of gestures stored in a look-up table.
- a gesture in the form of this invention may contain all the elements listed herein ⁇ i.e., any motion or movement, changes in direction of motion or movement, velocity and/ or acceleration of the motion or movement) and may also include the sensing of a change of in any of these motion properties to provide a different output based upon differences in the motion properties associated with a given gesture.
- the pattern of motion incorporated in the gesture say the moving of a fist or pointed finger in a circular clock-wise direction causes a command of "choose all” or “play all' rom a list of objects to be issues
- speeding up the circular motion of the hand or finger while making the circular motion may provide a different command to be issued, such as “choose all but increase the lighting magnitude as well” or "play all but play in a different order”.
- a change of linear and/or angular velocity and/or acceleration could be used as a gestural command or a series of gestures, as well as a motion-based commands where selections, controls and commands are given when a change in motion properties are made, or where any combination of gestures and motions of these is made.
- an accelerometer For purposes of measuring acceleration or changes in velocity, an accelerometer may be used.
- An accelerometer is a device that measures "proper acceleration". Proper acceleration is physical acceleration ⁇ i.e., measurable acceleration as by an accelerometer) experienced by an object and is the acceleration felt by occupants associated with an accelerating object, and which is described as a G-force, which is not a force, but rather an acceleration.
- an accelerometer therefore, is a device that measures acceleration and changes in acceleration by any means.
- Velocity and acceleration are vector quantities, consisting of magnitude (amount) and direction. Acceleration is typically thought of as a change in velocity, when the direction of velocity remains the same. However, acceleration also occurs when the velocity is constant, but the direction of the velocity changes, such as when a car makes a turn or a satellite orbits the earth. If a car's velocity remains constant, but the radius is continuously reduced in a turn, the force resulting from the acceleration increases. This force is called G-force. Acceleration rate may change, such as when a satellite keeps its same orbit with reference to the earth, but increases or decreases its speed along that orbit in order to be moved to a different location at a different time.
- a motion sensor is capable of sensing velocity and/or acceleration
- the output of such a device would include sampling to measure units of average velocity and/or accelerations over a given time or as close to instantaneous velocity and/or accelerations as possible.
- These changes may also be used for command and control function generation and determination including all acceptable command and control functions.
- average or instantaneous accelerations or velocities may be used to determine states or rates of change of motion, or may be used to provide multiple or different attribute or command functions concurrently or in a compounded manner.
- a command may be issued, either in real time, or as an average of change over time (avg da/dt), or as an "acceleration gesture" where an acceleration has been sensed and incorporated into the table values relevant to pose-movement-pose then look-up table value recognized and command sent, as is the way gestures are defined.
- Gestures are currently defined as pose, then a movement, then a pose as measured over a given time, which is then paired with a look-up table to see if the values match, and if they do, a command is issued.
- a velocity gesture and an acceleration gesture would include the ability to incorporate velocity or changes in velocity or acceleration or changes in acceleration as sensed and identified between the poses, offering a much more powerful and natural identifier of gestures, as well as a more secure gesture where desired.
- the addition of changes in motion properties during a gesture can be used to greatly expand the number of gesture and the richness of gesture processing and on-the-fly gesture modification during processing so that the look-up table would identify the "basic" gesture type and the system would then invoke routines to augment the basic response in a pre-determined or adaptive manner.
- Embodiments of this invention relate to methods that are capable of measuring a person, a person's body part(s), or object(s) under the control of a person moving in a continuous direction, but undergoing a change in velocity in such a manner that a sensor is capable of discerning the change in velocity represented by ⁇ or dv or acc.
- the sensor output is forwarded to a processing unit that issues a command function in response to the sensor output, where the command function comprises functions previously disclosed.
- the communication may be wired or wireless, if wired, the communication may be electrical, optical, sonic, or the like, if the communication is wireless, the communication may be: 1) light, light waveforms, or pulsed light transmissions such as Rf, microwave, infra-red (IR), visible, ultraviolet, or other light communication formats, 2) acoustic, audile, sonic, or acoustic waveforms such as ultrasound or other sonic communication formats, or 3) any other type of wireless communication format.
- the processing unit includes an obj ect list having an obj ect identifier for each object and an object specific attribute list for each object having one or a plurality of attributes, where each object specific attribute has an attribute identifier.
- command functions for selection and/or control of real and/or virtual objects may be generated based on a change in velocity at constant direction, a change in direction at constant velocity, a change in both direction and velocity, a change in a rate of velocity, or a change in a rate of acceleration.
- these changes may be used by a processing unit to issue commands for controlling real and/or virtual objects.
- a selection or combination scroll, selection, and attribute selection may occur upon the first movement.
- Such motion may be associated with doors opening and closing in any direction, golf swings, virtual or real world games, light moving ahead of a runner, but staying with a walker, or any other motion having compound properties such as direction, velocity, acceleration, and changes in any one or all of these primary properties; thus, direction, velocity, and acceleration may be considered primary motion properties, while changes in these primary properties may be considered secondary motion properties.
- the system may then be capable of differentially handling of primary and secondary motion properties.
- the primary properties may cause primary functions to be issued, while secondary properties may cause primary function to be issued, but may also cause the modification of primary function and/or secondary functions to be issued. For example, if a primary function comprises a predetermined selection format, the secondary motion properties may expand or contract the selection format.
- this primary/secondary format for causing the system to generate command functions may involve an object display.
- the state of the display may change, such as from a graphic to a combination graphic and text, to a text display only, while moving side to side or moving a finger or eyes from side to side could scroll the displayed objects or change the font or graphic size, while moving the head to a different position in space might reveal or control attributes or submenus of the object.
- these changes in motions may be discrete, compounded, or include changes in velocity, acceleration and rates of these changes to provide different results for the user.
- the present invention while based on the use of sensed velocity, acceleration, and changes and rates of changes in these properties to effect control of real world objects and/or virtual objects, the present invention may also use other properties of the sensed motion in combination with sensed velocity, acceleration, and changes in these properties to effect control of real world and/or virtual objects, where the other properties include direction and change in direction of motion, where the motion has a constant velocity.
- the motion sensor(s) senses velocity, acceleration, changes in velocity, changes in acceleration, and/or combinations thereof that is used for primary control of the objects via motion of a primary sensed human, animal, part thereof, real world object under the control of a human or animal, or robots under control of the human or animal
- sensing motion of a second body part may be used to confirm primary selection protocols or may be used to fine tune the selected command and control function.
- the secondary motion properties may be used to differentially control object attributes to achieve a desired final state of the objects.
- the user may move within the motion sensor active area to map out a downward concave arc, which would cause the lights on the right wall to dim proportionally to the arc distance from the lights.
- the right lights would be more dimmed in the center of the wall and less dimmed toward the ends of the wall.
- the apparatus may also use the velocity of the movement of the mapping out the concave or convex movement to further change the dimming or brightening of the lights.
- velocity starting off slowly and increasing speed in a downward motion would cause the lights on the wall to be dimmed more as the motion moved down.
- the lights at one end of the wall would be dimmed less than the lights at the other end of the wall.
- This differential control through the use of sensed complex motion permits a user to nearly instantaneously change lighting configurations, sound configurations, TV configurations, or any configuration of systems having a plurality of devices being simultaneously controlled or of a single system having a plurality of objects or attributes capable of simultaneous control.
- sensed complex motion would permit the user to quickly deploy, redeploy, rearrangement, manipulated and generally quickly reconfigure all controllable objects and/or attributes by simply conforming the movement of the objects to the movement of the user sensed by the motion detector.
- Embodiments of systems of this invention include a motion sensor or sensor array, where each sensor includes an active zone and where each sensor senses movement, movement direction, movement velocity, and/or movement acceleration, and/or changes in movement direction, changes in movement velocity, and/ or changes in movement acceleration, and/ or changes in a rate of a change in direction, changes in a rate of a change in velocity and/or changes in a rate of a change in acceleration within the active zone by one or a plurality of body parts or objects and produces an output signal.
- the systems also include at least one processing unit including communication software and hardware, where the processing units convert the output signal or signals from the motion sensor or sensors into command and control functions, and one or a plurality of real objects and/or virtual objects in communication with the processing units.
- the command and control functions comprise at least (1) a scroll function or a plurality of scroll functions, (2) a select function or a plurality of select functions, (3) an attribute function or plurality of attribute functions, (4) an attribute control function or a plurality of attribute control functions, or (5) a simultaneous control function.
- the simultaneous control function includes (a) a select function or a plurality of select functions and a scroll function or a plurality of scroll functions, (b) a select function or a plurality of select functions and an activate function or a plurality of activate functions, and (c) a select function or a plurality of select functions and an attribute control function or a plurality of attribute control functions.
- the processing unit or units (1) processes a scroll function or a plurality of scroll functions, (2) selects and processes a scroll function or a plurality of scroll functions, (3) selects and activates an object or a plurality of objects in communication with the processing unit, or (4) selects and activates an attribute or a plurality of attributes associated with an object or a plurality of objects in communication with the processing unit or units, or any combination thereof.
- the objects comprise electrical devices, electrical systems, sensors, hardware devices, hardware systems, environmental devices and systems, energy and energy distribution devices and systems, software systems, software programs, software objects, or combinations thereof.
- the attributes comprise adjustable attributes associated with the devices, systems, programs and/or objects.
- the senor(s) is(are) capable of discerning a change in movement, velocity and/or acceleration of ⁇ 5%. In other embodiments, the sensor(s) is(are) capable of discerning a change in movement, velocity and/or acceleration of ⁇ 10°. In other embodiments, the system further comprising a remote control unit or remote control system in communication with the processing unit to provide remote control of the processing unit and all real and/or virtual objects under the control of the processing unit.
- the motion sensor is selected from the group consisting of digital cameras, optical scanners, optical roller ball devices, touch pads, inductive pads, capacitive pads, holographic devices, laser tracking devices, thermal devices, touch or touchless sensors, acoustic devices, and any other device capable of sensing motion, arrays of such devices, and mixtures and combinations thereof.
- the objects include environmental controls, lighting devices, cameras, ovens, dishwashers, stoves, sound systems, display systems, alarm systems, control systems, medical devices, robots, robotic control systems, hot and cold water supply devices, air conditioning systems, heating systems, ventilation systems, air handling systems, computers and computer systems, chemical or manufacturing plant control systems, computer operating systems and other software systems, remote control systems, mobile devices, electrical systems, sensors, hardware devices, hardware systems, environmental devices and systems, energy and energy distribution devices and systems, software programs or objects or mixtures and combinations thereof.
- Embodiments of methods of this invention for controlling objects include the step of sensing movement, movement direction, movement velocity, and/or movement acceleration, and/or changes in movement direction, changes in movement velocity, and/or changes in movement acceleration, and/or changes in a rate of a change in direction, changes in a rate of a change in velocity and/or changes in a rate of a change in acceleration within the active zone by one or a plurality of body parts or objects within an active sensing zone of a motion sensor or within active sensing zones of an array of motion sensors.
- the methods also include the step of producing an output signal or a plurality of output signals from the sensor or sensors and converting the output signal or signals into a command function or a plurality of command functions.
- the command and control functions comprise at least (1) a scroll function or a plurality of scroll functions, (2) a select function or a plurality of select functions, (3) an attribute function or plurality of attribute functions, (4) an attribute control function or a plurality of attribute control functions, or (5) a simultaneous control function.
- the simultaneous control function includes (a) a select function or a plurality of select functions and a scroll function or a plurality of scroll functions, (b) a select function or a plurality of select functions and an activate function or a plurality of activate functions, and (c) a select function or a plurality of select functions and an attribute control function or a plurality of attribute control functions.
- the objects comprise electrical devices, electrical systems, sensors, hardware devices, hardware systems, environmental devices and systems, energy and energy distribution devices and systems, software systems, software programs, software objects, or combinations thereof.
- the attributes comprise adjustable attributes associated with the devices, systems, programs and/or objects.
- the timed hold is brief or the brief cessation of movement causing the attribute to be adjusted to a preset level, causing a selection to be made, causing a scroll function to be implemented, or a combination thereof. In other embodiments, the timed hold is continued causing the attribute to undergo a high value/low value cycle that ends when the hold is removed.
- the timed hold causes an attribute value to change so that (1) if the attribute is at its maximum value, the timed hold causes the attribute value to decrease at a predetermined rate, until the timed hold is removed, (2) if the attribute value is at its minimum value, then the timed hold causes the attribute value to increase at a predetermined rate, until the timed hold is removed, (3) if the attribute value is not the maximum or minimum value, then the timed hold causes randomly selects the rate and direction of attribute value change or changes the attribute to allow maximum control, or (4) the timed hold causes a continuous change in the attribute value or scroll function in a direction of the initial motion until the timed hold is removed.
- the motion sensor is selected from the group consisting of sensors of any kind including digital cameras, optical scanners, optical roller ball devices, touch pads, inductive pads, capacitive pads, holographic devices, laser tracking devices, thermal devices, touch or touchless sensors, acoustic devices, and any other device capable of sensing motion or changes in any waveform due to motion or arrays of such devices, and mixtures and combinations thereof.
- the objects include lighting devices, cameras, ovens, dishwashers, stoves, sound systems, display systems, alarm systems, control systems, medical devices, robots, robotic control systems, hot and cold water supply devices, air conditioning systems, heating systems, ventilation systems, air handling systems, computers and computer systems, chemical plant control systems, computer operating systems and other software systems, remote control systems, sensors, or mixtures and combinations thereof.
- Embodiments of this invention relate to methods for controlling objects include sensing motion including motion properties within an active sensing zone of a motion sensor, where the motion properties include a direction, a velocity, an acceleration, a change in direction, a change in velocity, a change in acceleration, a rate of change of direction, a rate of change of velocity, a rate of change of acceleration, stops, holds, timed holds, or mixtures and combinations thereof.
- the methods also include producing an output signal or a plurality of output signals corresponding to the sensed motion and converting the output signal or signals via a processing unit in communication with the motion sensor into a command function or a plurality of command functions.
- the command functions include a scroll function, a select function, an attribute function, an attribute control function, a simultaneous control function, or mixtures and combinations thereof.
- the simultaneous control functions include a select and scroll function, a select, scroll and activate function, a select, scroll, activate, and attribute control function, a select and activate function, a select and attribute control function, a select, active, and attribute control function, or mixtures or combinations thereof.
- the methods also include processing the command function or the command functions, where to active cause scrolling, selecting, s a scroll function, (2) selecting and processing a scroll function, (3) selecting and activating an object or a plurality of objects in communication with the processing unit, (4) selecting and activating an attribute or a plurality of attributes associated with the object or the plurality of objects in communication with the processing unit, or (5) selecting, activating an object or a plurality of objects in communication with the processing unit, and activating an attribute or a plurality of attributes associated with the object or the plurality of objects in communication with the processing unit.
- the objects comprise real world objects, virtual objects and mixtures or combinations thereof, where the real world objects include physical, mechanical, electro- mechanical, magnetic, electro-magnetic, electrical, or electronic devices or any other real world device that can be controlled by a processing unit and the virtual objects include any construct generated in a virtual world or by a computer and displayed by a display device and that are capable of being controlled by a processing unit.
- the attributes comprise activatable, executable and/or adjustable attributes associated with the objects.
- changes in motion properties are changes discernible by the motion sensors and/or the processing units.
- the motion sensor is selected from the group consisting of digital cameras, optical scanners, optical roller ball devices, touch pads, inductive pads, capacitve pads, holographic devices, laser tracking devices, thermal devices, any other device capable of sensing motion, arrays of motion sensors, and mixtures or combinations thereof.
- the objects include lighting devices, cameras, ovens, dishwashers, stoves, sound systems, display systems, alarm systems, control systems, medical devices, robots, robotic control systems, hot and cold water supply devices, air conditioning systems, heating systems, ventilation systems, air handling systems, computers and computer systems, chemical plant control systems, computer operating systems, systems, graphics systems, business software systems, word processor systems, internet browsers, accounting systems, military systems, control systems, other software systems, remote control systems, or mixtures and combinations thereof.
- the processing unit if the timed hold is brief, then the processing unit causes an attribute to be adjusted to a preset level. In other embodiments, if the timed hold is continued, then the processing unit causes an attribute to undergo a high value/low value cycle that ends when the hold is removed.
- the timed hold causes an attribute value to change so that (1) if the attribute is at its maximum value, the timed hold causes the attribute value to decrease at a predetermined rate, until the timed hold is removed, (2) if the attribute value is at its minimum value, then the timed hold causes the attribute value to increase at a predetermined rate, until the timed hold is removed, (3) if the attribute value is not the maximum or minium value, then the timed hold causes randomly selects the rate and direction of attribute value change or changes the attribute to allow maximum control, or (4) the timed hold causes a continuous change in the attribute value in a direction of the initial motion until the timed hold is removed.
- Embodiments of this invention relate to methods for controlling real world objects include sensing motion including motion properties within an active sensing zone of a motion sensor, where the motion properties include a direction, a velocity, an acceleration, a change in direction, a change in velocity, a change in acceleration, a rate of change of direction, a rate of change of velocity, a rate of change of acceleration, stops, holds, timed holds, or mixtures and combinations thereof.
- the methods also include producing an output signal or a plurality of output signals corresponding to the sensed motion and converting the output signal or signals via a processing unit in communication with the motion sensor into a command function or a plurality of command functions.
- the command functions include a scroll function, a select function, an attribute function, an attribute control function, a simultaneous control function, or mixtures and combinations thereof.
- the simultaneous control functions include a select and scroll function, a select, scroll and activate function, a select, scroll, activate, and attribute control function, a select and activate function, a select and attribute control function, a select, active, and attribute control function, or mixtures or combinations thereof.
- the methods also include (1) processing a scroll function, (2) selecting and processing a scroll function, (3) selecting and activating an object or a plurality of objects in communication with the processing unit, (4) selecting and activating an attribute or a plurality of attributes associated with the object or the plurality of objects in communication with the processing unit, or (5) selecting, activating an obj ect or a plurality of obj ects in communication with the processing unit, and activating an attribute or a plurality of attributes associated with the object or the plurality of objects in communication with the processing unit.
- the objects comprise real world objects and mixtures or combinations thereof, where the real world objects include physical, mechanical, electro-mechanical, magnetic, electro-magnetic, electrical, or electronic devices or any other real world device that can be controlled by a processing unit or units.
- the attributes comprise activatable, executable and/or adjustable attributes associated with the objects.
- changes in motion properties are changes discernible by the motion sensors and/or the processing units.
- the motion sensor is selected from the group consisting of digital cameras, optical scanners, optical roller ball devices, touch pads, inductive pads, capacitve pads, holographic devices, laser tracking devices, thermal devices, any other device capable of sensing motion, arrays of motion sensors, and mixtures or combinations thereof.
- the objects include lighting devices, cameras, ovens, dishwashers, stoves, sound systems, display systems, alarm systems, control systems, medical devices, robots, robotic control systems, hot and cold water supply devices, air conditioning systems, heating systems, ventilation systems, air handling systems, computers and computer systems, chemical plant control systems, remote control systems, or mixtures and combinations thereof.
- Embodiments of this invention relate to methods for controlling virtual objects include sensing motion including motion properties within an active sensing zone of a motion sensor, where the motion properties include a direction, a velocity, an acceleration, a change in direction, a change in velocity, a change in acceleration, a rate of change of direction, a rate of change of velocity, a rate of change of acceleration, stops, holds, timed holds, or mixtures and combinations thereof.
- the methods also include producing an output signal or a plurality of output signals corresponding to the sensed motion and converting the output signal or signals via a processing unit in communication with the motion sensor into a command function or a plurality of command functions.
- the command functions include a scroll function, a select function, an attribute function, an attribute control function, a simultaneous control function, or mixtures and combinations thereof.
- the simultaneous control functions include a select and scroll function, a select, scroll and activate function, a select, scroll, activate, and attribute control function, a select and activate function, a select and attribute control function, a select, active, and attribute control function, or mixtures or combinations thereof.
- the methods also include (1) processing a scroll function, (2) selecting and processing a scroll function, (3) selecting and activating an object or a plurality of objects in communication with the processing unit, (4) selecting and activating an attribute or a plurality of attributes associated with the object or the plurality of objects in communication with the processing unit, or (5) selecting, activating an obj ect or a plurality of obj ects in communication with the processing unit, and activating an attribute or a plurality of attributes associated with the object or the plurality of objects in communication with the processing unit.
- the objects comprise virtual objects and mixtures or combinations thereof, where the virtual objects include any construct generated in a virtual world or by a computer and displayed by a display device and that are capable of being controlled by a processing unit.
- the attributes comprise activatable, executable and/or adjustable attributes associated with the objects.
- changes in motion properties are changes discernible by the motion sensors and/or the processing units.
- the motion sensor is selected from the group consisting of digital cameras, optical scanners, optical roller ball devices, touch pads, inductive pads, capacitve pads, holographic devices, laser tracking devices, thermal devices, any other device capable of sensing motion, arrays of motion sensors, and mixtures or combinations thereof.
- the software products include computer operating systems, graphics systems, business software systems, word processor systems, internet browsers, accounting systems, military systems, control systems, or mixtures and combinations thereof.
- Embodiments of this invention relate to systems and apparatuses for controlling objects include one or a plurality of motion sensor including an active zone, where the sensor senses motion including motion properties within an active sensing zone of a motion sensor, where the motion properties include a direction, a velocity, an acceleration, a change in direction, a change in velocity, a change in acceleration, a rate of change of direction, a rate of change of velocity, a rate of change of acceleration, stops, holds, timed holds, or mixtures and combinations thereof to produce an output signal or a plurality of output signals.
- the systems and apparatuses also include one or a plurality of processing unit including communication software and hardware, where the processing unit or units convert the outputs into command and control functions, and one or a plurality of controllable objects in communication with the processing unit or units.
- the command functions include a scroll function, a select function, an attribute function, an attribute control function, a simultaneous control function, or mixtures and combinations thereof.
- the simultaneous control functions include a select and scroll function, a select, scroll and activate function, a select, scroll, activate, and attribute control function, a select and activate function, a select and attribute control function, a select, active, and attribute control function, or mixtures or combinations thereof.
- the processing unit or units (1) process scroll functions, (2) select and process scroll functions, (3) select and activate one controllable object or a plurality of controllable objects in communication with the processing unit, (4) select and activate one controllable attribute or a plurality of controllable attributes associated with the controllable objects in communication with the processing unit, or (5) select, activate an object or a plurality of objects in communication with the processing unit, and activate an attribute or a plurality of attributes associated with the object or the plurality of objects in communication with the processing unit.
- the objects comprise real world objects, virtual objects and mixtures or combinations thereof, where the real world objects include physical, mechanical, electromechanical, magnetic, electro-magnetic, electrical, or electronic devices or any other real world device that can be controlled by a processing unit and the virtual objects include any construct generated in a virtual world or by a computer and displayed by a display device and that are capable of being controlled by a processing unit.
- the attributes comprise activatable, executable and/or adjustable attributes associated with the objects.
- changes in motion properties are changes discernible by the motion sensors and/or the processing units.
- the motion sensor is selected from the group consisting of digital cameras, optical scanners, optical roller ball devices, touch pads, inductive pads, capacitve pads, holographic devices, laser tracking devices, thermal devices, any other device capable of sensing motion, arrays of motion sensors, and mixtures or combinations thereof.
- the objects include lighting devices, cameras, ovens, dishwashers, stoves, sound systems, display systems, alarm systems, control systems, medical devices, robots, robotic control systems, hot and cold water supply devices, air conditioning systems, heating systems, ventilation systems, air handling systems, computers and computer systems, chemical plant control systems, computer operating systems, systems, graphics systems, business software systems, word processor systems, internet browsers, accounting systems, military systems, control systems, other software systems, remote control systems, or mixtures and combinations thereof.
- the sensor and/or the processing unit are capable of discerning a change in direction of motion of ⁇ 15 °. In other embodiments, the sensor and/or the processing unit are capable of discerning a change in direction of motion of ⁇ 10°.
- the senor and/or the processing unit are capable of discerning a change in direction of motion of ⁇ 5 ° .
- the systems and apparatuses further include a remote control unit in communication with the processing unit to provide remote control of the processing unit and the objects in communication with the processing unit.
- Embodiments of this invention relate to systems and apparatuses for controlling real world objects include one or a plurality of motion sensor including an active zone, where the sensor senses motion including motion properties within an active sensing zone of a motion sensor, where the motion properties include a direction, a velocity, an acceleration, a change in direction, a change in velocity, a change in acceleration, a rate of change of direction, a rate of change of velocity, a rate of change of acceleration, stops, holds, timed holds, or mixtures and combinations thereof.
- the systems and apparatuses also include one or a plurality of processing unit including communication software and hardware, where the unit converts the output into command and control functions, and one or a plurality of controllable objects in communication with the processing unit.
- the command functions include a scroll function, a select function, an attribute function, an attribute control function, a simultaneous control function, or mixtures and combinations thereof.
- the simultaneous control functions include a select and scroll function, a select, scroll and activate function, a select, scroll, activate, and attribute control function, a select and activate function, a select and attribute control function, a select, active, and attribute control function, or mixtures or combinations thereof.
- the processing unit or units (1) process scroll functions, (2) select and process scroll functions, (3) select and activate one controllable object or a plurality of controllable objects in communication with the processing unit, (4) select and activate one controllable attribute or a plurality of controllable attributes associated with the controllable objects in communication with the processing unit, or (5) select, activate an object or a plurality of objects in communication with the processing unit, and activate an attribute or a plurality of attributes associated with the object or the plurality of objects in communication with the processing unit.
- the obj ects comprise real world obj ects and mixtures or combinations thereof, where the real world objects include physical, mechanical, electro-mechanical, magnetic, electro-magnetic, electrical, or electronic devices or any other real world device that can be controlled by a processing unit.
- the attributes comprise activatable, executable and/or adjustable attributes associated with the objects.
- changes in motion properties are changes discernible by the motion sensors and/or the processing units.
- the motion sensor is selected from the group consisting of digital cameras, optical scanners, optical roller ball devices, touch pads, inductive pads, capacitve pads, holographic devices, laser tracking devices, thermal devices, any other device capable of sensing motion, arrays of motion sensors, and mixtures or combinations thereof.
- the objects include lighting devices, cameras, ovens, dishwashers, stoves, sound systems, display systems, alarm systems, control systems, medical devices, robots, robotic control systems, hot and cold water supply devices, air conditioning systems, heating systems, ventilation systems, air handling systems, computers and computer systems, chemical plant control systems, remote control systems, or mixtures and combinations thereof.
- the sensor and/or the processing unit are capable of discerning a change in direction of motion of ⁇ 15 °.
- the senor and/or the processing unit are capable of discerning a change in direction of motion of ⁇ 10°. In certain embodiments, the sensor and/or the processing unit are capable of discerning a change in direction of motion of ⁇ 5 ° . In certain embodiments, the methods further include a remote control unit in communication with the processing unit to provide remote control of the processing unit and the objects in communication with the processing unit.
- Embodiments of this invention relate to systems and apparatuses for controlling virtual objects include one or a plurality of motion sensor including an active zone, where the sensor senses motion including motion properties within an active sensing zone of a motion sensor, where the motion properties include a direction, a velocity, an acceleration, a change in direction, a change in velocity, a change in acceleration, a rate of change of direction, a rate of change of velocity, a rate of change of acceleration, stops, holds, timed holds, or mixtures and combinations thereof.
- the systems and apparatuses also include one or a plurality of processing unit including communication software and hardware, where the unit converts the output into command and control functions, and one or a plurality of controllable objects in communication with the processing unit or units.
- the command functions include a scroll function, a select function, an attribute function, an attribute control function, a simultaneous control function, or mixtures and combinations thereof.
- the simultaneous control functions include a select and scroll function, a select, scroll and activate function, a select, scroll, activate, and attribute control function, a select and activate function, a select and attribute control function, a select, active, and attribute control function, or mixtures or combinations thereof.
- the processing unit or units (1) process scroll functions, (2) select and process scroll functions, (3) select and activate one controllable object or a plurality of controllable objects in communication with the processing unit, (4) select and activate one controllable attribute or a plurality of controllable attributes associated with the controllable objects in communication with the processing unit, or (5) select, activate an object or a plurality of objects in communication with the processing unit, and activate an attribute or a plurality of attributes associated with the object or the plurality of objects in communication with the processing unit.
- the objects comprise virtual objects and mixtures or combinations thereof, where the virtual objects include any construct generated in a virtual world or by a computer and displayed by a display device and that are capable of being controlled by a processing unit.
- the attributes comprise activatable, executable and/or adjustable attributes associated with the objects.
- changes in motion properties are changes discernible by the motion sensors and/or the processing units.
- the sensor and/or the processing unit are capable of discerning a change in direction of motion of ⁇ 15 ° .
- the sensor and/or the processing unit are capable of discerning a change in direction of motion of ⁇ 10°.
- the sensor and/or the processing unit are capable of discerning a change in direction of motion of ⁇ 5 ° .
- systems and apparatuses further include a remote control unit in communication with the processing unit to provide remote control of the processing unit and the objects in communication with the processing unit.
- the motion sensor is selected from the group consisting of digital cameras, optical scanners, optical roller ball devices, touch pads, inductive pads, capacitive pads, holographic devices, laser tracking devices, thermal devices, any other device capable of sensing motion, arrays of motion sensors, and mixtures or combinations thereof.
- the software products include computer operating systems, graphics systems, business software systems, word processor systems, internet browsers, accounting systems, military systems, control systems, or mixtures and combinations thereof.
- the motion sensors may also be used in conjunction with displays, keyboards, touch pads, touchless pads, sensors of any type, or other devices associated with a computer, a notebook computer or a drawing tablet or any mobile or stationary device.
- the motion sensors may be optical sensors, acoustic sensors, thermal sensors, optoacoustic sensors, acoustic devices, any other sensor that senses movement or changes in movement, or mixtures or combinations thereof.
- the sensors may be digital, analog or a combination of digital and analog. For camera systems, the systems may sense motion within a zone, area or volume in front of the lens.
- Optical sensors may operate in any region of the electromagnetic spectrum including, without limitation, RF, microwave, near IR, IR, far IR, visible, UV or mixtures or combinations thereof.
- Acoustic sensor may operate over the entire sonic range which includes the human audio range, animal audio ranges, or combinations thereof. EMF sensors may be used and operate in any region of a discernable wavelength or magnitude where motion can be discerned. Moreover, LCD screen(s) may be incorporated to identify which devices are chosen or the temperature setting, etc. Moreover, the interface may project a virtual control surface and sense motion within the projected image and invoke actions based on the sensed motion.
- the motion sensor associated with the interfaces of this invention can also be acoustic motion sensor using any acceptable region of the sound spectrum. A volume of a liquid or gas, where a user's body part or object under the control of a user may be immersed, may be used, where sensors associated with the liquid or gas can discern motion.
- any sensor being able to discern differences in transverse, longitudinal, pulse, compression or any other waveform could be used to discern motion and any sensor measuring gravitational, magnetic, electro-magnetic, or electrical changes relating to motion or contact while moving (resistive and capacitive screens) could be used.
- the interfaces can include mixtures or combinations of any known or yet to be invented motion sensors.
- Suitable physical mechanical, electro-mechanical, magnetic, electro-magnetic, electrical, or electronic devices, hardware devices, appliances, and/or any other real world device that can be controlled by a processing unit include, without limitation, any electrical and/or hardware device or appliance having attributes which can be controlled by a switch, a joy stick or similar type controller, or software program or object.
- Exemplary examples of such attributes include, without limitation, ON, OFF, intensity and/or amplitude, impedance, capacitance, inductance, software attributes, lists or submenus of software programs or objects, or any other controllable electrical and/or electromechanical function and/or attribute of the device.
- Exemplary examples of devices include, without limitation, environmental controls, building systems and controls, lighting devices such as indoor and/or outdoor lights or light fixtures, cameras, ovens (conventional, convection, microwave, and/or etc.), dishwashers, stoves, sound systems, mobile devices, display systems (TVs, VCRs, DVDs, cable boxes, satellite boxes, and/or etc.), alarm systems, control systems, air conditioning systems (air conditions and heaters), energy management systems, medical devices, vehicles, robots, robotic control systems, UAV, equipment and machinery control systems, hot and cold water supply devices, air conditioning system, heating systems, fuel delivery systems, energy management systems, product delivery systems, ventilation systems, air handling systems, computers and computer systems, chemical plant control systems, manufacturing plant control systems, computer operating systems and other software systems, programs, routines, objects, and/or elements, remote control systems, or the like or mixtures or combinations thereof.
- lighting devices such as indoor and/or outdoor lights or light fixtures, cameras, ovens (conventional, convection, microwave, and/or etc.), dishwashers, stove
- Suitable software systems, software products, and/or software objects that are amenable to control by the interface of this invention include, without limitation, any analog or digital processing unit or units having single or a plurality of software products installed thereon and where each software product has one or more adjustable attributes associated therewith, or singular software programs or systems with one or more adjustable attributes, menus, lists or other functions or display outputs.
- Exemplary examples of such software products include, without limitation, operating systems, graphics systems, business software systems, word processor systems, business systems, online merchandising, online merchandising systems, purchasing and business transaction systems, databases, software programs and applications, internet browsers, accounting systems, military systems, control systems, or the like, or mixtures or combinations thereof.
- Software objects generally refer to all components within a software system or product that are controllable by at least one processing unit.
- Suitable processing units for use in the present invention include, without limitation, digital processing units (DPUs), analog processing units (APUs), any other technology that can receive motion sensor output and generate command and/or control functions for objects under the control of the processing unit, or mixtures and combinations thereof.
- DPUs digital processing units
- APUs analog processing units
- any other technology that can receive motion sensor output and generate command and/or control functions for objects under the control of the processing unit or mixtures and combinations thereof.
- Suitable digital processing units include, without limitation, any digital processing unit capable of accepting input from a plurality of devices and converting at least some of the input into output designed to select and/or control attributes of one or more of the devices.
- Exemplary examples of such DPUs include, without limitation, microprocessor, microcontrollers, or the like manufactured by Intel, Motorola, Erricsson, HP, Samsung, Hitachi, NRC, Applied Materials, AMD, Cyrix, Sun Microsystem, Philips, National Semiconductor, Qualcomm, or any other manufacture of microprocessors or microcontrollers.
- Suitable analog processing units include, without limitation, any analog processing unit capable of accepting input from a plurality of devices and converting at least some of the input into output designed to control attributes of one or more of the devices. Such analog devices are available from manufacturers such as Analog Devices Inc.
- Suitable motion sensing apparatus include, without limitation, motion sensors of any form such as digital cameras, optical scanners, optical roller ball devices, touch pads, inductive pads, capacitive pads, holographic devices, laser tracking devices, thermal devices, EMF sensors, wave form sensors, any other device capable of sensing motion, changes in EMF, changes in wave form, or the like or arrays of such devices or mixtures or combinations thereof.
- FIG. 1A-H an embodiment of a motion sensing apparatus of this invention, generally 100, is shown to include a motion sensor 102.
- the motion sensor 102 has a field of view or active sensing zone 104, shown here as a cone. Within the field of view or active sensing zone 104, motion or movement may be detected or sensed.
- the apparatus 100 also includes a processing unit 106 in communication via communication path 108 with the motion sensor 102 for receiving output from the motion sensor 102 and generate command and/or control functions.
- an arcuate path 110 is shown. Because the path 106 is arcuate, the motion sensor 102 is capable of detecting various components of motion within in the field of view or active sensing zone 104. These components include direction along the path 106, changes in direction along the path 110, velocity along the path 110, changes in the velocity along the path 110, acceleration along the path 110, and changes in acceleration along the path 110. It should be recognized, the velocity and acceleration are vectorial values that have a magnitude and a direction. Thus, the motion sensor 102 may also separately determine the magnitude and direction of the velocity and/or acceleration movement. As the motion is arcuate, the sensor 102 would generate all these types of path data.
- the processing unit may use each data element individually and/or collectively (any combination) to cause an effect such as executing a command function to control devices, software programs, and/or any other object electrically or electro-mechanically.
- the velocity or acceleration may be linear, radial (linear from a center), angular (circular, spiral, elliptical, etc.) or arcuate, or any mixture thereof, or of any type that one might use to interface with objects.
- Random motions may be used for security purposes, where such motions may be duplicated later for unlocking, securing, or providing unique identifiers for users, including using bio- kinetic signatures, where motion and biometrics (such as joint length of two fingers) are used to provide unique identifiers for individuals.
- the motion sensor(s) may be able to sense movement of multiple body parts or multiple objects in the field of view.
- Each individual sensed movement corresponding to a velocity, an acceleration, a change of velocity, a change of acceleration, a rate of change of velocity, and/or a rate of change of acceleration, or any collection of movements may be used to cause the processing unit to issue a command and the nature of the command may be based on the movement of multiple body parts or objects.
- a path 112 is shown to be S-shaped. Because the path 112 is S-shaped, the motion sensor 102 will detect components of motion including direction, changes in direction, velocity, changes in the velocity, acceleration, and changes in acceleration. It should be recognized, the velocity and acceleration are vectorial values that have a magnitude and a direction. Thus, the motion sensor 102 may also separately determine the magnitude and direction of the velocity and/or acceleration vectors. As the motion is arcuate, the sensor 102 would generate all these types of path data. Moreover, the processing unit may use each data element individually and/or collectively (any combination) to cause an effect such as executing a command function to control devices, software programs, and/or any other object electrically or electro-mechanically.
- an arcuate path 114 is shown, where the path 114 includes four segments 114a, 114b, 114c, and 114d. Each segment 114a-d has an increasing velocity and an increasing acceleration as indicated by the thickness of the lines. Because the path 114 is arcuate and includes ever increasing velocities, the motion sensor 102 is capable of detecting direction, changes in direction, velocity, changes in the velocity, acceleration, and changes in acceleration. It should be recognized, the velocity and acceleration are vectorial values that have a magnitude and a direction. Thus, the motion sensor 102 may also separately determine the magnitude and direction of the velocity and/or acceleration vectors. As the motion is arcuate, the sensor 102 would generate all these types of path data. Moreover, the processing unit may use each data element individually and/or collectively (any combination) to cause an effect such as executing a command function to control devices, software programs, and/or any other object electrically or electro-mechanically.
- a complex arcuate path 116 is shown, where the path 116 includes four segments 116a, 116b, 116c, and 116d. Each segment 116a-d has an increasing velocity and an increasing acceleration as indicated by the thickness of the lines, but with different directions as compared to the path 110. Because the path 116 is arcuate and includes ever increasing velocities, the motion sensor 102 is capable of detecting direction, changes in direction, velocity, changes in the velocity, acceleration, and changes in acceleration. Thus, the motion sensor 102 may also separately determine the magnitude and direction of the velocity and/or acceleration vectors. As the motion is arcuate, the sensor 102 would generate all these types of path data.
- the processing unit may use each data element individually and/or collectively (any combination) to cause an effect such as executing a command function to control devices, software programs, and/or any other virtual or real object electrically or electro-mechanically.
- the motion represents an acceleration gesture, where the totality of the parts are used to provide an output, and the uniqueness of the gesture is provided by the changes of velocity and/or acceleration within the gesture.
- a spiral motion path 118 is shown. Because the path 118 is spiral shaped, the motion sensor 102 will detect components of motion including direction, changes in direction, velocity, changes in the velocity, acceleration, and changes in acceleration. It should be recognized, the velocity and acceleration are vectorial values that have a magnitude and a direction. Thus, the motion sensor 102 may also separately determine the magnitude and direction of the velocity and/or acceleration vectors. As the motion is arcuate, the sensor 102 would generate all these types of path data.
- the processing unit may use each data element individually and/or collectively (any combination) to cause an effect such as executing a command function to control devices, software programs, and/or any other virtual object, or any other real object such as a electrical objects or electro-mechanical objects.
- a path 120 is shown, where the path 120 includes six segments 120a, 120b, 120c, 120d, 120e, and 120f. Each segment 120a-f has a different direction and different velocity and/or acceleration as indicated by the thickness of the lines. Because the path 120 includes different segments, the motion sensor 102 is capable of detecting direction, changes in direction, velocity, changes in the velocity, acceleration, and changes in acceleration. It should be recognized, the velocity and acceleration are vectorial values that have a magnitude and a direction. Thus, the motion sensor 102 may also separately determine the magnitude and direction of the velocity and/or acceleration vectors. As the motion is arcuate, the sensor 102 would generate all these types of path data. Moreover, the processing unit may use each data element individually and/or collectively (any combination) to cause an effect such as executing a command function to control devices, software programs, and/or any other object electrically or electro-mechanically.
- the motion sensor 102 is capable of detecting the direction and the velocity or the acceleration of the motion direction, changes in the velocity, acceleration, changes in acceleration, rates of velocity changes, and/or rates of acceleration changes. It should be recognized, the velocity and acceleration are vectorial values that have a magnitude and a direction. Thus, the motion sensor 102 may also separately determine the magnitude and direction of the velocity and/or acceleration vectors.
- the processing unit may use each data element individually and/or collectively (any combination) to cause an effect such as executing a command function to control devices, software programs, and/or any other object electrically, optically, or electro-mechanically, or through any other medium by which commands or information may be communicated.
- the path 122 may be a path 122 that has a smooth change in velocity, where the processing unit or sensor or both interpret the path 122 as indicating a constantly changing velocity or acceleration, which may cause the processing unit to issue a command different from a series of segments, each segment having a constant velocity, but different from the previous or later segment.
- a gesture 124 is shown, where the gesture 124 includes a sequence of segments 124a, 124b, 124c, and 124d having the different directions, different velocities, and/or different accelerations, illustrated by different line thicknesses. While the gesture 124 here is shown to include segments 124a-d that increase in velocity and/or acceleration and change in direction in going from 124a to 124d, the segments 124a-d may have any direction, velocity, and/or acceleration change profile, where each combination of directions, velocities, and/or accelerations may represent a different gesture. Thus, a gesture including motion up, right, down and left may represent a number of different gestures depending upon the velocity and/or acceleration of each segment.
- an embodiment of a motion sensing apparatus of this invention is shown to include a motion sensor 202.
- the motion sensor 202 has a field of view or active sensing zone 204, shown here as dashed circle. Within the field of view or active sensing zone 204, motion or movement may be detected or sensed as the active zone 204 is either pointing in the +z or -z direction or both the +z and -z directions .
- a first entity or a first entity object 206 under the control of the first entity in the real world that is sensed by the motion sensor 202 in a first direction 208, here shown as motion is the x-direction.
- the system 200 also includes a second entity or a second entity object 210 under the control of the second entity in the real world that is sensed by the motion sensor 202 in a second direction 212, here shown as motion is the y-direction.
- the apparatus 200 also includes a processing unit 214 in communication with the sensor 202 via a communication path 216. While in this figure, the two directions are in the x-direction and y-direction, the two directions do not have to be different nor at right angles to each other.
- the two sensed motions or movements may result in separate sensor output signals or a combined sensor output signal, where the separate and/or combined sensor output signals are used by the processing unit or units to generate a command and/or control functions as set forth above.
- One of the separate sensor outputs could be used by the processing unit to generate a command and/or control function, while the second could be used as a confirmation of the function, cause a modification of the function, causes a further specification of the function, or causes the function to be rejected.
- the two motions could be separated by a delay so that the second motion would represent a confirmatory motion or a rejection of the selection.
- an embodiment of a motion sensing apparatus of this invention is shown to include four motion sensors 302a-d having fields of view or active sensing zones 304a-d and a processing unit 306 in active communication with the sensors 302a-d via communication pathways 308a-d.
- the fours sensor 302a-d comprise an sensor array 310.
- the sensor array 310 is show here with all sensors 302a-d having their active zones 304a-d pointing only in one direction, +x. Of course, it should be recognized that the sensor array 310 may have any desired unidirectional configuration.
- an embodiment of amotion sensing apparatus of this invention is shown to include four motion sensors 302a-d having fields of view or active sensing zones 304a-d and a processing unit 306 in active communication with the sensors 302a-d via communication pathways 308a-d.
- the fours sensor 302a-d comprise an sensor array 312.
- the sensor array 312 is show here with the four sensors 302a-d having their active zones 304a-d pointing in four different directions, +x, -x, +y, and -y, respectively.
- the sensor array 312 may have any desired four directional configuration.
- FIG. 3C an embodiment of a motion sensing apparatus of this invention, generally 300, is shown to include nine motion sensors 302a-i having fields of view or active sensing zones 304a-i and a processing unit 306 in active communication with the sensors 302a-i via communication pathways 308a-i. Within the fields of view or active sensing zones 304a-i, motion or movement may be detected or sensed by the respective sensors 302a-i.
- the nine sensor 302a-i comprise an sensor array 314.
- the sensor array 314 is show here with the nine sensors 302a-i having their active zone 304a-i pointing in nine different directions, +x, +x-y, -y, -x-y, -x, -x+y, +y, +x+y, and +z.
- the apparatus 300 may also include as tenth motion sensor 302j (not shown) having an active zone 304j (not shown) pointing in the -z direction.
- the sensor array 314 may have any desired four directional configuration.
- an embodiment of amotion sensing apparatus of this invention is shown to include a motion sensor 302 having field of view or active sensing zone 304 and a processing unit 306 in active communication with the sensors 302 via communication pathway via direct contact.
- the motion sensor 302 has a field of view or active sensing zone 304, shown here as a hemisphere. Within the field of view or active sensing zone 304, motion or movement may be detected or sensed.
- the apparatus 300 is mounted on a wall or a ceiling 316.
- FIG. 4A the apparatuses and systems is used to control lights in a room 400 including a left wall 402, aright wall 404, a bottom wall 406, and a top wall 408.
- the left wall 402 includes lights 410; the right wall 404 includes lights 412; the bottom wall 406 includes lights 414; and the top wall 408 includes lights 416.
- the user has already used the apparatuses and systems of this invention to select lights in the room 400, instead of a sound system, a TV system, a security system, or any other controllable system associated with room 400 and controllable from within the room 400.
- all of the lights 410, 412, 414, and 416 are all in their maximum intensity state. It should be recognized that the starting point of each light may be the same or different and the effect of the motion will proportionally change the intensity of each light in accord with the properties of the motion.
- the apparatuses or systems of this invention recognizes motion 418 in a downward direction to the right of a center of the room 400.
- the motion 418 is at a constant velocity and no acceleration causing all of the left wall lights 410 to dim based on the velocity of the motion 418.
- slower downward motion would cause the lights 410 to dim less than faster motion downward would cause the light 410 to dim more.
- the user could also start the motion and hold, which would cause the light to dim until the user moves again as which point the dimming would stop.
- the apparatuses or systems of this invention recognizes motion 420 in a concave downward direction to the right of a center of the room 400.
- the motion 420 is at a constant angular velocity and no angular acceleration causing the left wall lights 410 to dim in a pattern 422, which differentially dims the lights 410 from the left wall edges to its center with the greatest dimming at the center of the wall 402 and the lest dimming at the edges of the wall 402.
- the apparatuses or systems of this invention recognizes motion 424 in a convex downward direction to the right of a center of the room 400.
- the motion 420 is at a constant angular velocity and no angular acceleration causing the left wall lights 410 to dim in a pattern 426, which differentially dims the lights 410 from the left wall edges to its center with the greatest dimming at the edges of the wall 402 and the lest dimming at the center of the wall 402.
- the apparatuses or systems of this invention recognizes motion 428 in a variable convex downward direction to the right of a center of the room 400.
- the motion 428 is variable in that the angular velocity increases as the motion proceeds downward, i.e., the motion 428 includes angular acceleration.
- the motion 428 causes the left wall lights 410 to dim in a pattern 430, which differentially dims the lights 410 from the left wall edges to its center with the greatest dimming at the lower edge, less dimming at the upper edge and the least dimming at the center of the wall 402.
- the dimming pattern of the lights conforms to the changes in the velocity of the motion.
- the apparatuses and systems will adjust the intensity of the lights accordingly. Therefore, the user can achieve very complex lighting configurations simply by changing the motion properties sensed by the motion sensor(s).
- the apparatuses or systems of this invention recognizes motion 432 in a sinusoidal downward motion to the right of a center of the room 400.
- the motion 420 is at a constant angular velocity and no angular acceleration causing the left wall lights 410 to dim in a pattern 434, which differentially dims the lights 410 from the left wall edges to its center in conformity to the closeness of the motion 432 to each of the light 410.
- the apparatuses and systems will adjust the intensity of the lights accordingly. Therefore, the user can achieve very complex lighting configurations simply by changing the motion properties sensed by the motion sensor(s).
- FIG. 5A the apparatuses and systems is used to control lights in a room 500 including a left wall 502, aright wall 504, a bottom wall 506, and atop wall 508.
- the left wall 502 includes lights 510; the right wall 504 includes lights 512; the bottom wall 506 includes lights 514; the top wall 508 includes lights 516.
- the user has already used the apparatuses and systems of this invention to select lights in the room 500, instead of a sound system, a TV system, a security system, or any other controllable system associated with room 500 and controllable from within the room 500.
- all of the lights 510, 512, 514, and 516 are all in their minimum intensity or off state. It should be recognized that the starting point of each light maybe the same or different and the effect of the motion will proportionally change the intensity of each light in accord with the properties of the motion.
- the apparatuses or systems of this invention recognizes motion 518 in a upward direction to the right of a center of the room 500.
- the motion 518 is at a constant velocity and no acceleration causing all of the left wall lights 510 to brighten based on the velocity of the motion 518.
- slower upward motion would cause the lights 510 to brighten less than
- faster motion upward would cause the light 510 to brighten more.
- the user could also start the motion and hold, which would cause the light to brighten until the user moves again as which point the brightening would stop.
- the apparatuses or systems of this invention recognizes motion 520 in a circular direction.
- the motion 520 is at a constant angular velocity and no angular acceleration causing all of the lights 510, 512, 514, and 516 to brighten based on the velocity of the motion 520.
- slower upward motion would cause the lights 510 to brighten less than
- faster motion upward would cause the light 510 to brighten more.
- the user could also start the motion and hold, which would cause the light to brighten until the user moves again as which point the brightening would stop.
- the apparatuses or systems of this invention recognizes motion 522 in a variable circular direction.
- the motion 522 have a variable angular velocity or an angular acceleration causing all of the lights 510, 512, 514, and 516 to brighten based on the variable velocity or acceleration properties of the motion 522.
- the velocity starts out high and continuously reduces so that the lights 510, 512, 514, and 516 to brighten accordingly.
- the apparatuses and systems will adjust the intensity of the lights accordingly. Therefore, the user can achieve very complex lighting configurations simply by changing the motion properties sensed by the motion sensor(s).
- the apparatuses and systems is used to control lights in a room 600 including a left wall 602, a right wall 604, a bottom wall 606, a top wall 608, and a ceiling 610.
- the left wall 602 includes lights 612; the right wall 604 includes lights 614; the bottom wall 606 includes lights 616; the top wall 608 includes lights 618, and the ceiling 610 includes lights 620.
- the user has already used the apparatuses and systems of this invention to select lights in the room 600, instead of a sound system, a TV system, a security system, or any other controllable system associated with room 600 and controllable from within the room 600.
- the apparatuses or systems of this invention recognizes motion 622 in a upward pointing spiral (not visual from the flat perspective of the figure).
- the motion 622 has a regular spiral angular velocity causing all of the lights 612, 614, 616, 618, and 620 to brighten in a pattern 624 in accord with the upward pointing spiral motion 622.
- the apparatuses and systems will adjust the intensity of the lights accordingly. Therefore, the user can achieve very complex lighting configurations simply by changing the motion properties sensed by the motion sensor(s).
- the apparatuses, systems and methods of this invention may be used to select and simultaneously control one, a plurality or all objects and/or attributes associated with the objects in accord with the nature of the motions.
- the motion properties can be used to differentially control the objects and/or attributes associated therewith in conformity to the motions.
- Each properties of the motions may be used to control all of the objects based on distance, direction, velocity, acceleration and/or changes thereof so that complex selection and control of the objects can occur quickly, effectively and efficiently.
- the previous figures and associated description are designed to illustrate the control of a large number of devices using properties and/or characteristics of the sensed motion including, without limitation, relative distance of the motion for each object (real like a person in a room using his/her hand as the object for which motion is being sensed or virtual representations of the objects in a virtual or rendered room on a display apparatus), direction of motion, speed of motion, acceleration of motion, changes an any of these properties, rates of changes in any of these properties, or mixtures and combinations thereof to control a single controllable attribute of the object such as lights.
- the systems, apparatuses, and methods of this invention are also capable of using motion properties and/or characteristics to control two, three, or more attributes of an object.
- the systems, apparatuses, and methods of this invention are also capable of using motion properties and/or characteristics from a plurality of moving objects within a motion sensing zone to control different attributes of a collection of objects.
- the motion properties and/or characteristic may be used to simultaneously change color and intensity of the lights or one sensed motion could control intensity, while another sensed motion could control color.
- motion properties and/or characteristic would allow the artist to control the pixel properties of each pixel on the display using the properties of the sensed motion from one, two, three, etc. sensed motions.
- the systems, apparatuses, and methods of this invention are capable of converting the motion properties associated with each and every obj ect being controlled based on the instantaneous properties values as the motion traverse the object in real space or virtual space.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
- Image Analysis (AREA)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201361885453P | 2013-10-01 | 2013-10-01 | |
PCT/US2014/058706 WO2015051046A1 (en) | 2013-10-01 | 2014-10-01 | Apparatuses for controlling electrical devices and software programs and methods for making and using same |
Publications (2)
Publication Number | Publication Date |
---|---|
EP3052945A1 true EP3052945A1 (de) | 2016-08-10 |
EP3052945A4 EP3052945A4 (de) | 2017-05-03 |
Family
ID=52779121
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP14851104.1A Ceased EP3052945A4 (de) | 2013-10-01 | 2014-10-01 | Vorrichtungen zur steuerung von elektrischen vorrichtungen und software-programmen sowie verfahren zur herstellung und verwendung davon |
Country Status (6)
Country | Link |
---|---|
EP (1) | EP3052945A4 (de) |
JP (1) | JP6749837B2 (de) |
KR (2) | KR102408940B1 (de) |
CN (1) | CN105814442A (de) |
CA (1) | CA2926193A1 (de) |
WO (1) | WO2015051046A1 (de) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3583827B1 (de) * | 2017-02-16 | 2020-07-15 | Signify Holding B.V. | Steuergerät zur anzeige des vorhandenseins eines virtuellen objekts über eine beleuchtungsvorrichtung und verfahren dafür |
JP6560801B1 (ja) * | 2018-09-26 | 2019-08-14 | 株式会社Cygames | プログラム、電子装置、及び方法 |
US11507096B2 (en) * | 2020-02-11 | 2022-11-22 | Sphero, Inc. | Method and system for controlling movement of a device |
CN113064359A (zh) * | 2021-06-02 | 2021-07-02 | 北京奇岱松科技有限公司 | 模型建立方法和实体控制方法、装置、设备和介质 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1483653A1 (de) * | 2002-03-08 | 2004-12-08 | Revelations in Design, LP | Steuerkonsole für elektrische geräte |
US20090241052A1 (en) * | 2008-03-19 | 2009-09-24 | Computime, Ltd. | User Action Remote Control |
WO2010009575A1 (en) * | 2008-07-24 | 2010-01-28 | Lite-On It Corporation | Lighting system |
WO2011007325A1 (en) * | 2009-07-15 | 2011-01-20 | Koninklijke Philips Electronics N.V. | Luminaire with touch pattern control interface |
US20110301934A1 (en) * | 2010-06-04 | 2011-12-08 | Microsoft Corporation | Machine based sign language interpreter |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4406826B2 (ja) * | 2003-09-30 | 2010-02-03 | 東芝ライテック株式会社 | 照明制御システム |
JP2007334737A (ja) * | 2006-06-16 | 2007-12-27 | Canon Inc | 情報処理装置及び情報処理方法 |
US9050528B2 (en) * | 2006-07-14 | 2015-06-09 | Ailive Inc. | Systems and methods for utilizing personalized motion control in virtual environment |
US9772689B2 (en) * | 2008-03-04 | 2017-09-26 | Qualcomm Incorporated | Enhanced gesture-based image manipulation |
US7996793B2 (en) * | 2009-01-30 | 2011-08-09 | Microsoft Corporation | Gesture recognizer system architecture |
JP5434638B2 (ja) * | 2010-01-29 | 2014-03-05 | ソニー株式会社 | 情報処理装置および情報処理方法 |
IL204436A (en) * | 2010-03-11 | 2016-03-31 | Deutsche Telekom Ag | A system and method for remote control of online TV by waving hands |
US8457353B2 (en) * | 2010-05-18 | 2013-06-04 | Microsoft Corporation | Gestures and gesture modifiers for manipulating a user-interface |
JP6261984B2 (ja) * | 2011-02-04 | 2018-01-17 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | 絶対的な基準系を作成するのに固有受容性を使用するジェスチャ制御可能システム |
US9030303B2 (en) * | 2011-03-30 | 2015-05-12 | William Jay Hotaling | Contactless sensing and control system |
US9218058B2 (en) * | 2011-06-16 | 2015-12-22 | Daniel Bress | Wearable digital input device for multipoint free space data collection and analysis |
JP5547139B2 (ja) * | 2011-07-29 | 2014-07-09 | 株式会社東芝 | 認識装置、方法及びプログラム |
JP5770654B2 (ja) * | 2012-02-16 | 2015-08-26 | シャープ株式会社 | 画面表示装置、その制御方法、プログラム、およびコンピュータ読み取り可能な記録媒体 |
-
2014
- 2014-10-01 CA CA2926193A patent/CA2926193A1/en not_active Abandoned
- 2014-10-01 KR KR1020217017559A patent/KR102408940B1/ko active IP Right Grant
- 2014-10-01 JP JP2016546874A patent/JP6749837B2/ja active Active
- 2014-10-01 CN CN201480063274.3A patent/CN105814442A/zh active Pending
- 2014-10-01 KR KR1020167011548A patent/KR20160092993A/ko active Application Filing
- 2014-10-01 EP EP14851104.1A patent/EP3052945A4/de not_active Ceased
- 2014-10-01 WO PCT/US2014/058706 patent/WO2015051046A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1483653A1 (de) * | 2002-03-08 | 2004-12-08 | Revelations in Design, LP | Steuerkonsole für elektrische geräte |
US20090241052A1 (en) * | 2008-03-19 | 2009-09-24 | Computime, Ltd. | User Action Remote Control |
WO2010009575A1 (en) * | 2008-07-24 | 2010-01-28 | Lite-On It Corporation | Lighting system |
WO2011007325A1 (en) * | 2009-07-15 | 2011-01-20 | Koninklijke Philips Electronics N.V. | Luminaire with touch pattern control interface |
US20110301934A1 (en) * | 2010-06-04 | 2011-12-08 | Microsoft Corporation | Machine based sign language interpreter |
Non-Patent Citations (1)
Title |
---|
See also references of WO2015051046A1 * |
Also Published As
Publication number | Publication date |
---|---|
JP2016541074A (ja) | 2016-12-28 |
JP6749837B2 (ja) | 2020-09-02 |
EP3052945A4 (de) | 2017-05-03 |
KR102408940B1 (ko) | 2022-06-14 |
KR20210072828A (ko) | 2021-06-17 |
WO2015051046A1 (en) | 2015-04-09 |
CA2926193A1 (en) | 2015-04-09 |
KR20160092993A (ko) | 2016-08-05 |
CN105814442A (zh) | 2016-07-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20190391664A1 (en) | Apparatuses for controlling electrical devices and software programs and methods for making and using same | |
US11886694B2 (en) | Apparatuses for controlling unmanned aerial vehicles and methods for making and using same | |
US10901578B2 (en) | Selection attractive interfaces, systems and apparatuses including such interfaces, methods for making and using same | |
US10263967B2 (en) | Apparatuses, systems and methods for constructing unique identifiers | |
US11972609B2 (en) | Interfaces, systems and apparatuses for constructing 3D AR environment overlays, and methods for making and using same | |
EP3053008B1 (de) | Attraktive auswahlschnittstellen und systeme mit solchen schnittstellen | |
EP3384367A1 (de) | Bewegungsbasierte schnittstellensysteme und vorrichtungen sowie verfahren zur herstellung und verwendung davon mit richtungsaktivierbaren attributen oder attributsteuerungsobjekten | |
US10628977B2 (en) | Motion based calendaring, mapping, and event information coordination and interaction interfaces, apparatuses, systems, and methods making and implementing same | |
JP6749837B2 (ja) | アクティブセンシング領域を有する動きセンサによる方法とシステム | |
AU2014329561A1 (en) | Apparatuses for controlling electrical devices and software programs and methods for making and using same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20160408 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
RIN1 | Information on inventor provided before grant (corrected) |
Inventor name: JOSEPHSON, JONATHAN |
|
DAX | Request for extension of the european patent (deleted) | ||
A4 | Supplementary search report drawn up and despatched |
Effective date: 20170330 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: G06F 3/01 20060101AFI20170325BHEP Ipc: H05B 37/02 20060101ALI20170325BHEP |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20180226 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R003 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED |
|
18R | Application refused |
Effective date: 20190628 |