WO2016036543A1 - Multi-surface controller - Google Patents
Multi-surface controller Download PDFInfo
- Publication number
- WO2016036543A1 WO2016036543A1 PCT/US2015/046790 US2015046790W WO2016036543A1 WO 2016036543 A1 WO2016036543 A1 WO 2016036543A1 US 2015046790 W US2015046790 W US 2015046790W WO 2016036543 A1 WO2016036543 A1 WO 2016036543A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- button
- controller
- thumb
- buttons
- index
- Prior art date
Links
- 238000000034 method Methods 0.000 claims description 49
- 238000004891 communication Methods 0.000 claims description 15
- 230000005236 sound signal Effects 0.000 claims description 4
- 230000005057 finger movement Effects 0.000 abstract description 4
- 210000003813 thumb Anatomy 0.000 description 243
- 210000003811 finger Anatomy 0.000 description 152
- 241000699666 Mus <mouse, genus> Species 0.000 description 24
- 210000004932 little finger Anatomy 0.000 description 14
- 238000010586 diagram Methods 0.000 description 10
- 230000006870 function Effects 0.000 description 10
- 230000007812 deficiency Effects 0.000 description 6
- 238000003825 pressing Methods 0.000 description 6
- 239000003086 colorant Substances 0.000 description 5
- 230000008859 change Effects 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 210000000707 wrist Anatomy 0.000 description 4
- 230000000994 depressogenic effect Effects 0.000 description 3
- 210000004247 hand Anatomy 0.000 description 3
- 230000002035 prolonged effect Effects 0.000 description 3
- 230000004044 response Effects 0.000 description 3
- 238000003860 storage Methods 0.000 description 3
- 238000013459 approach Methods 0.000 description 2
- 238000011960 computer-aided design Methods 0.000 description 2
- 238000006073 displacement reaction Methods 0.000 description 2
- 210000005069 ears Anatomy 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000005611 electricity Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 210000003205 muscle Anatomy 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 230000001052 transient effect Effects 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 208000008035 Back Pain Diseases 0.000 description 1
- 208000012514 Cumulative Trauma disease Diseases 0.000 description 1
- 241000699670 Mus sp. Species 0.000 description 1
- 206010028391 Musculoskeletal Pain Diseases 0.000 description 1
- 208000007613 Shoulder Pain Diseases 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000003213 activating effect Effects 0.000 description 1
- 230000004913 activation Effects 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 208000003295 carpal tunnel syndrome Diseases 0.000 description 1
- 230000001427 coherent effect Effects 0.000 description 1
- 238000004883 computer application Methods 0.000 description 1
- 238000005520 cutting process Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 230000000284 resting effect Effects 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/24—Constructional details thereof, e.g. game controllers with detachable joystick handles
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03543—Mice or pucks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03547—Touch pads, in which fingers can move on a surface
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C23/00—Non-electrical signal transmission systems, e.g. optical systems
- G08C23/04—Non-electrical signal transmission systems, e.g. optical systems using light waves, e.g. infrared
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/18—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for optical projection, e.g. combination of mirror and condenser and objective
- G02B27/20—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for optical projection, e.g. combination of mirror and condenser and objective for imaging minute objects, e.g. light-pointer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/033—Indexing scheme relating to G06F3/033
- G06F2203/0333—Ergonomic shaped mouse for one hand
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/038—Indexing scheme relating to G06F3/038
- G06F2203/0381—Multimodal input, i.e. interface arrangements enabling the user to issue commands by simultaneous use of input devices of different nature, e.g. voice plus gesture on digitizer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04104—Multi-touch detection in digitiser, i.e. details about the simultaneous detection of a plurality of touching locations, e.g. multiple fingers or pen and finger
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04808—Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/167—Audio in a user interface, e.g. using voice commands for navigating, audio feedback
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C2201/00—Transmission systems of control signals via wireless link
- G08C2201/30—User interface
- G08C2201/32—Remote control based on movements, attitude of remote control device
Definitions
- the present specification relates to a control device.
- a personal computer or workstation may use an electronic mouse as a control device. Regardless of different shapes, sizes, and designs, most of the electronic mice are held by human hands to move on a flat surface, and the movement of the electronic mouse is produced by movements of the wrist and/or arm of a user. Either a rubberized ball or a laser beneath the electronic mouse is used to measure the displacement over the underlying surface, which is then translated to the movement of a cursor on the computer display.
- a traditional mouse may include two or three actuator buttons on the top surface, which may be clicked by the index finger and/or middle finger to control the operations of the computer. The thumb in most cases is just for the purpose of holding and moving the electronic mouse.
- a wireless mouse differs from the wired mouse (serial or USB) in communication methods used to connect the electronic mouse to the computer, while the wireless mouse still requires a flat surface to operate on and thereby causes similar health issues for the users.
- Personal computers and workstations may include separate hardware/software units to implement various accessory functionalities that include, but are not limited to, a mouse, a keyboard, a microphone and speakers, a monitor.
- a user may have to switch among various interface devices to interact with a computer, for example to use a mouse for navigation, a keyboard for text inputting, and a microphone and speakers for giving a voice command or receiving computer’s audio output, and a monitor for receiving computer’s visual output.
- multiple electronic devices may be used at home, in different locations using different platforms. A user may have to walk around the house to approach different devices for different purposes.
- the user may sit in front of a desktop computer or a laptop to work on some documents or do some reading, or browse the Internet using a tablet in the bedroom, or go to the living room and sit in front of a flat screen television to watch TVs or play video games, or use a smart phone to text and make phone calls wherever the user goes. Therefore, the user experience operating on the variety of computing/electronic systems is segmented.
- a control device for accommodating multiple interfacing requirements.
- a control device is provided for controlling multiple computing/electronic devices.
- the controller includes multiple buttons with touch sensors connected at the top of the buttons, forming touch pads.
- the buttons and touch pads detect various finger operations caused by the movements and/or gestures of fingers, and translate the detected finger operations into control commands for controlling various devices.
- the controller does not need a flat surface to operate on, and thus allows the user to hold and manipulate the controller in a natural posture without stressing his neck, shoulder, back, and/or wrist.
- the controller communicates with a variety of computing devices and/or electronic devices via wireless and/or wired
- the controller includes a thumb button with a thumb touch pad for the thumb to operate, and two buttons, each having a touch pad, for the index and middle finger to operate, respectively.
- the two buttons that are dedicated for the index finger and middle finger may be combined into one button.
- the buttons detect click operations and the touch pads detect touch operations, when the fingers move and/or make gestures on the buttons and/or touch pads.
- some or all of the buttons’ resistance may be
- buttons may have haptic/tactile feedback.
- the controller can recognize handwriting through the thumb’s moving trajectory on the thumb touch pad.
- the controller may operate in a variety of user selectable configurations and/or modes, according to which the click and/or touch operations of the fingers are translated to various control commands for controlling the computing devices and/or other devices.
- the controller generates control commands based on a combination of movements of different fingers on different buttons and/or touch pads.
- the controller includes a microphone for receiving voice commands that are translated into control commands.
- the controller includes a light that lights up in different manners for indicating different types of information (e.g., power status, operation status, etc.).
- the controller includes a laser for presentation and/or pointing purposes.
- the controller includes at least one connector for connecting to external devices such as a headset and/or a set of goggles.
- the controller includes motion sensors for detecting the motion and/or orientation of the controller, which may be interpreted individually or in combination with other detected signals for generating control commands. In at least one embodiment when the controller is equipped with motion sensors, the controller may recognize handwriting based on the controller’s moving trajectory in the air.
- the controller includes a touchscreen that may display contents for the user to interact with.
- the controller may control a soft keyboard on a display for inputting text and/or other content.
- the controller may be mirror symmetric that can be manipulated by a right hand as well as a left hand.
- the controller serves as a human-machine interface for the user to interact with multiple computing/electronic devices.
- the controller is configured to remotely control personal computers, laptop computers, workstations, smart phones, tablet computers, mobile devices, and/or televisions (TV), for example.
- the controller provides a single display platform for a variety of different display devices (e.g., a flat screen TV, a projector, a set of projection goggles, and/or a headset).
- FIG. 1 shows an embodiment of a controller
- FIG. 2 shows a top view of an embodiment of the controller of FIG. 1;
- FIG. 3 shows a side view of an embodiment of the controller of FIG. 1;
- FIG. 4 shows a bottom view of an embodiment of the controller of FIG. 1;
- FIG. 5 shows a view of an embodiment of the controller having a light
- FIG. 6 shows a view of an embodiment of the controller having a laser
- FIG. 7A shows a view of an embodiment of the controller having a
- FIG. 7B shows a view of the tail end of an embodiment of the controller of FIG. 7A
- FIG. 8 shows a view of another embodiment of a controller having two buttons
- FIG. 9 shows a view of an embodiment of the controller held in a right hand
- FIG. 10 shows an example of a user manipulating the controller to activate a scrolling operation
- FIG. 11 shows an example of a user manipulating the controller to activate a swiping operation
- FIG. 12 shows an example of a user manipulating the controller to activate a zoom-out operation
- FIG. 13 shows an example of a user manipulating the controller to activate a zoom-in operation
- FIG. 14 shows an example of a user manipulating the controller to activate a switch operation
- FIG. 15 shows an example of a user manipulating the controller to activate a rotation operation
- FIG. 16 shows an example of a user manipulating the controller using the right hand
- FIG. 17 shows an example of a user manipulating the controller using the left hand
- FIG. 18 shows a view of an embodiment of the thumb touch pad having different region for performing different operations
- FIG. 19 shows a view of an embodiment of the thumb button that is configured to perform operations during the text input mode
- FIG. 20 shows a view of an embodiment of the thumb button that is configured, in combination with index button, to perform operations during the text input mode
- FIG. 21 shows a view of an embodiment of the thumb touch pad that recognizes hand writing
- FIG. 22 shows a view of an embodiment of the thumb button having a touchscreen
- FIG. 23 shows a view of an embodiment of the controller connected to a headset
- FIG. 24 shows a view of an embodiment of the controller connected to a set of goggles
- FIG. 25 shows a block diagram of display systems that may be controlled by the controller
- FIG. 26 shows a block diagram of an embodiment of the controller of FIG. 1;
- FIG. 27A is a flowchart of an embodiment of a method of using the controller to receive user inputs
- FIG. 27B is a flowchart of an embodiment of a method of using the controller in various modes
- FIG. 27C is a flowchart of an embodiment of a method of using the controller to display video and/or audio data
- FIG.28 is a flowchart of an embodiment of a method of assembling the controller.
- FIG. 29 shows a cross-sectional side view of an embodiment of the controller.
- each of FIGs. 1-26 and 29 is a brief description of each element, which may have no more than the name of each of the elements in the one of FIGs. 1-26 and 29 that is being discussed. After the brief description of each element, each element is further discussed in numerical order. In general, each of FIGs. 1-29 is discussed in numerical order and the elements within FIGs. 1-29 are also usually discussed in numerical order to facilitate easily locating the discussion of a particular element. Nonetheless, there is no one location where all of the information of any element of FIGs. 1 -29 is necessarily located. Unique information about any particular element or any other aspect of any of FIGs. 1-29 may be found in, or implied by, any part of the specification.
- FIG. 1 shows an embodiment of a controller 100.
- Controller 100 includes at least a housing 101, a top surface 102, a front slope 103, a concave area 104, a thumb button 105, an index button 106, a middle button 107, a tail end 108, and a head end 109.
- the controller 100 may not have all of the elements or features listed and/or may have other elements or features instead of or in addition to those listed.
- FIG. 1 shows a perspective view of a controller that has multiple buttons and/or touch pads for detecting movement and/or gestures of fingers.
- Controller 100 is a handheld control device that is configured to control multiple computing devices and/or electronic devices to perform functions such as, but not limited to, pointing, selection, zooming, rotation, menu activation, navigation, and inputting contents and/or commands.
- the controller 100 includes sensors that are connected to multiple buttons for detecting finger movements and/or gestures.
- the controller 100 has an ergonomic shape, and is mirror symmetric and thus capable of accommodating the left hand (e.g., in a left hand mode) and a right hand (e.g., in a right hand mode).
- a left hand mode and a right hand mode are provided.
- the controller 100 does not need to be rested on a flat surface (e.g., table surface) to operate, and thus minimizes stress and fatigue of the muscles of the user.
- the controller 100 does not include a rubberized ball or a laser that detects the motion of the controller 100 with respect to a flat surface.
- the controller 100 may implement operations that can be performed by a traditional electronic mouse, as well as many other operations.
- the controller 100 communicates with various computing/electronic devices wirelessly and/or via cables or wires.
- the controller 100 provides an integrated user interface for the user to interact with various computing devices, and thus enhances user experience.
- the terms“controller,”“controlling device,”“control device,”“pointing device,” “mouse,” and“interface device” are used interchangeably, and may be substituted one for the other to obtain different embodiments.
- Housing 101 is a housing that retains and/or supports the mechanical and electronic components of the controller 100.
- the housing 101 has a curved handle shape that fits the arches of a half-way closed human hand.
- the housing 101 may have different sizes for different sized hands.
- the housing 101 is mirror symmetric.
- the housing 101 may include other shapes and/or structures.
- the housing 101 may have a ball shape or close to a ball shape with two buttons (or one button) located in the front end and one button on the top.
- Top surface 102 is the top surface of the housing 101, in the orientation as shown in FIG. 1. In at least one embodiment, when the controller 100 is held in the hand of a user, the thumb may rest on the top surface 102. In at least one embodiment, the top surface 102 is defined as the surface that is in contact with the thumb when the user holds the controller 100. In other embodiments, top surface 102 may face in different directions when the controller 100 is in normal use.
- Front slope 103 is a slope on a head end of the controller 100, extending at an angle from the top surface 102 of the controller 100. In at least one embodiment, the front slope 103 is located between a thumb button on the top surface 102 and an index button on the head end of the controller 100. In at least one embodiment, the front slope 103 is curved.
- Concave area 104 is a concave area on a bottom surface of the housing 101.
- the ring finger and the little finger of the hand wrap underneath the concave area 104 for providing better support and grip.
- the terms“concave area,”“concave,” and “trough” are used interchangeably, and may be substituted one for the other to obtain different embodiments.
- Thumb button 105 is a button that is located on the top surface 102, close to the head end of the controller 100.
- the thumb button 105 has a circular or oval shape.
- the thumb button 105 when the user holds the controller 100 with his right hand or left hand, the thumb of the right hand or left hand rests on the thumb button 105 in a natural posture, and may click and/or move on the thumb button 105.
- the thumb button 105 is connected to a button switch and/or other sensors for detecting click operations on the thumb button 105 (e.g., single click, double click, holding down, etc.).
- the resistance of the buttons of controller 100 in response to user’s click operations is adjustable.
- the click/pressing is registered by the button (the lower the threshold, the less resistance of the button, and the more sensitive the button).
- the user may choose and/or change the threshold of the buttons of the controller 100 so as to adjust the sensitivity of the buttons in different operation mode or circumstance (e.g., the user may need more sensitive buttons to play a video game than in regular control mode and TV mode).
- a game mode there is a game mode and one or more other modes (e.g., a TV mode and control mode) for performing other types of tasks, and when in the game mode, the default threshold for the pressure that needs to be applied to the buttons is preset to a threshold that is lower than the default preset threshold during the control mode and/or TV mode.
- the thumb button 105 may have other shapes and/or sizes.
- the terms“main button,” “function button,” and“thumb button” are used interchangeably, and may be substituted one for the other to obtain different embodiments.
- a top surface of the thumb button 105 includes high resolution touch sensors, and thus the top surface of the thumb button 105 serves as a thumb touch pad.
- the touch sensors of the thumb touch pad detect touch operations (e.g., single tap, double-tap, scroll, swipe, etc.) when the thumb makes movements and/or gestures on the thumb touch pad.
- the thumb touch pad may have haptic/tactile feedback (e.g., a vibration or a click may be created in response to inputting a command into the touchpad) to notify user his/her touch operation has been executed.
- the thumb touch pad may have different types of haptic/tactile feedback for different touch operations
- the click operations and/or touch operations detected by the thumb button 105 and thumb touch pad, respectively may be combined with other data (e.g., mode of operation, input data from other buttons and/or touch pads, voice commands, motion signals, etc.) for the controller 100 to generate control commands for controlling various devices.
- the controller 100 includes a circuit that collects and analyzes signals and generates control instructions to control a variety of devices. Throughout this specification the terms“data,”“information,” and“content” are used interchangeably, and may be substituted one for the other to obtain different
- “thumb touch pad,”“main pad,” “main touch pad,”“function pad,” and“thumb pad” are used interchangeably, and may be substituted one for the other to obtain different embodiments.
- “touch pad,”“touch surface,”“touchscreen,” and“touch screen” are used interchangeably, and may be substituted one for the other to obtain different embodiments.
- a triple click anywhere on the thumb button 105 brings up a setup menu.
- the setup menu allows the user to choose from different configurations, operation modes, and/or various settings such as, but not limited to, left-hand and right-hand configurations, a control mode, a text input mode, a presentation mode, a television mode, a game mode, a three-dimension (3-D) mode, a media control mode, a handwriting mode, a voice command mode, an idle mode, a sleep mode, soft keyboard settings, energy saving settings, motion sensors settings, light indication settings, and button resistance settings.
- the configurations, modes, and/or settings will be discussed in conjunction with FIGs. 6, 7A, 7B, and 9-22.
- Index button 106 is a button that is located on the head end of the controller 100, next to the front slope 103.
- the index button 106 has a shape of a narrow long strip that is curved to align with the arc shape of the head end of the controller 100.
- the index button 106 extends along the arc-shaped front side of the head end, allowing the index button 106 to be effectively activated by the index finger regardless of whether the controller 100 is held by a right hand or a left hand.
- the index button 106 is connected to a button switch and/or other sensors for detecting click operations when the index finger clicks or presses down on the index button 106.
- the index button 106 may have other shapes and/or sizes. In at least one embodiment, the index button 106 may include the functionalities of a“left button” of a traditional electronic mouse, besides of many other functions. Throughout this specification the terms“index button,”“index finger button,” and“left button” are used interchangeably, and may be substituted one for the other to obtain different embodiments.
- the surface of the index button 106 faces outward, perpendicular to the top surface of the thumb button 105.
- high resolution touch sensors are connected on top of the index button 106, and thus serve as an index touch pad.
- the touch sensors of the index touch pad detect touch operations when the index finger moves and/or makes gestures on the index touch pad.
- the index touch pad may have haptic/tactile feedback to notify user his/her touch operation has been executed.
- the index touch pad may have different types of haptic/tactile feedback for different touch operations. For example, the haptic feedback for a scroll operation could be different from that of a swipe operation.
- the click operations and touch operations detected by the index button 106 and the index touch pad may be combined with other data for the controller 100 to generate control commands.
- the terms“index touch pad,”“index finger touch pad,”“index pad,” and“left touch pad” are used interchangeably, and may be substituted one for the other to obtain different embodiments.
- Middle button 107 is a button that is located on the head end of the controller 100, parallel to the index button 106 and close to the bottom surface of the housing 101.
- the middle button 107 has a similar shape as the index button 106, and is placed next to the index button 106.
- the middle button 107 extends along the arc-shaped head end to a similar extent to which the index button 106 extends, allowing the middle button 107 to be effectively activated by the middle finger regardless of whether a right hand or a left hand is used.
- the middle button 107 is connected to a button switch and/or other sensors for detecting click operations when the middle finger clicks or presses down on the middle button 107.
- the middle button 107 may include the functionalities of a“right button” of a traditional electronic mouse, besides of many other functions. In other embodiments, the middle button 107 may have other shapes and/or sizes. Throughout this specification the terms“middle button,”“middle finger button,”“middle pad,” and“right button” are used interchangeably, and may be substituted one for the other to obtain different embodiments.
- the surface of the middle button 107 faces outward in a similar direction as the surface of the index button 106.
- high resolution touch sensors are connected on top of the middle button 107, and thus serve as a middle touch pad.
- the touch sensors on the middle touch pad detect touch operations when the middle finger moves and/or makes gestures on the middle touch pad.
- the middle touch pad may have haptic/tactile feedback to notify user his/her touch operation has been executed.
- the middle touch pad may have different types of haptic/tactile feedback for different touch operations. For example, the haptic feedback for a scroll operation could be different from that of a swipe operation.
- the click operations and touch operations detected by the middle button 107 and the middle touch pad may be combined with other data for the controller 100 to generate control commands.
- the terms“middle touch pad,”“middle finger touch pad,” and“right touch pad” are used interchangeably, and may be substituted one for the other to obtain different embodiments.
- the index finger and middle finger when the user holds the controller 100 in his hand, the index finger and middle finger are in contact with the index button 106 and middle button 107, respectively, in a natural posture.
- the touch sensors of the index touch pad and middle touch pad are capable of detecting both, and distinguishing between when the index finger and middle finger move along the index touch pad and middle touch pad, respectively(which may cause the screen to scroll) and when the index finger and middle finger move across the index touch pad and middle touch pad, respectively (which may cause the screen to display a swiping action– which is the sliding from one page to another in response to the cursor being moved in a fashion mimicking a swipe by a finger on the touchscreen of a smart phone).
- the touch sensors of the thumb touch pad may have higher precisions than those of the index and/or middle touch pads, and thus can detect more complex movements of the thumb with higher precisions (e.g., during fine pointing control, for handwriting recognition, etc.). In an embodiment, it may be cost-effective to attach different touch sensors having different resolutions on different buttons, depending on the desired functionality of each button.
- buttons e.g., thumb button 105, index button 106, and middle button107
- touch sensors of the touch pads e.g., thumb touch pad, index touch pad, and middle touch pad
- each of the buttons of the controller 100 is dedicated to be manipulated by a different finger.
- each button when each button is dedicated for a different finger, the touch pad on top of each button may not require multi-touch technology that is configured to detect multiple touches simultaneously on the same touch pad.
- it may be cost-effective not to include complex logic and/or circuits for detecting and processing multi-touch signals on each touch pad.
- one or more of the touch pads of the controller 100 may be operated by multiple fingers (e.g., the index button 106 and middle button 107 may be replaced by a single button that can be operated by both the index and middle fingers), and thus multi-touch technology may be adopted to detect and process the multi-touch signals.
- Tail end 108 is the end of the controller 100 that is closest to the wrist when the controller 100 is held in the hand of the user.
- Head end 109 is the end of the controller 100 that is wrapped by the index and middle fingers when the controller 100 is held in the hand of the user.
- the controller 100 has a length of 5 inches, a width of 5/8 inches at the narrowest point and 7/4 inches at the widest point, and a height of 7/4 inches.
- the concave area 104 has a depth of 1/4 inches. It should be understood that modifications may be made without departing from the essential teachings of the invention.
- the dimensions of the controller 100 may have a tolerance of 10%. Of course, components that are intended to fit snugly within one another need to vary together so that those components still fit within one another, snugly. In other embodiments other dimensions may be used that are outside of the 10% tolerances of the dimensions.
- FIG. 2 shows a top view 200 of an embodiment of the controller 100 of FIG. 1.
- the view 200 shows at least housing 101, top surface 102, front slope 103, thumb button 105, tail end 108, and head end 109.
- the device in view 200 may not have all of the elements or features listed and/or may have other elements or features instead of or in addition to those listed.
- FIG. 2 shows a top view 200 when the user looks at the top surface 102 of the controller 100.
- the directions with respect to the thumb touch pad are indicated by arrows in FIG. 2 and defined as follows. Left is defined as the direction toward the left edge of the thumb touch pad. Right is defined as the direction toward the right edge of the thumb touch pad. Up is defined as the direction toward the front slope 103. Down is defined as the direction toward the tail end 108.
- FIG. 3 shows a side view 300 of an embodiment of the controller 100 of FIG. 1.
- View 300 shows at least housing 101, top surface 102, front slope 103, concave area 104, thumb button 105, index button 106, middle button 107, tail end 108, and head end 109.
- the device in view 300 may not have all of the elements or features listed and/or may have other elements or features instead of or in addition to those listed.
- FIG. 3 shows a side view 300 of the controller 100 with the thumb button 105 located on the top surface while the index button 106 and middle button 107 on the front side of the head end 109.
- the head end 109 is curved so that the index and middle fingers wrap around the head end 109 in a comfortable and natural posture.
- the front slope 103 between the thumb button 105 and index button 106 is at an angle that is less than 90 degrees from the top surface 102.
- the surfaces of the index button 106 and middle button 107 are at a plane that is perpendicular to the surface of the thumb button 105.
- the front slope 103 between the thumb button 105 and index button 106 is at an angle that is more than 60 degrees but less than 120 degrees with respect to the top surface 102. In an embodiment, the front slope 103 is at an angle that is between 60 degrees and 70 degrees with respect to the top surface 102. In an embodiment, the front slope 103 is at an angle that is between 70 degrees and 80 degrees with respect to the top surface 102. In an embodiment, the front slope 103 is at an angle that is between 80 degrees and 90 degrees with respect to the top surface 102. In an embodiment, the front slope 103 is at an angle that is between 90 degrees and 100 degrees with respect to the top surface 102.
- the front slope 103 is at an angle that is between 100 degrees and 110 degrees with respect to the top surface 102. In an embodiment, the front slope 103 is at an angle that is between 110 degrees and 120 degrees with respect to the top surface 102. In an embodiment, the front slope 103 is at an angle that is less than 60 degrees with respect to the top surface 102. In an embodiment, the front slope 103 is at an angle that is greater than 120 degrees with respect to the top surface 102.
- the index button 106 and middle button 107 face the same direction. In another embodiment, the index button 106 faces a direction that is different from the direction of the middle button 107.
- the index button 106 may face a direction at an angle upwards and middle button 107 may face a direction at an angle downwards.
- the index button and middle button face different directions.
- FIG. 4 shows a bottom view 400 of an embodiment of the controller 100 of FIG. 1.
- View 400 shows at least housing 101, tail end 108, and head end 109.
- View 400 further shows a bottom surface 402 and a switch 404.
- the device in view 400 may not have all of the elements or features listed and/or may have other elements or features instead of or in addition to those listed.
- FIG. 4 shows a view 400 from the bottom of the controller 100.
- Bottom surface 402 is the surface at the bottom of the controller 100.
- the bottom surface 402 is curved at the concave area 104, where the ring finger and little finger wrap to support the holding of the controller 100.
- the bottom surface 402 may include other shapes and/or structures.
- the concave area 104 of the bottom surface 402 may have structures such as bumps and/or convex curved lines for a better grip.
- Switch 404 is an on-and-off switch that is located on the bottom surface, close to the tail end 108.
- the switch 404 is a slide switch that can be pushed to stop at two positions, in which the controller 100 is either turned on or off.
- switch 404 may be a push button, toggle, rotary selector switch, or any other switch.
- the switch 404 is located away from the ring finger and little finger that wrap underneath the concave area 104 of the controller 100 when in operation. In other embodiments, the switch 404 may be in any other location on controller 100.
- the switch 404 may be located only in a location where the switch is unlikely to interfere with the normal finger and/or hand operations, so that the switch is unlikely to be turned on and/or turned off inadvertently while performing other tasks and/or causing discomfort.
- FIG. 5 shows a view 500 of an embodiment of the controller having a light.
- View 500 shows at least housing 101, top surface 102, front slope 103, thumb button 105, tail end 108, and head end 109.
- View 500 further shows a light 502.
- the device in view 500 may not have all of the elements or features listed and/or may have other elements or features instead of or in addition to those listed.
- FIG. 5 shows a view 500 of an embodiment of the controller 100 having a light for displaying indications.
- Light 502 is a light that is installed on the top surface 102 for indication purposes.
- the light 502 has a ring shape that outlines the edge of the thumb button 105.
- light 502 includes a light-emitting diode (LED).
- LED light-emitting diode
- light 502 is powered by one or more batteries or a built-in battery inside the housing 101 of the controller 100.
- the light 502 may be in other shapes, sizes, and/or locations.
- the light 502 may light up in various manners indicating the operation mode and/or status of the controller 100.
- the light emitted by the light 502 may have different colors, and/or may be capable of illuminating different portions of the surface of light 502, and/or flashing frequencies, for indicating different information to the user. In at least one embodiment, the light emitted by the light 502 may be capable of illuminating different portions in different colors simultaneously and/or have different portions change colors.
- light 502 may emit light of different colors to indicate the power status of the battery/batteries.
- the light 502 may glow in blue when the battery has sufficient power (e.g., more than 30% power remains in the battery). When the remaining power runs lower than 30%, the light 502 may emit yellow light to remind the user to recharge the controller 100 soon. When the power is below 10%, the light 502 may blink in red to remind the user to recharge the controller 100 immediately. When the battery is fully charged, the light 502 may glow in green to indicate the completion of charging. In at least one embodiment, the light 502 may have an ambient glow of soft light.
- percentage of the light 502 that lights up may indicate the power status.
- the percentage of the light 502 is proportional to the remaining power. For example, when the power of the battery is full, the entire ring of the light 502 glows. When 50% of power remains, only half of the light 502 lights up.
- the light 502 may also indicate the operation modes, for example, by using different colors. For example, a blue light emitted by the light 502 may indicate that the controller 100 is in the control mode, while a red light indicates the presentation mode, and a yellow light indicates the television mode. In at least one embodiment, what color/percentage of light indicates what mode/status may be selected or reset using the controller 100 in the setup menu.
- the light 502 may indicate whether the controller 100 is active or inactive. For example, when no activity is detected by the controller 100 within a time threshold (e.g., 30 seconds, 1 minute, 2 minutes, 5 minutes, etc.), the controller 100 may automatically go into an idle mode or a sleep mode, with the light 502 turned off.
- a time threshold e.g., 30 seconds, 1 minute, 2 minutes, 5 minutes, etc.
- the controller 100 may automatically go into an idle mode or a sleep mode, with the light 502 turned off.
- the idle mode and the sleep mode are distinct from one another.
- the idle mode is activated when the controller 100 is inactive for a short period of time (for example, 30 seconds, 1 minute). In the idle mode the cursor may disappear from the display.
- a movement of the controller 100, a touch on any of the touch pads, or a click on any of the buttons may bring the controller 100 out of the idle mode, with the light 502 turned back on and the cursor showing up on the display.
- the controller 100 may go further into the sleep mode, and optionally the wireless connection to the computing device is turned off.
- a user may press and hold down on all three buttons simultaneously for more than 2 seconds, for example.
- the controller 100 is configured not to go to the sleep mode during the movie and/or music, etc.
- the idle mode may be similar to a sleep mode on a computer in which only those components needed to detect that the user is attempting to interact with the controller 100 remain on, while all other components may be shutoff and/or not provided with power.
- the terms“sleep mode” and“idle mode” may be used interchangeably and either term may be substituted for the other to obtain different embodiments.
- the time thresholds e.g., 2 seconds, 30 seconds, 1 minutes, 5 minutes
- the user may modify the settings to change time thresholds or to use default threshold settings for various operation and/or modes in the setup menu.
- FIG. 6 shows a view 600 of an embodiment of the controller 100 having a laser.
- View 600 shows at least housing 101, top surface 102, front slope 103, concave area 104, thumb button 105, index button 106, middle button 107, tail end 108, and head end 109.
- View 600 further shows a laser 602.
- the device in view 600 may not have all of the elements or features listed and/or may have other elements or features instead of or in addition to those listed.
- FIG. 6 shows a side view 600 of an embodiment of the controller 100 having a laser to be used in the presentation mode.
- the presentation mode may be activated by selecting from the setup menu using the controller 100.
- the controller 100 may detect the opening up of presentation slides and automatically activate the presentation mode.
- Laser 602 is a component that emits a laser beam, a beam of coherent light, allowing the laser beam to stay narrow and focused to a tight spot.
- the controller 100 may be used as a presentation device (which is a device to assist with presenting presentations) and/or a laser pointer when in the presentation mode.
- the laser 602 is installed in the middle of the front slope 103, between the thumb button 105 and index button 106. In the presentation mode, the laser 602 is activated when the user clicks on any portion of the thumb button 105 and holds down for longer than a preset threshold (e.g., 0.5 second, 1 second, two seconds, etc.). In at least one embodiment, when activated, the laser 602 emits a light beam to pinpoint a desired target shown on the display or in other locations.
- a preset threshold e.g., 0.5 second, 1 second, two seconds, etc.
- FIG. 7A shows a view 700a of an embodiment of the controller 100 having a microphone.
- View 700a shows at least housing 101, top surface 102, front slope 103, thumb button 105, tail end 108, and head end 109.
- View 700a further shows a microphone 702.
- the device in view 700a may not have all of the elements or features listed and/or may have other elements or features instead of or in addition to those listed.
- FIG. 7A shows a top view 700a of an embodiment of the controller 100 installed with a microphone on the top surface 102.
- Microphone 702 is a microphone or mini microphone for detecting sound signals.
- the microphone 702 is installed on the top surface 102, away from the thumb button 105 and close to the tail end 108.
- the microphone 702 receives voice data that may be recognized and translated into control commands using speech recognition and natural language understanding methods, in the voice command mode.
- the voice command feature of the controller 100 provides an alternative way for the user to quickly and conveniently input instructions, which enhances user experience and user interaction with the system.
- the controller 100 may include another numbers of microphones in other locations where the microphone is not blocked by the hand that holds the controller 100.
- FIG. 7B shows a view 700b of the tail end of an embodiment of the controller of FIG. 7A.
- View 700b shows at least housing 101, top surface 102, tail end 108, head end 109, and microphone 702.
- View 700a further shows speakers 704a-b and a connector 706.
- the device in view 700b may not have all of the elements or features listed and/or may have other elements or features instead of or in addition to those listed.
- FIG. 7B shows a view 700b of the tail end 108, on which a connector and a plurality of speakers are installed.
- Speakers 704a-b include electroacoustic transducers for converting electrical signals into sound and playing audio content.
- the speakers 704a-b include a pair of speakers located on the tail end 108, on either side of a connector.
- the speakers 704a-b may be in other locations.
- the speakers 704a-b may be in other locations where the speakers are not blocked by the hand that holds the controller 100.
- the controller 100 may include another number of speakers.
- Connector 706 is an electro-mechanical device for connecting the controller 100 to other devices.
- the connector 706 includes an interface, a port, and/or a plug.
- the connector 706 may act as a power charging port for connecting to a power outlet.
- the connector 706 may be used as an audio/video extension connector for connecting to audio/video devices.
- the connector 706 has sufficient bandwidth for transmitting audio and/or video content to a head-wearing device, such as a headset and/or a set of goggles.
- the connector 706 is both a charging port and a data connector.
- the connector 706 is a Universal Serial Bus (USB) connector or a Micro USB connector.
- the connector 706 may be connected, via a data cable (e.g., a USB cable), to the computing device, allowing the controller 100 to be connected locally, while the control commands are transmitted, via the data cable instead of wireless communication.
- USB Universal Serial Bus
- FIG. 8 shows a view 800 of another embodiment of the controller having two touch surfaces.
- Controller 800 shows at least a housing 801, a top surface 802, a front slope 803, a concave area 804, a thumb button 805, an auxiliary button 806, a tail end 808, and a head end 809.
- the controller 800 may not have all of the elements or features listed and/or may have other elements or features instead of or in addition to those listed.
- FIG. 8 shows a view of a controller 800 that includes only two touch buttons, each connected to touch sensors.
- controller 800 may be another version of the controller 100 when the index button 106 and the middle button 107 are merged into one button while the index touch pad and middle touch pad merged into one touch pad.
- the controller 800 is symmetric and can fit in both a left hand and a right hand.
- Housing 801, top surface 802, front slope 803, concave area804, thumb button 805, and tail end 808 may be similar to the housing 101, top surface 102, front slope 103, concave area 104, thumb button 105, and tail end 108, which were discussed in conjunction with FIG. 1.
- Auxiliary button 806 is a single button that is located on the head end 109 of the controller 800, on which the index and middle fingers rest in a natural posture when the controller 800 is held in the hand of a user.
- the auxiliary button 806 takes the place of the index button 106 and middle button 107 of the controller 100.
- high resolution touch sensors are connected to the top of the auxiliary button 806, and thus serve as an auxiliary touch pad.
- the auxiliary touch pad is configured to detect multi-touch operations when both the index finger and the middle finger clicks or moves on the auxiliary touch pad.
- top portions of the auxiliary button 806 and auxiliary touch pad that are close to the front slope 803 correspond to the index button 106 and index touch pad, respectively, while bottom portions of the auxiliary button 806 and auxiliary touch pad that are close to the bottom surface correspond to the middle button 107 and middle touch pad, respectively.
- the index finger may click the top portion of the auxiliary button 806, while the middle finger may click the bottom portion of the auxiliary button 806.
- the controller 800 may interpret the click as either a left-click by the index finger or a right-click by the middle finger.
- scrolling or swiping on the auxiliary touch pad with either one or both of the index and middle fingers may be interpreted in similar manners as scrolling or swiping on the index touch pad and/or middle touch pad of the controller 100.
- finger operations on the two buttons/touch pads of controller 800 and on the three buttons/touch pads of controller 100 may generate control commands and instructions in similar manners.
- Head end 809 is the end of the controller 800 which may be wrapped by the index finger and middle finger when the controller 800 is held in the hand of a user.
- the head end 809 has similar curve and/or shape as the head end 109 of the controller 100.
- the auxiliary button 806 is located on the front side of the head end 809. In other embodiments, the head 809 may have other shapes and/or sizes.
- FIG. 9 shows a view 900 of an embodiment of the controller 100 held in a right hand.
- View 900 shows at least controller 100, top surface 102, front slope 103, thumb button 105, index button 106, middle button 107, tail end 108, and head end 109.
- View 900 further shows a right hand 901 that includes thumb 902, index finger 903, middle finger 904, ring finger 905, and little finger 906.
- the view 900 may not have all of the elements or features listed and/or may have other elements or features instead of or in addition to those listed.
- FIG. 9 shows a view 900 of the controller 100 that is held in the right hand of a user, with the thumb resting on the thumb button 105, the index finger and middle finger in contact with the index button 106 and middle button 107, respectively, and the ring finger and little finger wrapping underneath the concave area 104.
- Right hand 901 holds the controller 100 if the user is right-handed.
- the controller 100 may be held by a left-handed user in his left hand.
- the right hand 901 may hold the controller 100 in the air, on the lap, or anywhere near the body of the user. In at least one embodiment, the right hand 901 does not have to operate the controller 100 on a flat surface.
- Thumb 902 is in contact with the thumb button 105, while index finger 903 and middle finger 904 are in contact with the index button 106 and middle button 107, respectively, in a natural posture when the right hand 901 is holding the controller 100.
- the directions of movements of the thumb 902 with respect to the thumb button 105 were defined in conjunction with FIG. 2.
- the directions of movements of the index finger 903 and middle finger 904 on the index touch pad and middle touch pad, respectively, are defined in FIG. 9 as follows.
- a clockwise movement is defined as the movement of the index finger 903 and middle finger 904 along the index touch pad and middle touch pad, respectively, in a clockwise direction starting from the left side of the controller 100 toward the right side as shown in FIG. 9.
- a counter clockwise movement is defined as the movement of the index finger 903 and middle finger 904 in a counter clockwise direction starting from the right side of the controller 100 toward the left side as shown in FIG. 9.
- An up movement is defined as the movement of the index finger 903 and middle finger 904 across the index touch pad and middle touch pad, respectively, toward the top surface 102.
- a down movement is defined as the movement of the index finger 903 and middle finger 904 across the index touch pad and middle touch pad, respectively, toward the bottom surface 402.
- Ring finger 905 and little finger 906 wrap underneath the concave area 104 for support.
- FIG. 10 shows an example 1000 of a user manipulating the controller 100 to activate a scrolling operation.
- the view 1000 shows at least controller 100, top surface 102, front slope 103, thumb button 105, index button 106, middle button 107, tail end 108, right hand 901, thumb 902, index finger 903, middle finger 904, ring finger 905, and little finger 906.
- View 1000 also shows arrows 1002.
- the view 1000 may not have all of the elements or features listed and/or may have other elements or features instead of or in addition to those listed.
- FIG. 10 shows an example 1000 in which the index finger 903 of the right hand 901 scrolls along the index touch pad to activate scrolling operations.
- FIG. 10 shows an example 1000 in which the index finger 903 of the right hand 901 scrolls along the index touch pad to activate scrolling operations.
- an opened web page or document on the display is scrolled down (e.g., similar to scrolling down on a scroll wheel or a scroll ball of a traditional electrical mouse).
- a counter clockwise scrolling movement on the index touch pad activates a scroll up operation (e.g., similar to scrolling up on the scroll wheel or scroll ball of a traditional electrical mouse).
- Arrows 1002 show the directions in which the index finger 903 scrolls on the index touch pad.
- FIG. 10 shows that, when the controller 100 is held in the right hand 901, a clockwise scrolling movement corresponds to a scroll down operation, while a counter clockwise scrolling movement corresponds to a scroll up operation.
- the controller 100 may adopt a different configuration so as to implement similar operations. For example, to activate the scrolling down operation, the left index finger may scroll in a counter clockwise direction. A clockwise movement of the left index finger along the index touch pad may correspond to a scroll up operation.
- a user may select either the right hand configuration or left hand configuration in the setup menu.
- the controller 100 may instruct a message to be automatically prompted on the display asking if the user wants to switch to the left hand mode. Vice versa for the right hand mode switch.
- FIG. 11 shows an example 1100 of a user manipulating the controller 100 to activate a swiping operation.
- the view 1100 shows at least controller 100, top surface 102, front slope 103, thumb button 105, index button 106, middle button 107, tail end 108, right hand 901, thumb 902, index finger 903, middle finger 904, ring finger 905, and little finger 906.
- View 1100 also shows arrows 1102.
- the view 1100 may not have all of the elements or features listed and/or may have other elements or features instead of or in addition to those listed.
- FIG. 11 shows an example 1100 in which the index finger 903 swipes across the index touch pad to activate swiping operations.
- Arrows 1102 show the directions in which the index finger 903 swipes across the index touch pad that correspond to swiping operations.
- FIG. 11 shows that, when the controller 100 is held in the right hand 901, swiping upwards on the index touch pad corresponds to a left swipe operation (e.g., the opened webpage or document on the display is swiped to the left), while swiping downwards on the index touch pad corresponds to a right swipe operation.
- a left swipe operation e.g., the opened webpage or document on the display is swiped to the left
- swiping downwards on the index touch pad corresponds to a right swipe operation.
- FIG. 12 shows an example 1200 of a user manipulating the controller 100 to activate a zoom-out operation.
- the view 1200 shows at least controller 100, top surface 102, front slope 103, thumb button 105, index button 106, middle button 107, tail end 108, right hand 901, thumb 902, index finger 903, middle finger 904, ring finger 905, and little finger 906.
- View 1200 also shows arrows 1202 and 1204.
- the view 1200 may not have all of the elements or features listed and/or may have other elements or features instead of or in addition to those listed.
- FIG. 12 shows an example 1200 in which both the thumb touch pad and the index touch pad are manipulated simultaneously to activate a zoom-out operation.
- Arrow 1202 shows the direction in which the thumb 902 moves on the thumb touch pad toward the upper left side of the controller 100 (the direction with respect to the controller was discussed in conjunction with FIG. 2).
- Arrow 1204 shows the direction in which the index finger 903 moves along the index touch pad toward the left side of the controller 100.
- the controller 100 combines the signals detected by the thumb touch pad and index touch pad to generate a zoom-out control command to zoom out the image (e.g., the image of the document, webpage, or map, for example) shown on the display, thereby reducing the size of the features of the image.
- FIG. 12 Although a right hand is depicted in FIG. 12 (and in the following FIG. 13), it should be understood that the fingers of a left hand can also manipulate the thumb touch pad and index touch pad (the thumb of the left hand moving toward the upper right side while the index finger of the left hand moving toward the right side) to activate a zoom out operation in the left hand mode.
- the thumb touch pad and index touch pad the thumb of the left hand moving toward the upper right side while the index finger of the left hand moving toward the right side
- FIG. 13 shows an example 1300 of a user manipulating the controller 100 to activate a zoom-in operation.
- the view 1300 shows at least controller 100, top surface 102, front slope 103, thumb button 105, index button 106, middle button 107, tail end 108, right hand 901, thumb 902, index finger 903, middle finger 904, ring finger 905, and little finger 906.
- View 1300 also shows arrows 1302 and 1304.
- the view 1300 may not have all of the elements or features listed and/or may have other elements or features instead of or in addition to those listed.
- FIG. 13 shows an example 1300 in which both the thumb touch pad and the index touch pad are manipulated simultaneously to activate a zoom-in operation, thereby increasing the size of the features of the image.
- Arrow 1302 shows the direction in which the thumb 902 moves on the thumb touch pad toward the lower right side of the controller 100.
- Arrow 1304 shows the direction in which the index finger 903 moves along the index touch pad toward the right side of the controller 100.
- a zoom-in control command is generated by the controller 100 to zoom in the image (e.g., the image of the document, webpage, or map, for example) shown on the display.
- the zoom-out and zoom-in operations using the controller 100 do not require the movement of the middle finger 904.
- the thumb of the left hand may move toward the lower left side while the index finger of the left hand may move toward the left side to activate zoom in operation, for example.
- FIG. 14 shows an example 1400 of a user manipulating the controller 100 to activate a switch operation.
- the view 1400 shows at least controller 100, top surface 102, front slope 103, thumb button 105, index button 106, middle button 107, tail end 108, right hand 901, thumb 902, index finger 903, middle finger 904, ring finger 905, and little finger 906.
- View 1400 also shows arrows 1402.
- the view 1300 may not have all of the elements or features listed and/or may have other elements or features instead of or in addition to those listed.
- FIG. 14 shows an example 1400 in which swiping operations on both the index touch pad and middle touch pad are combined to perform a switch operation.
- Arrows 1402 show the directions in which both the index finger 903 and middle finger 904 are swiped.
- index finger 903 and middle finger 904 are swiped simultaneously across the index touch pad (which is on top of index button 106) and middle touch pad (which is on top of middle button 107), respectively.
- the controller 100 controls the computing device or an application to switch the selected content to the previous content (e.g., to switch from the current webpage to the previous webpage in a browser, switch from the current photo to a previous photo in a photo viewing application, showing the previous slide in a presentation application, etc.).
- swiping both the index finger 903 and middle finger 904 simultaneously toward the bottom surface 402 causes a switch operation to the next content.
- swiping both the index finger 903 and middle finger 904 may switch quickly among different soft keyboards (e.g., a keyboard having alphanumeric symbols -English or International, keyboards showing digits and signs, keyboard showing special characters, for example).
- FIG. 15 shows an example 1500 of a user manipulating the controller 100 to activate a rotation operation.
- the view 1500 shows at least controller 100, top surface 102, front slope 103, thumb button 105, index button 106, middle button 107, tail end 108, right hand 901, thumb 902, index finger 903, middle finger 904, ring finger 905, and little finger 906.
- View 1500 also shows arrows 1502 and arrows 1504.
- the view 1500 may not have all of the elements or features listed and/or may have other elements or features instead of or in addition to those listed.
- FIG. 15 shows an example 1500 in which the index finger 903 and middle finger 904 are moved in reverse directions simultaneously (e.g., one in a clockwise direction or towards the right side of FIG. 15, while the other in a counter clockwise direction or towards the left side of FIG. 15) along the index touch pad and middle touch pad, respectively, to activate a rotation operation.
- Arrows 1502 include a solid arrow that shows the direction that the index finger 903 moves along the index touch pad leftwards, and a dashed arrow that shows the direction that the middle finger 904 moves along the middle touch pad rightwards. In at least one embodiment, when the index finger 903 and middle finger 904 move
- the controller 100 simultaneously in the directions as indicated by the arrows 1502, respectively, the controller 100 generates a control command to rotate the selected object shown in the user interface in a clockwise direction with respect to the display.
- Arrows 1504 include a solid arrow that shows the direction that the index finger 903 moves, which is rightward, and a dashed arrow shows the direction that the middle finger 904 moves, which is leftward, to cause a counter clockwise rotation of a selected object.
- moving the index finger 903 and middle finger 904 simultaneously in the directions as indicated by the arrows 1504, respectively, causes the selected object to rotate in a counter clockwise direction with respect to the display.
- the controller 100 may use various combinations of finger operations on the buttons and/or touch pads to control the movements and orientations of a 3-D object.
- the user may use the index finger 903 and middle finger 904 to click and hold down on both the index button 106 and middle button 107, respectively, and meanwhile move the thumb 902 on the thumb touch pad to the left, right, up, or down, so as to shift the 3-D view point to the left, right, forward, or backward, respectively.
- the user may use the thumb 902 to press down on the thumb button 105, and meanwhile swipe both the index finger 903 and middle finger 904 upward or downward across the index touch pad and middle touch pad, so as to increase or decrease the 3-D view angle, respectively.
- the user may use the thumb 902 to press down on the thumb button 105, and meanwhile move both the index finger 903 and middle finger 904 leftwards or rightwards along the index touch pad and middle touch pad, respectively, so as to rotate the 3-D view angle in a clockwise or counter clockwise direction.
- the user may shift the 3-D view point by moving the controller 100 left, right, forward, or backward on the same horizontal plane, change the view angle by raising or lowering the controller 100 to a different horizontal plane, and/or rotate the 3-D view angle clockwise or counter clockwise by rotating the controller 100 in the same horizontal plane.
- the 3-D mode may be activated in the setup menu.
- the controller 100 may use the operations in the 3-D mode to control 3-D objects or views in a computer application, such as, but not limited to, Google Earth and Computer Aided Design (CAD) software.
- the presentation mode is automatically deactivated when the 3-D mode is on.
- FIG. 16 shows an example 1600 of a user manipulating the controller 100 using the right hand 901.
- the view 1600 shows at least controller 100, thumb button 105, right hand 901, and thumb 902.
- the view 1600 also shows a cursor 1601, a display1602, a computing device 1604, a display 1605, and coordinate frames 1606 and 1608.
- the view 1600 may not have all of the elements or features listed and/or may have other elements or features instead of or in addition to those listed.
- FIG. 16 shows an example 1600 in which the user uses his right hand 901 to manipulate the controller 100 to control a computing device and a separate display.
- Cursor 1601 is an indicator that indicates the position on a computer monitor or other display device in respond to input from a pointing device or text input.
- cursor 1601 may be a pointer or a mouse cursor, and have an arrow shape as shown in FIG. 16.
- cursor 1601 may be a flashing text cursor or a caret.
- the movement of the cursor 1601 corresponds to the thumb movement on the thumb touch pad and/or the movement of the controller 100 as a whole.
- a single tap on the thumb touch pad wakes or locates the cursor 1601 on the display.
- a tap on the thumb touch pad or a click on the index button 106 locates the text cursor.
- Display 1602 is a display device that is separate from the display device or monitor of the computing device. In an embodiment, the display 1602 has a larger screen compared to the monitor of the computing device. In an embodiment, the display 1602 is in another location away from the computing device. In an embodiment, the display 1602 is a screen of a television. In an embodiment, the display 1602 may be connected, via a cable or wirelessly, to the computing device, so that a view of the user interface (e.g., a desktop view in a Windows-base system) that is intended to be displayed on the monitor of the computer device can be mirrored onto the display 1602.
- a view of the user interface e.g., a desktop view in a Windows-base system
- the display 1602 may need a wireless communication module or hardware (e.g., a wireless dongle) for establishing wireless connections to the computing device 1604.
- the display 1602 may be a television that includes at least a screen and speakers, while the computing device 1604 transmits image/video content as well as audio content via the wireless connection or data cable.
- the image/video content may be displayed on the screen of the television and the audio content may be played via the speakers of the TV.
- Computing device 1604 is a device that can be programmed to carry out a set of arithmetic or logical operations automatically. Although the computing device 1604 is illustrated as a desktop computer in FIG. 16, it should be understood that the computing device 1604 may represent a variety of computing devices such as, but not limited to, mobile devices, laptop computers, tablet computers, workstations, game consoles, and smart phones.
- the controller 100 is paired up with the computing device 1604 to control the operations on the computing device 1604 in the control mode.
- the computing device 1604 may need a wireless communication module or hardware (e.g., a wireless dongle) for establishing wireless connections with the controller 100.
- the computing device 1604 may be connected with the controller 100 via cables or wires.
- the computing device 1604 includes its own display, and can be connected to one or more external display devices via data cables or wirelessly.
- the controller 100 includes a wireless
- the user may turn on the switch 404 (FIG. 4) and click on all three buttons (e.g., thumb button 105, index button 106, and middle button 107) of the controller 100 and hold down all three buttons for more than 1 second, for example, to trigger a process to establish wireless connection with the computing device 1604 and/or other devices.
- buttons e.g., thumb button 105, index button 106, and middle button 107
- applications of the controller 100 and/or computing device 1604 may be automatically activated, thereby allowing the detected finger operations of the controller 100 to control the computing device 1604.
- multiple display devices may be on and connected to the computing device 1604, and upon connecting the controller 100 to the computing device, a list of connected display devices are presented on all connected display devices. The user may choose at least one from the list of connected display devices to show the user interface of the computing device 1604. After the user selects that display device and/or devices of interest, the other display devices that were not selected are automatically turned off by computing device 1604 (or going into sleep mode).
- the user may hold down on all three buttons for more than 0.5 second, for example, to bring up a task manager to appear on the selected display device (e.g., similar to the“Ctrl-Alt-Del” shortcut operation in the Windows system).
- a window will prompt up to ask if the user wants to close the control session of the controller 100 or to turn off the computing device 1604.
- Display 1605 is the display device or monitor of the computing device 1604.
- the display 1605 displays a user interface for the user to interact with the computing device 1604.
- the display 1605 may or may not show the user interface simultaneously.
- Directions 1606 show the directions in which the thumb 902 may move on the thumb touch pad of the controller 100.
- directions 1606 include directions such as left (l), right (r), up (u), and down (d).
- the directions 1606 are similar to the directions as defined in FIG. 2.
- the controller 100 may move as a whole in the directions 1606, when the user holds and moves the controller 100 as a whole.
- Directions 1608 show the directions in which the cursor1601 may move on the display 1602.
- directions 1608 include directions, such as left (l), right (r), up (u), and down (d).
- the directions 1608 correspond to the directions 1606, respectively, and the movements of the thumb 902 with respect to the thumb touch pad correspond to the movements of the cursor 1601 with respect to the display 1602.
- the distance of the movement of thumb 902 on the thumb touch pad may be proportional to the distance of the movement of cursor 1601 on the display 1602.
- the speed of the movement of thumb 902 on the thumb touch pad may be measured and incorporated into the calculation of the displacement of the cursor 1601 on the display 1602. For example, the faster the thumb 902 moves on the thumb touch pad, the further the cursor 301 may travel.
- the controller 100 may include motion sensors such as accelerometers and/or gyroscopes to detect the direction and speed of the controller 100 when the user holds and moves the controller 100.
- the movements of the controller 100 in combination with the movements of the thumb 902, determine how far and/or how fast the cursor 1601 travels. For example, the user may flick or move the controller 100 toward left, and concurrently move the thumb 902 to the left. As a result, the cursor 1601 may move further to the left than when only the thumb 902 is moved.
- the controller 100 may adopt another configuration in which the motion sensors and the thumb touch pad independently control the movements of the cursor 1601. For example, the user may wave the controller 100 to quickly relocate the cursor 1601 to a desired region on the display 1602, and then move the thumb 902 on the thumb touch pad for fine control of the movements of the cursor 1601.
- the controller 100 is not required to point to the computing device 1604 and/or the display 1602 during the operation of the controller 100.
- the user may hold the controller 100 in his hand and rest his hand on the lap or on an armrest of a couch, with the controller 100 facing sideways and pointing away from the display 1602.
- the directions of movements of the cursor 1601 is determined by the movements of the thumb 902 with respect to the thumb touch pad and/or the movements of the controller 100, but may not be affected by the direction in which the controller 100 points to.
- the display 1602 may be a screen of a television, and the controller 100 may be used to remotely control the television in the television mode.
- the controller 100 includes an Infrared transmitter and can function as a TV remote control.
- the user may activate the television mode in the setup menu, and may operate on the controller 100 while pointing the controller 100 to the television to switch channels, adjust volume, etc.
- the controller 100 may be used to toggle through various modes (control mode, TV mode, presentation mode, for example) when the user holds down the thumb button 105 while clicks the index button 106 and the middle button 107 simultaneously.
- the color of light emitted by light 502 acts as an indicator of the mode the controller 100 is in.
- the user when the TV is currently turned off, the user may toggle to or activate the TV mode using the controller 100, and then hold down on all three buttons for more than 0.5 second, for example, to turn the TV on. Vice Versa to turn off the TV.
- the controller 100 may include various settings to translate click and/or touch operations into control commands in the television mode. For example, to switch channels, the user may click on the top/ bottom portion of the thumb button 105 or to swipe the thumb 902 across the main touch pad to the left/right, or to swipe the index finger 903 upward/downward on the index touch pad, depending on what settings are selected by the user.
- the user may click on the right/left portion of the thumb button 105 or swipe the thumb 902 across the main touch pad upward/downward, or to scroll the index finger 903 in a clockwise/counter clockwise direction, respectively, along the index touch pad, depending on the selected settings.
- the user may hold down on both the index button 106 and middle button 107, and meanwhile use the thumb 902 to write a digit on the thumb touch pad. The user may wait till the TV has registered this digit, and then write the next digit while still holding down the two buttons. The user may repeat the abovementioned steps until all digits are input.
- the user may hold down the thumb button and move the controller 100 in the air to write the channel number in the air, and the handwriting
- recognition software of the controller 100 may recognize the input channel number and switch to the desired channel on the television.
- less frequently used functions e.g., Guide, Menu, Mute, etc.
- a regular TV remote control may be
- the user may choose from the setup menu which settings to be used to control the television.
- the controller 100 may be used to control a game console or video games on the computing device 1604 when in the game mode.
- the game mode may be activated in the setup menu.
- the buttons and/or touch pads of the controller 100 may be used for selection and/or navigation in the games.
- the motion sensors of the controller 100 provide extra dimensions of user input when the user waves and/or rotates the controller 100 to control the video games.
- the controller 100 is used to control the display of the multimedia content in the media control mode.
- multimedia content may be any of, or any combination of, content forms including, but not limited to, text, still images, animation, video, audio, and/or interactive content forms.
- the media control mode may be activated in the setup menu, or by selecting a video or audio file to play.
- the user in the media control mode, the user may move the thumb 902 upward or downward on the thumb touch pad to increase or decrease the volume of the audio content, respectively.
- the user may move the thumb 902 in a counter clockwise or clockwise direction on the thumb touch pad to rewind or to fast forward a video clip, respectively.
- FIG. 17 shows an example 1700 of a user manipulating the controller 100 using the left hand.
- the view 1700 shows at least controller 100, thumb button 105, cursor 1601, display1602, and coordinate frames 1606 and 1608.
- the view 1700 further shows a left hand 1701, thumb 1702, index finger 1703, middle finger 1704, ring finger 1705, and little finger 1706.
- the view 1700 may not have all of the elements or features listed and/or may have other elements or features instead of or in addition to those listed.
- FIG. 17 shows an example 1700 in which the user uses his left hand to manipulate the controller 100.
- Left hand 1701 holds the controller 100 if the user is left-handed. In at least one embodiment, the left hand 1701 does not have to operate the controller 100 on a flat surface. In an embodiment, the left hand 1701 holds the controller to move in directions 1606 that correspond to the directions 1608 in which the cursor 1601 moves on the display 1602, in a similar manner as discussed in conjunction with FIG. 16.
- Thumb 1702 of the left hand 1701 is in contact with the thumb button 105 in a natural posture when the left hand 1701 is holding the controller 100.
- the thumb 1702 may operate on the thumb button 105 and/or the thumb touch pad for inputting commands.
- the thumb 1702 moves in the directions 1606 to control the cursor movements accordingly, in a similar manner as discussed in conjunction with FIG. 16.
- Index finger 1703 and middle finger 1704 of the left hand 1701 are in contact with the index button 106 and middle button 107, respectively, in a natural posture when the left hand 1701 is holding the controller 100.
- the index finger 1703 may operate on the index button 106 and/or the index touch pad
- the middle finger 1704 may operate on the middle button 107 and/or the middle touch pad.
- the index finger 1703 and middle finger 1704 of the left hand 1701 may move along or swipe across the index touch pad and middle touch pad, respectively, in manners as discussed in conjunction with FIGs. 10-15.
- Ring finger 1705 and little finger 1706 wrap underneath the concave area 104 for support.
- FIG. 18 shows a view 1800 of an embodiment of the thumb touch pad that is configured to perform various operations.
- the view 1800 shows at least top surface 102, front slope 103, thumb button 105, and head end 109.
- the view 1800 further shows a left portion 1801, a top portion 1802, a right portion 1803, a bottom portion 1804, and a center portion 1805.
- the view 1800 may not have all of the elements or features listed and/or may have other elements or features instead of or in addition to those listed.
- FIG. 18 shows a view 1800 of the thumb touch pad that is used to control the computing device 1604 when browsing a webpage or a map.
- a double-tap on the thumb touch pad realigns the content of the user interface shown on the display 1602 or 1605, depending on which part of the thumb touch pad is tapped.
- Left portion 1801 is a portion of the thumb touch pad close to the left side.
- a double-tap on the left portion 1801 brings the user back to the home page (e.g., similar to clicking the Home key on a keyboard).
- a double-tap on the left portion 1801 moves the focus of the image (e.g. a map) to the left.
- Top portion 1802 is a portion of the thumb touch pad close to the front slope 103.
- a double-tap on the top portion 1802 brings the user to the previous page (e.g., similar to clicking the Page Up key on a keyboard).
- a double-tap on the top portion 1802 moves the focus of the image (e.g., a map) upwards.
- Right portion 1803 is a portion of the thumb touch pad close to the right side.
- a double-tap on the right portion 1803 brings the user to the end of the webpage or document (e.g., similar to clicking the End key on a keyboard).
- a double-tap on the right portion 1803 moves the focus of the map to the right.
- Bottom portion 1804 is a portion of the thumb touch pad close to the bottom side.
- a double-tap on the bottom portion 1804 brings the user to the next page (e.g., similar to clicking the Page Down on a keyboard).
- a double-tap on the bottom portion 1804 moves the focus of the image downwards.
- Center portion 1805 is a portion of the thumb touch pad in the center.
- a double-tap on the center portion 1805 realigns the image (e.g. the webpage or map) to the center of the user interface show on the display 1602 or 1605.
- FIG. 19 shows a view 1900 of an embodiment of the thumb button 105 that is configured to perform operations during the text input mode.
- the view 1900 shows at least top surface 102, front slope 103, thumb button 105, and head end 109.
- the view 1900 further shows a left portion 1901, a top portion 1902, a right portion 1903, and a bottom portion 1904.
- the view 1900 may not have all of the elements or features listed and/or may have other elements or features instead of or in addition to those listed.
- FIG. 19 shows a view1900 of the thumb button 105 that controls the text input location during the text input mode.
- FIG 19 shows that the click operations on the thumb button 105 may be translated to various arrow key operations during the text input mode.
- the user may move the thumb 902 on the thumb touch pad to relocate the cursor 1601 over a text input area, and then the user may tap on the thumb touch pad or click the index button 106 to highlight the text cursor in the text input area, causing a soft keyboard to appear on the display 1602 or 1605.
- the user may further move the thumb 902 on the thumb touch pad to move the cursor 1601 over a desired key on the soft keyboard.
- the key upon selection, becomes ballooned or highlighted. Then the user may click the index button 106 to select the highlighted key. If a letter or number is selected, the letter/number is inserted to the current location of the text cursor 1601.
- the corresponding operation will be activated. For example, if the“Caps Lock” key on the soft keyboard is selected, the next letter input may be capitalized.
- the cursor 1601 may be relocated over a non-input area, and the index button 106 is clicked, causing the soft keyboard to disappear from the display 1602 or 1605.
- the user may click the middle button 107 to activate a context menu next to the cursor 1601 (e.g., similar to a right click using a traditional electrical mouse).
- Left portion 1901 is a portion of the thumb button 105 close to the left side.
- clicking on the left portion 1901 causes the cursor 1601 to move to the left of the previous cursor position (e.g., similar to clicking the Left Arrow key on the soft keyboard or a real keyboard).
- each single click on the thumb button 105 causes the text cursor 1601 to move a distance of just one position (e.g., the distance of the space occupied by one character).
- Top portion 1902 is a portion of the thumb button 105 close to the front slope 103. During the text input mode, clicking on the top portion 1902 causes the cursor 1601 to move up to the previous text line (e.g., similar to clicking the Up Arrow key on the soft keyboard or a real keyboard).
- Right portion 1903 is a portion of the thumb button 105 close to the right side. During the text input mode, clicking on the right portion 1903 causes the cursor 1601 to move to the right of the previous cursor position (e.g., similar to clicking the Right Arrow key on the soft keyboard or a real keyboard). In an embodiment, when the cursor 1601 is at the end of a text line, clicking the right portion 1903 moves the cursor 1601 to the beginning of the next text line, if available.
- Bottom portion 1904 is a portion of the thumb button 105 close to the bottom side. During the text input mode, clicking on the bottom portion 1904 causes the cursor 1601 to move down to the next text line (e.g., similar to clicking the Down Arrow key on the soft keyboard or a real keyboard).
- FIG. 20 shows a view 2000 of an embodiment of the thumb button 105 that is configured, in combination with index button 106, to activate operations during the text input mode.
- the view 2000 shows at least front slope 103, thumb button 105, and head end 109.
- the view 2000 further shows a left portion 2001, a top portion 2002, a right portion 2003, and a bottom portion 2004.
- the view 2000 may not have all of the elements or features listed and/or may have other elements or features instead of or in addition to those listed.
- FIG. 20 shows a view 2000 of the thumb button 105 that controls the text input in combination with the index button 106.
- clicking on various portions of the thumb button 105 while holding down on the index button 106 may be translated to various commands related to manipulating text and/or navigating through text.
- Left portion 2001 may be an embodiment of the left portion 1901.
- clicking on the left portion 2001 when the index button 106 is pressed down deletes the text to the left of the cursor 1601 (e.g., similar to clicking the backspace key on the soft keyboard or a real keyboard).
- Top portion 2002 may be an embodiment of the top portion 1902. In an embodiment, clicking on the top portion 2002 when the index button 106 is pressed down deletes the text to the right of the cursor 1601(e.g., similar to clicking the delete key on the soft keyboard or a real keyboard).
- Right portion 2003 may be an embodiment of the right portion 1903. In an embodiment, clicking on the right portion 2003 when the index button 106 is pressed down causes one space to be inserted to the current text cursor position (e.g., similar to pressing the space key on the soft keyboard or a real keyboard).
- Bottom portion 2004 may be an embodiment of the bottom portion 1904. In an embodiment, clicking on the bottom portion 2004 when the index button 106 is pressed down starts a new line and moves the cursor 1601 to the beginning of the new line (e.g., similar to pressing the return key or enter key on the soft keyboard or a real keyboard).
- FIG. 21 shows a view 2100 of an embodiment of the thumb touch pad that recognizes hand writing.
- the view 2100 shows at least letters 2101, 2102, 2103, 2104, 2015, and 2106, and thumb touch pad 2108.
- the view 2100 may not have all of the elements or features listed and/or may have other elements or features instead of or in addition to those listed.
- FIG. 21 shows a view 2100 of the thumb touch pad that may recognize hand writing to be translated by the controller 100 to control commands.
- FIG. 21 shows examples of handwriting letters on the thumb touch, which when both the index button 106 and middle button 107 are held down, may be translated to various“short-cut” operations.
- Letter 2101 shows a letter“C” written by the thumb 902 on the thumb touch pad.
- writing letter 2101 while holding down on both the index button 106 and middle button 107 causes the selected content to be copied into the clipboard (e.g., similar to a“Ctrl-C” short-cut on a keyboard in a Windows system).
- Letter 2102 shows a letter“V” written by the thumb 902 on the thumb touch pad, which, when both the index button 106 and middle button 107 are pressed down, causes to paste the content stored in the clipboard at the cursor position (e.g., similar to a“Ctrl-V” short-cut on a keyboard in a Windows system).
- Letter 2103 shows a letter“Z” written by the thumb 902 on the thumb touch pad, which, when both the index button 106 and middle button 107 are pressed down, causes to revert/undo the last operation (e.g., similar to a“Ctrl-Z” short-cut on a keyboard in a Windows system).
- Letter 2104 shows a letter“S” written by the thumb 902 on the thumb touch pad, which, when both the index button 106 and middle button 107 are pressed down, causes a document/webpage to be saved (e.g., similar to a“Ctrl-S” short-cut on a keyboard in a Windows system).
- Letter 2015 shows a letter“A” written by the thumb 902 on the thumb touch pad, when both the index button 106 and middle button 107 are pressed down, causes the entire content to be selected (e.g., similar to a“Ctrl-A” short-cut on a keyboard in a Windows system).
- Letter 2106 shows a letter“X” written by the thumb 902 on the thumb touch pad, which, when both the index button 106 and middle button 107 are pressed down, leads to cutting the selected content and store into the clipboard (e.g., similar to a“Ctrl-X” short-cut on a keyboard in a Windows system).
- just drawing the letter or in on the touch pad is interpreted as the keyboard short cut using the same letter or when in a certain mode just drawing the letter is interpreted as the same keyboard shortcut associated with the letter drawn, so that for example drawing an X is interpreted as“Ctrl-X.”
- each letter input may correspond to a different short-cut operation.
- Thumb touch pad 2108 may be an embodiment of the thumb touch pad that is connected on top of the thumb button 105, as discussed in conjunction with FIG. 1.
- the controller 100 may recognize hand writing of characters, letters, shapes, and/or drawings that are directly drawn by the thumb 902 on the thumb touch pad 2108.
- the user may choose to input text using the soft keyboard or real keyboard, or may activate a handwriting mode (e.g., by selecting from the setup menu) to write/draw content directly on the thumb touch pad 2108.
- the thumb touch pad 2108 can recognize the hand writing of foreign language characters.
- the controller 100 may recognize hand writing of characters, letters, shapes, and/or drawings that are directly drawn by moving the controller 100 itself in the air.
- FIG. 22 shows a view 2200 of an embodiment of the thumb button 105 mounted with a touchscreen.
- the view 2200 shows at least top surface 102, front slope 103, and head end 109.
- the view 2200 further shows a touchscreen 2202 and key panel 2204.
- the view 2200 may not have all of the elements or features listed and/or may have other elements or features instead of or in addition to those listed.
- FIG. 22 shows a view 2200 of the thumb button 105 that is mounted with a touchscreen that can display information for the user to directly interact with.
- the thumb touch pad on the thumb button 105 may not display information.
- Touchscreen 2202 is an input device that is layered on the top of an electronic visual display.
- the touchscreen 2202 displays images and/or other content for the user to interact with by touching the touchscreen 2202.
- the display functionality of the touchscreen 2202 may operate independently from the touch sensors that detect finger movements. For example, the display of the touchscreen 2202 does not have to be activated all the time. When in certain modes, such as in the television mode, the touchscreen 2202 may display a numerical panel for the user to touch and input channel numbers. In another example, when a Voice over IP (VoIP) application is activated, the touchscreen 2202 may display a phone dialing panel for the user to dial a phone number to make a call over the internet. In other embodiment, the touchscreen 2202 may display other keypads, buttons and/or contents for the user to select and/or input.
- VoIP Voice over IP
- Keypad 2204 includes a set of buttons or keys that are displayed on the touchscreen 2202. The user may touch the buttons or keys to select the corresponding number/letter. Although shown as a telephone keypad in FIG. 22, the keys of the keypad 2204 may bear any type of information such as, but not limited to, numbers, letters, symbols, and/or other characters.
- FIG. 23 shows a view 2300 of an embodiment of the controller 100 connected to a headset.
- the view 2300 shows at least controller 100, right hand 901, display 1602, computing device 1604, and display 1605.
- the view 2300 further shows a user 2301, a headset 2302, and a cable 2304.
- the view 2300 may not have all of the elements or features listed and/or may have other elements or features instead of or in addition to those listed.
- FIG. 23 shows a view 2300 of an embodiment of the controller 100 controlling the multimedia content to be displayed by the display 1602 and a headset.
- User 2301 is a person who may use the controller 100 to control various devices at his/her home. In at least one embodiment, the user 2301 does not have to walk to different locations/rooms at home to interact with different computing/electronic devices, but can stay wherever he/she wants and access multiple devices using a single controller 100. For example, the user 2301 may sit comfortably in the couch in the living room and watch the display 1602 of the flat screen TV that shows the user interface of the computing device 1604, or may put on a headset and/or a set of goggles to view the user interface of the computing device 1604.
- Headset 2302 is a device that includes a pair of small loudspeakers that are designed to be held in place close to the ears of the user 2301, allowing the user 2301 to listen to an audio source privately.
- the headset 2302 may include a double-earpiece or a single-earpiece.
- the headset 2302 is connected to the controller 100, via a cable.
- the headset 2302 may be wirelessly connected to the controller 100 and/or the computing device 1604.
- the audio data may be transmitted from the computing device 1604 to the controller 100 and then played by the headset 2302.
- the headset 2302 may play audio data that is stored in a memory system of the controller 100.
- the headset 2302 may have its own volume control.
- the playing of multimedia content may be controlled by the controller 100 in the media control mode, as discussed in conjunction with FIG. 16.
- the headset 2302 includes a microphone for detecting voice data that may be translated to voice commands.
- the controller 100 may be connected to a headphone or earphone that is not equipped with a microphone.
- the user may input voice commands via the microphone 702 installed on the controller 100.
- Cable 2304 may include a plurality of electrical wires for connecting the headset 2302 to the connector 706 of the controller 100.
- the cable 2304 transmits audio data from the controller 100 to the headset 2302 to be played, and/or transmits voice data from the headset 2302 to the controller 100.
- FIG. 24 shows a view 2400 of an embodiment of the controller 100 connected to a set of goggles.
- the view 2400 shows at least controller 100, right hand 901, computing device 1604, display 1605, and user 2301.
- the view 2400 further shows a set of goggles 2402, a headset 2403, and a cable 2304.
- the view 2400 may not have all of the elements or features listed and/or may have other elements or features instead of or in addition to those listed.
- FIG. 24 shows a view 2400 of an embodiment of the controller 100 that is connected to a set of goggles to display multimedia content.
- Goggles 2402 are a type of eyewear that functions as a display device for displaying images and/or videos to be viewed by the user 2301 when wearing the set of goggles 2402.
- the set of goggles 2402 takes the form of one or two display screens that are worn in front of the face of the user 2301 and project graphical images.
- the user 2301 may see two separate images - one in each eye which the brain combines to form a three dimensional (3D) image.
- the set of goggles 2402 may show an illusion of depth which is a
- the set of goggles 2402 are projection goggles or a set of virtual reality (VR) goggles.
- the set of goggles 2402 allows the user 2301 to view and interact with images and/or videos when wearing the set of goggles 2402.
- the set of goggles 2402 may include an integrated headset for playing audio data to go with the video display.
- the set of goggles 2402 is connected to the controller 100, via a cable.
- the set of goggles 2402 may be wirelessly connected to the controller 100 and/or the computing device 1604. In at least one embodiment, when the controller 100 or the set of goggles 2402 is communicatively connected to the computing system 1604.
- the set of goggles 2402 may project the user interface that was intended to be displayed by the display 1605 of the computing system 1604.
- the set of goggles 2402 is powered by batteries, or may be installed with a built-in battery unit.
- the set of goggles 2402 may be charged separately.
- the goggles 2402 are battery operated.
- the set of goggles 2402 and the controller 100 can form an independent computing system, without having to be connected to an external computer such as the computing system 1604.
- the user may view the setup menu or a user interface displayed via the set of goggles 2402 to configure the controller 100 to perform various operations.
- the user may use the controller 100 to control the set of goggles 2402 to play music or video content that are stored in the memory system of the controller 100, to edit documents, and/or to browse the Internet (when the controller 100 is connected to the Internet), in similar manners as operating on a personal computer (PC), but with more ease and flexibility (e.g., the user 2301 may be lying in a couch instead of sitting in front of a PC).
- PC personal computer
- the set of goggles 2402 if connected to the controller 100, is configured as the default display device for the controller 100.
- the set of goggles 2402 may display the setup menu or a start menu once the controller 100 is turned on.
- the user can simply select from the setup menu or start menu shown, via the set of goggles 2402, using the controller 100, to establish a connection to the computing device 1604 and start operating on the computing device 1604.
- the user 2301 may establish the connection between the controller 100 and the computing device 1604 by pressing down all the three buttons simultaneously, as discussed in conjunction with FIG. 1.
- the set of goggles 2402 projects contents (e.g., user interface, web pages, documents, images, videos, etc.) privately and exclusively to the user 2301 who is wearing the set of goggles 2402, while the headset of the set of goggles 2402 plays audio content privately to the ears of the user 2301, preventing a third party to see and hear the communication content.
- contents e.g., user interface, web pages, documents, images, videos, etc.
- the headset of the set of goggles 2402 plays audio content privately to the ears of the user 2301, preventing a third party to see and hear the communication content.
- goggles 2402 with the headset effectively protects the privacy of the user 2301.
- Headset 2403 may be similar to the headset 2302, except that the headset 2302 may be used as a standalone device while the headset 2403 may be integrated in or connected to the set of goggles 2402.
- the set of goggles 2402 and the headset 2403 may be independent of one another, and the controller 100 may be connected individually to the set of goggles 2402 and headset 2403.
- Cable 2404 connects the controller 100 to the set of goggles 2402 and the headset 2403.
- Cable 2404 may be similar to the cable 2304, except that the cable 2404 transmits video data as well as audio data to the set of goggles 2402 and the headset 2403, while the cable 2304 transmits only audio data/voice data to the headset 2302.
- the controller 100 receives data (user interface, video, audio, for example) from the computing device 1604 via wireless connection, and then the controller 100 transmits the received data via cable 2404 to the goggles 2402.
- the controller 100 may be wirelessly connected to the goggles 2402 and/or other display systems, and the controller 100 may transmit data to the goggles 2402 (and/or to other display systems for display).
- the controller 100 coordinates the display content, such as the videos or movies (e.g., on the display 1602, via the set of goggles 2402, or on other display devices), and the sound track of the videos or movies (e.g., via the headset 2403, or other speakers), to achieve a synchronized display effect.
- the controller 100 controls display of video on the display 1602 or the set of goggles 2402 to pause, and also controls the audio via the headset 2403 or speakers to stop playing.
- the display content on the display 1602 or goggles 2402 may be independent from the playing of audio content (e.g., the user may switch between different tasks while listening to music), and the controller 100 may individually control the display 1602/goggles 2402 and the headset 2403 or other speakers.
- FIG. 25 shows a block diagram 2500 of display systems that may be controlled directly or indirectly by the controller 100.
- the block diagram 2500 shows the controller 100, a computing device 2501, a computer display 2502, a projector 2503, a television display 2504, a television 2504b, 2503 and a set of goggles 2508.
- the block diagram 2500 may not have all of the elements or features listed and/or may have other elements or features instead of or in addition to those listed.
- FIG. 25 shows a block diagram 2500 of the controller 100 that provides a single display platform for a variety of different display systems.
- Computing device 2501 and computer display 2502 may be an embodiment of the computing device 1604 and the display 1605 of the computing device 1604, which were discussed in conjunction with FIG. 16.
- the computer display 2502 and other display systems of FIG. 25 may be connected, via cables or wirelessly, to the computing device 2501.
- the computing device 2501 may instruct each connected display system to show a list of all the connected display systems.
- the display systems needs to be on and connected to the computing device 2501 so that each of the display systems may display the list of connected display systems.
- the user 2301 may choose from the list a display system to view by watching the display closest to him/her. For example, the user 2301 may select the television display from the list, and the user interface on the computer display 2502 will then be mirrored onto the television display dedicatedly.
- Projector 2503 is an optical device that projects images or videos onto a surface (e.g., a projection screen).
- the projector 2503 is connected to the computing device 2501 via a cable or wirelessly.
- the controller 100 controls the computing device 2501 to send data to the projector 2503 to project the user interface of the computing device 2501 on the projection screen.
- Television display 2504a is a display system of a television.
- the television display 2504a is connected to the computing device 2501 via cable or wirelessly, and the computing device 2501 mirrors the user interface onto the television display 2504a.
- Television 2504b may be any type of television (TV) including, but not limited to, electronic television, digital television, flat screen TV, smart TV, 3D TV, cable TV, satellite TV, and web TV/internet TV.
- television 2504b includes at least a screen (e.g., television display 2504a) and one or more speakers.
- television 2504b may be controlled using a remote controller or a device that can transmit infrared signals or other wireless signals.
- television 2504b may be controlled by the controller 100 in the TV mode, while the user points the controller 100 that has an infrared module toward the television 2504b.
- the television 2504b may be located in the living room, in front of a couch or chairs.
- the controller 100 controls the television 2504b to watch TV (in the television mode) or to display the user interface of the computing device 2501 on the television display 2504a (in the control mode).
- Goggles 2508 are embodiments of the set of goggles 2402, which was discussed in conjunction with FIG. 24.
- the goggles 2508 are connected to the controller 100, via at least a data cable, and the controller 100 sends user interface and/video content that is received from the computing device 2501 to the goggles 2508 for display.
- the controller 100 sends video content that is stored in the memory system of the controller 100 to the goggles 2508.
- the goggles 2508 may be wireless connected to the computing device 2502 and receive data from the computing device 2501.
- the controller 100 may be directly connected to and controls the display system (goggles 2505, television 2504b in the TV mode, for example). In another embodiment, the controller 100 may indirectly manipulate the display content on the display system by instructing the computing device 1604 to mirror
- image/video content onto the display system (television display 2504a in the control mode, projector, for example).
- FIG. 26 shows a block diagram 2600 of an embodiment of the controller 100 of FIG. 1.
- the controller 100 includes at least a processor system 2602, a transceiver 2604, a memory system 2606, a power management 2608, an LED 2610, a infrared 2612, a laser 2614, a network control 2616, motion sensors 2618, a thumb touch pad 2620, an index touch pad 2622, a middle touch pad 2624, a thumb button 2626, an index button 2628, a middle button 2630, an audio system 2632, a microphone 2634, a speaker 2636, and a video system 2638.
- the system in block diagram 2600 may not have all of the elements or features listed and/or may have other elements or features instead of or in addition to those listed.
- FIG. 26 shows a block diagram 2600 of the components in the controller 100.
- Processor system 2602 may include any one of, some of, any combination of, or all of multiple parallel processors, a single processor, a system of processors having one or more central processors and/or one or more specialized processors dedicated to specific tasks.
- processor system 2602 may implement machine instructions stored in a memory system, such as software for analyzing input information and accordingly controlling various peripherals.
- processor system 2602 may collect and analyze input data from buttons, touch pads, motion sensors, and/or microphone of the controller 100, and translate the input data into control commands.
- Transceiver 2604 is a device that includes both a transmitter and a receiver. In at least one embodiment, transceiver 2604 includes at least an antenna. In at least one embodiment, transceiver 2604 receives signals, and transmits signals to external devices. Alternatively or additionally, controller 100 includes a separate transmitter and receiver.
- Memory system 2606 may include, for example, any one of, some of, any combination of, or all of a long term storage system, such as a hard drive; a short term storage system, such as random access memory; a removable storage system, such as a floppy drive or a removable drive; and/or flash memory.
- Memory system 2606 may include cache memory accessible by the processor 2602.
- Memory system 2606 may include one or more machine-readable mediums that may store a variety of different types of information.
- the term machine-readable medium is used to refer to any non-transient medium capable carrying information that is readable by a machine.
- One example of a machine-readable medium is a non-transient computer-readable medium.
- memory system 2606 may include software for communicating with the computing device 1604, television 2504b, projector 2503, goggles 2402 /2508, and/or other electronic devices.
- the software in the memory system 2606 may include machine instructions for transmitting and processing signals and generating control instructions based on input data.
- memory system 2606 may store various settings and/or configuration related to various operation modes of the controller 100. For example, memory system 2606 may include configurations for the right hand mode and left hand mode.
- Power management 2608 is a system that monitors and/or manages power status and/or consumptions. In at least one embodiment, power management 2608 records and updates the power status for the light 502 to show indications of the power status. In at least one embodiment, power management 2608 manages power supply to different modules of the controller 100 when in different mode of operation. In at least one embodiment, power management 2608 turns off the power or switches the controller 100 to a low-power state (e.g., idle mode, sleep mode, etc.) when the controller 100 is inactive.
- a low-power state e.g., idle mode, sleep mode, etc.
- Light module 2610 is a module that is configured to control the light 502. In at least one embodiment, the light module 2610 controls the on-and-off of the light 502, the color, and/or the percentage of the light 502 that is illuminated.
- Infrared module 2612 is a module that is configured to generate and transmit digitally-coded pulses of infrared radiation to control external devices (e.g., the television 2504b) remotely.
- the transmitter of the infrared module 2612 may be a light emitting diode (LED) that is built into the head end 109 of the controller 100.
- the infrared module 2612 in the television mode when the user points the transmitter of the infrared module 2612 toward the television 2504b, the infrared module 2612 sends control commands remotely to the television 2504b to control functions and/or settings such as power, volume, channels, etc.
- infrared module 2612 may also send wireless signals to other electronic devices and/or computing devices.
- Laser module 2614 is a module that is configured to control the laser 602, which was discussed in conjunction with FIG. 6. In an embodiment, laser module 2614 controls and/or activates the laser 602 in the presentation mode.
- Network control 2616 is a system that controls the network communication between the controller 100 and various electronic devices and/or computing devices.
- network control 2616 controls and manages the wireless and/or wired connections between the controller 100 and external devices.
- the network control 2616 may include a network interface to communicate with a mobile communication network, local networks, and/or the internet.
- controller 100 may include any one of, some of, any combination of, or all of electrical cables, fiber optic cables, and/or means of sending signals through air or water (e.g. wireless communications), for example.
- Some examples of means of sending signals through air include systems for transmitting electromagnetic waves such as infrared and/or radio waves and/or systems for sending sound waves.
- Motion sensors 2618 may include accelerometers and/or gyroscopes to detect movement and/or rotation of the controller 100.
- motion sensors 2618 may include single- and/or multi-axis models of accelerometers to detect magnitude and direction of the proper acceleration, and thus to sense orientation of the controller 100.
- motion sensors may include a gyroscope that uses Earth’s gravity to help determine orientation and measuring rotational movement of the controller 100.
- Thumb touch pad 2620, index touch pad 2622, and index touch pad 2624 are the touch pads formed by the touch sensors that are connected on top of the thumb button 105, index button 106, and middle button 107, respectively, which were discussed in conjunction with FIG. 1.
- Thumb button 2626, index button 2628, and middle button 2630 may be embodiments of the thumb button 105, index button 106, and middle button 107, respectively, which were discussed in conjunction with FIG. 1.
- Audio system 2632 is a system that is configured to receive, analyze, manage, interpret, and/or transmit audio data.
- the audio system 2632 receives voice data from the microphone 702 of the controller 100 or the microphone of the headset 2302, and then the processor system translates the data into voice commands using speech recognition and natural language understanding algorithms and methods.
- the audio system 2632 receives audio data from computing device 1604 and transmits the audio data to be played by headset 2302/2502 or speakers.
- the audio system 2632 receives audio data from the computer 1604 and transmits the audio data to the speaker of the television 2504b, or the headset 2302/2502, or the speakers 704a-b of the controller 100, depending on user selection.
- the audio system 2632 may control the volume and/or channels of audio data.
- Microphone 2634 may be an embodiment of the microphone 702, which was discussed in conjunction with FIG. 7A. In at least one embodiment, microphone 2634 receives voice data from the user and transmits the voice data to the audio system 2632 to be interpreted and translated to control commands.
- Speaker 2636 may be an embodiment of the speakers 704a-b, which were discussed in conjunction with FIG. 7B. In at least one embodiment, speaker 2636 may play the audio content and/or sound of system alerts and/or notification ringtones.
- Video system 2638 is a system that is configured to receive, control, manage, and/or transmit video data.
- the video system 2638 receives video data from computing device 1604 and transmits the video data to be played by selected display systems. For example, when the user opens up a movie or a video file on the computer 1604 using the controller 100, the video system 2638 receives video data from the computer 1604 and transmits the video data to the display 1602 or the set of goggles 2402 to play the movie or video file.
- FIG. 27A is a flowchart of an embodiment of a method 2700a of using the controller 100 to receive user inputs.
- step 2702 the controller 100 determines whether voice data is received by the microphone 702. If voice data is received, the method 2700a proceeds to step 2704. If not, the method 2700a proceeds to step 2708.
- step 2704 the microphone 702 sends the received voice data to the audio system 2632.
- step 2706 the audio system 2632 analyzes and interprets the received voice data using speech recognition and interrelation algorithms and methods, and generates voice commands.
- step 2708 the controller 100 determines whether the signals from the motion sensors 2618 are above preset threshold. If the signal is above the threshold, the method 2700a proceeds to step 2710. If the signal is below the threshold, the method 2700a proceeds to step 2712.
- step 2710 if the signals detected by the motion sensors 2618 are above the threshold, the processor system 2602 analyzes the signals and translates into parameters related to the orientation and/or rotation of the movement. As part of step 2710, the processor system 2602 translates the signals to cursor movements.
- step 2712 the thumb touch pad 2620, index touch pad 2622, and middle touch pad 2624 determine whether touch operations are detected. If touch operations are received, the method 2700a proceeds to step 2714. If not, the method 2700a proceeds to step 2716.
- step 2714 the processor system 2602 registers each touch operation signal to the corresponding touch pad.
- a single gesture e.g., a single tap
- two tap operations are recorded consecutively on the same touch pad (e.g., double tap on the thumb touch pad 2620).
- two gestures are detected on two different touch pads simultaneously (e.g., in a zoom-in operation).
- step 2716 the thumb button 2626, index button 2628, and middle button 2630 determine whether click operations are received. If click operations are received, the method 2700a proceeds to step 2718. If not, the method 2700a proceeds to step 2720.
- step 2718 if click operations are received, the processor system 2602 registers each click operation signal to the corresponding button.
- a single click on one button is interpreted as a command.
- a prolonged signal indicating that a button is pressed down for a period of time is interpreted as another command.
- two consecutive clicks on the same button are interpreted as another command (e.g., double click on the index button 2628).
- commands on different buttons are received simultaneously.
- the steps 2712 and 2716 take place simultaneously, and touch operations and click operations may be detected simultaneously. For example, the index button 2628 is held down, and meanwhile thumb is swiped or writing letter“C” on the thumb touch pad 2620.
- step 2720 the processor system 2602 combines the recorded voice commands, cursor movements, touch operations, and/or click operations to generate control commands.
- the processor system 2602 translates the input signals to control commands (while in one mode a particular signal may be interpreted as one command, and while in another mode the same signal is interpreted as another command).
- each of the steps of method 2700a is a distinct step.
- step 2702-2720 may not be distinct steps.
- method 2700a may not have all of the above steps and/or may have other steps in addition to or instead of those listed above. The steps of method 2700a may be performed in another order.
- FIG. 27B is a flowchart of an embodiment of a method 2700b of using the controller 100 in various modes.
- the processor system 2602 determines and/or records which operation mode the controller 100 is in.
- step 2722 and the step 2720 of FIG. 27A may be merged into one step.
- step 2724 if the controller 100 is determined to be in the TV mode, the processor system 2602 sends the control commands to the infrared module 2612. As part of the step 2724, the infrared module 2612 sends wireless signals to control the television 2504b.
- step 2726 if the controller 100 is determined to be in the presentation mode, the processor system 2602 sends the control commands to the laser module 2614 as well as sending control commands to the computing device 1604 to display images or presentation slides on a selected display device. As part of the step 2726, the laser module 2614 activates the laser 602 based on the user commands.
- step 2728 if the controller 100 is determined to be in the control mode, the processor system 2602 further determines whether the controller 100 is connected locally to the computing device 1604 or other devices. If no data cable is detected as connecting the controller 100 to the computing device 1604 or other devices, the method 2700b proceeds to step 2730. If a data cable is detected that connects the controller 100 to the computing device 1604 or other devices, the method 2700b proceeds to step 2732. As part of step 2728, wireless connection is established whether a data cable is detected.
- the transceiver 2604 transmits the control commands wirelessly to the computing device 1604 or other devices (e.g., the set of goggles 2402, headset 2302, for example).
- the computing device 1604 or other devices e.g., the set of goggles 2402, headset 2302, for example.
- step 2732 the controller 100 transmits the control commands to the computing device 1604 or other devices via cables.
- the controller 100 may communicate with one device wirelessly and with another device via a cable, and thereby steps 2730 and 2732 may take place simultaneously.
- the controller 100 may control the computing device 1604 to display the user interface on the display 1602 and/or the set of goggles 2402, and meanwhile the controller 100 may transmit audio data to the headset 2302/2403 via a cable.
- step 2734 the controller 100 records data about controller mode, power status, and error status.
- step 2736 the controller 100 sends the data about controller mode, power status, and error status to the light module 2610.
- the light module 2610 or the processor system 2602 analyzes the data received and generates control instructions based on the data received.
- the light module 2610 or the processor system 2602 sends the control instructions to the light 502.
- the light 502 receives the control instructions and displays light indications to indicate the controller mode, power status, and error status to the user.
- each of the steps of method 2700b is a distinct step.
- step 2722-2738 may not be distinct steps.
- method 2700b may not have all of the above steps and/or may have other steps in addition to or instead of those listed above. The steps of method 2700b may be performed in another order.
- FIG. 27C is a flowchart of an embodiment of a method 2700c of using the controller 100 to display video and/or play audio data.
- step 2740 the controller 100 determines if video data is received. If video data is received, the method 2700c proceeds to step 2742. If not, the method 2700c proceeds to step 2746.
- step 2742 the controller 100 sends the video data to the video system 2638 for processing.
- the video system 2638 converts the video data to a compatible format that may be played by a display system that is selected by the user (e.g., the display 1602, the set of goggles 2402, the display 1605 of the computing device 1604, the projector 2503).
- step 2744 the controller 100 sends the video data to the display that is selected by the user.
- step 2746 the controller 100 determines if audio data is received. If audio data is received, the method 2700c proceeds to step 2748. If not, the method 2700c ends.
- step 2748 the controller 100 sends the audio data to the audio system 2632 for processing.
- the audio system 2632 processes and/or converts the audio data to a compatible format that may be played by speakers that are selected by the user (e.g., the speakers 704a-b of the controller 100, the speakers of the television 1603, the speakers of the headset 2302, the speakers that are connected to the computing device 1604).
- step 2750 the controller 100 sends the audio data to the speakers that are selected by the user.
- each of the steps of method 2700c is a distinct step.
- step 2740-2750 may not be distinct steps.
- method 2700c may not have all of the above steps and/or may have other steps in addition to or instead of those listed above. The steps of method 2700c may be performed in another order.
- the controller 100 may control the display and/or speakers to play multimedia content using the method 2700c, while input user commands using the method 2700a during the playing (e.g., control the volume, pause, play back, fast forward, etc.).
- the input user commands may be sent out in various modes as depicted in method 2700b.
- FIG. 28 is a flowchart of an embodiment of a method 2800 of assembling the controller 100.
- the bottom part and upper part of housing 101 are formed.
- the housing 101 may include a bottom part that is covered by an upper part.
- the upper part may include at least the upper surface 102 and front slope 103.
- the bottom part includes the rest of housing 101.
- the upper part fits on the bottom part similar to the way a cover fits on a container.
- step 2804 touch sensors are connected to the top of thumb button 105, index button 106, and middle button 107 to form thumb touch pad 2620, index touch pad 2622, and index touch pad 2624, respectively.
- step 2806 the index button 106 and middle button 107 are connected to the bottom part of housing 101, at the head end 109. As part of step 2806, the index button 106 and middle button 107 are connected to two button switches, respectively.
- step 2808 the speakers 704a-b and connector 706 are connected to the bottom part of the housing 101, at the tail end 108.
- the switch 404 is connected to the bottom part of the housing 101.
- the motion sensors 2618 and transceiver 2604 are affixed in the bottom part of housing 101.
- a built-in rechargeable battery is affixed in the bottom part of housing 101.
- a battery holder may be connected in the bottom part of housing 101.
- the battery holder in the housing 101 may include leads from the battery to a power supply/adapter system that formals (e.g., converts the voltage, current, and/or frequency of) the electricity to one that is appropriate for the rest of the system of the controller 100.
- a battery (batteries) is (are) placed inside the battery holder for providing electricity to the controller 100.
- batteries may not be installed during manufacture and/or for sale, and a user may insert the batteries before start using the controller 100.
- step 2812 the circuit board is affixed inside the bottom part of the housing 101.
- the circuit board is connected via wires with the button switches and touch sensors that are connected to the index button 106 and middle button 107, respectively, the switch 404, the built-in battery, the motion sensor 2618, the transceiver 2604, the speakers 704a-b, and connector 706.
- step 2814 the light 502 is affixed on the top surface 102 of the upper part of the housing 101, close to the head end 109.
- the thumb button 105 is connected to the upper part of the housing 101, inside the light 502.
- the buttons may be connected to the housing 101 first, and then the touch sensors may be attached to the top of the buttons.
- the thumb button 105 is connected to a button switch.
- the button switch for the thumb button 105 may be connected in the bottom part of the housing 101.
- step 2818 the microphone 702 is connected to the upper part of the housing 101, close to the tail end 108.
- the laser 602 is connected to the front slope 103 on the upper part of the housing 101.
- step 2820 the circuit board is connected via wires to the light 502, touch sensors and button switch of the thumb button 105, microphone 702, and laser 602.
- step 2822 the upper part of the housing 101 is connected to the bottom part of the housing 101.
- the upper part is glued to the bottom part of the housing 101 to permanently connect the two parts of the housing 101 together.
- each of the steps of method 2800 is a distinct step.
- step 2802-2822 may not be distinct steps.
- method 2800 may not have all of the above steps and/or may have other steps in addition to or instead of those listed above. The steps of method 2800 may be performed in another order.
- FIG. 29 shows a cross-sectional side view 2900 of an embodiment of the controller 100 of FIG. 1.
- the device in the view 2900 includes at least a housing 2901, a top surface 2902, a front slope 2903, a concave area 2904, a thumb button 2905, an index button 2906, a middle button 2907, a tail end 2908, a head end 2909, a rack 2910, a thumb button bar 2911, a pivot 2912, an index button bar 2913, a pivot 2914, a middle button bar 2915, a pivot 2916, button switches 2917, 2918, and 2919, a circuit 2920, supporting structures 2922, a light 2930, a thumb touch pad 2931, an index touch pad 2932, and a middle touch pad 2933.
- the device in view 2900 may not have all of the elements or features listed and/or may have other elements or features instead of or in addition to those listed.
- FIG. 29 shows a cross-sectional side view 2900 of an embodiment of the controller 100, demonstrating internal mechanical and electrical components.
- the housing 2901, top surface 2902, front slope 2903, concave area 2904, thumb button 2905, index button 2906, middle button 2907, tail end 2908, and head end 2909 are embodiments of the housing 101, top surface 102, front slope 103, concave area 104, thumb button 105, index button 106, middle button 107, tail end 108, and head end 109, which were discussed in conjunction with FIG. 1.
- the light 2930 is an embodiment of the light 502, which was discussed in conjunction with FIG. 5.
- touch sensors are connected on the thumb button 2905, index button 2906, middle button 2907 to form touch pads for detecting finger movements and/or gestures, respectively, in manners that were discussed throughout the specification.
- Rack 2910 is a structure that is fixed inside the housing 2901, close to the head end 2909. In at least one embodiment, the rack 2910 is perpendicular to the top surface 2902. In at least one embodiment, the rack 2910 is affixed to the inner surface of the housing 2901 at the bottom of the head end 2909. Alternatively or additionally, the rack 2910 is connected to a portion of the housing 2901 in between the index button 2906 and middle button 2907, and/or to the front slope 2903 from inside the housing 2901. In at least one embodiment, the rack 2910 supports button switches of the thumb button 2905, index button 2906, and/or middle button 2907. In other embodiments, the rack 2910 may include other shapes and/or structures.
- Thumb button bar 2911 is connected on one end to a bottom side of the thumb button 2905, and the other end is pivotally connected to the inner side of the top surface 2902. In an embodiment, when the thumb button 2905 is pressed (and released), the thumb button 2905 and thumb button bar 2911 pivot to activate (and deactivate) a button switch, which may make or break a circuit, thereby indicating when thumb button 2905 was depressed.
- Pivot 2912 is a pivot that holds the thumb button bar 2911 and a protrusion on the inner surface of the housing 2901 together. Pivot 2912 allows the thumb button bar 2911 to rotate about the pivot 2912 while one end is attached to the housing 2901. In at least one embodiment, the pivot 2912 is spring loaded to bias the thumb button 2905 to stay in an unpressed position until the user clicks the thumb button 2905.
- Index button bar 2913 is connected on one end to an inner side of the index button 2906, and the other end is pivotally connected to a portion of the rack 2910 close to the front slope 2903.
- the index button bar 2910 is pivotally connected to a protrusion inside the housing 2901 close to the front slope 2903.
- the index button 2906 and index button bar 2913 pivot to activate (and deactivate) a button switch, which may make or break a circuit, thereby indicating when index button 2906 was depressed
- Pivot 2914 is a pivot that holds the index button bar 2913 and the rack 2910 together. Pivot 2914 allows the index button bar 2913 to rotate about the pivot 2914 while one end is attached to the rack 2910. In at least one embodiment, the pivot 2914 is spring loaded to bias the index button 2906 to stay in an unpressed position until the user clicks the index button 2906.
- Middle button bar 2915 is connected on one end to an inner side of the middle button 2907, and the other end is pivotally connected to a protrusion inside the housing 2901 at the bottom of the head 2909. Alternatively, the middle button bar 2915 may be pivotally connected to the rack 2910. In an embodiment, when the middle button 2907 is pressed (and released), the middle button 2907 and middle button bar 2915 pivot to activate (and deactivate) a button switch, which may make or break a circuit that indicates when middle button 2907 was depressed.
- Pivot 2916 is a pivot that holds the middle button bar 2915 and the protrusion of the housing 2901 together. Pivot 2916 allows the middle button bar 2915 to rotate about the pivot 2916 while one end is attached to the housing 2901. In at least one embodiment, the pivot 2916 is spring loaded to bias the middle button 2907 to stay in an unpressed position until the user clicks the middle button 2907.
- Button switches 2917, 2918, and 2919 are electric switches or sensing devices that detect click operations on the thumb button 2905, index button 2906, and middle button 2907, respectively.
- the button switches 2917, 2918, and 2919 are micro-switches.
- the button switch 2917 is connected to the bottom side of the thumb button 2905, and is activated when the thumb button 2905 is clicked or held down.
- the button switch 2918 and 2919 are connected to the inner side of the index button 2906 and middle button 2907, respectively, and are activated when the index button 2906 and middle button 2907 are clicked or held down, respectively.
- Circuit 2920 includes a circuit board that mechanically supports and electrically connects electronic components of the circuit 2920.
- Circuit 2902 may include the circuitry shown in block diagram 2600 (FIG. 26).
- the circuit board of circuit 2920 may be single sided (one layer), or double sided (two layers), or multi-layer (outer and inner layers).
- the circuit 2920 receives and process signals corresponding to the finger operations, voice commands, and/or motion of the controller 100 to communicate with and control the computing device 1604 and/or other devices.
- the circuit board may be attached to the rack 2910 and/or other supporting structures inside the housing 2901. In at least one embodiment, more than one circuit board may be affixed in the housing 101 of the controller 100.
- Supporting structures 2922 are structures that hold and support the circuit 2920 inside the housing 2901.
- the supporting structures 2922 include a plurality of bars and/or tabs that are fixed to the inner surface of the housing 2901 and the circuit 2920 are attached to the supporting structures 2922.
- Thumb touch pad 2931, index touch pad 2932, and middle touch pad 2933 are embodiments of the thumb touch pad, index touch pad, and middle touch pad that were attached on top of the thumb button 105, index button 106 and middle button 107, respectfully, which were discussed in conjunction with FIG. 1.
- the index button 106 and middle button 107 may be merged into one button, which is connected with touch sensors.
- the merged single button may be manipulated by both the index and middle fingers, or may be controlled using the index finger only.
- the shape and/or size of the controller may be different from that of the controller 100 depicted in FIGs.1-7, where the controller can be held in human hands in a comfortable posture so as to reduce the muscle stress of the user when operating the controller.
- the position of each button may also be different from that of the controller 100, where the dedicated finger for each button can rest and move comfortably on the button during the operation of the controller.
- home computing experience is segmented when the user needs to work on different computing/electronic devices that may be located in different places in a house.
- a personal computer may be set up on a desk in the study room
- a flat screen TV may be set up as the entertainment center in the living room
- laptop computers or tablets may be found in various places such as bedroom or dining room
- a smart phone may be carried by the user wherever he goes. Therefore, it may be desired in at least some circumstances for the user to communicate to a computing device with a single control device that incorporates navigating, text inputting, audio communication capabilities, display mirroring. It may also be desired in at least some circumstances for the user to control various computing/electronic devices using a single control device.
- the user can use the controller 100 as a single human-machine interface to interact with multiple computing/electronic devices.
- the user may sit comfortably in a couch in the living room to use the controller 100 to control the operations on the personal computer and view the user interface on the screen of the TV or using a projector, or watch movies or play video games using a virtual reality goggle and a headset.
- the user may hold the controller 100 and relax his arm and wrist in a natural posture, while use his fingers to click and/or make gestures on the buttons and/or touch pads of the controller 100.
- the user may also move the controller 100 in the air for further control of the cursor movement and/or handwriting recognition. Furthermore, the controller 100 may detect voice commands from the user.
- the controller 100 allows the user to control various devices and/or operations in different modes (e.g., to control the TV in the television mode, to control the laser in the presentation mode, to control a soft key board and text editing during the text input mode, etc.)
- At least a type of software may be used to assist the controller 100 to achieve all user intended operations.
- the driver will translate the control commands into corresponding operations, such as moving the cursor, activating the soft keyboard, opening an application, playing video and audio streams, etc.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- General Health & Medical Sciences (AREA)
- Optics & Photonics (AREA)
- Position Input By Displaying (AREA)
- Selective Calling Equipment (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020177008769A KR102257168B1 (en) | 2014-09-01 | 2015-08-25 | Multi-surface controller |
JP2017530960A JP6408156B2 (en) | 2014-09-01 | 2015-08-25 | Multi surface controller |
EP15837335.7A EP3189403A4 (en) | 2014-09-01 | 2015-08-25 | Multi-surface controller |
CN201580053746.1A CN107003750B (en) | 2014-09-01 | 2015-08-25 | Multi-surface controller |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201462044372P | 2014-09-01 | 2014-09-01 | |
US62/044,372 | 2014-09-01 | ||
US14/829,512 | 2015-08-18 | ||
US14/829,512 US10444849B2 (en) | 2014-09-01 | 2015-08-18 | Multi-surface controller |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016036543A1 true WO2016036543A1 (en) | 2016-03-10 |
Family
ID=55402450
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2015/046790 WO2016036543A1 (en) | 2014-09-01 | 2015-08-25 | Multi-surface controller |
Country Status (6)
Country | Link |
---|---|
US (1) | US10444849B2 (en) |
EP (1) | EP3189403A4 (en) |
JP (1) | JP6408156B2 (en) |
KR (1) | KR102257168B1 (en) |
CN (1) | CN107003750B (en) |
WO (1) | WO2016036543A1 (en) |
Families Citing this family (37)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10534447B2 (en) | 2014-09-01 | 2020-01-14 | Yinbo Li | Multi-surface controller |
US10444849B2 (en) | 2014-09-01 | 2019-10-15 | Yinbo Li | Multi-surface controller |
USD794201S1 (en) * | 2015-07-09 | 2017-08-08 | Dexcom, Inc. | Medical device electronic module |
DE102015221123A1 (en) * | 2015-10-29 | 2017-05-04 | Bayerische Motoren Werke Aktiengesellschaft | operating element |
JP5996079B1 (en) * | 2015-12-10 | 2016-09-21 | レノボ・シンガポール・プライベート・リミテッド | Information processing apparatus, software keyboard display method, and program |
JP7075357B2 (en) * | 2016-06-07 | 2022-05-25 | コーニンクレッカ フィリップス エヌ ヴェ | Operation control of wireless sensor |
JP6776067B2 (en) * | 2016-09-05 | 2020-10-28 | 東芝テック株式会社 | Operation input device and program |
US20180170242A1 (en) * | 2016-12-19 | 2018-06-21 | Pilot, Inc. | Bluetooth-enabled vehicle lighting control hub |
US20180174581A1 (en) * | 2016-12-19 | 2018-06-21 | Pilot, Inc. | Voice-activated vehicle lighting control hub |
WO2018175292A1 (en) * | 2017-03-20 | 2018-09-27 | Tactual Labs Co. | Hand sensing controller |
US11163416B2 (en) | 2017-04-27 | 2021-11-02 | Magic Leap, Inc. | Light-emitting user input device for calibration or pairing |
USD828337S1 (en) | 2017-06-20 | 2018-09-11 | Yinbo Li | Multi-surface controller |
SE543016C2 (en) * | 2017-09-14 | 2020-09-29 | Zound Industries Int Ab | A detachable knob for controlling a function of a loudspeaker |
EP3710962A1 (en) | 2017-11-13 | 2020-09-23 | Arkio EHF. | Virtual/augmented reality modeling application for architecture |
USD879881S1 (en) * | 2018-02-02 | 2020-03-31 | Kynoa Sa | Video game controller |
US20190265880A1 (en) * | 2018-02-23 | 2019-08-29 | Tsimafei Sakharchuk | Swipe-Board Text Input Method |
EP4212222A1 (en) | 2018-03-07 | 2023-07-19 | Magic Leap, Inc. | Visual tracking of peripheral devices |
US10639543B2 (en) | 2018-03-23 | 2020-05-05 | Valve Corporation | Handheld controllers with touch-sensitive controls |
JP7273459B2 (en) * | 2018-04-03 | 2023-05-15 | 株式会社栗本鐵工所 | Haptic device |
US10592104B1 (en) * | 2018-06-08 | 2020-03-17 | Facebook Technologies, Llc | Artificial reality trackpad-based keyboard |
USD918176S1 (en) | 2018-07-24 | 2021-05-04 | Magic Leap, Inc. | Totem controller having an illumination region |
USD924204S1 (en) * | 2018-07-24 | 2021-07-06 | Magic Leap, Inc. | Totem controller having an illumination region |
USD930614S1 (en) * | 2018-07-24 | 2021-09-14 | Magic Leap, Inc. | Totem controller having an illumination region |
WO2020039507A1 (en) * | 2018-08-21 | 2020-02-27 | 株式会社ソニー・インタラクティブエンタテインメント | Controller device |
WO2020068087A1 (en) * | 2018-09-27 | 2020-04-02 | Hewlett-Packard Development Company, L.P. | Microphone control |
US10824244B2 (en) * | 2018-11-19 | 2020-11-03 | Facebook Technologies, Llc | Systems and methods for transitioning between modes of tracking real-world objects for artificial reality interfaces |
CN109979181B (en) * | 2019-04-22 | 2024-06-04 | 深圳市特力电器有限公司 | Wireless remote control assembly |
JP6947776B2 (en) * | 2019-04-26 | 2021-10-13 | 株式会社ソニー・インタラクティブエンタテインメント | Controller device, its control method, and program |
USD962905S1 (en) * | 2019-12-16 | 2022-09-06 | Magic Leap, Inc. | Device controller |
USD940688S1 (en) * | 2020-02-19 | 2022-01-11 | XRSpace CO., LTD. | Controller |
CN111267129B (en) * | 2020-02-24 | 2022-07-29 | 东南大学 | Control device of accompanying robot |
CN112068696B (en) * | 2020-08-28 | 2024-06-28 | 深圳晶泰科技有限公司 | VR helmet, crystal interaction system and method |
WO2022061114A1 (en) * | 2020-09-17 | 2022-03-24 | Bogie Inc. | System and method for an interactive controller |
USD994025S1 (en) * | 2021-01-29 | 2023-08-01 | Sheel, Llc | Controller |
JP7311550B2 (en) * | 2021-03-25 | 2023-07-19 | 株式会社ソニー・インタラクティブエンタテインメント | input device |
JP7238090B1 (en) | 2021-12-28 | 2023-03-13 | レノボ・シンガポール・プライベート・リミテッド | Input device, electronic device, and input setting method |
US20240245214A1 (en) * | 2022-01-09 | 2024-07-25 | Shweta HARVE | Magnetic invisible shelf support |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020118167A1 (en) * | 1999-03-29 | 2002-08-29 | Yu Mei | Hand-held mobile mouse |
US20050179658A1 (en) * | 2004-02-18 | 2005-08-18 | Benq Corporation | Mouse with a built-in laser pointer |
US20070008284A1 (en) * | 2005-07-11 | 2007-01-11 | Do-Hwi Kim | Mouse with built-in microphone |
US20130038534A1 (en) * | 2005-06-03 | 2013-02-14 | Apple Inc. | Mouse with improved input mechanisms using touch sensors |
Family Cites Families (78)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS648156A (en) | 1987-06-29 | 1989-01-12 | Sharp Kk | Double surface copying sheet conveying device |
US5144594A (en) * | 1991-05-29 | 1992-09-01 | Cyber Scientific | Acoustic mouse system |
USD366875S (en) | 1994-02-15 | 1996-02-06 | Sony Corporation | Remote commander |
USD380473S (en) | 1995-03-08 | 1997-07-01 | Sharp Kabushiki Kaisha | Remote controller |
US6127990A (en) | 1995-11-28 | 2000-10-03 | Vega Vista, Inc. | Wearable display and methods for controlling same |
JPH09212288A (en) | 1996-01-31 | 1997-08-15 | Canon Inc | System and method for information input |
USD392945S (en) | 1996-10-16 | 1998-03-31 | Motorola, Inc. | Portable electronic package for a visual display |
USD420006S (en) | 1996-10-23 | 2000-02-01 | U.S. Philips Corporation | Remote control unit |
US7800592B2 (en) | 2005-03-04 | 2010-09-21 | Apple Inc. | Hand held electronic device with multiple touch sensing devices |
USD426232S (en) | 1999-01-20 | 2000-06-06 | Avicor Digital Imaging Systems | Hand-held infrared remote control unit for a digital image viewer system |
JP2001290598A (en) | 2000-04-10 | 2001-10-19 | Toshiba Corp | Information input/output device and system using the same |
US20020084986A1 (en) * | 2001-01-04 | 2002-07-04 | Armstrong Brad A. | Computer mouse with specialized button(s) |
USD486144S1 (en) | 2002-05-03 | 2004-02-03 | Disney Enterprises, Inc. | Remote control |
JP4791687B2 (en) * | 2003-09-05 | 2011-10-12 | 修司 北澤 | Input device |
USD511750S1 (en) | 2004-02-17 | 2005-11-22 | Hunter Fan Company | Ceiling fan remote |
USD502929S1 (en) | 2004-03-02 | 2005-03-15 | Select Comfort Corporation | Remote control |
USD520993S1 (en) | 2004-03-15 | 2006-05-16 | Nokia Corporation | Digital image album |
US20060001657A1 (en) | 2004-07-02 | 2006-01-05 | Logitech Europe S.A. | Scrolling device |
USD528103S1 (en) | 2004-10-12 | 2006-09-12 | Thomas Licensing S.A. | Universal remote control |
USD550168S1 (en) | 2005-01-24 | 2007-09-04 | Rhine Electronic Co., Ltd. | Remote control transmitter |
USD520462S1 (en) | 2005-02-04 | 2006-05-09 | Sony Corporation | Remote controller |
USD527006S1 (en) | 2005-02-25 | 2006-08-22 | Hillcrest Laboratories, Inc. | Remote control device |
USD556760S1 (en) | 2005-08-17 | 2007-12-04 | Nintendo Co., Ltd. | Controller for electronic game machine |
CN1919389A (en) | 2005-08-24 | 2007-02-28 | 任天堂株式会社 | Video game controller and video game system |
US8870655B2 (en) | 2005-08-24 | 2014-10-28 | Nintendo Co., Ltd. | Wireless game controllers |
US8308563B2 (en) | 2005-08-30 | 2012-11-13 | Nintendo Co., Ltd. | Game system and storage medium having game program stored thereon |
KR100792290B1 (en) * | 2006-06-08 | 2008-01-07 | 삼성전자주식회사 | Input device comprising Geomagnetic sensor and acceleration sensor, display device for displaying cursor corresponding to the motion of the input device, and, cursor display method thereof |
US8022935B2 (en) | 2006-07-06 | 2011-09-20 | Apple Inc. | Capacitance sensing electrode with integrated I/O mechanism |
DE112006004113T5 (en) * | 2006-11-10 | 2009-10-15 | Razer (Asia-Pacific) Pte. Ltd. | Ergonomically configurable game controller |
US8516393B2 (en) * | 2006-12-18 | 2013-08-20 | Robert Pedersen, II | Apparatus, system, and method for presenting images in a multiple display environment |
USD550654S1 (en) | 2006-12-29 | 2007-09-11 | Clarion Co., Ltd. | Remote control for a speaker |
JP4940294B2 (en) | 2007-03-09 | 2012-05-30 | 株式会社ソニー・コンピュータエンタテインメント | Information processing system, operation device, and information processing method |
TW200909032A (en) | 2007-08-20 | 2009-03-01 | Tai Sol Electronics Co Ltd | Three-dimensional wireless rocking lever |
KR20090059079A (en) * | 2007-12-05 | 2009-06-10 | 오의진 | Data input device |
WO2009072475A1 (en) | 2007-12-07 | 2009-06-11 | Sony Corporation | Input device, control device, control system, handheld device and control method |
USD597038S1 (en) | 2008-07-09 | 2009-07-28 | Sears Brands, Llc | Remote control |
JP4936479B2 (en) | 2009-03-26 | 2012-05-23 | 任天堂株式会社 | Information processing program, information processing apparatus, information processing system, and information processing method |
TWI505140B (en) * | 2009-05-20 | 2015-10-21 | Asustek Comp Inc | Adaptive mouse |
US20100302190A1 (en) * | 2009-06-02 | 2010-12-02 | Elan Microelectronics Corporation | Multi-functional touchpad remote controller |
CN102033642B (en) * | 2009-09-29 | 2012-10-10 | 联想(北京)有限公司 | Method for gesture reorganization and electronic device |
US8633916B2 (en) | 2009-12-10 | 2014-01-21 | Apple, Inc. | Touch pad with force sensors and actuator feedback |
USD620925S1 (en) | 2009-12-27 | 2010-08-03 | Hillcrest Laboratories, Inc. | Presenter |
USD632673S1 (en) | 2010-04-21 | 2011-02-15 | Estefano Emilio Isaias | Remote control |
US9152244B2 (en) * | 2011-01-30 | 2015-10-06 | Lg Electronics Inc. | Image display apparatus and method for operating the same |
CN102778961A (en) | 2011-05-12 | 2012-11-14 | 鸿富锦精密工业(深圳)有限公司 | Touch type mouse |
AU341265S (en) | 2011-08-24 | 2012-02-29 | Lg Electronics Inc | Remote control for television receivers |
JP2013117885A (en) * | 2011-12-02 | 2013-06-13 | Nintendo Co Ltd | Information processing program, information processing equipment, information processing system and information processing method |
EP2834725A4 (en) | 2012-04-07 | 2015-12-09 | Motorola Mobility Llc | Text select and enter |
USD690684S1 (en) | 2012-05-30 | 2013-10-01 | Samsung Electronics Co., Ltd. | Remote controller |
KR20130134545A (en) * | 2012-05-31 | 2013-12-10 | 주식회사 다음커뮤니케이션 | System and method for digital television voice search using remote control |
KR101339985B1 (en) * | 2012-06-20 | 2013-12-11 | 삼성전자주식회사 | Display apparatus, remote controlling apparatus and control method thereof |
DE202013100255U1 (en) | 2012-06-20 | 2013-04-12 | Samsung Electronics Co. Ltd. | Display device, remote control device and operating function of the same |
KR20140060818A (en) * | 2012-11-12 | 2014-05-21 | 삼성전자주식회사 | Remote controller and display apparatus, control method thereof |
CN103841442B (en) * | 2012-11-22 | 2017-06-27 | 华为技术有限公司 | A kind of method for controlling air mouse, air mouse and Set Top Box |
USD693333S1 (en) | 2013-01-03 | 2013-11-12 | Lg Electronics Inc. | Remote control |
USD723008S1 (en) | 2013-03-05 | 2015-02-24 | Samsung Electronics Co., Ltd. | Remote controller |
USD716768S1 (en) | 2013-03-05 | 2014-11-04 | Samsung Electronics Co., Ltd. | Remote controller |
USD716767S1 (en) | 2013-03-05 | 2014-11-04 | Samsung Electronics Co., Ltd. | Remote controller |
USD715774S1 (en) | 2013-03-05 | 2014-10-21 | Samsung Electronics Co., Ltd. | Remote controller |
USD724059S1 (en) | 2013-03-12 | 2015-03-10 | Samsung Electronics Co., Ltd. | Remote controller |
USD746266S1 (en) | 2013-10-23 | 2015-12-29 | Samsung Electronics Co., Ltd. | Remote controller |
USD729208S1 (en) | 2013-12-13 | 2015-05-12 | Samsung Electronics Co., Ltd. | Remote controller |
USD725609S1 (en) | 2014-01-16 | 2015-03-31 | Umm Al-Qura University | Remote control |
CN103885707A (en) * | 2014-02-27 | 2014-06-25 | 四川长虹电器股份有限公司 | Multi-touch technology based human-computer interaction method and remote controller |
USD734743S1 (en) | 2014-05-16 | 2015-07-21 | Hillcrest Laboratories, Inc. | Remote control device |
USD717279S1 (en) | 2014-06-30 | 2014-11-11 | Victor K. Wai | Handheld remote control |
US10444849B2 (en) | 2014-09-01 | 2019-10-15 | Yinbo Li | Multi-surface controller |
US10534447B2 (en) | 2014-09-01 | 2020-01-14 | Yinbo Li | Multi-surface controller |
USD773407S1 (en) | 2015-02-10 | 2016-12-06 | Samsung Electronics Co., Ltd. | Remote control |
USD769227S1 (en) | 2015-02-13 | 2016-10-18 | Hunter Douglas Inc. | Remote control housing |
USD776091S1 (en) | 2015-11-06 | 2017-01-10 | Mary A. Spio | Controller |
USD812041S1 (en) | 2016-05-25 | 2018-03-06 | Fibar Group S.A. | Keyfob |
USD796454S1 (en) | 2016-06-21 | 2017-09-05 | Power Vision Robot, Inc. | Somatosensory controller |
USD797743S1 (en) | 2016-08-22 | 2017-09-19 | Magic Leap, Inc. | Totem controller |
USD813203S1 (en) | 2016-10-26 | 2018-03-20 | Muzik LLC | Hand held controller |
USD798275S1 (en) | 2016-12-23 | 2017-09-26 | Shanghai Knorvay Technology Ltd. | Wireless presenter |
USD844608S1 (en) | 2017-04-28 | 2019-04-02 | Facebook Technologies, Llc | Electronic input device |
USD828337S1 (en) | 2017-06-20 | 2018-09-11 | Yinbo Li | Multi-surface controller |
-
2015
- 2015-08-18 US US14/829,512 patent/US10444849B2/en not_active Expired - Fee Related
- 2015-08-25 EP EP15837335.7A patent/EP3189403A4/en not_active Ceased
- 2015-08-25 CN CN201580053746.1A patent/CN107003750B/en not_active Expired - Fee Related
- 2015-08-25 WO PCT/US2015/046790 patent/WO2016036543A1/en active Application Filing
- 2015-08-25 KR KR1020177008769A patent/KR102257168B1/en active IP Right Grant
- 2015-08-25 JP JP2017530960A patent/JP6408156B2/en not_active Expired - Fee Related
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020118167A1 (en) * | 1999-03-29 | 2002-08-29 | Yu Mei | Hand-held mobile mouse |
US20050179658A1 (en) * | 2004-02-18 | 2005-08-18 | Benq Corporation | Mouse with a built-in laser pointer |
US20130038534A1 (en) * | 2005-06-03 | 2013-02-14 | Apple Inc. | Mouse with improved input mechanisms using touch sensors |
US20070008284A1 (en) * | 2005-07-11 | 2007-01-11 | Do-Hwi Kim | Mouse with built-in microphone |
Non-Patent Citations (1)
Title |
---|
See also references of EP3189403A4 * |
Also Published As
Publication number | Publication date |
---|---|
CN107003750B (en) | 2020-10-30 |
KR20170054423A (en) | 2017-05-17 |
CN107003750A (en) | 2017-08-01 |
JP6408156B2 (en) | 2018-10-17 |
KR102257168B1 (en) | 2021-05-26 |
JP2017535002A (en) | 2017-11-24 |
US20160062489A1 (en) | 2016-03-03 |
US10444849B2 (en) | 2019-10-15 |
EP3189403A4 (en) | 2018-08-01 |
EP3189403A1 (en) | 2017-07-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10534447B2 (en) | Multi-surface controller | |
US10444849B2 (en) | Multi-surface controller | |
US11921959B2 (en) | Dynamic user interface adaptable to multiple input tools | |
US10037052B2 (en) | Finger-wearable devices and associated systems | |
US10528154B2 (en) | System for projecting content to a display surface having user-controlled size, shape and location/direction and apparatus and methods useful in conjunction therewith | |
JP6275839B2 (en) | Remote control device, information processing method and system | |
KR20150007799A (en) | Electronic device and method for controlling image display | |
Lissermann et al. | EarPut: augmenting ear-worn devices for ear-based interaction | |
CN109844702B (en) | Control method for electronic equipment and input equipment | |
JP6194355B2 (en) | Improved devices for use with computers | |
CN109558061A (en) | A kind of method of controlling operation thereof and terminal | |
JP3169021U (en) | Mouse with touch panel | |
CN110045843A (en) | Electronic pen, electronic pen control method and terminal device | |
AU2011219427B2 (en) | A system for projecting content to a display surface having user-controlled size, shape and location/direction and apparatus and methods useful in conjunction therewith | |
CN103345358A (en) | Display device and information processing method thereof | |
EP4439241A1 (en) | Improved touchless pointer operation during typing activities using a computer device | |
KR101793146B1 (en) | Input device | |
KR20210014386A (en) | A tactile sheet for blind person and a mobile terminal using the same | |
Board | EarPut: Augmenting Ear-worn Devices for Ear-based Interaction |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15837335 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017530960 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REEP | Request for entry into the european phase |
Ref document number: 2015837335 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2015837335 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 20177008769 Country of ref document: KR Kind code of ref document: A |