US20140024456A1 - Changing icons on user input device - Google Patents

Changing icons on user input device Download PDF

Info

Publication number
US20140024456A1
US20140024456A1 US13/553,636 US201213553636A US2014024456A1 US 20140024456 A1 US20140024456 A1 US 20140024456A1 US 201213553636 A US201213553636 A US 201213553636A US 2014024456 A1 US2014024456 A1 US 2014024456A1
Authority
US
United States
Prior art keywords
display device
pushable
display
button
icon
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/553,636
Inventor
Bryon Ashley
Quintin Morris
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microsoft Technology Licensing LLC
Original Assignee
Microsoft Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft Corp filed Critical Microsoft Corp
Priority to US13/553,636 priority Critical patent/US20140024456A1/en
Assigned to MICROSOFT CORPORATION reassignment MICROSOFT CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: Ashley, Bryon, MORRIS, QUINTIN
Priority to PCT/US2013/051169 priority patent/WO2014015198A1/en
Publication of US20140024456A1 publication Critical patent/US20140024456A1/en
Assigned to MICROSOFT TECHNOLOGY LICENSING, LLC reassignment MICROSOFT TECHNOLOGY LICENSING, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MICROSOFT CORPORATION
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/24Constructional details thereof, e.g. game controllers with detachable joystick handles
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • A63F13/26Output arrangements for video game devices having at least one additional display device, e.g. on the game controller or outside a game booth
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/23Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console
    • A63F13/235Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console using a wireless connection, e.g. infrared or piconet
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1043Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being characterized by constructional details
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/20Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform
    • A63F2300/203Image generating hardware
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/301Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device using an additional display connected to the game console, e.g. on the controller

Definitions

  • User input devices such as hand-held game controllers, include one or more input mechanisms for effecting control over a computing device or applications presented thereby.
  • Input mechanisms such as buttons and directional pads, may include generic, fixed markings to identify the input mechanisms.
  • Embodiments are disclosed that relate to pushable buttons comprising individually controllable display devices.
  • a hand-held game controller comprising a body and a plurality of pushable buttons movably coupled to the body, each pushable button comprising a transparent cap and an individually controllable display device arranged behind the transparent cap such that the display device is viewable through the transparent cap.
  • the hand-held game controller further comprises a controller configured to receive an input from an application requesting display of a first icon on a first display device of a first pushable button of the plurality of pushable buttons and a second icon on a second display device of a second button of the plurality of pushable buttons, and, in response, display the first icon on the first display device and display the second icon on the second display device.
  • FIG. 1 shows an embodiment of a hand-held video game controller in an example use environment.
  • FIG. 2 shows an example of an embodiment of a pushable button comprising an individually controllable display device.
  • FIG. 3 shows a top-view of an embodiment of a pushable button comprising an individually controllable display device.
  • FIG. 4-5 shows an example use case scenario for an embodiment of a hand-held video game controller comprising a plurality of pushable buttons.
  • FIG. 6 shows a process flow depicting an embodiment of a method of operating a hand-held video game controller.
  • FIG. 7 shows a non-limiting example of an embodiment of a computing device.
  • an input device may include one or more input mechanisms (e.g., buttons, directional pads, triggers, etc.) that may be user-actuatable in order to effect control over a computing device and/or over applications presented thereby (e.g., video games).
  • Said input mechanisms typically include one or more fixed markings, such as arrows, alphanumeric characters, symbols, etc., to identify the input mechanism.
  • an input mechanism may not be utilized in a consistent manner, for example, for different applications. Instead, customized functionality may be defined by the computing device manufacturer, the application developer, the user, and/or other entity. For example, the “up” direction of a directional pad may effect forward motion of an on-screen character in a first set of applications, whereas the “up” direction may fire an on-screen weapon in a second set of applications.
  • a user of an input device may rely on their own memory and/or may access on-screen help (e.g., a controller legend) in order to ascertain which input mechanism is associated with a given activity, function, weapon, ability, etc.
  • various embodiments are disclosed herein that relate to the display of context-specific icons on such input devices to allow the user to quickly at the controller to recall the assignments.
  • a user may be able to quickly and accurately ascertain the functionality of a given input mechanism without referring to additional instructional materials (e.g., instruction manuals, etc.).
  • additional instructional materials e.g., instruction manuals, etc.
  • FIG. 1 shows an example use environment 100 comprising an embodiment of a hand-held video game controller 102 .
  • Controller 102 comprises body 104 configured to be held in one or both hands.
  • body 104 may be “hollow” such that body 104 forms one or more cavities in which other elements of controller 102 , in whole or in part, are oriented.
  • Body 104 may be further configured to provide support and/or positioning for one or more input mechanisms.
  • controller 102 may include input mechanisms such as joystick 106 and/or buttons 108 movably coupled to body 104 . These input mechanisms may be usable to effect control over computing device 110 and/or over applications presented thereby.
  • Controller 102 may be used, for example, to control the motion of game character 112 of application 113 rendered by computing device 110 and displayed via display device 114 .
  • controller 102 may further comprise a communication subsystem configured to provide unidirectional or bidirectional communication with computing device 110 to allow controller 102 to send control information to computing device 110 for controlling the computing device, and also to receive communications from the computing device 110 .
  • the control information sent to the computing device 110 may comprise, for example, state information regarding the current state of joystick 106 and/or buttons 108 .
  • controller 102 communicate with computing device 110 via a wireless connection, a wired connection, or a combination thereof.
  • Wireless communication may be performed via infrared light, visible light, radio-frequency (e.g., 802.11 or mobile telephony), combinations thereof, or via any other suitable mechanism.
  • controller 102 may further comprise one or more input mechanisms each comprising an individually controllable display device, embodiments of which are shown as pushable buttons 116 .
  • pushable button refers to any input mechanism configured to be user-actuatable by a pressing motion (e.g., via thumb or other finger) that causes movement of the button into a body of the controller to actuate an actuatable component.
  • Independently controllable displays may be configured to provide formation, such as icons, to a user of controller 102 to convey the functionality provided by actuation of the corresponding pushable button.
  • the term “icon” as used herein refers to any image, textual and/or graphical, displayed via an individually controllable display device.
  • buttons 116 may be configured to alternatively display alphanumeric icons 118 and graphical icons 120 .
  • specific actions or activities e.g., jump, crouch, run, change weapon, etc.
  • the icons may be updated to reflect an updated state (e.g., updated functionality of the associated input mechanism).
  • Controller 102 may further include one or more directional pads 122 each comprising an individually controllable display device and disposed on body 104 apart from pushable buttons 116 .
  • directional pad 122 may be configured to display a variety of icons, including, but not limited to, directional icons 124 (e.g., arrows similar to “standard” directional pad markings), graphical icons 126 (e.g., “run” icon), and textual icons (e.g., international and/or extended character sets to facilitate user input of text).
  • directional pad 122 may comprise a unitary body that is user-actuatable in a plurality of directions 123 (e.g., up, down, left, right, etc.). Accordingly, each of directions 123 may comprise a corresponding actuatable component in order to register user-actuation of the direction. Regardless of the number and orientation of the directions, directional pad 122 may comprise a monolithic transparent directional cap spanning the plurality of directions 123 such that a single directional pad display device may be utilized for the plurality of directions.
  • the display device may be substantially coextensive with the directional cap (e.g., cross-shaped, etc.) such that the display device is disposed substantially within the transparent directional cap.
  • the display device array comprise a quadrilateral display area (e.g., 16:9 display, 4:3 display, square display, etc.) or other shape such that a portion of the display device is viewable through the directional cap while the remainder of the display device is obscured via the directional cap and/or the controller body.
  • the underside of the display device may be configured to interact with the plurality of actuatable components.
  • a plurality of individually controllable display devices may be disposed beneath the transparent cap.
  • directional pads may have various configurations without departing from the scope of the present disclosure.
  • an input mechanism comprising an individually controllable display device may be configured to display any configuration and combination of “fixed” or “dynamic” icons.
  • Fixed icons are predefined icons that may be included with controller 102 and/or computing device 110 (e.g., via one or more data structures within non-volatile memory).
  • each of pushable buttons may be configured to display one or more alphanumeric icons (e.g., icon 118 ) and/or one or more graphical icons (e.g., icon 120 ).
  • Applications configured to interact with the individually controllable display devices e.g., application 113
  • fixed icons may have any suitable configuration, including, but not limited to, generic alphanumeric characters, generic images (e.g., media controls, etc.), and/or geometric shapes.
  • icons may be “dynamic.” Dynamic icons may be specific to one or more applications presented by computing device 110 , and array enable application developers to provide custom icons suited for the particular digital experience (e.g., video game, media player, etc.). For example, in media player scenarios, one or more media functions may be presented to the user (e.g., playback controls, playlist controls, etc.) in order to provide a more customized and intuitive user experience. It will be appreciated that in such scenarios, there may exist any number and configuration of dynamic icons as provided by applications utilizing the controller 102 .
  • each of pushable buttons 116 may be configured to display one or more “default” icons. Said default icons may be displayed by one or more pushable buttons 116 upon startup of controller 102 , during interaction with applications that do not support the individually controllable display devices, during interaction with applications that are not currently utilizing a given button, and/or in any other scenario where display of a fixed icon or a dynamic icon via pushable buttons 116 is not requested.
  • a default icon may comprise an alphanumeric indicator (e.g., A, B, X, Y, etc.).
  • a default icon may comprise a blank image (i.e. all pixels on or all pixels off). It will be appreciated that these scenarios are presented for the purpose of example, and are not intended to be limiting in any manner.
  • any suitable display technologies may be utilized to display icons as disclosed herein.
  • a “segmented” display technology such as electronic paper.
  • each of the fixed icons may be represented by activating/de-activating one or more display “segments.”
  • a pixel-based display technology e.g., active matrix
  • FIG. 1 For example, these configurations are presented for the purpose of example, and that any suitable display technology may be utilized without departing from the scope of the present disclosure.
  • Other example display technologies include, but are not limited to, liquid crystal displays, organic light emitting device displays, and projection displays.
  • a game controller in accordance with embodiments of the present disclosure may comprise any suitable combination and configuration of pushable buttons and individually controllable display devices, and that said individually controllable display devices maybe configured to display any configuration of dynamic icons, fixed icons, and/or default icons.
  • the behavior e.g., timing, update triggers, etc.
  • a communicatively coupled computing device e.g., computing device 110
  • an application presented by the computing device e.g., application 113
  • FIG. 2 a sectional view of an embodiment of a pushable button 200 comprising an individually controllable display device 202 is shown.
  • Button 200 comprises transparent cap 204 configured to be actuated by the user.
  • transparent cap 204 may have any suitable configuration (e.g., concave, flat, angled, etc.) without departing from the scope of the present disclosure.
  • individually controllable display device 202 is disposed under transparent cap 204 such that display device 202 is viewable through the top of transparent cap 204 .
  • button 200 further comprises a support structure 206 located behind the individually controllable display device.
  • Support structure 206 may take any suitable form and may be coupled to transparent cap 204 via any suitable mechanism or combination of mechanisms.
  • transparent cap 204 may comprise one or more retaining tabs 208 disposed along the inner surface of the transparent cap and configured to interact with edge 210 of support structure 206 .
  • tabs 208 may comprise plural discrete features formed on an inside surface of transparent cap 204 , whereas in other embodiments, the inner surface of transparent cap 204 may comprise a substantially continuous feature (e.g., flange) configured to interacy with edge 210 of support structure 206 instead of, or in addition to, retaining tabs 208 .
  • one or more tabs may be formed on support structure 206 for engagement with complementary features on transparent cap 204 .
  • support structure 206 may be retained within transparent cap 204 via an adhesive.
  • Display device 202 may be further retained within the transparent cap 204 via any suitable mechanism or combination of mechanisms, in some embodiments, display device 202 may be retained via friction with transparent cap 204 and support structure 206 . In other embodiments, display device 202 may be affixed to transparent cap 204 and/or support structure 206 via adhesive. In yet other embodiments, display device 202 may be mechanically coupled to transparent cap 204 and/or support structure 206 via one or more mechanical features (e.g., complimentary features, pressure fittings, screws, etc.).
  • mechanical features e.g., complimentary features, pressure fittings, screws, etc.
  • button 200 is disposed within an opening of body 212 (e.g., body 104 of FIG. 1 ). Accordingly, upon user-actuation of transparent cap 204 , button 200 is configured to move in a direction substantially perpendicular to body 212 , in order to retain button 200 within the opening, the button may further comprise flange 214 configured to interact with an inner surface of body 212 . In other embodiments, button 200 may comprise one or more discrete tabs configured to retain the button within the opening instead of, or in addition to, flange 214 .
  • Button 200 further comprises actuatable component 216 coupled to board 218 (e.g., multi-layer printed circuit board) and configured to translate user actuation of the button into one or more representative analog and/or digital electrical signals.
  • Actuatable component 216 may comprise any suitable mechanism or combination of mechanisms, including, but not limited to, mechanical sensors (e.g., tactile switch, membrane switch, etc.), optical sensors (e.g., optical encoder, optical break sensor, etc.), magnetic sensors (e.g., magnetic reed switch), and/or capacitive sensors.
  • button 200 may further comprise one or more mechanisms configured to mechanically couple actuatable component 216 and one or more of transparent cap 204 , display device 202 , and support structure 206 .
  • button 200 may comprise rod 220 configured to couple actuatable component 216 to support structure 206 .
  • actuatable component 216 may be configured to interact directly with transparent cap 204 and/or support structure 206 .
  • actuatable component 216 may be disposed below flange 214 such that user actuation of transparent cap 204 effects interaction between flange 214 and actuatable component 216 . It be appreciated these configurations are presented for the purpose of example, and are not intended to be limiting in any manner.
  • Board 218 may be disposed within body 212 to provide structural and/or electrical interfaces for button 200 . Board 218 may be further configured to electrically couple display device 202 to one or more electrical components (e.g., display controller, non-volatile memory, communication subsystem, etc.). Accordingly, display device 202 may comprise one or more electrical connections 222 (e.g., ribbon cable, flexible flat cable, etc.) to provide said coupling. Further, support structure 206 may comprise one or more vias 224 to accommodate electrical connections 222 . As illustrated, via 224 may comprise an opening through op portion of support structure 206 .
  • electrical connections 222 e.g., ribbon cable, flexible flat cable, etc.
  • via 224 may further comprise one or more features extending below the top portion of support structure 206 and configured to at least partially enclose electrical connections 222 . Said features may provide additional routing and support of electrical connections 222 within button 200 , and may therefore decrease mechanical fatigue experienced by electrical connections 222 (e.g., via pinching, friction, etc.).
  • via 224 is illustrated as being disposed through the top portion of support structure 206 , it will be appreciated that vias 224 and electrical connections 222 may have any suitable configuration, examples of which are discussed in greater detail below with reference to FIG. 3 .
  • buttons 200 may represent a single “direction” of the directional control pad.
  • the directional pad may comprises a transparent directional pad cap (e.g., cap 204 ) and a directional pad display device (e.g., display device 202 ) located behind the transparent directional pad cap such that the directional pad display device is viewable through the transparent directional pad cap.
  • a directional pad support structure e.g., support structure 206
  • a directional pad may comprise a single directional pad cap and a plurality of actuatable components (e.g., one actuatable component per “direction”).
  • button 200 comprising individually controllable display device 202 (illustrated as displaying an “X” icon) viewable through transparent cap 204 is shown.
  • button 200 comprises support structure 206 retained within transparent cap 204 , the support structure being configured to support display device 202 against transparent cap 204 .
  • button 200 is illustrated as being substantially circular, it will be appreciated that pushable buttons as disclosed herein may comprise any suitable shape without departing from the scope of the present disclosure.
  • support structure 206 may comprise one or more vias to accommodate electrical connections to display device 202 .
  • Such vias may be disposed through the top portion of the support structure, as illustrated at 224 .
  • such embodiments may further comprise one or more features extending below the top portion in order to provide additional routing.
  • such vias also may be formed along one or more side portions of the support structure such that the one or more electrical connections are disposed between the support structure and transparent cap, as illustrated at 226 .
  • via 226 may further comprise one or more elements configured to at least partially enclose the electrical connections.
  • the electrical connections may be disposed freely within via 226 and/or may be mechanically coupled thereto via one or more mechanisms (e.g., tabs, loops, adhesive, etc.). It will be appreciated that pushable buttons may comprise any combination and configuration of vias to accommodate the one or more electrical connections of the individually controllable display device without departing from the scope of the present disclosure.
  • FIGS. 4 and 5 show a plurality of pushable buttons 400 displaying various icons (e.g., icons for a first-person combat video game).
  • first pushable button 402 of the plurality of pushable buttons 400 is displaying icon 404 , illustrated as a default “X” icon.
  • button 402 may be configured to display an icon different than icon 404 .
  • FIG. 5 shows pushable buttons 400 after user actuation of button 402 .
  • button 402 is displaying icon 408 in the form of a grenade.
  • buttons 402 may result in use of a grenade by an in-game character.
  • icon change of FIGS. 4 and 5 is presented with respect to user-actuation of a button during interaction with a video game, it will be appreciated that that a change in display state for one or more of the individually controllable displays may be effected by any mechanism (e.g. user actuation, change in application state, change in computing device state, etc.) and that icons may be presented for any suitable user experience (e.g., media player, etc.) without departing from the scope of the present disclosure.
  • method 600 comprises receiving an input from an application (e.g. application 113 ) requesting display of a first icon on a first display device of a first pushable button of a plurality of pushable buttons and a second icon on a second display device of a second button of the plurality of pushable buttons.
  • an application e.g. application 113
  • icons displayed by each individually controllable display device may comprise fixed icons 604 and/or dynamic icons 606 in any combination and configuration.
  • the custom icon image data may be provided by the application and/or computing device in communication with the controller.
  • method 600 may further comprise, at 608 , providing image data (e.g., pixel array or other suitable data structure) to each display specifying an image to be displayed.
  • image data e.g., pixel array or other suitable data structure
  • said image data may be provided to the controller, and thus stored by the controller, upon initialization of the computing device and/or the application.
  • the image data may be dynamically provided.
  • method 600 further comprises, in response, displaying the first icon on the first display device and display the second icon on the second display device, thereby causing the first icon to appear on the first pushable button and the second icon to appear on the second pushable button.
  • pushable buttons comprising an individually controllable display device may be configured to display one or more “default” icons (e.g., generic text, generic images, blank image, etc.). Default icons may be displayed, for example, when no fixed or dynamic icon has been requested for a given pushable button.
  • method 600 may further comprise, at 612 , displaying a default icon on a third display device of a third pushable button, thereby causing the default icon to appear on the third pushable button.
  • method 600 further may further comprise detecting user actuation of the third pushable button.
  • Method 600 may further comprise, at 616 , receiving an input from the first application requesting display of a first dynamic icon on the third display device of the third pushable button.
  • method 600 may further comprise displaying the third icon on the third display device of the third pushable button.
  • the above described methods and processes may be tied to a computing system including one or more computers.
  • the methods and processes described herein may be implemented as a computer application, computer service, computer API, computer library, and/or other computer program product.
  • FIG. 7 schematically shows a nonlimiting computing system 700 that may perform one or more of the above described methods and processes.
  • Controller 102 and computing device 110 are non-limiting examples of computing system 700 .
  • Computing system 700 is shown in simplified form. It is to be understood that virtually any computer architecture may be used without departing from the scope of this disclosure.
  • computing system 700 may take the form of a mainframe computer, server computer, desktop computer, laptop computer, tablet computer, home entertainment computer, network computing device, mobile computing device, mobile communication device, gaining device, user input device, etc.
  • Computing system 700 includes a logic subsystem 702 and a data-holding subsystem 704 .
  • Computing system 700 may optionally include a display subsystem 706 , communication subsystem 708 , and/or other components not shown in FIG. 7 .
  • Computing system 700 may also optionally include user input devices such as keyboards, mice, game controllers, cameras, microphones, and/or touch screens, for example.
  • Logic subsystem 702 may include one or more physical devices configured to execute one or more instructions.
  • the logic subsystem may be configured to execute one or more instructions that are part of one or more applications, services, programs, routines, libraries, objects, components, data structures, or other logical constructs.
  • Such instructions may be implemented to perform a task, implement a data type, transform the state of one or more devices, or otherwise arrive at a desired result.
  • the logic subsystem may include one or more processors that are configured to execute software instructions. Additionally or alternatively, the logic subsystem may include one or more hardware or firmware logic machines configured to execute hardware or firmware instructions. Processors of the logic subsystem may be single core or multicore, and the programs executed thereon may be configured for parallel or distributed processing. The logic subsystem may optionally include individual components that are distributed throughout two or more devices, which may be remotely located and/or configured for coordinated processing. One or more aspects of the logic subsystem may be virtualized and executed by remotely accessible networked computing devices configured in a cloud computing configuration.
  • Data-holding subsystem 704 may include one or more physical, non-transitory, devices configured to hold data and/or instructions executable by the logic subsystem to implement the herein described methods and processes. When such methods and processes are implemented, the state of data-holding subsystem 704 may be transformed (e.g., to hold different data).
  • Data-holding subsystem 704 may include removable media anchor built-in devices.
  • Data-holding subsystem 704 may include optical memory devices (e.g., CD, DVD, HD-DVD, Blu-Ray Disc, etc.), semiconductor memory devices (e.g., RAM, EPROM, EEPROM, etc.) and/or magnetic memory devices (e.g., hard disk drive, floppy disk drive, tape drive, MRAM, etc.), among others.
  • Data-holding subsystem 704 may include devices with one or more of the following characteristics: volatile, nonvolatile, dynamic, static, read/write, read-only, random access sequential access, location addressable, file addressable, and content addressable.
  • logic subsystem 702 and data-holding subsystem 704 may be integrated into one or more common devices, such as an application specific integrated circuit or a system on a chip.
  • FIG. 7 also shows an aspect of the data-holding subsystem in the form of removable computer-readable storage media 710 which may be used to store and/or transfer data and/or instructions executable to implement the herein described methods and processes.
  • Removable computer-readable storage media 710 may take the form of CDs, DVDs, HD-DVDs, Blu-Ray Discs, EEPROMs, and/or floppy disks, among others.
  • data-holding subsystem 704 includes one or more physical, non-transitory devices.
  • aspects of the instructions described herein array be propagated in a transitory fashion by a pure signal (e.g., an electromagnetic signal, an optical signal, etc.) that is not held by a physical device for at least a finite duration.
  • a pure signal e.g., an electromagnetic signal, an optical signal, etc.
  • data and/or other forms of information pertaining to the present disclosure may be propagated by a pure signal.
  • application may be used to describe an aspect of computing system 700 that is implemented to perform one or more particular functions.
  • an application and/or program may be instantiated via logic subsystem 702 executing instructions held by data-holding subsystem 704 .
  • different applications and/or programs may be instantiated from the same service, code block, object, library, routine, API, function, etc.
  • the same application and/or programs may be instantiated by different services, code blocks, objects, routines, APIs, functions, etc.
  • application and/or program are meant to encompass individual or groups of executable files, data files, libraries, drivers, scripts, etc.
  • display subsystem 706 may be used to present a visual representation of data held by data-holding subsystem 704 . As the herein described methods and processes change the data held by the data-holding subsystem, and thus transform the state of the data-holding subsystem, the state of display subsystem 706 may likewise be transformed to visually represent changes in the underlying data.
  • Display subsystem 706 may include one or more display devices utilizing virtually any type of technology. Such display devices may be combined with logic subsystem 702 and/or data-holding subsystem 704 in a shared enclosure, or such display devices may be peripheral display devices.
  • communication subsystem 708 may be configured to communicatively couple computing system 700 with one or more other computing devices.
  • Communication subsystem 708 may include wired and/or wireless communication devices compatible with one or more different communication protocols.
  • the communication subsystem may be configured for communication via a wireless telephone network, a wireless local area network, a wired local area network, a wireless wide area network, a wired wide area network, etc.
  • the communication subsystem may allow computing system 700 to send and/or receive messages to and/or from other devices via a network such as the Internet.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Embodiments are disclosed that relate to pushable buttons comprising individually controllable display devices. For example, one disclosed embodiment provides a hand-held game controller comprising a body and a plurality of pushable buttons movably coupled to the body, each pushable button comprising a transparent cap and an individually controllable display device arranged behind the transparent cap such that the display device is viewable through the transparent cap. The hand-held game controller further comprises a controller configured to receive an input from an application requesting display of a first icon on a first display device of a first pushable button of the plurality of pushable buttons and a second icon on a second display device of a second button of the plurality of pushable buttons, and, in response, display the first icon on the first display device and display the second icon on the second display device.

Description

    BACKGROUND
  • User input devices, such as hand-held game controllers, include one or more input mechanisms for effecting control over a computing device or applications presented thereby. Input mechanisms, such as buttons and directional pads, may include generic, fixed markings to identify the input mechanisms.
  • SUMMARY
  • Embodiments are disclosed that relate to pushable buttons comprising individually controllable display devices. For example, one disclosed embodiment provides a hand-held game controller comprising a body and a plurality of pushable buttons movably coupled to the body, each pushable button comprising a transparent cap and an individually controllable display device arranged behind the transparent cap such that the display device is viewable through the transparent cap. The hand-held game controller further comprises a controller configured to receive an input from an application requesting display of a first icon on a first display device of a first pushable button of the plurality of pushable buttons and a second icon on a second display device of a second button of the plurality of pushable buttons, and, in response, display the first icon on the first display device and display the second icon on the second display device.
  • This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 shows an embodiment of a hand-held video game controller in an example use environment.
  • FIG. 2 shows an example of an embodiment of a pushable button comprising an individually controllable display device.
  • FIG. 3 shows a top-view of an embodiment of a pushable button comprising an individually controllable display device.
  • FIG. 4-5 shows an example use case scenario for an embodiment of a hand-held video game controller comprising a plurality of pushable buttons.
  • FIG. 6 shows a process flow depicting an embodiment of a method of operating a hand-held video game controller.
  • FIG. 7 shows a non-limiting example of an embodiment of a computing device.
  • DETAILED DESCRIPTION
  • As mentioned above, an input device may include one or more input mechanisms (e.g., buttons, directional pads, triggers, etc.) that may be user-actuatable in order to effect control over a computing device and/or over applications presented thereby (e.g., video games). Said input mechanisms typically include one or more fixed markings, such as arrows, alphanumeric characters, symbols, etc., to identify the input mechanism.
  • In some instances, an input mechanism may not be utilized in a consistent manner, for example, for different applications. Instead, customized functionality may be defined by the computing device manufacturer, the application developer, the user, and/or other entity. For example, the “up” direction of a directional pad may effect forward motion of an on-screen character in a first set of applications, whereas the “up” direction may fire an on-screen weapon in a second set of applications. A user of an input device may rely on their own memory and/or may access on-screen help (e.g., a controller legend) in order to ascertain which input mechanism is associated with a given activity, function, weapon, ability, etc.
  • Accordingly, various embodiments are disclosed herein that relate to the display of context-specific icons on such input devices to allow the user to quickly at the controller to recall the assignments. In this way, a user may be able to quickly and accurately ascertain the functionality of a given input mechanism without referring to additional instructional materials (e.g., instruction manuals, etc.). Such a configuration may thus enable novice and experienced users alike to accurately and efficiently operate the input mechanism.
  • FIG. 1 shows an example use environment 100 comprising an embodiment of a hand-held video game controller 102. Controller 102 comprises body 104 configured to be held in one or both hands. In some embodiments, body 104 may be “hollow” such that body 104 forms one or more cavities in which other elements of controller 102, in whole or in part, are oriented.
  • Body 104 may be further configured to provide support and/or positioning for one or more input mechanisms. For example, controller 102 may include input mechanisms such as joystick 106 and/or buttons 108 movably coupled to body 104. These input mechanisms may be usable to effect control over computing device 110 and/or over applications presented thereby. Controller 102 may be used, for example, to control the motion of game character 112 of application 113 rendered by computing device 110 and displayed via display device 114. As such, controller 102 may further comprise a communication subsystem configured to provide unidirectional or bidirectional communication with computing device 110 to allow controller 102 to send control information to computing device 110 for controlling the computing device, and also to receive communications from the computing device 110. The control information sent to the computing device 110 may comprise, for example, state information regarding the current state of joystick 106 and/or buttons 108.
  • Although illustrated as a wireless connection, it will be appreciated that controller 102 communicate with computing device 110 via a wireless connection, a wired connection, or a combination thereof. Wireless communication may be performed via infrared light, visible light, radio-frequency (e.g., 802.11 or mobile telephony), combinations thereof, or via any other suitable mechanism.
  • In addition to joystick 106 and/or buttons 108, controller 102 may further comprise one or more input mechanisms each comprising an individually controllable display device, embodiments of which are shown as pushable buttons 116. It will be appreciated that the term “pushable button” as used herein refers to any input mechanism configured to be user-actuatable by a pressing motion (e.g., via thumb or other finger) that causes movement of the button into a body of the controller to actuate an actuatable component.
  • Independently controllable displays may be configured to provide formation, such as icons, to a user of controller 102 to convey the functionality provided by actuation of the corresponding pushable button. It will be appreciated that the term “icon” as used herein refers to any image, textual and/or graphical, displayed via an individually controllable display device. For example, buttons 116 may be configured to alternatively display alphanumeric icons 118 and graphical icons 120. Accordingly, during use of computing device 110 (e.g., during interaction with application 113), specific actions or activities (e.g., jump, crouch, run, change weapon, etc.) may be assigned to buttons 116, and corresponding icons may be displayed thereon. Further, upon the occurrence of user actuation of buttons 116 and/or upon the occurrence of various triggers (e.g., video game game play situations), the icons may be updated to reflect an updated state (e.g., updated functionality of the associated input mechanism).
  • Controller 102 may further include one or more directional pads 122 each comprising an individually controllable display device and disposed on body 104 apart from pushable buttons 116. As with pushable buttons 116, directional pad 122 may be configured to display a variety of icons, including, but not limited to, directional icons 124 (e.g., arrows similar to “standard” directional pad markings), graphical icons 126 (e.g., “run” icon), and textual icons (e.g., international and/or extended character sets to facilitate user input of text).
  • In contrast to buttons 116, it will be appreciated that directional pad 122 may comprise a unitary body that is user-actuatable in a plurality of directions 123 (e.g., up, down, left, right, etc.). Accordingly, each of directions 123 may comprise a corresponding actuatable component in order to register user-actuation of the direction. Regardless of the number and orientation of the directions, directional pad 122 may comprise a monolithic transparent directional cap spanning the plurality of directions 123 such that a single directional pad display device may be utilized for the plurality of directions.
  • In some embodiments, the display device may be substantially coextensive with the directional cap (e.g., cross-shaped, etc.) such that the display device is disposed substantially within the transparent directional cap. In other embodiments, the display device array comprise a quadrilateral display area (e.g., 16:9 display, 4:3 display, square display, etc.) or other shape such that a portion of the display device is viewable through the directional cap while the remainder of the display device is obscured via the directional cap and/or the controller body. In such embodiments, the underside of the display device may be configured to interact with the plurality of actuatable components. In yet other embodiments, a plurality of individually controllable display devices (e.g., one display per direction 213) may be disposed beneath the transparent cap. It will be appreciated that directional pads may have various configurations without departing from the scope of the present disclosure.
  • In various embodiments, an input mechanism comprising an individually controllable display device may be configured to display any configuration and combination of “fixed” or “dynamic” icons. Fixed icons are predefined icons that may be included with controller 102 and/or computing device 110 (e.g., via one or more data structures within non-volatile memory). For example, each of pushable buttons may be configured to display one or more alphanumeric icons (e.g., icon 118) and/or one or more graphical icons (e.g., icon 120). Applications configured to interact with the individually controllable display devices (e.g., application 113) may therefore be capable of selecting and displaying one of said fixed icons on each individually controllable display device. It will be appreciated that fixed icons may have any suitable configuration, including, but not limited to, generic alphanumeric characters, generic images (e.g., media controls, etc.), and/or geometric shapes.
  • In other embodiments, icons may be “dynamic.” Dynamic icons may be specific to one or more applications presented by computing device 110, and array enable application developers to provide custom icons suited for the particular digital experience (e.g., video game, media player, etc.). For example, in media player scenarios, one or more media functions may be presented to the user (e.g., playback controls, playlist controls, etc.) in order to provide a more customized and intuitive user experience. It will be appreciated that in such scenarios, there may exist any number and configuration of dynamic icons as provided by applications utilizing the controller 102.
  • Further, each of pushable buttons 116 may be configured to display one or more “default” icons. Said default icons may be displayed by one or more pushable buttons 116 upon startup of controller 102, during interaction with applications that do not support the individually controllable display devices, during interaction with applications that are not currently utilizing a given button, and/or in any other scenario where display of a fixed icon or a dynamic icon via pushable buttons 116 is not requested. For example, in some embodiments, a default icon may comprise an alphanumeric indicator (e.g., A, B, X, Y, etc.). In other embodiments, a default icon may comprise a blank image (i.e. all pixels on or all pixels off). It will be appreciated that these scenarios are presented for the purpose of example, and are not intended to be limiting in any manner.
  • Any suitable display technologies may be utilized to display icons as disclosed herein. For example, in the instance of fixed icons, as the set of fixed icons does not change over time, it may be advantageous to use a “segmented” display technology, such as electronic paper. In such scenarios, each of the fixed icons may be represented by activating/de-activating one or more display “segments.” Further, to provide greater flexibility in providing dynamic icons, a pixel-based display technology (e.g., active matrix) may be used. It will be appreciated that these configurations are presented for the purpose of example, and that any suitable display technology may be utilized without departing from the scope of the present disclosure. Other example display technologies include, but are not limited to, liquid crystal displays, organic light emitting device displays, and projection displays.
  • It will be appreciated that a game controller in accordance with embodiments of the present disclosure may comprise any suitable combination and configuration of pushable buttons and individually controllable display devices, and that said individually controllable display devices maybe configured to display any configuration of dynamic icons, fixed icons, and/or default icons. Further, the behavior (e.g., timing, update triggers, etc.) may be determined by the controller, a communicatively coupled computing device (e.g., computing device 110), an application presented by the computing device (e.g., application 113), and/or any combination thereof.
  • Turning now to FIG. 2, a sectional view of an embodiment of a pushable button 200 comprising an individually controllable display device 202 is shown. Button 200 comprises transparent cap 204 configured to be actuated by the user. Although illustrated as comprising a convex top surface, it will be appreciated that transparent cap 204 may have any suitable configuration (e.g., concave, flat, angled, etc.) without departing from the scope of the present disclosure.
  • As illustrated, individually controllable display device 202 is disposed under transparent cap 204 such that display device 202 is viewable through the top of transparent cap 204. In order to support display device 202 against transparent cap 204, button 200 further comprises a support structure 206 located behind the individually controllable display device. Support structure 206 may take any suitable form and may be coupled to transparent cap 204 via any suitable mechanism or combination of mechanisms. For example, as illustrated, transparent cap 204 may comprise one or more retaining tabs 208 disposed along the inner surface of the transparent cap and configured to interact with edge 210 of support structure 206. In some embodiments, tabs 208 may comprise plural discrete features formed on an inside surface of transparent cap 204, whereas in other embodiments, the inner surface of transparent cap 204 may comprise a substantially continuous feature (e.g., flange) configured to interacy with edge 210 of support structure 206 instead of, or in addition to, retaining tabs 208. In other embodiments, one or more tabs may be formed on support structure 206 for engagement with complementary features on transparent cap 204. In yet other embodiments, support structure 206 may be retained within transparent cap 204 via an adhesive.
  • Display device 202 may be further retained within the transparent cap 204 via any suitable mechanism or combination of mechanisms, in some embodiments, display device 202 may be retained via friction with transparent cap 204 and support structure 206. In other embodiments, display device 202 may be affixed to transparent cap 204 and/or support structure 206 via adhesive. In yet other embodiments, display device 202 may be mechanically coupled to transparent cap 204 and/or support structure 206 via one or more mechanical features (e.g., complimentary features, pressure fittings, screws, etc.).
  • As illustrated, button 200 is disposed within an opening of body 212 (e.g., body 104 of FIG. 1). Accordingly, upon user-actuation of transparent cap 204, button 200 is configured to move in a direction substantially perpendicular to body 212, in order to retain button 200 within the opening, the button may further comprise flange 214 configured to interact with an inner surface of body 212. In other embodiments, button 200 may comprise one or more discrete tabs configured to retain the button within the opening instead of, or in addition to, flange 214.
  • Button 200 further comprises actuatable component 216 coupled to board 218 (e.g., multi-layer printed circuit board) and configured to translate user actuation of the button into one or more representative analog and/or digital electrical signals. Actuatable component 216 may comprise any suitable mechanism or combination of mechanisms, including, but not limited to, mechanical sensors (e.g., tactile switch, membrane switch, etc.), optical sensors (e.g., optical encoder, optical break sensor, etc.), magnetic sensors (e.g., magnetic reed switch), and/or capacitive sensors.
  • In some embodiments, button 200 may further comprise one or more mechanisms configured to mechanically couple actuatable component 216 and one or more of transparent cap 204, display device 202, and support structure 206. For example, in some embodiments, button 200 may comprise rod 220 configured to couple actuatable component 216 to support structure 206. In other embodiments, actuatable component 216 may be configured to interact directly with transparent cap 204 and/or support structure 206. For example, actuatable component 216 may be disposed below flange 214 such that user actuation of transparent cap 204 effects interaction between flange 214 and actuatable component 216. It be appreciated these configurations are presented for the purpose of example, and are not intended to be limiting in any manner.
  • Board 218 may be disposed within body 212 to provide structural and/or electrical interfaces for button 200. Board 218 may be further configured to electrically couple display device 202 to one or more electrical components (e.g., display controller, non-volatile memory, communication subsystem, etc.). Accordingly, display device 202 may comprise one or more electrical connections 222 (e.g., ribbon cable, flexible flat cable, etc.) to provide said coupling. Further, support structure 206 may comprise one or more vias 224 to accommodate electrical connections 222. As illustrated, via 224 may comprise an opening through op portion of support structure 206. In some embodiments, via 224 may further comprise one or more features extending below the top portion of support structure 206 and configured to at least partially enclose electrical connections 222. Said features may provide additional routing and support of electrical connections 222 within button 200, and may therefore decrease mechanical fatigue experienced by electrical connections 222 (e.g., via pinching, friction, etc.). Although via 224 is illustrated as being disposed through the top portion of support structure 206, it will be appreciated that vias 224 and electrical connections 222 may have any suitable configuration, examples of which are discussed in greater detail below with reference to FIG. 3.
  • Although the discussion of FIG. 2 is with reference to a single pushable button, it will be appreciated that the discussion is also applicable to the directional pads mentioned above, as button 200 may represent a single “direction” of the directional control pad. In other words, the directional pad may comprises a transparent directional pad cap (e.g., cap 204) and a directional pad display device (e.g., display device 202) located behind the transparent directional pad cap such that the directional pad display device is viewable through the transparent directional pad cap. Further, a directional pad support structure (e.g., support structure 206) may be included to support the directional pad display device against the transparent cap. As mentioned above, in contrast to an individual pushable button, a directional pad may comprise a single directional pad cap and a plurality of actuatable components (e.g., one actuatable component per “direction”).
  • Turning now to FIG. 3, a op view of an embodiment of pushable button 200 comprising individually controllable display device 202 (illustrated as displaying an “X” icon) viewable through transparent cap 204 is shown. As described above, button 200 comprises support structure 206 retained within transparent cap 204, the support structure being configured to support display device 202 against transparent cap 204. Although button 200 is illustrated as being substantially circular, it will be appreciated that pushable buttons as disclosed herein may comprise any suitable shape without departing from the scope of the present disclosure.
  • As mentioned above, support structure 206 may comprise one or more vias to accommodate electrical connections to display device 202. Such vias may be disposed through the top portion of the support structure, as illustrated at 224. As mentioned above, such embodiments may further comprise one or more features extending below the top portion in order to provide additional routing. Alternatively, such vias also may be formed along one or more side portions of the support structure such that the one or more electrical connections are disposed between the support structure and transparent cap, as illustrated at 226. In such embodiments, via 226 may further comprise one or more elements configured to at least partially enclose the electrical connections. The electrical connections may be disposed freely within via 226 and/or may be mechanically coupled thereto via one or more mechanisms (e.g., tabs, loops, adhesive, etc.). It will be appreciated that pushable buttons may comprise any combination and configuration of vias to accommodate the one or more electrical connections of the individually controllable display device without departing from the scope of the present disclosure.
  • FIGS. 4 and 5 show a plurality of pushable buttons 400 displaying various icons (e.g., icons for a first-person combat video game). First, as shown in the example of FIG. 4, first pushable button 402 of the plurality of pushable buttons 400 is displaying icon 404, illustrated as a default “X” icon. In response to actuation of button 402 by a user (e.g., via thumb 406), button 402 may be configured to display an icon different than icon 404. FIG. 5 shows pushable buttons 400 after user actuation of button 402. As illustrated, button 402 is displaying icon 408 in the form of a grenade. As such, further actuation of button 402 may result in use of a grenade by an in-game character. Although the icon change of FIGS. 4 and 5 is presented with respect to user-actuation of a button during interaction with a video game, it will be appreciated that that a change in display state for one or more of the individually controllable displays may be effected by any mechanism (e.g. user actuation, change in application state, change in computing device state, etc.) and that icons may be presented for any suitable user experience (e.g., media player, etc.) without departing from the scope of the present disclosure.
  • Turning now to FIG. 6, a flow diagram depicting an embodiment of a method 600 of operating a hand-held video game controller (e.g., controller 102) is shown. At 602, method 600 comprises receiving an input from an application (e.g. application 113) requesting display of a first icon on a first display device of a first pushable button of a plurality of pushable buttons and a second icon on a second display device of a second button of the plurality of pushable buttons. As mentioned above, icons displayed by each individually controllable display device may comprise fixed icons 604 and/or dynamic icons 606 in any combination and configuration.
  • In some scenarios (e.g., dynamic icon scenarios), it will be appreciated that the custom icon image data may be provided by the application and/or computing device in communication with the controller. As such, method 600 may further comprise, at 608, providing image data (e.g., pixel array or other suitable data structure) to each display specifying an image to be displayed. In some embodiments, said image data may be provided to the controller, and thus stored by the controller, upon initialization of the computing device and/or the application. In other embodiments, the image data may be dynamically provided.
  • At 610, method 600 further comprises, in response, displaying the first icon on the first display device and display the second icon on the second display device, thereby causing the first icon to appear on the first pushable button and the second icon to appear on the second pushable button. As mentioned above, in some embodiments, pushable buttons comprising an individually controllable display device may be configured to display one or more “default” icons (e.g., generic text, generic images, blank image, etc.). Default icons may be displayed, for example, when no fixed or dynamic icon has been requested for a given pushable button. As such, method 600 may further comprise, at 612, displaying a default icon on a third display device of a third pushable button, thereby causing the default icon to appear on the third pushable button.
  • As mentioned above, the icons displayed on the individually controllable display devices may be updated upon user input, upon change in application state, upon change in computing device state, and/or according to any other suitable trigger or combination of triggers. Accordingly, at 614, method 600 further may further comprise detecting user actuation of the third pushable button. Method 600 may further comprise, at 616, receiving an input from the first application requesting display of a first dynamic icon on the third display device of the third pushable button. At 618, method 600 may further comprise displaying the third icon on the third display device of the third pushable button.
  • The above described methods and processes may be tied to a computing system including one or more computers. In particular, the methods and processes described herein may be implemented as a computer application, computer service, computer API, computer library, and/or other computer program product.
  • FIG. 7 schematically shows a nonlimiting computing system 700 that may perform one or more of the above described methods and processes. Controller 102 and computing device 110 are non-limiting examples of computing system 700. Computing system 700 is shown in simplified form. It is to be understood that virtually any computer architecture may be used without departing from the scope of this disclosure. In different embodiments, computing system 700 may take the form of a mainframe computer, server computer, desktop computer, laptop computer, tablet computer, home entertainment computer, network computing device, mobile computing device, mobile communication device, gaining device, user input device, etc.
  • Computing system 700 includes a logic subsystem 702 and a data-holding subsystem 704. Computing system 700 may optionally include a display subsystem 706, communication subsystem 708, and/or other components not shown in FIG. 7. Computing system 700 may also optionally include user input devices such as keyboards, mice, game controllers, cameras, microphones, and/or touch screens, for example.
  • Logic subsystem 702 may include one or more physical devices configured to execute one or more instructions. For example, the logic subsystem may be configured to execute one or more instructions that are part of one or more applications, services, programs, routines, libraries, objects, components, data structures, or other logical constructs. Such instructions may be implemented to perform a task, implement a data type, transform the state of one or more devices, or otherwise arrive at a desired result.
  • The logic subsystem may include one or more processors that are configured to execute software instructions. Additionally or alternatively, the logic subsystem may include one or more hardware or firmware logic machines configured to execute hardware or firmware instructions. Processors of the logic subsystem may be single core or multicore, and the programs executed thereon may be configured for parallel or distributed processing. The logic subsystem may optionally include individual components that are distributed throughout two or more devices, which may be remotely located and/or configured for coordinated processing. One or more aspects of the logic subsystem may be virtualized and executed by remotely accessible networked computing devices configured in a cloud computing configuration.
  • Data-holding subsystem 704 may include one or more physical, non-transitory, devices configured to hold data and/or instructions executable by the logic subsystem to implement the herein described methods and processes. When such methods and processes are implemented, the state of data-holding subsystem 704 may be transformed (e.g., to hold different data).
  • Data-holding subsystem 704 may include removable media anchor built-in devices. Data-holding subsystem 704 may include optical memory devices (e.g., CD, DVD, HD-DVD, Blu-Ray Disc, etc.), semiconductor memory devices (e.g., RAM, EPROM, EEPROM, etc.) and/or magnetic memory devices (e.g., hard disk drive, floppy disk drive, tape drive, MRAM, etc.), among others. Data-holding subsystem 704 may include devices with one or more of the following characteristics: volatile, nonvolatile, dynamic, static, read/write, read-only, random access sequential access, location addressable, file addressable, and content addressable. In some embodiments, logic subsystem 702 and data-holding subsystem 704 may be integrated into one or more common devices, such as an application specific integrated circuit or a system on a chip.
  • FIG. 7 also shows an aspect of the data-holding subsystem in the form of removable computer-readable storage media 710 which may be used to store and/or transfer data and/or instructions executable to implement the herein described methods and processes. Removable computer-readable storage media 710 may take the form of CDs, DVDs, HD-DVDs, Blu-Ray Discs, EEPROMs, and/or floppy disks, among others.
  • It is to be appreciated that data-holding subsystem 704 includes one or more physical, non-transitory devices. In contrast, in some embodiments aspects of the instructions described herein array be propagated in a transitory fashion by a pure signal (e.g., an electromagnetic signal, an optical signal, etc.) that is not held by a physical device for at least a finite duration. Furthermore, data and/or other forms of information pertaining to the present disclosure may be propagated by a pure signal.
  • The terms “application,” “program,” and the like may be used to describe an aspect of computing system 700 that is implemented to perform one or more particular functions. In some cases, such an application and/or program may be instantiated via logic subsystem 702 executing instructions held by data-holding subsystem 704. It is to be understood that different applications and/or programs may be instantiated from the same service, code block, object, library, routine, API, function, etc. Likewise, the same application and/or programs may be instantiated by different services, code blocks, objects, routines, APIs, functions, etc. The terms application and/or program are meant to encompass individual or groups of executable files, data files, libraries, drivers, scripts, etc.
  • When included, display subsystem 706 may be used to present a visual representation of data held by data-holding subsystem 704. As the herein described methods and processes change the data held by the data-holding subsystem, and thus transform the state of the data-holding subsystem, the state of display subsystem 706 may likewise be transformed to visually represent changes in the underlying data. Display subsystem 706 may include one or more display devices utilizing virtually any type of technology. Such display devices may be combined with logic subsystem 702 and/or data-holding subsystem 704 in a shared enclosure, or such display devices may be peripheral display devices.
  • When included, communication subsystem 708 may be configured to communicatively couple computing system 700 with one or more other computing devices. Communication subsystem 708 may include wired and/or wireless communication devices compatible with one or more different communication protocols. As nonlimiting examples, the communication subsystem may be configured for communication via a wireless telephone network, a wireless local area network, a wired local area network, a wireless wide area network, a wired wide area network, etc. In some embodiments, the communication subsystem may allow computing system 700 to send and/or receive messages to and/or from other devices via a network such as the Internet.
  • It is to be understood that the configurations and/or approaches described herein are exemplary in nature, and that these specific embodiments or examples are not to be considered in a limiting sense, because numerous variations are possible. The specific routines or methods described herein may represent one or more of any number of processing strategies. As such, various acts illustrated may be performed in the sequence illustrated, in other sequences, in parallel, or in some cases omitted. Likewise, the order of the above-described processes may be changed.
  • The subject matter of the present disclosure includes all novel and nonobvious combinations and subcombinations of the various processes, systems and configurations, and other features, functions, acts, and/or properties disclosed herein, as well as any and all equivalents thereof.

Claims (20)

1. A hand-held video game controller, comprising:
a body;
a plurality of pushable buttons movably coupled to the body, each pushable button comprising a transparent cap and an individually controllable display device arranged behind the transparent cap such that the display device is viewable through the transparent cap; and
a controller configured to:
receive an input from an application requesting display of a first icon on a first display device of a first pushable button of the plurality of pushable buttons and a second icon on a second display device of a second button of the plurality of pushable buttons; and
in response, display the first icon on the first display device and display the second icon on the second display device.
2. The hand-held video game controller of claim 1, wherein each pushable button further comprises a support structure located behind the individually controllable display device to support the individually controllable display device against the transparent cap.
3. The hand-held video game controller of claim 2, wherein, for each pushable button, the support structure is retained within the pushable button by one or more retaining tabs.
4. The hand-held video game controller of claim 2, wherein, for each pushable button, the support structure is retained within the pushable button by an adhesive.
5. The hand-held video game controller of claim 2, wherein, for each pushable button, the support structure comprises one or more vias configured to accommodate one or more electrical connections to the individually controllable display.
6. The hand-held video game controller of claim 5, wherein the one or more vias are formed along one or more side portions of the support structure.
7. The hand-held video game controller of claim 1, wherein each pushable button of the plurality of pushable buttons comprises a single actuatable component.
8. The hand-held video game controller of claim 1, wherein each display device is configured to display two or more predefined fixed icons.
9. The hand-held video game controller of claim 8, wherein each display device comprises an electronic paper display device.
10. The hand-held video game controller of claim 1, wherein each display device comprises an active matrix display, and wherein the controller is configured to provide image data to each active matrix display specifying an image to be displayed.
11. A hand-held video game controller, comprising:
a body;
a plurality of pushable buttons movably coupled to the body, each pushable button comprising a transparent cap, an individually controllable display device arranged behind the transparent cap such that the display device is viewable through the transparent cap, and a support structure configured to support the individually controllable display device against the transparent cap;
a logic subsystem; and
memory comprising code representing a plurality of predefined icons displayable on the individually controllable display devices, the memory also comprising executable instructions that are executable by the logic subsystem to:
display on each display device icons selected from a fixed set of predefined icons;
receive an input from an application requesting display of a first predefined icon on a first display device of a first pushable button of the plurality of pushable buttons and a second predefined icon on a second display device of a second pushable button of the plurality of pushable buttons; and
in response, display the first predefined icon on the first display device and display the second predefined icon on the second display device, thereby causing the first predefined icon to appear on the first pushable button and the second predefined icon to appear on the second pushable button.
12. The hand-held video game controller of claim 11, wherein the code representing the predefined icons includes code representing a default icon for each pushable button, and wherein the executable instructions are further executable to display the default icon for each pushable button for which the application does not request display of one of the predefined icons.
13. The hand-held video game controller of claim 11, wherein each pushable button of the plurality of pushable buttons comprises a single actuatable component.
14. The hand-held video game controller of claim 11, wherein each display device comprises an electronic paper display device.
15. The hand-held video game controller of claim 11, wherein, for each pushable button, the support structure is retained within the pushable button by one or more retaining tabs.
16. The hand-held video game controller of claim 11, wherein, for each pushable button, the support structure is retained within the pushable button by an adhesive.
17. The hand-held video game controller of claim 11, wherein, for each pushable button, the support structure comprises one or more vias formed along one or more side portions of the support structure and configured to accommodate one or more electrical connections to the individually controllable display.
18. The hand-held video game controller of claim 11, further comprising a directional pad disposed on the body at a location spaced apart from the plurality of pushable buttons, the directional pad comprising a plurality of pushable directions and a corresponding plurality of actuatable components, and wherein the directional pad comprises a transparent directional pad cap and a directional pad display device located behind the transparent directional pad cap such that the directional pad display device is viewable through the transparent directional pad cap, and a directional pad support structure configured to support the directional pad display device against the transparent cap.
19. A method of operating a hand-held video game controller, the hand-held video game controller comprising a plurality of pushable buttons each having a transparent cap and an individually controllable display device arranged behind the transparent cap such that the display device is viewable through the transparent cap, a logic subsystem, and memory comprising code representing a plurality of predefined icons displayable on the individually controllable display devices, the method comprising:
receiving an input front a first application requesting display of a first predefined icon on a first display device of a first pushable button of the plurality of pushable buttons and a second predefined icon on a second display device of a second button of the plurality of pushable buttons;
in response, display the first predefined icon on the first display device, display the second predefined icon on the second display device, and display a default icon on a third display device of a third pushable button, thereby causing the first predefined icon to appear on the first pushable button, the second predefined icon to appear on the second pushable button, and the default icon to appear on the third pushable button.
20. The method of claim 19, further comprising:
detecting user actuation of the of the third pushable button;
receiving an input from the first application requesting display of a first dynamic icon on the third display device of the third pushable button; and
displaying the first dynamic icon on the third display device of the third pushable button.
US13/553,636 2012-07-19 2012-07-19 Changing icons on user input device Abandoned US20140024456A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US13/553,636 US20140024456A1 (en) 2012-07-19 2012-07-19 Changing icons on user input device
PCT/US2013/051169 WO2014015198A1 (en) 2012-07-19 2013-07-18 Changing icons on user input device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US13/553,636 US20140024456A1 (en) 2012-07-19 2012-07-19 Changing icons on user input device

Publications (1)

Publication Number Publication Date
US20140024456A1 true US20140024456A1 (en) 2014-01-23

Family

ID=48901192

Family Applications (1)

Application Number Title Priority Date Filing Date
US13/553,636 Abandoned US20140024456A1 (en) 2012-07-19 2012-07-19 Changing icons on user input device

Country Status (2)

Country Link
US (1) US20140024456A1 (en)
WO (1) WO2014015198A1 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140378227A1 (en) * 2013-06-20 2014-12-25 Cheng Uei Precision Industry Co., Ltd. Button structure for game controller
USD741363S1 (en) * 2013-11-21 2015-10-20 Microsoft Corporation Display screen with icon
USD741365S1 (en) * 2013-11-21 2015-10-20 Microsoft Corporation Display screen with icon
USD741366S1 (en) * 2013-11-21 2015-10-20 Microsoft Corporation Display screen with icon
USD741364S1 (en) * 2013-11-21 2015-10-20 Microsoft Corporation Display screen with icon
EP3384969A1 (en) * 2017-04-06 2018-10-10 Playrapid Command input device for a joystick, with removable or detachable actuator
CN108721886A (en) * 2018-05-24 2018-11-02 京东方科技集团股份有限公司 A kind of game paddle and a kind of game machine
USD909413S1 (en) * 2018-11-01 2021-02-02 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
US20230084581A1 (en) * 2021-09-16 2023-03-16 Voyetra Turtle Beach Inc. Video game controller with a graphical user interface
EP4292683A1 (en) * 2022-06-14 2023-12-20 Heusinkveld Engineering B.V. Input device for a simulator environment, simulator provided therewith, and method for operating such device and simulator

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050110762A1 (en) * 2003-11-24 2005-05-26 Muyskens Neil H. Keyboard with changeable key display
US20090179854A1 (en) * 2008-01-11 2009-07-16 Apple Inc. Dynamic input graphic display
US20090231283A1 (en) * 2007-06-22 2009-09-17 Polak Robert D Colored Morphing Apparatus for an Electronic Device
US20110205161A1 (en) * 2010-02-22 2011-08-25 Stephen Myers Versatile keyboard input and output device
WO2013133686A1 (en) * 2012-03-06 2013-09-12 Rti Science & Technology Sdn Bhd Keyboard system with changeable key displays

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6454649B1 (en) * 1998-10-05 2002-09-24 International Game Technology Gaming device and method using programmable display switch
AU2003218071A1 (en) * 2002-03-11 2003-09-29 Tahl Salomon Systems and methods employing changeable touch-key
US20080092087A1 (en) * 2005-09-19 2008-04-17 Ronald Brown Method and Display Data Entry Unit
US20090073126A1 (en) * 2007-07-16 2009-03-19 Srivastava Aditya Narain Standardized method and systems for providing configurable keypads

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050110762A1 (en) * 2003-11-24 2005-05-26 Muyskens Neil H. Keyboard with changeable key display
US20090231283A1 (en) * 2007-06-22 2009-09-17 Polak Robert D Colored Morphing Apparatus for an Electronic Device
US20090179854A1 (en) * 2008-01-11 2009-07-16 Apple Inc. Dynamic input graphic display
US20110205161A1 (en) * 2010-02-22 2011-08-25 Stephen Myers Versatile keyboard input and output device
WO2013133686A1 (en) * 2012-03-06 2013-09-12 Rti Science & Technology Sdn Bhd Keyboard system with changeable key displays

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140378227A1 (en) * 2013-06-20 2014-12-25 Cheng Uei Precision Industry Co., Ltd. Button structure for game controller
US8961311B2 (en) * 2013-06-20 2015-02-24 Cheng Uei Precision Industry Co., Ltd. Button structure for game controller
USD741363S1 (en) * 2013-11-21 2015-10-20 Microsoft Corporation Display screen with icon
USD741365S1 (en) * 2013-11-21 2015-10-20 Microsoft Corporation Display screen with icon
USD741366S1 (en) * 2013-11-21 2015-10-20 Microsoft Corporation Display screen with icon
USD741364S1 (en) * 2013-11-21 2015-10-20 Microsoft Corporation Display screen with icon
EP3384969A1 (en) * 2017-04-06 2018-10-10 Playrapid Command input device for a joystick, with removable or detachable actuator
FR3064925A1 (en) * 2017-04-06 2018-10-12 Playrapid CONTROL INPUT DEVICE FOR A GAME LEVER, WITH REMOVABLE OR REMOVABLE ACTUATOR.
CN108721886A (en) * 2018-05-24 2018-11-02 京东方科技集团股份有限公司 A kind of game paddle and a kind of game machine
USD909413S1 (en) * 2018-11-01 2021-02-02 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
US20230084581A1 (en) * 2021-09-16 2023-03-16 Voyetra Turtle Beach Inc. Video game controller with a graphical user interface
EP4292683A1 (en) * 2022-06-14 2023-12-20 Heusinkveld Engineering B.V. Input device for a simulator environment, simulator provided therewith, and method for operating such device and simulator
NL2032166B1 (en) * 2022-06-14 2023-12-21 Heusinkveld Eng B V Input device for a simulator environment, simulator provided therewith, and method for operating such device and simulator

Also Published As

Publication number Publication date
WO2014015198A1 (en) 2014-01-23

Similar Documents

Publication Publication Date Title
US20140024456A1 (en) Changing icons on user input device
JP6387404B2 (en) User interface element selection via position signal
EP2597548A2 (en) Dynamic scaling of touch sensor
JP6006439B2 (en) Mechanical actuator device for touch screen
JP5518870B2 (en) Integration of tactile control device and touch-sensitive display
AU2013226573B2 (en) Systems and methods for presenting visual interface content
US20160004329A1 (en) Versatile keyboard input and output device
US20140049558A1 (en) Augmented reality overlay for control devices
CN102886140A (en) Game controller on touch-enabled mobile device
EP3066542A1 (en) Multitasking experiences with interactive picture-in-picture
US9808716B2 (en) Display grid for video game input on touchscreen display
US11650721B2 (en) Apparatus, method, and computer-readable storage medium for manipulating a user interface element
JP6216862B1 (en) GAME METHOD AND GAME PROGRAM
US20140282209A1 (en) Method for activating an application bar
JP2024511304A (en) State-based action buttons
EP2886173B1 (en) Augmented reality overlay for control devices
JP2018069040A (en) Gaming method and gaming program

Legal Events

Date Code Title Description
AS Assignment

Owner name: MICROSOFT CORPORATION, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ASHLEY, BRYON;MORRIS, QUINTIN;REEL/FRAME:028593/0717

Effective date: 20120718

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034544/0541

Effective date: 20141014