US20220292938A1 - Detachable accessory for enhanced effects, accessibility, spectating, and play-along - Google Patents

Detachable accessory for enhanced effects, accessibility, spectating, and play-along Download PDF

Info

Publication number
US20220292938A1
US20220292938A1 US17/201,922 US202117201922A US2022292938A1 US 20220292938 A1 US20220292938 A1 US 20220292938A1 US 202117201922 A US202117201922 A US 202117201922A US 2022292938 A1 US2022292938 A1 US 2022292938A1
Authority
US
United States
Prior art keywords
simulation
generator
speaker
light source
signals
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US17/201,922
Other versions
US11430309B1 (en
Inventor
Steven Osman
Brielle Powell
Rizwana Rahman
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Interactive Entertainment Inc
Original Assignee
Sony Interactive Entertainment Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Interactive Entertainment Inc filed Critical Sony Interactive Entertainment Inc
Priority to US17/201,922 priority Critical patent/US11430309B1/en
Priority to PCT/US2022/020094 priority patent/WO2022197573A1/en
Application granted granted Critical
Publication of US11430309B1 publication Critical patent/US11430309B1/en
Publication of US20220292938A1 publication Critical patent/US20220292938A1/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B7/00Signalling systems according to more than one of groups G08B3/00 - G08B6/00; Personal calling systems according to more than one of groups G08B3/00 - G08B6/00
    • G08B7/06Signalling systems according to more than one of groups G08B3/00 - G08B6/00; Personal calling systems according to more than one of groups G08B3/00 - G08B6/00 using electric transmission, e.g. involving audible and visible signalling through the use of sound and light sources
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • A63F13/28Output arrangements for video game devices responding to control signals received from the game device for affecting ambient conditions, e.g. for vibrating players' seats, activating scent dispensers or affecting temperature or light
    • A63F13/285Generating tactile feedback signals via the game input device, e.g. force feedback
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B6/00Tactile signalling systems, e.g. personal calling systems

Definitions

  • the present application relates to technically inventive, non-routine solutions that are necessarily rooted in computer technology and that produce concrete technical improvements.
  • present principles are directed to a detachable accessory for enhanced effects, accessibility, spectating, and play-along that is used with either a handheld controller (for example, DualShock® or DualSense® made by Sony), or mobile device (e.g., a cell phone).
  • the accessory can have LEDs, speakers, haptic actuators, an accessory port, and other features.
  • the accessory enhances the experience by adding extra sound, light, and haptics, such as for indicating the direction of an enemy.
  • the accessory also can enhance accessibility by, e.g., replacing the game's sound with lights or haptics, replacing the game's visuals with sounds, replacing the game's haptics by lights instead.
  • the accessory When detached, the accessory can be handed to a spectator or a second user. In this mode it can enhance the spectating experience by additional visuals, haptics and sounds as well as use light or sound to indicate what actions the player is taking (to help learn the game), e.g., what button they are pressing, what game they are launching, etc.
  • the light sources can indicate the player's actions which may be otherwise hard to see.
  • This port can support buttons, joysticks, switches, etc., which can simulate button presses by the player, allowing the spectator to play along. These can be special accessibility buttons.
  • an assembly includes at least one computer control device such as a computer simulation controller, a mobile telephone, etc.
  • the assembly includes at least one accessory detachably engageable with the computer control device.
  • the accessory includes at least one haptic generator, and/or at least one light source, and/or at least one speaker, and at least one processor programmed with instructions to actuate the generator, or the light source, or the speaker, or any combination thereof in response to signals from a simulation being controlled by the computer control device.
  • the instructions can be executable by the processor to actuate the generator or the speaker or both the generator and the speaker responsive to a demanded image indicated by the signals from the simulation.
  • These embodiments may include at least one display device presenting the simulation, and the demanded image may not be presented by the display device such that the haptics or sound produced by the accessory replaces, as sensory input to the user, the demanded image that otherwise would be presented on the display device.
  • the instructions can be executable by the processor to actuate the generator or the light source or both the generator and the light source responsive to a demanded audio indicated by the signals from the simulation.
  • These embodiments may include at least one display device presenting the simulation, and the demanded audio may not be presented by the display device such that the light or haptics produced by the accessory replaces, as sensory input to the user, the demanded sound that otherwise would be presented on the display device.
  • the instructions can be executable by the processor to actuate the speaker or the light source or both the speaker and the light source responsive to a demanded haptic indicated by the signals from the simulation.
  • These embodiments may include at least one display device presenting the simulation, and the demanded haptic may not be presented by the display device such that the light or sound produced by the accessory replaces, as sensory input to the user, the demanded haptic that otherwise would be presented on the display device.
  • the instructions can be executable by the processor to, responsive to the accessory being disengaged from the control device, actuate at least one of the light sources(s), speaker, or generator or any combination thereof to indicate quality of simulation control command input.
  • the accessory may include at least one element manipulable to generate the simulation control command input.
  • the assembly may include at least one display device presenting the simulation and separate from the accessory and the control device, and the instructions can be executable by the processor to activate the generator, or the light source, or the speaker, or any combination thereof in response to signals from the simulation.
  • the signals from the simulation include demanded audio and/or demanded images presented on the display device such that activation of the generator, or the light source, or the speaker, or any combination thereof augments the simulation displayed on the display device.
  • a device in another aspect, includes at least one computer storage that is not a transitory signal and that in turn includes instructions executable by at least one processor to execute any one or more of the below:
  • the signals from the simulation include demanded audio and/or demanded images presented on a display device such that activation of the generator, or the light source, or the speaker, or any combination thereof augments the simulation displayed on the display device;
  • a method in another aspect, includes detachably engaging, without fasteners, an accessory with a computer simulation controller, and using the computer simulation controller to control a computer simulation presented on a display. The method also includes using the accessory to produce haptics and/or light source activation and/or sound responsive to signals from the computer simulation. If desired, the method may include detaching the accessory from the computer simulation controller and using the accessory to input signals to the simulation.
  • FIG. 1 is a block diagram of an example system including an example in accordance with present principles
  • FIG. 2 illustrates a first example embodiment of the accessory
  • FIG. 3 illustrates a second example embodiment of the accessory
  • FIG. 4 is a block diagram of example components of the accessory
  • FIG. 5 illustrates example logic in example flow chart format for augmenting computer simulation (e.g., computer game) output signals
  • FIG. 6 illustrates an example data structure in example table format consistent with FIG. 5 ;
  • FIG. 7 illustrates example logic in example flow chart format for replacing game sound with light source and/or haptics using the accessory
  • FIG. 8 illustrates an example data structure in example table format consistent with FIG. 7 ;
  • FIG. 9 illustrates example logic in example flow chart format for replacing game images with sound and/or haptics using the accessory
  • FIG. 10 illustrates an example data structure in example table format consistent with FIG. 9 ;
  • FIG. 11 illustrates example logic in example flow chart format for replacing game haptics with light source and/or sound using the accessory
  • FIG. 12 illustrates an example data structure in example table format consistent with FIG. 11 ;
  • FIG. 13 illustrates an example user interface consistent with present principles
  • FIG. 14 illustrates example logic in example flow chart format for second person use of the accessory.
  • a system herein may include server and client components which may be connected over a network such that data may be exchanged between the client and server components.
  • the client components may include one or more computing devices including game consoles such as Sony PlayStation® or a game console made by Microsoft or Nintendo or other manufacturer, virtual reality (VR) headsets, augmented reality (AR) headsets, portable televisions (e.g., smart TVs, Internet-enabled TVs), portable computers such as laptops and tablet computers, and other mobile devices including smart phones and additional examples discussed below.
  • game consoles such as Sony PlayStation® or a game console made by Microsoft or Nintendo or other manufacturer
  • VR virtual reality
  • AR augmented reality
  • portable televisions e.g., smart TVs, Internet-enabled TVs
  • portable computers such as laptops and tablet computers, and other mobile devices including smart phones and additional examples discussed below.
  • These client devices may operate with a variety of operating environments.
  • client computers may employ, as examples, Linux operating systems, operating systems from Microsoft, or a Unix operating system, or operating systems produced by Apple, Inc., or Google.
  • These operating environments may be used to execute one or more browsing programs, such as a browser made by Microsoft or Google or Mozilla or other browser program that can access websites hosted by the Internet servers discussed below.
  • an operating environment according to present principles may be used to execute one or more computer game programs.
  • Servers and/or gateways may include one or more processors executing instructions that configure the servers to receive and transmit data over a network such as the Internet. Or a client and server can be connected over a local intranet or a virtual private network.
  • a server or controller may be instantiated by a game console such as a Sony PlayStation®, a personal computer, etc.
  • servers and/or clients can include firewalls, load balancers, temporary storages, and proxies, and other network infrastructure for reliability and security.
  • servers may form an apparatus that implement methods of providing a secure community such as an online social website to network members.
  • a processor may be a single- or multi-chip processor that can execute logic by means of various lines such as address lines, data lines, and control lines and registers and shift registers.
  • a system having at least one of A, B, and C includes systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.
  • an example system 10 which may include one or more of the example devices mentioned above and described further below in accordance with present principles.
  • the first of the example devices included in the system 10 is a consumer electronics (CE) device such as an audio video device (AVD) 12 such as but not limited to an Internet-enabled TV with a TV tuner (equivalently, set top box controlling a TV).
  • CE consumer electronics
  • APD audio video device
  • the AVD 12 alternatively may also be a computerized Internet enabled (“smart”) telephone, a tablet computer, a notebook computer, a HMD, a wearable computerized device, a computerized Internet-enabled music player, computerized Internet-enabled headphones, a computerized Internet-enabled implantable device such as an implantable skin device, etc.
  • the AVD 12 is configured to undertake present principles (e.g., communicate with other CE devices to undertake present principles, execute the logic described herein, and perform any other functions and/or operations described herein).
  • the AVD 12 can be established by some or all of the components shown in FIG. 1 .
  • the AVD 12 can include one or more displays 14 that may be implemented by a high definition or ultra-high definition “4K” or higher flat screen and that may be touch-enabled for receiving user input signals via touches on the display.
  • the AVD 12 may include one or more speakers 16 for outputting audio in accordance with present principles, and at least one additional input device 18 such as an audio receiver/microphone for entering audible commands to the AVD 12 to control the AVD 12 .
  • the example AVD 12 may also include one or more network interfaces 20 for communication over at least one network 22 such as the Internet, an WAN, an LAN, etc.
  • the interface 20 may be, without limitation, a Wi-Fi transceiver, which is an example of a wireless computer network interface, such as but not limited to a mesh network transceiver. It is to be understood that the processor 24 controls the AVD 12 to undertake present principles, including the other elements of the AVD 12 described herein such as controlling the display 14 to present images thereon and receiving input therefrom.
  • the network interface 20 may be a wired or wireless modem or router, or other appropriate interface such as a wireless telephony transceiver, or Wi-Fi transceiver as mentioned above, etc.
  • the AVD 12 may also include one or more input ports 26 such as a high-definition multimedia interface (HDMI) port or a USB port to physically connect to another CE device and/or a headphone port to connect headphones to the AVD 12 for presentation of audio from the AVD 12 to a user through the headphones.
  • the input port 26 may be connected via wire or wirelessly to a cable or satellite source 26 a of audio video content.
  • the source 26 a may be a separate or integrated set top box, or a satellite receiver.
  • the source 26 a may be a game console or disk player containing content.
  • the source 26 a when implemented as a game console may include some or all of the components described below in relation to the CE device 44 .
  • the AVD 12 may further include one or more computer memories 28 such as disk-based or solid-state storage that are not transitory signals, in some cases embodied in the chassis of the AVD as standalone devices or as a personal video recording device (PVR) or video disk player either internal or external to the chassis of the AVD for playing back AV programs or as removable memory media.
  • the AVD 12 can include a position or location receiver such as but not limited to a cellphone receiver, GPS receiver and/or altimeter 30 that is configured to receive geographic position information from a satellite or cellphone base station and provide the information to the processor 24 and/or determine an altitude at which the AVD 12 is disposed in conjunction with the processor 24 .
  • the component 30 may also be implemented by an inertial measurement unit (IMU) that typically includes a combination of accelerometers, gyroscopes, and magnetometers to determine the location and orientation of the AVD 12 in three dimensions.
  • IMU inertial measurement unit
  • the AVD 12 may include one or more cameras 32 that may be a thermal imaging camera, a digital camera such as a webcam, and/or a camera integrated into the AVD 12 and controllable by the processor 24 to gather pictures/images and/or video in accordance with present principles. Also included on the AVD 12 may be a Bluetooth transceiver 34 and other Near Field Communication (NFC) element 36 for communication with other devices using Bluetooth and/or NFC technology, respectively.
  • NFC element can be a radio frequency identification (RFID) element.
  • the AVD 12 may include one or more auxiliary sensors 38 (e.g., a motion sensor such as an accelerometer, gyroscope, cyclometer, or a magnetic sensor, an infrared (IR) sensor, an optical sensor, a speed and/or cadence sensor, a gesture sensor (e.g., for sensing gesture command), providing input to the processor 24 .
  • the AVD 12 may include an over-the-air TV broadcast port 40 for receiving OTA TV broadcasts providing input to the processor 24 .
  • the AVD 12 may also include an infrared (IR) transmitter and/or IR receiver and/or IR transceiver 42 such as an IR data association (IRDA) device.
  • IRDA IR data association
  • a battery (not shown) may be provided for powering the AVD 12 , as may be a kinetic energy harvester that may turn kinetic energy into power to charge the battery and/or power the AVD 12 .
  • a graphics processing unit (GPU) 44 and field programmable gated array 46 also may be included.
  • the system 10 may include one or more other CE device types.
  • a first CE device 48 may be a computer game console that can be used to send computer game audio and video to the AVD 12 via commands sent directly to the AVD 12 and/or through the below-described server while a second CE device 50 may include similar components as the first CE device 48 .
  • the second CE device 50 may be configured as a computer game controller manipulated by a player or a head-mounted display (HMD) worn by a player.
  • HMD head-mounted display
  • the below-described accessory may include appropriate components in FIG. 2 .
  • a device herein may implement some or all of the components shown for the AVD 12 . Any of the components shown in the following figures may incorporate some or all of the components shown in the case of the AVD 12 .
  • At least one server 52 includes at least one server processor 54 , at least one tangible computer readable storage medium 56 such as disk-based or solid-state storage, and at least one network interface 58 that, under control of the server processor 54 , allows for communication with the other devices of FIG. 1 over the network 22 , and indeed may facilitate communication between servers and client devices in accordance with present principles.
  • the network interface 58 may be, e.g., a wired or wireless modem or router, Wi-Fi transceiver, or other appropriate interface such as, e.g., a wireless telephony transceiver.
  • the server 52 may be an Internet server or an entire server “farm” and may include and perform “cloud” functions such that the devices of the system 10 may access a “cloud” environment via the server 52 in example embodiments for, e.g., network gaming applications.
  • the server 52 may be implemented by one or more game consoles or other computers in the same room as the other devices shown in FIG. 1 or nearby.
  • FIGS. 2-4 may include some or all components shown in FIG. 1 .
  • FIG. 2 illustrates an accessory 200 that may be detachably engaged with a computer simulation controller 202 such as but not limited to a PlayStation® controller or Xbox® controller.
  • the controller 202 controls presentation of a computer simulation such as a computer game on one or more display devices 204 , which may receive game signals from one or more simulation consoles 206 and/or one or more streamlining servers 208 .
  • the accessory 200 may communicate via wired or wireless paths with any one or more of the components shown in FIG. 2 .
  • the contour of the accessory 200 substantially matches that of the controller 202 , and the accessory 200 may include raised walls 210 forming a partially open enclosure into which the controller 202 may be detachably received as in, for example, an interference fit or snapping engagement.
  • an accessory 300 may include a contour that may generally match the contour of a mobile communication device 302 such as, for example, a cell phone.
  • the accessory contemplated herein may have other shapes than those illustrated.
  • FIG. 4 shows that the accessory 200 or 300 may include relevant components from FIG. 1 and as potentially introduced in FIG. 4 , including one or more light sources 402 which may be implemented by light emitting diodes (LED).
  • the accessory may further include one or more audio speakers 404 and one or more haptics generators 406 .
  • These sensor output devices may be controlled by one or more processor 408 accessing one or more computer storages or memories 410 to execute logic herein.
  • the accessory may also include one or more accessory ports 412 which can be electrically engaged with one or more controls 414 such as buttons, joysticks, etc.
  • the controls 414 may include any or all of the controls on a Play Station game controller, for instance.
  • the accessory may include one or more video displays 416 , one or more motion sensors 418 , and one or more cameras 420 for purposes to be shortly disclosed.
  • the accessory may include a processor, a wireless transceiver, and a power source such as a battery.
  • FIGS. 5 and 6 illustrate that in one use case the accessory contemplated herein may be used to augment sensory input to the player of a computer game.
  • Game signals may be received at block 500 , and the light source(s) 402 activated at block 502 responsive to the game signals to augment presentation of the game as sensed by the player.
  • the speaker(s) 404 may be activated at block 504 responsive to the game signals to augment presentation of the game as sensed by the player.
  • the haptics generator(s) 406 may be activated at block 506 responsive to the game signals to augment presentation of the game as sensed by the player.
  • the signals to activate the light sources/speakers/haptics on the accessory may come directly from the game, i.e., the game developer may program the activation signals into the metadata of the game with the accessory in mind.
  • the game metadata may contain explicit signals such as “blink LEDs on accessory at time t” or “play beep on accessory speaker at time t” or “activate haptics generator on accessory at time t”.
  • a data structure 600 may be accessed by the accessory processor 408 shown in FIG. 4 to correlate various in-game signals, shown in column 602 , to accessory actions, shown in column 604 .
  • the accessory can correlate that to actuating the haptics generator(s) 406 shown in FIG. 4 , e.g., to buzz for a short period, and to play a jet sound effect on the speaker(s) 404 shown in FIG. 4 .
  • the game signal indicates a gunshot, this may be correlated to a single blink of the light source 402 shown in FIG. 4 , playing of a popping sound effect on the speakers 404 , and a single jolt from the haptics generator 406 .
  • Other correlations may be developed as desired.
  • the correlation may be executed by any of the devices herein and sent to the accessory.
  • image recognition may be performed on images from the accessory camera 420 shown in FIG. 4 or other camera discussed herein to indicate the game signal in column 602 .
  • Audio recognition may also be performed on sounds played by the display device 204 for the same purpose.
  • FIGS. 7 and 8 illustrate replacing game audio with visual sensory input from the accessory and/or haptics input from the accessory
  • FIGS. 9 and 10 illustrate replacing game images presentation with sound input from the accessory and/or haptics input from the accessory
  • FIGS. 11 and 12 illustrate replacing game haptics with visual sensory input from the accessory and/or audio from the accessory. It should be noted that while these figures assume correlating game signals to replacement sensory input from the accessory, as is the case above direct commands to the accessory to activate its various output devices may be directly embedded in the game metadata, in which case the correlations discussed are done a priori.
  • a signal representing a demanded sound from the simulation (e.g., computer game) being executed is received.
  • the signal is correlated at block 702 to a haptic and/or light source actuation, which is presented at block 704 by the accessory in a human-sensible way.
  • a signal may be sent, e.g., from the accessory to the simulation to not play or otherwise deactivate the demanded sound, which in effect is replaced by the visible or tactile output of the accessory.
  • FIG. 8 presents example correlations of sound to light source activation/haptic generation.
  • a demanded simulation sound for a car engine may be correlated to a buzzing signal generated by the haptic generator 406 of the accessory and/or a rolling sequence of light source activation using the light sources 402 of the accessory such that a sequence of light sources is activated one after another to present a sequence of illuminations.
  • a demanded gunshot signal from the simulation may be correlated to a single pop generated by the haptic generator 406 and/or a single bright flash from a light source 402 of the accessory.
  • a signal representing a demanded image from the simulation (e.g., computer game) being executed is received.
  • the image may be identified by metadata or image recognition performed on, e.g., images of the display device 204 from the camera 420 shown in FIG. 4 , in the event that the simulation does not include a direct call for an output device on the accessory.
  • the demanded image is correlated at block 904 to a haptic and/or speaker actuation, which is presented at block 906 by the accessory in a human-sensible way.
  • a signal may be sent, e.g., from the accessory to the simulation to not play or otherwise deactivate the demanded image, which in effect is replaced by the acoustic or tactile output of the accessory.
  • FIG. 10 presents example correlations of image to speaker activation/haptic generation.
  • An image of a plane although demanded by the simulation, may not be presented on the display device 204 and instead a replacement jet sound may be played on the speakers 404 of the accessory.
  • An image of a bird although demanded by the game, may not be presented on the display and instead a chirp sound effect may be played on the speaker 404 of the accessory and/or a short mild buzz output by the haptic generator 406 .
  • a signal representing a demanded haptic from the simulation (e.g., computer game) being executed is received.
  • the signal is correlated at block 1102 to a speaker and/or light source actuation, which is presented at block 1104 by the accessory in a human-sensible way.
  • a signal may be sent, e.g., from the accessory to the simulation to not play or otherwise deactivate the demanded haptic, which in effect is replaced by the visible or acoustic output of the accessory.
  • FIG. 12 presents example correlations of haptic to light source activation/sound generation.
  • a demanded long buzz from, e.g., the game controller during game play may be replaced by a steady illumination of a light source 402 on the accessory and/or by a steady buzzing sound effect played on the speaker 404 of the accessory.
  • a demanded pop from the game controller may be replaced by a quick on-off illumination of the light source 402 of the accessory.
  • FIG. 13 illustrates an example user interface (UI) 1300 that may be presented on a display 1302 such as any of the displays herein to enable selection of present features.
  • a selector 1304 may be selectable to enable the logic of FIG. 5 to augment game presentation with signals from the accessory.
  • a replace selector 1306 may be selectable to select from a menu 1308 to replace demanded sound, or images, or haptics from the display device 204 /game controller with sensory output from the accessory consistent with FIGS. 7-12 .
  • a selector 1310 may be selected to enable the logic of FIG. 14 to enhance the experience of a second person such as a spectator of a computer game or second player of the computer game.
  • an indication is received to use the accessory to enhance the gaming experience of a second person such as a second gamer or spectator of a game being played by a first player.
  • This indication may be detachment of the accessory 200 / 300 from a control device or the attachment of the accessory 200 / 300 with a control device as indicated by, e.g., images from the camera 420 on the accessory or signals from the motion sensor 418 , or as indicated by selection of the selector 1310 in FIG. 13 .
  • the accessory receives game signals from the computer simulation (e.g., game) being executed and controlled using the controller 202 shown in FIG. 2 .
  • the light source(s) 402 /speaker(s) 404 /haptic generator(s) 406 of the accessory are activated according to the control signals from the game. In this way, when, for example, a shot is fired in the game, a spectator holding the accessory may experience a pop sound from the accessory speaker 404 and/or a flash of the light source 402 and/or a jolt from the haptics generator 406 , indicating the event occurring in the game.
  • some of the visuals, sounds, and/or haptics presented by the accessory can be tied directly to the player's button presses. That way the spectator holding the accessory can feel/see button presses that are otherwise occluded by the player's fingers and hands. This could help, for instance, to learn the rhythm of the button presses.
  • haptics may make sense to replace haptics with haptics. For example, some people may feel that high frequency haptics in a controller are uncomfortable. In that case, the controller haptics may be disabled and replaced with a different type of more palatable haptic presented on the accessory.
  • Block 1406 indicates that in addition to enhancing second person experiences, one or more of the controls 414 shown in FIG. 4 may be connected to the accessory port 412 and manipulated by the second person to input to the accessory processor 408 game control signals. These signals may be sent from the accessory to the device executing the game at block 1408 to allow play-through of the game in accordance with the signals received at block 1406 , i.e., to enable the second person holding the accessory to input game commands to control the game.
  • correctness of the input received at block 1408 may be determined at block 1410 .
  • metadata from the game or other source may indicate which game commands are better than others, e.g., based on a history of game play by experts, and from this data the correctness of the second person's input may be determined by determining how much the input diverged from the “best” expert input to indicate quality of simulation control command input.
  • Block 1412 indicates that the light source(s) 402 /speaker(s) 404 /haptics generator(s) 406 of the accessory may be activated consistent with the determination at block 1410 to provide sensory information as to the correctness of the second person's play, to train the second person, for instance. For example, a green light source 402 may be illuminated and a gentle chime played on the speaker 404 for a good input, while for a less good input a red-light source 402 may be illuminated, an unpleasant buzz played on the speaker 404 , and a jolt emitted by the haptic generator 406 .
  • the first person can detach the accessory from his controller and the second person attach it to her controller.
  • the second person can then use her controller as input and output rather than just limited to the output devices connected to the accessory and input devices connected to the accessory's accessory port.
  • Uses of the accessory can be mixed and matched, e.g., disengage the device from the first person's controller and engage with the second person's phone, or disengage it from the first person's phone and engage the accessory with the second person's controller.
  • the accessory can also be used for a physical transfer/transmission of a gameplay session.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A detachable for a computer game controller or a mobile device includes LEDs, speakers, haptic actuators, an accessory port, and other features. When attached to the phone or controller, the device can enhance the experience by adding extra sound, light, and haptics, or it can replace the game's sound with lights or haptics, replace game visuals with sounds, and replace game haptics by lights. When detached, the device can be handed to a second person to enhance the spectating experience by additional visuals, haptics and sounds and for using light or sound to indicate what actions the player is taking (to help learn the game) and allow play-along through an accessory port that supports controls. Light, sound or haptics can be used to indicate what actions the player is taking.

Description

    FIELD
  • The present application relates to technically inventive, non-routine solutions that are necessarily rooted in computer technology and that produce concrete technical improvements.
  • BACKGROUND
  • As understood herein, the growing variety and richness of computer games opens a window to further sensory output for multiple purposes.
  • SUMMARY
  • Accordingly, present principles are directed to a detachable accessory for enhanced effects, accessibility, spectating, and play-along that is used with either a handheld controller (for example, DualShock® or DualSense® made by Sony), or mobile device (e.g., a cell phone). The accessory can have LEDs, speakers, haptic actuators, an accessory port, and other features. When attached to the phone or controller, the accessory enhances the experience by adding extra sound, light, and haptics, such as for indicating the direction of an enemy.
  • The accessory also can enhance accessibility by, e.g., replacing the game's sound with lights or haptics, replacing the game's visuals with sounds, replacing the game's haptics by lights instead.
  • When detached, the accessory can be handed to a spectator or a second user. In this mode it can enhance the spectating experience by additional visuals, haptics and sounds as well as use light or sound to indicate what actions the player is taking (to help learn the game), e.g., what button they are pressing, what game they are launching, etc. The light sources can indicate the player's actions which may be otherwise hard to see.
  • Play-along through an accessory port is facilitated. This port can support buttons, joysticks, switches, etc., which can simulate button presses by the player, allowing the spectator to play along. These can be special accessibility buttons.
  • Accordingly, an assembly includes at least one computer control device such as a computer simulation controller, a mobile telephone, etc. The assembly includes at least one accessory detachably engageable with the computer control device. The accessory includes at least one haptic generator, and/or at least one light source, and/or at least one speaker, and at least one processor programmed with instructions to actuate the generator, or the light source, or the speaker, or any combination thereof in response to signals from a simulation being controlled by the computer control device.
  • In some embodiments, the instructions can be executable by the processor to actuate the generator or the speaker or both the generator and the speaker responsive to a demanded image indicated by the signals from the simulation. These embodiments may include at least one display device presenting the simulation, and the demanded image may not be presented by the display device such that the haptics or sound produced by the accessory replaces, as sensory input to the user, the demanded image that otherwise would be presented on the display device.
  • In some embodiments, the instructions can be executable by the processor to actuate the generator or the light source or both the generator and the light source responsive to a demanded audio indicated by the signals from the simulation. These embodiments may include at least one display device presenting the simulation, and the demanded audio may not be presented by the display device such that the light or haptics produced by the accessory replaces, as sensory input to the user, the demanded sound that otherwise would be presented on the display device.
  • In some embodiments, the instructions can be executable by the processor to actuate the speaker or the light source or both the speaker and the light source responsive to a demanded haptic indicated by the signals from the simulation. These embodiments may include at least one display device presenting the simulation, and the demanded haptic may not be presented by the display device such that the light or sound produced by the accessory replaces, as sensory input to the user, the demanded haptic that otherwise would be presented on the display device.
  • In some embodiments, the instructions can be executable by the processor to, responsive to the accessory being disengaged from the control device, actuate at least one of the light sources(s), speaker, or generator or any combination thereof to indicate quality of simulation control command input. The accessory may include at least one element manipulable to generate the simulation control command input.
  • The assembly may include at least one display device presenting the simulation and separate from the accessory and the control device, and the instructions can be executable by the processor to activate the generator, or the light source, or the speaker, or any combination thereof in response to signals from the simulation. The signals from the simulation include demanded audio and/or demanded images presented on the display device such that activation of the generator, or the light source, or the speaker, or any combination thereof augments the simulation displayed on the display device.
  • In another aspect, a device includes at least one computer storage that is not a transitory signal and that in turn includes instructions executable by at least one processor to execute any one or more of the below:
  • activate, on an accessory detachably engaged with a control device for a computer simulation, at least one haptics generator, or at least one light source, or at least one speaker, or any combination thereof in response to signals from the simulation. The signals from the simulation include demanded audio and/or demanded images presented on a display device such that activation of the generator, or the light source, or the speaker, or any combination thereof augments the simulation displayed on the display device;
  • actuate the generator or the speaker or both the generator and the speaker responsive to a demanded image indicated by the signals from the simulation;
  • actuate the generator or the light source or both the generator and the light source responsive to a demanded audio indicated by the signals from the simulation;
  • actuate the speaker or the light source or both the speaker and the light source responsive to a demanded haptic indicated by the signals from the simulation;
  • responsive to the accessory being disengaged from the control device, actuate at least one of the light sources(s), speaker, or generator or any combination thereof to indicate quality of simulation control command input.
  • In another aspect, a method includes detachably engaging, without fasteners, an accessory with a computer simulation controller, and using the computer simulation controller to control a computer simulation presented on a display. The method also includes using the accessory to produce haptics and/or light source activation and/or sound responsive to signals from the computer simulation. If desired, the method may include detaching the accessory from the computer simulation controller and using the accessory to input signals to the simulation.
  • The details of the present application, both as to its structure and operation, can best be understood in reference to the accompanying drawings, in which like reference numerals refer to like parts, and in which:
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a block diagram of an example system including an example in accordance with present principles;
  • FIG. 2 illustrates a first example embodiment of the accessory;
  • FIG. 3 illustrates a second example embodiment of the accessory;
  • FIG. 4 is a block diagram of example components of the accessory;
  • FIG. 5 illustrates example logic in example flow chart format for augmenting computer simulation (e.g., computer game) output signals;
  • FIG. 6 illustrates an example data structure in example table format consistent with FIG. 5;
  • FIG. 7 illustrates example logic in example flow chart format for replacing game sound with light source and/or haptics using the accessory;
  • FIG. 8 illustrates an example data structure in example table format consistent with FIG. 7;
  • FIG. 9 illustrates example logic in example flow chart format for replacing game images with sound and/or haptics using the accessory;
  • FIG. 10 illustrates an example data structure in example table format consistent with FIG. 9;
  • FIG. 11 illustrates example logic in example flow chart format for replacing game haptics with light source and/or sound using the accessory;
  • FIG. 12 illustrates an example data structure in example table format consistent with FIG. 11;
  • FIG. 13 illustrates an example user interface consistent with present principles; and
  • FIG. 14 illustrates example logic in example flow chart format for second person use of the accessory.
  • DETAILED DESCRIPTION
  • This disclosure relates generally to computer ecosystems including aspects of consumer electronics (CE) device networks such as but not limited to computer game networks including drones used for gaming or non-gaming purposes. A system herein may include server and client components which may be connected over a network such that data may be exchanged between the client and server components. The client components may include one or more computing devices including game consoles such as Sony PlayStation® or a game console made by Microsoft or Nintendo or other manufacturer, virtual reality (VR) headsets, augmented reality (AR) headsets, portable televisions (e.g., smart TVs, Internet-enabled TVs), portable computers such as laptops and tablet computers, and other mobile devices including smart phones and additional examples discussed below. These client devices may operate with a variety of operating environments. For example, some of the client computers may employ, as examples, Linux operating systems, operating systems from Microsoft, or a Unix operating system, or operating systems produced by Apple, Inc., or Google. These operating environments may be used to execute one or more browsing programs, such as a browser made by Microsoft or Google or Mozilla or other browser program that can access websites hosted by the Internet servers discussed below. Also, an operating environment according to present principles may be used to execute one or more computer game programs.
  • Servers and/or gateways may include one or more processors executing instructions that configure the servers to receive and transmit data over a network such as the Internet. Or a client and server can be connected over a local intranet or a virtual private network. A server or controller may be instantiated by a game console such as a Sony PlayStation®, a personal computer, etc.
  • Information may be exchanged over a network between the clients and servers. To this end and for security, servers and/or clients can include firewalls, load balancers, temporary storages, and proxies, and other network infrastructure for reliability and security. One or more servers may form an apparatus that implement methods of providing a secure community such as an online social website to network members.
  • A processor may be a single- or multi-chip processor that can execute logic by means of various lines such as address lines, data lines, and control lines and registers and shift registers.
  • Components included in one embodiment can be used in other embodiments in any appropriate combination. For example, any of the various components described herein and/or depicted in the Figures may be combined, interchanged, or excluded from other embodiments.
  • “A system having at least one of A, B, and C” (likewise “a system having at least one of A, B, or C” and “a system having at least one of A, B, C”) includes systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.
  • Now specifically referring to FIG. 1, an example system 10 is shown, which may include one or more of the example devices mentioned above and described further below in accordance with present principles. The first of the example devices included in the system 10 is a consumer electronics (CE) device such as an audio video device (AVD) 12 such as but not limited to an Internet-enabled TV with a TV tuner (equivalently, set top box controlling a TV). The AVD 12 alternatively may also be a computerized Internet enabled (“smart”) telephone, a tablet computer, a notebook computer, a HMD, a wearable computerized device, a computerized Internet-enabled music player, computerized Internet-enabled headphones, a computerized Internet-enabled implantable device such as an implantable skin device, etc. Regardless, it is to be understood that the AVD 12 is configured to undertake present principles (e.g., communicate with other CE devices to undertake present principles, execute the logic described herein, and perform any other functions and/or operations described herein).
  • Accordingly, to undertake such principles the AVD 12 can be established by some or all of the components shown in FIG. 1. For example, the AVD 12 can include one or more displays 14 that may be implemented by a high definition or ultra-high definition “4K” or higher flat screen and that may be touch-enabled for receiving user input signals via touches on the display. The AVD 12 may include one or more speakers 16 for outputting audio in accordance with present principles, and at least one additional input device 18 such as an audio receiver/microphone for entering audible commands to the AVD 12 to control the AVD 12. The example AVD 12 may also include one or more network interfaces 20 for communication over at least one network 22 such as the Internet, an WAN, an LAN, etc. under control of one or more processors 24. A graphics processor may also be included. Thus, the interface 20 may be, without limitation, a Wi-Fi transceiver, which is an example of a wireless computer network interface, such as but not limited to a mesh network transceiver. It is to be understood that the processor 24 controls the AVD 12 to undertake present principles, including the other elements of the AVD 12 described herein such as controlling the display 14 to present images thereon and receiving input therefrom. Furthermore, note the network interface 20 may be a wired or wireless modem or router, or other appropriate interface such as a wireless telephony transceiver, or Wi-Fi transceiver as mentioned above, etc.
  • In addition to the foregoing, the AVD 12 may also include one or more input ports 26 such as a high-definition multimedia interface (HDMI) port or a USB port to physically connect to another CE device and/or a headphone port to connect headphones to the AVD 12 for presentation of audio from the AVD 12 to a user through the headphones. For example, the input port 26 may be connected via wire or wirelessly to a cable or satellite source 26 a of audio video content. Thus, the source 26 a may be a separate or integrated set top box, or a satellite receiver. Or the source 26 a may be a game console or disk player containing content. The source 26 a when implemented as a game console may include some or all of the components described below in relation to the CE device 44.
  • The AVD 12 may further include one or more computer memories 28 such as disk-based or solid-state storage that are not transitory signals, in some cases embodied in the chassis of the AVD as standalone devices or as a personal video recording device (PVR) or video disk player either internal or external to the chassis of the AVD for playing back AV programs or as removable memory media. Also, in some embodiments, the AVD 12 can include a position or location receiver such as but not limited to a cellphone receiver, GPS receiver and/or altimeter 30 that is configured to receive geographic position information from a satellite or cellphone base station and provide the information to the processor 24 and/or determine an altitude at which the AVD 12 is disposed in conjunction with the processor 24. The component 30 may also be implemented by an inertial measurement unit (IMU) that typically includes a combination of accelerometers, gyroscopes, and magnetometers to determine the location and orientation of the AVD 12 in three dimensions.
  • Continuing the description of the AVD 12, in some embodiments the AVD 12 may include one or more cameras 32 that may be a thermal imaging camera, a digital camera such as a webcam, and/or a camera integrated into the AVD 12 and controllable by the processor 24 to gather pictures/images and/or video in accordance with present principles. Also included on the AVD 12 may be a Bluetooth transceiver 34 and other Near Field Communication (NFC) element 36 for communication with other devices using Bluetooth and/or NFC technology, respectively. An example NFC element can be a radio frequency identification (RFID) element.
  • Further still, the AVD 12 may include one or more auxiliary sensors 38 (e.g., a motion sensor such as an accelerometer, gyroscope, cyclometer, or a magnetic sensor, an infrared (IR) sensor, an optical sensor, a speed and/or cadence sensor, a gesture sensor (e.g., for sensing gesture command), providing input to the processor 24. The AVD 12 may include an over-the-air TV broadcast port 40 for receiving OTA TV broadcasts providing input to the processor 24. In addition to the foregoing, it is noted that the AVD 12 may also include an infrared (IR) transmitter and/or IR receiver and/or IR transceiver 42 such as an IR data association (IRDA) device. A battery (not shown) may be provided for powering the AVD 12, as may be a kinetic energy harvester that may turn kinetic energy into power to charge the battery and/or power the AVD 12. A graphics processing unit (GPU) 44 and field programmable gated array 46 also may be included.
  • Still referring to FIG. 1, in addition to the AVD 12, the system 10 may include one or more other CE device types. In one example, a first CE device 48 may be a computer game console that can be used to send computer game audio and video to the AVD 12 via commands sent directly to the AVD 12 and/or through the below-described server while a second CE device 50 may include similar components as the first CE device 48. In the example shown, the second CE device 50 may be configured as a computer game controller manipulated by a player or a head-mounted display (HMD) worn by a player. In the example shown, only two CE devices are shown, it being understood that fewer or greater devices may be used. For example, the below-described accessory may include appropriate components in FIG. 2. A device herein may implement some or all of the components shown for the AVD 12. Any of the components shown in the following figures may incorporate some or all of the components shown in the case of the AVD 12.
  • Now in reference to the afore-mentioned at least one server 52, it includes at least one server processor 54, at least one tangible computer readable storage medium 56 such as disk-based or solid-state storage, and at least one network interface 58 that, under control of the server processor 54, allows for communication with the other devices of FIG. 1 over the network 22, and indeed may facilitate communication between servers and client devices in accordance with present principles. Note that the network interface 58 may be, e.g., a wired or wireless modem or router, Wi-Fi transceiver, or other appropriate interface such as, e.g., a wireless telephony transceiver.
  • Accordingly, in some embodiments the server 52 may be an Internet server or an entire server “farm” and may include and perform “cloud” functions such that the devices of the system 10 may access a “cloud” environment via the server 52 in example embodiments for, e.g., network gaming applications. Or the server 52 may be implemented by one or more game consoles or other computers in the same room as the other devices shown in FIG. 1 or nearby.
  • The components shown in FIGS. 2-4 may include some or all components shown in FIG. 1.
  • FIG. 2 illustrates an accessory 200 that may be detachably engaged with a computer simulation controller 202 such as but not limited to a PlayStation® controller or Xbox® controller. The controller 202 controls presentation of a computer simulation such as a computer game on one or more display devices 204, which may receive game signals from one or more simulation consoles 206 and/or one or more streamlining servers 208. It is to be understood that the accessory 200 may communicate via wired or wireless paths with any one or more of the components shown in FIG. 2.
  • In the example shown, the contour of the accessory 200 substantially matches that of the controller 202, and the accessory 200 may include raised walls 210 forming a partially open enclosure into which the controller 202 may be detachably received as in, for example, an interference fit or snapping engagement.
  • Alternatively, as shown in FIG. 3 an accessory 300 may include a contour that may generally match the contour of a mobile communication device 302 such as, for example, a cell phone. The accessory contemplated herein may have other shapes than those illustrated.
  • FIG. 4 shows that the accessory 200 or 300 may include relevant components from FIG. 1 and as potentially introduced in FIG. 4, including one or more light sources 402 which may be implemented by light emitting diodes (LED). The accessory may further include one or more audio speakers 404 and one or more haptics generators 406. These sensor output devices may be controlled by one or more processor 408 accessing one or more computer storages or memories 410 to execute logic herein.
  • The accessory may also include one or more accessory ports 412 which can be electrically engaged with one or more controls 414 such as buttons, joysticks, etc. The controls 414 may include any or all of the controls on a Play Station game controller, for instance. In some embodiments the accessory may include one or more video displays 416, one or more motion sensors 418, and one or more cameras 420 for purposes to be shortly disclosed. As mentioned elsewhere herein, the accessory may include a processor, a wireless transceiver, and a power source such as a battery.
  • FIGS. 5 and 6 illustrate that in one use case the accessory contemplated herein may be used to augment sensory input to the player of a computer game. Game signals may be received at block 500, and the light source(s) 402 activated at block 502 responsive to the game signals to augment presentation of the game as sensed by the player. Similarly, the speaker(s) 404 may be activated at block 504 responsive to the game signals to augment presentation of the game as sensed by the player. Likewise, the haptics generator(s) 406 may be activated at block 506 responsive to the game signals to augment presentation of the game as sensed by the player.
  • In some embodiments the signals to activate the light sources/speakers/haptics on the accessory may come directly from the game, i.e., the game developer may program the activation signals into the metadata of the game with the accessory in mind. Thus, the game metadata may contain explicit signals such as “blink LEDs on accessory at time t” or “play beep on accessory speaker at time t” or “activate haptics generator on accessory at time t”.
  • In other embodiments a data structure 600 may be accessed by the accessory processor 408 shown in FIG. 4 to correlate various in-game signals, shown in column 602, to accessory actions, shown in column 604. Thus, for instance, if the game metadata indicates an airplane flyover, the accessory can correlate that to actuating the haptics generator(s) 406 shown in FIG. 4, e.g., to buzz for a short period, and to play a jet sound effect on the speaker(s) 404 shown in FIG. 4. On the other hand, if the game signal indicates a gunshot, this may be correlated to a single blink of the light source 402 shown in FIG. 4, playing of a popping sound effect on the speakers 404, and a single jolt from the haptics generator 406. Other correlations may be developed as desired.
  • In other embodiments the correlation may be executed by any of the devices herein and sent to the accessory.
  • In other embodiments, in lieu of game metadata indicating the game signal in column 602, image recognition may be performed on images from the accessory camera 420 shown in FIG. 4 or other camera discussed herein to indicate the game signal in column 602. Audio recognition may also be performed on sounds played by the display device 204 for the same purpose.
  • FIGS. 7 and 8 illustrate replacing game audio with visual sensory input from the accessory and/or haptics input from the accessory, FIGS. 9 and 10 illustrate replacing game images presentation with sound input from the accessory and/or haptics input from the accessory, while FIGS. 11 and 12 illustrate replacing game haptics with visual sensory input from the accessory and/or audio from the accessory. It should be noted that while these figures assume correlating game signals to replacement sensory input from the accessory, as is the case above direct commands to the accessory to activate its various output devices may be directly embedded in the game metadata, in which case the correlations discussed are done a priori.
  • Considering FIGS. 7 and 8, at block 700 in FIG. 7 a signal representing a demanded sound from the simulation (e.g., computer game) being executed is received. The signal is correlated at block 702 to a haptic and/or light source actuation, which is presented at block 704 by the accessory in a human-sensible way. If desired, in one option at block 706 a signal may be sent, e.g., from the accessory to the simulation to not play or otherwise deactivate the demanded sound, which in effect is replaced by the visible or tactile output of the accessory.
  • FIG. 8 presents example correlations of sound to light source activation/haptic generation. A demanded simulation sound for a car engine may be correlated to a buzzing signal generated by the haptic generator 406 of the accessory and/or a rolling sequence of light source activation using the light sources 402 of the accessory such that a sequence of light sources is activated one after another to present a sequence of illuminations. A demanded gunshot signal from the simulation may be correlated to a single pop generated by the haptic generator 406 and/or a single bright flash from a light source 402 of the accessory.
  • Turning now to FIGS. 9 and 10, at block 900 in FIG. 9 a signal representing a demanded image from the simulation (e.g., computer game) being executed is received. At block 902 the image may be identified by metadata or image recognition performed on, e.g., images of the display device 204 from the camera 420 shown in FIG. 4, in the event that the simulation does not include a direct call for an output device on the accessory. The demanded image is correlated at block 904 to a haptic and/or speaker actuation, which is presented at block 906 by the accessory in a human-sensible way. If desired, in one option at block 908 a signal may be sent, e.g., from the accessory to the simulation to not play or otherwise deactivate the demanded image, which in effect is replaced by the acoustic or tactile output of the accessory.
  • FIG. 10 presents example correlations of image to speaker activation/haptic generation. An image of a plane, although demanded by the simulation, may not be presented on the display device 204 and instead a replacement jet sound may be played on the speakers 404 of the accessory. An image of a bird, although demanded by the game, may not be presented on the display and instead a chirp sound effect may be played on the speaker 404 of the accessory and/or a short mild buzz output by the haptic generator 406.
  • Considering FIGS. 11 and 12, at block 1100 in FIG. 11 a signal representing a demanded haptic from the simulation (e.g., computer game) being executed is received. The signal is correlated at block 1102 to a speaker and/or light source actuation, which is presented at block 1104 by the accessory in a human-sensible way. If desired, in one option at block 1106 a signal may be sent, e.g., from the accessory to the simulation to not play or otherwise deactivate the demanded haptic, which in effect is replaced by the visible or acoustic output of the accessory.
  • FIG. 12 presents example correlations of haptic to light source activation/sound generation. A demanded long buzz from, e.g., the game controller during game play may be replaced by a steady illumination of a light source 402 on the accessory and/or by a steady buzzing sound effect played on the speaker 404 of the accessory. A demanded pop from the game controller may be replaced by a quick on-off illumination of the light source 402 of the accessory.
  • FIG. 13 illustrates an example user interface (UI) 1300 that may be presented on a display 1302 such as any of the displays herein to enable selection of present features. A selector 1304 may be selectable to enable the logic of FIG. 5 to augment game presentation with signals from the accessory. A replace selector 1306 may be selectable to select from a menu 1308 to replace demanded sound, or images, or haptics from the display device 204/game controller with sensory output from the accessory consistent with FIGS. 7-12. A selector 1310 may be selected to enable the logic of FIG. 14 to enhance the experience of a second person such as a spectator of a computer game or second player of the computer game.
  • Accordingly, referring now to FIG. 14, at block 1400 an indication is received to use the accessory to enhance the gaming experience of a second person such as a second gamer or spectator of a game being played by a first player. This indication may be detachment of the accessory 200/300 from a control device or the attachment of the accessory 200/300 with a control device as indicated by, e.g., images from the camera 420 on the accessory or signals from the motion sensor 418, or as indicated by selection of the selector 1310 in FIG. 13.
  • Proceeding to block 1402, the accessory receives game signals from the computer simulation (e.g., game) being executed and controlled using the controller 202 shown in FIG. 2. Moving to block 1404, the light source(s) 402/speaker(s) 404/haptic generator(s) 406 of the accessory are activated according to the control signals from the game. In this way, when, for example, a shot is fired in the game, a spectator holding the accessory may experience a pop sound from the accessory speaker 404 and/or a flash of the light source 402 and/or a jolt from the haptics generator 406, indicating the event occurring in the game.
  • In spectator mode, some of the visuals, sounds, and/or haptics presented by the accessory can be tied directly to the player's button presses. That way the spectator holding the accessory can feel/see button presses that are otherwise occluded by the player's fingers and hands. This could help, for instance, to learn the rhythm of the button presses.
  • Note also that it may make sense to replace haptics with haptics. For example, some people may feel that high frequency haptics in a controller are uncomfortable. In that case, the controller haptics may be disabled and replaced with a different type of more palatable haptic presented on the accessory.
  • Block 1406 indicates that in addition to enhancing second person experiences, one or more of the controls 414 shown in FIG. 4 may be connected to the accessory port 412 and manipulated by the second person to input to the accessory processor 408 game control signals. These signals may be sent from the accessory to the device executing the game at block 1408 to allow play-through of the game in accordance with the signals received at block 1406, i.e., to enable the second person holding the accessory to input game commands to control the game.
  • If desired, correctness of the input received at block 1408 may be determined at block 1410. For example, metadata from the game or other source may indicate which game commands are better than others, e.g., based on a history of game play by experts, and from this data the correctness of the second person's input may be determined by determining how much the input diverged from the “best” expert input to indicate quality of simulation control command input.
  • Block 1412 indicates that the light source(s) 402/speaker(s) 404/haptics generator(s) 406 of the accessory may be activated consistent with the determination at block 1410 to provide sensory information as to the correctness of the second person's play, to train the second person, for instance. For example, a green light source 402 may be illuminated and a gentle chime played on the speaker 404 for a good input, while for a less good input a red-light source 402 may be illuminated, an unpleasant buzz played on the speaker 404, and a jolt emitted by the haptic generator 406.
  • Note that two people might “own” a single accessory described herein, or at least the cradle to which the accessory attaches. The first person can detach the accessory from his controller and the second person attach it to her controller. The second person can then use her controller as input and output rather than just limited to the output devices connected to the accessory and input devices connected to the accessory's accessory port. Uses of the accessory can be mixed and matched, e.g., disengage the device from the first person's controller and engage with the second person's phone, or disengage it from the first person's phone and engage the accessory with the second person's controller. The accessory can also be used for a physical transfer/transmission of a gameplay session.
  • It will be appreciated that whilst present principals have been described with reference to some example embodiments, these are not intended to be limiting, and that various alternative arrangements may be used to implement the subject matter claimed herein.

Claims (19)

1. An assembly, comprising:
at least one computer control device;
at least one accessory detachably engageable with the computer control device and comprising:
at least one haptic generator;
at least one light source;
at least one speaker; and
at least one processor programmed with instructions to:
activate the generator, or the light source, or the speaker, or any combination thereof in response to signals from a computer simulation, the signals from the simulation comprising demanded audio and/or demanded images presented on at least one display device configured for presenting the simulation and separate from the accessory and separate or integral to the control device such that activation of the generator, or the light source, or the speaker, or any combination thereof augments the simulation displayed on the display device.
2. The assembly of claim 1, wherein the instructions are executable by the processor to:
actuate the generator or the speaker or both the generator and the speaker responsive to a demanded image indicated by the signals from the simulation.
3. An assembly, comprising:
at least one computer control device;
at least one accessory detachably engageable with the computer control device and comprising:
at least one haptic generator;
at least one light source;
at least one speaker; and
at least one processor programmed with instructions to:
actuate the generator, or the light source, or the speaker, or any combination thereof in response to signals from a simulation being controlled by the computer control device;
actuate the generator or the speaker or both the generator and the speaker responsive to a demanded image indicated by the signals from the simulation;
the assembly comprising at least one display device presenting the simulation, and the demanded image is not presented by the display device.
4. The assembly of claim 1, wherein the instructions are executable by the processor to:
actuate the generator or the light source or both the generator and the light source responsive to a demanded audio indicated by the signals from the simulation.
5. An assembly, comprising:
at least one computer control device;
at least one accessory detachably engageable with the computer control device and comprising:
at least one haptic generator;
at least one light source;
at least one speaker; and
at least one processor programmed with instructions to:
actuate the generator, or the light source, or the speaker, or any combination thereof in response to signals from a simulation being controlled by the computer control device;
actuate the generator or the light source or both the generator and the light source responsive to a demanded audio indicated by the signals from the simulation;
the assembly comprising at least one display device having at least one speaker and presenting the simulation, and the demanded audio is not presented by the display device.
6. An assembly, comprising:
at least one computer control device;
at least one accessory detachably engageable with the computer control device and comprising:
at least one haptic generator;
at least one light source;
at least one speaker; and
at least one processor programmed with instructions to:
actuate the generator, or the light source, or the speaker, or any combination thereof in response to signals from a simulation being controlled by the computer control device; and
actuate the speaker or the light source or both the speaker and the light source responsive to a demanded haptic indicated by the signals from the simulation.
7. The assembly of claim 6, comprising at least one display device having at least one display device haptic generator and presenting the simulation, and the demanded haptic is not presented by the display device.
8. An assembly, comprising:
at least one computer control device;
at least one accessory detachably engageable with the computer control device and comprising:
at least one haptic generator;
at least one light source;
at least one speaker; and
at least one processor programmed with instructions to:
actuate the generator, or the light source, or the speaker, or any combination thereof in response to signals from a simulation being controlled by the computer control device; and
responsive to the accessory being disengaged from the control device, actuate at least one of the light sources, speaker, or generator or any combination thereof to indicate a characteristic of simulation control command input.
9. The assembly of claim 8, wherein the accessory comprises at least one element manipulable to generate the simulation control command input.
10. The assembly of claim 1, wherein the computer control device comprises at least one wireless telephone.
11. The assembly of claim 1, wherein the computer control device comprises at least one computer game controller.
12. (canceled)
13. A device comprising:
at least one computer storage that is not a transitory signal and that comprises instructions executable by at least one processor to execute at least one of (a) or (b) or (c) or (d) or (e) or any combination thereof:
(a) activate, on an accessory detachably engaged with a control device for a computer simulation, at least one haptics generator, or at least one light source, or at least one speaker, or any combination thereof in response to signals from the simulation, the signals from the simulation comprising demanded audio and/or demanded images presented on a display device such that activation of the generator, or the light source, or the speaker, or any combination thereof augments the simulation displayed on the display device;
(b) actuate the generator or the speaker or both the generator and the speaker responsive to a demanded image indicated by the signals from the simulation;
(c) actuate the generator or the light source or both the generator and the light source responsive to a demanded audio indicated by the signals from the simulation;
(d) actuate the speaker or the light source or both the speaker and the light source responsive to a demanded haptic indicated by the signals from the simulation;
(e) responsive to the accessory being disengaged from the control device, actuate at least one of the light sources, speaker, or generator or any combination thereof to indicate quality of simulation control command input.
14. The device of claim 13, wherein the instructions executable by the at least one processor to:
activate, on an accessory detachably engaged with a control device for a computer simulation, at least one haptics generator, or at least one light source, or at least one speaker, or any combination thereof in response to signals from the simulation, the signals from the simulation comprising demanded audio and/or demanded images presented on a display device such that activation of the generator, or the light source, or the speaker, or any combination thereof augments the simulation displayed on the display device.
15. The device of claim 13, wherein the instructions executable by the at least one processor to:
actuate the generator or the speaker or both the generator and the speaker responsive to a demanded image indicated by the signals from the simulation.
16. The device of claim 13, wherein the instructions executable by the at least one processor to:
actuate the generator or the light source or both the generator and the light source responsive to a demanded audio indicated by the signals from the simulation.
17. The device of claim 13, wherein the instructions executable by the at least one processor to:
actuate the speaker or the light source or both the speaker and the light source responsive to a demanded haptic indicated by the signals from the simulation.
18. The device of claim 13, wherein the instructions executable by the at least one processor to:
responsive to the accessory being disengaged from the control device, actuate at least one of the light sources, speaker, or generator or any combination thereof to indicate quality of simulation control command input.
19.-20. (canceled)
US17/201,922 2021-03-15 2021-03-15 Detachable accessory for enhanced effects, accessibility, spectating, and play-along Active 2041-03-19 US11430309B1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US17/201,922 US11430309B1 (en) 2021-03-15 2021-03-15 Detachable accessory for enhanced effects, accessibility, spectating, and play-along
PCT/US2022/020094 WO2022197573A1 (en) 2021-03-15 2022-03-13 Detachable accessory for enhanced effects, accessibility, spectating, and play-along

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US17/201,922 US11430309B1 (en) 2021-03-15 2021-03-15 Detachable accessory for enhanced effects, accessibility, spectating, and play-along

Publications (2)

Publication Number Publication Date
US11430309B1 US11430309B1 (en) 2022-08-30
US20220292938A1 true US20220292938A1 (en) 2022-09-15

Family

ID=83007867

Family Applications (1)

Application Number Title Priority Date Filing Date
US17/201,922 Active 2041-03-19 US11430309B1 (en) 2021-03-15 2021-03-15 Detachable accessory for enhanced effects, accessibility, spectating, and play-along

Country Status (2)

Country Link
US (1) US11430309B1 (en)
WO (1) WO2022197573A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8267786B2 (en) * 2005-08-24 2012-09-18 Nintendo Co., Ltd. Game controller and game system
US20180178118A1 (en) * 2016-12-27 2018-06-28 Nintendo Co., Ltd. Storage medium having stored therein information processing program, information processing apparatus, information processing system, and information processing method
US10019034B2 (en) * 2012-03-14 2018-07-10 Popsockets Llc Docking connector platform for mobile electronic devices
US20190155387A1 (en) * 2017-11-22 2019-05-23 Immersion Corporation Haptic Accessory Apparatus

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8267786B2 (en) * 2005-08-24 2012-09-18 Nintendo Co., Ltd. Game controller and game system
US10019034B2 (en) * 2012-03-14 2018-07-10 Popsockets Llc Docking connector platform for mobile electronic devices
US20180178118A1 (en) * 2016-12-27 2018-06-28 Nintendo Co., Ltd. Storage medium having stored therein information processing program, information processing apparatus, information processing system, and information processing method
US20190155387A1 (en) * 2017-11-22 2019-05-23 Immersion Corporation Haptic Accessory Apparatus

Also Published As

Publication number Publication date
WO2022197573A1 (en) 2022-09-22
US11430309B1 (en) 2022-08-30

Similar Documents

Publication Publication Date Title
CN111921197A (en) Method, device, terminal and storage medium for displaying game playback picture
CN110833695A (en) Service processing method, device, equipment and storage medium based on virtual scene
CN111651616B (en) Multimedia resource generation method, device, equipment and medium
US11325028B2 (en) Pro gaming AR visor and method for parsing context specific HUD content from a video stream
US11430309B1 (en) Detachable accessory for enhanced effects, accessibility, spectating, and play-along
US20170246534A1 (en) System and Method for Enhanced Immersion Gaming Room
US11882172B2 (en) Non-transitory computer-readable medium, information processing method and information processing apparatus
US20220254082A1 (en) Method of character animation based on extraction of triggers from an av stream
US11402917B2 (en) Gesture-based user interface for AR and VR with gaze trigger
US11771993B2 (en) Changing response window for interactive content using user's reaction time
US9839839B2 (en) Information processing system including a portable terminal and a wireless controller each wirelessly connected to an information processing device
US20220355211A1 (en) Controller action recognition from video frames using machine learning
US11474620B2 (en) Controller inversion detection for context switching
US11934627B1 (en) 3D user interface with sliding cylindrical volumes
US20240042312A1 (en) Haptics support for ui navigation
US20240123340A1 (en) Haptic fingerprint of user's voice
US20230256332A1 (en) Massively multiplayer local co-op and competitive gaming
US11632529B2 (en) Projecting content onto water to enhance computer simulation
US20230338841A1 (en) Foveated enhancement of non-xr games within a hmd system
US11934244B2 (en) Low battery switchover
US20240066399A1 (en) Smooth switchover of computer game control
US20240161759A1 (en) Utilizing inaudible ultrasonic frequencies to embed additional audio asset channels within existing audio channels
US20220347571A1 (en) Method of detecting idle game controller
CN118341075A (en) Interaction method and device based on virtual object, electronic equipment and storage medium
JP2023522654A (en) User selection of virtual camera positions to generate video using synthetic input from multiple cameras

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCF Information on status: patent grant

Free format text: PATENTED CASE