WO2021032693A1 - Détermination de commandes de commande de lumière applicables pour chacun d'une pluralité de dispositifs d'éclairage - Google Patents

Détermination de commandes de commande de lumière applicables pour chacun d'une pluralité de dispositifs d'éclairage Download PDF

Info

Publication number
WO2021032693A1
WO2021032693A1 PCT/EP2020/073019 EP2020073019W WO2021032693A1 WO 2021032693 A1 WO2021032693 A1 WO 2021032693A1 EP 2020073019 W EP2020073019 W EP 2020073019W WO 2021032693 A1 WO2021032693 A1 WO 2021032693A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
type
commands
output
video game
Prior art date
Application number
PCT/EP2020/073019
Other languages
English (en)
Inventor
Dzmitry Viktorovich Aliakseyeu
Jonathan David Mason
Original Assignee
Signify Holding B.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Signify Holding B.V. filed Critical Signify Holding B.V.
Publication of WO2021032693A1 publication Critical patent/WO2021032693A1/fr

Links

Classifications

    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/155Coordinated control of two or more light sources
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/24Constructional details thereof, e.g. game controllers with detachable joystick handles
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • A63F13/28Output arrangements for video game devices responding to control signals received from the game device for affecting ambient conditions, e.g. for vibrating players' seats, activating scent dispensers or affecting temperature or light
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/90Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
    • A63F13/98Accessories, i.e. detachable arrangements optional for the use of the video game device, e.g. grip supports of game controllers
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/165Controlling the light source following a pre-assigned programmed sequence; Logic control [LC]

Definitions

  • the invention relates to a controller device for controlling a first system and a second system to render light effects according to light commands generated based on output of a video game.
  • the invention further relates to a method of controlling a first system and a second system to render light effects according to light commands generated based on output of a video game.
  • the invention also relates to a computer program product enabling a computer system to perform such a method.
  • Philips’ Hue Entertainment and Hue Sync are gaining popularity. Philips Hue Sync enables the rendering of light effects based on the content that is played on a computer, e.g. video games.
  • a dynamic lighting system can dramatically influence the experience and impression of audio-visual material, especially when the colors sent to the lights match what would be seen in the composed environment around the screen.
  • W02019038307A1 discloses an example of a keyboard that comprises one or more lighting units.
  • W02019038307A1 describes a system for generating moving light effects that comprises a control module designed to operate the one or more lighting units of the keyboard and lighting units of a display to generate a color pattern in one-sided or mutual dependence of one another.
  • US2017225069A1 discloses an immersive storytelling environment in which audiovisual content is played and in which multiple devices (e.g. illumination, blinds,
  • HVAC are present and controlled synchronously.
  • US2014104247A1 discloses parsing an incoming video file to separate a track including at least one ambient light effect associated with the video and generating commands to lighting devices to generate the ambient light effect included in track.
  • Hue lights and peripherals will render different colors and dynamics as they usually have a different purpose - in most cases dedicated light sources (e.g. Hue lights) would render ambience and special effects, while peripherals (e.g. Razer) would render information with light such as highlighting control keys, displaying health as a color, or render prepared special effects that are triggered by the game events.
  • dedicated light sources e.g. Hue lights
  • peripherals e.g. Razer
  • a controller device for controlling a first system and a second system to render light effects according to light commands generated based on output of a video game comprises at least one communication interface and at least one processor configured to identify (i.e. determine at least presence and/or availability of) a first system, said first system being configured to generate a first type of light commands based on first output of a video game and comprising a first lighting device, and identify (i.e. determine at least presence and/or availability of) a second system, said second system being configured to generate a second type of light commands based on second output of said video game and/or said first output of said video game and comprising a second lighting device.
  • the controller device is further configured to determine for each of said first lighting device and said second lighting device whether light control commands of said first type or light control commands of said second type should be used to control said respective lighting device and use said at least one communication interface to control said first system and/or said second system to render said light effects according to said determined type of light commands. This allows the controller device to identify moments when the use of multiple separate system might feel unnatural. The controller device may then ensure a seamless experience by temporarily “taking over” one of the systems.
  • Said first output and/or said second output of said video game may comprise game data, triggers, audio and/or video output by said video game.
  • Said at least one processor may be configured to control said first system and said second system to render light effects according to a same type of light commands during at least one period, said same type of light commands being one of said first type and said second type.
  • Said controller device may be part of said first system or said second system, for example.
  • said controller device may be, for example, independent from the first system and the system, e.g. a personal computer.
  • Said first type of light commands may be generated by a first application running on said controller device and said second type of light commands are generated by a second application running on said controller device, said first system comprising said first application and said second system comprising said second application.
  • Said first type of light commands may be generated based on game data and/or triggers output by said video game and said second type of light commands are generated based on audio and/or video output by said video game.
  • Light effects for Hue lights are typically determined based on audio and/or video output.
  • Light effects for keyboards may be determined based on game data and/or triggers output by the video game in addition to or instead of being based on interaction with the keyboard itself.
  • Said at least one processor may be configured to detect a characteristic of said video game and determine based on said characteristic whether said light control commands of said first type or light control commands of said second type should be used to control said respective lighting device. Whether said light control commands of said first type or light control commands of said second type should be used to control said respective lighting device may be determined based on one or more characteristics. Examples of characteristics are described in the following paragraphs.
  • Said characteristic may represent a current or upcoming event in said video game or a non-interactive scene in said video game, for example. For instance, functional lighting (e.g. indicating a warning) may be provided in case of a (certain) current or upcoming event, but functional lighting is less useful if a non-interactive scene is being rendered.
  • Said characteristic may represent a detected mismatch between light effects generated by said first and second systems.
  • Said at least one processor may be configured to detect said mismatch upon detecting a high color contrast between said light effects generated by said first and second systems, for example. This mismatch detection is typically not dependent on how the first and second types of light commands are generated and may be beneficial, for example, if the controller device is able to obtain to the light control commands, but not the output of the video game.
  • Said characteristic may represent a same or similar color in all analysis areas used by said second system to generate light commands from video which is output by said video game. If the same or similar color is detected in all analysis areas, this color will be very prominent in the light effects rendered on the second system and will likely cause a mismatch between light effects generated by the first and second systems. It may then be beneficial to render the light effects on the first system also based on this color. This may be used, for example, if the information provided on the keyboard is not crucial for the player.
  • the Razer Chroma API is built into the game and communicates the importance of the current light state of the keyboard, it may be beneficial to give this data a higher priority.
  • Said characteristic may represent a level of interaction between a user and a keyboard and said first type of light commands may include light commands generated based on said interaction with said keyboard.
  • the keyboard may take control over Hue lights when there is a lot of interaction with the keyboard (e.g. a user pressing the spacebar 20 times in a row), despite the video output remaining more or less constant.
  • the interaction with the keyboard may comprise one or more of intensity of strike, frequency, and periodicity, for example.
  • the light effects determined based on video analysis of these explosions may comprise a lot of white/yellowish light effects for an extended period of time.
  • a keyboard-based lighting system might at the same time be rendering effects on the keyboard, e.g. because the user hits the spacebar 5 times in a row (this effect may be different than the one rendered after the user hits the spacebar 10 times in a row), it would be more exciting if the Hue lights’ output is (partially) determined by the keyboard light effect.
  • Said at least one processor may be configured to determine said characteristic from said first output and/or said second output of said video game. For example, if the light effects are determined from the video, e.g. computer-generated images, that is output by the game, the same video output may be used to determine the characteristic, e.g. by extracting colors from the video output.
  • a method of controlling a first system and a second system to render light effects according to light commands generated based on output of a video game comprises identifying a first system, said first system being configured to generate a first type of light commands based on first output of a video game and comprising a first lighting device, identifying a second system, said second system being configured to generate a second type of light commands based on second output of said video game and/or said first output of said video game and comprising a second lighting device, determining for each of said first lighting device and said second lighting device whether light control commands of said first type or light control commands of said second type should be used to control said respective lighting device, and controlling said first system and/or said second system to render said light effects according to said determined type of light commands.
  • Said method may be performed by software running on a programmable device. This software may be provided as a computer program product.
  • a computer program for carrying out the methods described herein, as well as a non-transitory computer readable storage-medium storing the computer program are provided.
  • a computer program may, for example, be downloaded by or uploaded to an existing device or be stored upon manufacturing of these systems.
  • a non-transitory computer-readable storage medium stores at least one software code portion, the software code portion, when executed or processed by a computer, being configured to perform executable operations for controlling a first system and a second system to render light effects according to light commands generated based on output of a video game.
  • the executable operations comprise identifying a first system, said first system being configured to generate a first type of light commands based on first output of a video game and comprising a first lighting device, identifying a second system, said second system being configured to generate a second type of light commands based on second output of said video game and/or said first output of said video game and comprising a second lighting device, determining for each of said first lighting device and said second lighting device whether light control commands of said first type or light control commands of said second type should be used to control said respective lighting device, and controlling said first system and/or said second system to render said light effects according to said determined type of light commands.
  • aspects of the present invention may be embodied as a device, a method or a computer program product. Accordingly, aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a "circuit", "module” or “system.” Functions described in this disclosure may be implemented as an algorithm executed by a processor/microprocessor of a computer. Furthermore, aspects of the present invention may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied, e.g., stored, thereon.
  • the computer readable medium may be a computer readable signal medium or a computer readable storage medium.
  • a computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
  • a computer readable storage medium may include, but are not limited to, the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
  • a computer readable storage medium may be any tangible medium that can contain, or store, a program for use by or in connection with an instruction execution system, apparatus, or device.
  • a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof.
  • a computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
  • Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber, cable, RF, etc., or any suitable combination of the foregoing.
  • Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java(TM), Smalltalk, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
  • LAN local area network
  • WAN wide area network
  • Internet Service Provider an Internet Service Provider
  • These computer program instructions may be provided to a processor, in particular a microprocessor or a central processing unit (CPU), of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer, other programmable data processing apparatus, or other devices create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • a processor in particular a microprocessor or a central processing unit (CPU), of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer, other programmable data processing apparatus, or other devices create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
  • the computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of devices, methods and computer program products according to various embodiments of the present invention.
  • each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s).
  • Fig. l is a block diagram of a first embodiment of the controller device
  • Fig. 2 is a block diagram of a second embodiment of the controller device
  • Fig. 3 is a flow diagram of a first embodiment of the method
  • Fig. 4 is a flow diagram of a second embodiment of the method
  • Fig. 5 is a block diagram of an exemplary data processing system for performing the method of the invention.
  • Fig. 1 shows a first embodiment of the controller device for controlling a first system and a second system to render light effects according to light commands generated based on output of a video game.
  • the controller device is a personal computer 1.
  • the first system comprises a (e.g. Razer) keyboard 11 with embedded lighting, e.g. one or more LEDs
  • the second system comprises a lighting system 10.
  • the lighting system 10 comprises a bridge 13 and a plurality of lighting devices 15-17.
  • the bridge 13 may be a Philips Hue bridge and the lighting devices 15-17 may be Philips Hue lights, for example.
  • the first system comprises a different peripheral with embedded lighting, e.g. a PC case, a mousepad with connected lights or a gaming chair.
  • PC cases are now often transparent and one or more of the hardware components (e.g. the GPU, the motherboard, the cooling system especially water cooling) may include controllable lights (e.g. controllable with the Razer Chroma SDK).
  • another device with user control as a primary function and led lighting as a secondary function may be used.
  • the personal computer 1 and the bridge 13 are connected to a wireless LAN access point 12, e.g. via Ethernet and/or Wi-Fi.
  • the bridge 13 communicates with the lighting devices 15-17, e.g. using Zigbee technology.
  • the personal computer 1 is also connected to a display 14.
  • a video game is executed on the personal computer 1 and provides video output to the display 14.
  • the personal computer 1 comprises a receiver 3, a transmitter 4, a processor 5, and a memory 7.
  • the processor 5 is configured to identify the first system and the second system.
  • the first system is configured to generate a first type of light commands based on first output of a video game.
  • the second system is configured to generate a second type of light commands based on second output of the video game and/or the first output of the video game.
  • the first output and/or the second output of the video game may comprise game data, triggers, audio and/or video output by the video game, for example.
  • the first type of light commands may be generated based on game data and/or triggers output by the video game and the second type of light commands may be generated based on audio and/or video output by the video game, for example.
  • first type of light commands may be generated by a first application running on the personal computer 1 and some or all the second type of light commands may be generated by a second application running on the controller device 1.
  • the first application would then be part of the first system and the second application would then be part of the second system.
  • the systems may be identified locally.
  • the applications may run as services and may be identified in the corresponding manner.
  • the systems may be identified using wired or wireless communication protocols.
  • the keyboard 11 may be connected to the personal computer 1 via a USB port and the USB protocol may be used to identify the keyboard 11, for example.
  • the bridge 13 of the lighting system 10 may be identified by transmitting a UPnP query to the wireless access point 12, for example.
  • the processor 5 is further configured to determine for each of the lighting devices whether light control commands of the first type or light control commands of the second type should be used to control the respective lighting device and use transmitter 4 to control the first system and/or the second system to render the light effects according to the determined type of light commands.
  • the keyboard 11 and the lighting system 13 may have an open API allowing the personal computer 1 to temporally get control of it.
  • Razer can be used to continuously control peripherals and Hue lights or HueSync can be used to continuously control Hue lights and peripherals like Razer.
  • a mixed mode where control could be temporally given to one system.
  • Such a mixed mode is implemented in the embodiment of Fig. 1. Since both systems have and use different information about the game, such a mixed mode, i.e. separating control and only temporally giving full control to one of the systems, is preferable compared to one system continuously controlling all lights.
  • the personal computer 1 identifies moments when the use of two separate systems might feel unnatural. The personal computer 1 can then ensure a seamless experience by temporarily “taking over” one of the systems.
  • the personal computer 1 may receive light control commands from one of the systems and forward them to the other system or may instruct one system to send light control commands to the other system, for example.
  • the personal computer 1 could take over control temporarily and have the second system, which comprises lighting system 10, control the keyboard 11 temporarily to render colors appropriate for that moment and then release the control allowing the keyboard 11 to resume its own control.
  • the personal computer 1 could have the first system, which comprises the keyboard 11, control the lighting system 10 temporarily, e.g. if the personal computer 1 detects that color or dynamics mismatch could influence the ambience or a message that keyboard is conveying (e.g. low health).
  • the above-mentioned moments could be estimated based on content analysis (on-screen and audio) and/or based on predefined data output by the game, for example.
  • the following moments may be recognized:
  • a user is not actively controlling the game, e.g. when a cut scene, a short in game animation or in-game dialogue is rendered. At these moments, keyboard interaction is irrelevant so the light effects rendered by the main light system could temporally also be rendered by the peripheral.
  • Critical information is displayed on the peripheral such as low health or approaching “boss” fight. This may trigger the light effects rendered by the peripheral to also be rendered by the main lighting system to temporally use the main lighting system to display the same critical message instead of rendering ambience lighting.
  • peripheral lighting and main lighting may create a contrast in the colors of the light effects that could impact experience when there are moments on the main lights of a more unified color and dynamic e.g. when using a shield, entering a dark zone, or engulfed by fire.
  • the peripherals may temporally be assigned similar colors, so they too contribute to the overall effect/experience.
  • peripheral lighting and main lighting will create a contrast in the colors of the light effects that could impact experience if all zones for ambiance detection have similar color, for example.
  • the same color may be temporarily assigned to the peripherals.
  • dynamic “moving effects such as color wave and swirl where including peripherals would create a more smooth and consistent effect, while excluding them might create undesired experience such that effects are not even recognized due to differently colored peripheral. Since these effects are usually short it will have minimum impact on the normal use of peripheral led lighting e.g. control keys will not be highlighted only for a short moment.
  • the personal computer 1 controls both the first system and the second system to render the light effects according to the determined type of light commands.
  • the personal computer 1 is only able to control one of the systems. In this situation, it may still be beneficial to control this system, as the personal computer 1 may then be able to ensure a seamless experience part of the time.
  • the controller device is an HDMI module, for example.
  • An HDMI module may capture what a personal computer sends to the display 14 by connecting in between and then sending the light commands either to the bridge 13 or directly to lighting devices 15-17.
  • the peripheral e.g. controller or keyboard
  • the peripheral’ s lights may not be controllable by the HDMI module, but the app can inform the HDMI module of the peripheral light state, so it can control the other system, e.g. Hue lights, when necessary.
  • the personal computer 1 comprises one processor 5.
  • the personal computer 1 comprises multiple processors.
  • the processor 5 of the personal computer 1 may be a general-purpose processor, e.g. from Intel or AMD.
  • the processor 5 of the personal computer 1 may run a Windows, Mac OS X or Linux operating system for example.
  • the processor 5 may comprise one or more cores.
  • the memory 7 may comprise one or more memory units.
  • the memory 7 may comprise solid state memory, for example.
  • the receiver 3 and the transmitter 4 may use one or more wireless communication technologies such as Wi-Fi (IEEE 802.11) to communicate with the wireless LAN access point 12, for example.
  • Wi-Fi IEEE 802.11
  • multiple receivers and/or multiple transmitters are used instead of a single receiver and a single transmitter.
  • a separate receiver and a separate transmitter are used.
  • the receiver 3 and the transmitter 4 are combined into a transceiver.
  • the personal computer 1 may comprise other components typical for a personal computer such as a power supply and loudspeakers.
  • the invention may be implemented using a computer program running on one or more processors.
  • a bridge is used to control lighting devices 15-17.
  • lighting devices 15-17 are controlled without using a bridge.
  • the controller device is a personal computer.
  • the controller device is a different device, e.g. a (portable or non-portable) game console, a bridge, an HDMI module or a peripheral, or comprises a plurality of devices.
  • the controller device may even be a cloud server, e.g. in case of cloud gaming. In that case, the game runs in the cloud and the game video, audio and light commands may be streamed to the user’s device, which is essentially a terminal.
  • Fig. 2 shows a second embodiment of the controller device for controlling a first system and a second system to render light effects according to light commands generated based on output of a video game: a bridge 21, e.g. a Philips Hue bridge.
  • the first system comprises a (e.g. Razer) keyboard 11 with embedded lighting, e.g. one or more LEDs
  • the second system comprises a lighting system 20.
  • the bridge 21 and the lighting devices 15-17 are part of the lighting system 20.
  • the controller device is part of the second system.
  • the controller device is part of the first system, e.g. the keyboard 11 may be adapted to be a controller device.
  • the bridge 21 comprises a receiver 23, a transmitter 24, a processor 25, and a memory 27.
  • the processor 25 is configured to identify the first system and the second system.
  • the first system is configured to generate a first type of light commands based on first output of a video game.
  • the second system is configured to generate a second type of light commands based on second output of the video game and/or the first output of the video game.
  • the processor 25 is further configured to determine for each of the lighting devices whether light control commands of the first type or light control commands of the second type should be used to control the respective lighting device and use transmitter 24 to control the first system and the second system to render the light effects according to the determined type of light commands. Since the bridge 21 is part of the second system, it may not be necessary to use the transmitter 24 to control the second system, but only to control the first system.
  • the bridge 21 comprises one processor 25.
  • the bridge 21 comprises multiple processors.
  • the processor 25 of the bridge 21 may be a general-purpose processor, e.g. ARM-based, or an application-specific processor.
  • the processor 25 of the bridge 21 may run a Unix-based operating system for example.
  • the memory 27 may comprise one or more memory units.
  • the memory 27 may comprise one or more hard disks and/or solid-state memory, for example.
  • the memory 27 may be used to store a table of connected lights, for example.
  • the receiver 23 and the transmitter 24 may use one or more wired or wireless communication technologies such as Ethernet to communicate with the wireless LAN access point 12, for example.
  • multiple receivers and/or multiple transmitters are used instead of a single receiver and a single transmitter.
  • a separate receiver and a separate transmitter are used.
  • the receiver 23 and the transmitter 24 are combined into a transceiver.
  • the bridge 21 may comprise other components typical for a network device such as a power connector.
  • the invention may be implemented using a computer program running on one or more processors.
  • a first embodiment of the method of controlling a first system and a second system to render light effects according to light commands generated based on output of a video game is shown in Fig. 3.
  • a step 101 comprises identifying a first system.
  • the first system is configured to generate a first type of light commands based on first output of a video game and comprises a first lighting device.
  • a step 103 comprises identifying a second system.
  • the second system is configured to generate a second type of light commands based on second output of the video game and/or the first output of the video game and comprises a second lighting device.
  • a step 105 comprises determining for each of the first lighting device and the second lighting device whether light control commands of the first type or light control commands of the second type should be used to control the respective lighting device.
  • a step 107 comprises controlling the first system and/or the second system to render the light effects according to the determined type of light commands.
  • FIG. 4 A second embodiment of the method of controlling a first system (X) and a second system (Y) to render light effects according to light commands generated based on output of a video game is shown in Fig. 4.
  • the first system (X) comprises a keyboard with embedded lighting and the second system (Y) comprises one or more Hue lights.
  • the light control commands of the first type (A) are optimized for the keyboard and the light control commands of the second type (B) are optimized for the Hue lights.
  • step 105 of Fig. 3 comprises sub steps 111 - 145. Step 111 is performed after steps 101 and 103, which have been described in relation to Fig.
  • Step 111 comprises detecting a characteristic of the video game. In steps 121-129, it is determined based on the characteristic whether the light control commands of the first type (A) or light control commands of the second type (B) should be used to control the respective lighting device.
  • step 111 the following detections take place in in step 111 to determine the characteristics:
  • This level may comprise one or more of intensity of strike, frequency, and periodicity, for example.
  • Characteristics 1 and 2 are determined if the video game outputs data that allows these characteristics to be determined.
  • Characteristic 3 is determined if the first type of light commands includes light commands generated based on the interaction with the keyboard.
  • Characteristic 4 is determined if the second type of light commands includes light commands generated based on video which is output by the video game.
  • Characteristic is 5 are always determined in this embodiment.
  • a subset of these characteristics is never determined and/or one or more additional characteristics may be detected.
  • whether a characteristic is determined depends on different conditions.
  • One or more of the characteristics may be determined from the first output and/or the second output of the video game, i.e. from an output which is also used to generate light commands.
  • one or more of the characteristics may be determined from a different output of the video game.
  • step 141 it is determined that that light commands of the first type (A) should be used to control both the first lighting device (X) and the second lighting device (Y).
  • step 143 it is determined that that light commands of the first type (A) should be used to control the first lighting device
  • step 145 it is determined that that light commands of the second type (B) should be used to control both the first lighting device (X) and the second lighting device
  • step 141 or step 145 the first system (X) and the second system (Y) will be controlled to render light effects according to the same (first or second) type of light commands during the next period, i.e. control of one of the systems is “taken over”.
  • a control message transmitted to this system by a controller device may specify the duration of “takeover” and this system may limit the “takeover” to this duration or automatically regain control if control is not released by the controller device after a predefined timeout.
  • Step 107 comprises controlling the first system (X) and the second system (Y) to render the light effects (on the first lighting device and the second lighting device, respectively) according to the type of light commands determined in step 141, step 143 or 145, whichever was performed. If the method is performed by a PC, the first system and second system may be controlled by communicating with other software components running on the PC or with other devices, for example. Step 111 is repeated after step 107. If the same step 141, 143, or 145 was performed in the previous loop, then it may not be necessary to communicate with any software component or other device.
  • step 121 characteristic 1 is assessed. If a non-interactive scene is detected in the video game, step 145 is performed next and steps 123-129 are skipped. If not, then step 123 is performed next. If a non-interactive scene is detected in the video game, then the second type of light commands (B), optimized for the Hue lights, will be rendered on both the Hue lights and the keyboard, as no interaction with the keyboard is expected and the user will likely not even be looking at the keyboard. Examples of non-interactive scenes are cut scenes, animation, dialog or any other scenes where the user is not in control of the game.
  • step 123 characteristic 2 is assessed. If a (certain) current or upcoming event is detected in the video game, step 141 is performed next and steps 125-129 are skipped. If not, then step 125 is performed next. If a (certain) current or upcoming event is detected in the video game, then the first type of light commands (A) will be rendered on both the Hue lights and the keyboard, as the first type of light commands (A) will be indicative of the event and indicating the same critical message with all lights will help the user play the video game. Examples of events are a low health indication, (imminent) danger such as a main boss approaching or a key moment in the game that has been associated with a dedicated special effect.
  • step 125 characteristic 3 is assessed. If the level of interaction between the user and the keyboard exceeds a certain threshold, step 141 is performed next and steps 127 and 129 are skipped. If not, then step 127 is performed next. If the level of interaction between the user and the keyboard exceeds a certain threshold, then the first type of light commands (A), optimized for the keyboard, will be rendered on both the Hue lights and the keyboard, as the light effects determined based on the interaction with the keyboard may be more exciting than the light effects determined based on the audio and/or video output by the video game.
  • A first type of light commands
  • step 141 is performed next may depend on the video output of the game. For example, step 141 may be performed next only if the level of interaction between the user and the keyboard exceeds a certain threshold and the video output remains more or less constant, e.g. when the user uses a shield, enters a dark zone or is engulfed by fire, or if the analysis of the video output results in the same light effects being rendered for an extended period of time, e.g. due to explosion happening on screen.
  • step 127 characteristic 4 is assessed. If a same or similar color is detected in all analysis areas used by the second system, step 145 is performed next and step 129 is skipped. If not, then step 129 is performed next. If a same or similar color is detected in all analysis areas used by the second system, then the second type of light commands (B), optimized for the Hue lights, will be rendered on both the Hue lights and the keyboard, so that the keyboard contributes to the overall effect/experience.
  • step 129 characteristic 5 is assessed. The detection of a mismatch between light effects generated by the first and second systems by detecting a high color contrast between the light effects generated by the first and second systems normally is typically not dependent on how the first and second types of light commands are generated.
  • step 145 is performed next.
  • the second type of light commands (B), optimized for the Hue lights will be rendered on both the Hue lights and the keyboard. If no mismatch is detected, then step 143 is performed. In that case, the light commands of the first type (A), optimized for the keyboard, will be rendered on the keyboard and the light commands of the second type (B), optimized for the Hue lights, will be rendered on the Hue lights.
  • characteristics 1-5 are all determined before it is determined whether to perform step 141, 143, or 145. In an alternative embodiment, the latter determination is made after each characteristic has been determined. As a result, it will typically not be necessary to determine all characteristics in each loop in this alternative embodiment.
  • Fig. 5 depicts a block diagram illustrating an exemplary data processing system that may perform the method as described with reference to Figs. 3 and 4.
  • the data processing system 300 may include at least one processor 302 coupled to memory elements 304 through a system bus 306. As such, the data processing system may store program code within memory elements 304. Further, the processor 302 may execute the program code accessed from the memory elements 304 via a system bus 306. In one aspect, the data processing system may be implemented as a computer that is suitable for storing and/or executing program code. It should be appreciated, however, that the data processing system 300 may be implemented in the form of any system including a processor and a memory that is capable of performing the functions described within this specification.
  • the memory elements 304 may include one or more physical memory devices such as, for example, local memory 308 and one or more bulk storage devices 310.
  • the local memory may refer to random access memory or other non-persistent memory device(s) generally used during actual execution of the program code.
  • a bulk storage device may be implemented as a hard drive or other persistent data storage device.
  • the processing system 300 may also include one or more cache memories (not shown) that provide temporary storage of at least some program code in order to reduce the quantity of times program code must be retrieved from the bulk storage device 310 during execution.
  • the processing system 300 may also be able to use memory elements of another processing system, e.g. if the processing system 300 is part of a cloud-computing platform.
  • I/O devices depicted as an input device 312 and an output device 314 optionally can be coupled to the data processing system.
  • input devices may include, but are not limited to, a keyboard, a pointing device such as a mouse, a microphone (e.g. for voice and/or speech recognition), or the like.
  • output devices may include, but are not limited to, a monitor or a display, speakers, or the like. Input and/or output devices may be coupled to the data processing system either directly or through intervening EO controllers.
  • the input and the output devices may be implemented as a combined input/output device (illustrated in Fig. 5 with a dashed line surrounding the input device 312 and the output device 314).
  • a combined device is a touch sensitive display, also sometimes referred to as a “touch screen display” or simply “touch screen”.
  • input to the device may be provided by a movement of a physical object, such as e.g. a stylus or a finger of a user, on or near the touch screen display.
  • a network adapter 316 may also be coupled to the data processing system to enable it to become coupled to other systems, computer systems, remote network devices, and/or remote storage devices through intervening private or public networks.
  • the network adapter may comprise a data receiver for receiving data that is transmitted by said systems, devices and/or networks to the data processing system 300, and a data transmitter for transmitting data from the data processing system 300 to said systems, devices and/or networks.
  • Modems, cable modems, and Ethernet cards are examples of different types of network adapter that may be used with the data processing system 300.
  • the memory elements 304 may store an application 318.
  • the application 318 may be stored in the local memory 308, the one or more bulk storage devices 310, or separate from the local memory and the bulk storage devices. It should be appreciated that the data processing system 300 may further execute an operating system (not shown in Fig. 5) that can facilitate execution of the application 318.
  • the application 318 being implemented in the form of executable program code, can be executed by the data processing system 300, e.g., by the processor 302. Responsive to executing the application, the data processing system 300 may be configured to perform one or more operations or method steps described herein.
  • Fig. 5 shows the input device 312 and the output device 314 as being separate from the network adapter 316.
  • input may be received via the network adapter 316 and output be transmitted via the network adapter 316.
  • the data processing system 300 may be a cloud server.
  • the input may be received from and the output may be transmitted to a user device that acts as a terminal.
  • Various embodiments of the invention may be implemented as a program product for use with a computer system, where the program(s) of the program product define functions of the embodiments (including the methods described herein).
  • the program(s) can be contained on a variety of non-transitory computer-readable storage media, where, as used herein, the expression “non-transitory computer readable storage media” comprises all computer-readable media, with the sole exception being a transitory, propagating signal.
  • the program(s) can be contained on a variety of transitory computer-readable storage media.
  • Illustrative computer-readable storage media include, but are not limited to: (i) non-writable storage media (e.g., read-only memory devices within a computer such as CD-ROM disks readable by a CD-ROM drive, ROM chips or any type of solid-state non-volatile semiconductor memory) on which information is permanently stored; and (ii) writable storage media (e.g., flash memory, floppy disks within a diskette drive or hard-disk drive or any type of solid-state random-access semiconductor memory) on which alterable information is stored.
  • the computer program may be run on the processor 302 described herein.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Circuit Arrangement For Electric Light Sources In General (AREA)

Abstract

La présente invention concerne un procédé comprenant l'identification (101) d'un premier système et l'identification (103) d'un second système. Le premier système est conçu pour générer un premier type de commandes de lumière sur la base d'une première sortie d'un jeu vidéo et comprend un premier dispositif d'éclairage. Le second système est conçu pour générer un second type de commandes de lumière sur la base d'une seconde sortie du jeu vidéo et/ou de la première sortie du jeu vidéo et comprend un second dispositif d'éclairage. Le procédé comprend en outre la détermination (105) pour chacun du premier dispositif d'éclairage et du second dispositif d'éclairage, si des instructions de commande de lumière du premier type ou des instructions de commande de lumière du second type doivent être utilisées pour commander le dispositif d'éclairage respectif et la commande (107) du premier système et du second système pour rendre les effets de lumière selon le type déterminé de commandes de lumière.
PCT/EP2020/073019 2019-08-22 2020-08-17 Détermination de commandes de commande de lumière applicables pour chacun d'une pluralité de dispositifs d'éclairage WO2021032693A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP19193029.6 2019-08-22
EP19193029 2019-08-22

Publications (1)

Publication Number Publication Date
WO2021032693A1 true WO2021032693A1 (fr) 2021-02-25

Family

ID=67734510

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2020/073019 WO2021032693A1 (fr) 2019-08-22 2020-08-17 Détermination de commandes de commande de lumière applicables pour chacun d'une pluralité de dispositifs d'éclairage

Country Status (1)

Country Link
WO (1) WO2021032693A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114040548A (zh) * 2021-10-29 2022-02-11 广西世纪创新显示电子有限公司 电竞显示器的灯光效果控制方法及相关设备
WO2023274700A1 (fr) * 2021-06-29 2023-01-05 Signify Holding B.V. Dispositif de commande pour commander une pluralité de dispositifs d'éclairage sur la base d'un contenu multimédia et son procédé

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140104247A1 (en) 2012-10-17 2014-04-17 Adam Li Devices and systems for rendering ambient light effects in video
US20170225069A1 (en) 2016-02-04 2017-08-10 Disney Enterprises, Inc. Incorporating and coordinating multiple home systems into a play experience
US20180368230A1 (en) * 2017-05-26 2018-12-20 Cooler Master Technology Inc. Light control system and method thereof
WO2019038307A1 (fr) 2017-08-22 2019-02-28 Roccat GmbH Dispositif et procédé de génération 'effets lumineux en mouvement ainsi qu'espace de vente doté d'un tel système

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140104247A1 (en) 2012-10-17 2014-04-17 Adam Li Devices and systems for rendering ambient light effects in video
US20170225069A1 (en) 2016-02-04 2017-08-10 Disney Enterprises, Inc. Incorporating and coordinating multiple home systems into a play experience
US20180368230A1 (en) * 2017-05-26 2018-12-20 Cooler Master Technology Inc. Light control system and method thereof
WO2019038307A1 (fr) 2017-08-22 2019-02-28 Roccat GmbH Dispositif et procédé de génération 'effets lumineux en mouvement ainsi qu'espace de vente doté d'un tel système

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023274700A1 (fr) * 2021-06-29 2023-01-05 Signify Holding B.V. Dispositif de commande pour commander une pluralité de dispositifs d'éclairage sur la base d'un contenu multimédia et son procédé
CN114040548A (zh) * 2021-10-29 2022-02-11 广西世纪创新显示电子有限公司 电竞显示器的灯光效果控制方法及相关设备

Similar Documents

Publication Publication Date Title
CN109076678B (zh) 用于视频游戏的照明
CN106557038B (zh) 一种用于定制环境的计算设备、方法及系统
JP5410596B2 (ja) 仮想ポート管理方法及びシステム
US10512845B2 (en) Real-time manipulation of gameplay through synchronous signal consumption in non-interactive media
US11109108B2 (en) Information processing device
WO2021032693A1 (fr) Détermination de commandes de commande de lumière applicables pour chacun d'une pluralité de dispositifs d'éclairage
EP3804471B1 (fr) Sélection d'un ou de plusieurs effets de lumière en fonction d'une variation de retard
CN113170301B (zh) 临时将光设备添加到娱乐组
EP4272454A1 (fr) Ajustement d'effets lumineux sur la base d'ajustements effectués par des utilisateurs d'autres systèmes
EP4018646B1 (fr) Sélection d'une zone d'analyse d'image sur la base d'une comparaison de niveaux de dynamicité
WO2021160552A1 (fr) Association d'une autre action de commande avec une commande physique si un mode de divertissement est actif
CN115868250A (zh) 在娱乐模式下分配照明设备的控制
WO2022058282A1 (fr) Détermination de différents effets de lumière pour le contenu d'économiseur d'écran
US20240096300A1 (en) Determining light effects in dependence on whether an overlay is likely displayed on top of video content
US10525336B2 (en) Image processing system, image processing method, program, and information storage medium
WO2020144196A1 (fr) Détermination d'un effet de lumière sur la base d'un paramètre d'effet de lumière spécifié par un utilisateur pour un autre contenu ayant lieu à un emplacement similaire
CN113261057A (zh) 基于媒体内容中的语音度来确定光效果
US9805036B2 (en) Script-based multimedia presentation
WO2022157299A1 (fr) Sélection d'un ensemble de dispositifs d'éclairage sur la base d'un identifiant d'une source de signal audio et/ou vidéo
EP3445138A1 (fr) Stockage d'une préférence pour un état lumineux d'une source de lumière en fonction d'un décalage de l'attention
EP4274387A1 (fr) Sélection de dispositifs d'éclairage de divertissement sur la base de la dynamique d'un contenu vidéo
CN116543459A (zh) 智能景观照明方法、系统及电子设备
WO2024022846A1 (fr) Sélection de dispositifs d'éclairage sur la base d'un effet lumineux indiqué et de distances entre des dispositifs d'éclairage disponibles
CN117750142A (zh) 直播互动方法、装置、设备、存储介质及程序产品
CN117942571A (zh) 一种数据通信方法、装置、电子设备以及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20753975

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20753975

Country of ref document: EP

Kind code of ref document: A1