WO2020074303A1 - Determining dynamicity for light effects based on movement in video content - Google Patents

Determining dynamicity for light effects based on movement in video content Download PDF

Info

Publication number
WO2020074303A1
WO2020074303A1 PCT/EP2019/076422 EP2019076422W WO2020074303A1 WO 2020074303 A1 WO2020074303 A1 WO 2020074303A1 EP 2019076422 W EP2019076422 W EP 2019076422W WO 2020074303 A1 WO2020074303 A1 WO 2020074303A1
Authority
WO
WIPO (PCT)
Prior art keywords
chromaticity
video content
determining
brightness settings
frame
Prior art date
Application number
PCT/EP2019/076422
Other languages
French (fr)
Inventor
Tobias BORRA
Simon RYCROFT
Dragan Sekulovski
Dzmitry Viktorovich Aliakseyeu
Original Assignee
Signify Holding B.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Signify Holding B.V. filed Critical Signify Holding B.V.
Publication of WO2020074303A1 publication Critical patent/WO2020074303A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/254Analysis of motion involving subtraction of images
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B45/00Circuit arrangements for operating light-emitting diodes [LED]
    • H05B45/10Controlling the intensity of the light
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B45/00Circuit arrangements for operating light-emitting diodes [LED]
    • H05B45/20Controlling the colour of the light
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/155Coordinated control of two or more light sources

Definitions

  • the invention relates to a system for determining a sequence of chromaticity and/or brightness settings.
  • the invention further relates to a method of determining a sequence of chromaticity and/or brightness settings.
  • the invention even further relates to a lighting system comprising said system.
  • the invention also relates to a computer program product enabling a computer system to perform such a method.
  • Philips Hue Sync and Philips Ambilight enable the rendering of light effects based on the content that is played on a television (Ambilight) or computer (Hue Sync). More dynamic light effects are especially suitable for more dynamic content on-screen, but also off-screen, as disclosed in US2010/0165000 Al.
  • US2010/0026734A1 discloses a system suitable for backlighting.
  • the system comprises a monitor unit configured to monitor an information signal, and to generate a first signal based at least partly on the information comprised in the information signal, a control unit configured to control the reaction time of at least one illumination area capable of emitting properties of light and comprised in the system, based on the first signal, wherein the first signal may comprise scene change information.
  • the current Philips Hue Sync app allows the user to control the dynamics of the light effects.
  • Slower dynamics result in smoother transitions, typically characterized by lower intensity peaks.
  • Slower dynamics are usually preferred for content that does not contain a lot of dynamics (e.g. dialogue scenes) whereas increased dynamics are usually preferred for content that contains faster movement (e.g. action movies).
  • a drawback of the current Philips Hue Sync app is that once the dynamics setting is set, it will not change unless the user changes it. This results in the dynamics setting no longer being a good fit with the content at certain times.
  • a system for determining a sequence of chromaticity and/or brightness settings comprises at least one processor configured to determine a quantity of pixels which have moved in a second frame of video content compared to a first frame of said video content, determine if said quantity of pixels exceeds a certain threshold, determine one or more properties of a dominant movement between said first frame and said second frame, determine a dynamicity for chromaticity and/or brightness settings based on said one or more properties in dependence on said quantity exceeding said threshold, determine a sequence of chromaticity and/or brightness settings based on at least part of said video content, said sequence having said dynamicity wherein said dynamicity for said chromaticity and/or brightness settings comprises a transition speed in said chromaticity and/or brightness settings, and determine a first transition speed for said chromaticity settings and a second transition speed for said brightness settings.
  • the first transition speed and the second transition speed are different in order to provide a better fit between the light effects and the video content.
  • the system may be a lighting system, may be part of a lighting system or may be used in a lighting system.
  • Said one or more properties may comprise a speed and/or a direction and/or a type of said dominant movement, for example.
  • Said movement type may be a camera pan or a camera zoom or a camera tilt or a camera forward movement or an object movement, for example.
  • Said at least one processor may be configured to determine said quantity by determining a quantity of all pixels or of key pixels which have moved in said second frame of video content compared to said first frame of said video content.
  • the determined dynamicity is typically used for chromaticity and/or brightness settings based on video content starting at or after the second frame.
  • the sequence is intended to be rendered on one or more lights simultaneously with the video content on which the sequence is based.
  • the quantity of pixels which have moved in the second frame of video content compared to the first frame of said video content may be determined by taking the pixel difference between the two frames, for example. This pixel difference could be in either lightness or chromaticity or both. However, the determination of the quantity of pixels which have moved in the second frame of video content compared to the first frame of said video content does not require that a movement is determined per pixel in the video frame (e.g. by determining the pixel difference).
  • the wording“a quantity of pixels which have moved in a second frame of video content compared to a first frame of said video content” may both refer to the case that all pixels that have moved are included or that only a part of the pixels that have moved are included, for example the pixels corresponding to an object present in the first and second frame.
  • An object may be detected using edge detection, for example. If the object is detected to have moved, the quantity of pixels that form the object, i.e. are located inside the edges of the object, may be considered to be a quantity of pixels that have moved.
  • the quantity of pixels which have moved and/or one or more properties of the dominant movement may be determined by computing the optical flow field.
  • the concept of optical flow is disclosed by David J. Fleet & Yair Weiss in the chapter titled "Optical Flow Estimation" of the“Handbook of Mathematical Models in Computer Vision”, Paragios et ah, Springer, ISBN 0-387-26371-3.
  • a movement direction and/or speed that the largest amount of moving pixels have in common may be considered to be the dominant movement, for example. This does not require that all pixels move in the same direction. For example, in case of a camera zooming in or out, pixels may be considered to be part of the same movement, without there being one dominant movement per pixel. If the moving pixels are part of an object that has moved, then the pixels will normally have the same movement direction and speed.
  • Said dynamicity for said chromaticity and/or brightness settings may comprise a transition speed in said chromaticity and/or brightness settings, for example. Transition speed is typically the most important dynamicity parameter for light effects.
  • Said at least one processor is configured to determine a first transition speed for said chromaticity settings and a second transition speed for said brightness settings. This ensures that the dynamicity of the light effects fit with the content even better.
  • the transition speed in said chromaticity and/or brightness settings increases at increasing quantity of pixels exceeding said threshold.
  • the transition speed of brightness and/or chromaticity is increased compared to the case a smaller object that has moved in order to provide a better fit between the light settings and the video content.
  • Said at least one processor may be configured to determine said second transition speed to be a certain normal speed and said first transition speed to be a speed which is at least a certain amount higher than said certain normal speed upon determining that said dominant movement is a camera pan.
  • Fast chromaticity transitions are a good fit for camera panning in video content, for example.
  • Said at least one processor may be configured to determine said transition speed further based on a user-configured transition speed. This allows the light effects to be optimized for a particular user or group of users.
  • Said at least one processor may be configured to increase said transition speed upon determining an increase in a speed of said dominant movement.
  • Said at least one processor may be configured to control one or more light devices based on said sequence of chromaticity and/or brightness settings while said video content is being rendered.
  • the same system that determines the light effects may also control the one or more light devices to render these light effects.
  • Said at least one processor may be configured to control said one or more light devices while receiving said video content and while determining chromaticity and/or brightness settings of said sequence based on said video content. This allows the system to determine the sequence of chromaticity and/or brightness settings and control the one or more light devices based on this sequence in real-time.
  • Said at least one processor may be configured to determine different chromaticity and/or brightness settings for different light devices or light elements (e.g. light device left of the TV and light device right of the TV).
  • the sequence of chromaticity and/or brightness settings may indicate per setting where this setting should be applied.
  • Said at least one processor may be configured to store said sequence in a light script, each of said settings being associated in said light script with a time in said video content. This allows the system to determine light effects to be rendered by another system.
  • a method of determining a sequence of chromaticity and/or brightness settings comprises determining a quantity of pixels which have moved in a second frame of video content compared to a first frame of said video content, determining if said quantity of pixels exceeds a certain threshold, determining one or more properties of a dominant movement between said first frame and said second frame, determining a dynamicity for chromaticity and/or brightness settings based on said one or more properties in dependence on said quantity exceeding said threshold, determining a sequence of chromaticity and/or brightness settings based on at least part of said video content, said sequence having said dynamicity wherein said dynamicity for said chromaticity and/or brightness settings comprises a transition speed in said chromaticity and/or brightness settings, and determining a first transition speed for said chromaticity settings and a second transition speed for said brightness settings. The first transition speed is different from the second transition speed.
  • Said method may be performed by software running on a
  • This software may be provided as a computer program product.
  • a computer program for carrying out the methods described herein, as well as a non-transitory computer readable storage-medium storing the computer program are provided.
  • a computer program may, for example, be downloaded by or uploaded to an existing device or be stored upon manufacturing of these systems.
  • a non-transitory computer-readable storage medium stores a software code portion, the software code portion, when executed or processed by a computer, being configured to perform executable operations comprising: determining a quantity of pixels which have moved in a second frame of video content compared to a first frame of said video content, determining if said quantity of pixels exceeds a certain threshold, determining one or more properties of a dominant movement between said first frame and said second frame, determining a dynamicity for chromaticity and/or brightness settings based on said one or more properties in dependence on said quantity exceeding said threshold, and determining a sequence of chromaticity and/or brightness settings based on at least part of said video content, said sequence having said dynamicity.
  • aspects of the present invention may be embodied as a device, a method or a computer program product.
  • aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro- code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a "circuit", "module” or “system.”
  • Functions described in this disclosure may be implemented as an algorithm executed by a processor/microprocessor of a computer.
  • aspects of the present invention may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied, e.g., stored, thereon.
  • the computer readable medium may be a computer readable signal medium or a computer readable storage medium.
  • a computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
  • a computer readable storage medium may include, but are not limited to, the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
  • a computer readable storage medium may be any tangible medium that can contain, or store, a program for use by or in connection with an instruction execution system, apparatus, or device.
  • a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof.
  • a computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
  • Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber, cable, RF, etc., or any suitable combination of the foregoing.
  • Computer program code for carrying out operations for aspects of the present invention may be written in any appropriate medium, including but not limited to wireless, wireline, optical fiber, cable, RF, etc., or any suitable combination of the foregoing.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
  • LAN local area network
  • WAN wide area network
  • Internet Service Provider for example, AT&T, MCI, Sprint, EarthLink, MSN, GTE, etc.
  • These computer program instructions may be provided to a processor, in particular a microprocessor or a central processing unit (CPU), of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer, other programmable data processing apparatus, or other devices create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • a processor in particular a microprocessor or a central processing unit (CPU), of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer, other programmable data processing apparatus, or other devices create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
  • the computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s).
  • the functions noted in the blocks may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
  • Fig. 1 is a block diagram of an embodiment of the system
  • Fig. 2 is a flow diagram of an embodiment of the method
  • Fig. 3 depicts a part of video content in which a camera zooms in
  • Fig. 4 depicts a part of video content in which an object moves
  • Fig. 5 depicts a part of video content in which a camera pans to the left
  • Fig. 6 depicts a part of video content in which a camera follows a car
  • Fig. 7 is a block diagram of an exemplary data processing system for performing the method of the invention. Corresponding elements in the drawings are denoted by the same reference numeral.
  • Fig. 1 shows an embodiment of the system of the invention: mobile device 1.
  • Mobile device 1 is connected to a wireless LAN access point 17.
  • a bridge 11 is also connected to the wireless LAN access point 17, e.g. via Ethernet.
  • Light devices 13, 14 and 15 communicate wirelessly with the bridge 11, e.g. using the Zigbee protocol, and can be controlled via the bridge 11, e.g. by the mobile device 1.
  • the bridge 11 may be a Philips Hue bridge and the light devices 13-15 may be Philips Hue lights, for example. In an alternative embodiment, light devices are controlled without a bridge.
  • the wireless LAN access point 17 is connected to the Internet 18.
  • An Internet server 19 is also connected to the Internet 18.
  • the mobile device 1 may be a mobile phone or a tablet, for example.
  • the mobile device 1 may run the Philips Hue Sync app, for example.
  • the mobile device 1 and the light devices 13, 14 and 15 together form a lighting system.
  • the mobile device 1 comprises a processor 5, a transceiver 3, a memory 7, and a display 9.
  • the processor 5 is configured to determine a quantity of pixels which have moved in a second frame of video content compared to a first frame of the video content and determine if the quantity of pixels exceeds a certain threshold.
  • the processor 5 is further configured to determine one or more properties of a dominant movement between the first frame and the second frame, determine a dynamicity for chromaticity and/or brightness settings based on the one or more properties in dependence on the quantity exceeding the threshold, and determine a sequence of chromaticity and/or brightness settings based on at least part of the video content, the sequence having the dynamicity.
  • the dynamicity setting (e.g. chromaticity transition speed) is adjusted in dependence on the quantity of movement in scenes, where depending on the movement e.g. fast camera panning vs fast linear camera movement, a different adaptation is used.
  • the processor 5 is configured to control the light devices 13-15 based on the sequence of chromaticity and/or brightness settings while receiving the video content, while determining chromaticity and/or brightness settings of the sequence based on the video content, and while the video content is being rendered.
  • the processor 5 is configured to store the sequence in a light script, each of the settings being associated in the light script with a time in the video content.
  • the processor 5 may be configured to execute the light script at a later time.
  • the processor 5 is configured to control the light devices 13-15 based on the sequence of chromaticity and/or brightness settings while the video content is being rendered.
  • the mobile device 1 comprises one processor 5.
  • the mobile device 1 comprises multiple processors.
  • the processor 5 of the mobile device 1 may be a general-purpose processor, e.g. from Qualcomm or ARM-based, or an application-specific processor.
  • the processor 5 of the mobile device 1 may run an Android or iOS operating system for example.
  • the memory 7 may comprise one or more memory units.
  • the memory 7 may comprise solid- state memory, for example.
  • the memory 7 may be used to store an operating system, applications and application data, for example.
  • the transceiver 3 may use one or more wireless communication technologies such as Wi-Fi (IEEE 802.11) to communicate with the wireless LAN access point 17, for example.
  • multiple transceivers are used instead of a single transceiver.
  • a receiver and a transmitter have been combined into a transceiver 3.
  • one or more separate receiver components and one or more separate transmitter components are used.
  • the display 9 may comprise an LCD or OLED panel, for example.
  • the display 9 may be a touch screen.
  • the mobile device 1 may comprise other components typical for a mobile device such as a battery and a power connector.
  • the invention may be implemented using a computer program running on one or more processors.
  • the system of the invention is a mobile device.
  • the system of the invention is a different device, e.g. a PC.
  • a step 101 comprises determining a quantity of pixels which have moved in a frame of video content compared to a previous frame of the video content (also referred to as“second frame” and“first frame”, respectively).
  • the quantity of pixels which have moved in the second frame of video content compared to the first frame of said video content may be determined by taking the pixel difference between the two frames, for example. This pixel difference could be in either lightness or chromaticity or both.
  • the determination of the quantity of pixels which have moved in the second frame of video content compared to the first frame of said video content does not require that a movement is determined per pixel in the video frame (e.g. by determining the pixel difference).
  • An object may be detected using edge detection, for example. If the object is detected to have moved, the quantity of pixels that form the object, i.e. are located inside the edges of the object, may be considered to be a quantity of pixels that have moved.
  • the quantity of pixels is determined by computing the optical flow field with an algorithm appropriate for the application.
  • an algorithm appropriate for the application.
  • a fast algorithm may be used for real-time light effect generation and a high accuracy algorithm may be used for script creation.
  • a step 103 comprises determining if the quantity of pixels exceeds a certain threshold, e.g. 10% of all pixels. If the certain threshold is exceeded, step 105 is performed. If not, step 113 is performed. In the embodiment of Fig. 2, step 113 comprises determining a sequence of chromaticity and/or brightness settings based on at least part of the video content using a slow dynamicity.
  • a certain threshold e.g. 10% of all pixels.
  • a step 105 comprises determining one or more properties of a dominant movement between the first frame and the second frame.
  • the one or more properties may comprise a speed and/or a direction and/or a type of the dominant movement, for example.
  • the movement type may be a camera pan or a camera zoom or a camera tilt or a camera forward movement or an object movement, for example.
  • a movement direction and/or speed that the largest amount of moving pixels have in common may be considered to be the dominant movement, for example. This does not require that all pixels move in the same direction. For example, in case of a camera zooming in or out, pixels may be considered to be part of the same movement, without there being one dominant movement per pixel. If the moving pixels are part of an object that has moved, then the pixels will normally have the same movement direction and speed.
  • a step 107 comprises determining a dynamicity for chromaticity and/or brightness settings based on the one or more properties in dependence on the quantity exceeding the threshold.
  • the dynamicity for the chromaticity and/or brightness settings comprises a transition speed in the chromaticity and/or brightness settings and step 107 comprises determining a first transition speed for the chromaticity settings and a second transition speed for the brightness settings.
  • the brightness transitions are much slower than the chromaticity transitions.
  • the second transition speed in brightness settings
  • the first transition speed in chromaticity settings
  • the transition speed may further be determined based on a user- configured transition speed.
  • the user may be able to select one of a limited number of static presets with slightly slower or faster transitions, for example. For instance, if a user selects “moderate” setting, where the transition speed of brightness is a and of chromaticity b, then based on the detected movement and its properties, a change (can be positive or negative) can be introduced to modify the current transition speed, e.g. Da and Db.
  • the changes to the transition speed could also be presented as an option to the script creator where he might need to confirm or correct adaptions of transition speed.
  • a step 109 comprises determining a sequence of chromaticity and/or brightness settings based on at least part of the video content, the sequence having the dynamicity.
  • Figs. 5 and 6 illustrate an implementation in which step 105 comprises determining the type of the dominant movement.
  • the dynamicity determined in step 107 depends on this movement type and different light effects transition speeds are used for different types of movements.
  • Figs. 5 and 6 show different scenes with significant changes between the frames, caused by different types of movement.
  • Fig. 5 shows a camera panning movement and
  • Fig. 6 shows a chase (front movement).
  • chromaticity transitions can be sped up to create a feeling of high side speed (e.g. colors traveling fast from the left then center and then right light).
  • increased transition speed of chromaticity and brightness (on all lights) might create a stronger feeling of moving very fast forward.
  • the transition speed may be increased upon determining an increase in a speed of the dominant movement.
  • steps 101 and 105 are separate steps and step 105 is only performed if the threshold is exceeded.
  • steps 101 and 105 are combined and if the threshold is exceeded, step 107 is performed instead of step 105.
  • steps 101, 105 and 107 are combined and if the threshold is exceeded, step 109 is performed instead of step 105.
  • Fig. 7 depicts a block diagram illustrating an exemplary data processing system that may perform the method as described with reference to Fig. 2.
  • the data processing system 300 may include at least one processor 302 coupled to memory elements 304 through a system bus 306. As such, the data processing system may store program code within memory elements 304. Further, the processor 302 may execute the program code accessed from the memory elements 304 via a system bus 306. In one aspect, the data processing system may be implemented as a computer that is suitable for storing and/or executing program code. It should be appreciated, however, that the data processing system 300 may be implemented in the form of any system including a processor and a memory that can perform the functions described within this specification.
  • the memory elements 304 may include one or more physical memory devices such as, for example, local memory 308 and one or more bulk storage devices 310.
  • the local memory may refer to random access memory or other non-persistent memory device(s) generally used during actual execution of the program code.
  • a bulk storage device may be implemented as a hard drive or other persistent data storage device.
  • the processing system 300 may also include one or more cache memories (not shown) that provide temporary storage of at least some program code in order to reduce the quantity of times program code must be retrieved from the bulk storage device 310 during execution.
  • the processing system 300 may also be able to use memory elements of another processing system, e.g. if the processing system 300 is part of a cloud-computing platform.
  • I/O devices depicted as an input device 312 and an output device 314 optionally can be coupled to the data processing system.
  • input devices may include, but are not limited to, a keyboard, a pointing device such as a mouse, a microphone (e.g. for voice and/or speech recognition), or the like.
  • output devices may include, but are not limited to, a monitor or a display, speakers, or the like. Input and/or output devices may be coupled to the data processing system either directly or through intervening I/O controllers.
  • the input and the output devices may be implemented as a combined input/output device (illustrated in Fig. 7 with a dashed line surrounding the input device 312 and the output device 314).
  • a combined device is a touch sensitive display, also sometimes referred to as a“touch screen display” or simply“touch screen”.
  • input to the device may be provided by a movement of a physical object, such as e.g. a stylus or a finger of a user, on or near the touch screen display.
  • a network adapter 316 may also be coupled to the data processing system to enable it to become coupled to other systems, computer systems, remote network devices, and/or remote storage devices through intervening private or public networks.
  • the network adapter may comprise a data receiver for receiving data that is transmitted by said systems, devices and/or networks to the data processing system 300, and a data transmitter for transmitting data from the data processing system 300 to said systems, devices and/or networks.
  • Modems, cable modems, and Ethernet cards are examples of different types of network adapter that may be used with the data processing system 300.
  • the memory elements 304 may store an application 318.
  • the application 318 may be stored in the local memory 308, the one or more bulk storage devices 310, or separate from the local memory and the bulk storage devices.
  • the data processing system 300 may further execute an operating system (not shown in Fig. 7) that can facilitate execution of the application 318.
  • the application 318 being implemented in the form of executable program code, can be executed by the data processing system 300, e.g., by the processor 302. Responsive to executing the application, the data processing system 300 may be configured to perform one or more operations or method steps described herein.
  • Various embodiments of the invention may be implemented as a program product for use with a computer system, where the program(s) of the program product define functions of the embodiments (including the methods described herein).
  • the program(s) can be contained on a variety of non-transitory computer-readable storage media, where, as used herein, the expression“non-transitory computer readable storage media” comprises all computer-readable media, with the sole exception being a transitory, propagating signal.
  • the program(s) can be contained on a variety of transitory computer-readable storage media.
  • Illustrative computer-readable storage media include, but are not limited to: (i) non- writable storage media (e.g., read-only memory devices within a computer such as CD-ROM disks readable by a CD-ROM drive, ROM chips or any type of solid-state non-volatile semiconductor memory) on which information is permanently stored; and (ii) writable storage media (e.g., flash memory, floppy disks within a diskette drive or hard-disk drive or any type of solid-state random-access semiconductor memory) on which alterable information is stored.
  • the computer program may be run on the processor 302 described herein.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

A method comprises determining (101) a quantity of pixels which have moved in a second frame of video content compared to a first frame of the video content, determining (103) if the quantity of pixels exceeds a certain threshold, determining (105) one or more properties of a dominant movement between the first frame and the second frame, determining (107) a dynamicity for chromaticity and/or brightness settings based on the one or more properties in dependence on the quantity exceeding the threshold, and determining (109) a sequence of chromaticity and/or brightness settings based on at least part of the video content, the sequence having the dynamicity. The sequence is intended to be rendered on one or more lights simultaneously with the video content on which the sequence is based.

Description

DETERMINING DYNAMICITY FOR LIGHT EFFECTS BASED ON MOVEMENT IN VIDEO CONTENT
FIELD OF THE INVENTION
The invention relates to a system for determining a sequence of chromaticity and/or brightness settings.
The invention further relates to a method of determining a sequence of chromaticity and/or brightness settings.
The invention even further relates to a lighting system comprising said system.
The invention also relates to a computer program product enabling a computer system to perform such a method. BACKGROUND OF THE INVENTION
Technologies such as Philips Hue Sync and Philips Ambilight enable the rendering of light effects based on the content that is played on a television (Ambilight) or computer (Hue Sync). More dynamic light effects are especially suitable for more dynamic content on-screen, but also off-screen, as disclosed in US2010/0165000 Al.
US2010/0026734A1 discloses a system suitable for backlighting. The system comprises a monitor unit configured to monitor an information signal, and to generate a first signal based at least partly on the information comprised in the information signal, a control unit configured to control the reaction time of at least one illumination area capable of emitting properties of light and comprised in the system, based on the first signal, wherein the first signal may comprise scene change information.
To provide an optimal user experience, the current Philips Hue Sync app allows the user to control the dynamics of the light effects. Slower dynamics result in smoother transitions, typically characterized by lower intensity peaks. Slower dynamics are usually preferred for content that does not contain a lot of dynamics (e.g. dialogue scenes) whereas increased dynamics are usually preferred for content that contains faster movement (e.g. action movies). A drawback of the current Philips Hue Sync app is that once the dynamics setting is set, it will not change unless the user changes it. This results in the dynamics setting no longer being a good fit with the content at certain times.
SUMMARY OF THE INVENTION
It is a first object of the invention to provide a system, which is able to determine the dynamicity of light effects such that it is often a good fit with the content.
It is a second object of the invention to provide a method, which is able to determine the dynamicity of light effects such that it is often a good fit with the content.
It is a third object of the invention to provide a lighting system that is able to control the light sources such that the dynamicity of light effects is often a good fit with the content.
In a first aspect of the invention, a system for determining a sequence of chromaticity and/or brightness settings comprises at least one processor configured to determine a quantity of pixels which have moved in a second frame of video content compared to a first frame of said video content, determine if said quantity of pixels exceeds a certain threshold, determine one or more properties of a dominant movement between said first frame and said second frame, determine a dynamicity for chromaticity and/or brightness settings based on said one or more properties in dependence on said quantity exceeding said threshold, determine a sequence of chromaticity and/or brightness settings based on at least part of said video content, said sequence having said dynamicity wherein said dynamicity for said chromaticity and/or brightness settings comprises a transition speed in said chromaticity and/or brightness settings, and determine a first transition speed for said chromaticity settings and a second transition speed for said brightness settings. The first transition speed and the second transition speed are different in order to provide a better fit between the light effects and the video content.
The system may be a lighting system, may be part of a lighting system or may be used in a lighting system. Said one or more properties may comprise a speed and/or a direction and/or a type of said dominant movement, for example. Said movement type may be a camera pan or a camera zoom or a camera tilt or a camera forward movement or an object movement, for example. Said at least one processor may be configured to determine said quantity by determining a quantity of all pixels or of key pixels which have moved in said second frame of video content compared to said first frame of said video content. The determined dynamicity is typically used for chromaticity and/or brightness settings based on video content starting at or after the second frame. The sequence is intended to be rendered on one or more lights simultaneously with the video content on which the sequence is based.
By dynamically determining the dynamicity of the sequence of chromaticity and/or brightness settings based on the video content, it becomes possible to make the dynamicity of the light effects fit better with the content. This can be realized by using motion detection. By determining the dynamicity of the sequence of chromaticity and/or brightness settings based on one or more properties of a dominant movement if there is substantial movement in the content, a dynamicity of the light effects may be reduced for a slow scene with not a lot of movement in the middle of a fast-paced action movie. By determining the dynamicity differently for different types of dominant movement, the dynamicity of the light effects will normally be a good fit with the content.
The quantity of pixels which have moved in the second frame of video content compared to the first frame of said video content may be determined by taking the pixel difference between the two frames, for example. This pixel difference could be in either lightness or chromaticity or both. However, the determination of the quantity of pixels which have moved in the second frame of video content compared to the first frame of said video content does not require that a movement is determined per pixel in the video frame (e.g. by determining the pixel difference). The wording“a quantity of pixels which have moved in a second frame of video content compared to a first frame of said video content” may both refer to the case that all pixels that have moved are included or that only a part of the pixels that have moved are included, for example the pixels corresponding to an object present in the first and second frame.
It is sometimes not possible to determine that a pixel of an object has moved, because it does not have the exact same brightness and/chromaticity in the next frame. While the properties of a pixel might change, it is still possible to calculate that the pixel has moved, e.g. because it remains part of the same object and this object has moved. An object may be detected using edge detection, for example. If the object is detected to have moved, the quantity of pixels that form the object, i.e. are located inside the edges of the object, may be considered to be a quantity of pixels that have moved.
The quantity of pixels which have moved and/or one or more properties of the dominant movement, e.g. the movement type, may be determined by computing the optical flow field. The concept of optical flow is disclosed by David J. Fleet & Yair Weiss in the chapter titled "Optical Flow Estimation" of the“Handbook of Mathematical Models in Computer Vision”, Paragios et ah, Springer, ISBN 0-387-26371-3. A movement direction and/or speed that the largest amount of moving pixels have in common may be considered to be the dominant movement, for example. This does not require that all pixels move in the same direction. For example, in case of a camera zooming in or out, pixels may be considered to be part of the same movement, without there being one dominant movement per pixel. If the moving pixels are part of an object that has moved, then the pixels will normally have the same movement direction and speed.
Said dynamicity for said chromaticity and/or brightness settings may comprise a transition speed in said chromaticity and/or brightness settings, for example. Transition speed is typically the most important dynamicity parameter for light effects.
Said at least one processor is configured to determine a first transition speed for said chromaticity settings and a second transition speed for said brightness settings. This ensures that the dynamicity of the light effects fit with the content even better.
In an embodiment, the transition speed in said chromaticity and/or brightness settings increases at increasing quantity of pixels exceeding said threshold. In case e.g. a larger object in a video frame has moved, the transition speed of brightness and/or chromaticity is increased compared to the case a smaller object that has moved in order to provide a better fit between the light settings and the video content.
Said at least one processor may be configured to determine said second transition speed to be a certain normal speed and said first transition speed to be a speed which is at least a certain amount higher than said certain normal speed upon determining that said dominant movement is a camera pan. Fast chromaticity transitions are a good fit for camera panning in video content, for example.
Said at least one processor may be configured to determine said transition speed further based on a user-configured transition speed. This allows the light effects to be optimized for a particular user or group of users.
Said at least one processor may be configured to increase said transition speed upon determining an increase in a speed of said dominant movement. By matching the transition speed of the light effects with the movement speed in the video content, the resulting light effects will enhance the content particularly well.
Said at least one processor may be configured to control one or more light devices based on said sequence of chromaticity and/or brightness settings while said video content is being rendered. Thus, the same system that determines the light effects may also control the one or more light devices to render these light effects. Said at least one processor may be configured to control said one or more light devices while receiving said video content and while determining chromaticity and/or brightness settings of said sequence based on said video content. This allows the system to determine the sequence of chromaticity and/or brightness settings and control the one or more light devices based on this sequence in real-time.
Said at least one processor may be configured to determine different chromaticity and/or brightness settings for different light devices or light elements (e.g. light device left of the TV and light device right of the TV). The sequence of chromaticity and/or brightness settings may indicate per setting where this setting should be applied.
Said at least one processor may be configured to store said sequence in a light script, each of said settings being associated in said light script with a time in said video content. This allows the system to determine light effects to be rendered by another system.
In a second aspect of the invention, a method of determining a sequence of chromaticity and/or brightness settings comprises determining a quantity of pixels which have moved in a second frame of video content compared to a first frame of said video content, determining if said quantity of pixels exceeds a certain threshold, determining one or more properties of a dominant movement between said first frame and said second frame, determining a dynamicity for chromaticity and/or brightness settings based on said one or more properties in dependence on said quantity exceeding said threshold, determining a sequence of chromaticity and/or brightness settings based on at least part of said video content, said sequence having said dynamicity wherein said dynamicity for said chromaticity and/or brightness settings comprises a transition speed in said chromaticity and/or brightness settings, and determining a first transition speed for said chromaticity settings and a second transition speed for said brightness settings. The first transition speed is different from the second transition speed. Said method may be performed by software running on a
programmable device. This software may be provided as a computer program product.
Moreover, a computer program for carrying out the methods described herein, as well as a non-transitory computer readable storage-medium storing the computer program are provided. A computer program may, for example, be downloaded by or uploaded to an existing device or be stored upon manufacturing of these systems.
A non-transitory computer-readable storage medium stores a software code portion, the software code portion, when executed or processed by a computer, being configured to perform executable operations comprising: determining a quantity of pixels which have moved in a second frame of video content compared to a first frame of said video content, determining if said quantity of pixels exceeds a certain threshold, determining one or more properties of a dominant movement between said first frame and said second frame, determining a dynamicity for chromaticity and/or brightness settings based on said one or more properties in dependence on said quantity exceeding said threshold, and determining a sequence of chromaticity and/or brightness settings based on at least part of said video content, said sequence having said dynamicity.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a device, a method or a computer program product.
Accordingly, aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro- code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a "circuit", "module" or "system." Functions described in this disclosure may be implemented as an algorithm executed by a processor/microprocessor of a computer. Furthermore, aspects of the present invention may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied, e.g., stored, thereon.
Any combination of one or more computer readable medium(s) may be utilized. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a computer readable storage medium may include, but are not limited to, the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of the present invention, a computer readable storage medium may be any tangible medium that can contain, or store, a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber, cable, RF, etc., or any suitable combination of the foregoing. Computer program code for carrying out operations for aspects of the present invention may be written in any
combination of one or more programming languages, including an object oriented programming language such as Java(TM), Smalltalk, C++ or the like, conventional procedural programming languages, such as the "C" programming language or similar programming languages, and functional programming languages such as Scala, Haskel or the like. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
Aspects of the present invention are described below with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the present invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor, in particular a microprocessor or a central processing unit (CPU), of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer, other programmable data processing apparatus, or other devices create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of devices, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the blocks may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
BRIEF DESCRIPTION OF THE DRAWINGS
These and other aspects of the invention are apparent from and will be further elucidated, by way of example, with reference to the drawings, in which:
Fig. 1 is a block diagram of an embodiment of the system;
Fig. 2 is a flow diagram of an embodiment of the method;
Fig. 3 depicts a part of video content in which a camera zooms in;
Fig. 4 depicts a part of video content in which an object moves;
Fig. 5 depicts a part of video content in which a camera pans to the left;
Fig. 6 depicts a part of video content in which a camera follows a car; and
Fig. 7 is a block diagram of an exemplary data processing system for performing the method of the invention. Corresponding elements in the drawings are denoted by the same reference numeral.
DETAILED DESCRIPTION OF THE EMBODIMENTS
Fig. 1 shows an embodiment of the system of the invention: mobile device 1. Mobile device 1 is connected to a wireless LAN access point 17. A bridge 11 is also connected to the wireless LAN access point 17, e.g. via Ethernet. Light devices 13, 14 and 15 communicate wirelessly with the bridge 11, e.g. using the Zigbee protocol, and can be controlled via the bridge 11, e.g. by the mobile device 1. The bridge 11 may be a Philips Hue bridge and the light devices 13-15 may be Philips Hue lights, for example. In an alternative embodiment, light devices are controlled without a bridge. The wireless LAN access point 17 is connected to the Internet 18. An Internet server 19 is also connected to the Internet 18. The mobile device 1 may be a mobile phone or a tablet, for example. The mobile device 1 may run the Philips Hue Sync app, for example. The mobile device 1 and the light devices 13, 14 and 15 together form a lighting system.
The mobile device 1 comprises a processor 5, a transceiver 3, a memory 7, and a display 9. The processor 5 is configured to determine a quantity of pixels which have moved in a second frame of video content compared to a first frame of the video content and determine if the quantity of pixels exceeds a certain threshold. The processor 5 is further configured to determine one or more properties of a dominant movement between the first frame and the second frame, determine a dynamicity for chromaticity and/or brightness settings based on the one or more properties in dependence on the quantity exceeding the threshold, and determine a sequence of chromaticity and/or brightness settings based on at least part of the video content, the sequence having the dynamicity.
Thus, the dynamicity setting (e.g. chromaticity transition speed) is adjusted in dependence on the quantity of movement in scenes, where depending on the movement e.g. fast camera panning vs fast linear camera movement, a different adaptation is used.
The proposed approach can be applied for real-time light effects generation as well as for script creation. In the former case, the processor 5 is configured to control the light devices 13-15 based on the sequence of chromaticity and/or brightness settings while receiving the video content, while determining chromaticity and/or brightness settings of the sequence based on the video content, and while the video content is being rendered. In the latter case, the processor 5 is configured to store the sequence in a light script, each of the settings being associated in the light script with a time in the video content. The processor 5 may be configured to execute the light script at a later time. In this case, the processor 5 is configured to control the light devices 13-15 based on the sequence of chromaticity and/or brightness settings while the video content is being rendered. In the embodiment of the mobile device 1 shown in Fig. 1, the mobile device 1 comprises one processor 5. In an alternative embodiment, the mobile device 1 comprises multiple processors. The processor 5 of the mobile device 1 may be a general-purpose processor, e.g. from Qualcomm or ARM-based, or an application-specific processor. The processor 5 of the mobile device 1 may run an Android or iOS operating system for example. The memory 7 may comprise one or more memory units. The memory 7 may comprise solid- state memory, for example. The memory 7 may be used to store an operating system, applications and application data, for example.
The transceiver 3 may use one or more wireless communication technologies such as Wi-Fi (IEEE 802.11) to communicate with the wireless LAN access point 17, for example. In an alternative embodiment, multiple transceivers are used instead of a single transceiver. In the embodiment shown in Fig. 1, a receiver and a transmitter have been combined into a transceiver 3. In an alternative embodiment, one or more separate receiver components and one or more separate transmitter components are used. The display 9 may comprise an LCD or OLED panel, for example. The display 9 may be a touch screen. The mobile device 1 may comprise other components typical for a mobile device such as a battery and a power connector. The invention may be implemented using a computer program running on one or more processors.
In the embodiment of Fig. 1, the system of the invention is a mobile device. In an alternative embodiment, the system of the invention is a different device, e.g. a PC.
A first embodiment of the method of the invention is shown in Fig. 2. A step 101 comprises determining a quantity of pixels which have moved in a frame of video content compared to a previous frame of the video content (also referred to as“second frame” and“first frame”, respectively). The quantity of pixels which have moved in the second frame of video content compared to the first frame of said video content may be determined by taking the pixel difference between the two frames, for example. This pixel difference could be in either lightness or chromaticity or both. However, the determination of the quantity of pixels which have moved in the second frame of video content compared to the first frame of said video content does not require that a movement is determined per pixel in the video frame (e.g. by determining the pixel difference).
It is sometimes not possible to determine that a pixel of an object has moved, because it does not have the exact same brightness and/chromaticity in the next frame. While the properties of a pixel might change, it is still possible to calculate that the pixel has moved, e.g. because it remains part of the same object and this object has moved. An object may be detected using edge detection, for example. If the object is detected to have moved, the quantity of pixels that form the object, i.e. are located inside the edges of the object, may be considered to be a quantity of pixels that have moved.
In the embodiment of Fig. 2, the quantity of pixels is determined by computing the optical flow field with an algorithm appropriate for the application. For example, a fast algorithm may be used for real-time light effect generation and a high accuracy algorithm may be used for script creation.
The concept of optical flow, as disclosed by David J. Fleet & Yair Weiss in the chapter titled "Optical Flow Estimation" of the“Handbook of Mathematical Models in Computer Vision”, Paragios et al., Springer, ISBN 0-387-26371-3, is used as a step in many video processing algorithms from compression to motion estimation. As such, there is a large body of research describing algorithms with varying computational complexity and accuracy. The output of the optical flow calculation is the estimated movement of all the pixels or key pixels from one frame to another. Optical flow has also been used in classifications of type of movement. Example of this are shown in Figs. 3 and 4. Fig. 3 depicts a part of video content in which a camera zooms and Fig. 4 depicts a part of video content in which an object moves.
A step 103 comprises determining if the quantity of pixels exceeds a certain threshold, e.g. 10% of all pixels. If the certain threshold is exceeded, step 105 is performed. If not, step 113 is performed. In the embodiment of Fig. 2, step 113 comprises determining a sequence of chromaticity and/or brightness settings based on at least part of the video content using a slow dynamicity.
A step 105 comprises determining one or more properties of a dominant movement between the first frame and the second frame. The one or more properties may comprise a speed and/or a direction and/or a type of the dominant movement, for example. The movement type may be a camera pan or a camera zoom or a camera tilt or a camera forward movement or an object movement, for example.
A movement direction and/or speed that the largest amount of moving pixels have in common may be considered to be the dominant movement, for example. This does not require that all pixels move in the same direction. For example, in case of a camera zooming in or out, pixels may be considered to be part of the same movement, without there being one dominant movement per pixel. If the moving pixels are part of an object that has moved, then the pixels will normally have the same movement direction and speed.
A step 107 comprises determining a dynamicity for chromaticity and/or brightness settings based on the one or more properties in dependence on the quantity exceeding the threshold. In the embodiment of Fig. 2, the dynamicity for the chromaticity and/or brightness settings comprises a transition speed in the chromaticity and/or brightness settings and step 107 comprises determining a first transition speed for the chromaticity settings and a second transition speed for the brightness settings.
It has been learned from perception tests that preferably the brightness transitions are much slower than the chromaticity transitions. For example, if the dominant movement is a camera pan, the second transition speed (in brightness settings) may be a certain normal speed and the first transition speed (in chromaticity settings) may be a speed which is at least a certain amount, e.g. twice, higher than the certain normal speed.
If the light effects are created real-time, i.e. while the video content is being received and rendered, the transition speed may further be determined based on a user- configured transition speed. The user may be able to select one of a limited number of static presets with slightly slower or faster transitions, for example. For instance, if a user selects “moderate” setting, where the transition speed of brightness is a and of chromaticity b, then based on the detected movement and its properties, a change (can be positive or negative) can be introduced to modify the current transition speed, e.g. Da and Db.
If the light effects are being stored in a light script, it is possible to more precisely calculate the moments when movements start and stop and to more precisely calculate the quantity of pixels and/or the one or more properties of the dominant movement. Furthermore, in case of scripting, the changes to the transition speed could also be presented as an option to the script creator where he might need to confirm or correct adaptions of transition speed.
A step 109 comprises determining a sequence of chromaticity and/or brightness settings based on at least part of the video content, the sequence having the dynamicity.
Figs. 5 and 6 illustrate an implementation in which step 105 comprises determining the type of the dominant movement. The dynamicity determined in step 107 depends on this movement type and different light effects transition speeds are used for different types of movements. Figs. 5 and 6 show different scenes with significant changes between the frames, caused by different types of movement. Fig. 5 shows a camera panning movement and Fig. 6 shows a chase (front movement). In the example of Fig. 5, chromaticity transitions can be sped up to create a feeling of high side speed (e.g. colors traveling fast from the left then center and then right light). In the example of Fig. 6, increased transition speed of chromaticity and brightness (on all lights) might create a stronger feeling of moving very fast forward. In an extension of this embodiment, the transition speed may be increased upon determining an increase in a speed of the dominant movement.
In the embodiment of Fig. 2, steps 101 and 105 are separate steps and step 105 is only performed if the threshold is exceeded. In alternative embodiment, steps 101 and 105 are combined and if the threshold is exceeded, step 107 is performed instead of step 105. In a different embodiment, steps 101, 105 and 107 are combined and if the threshold is exceeded, step 109 is performed instead of step 105.
Fig. 7 depicts a block diagram illustrating an exemplary data processing system that may perform the method as described with reference to Fig. 2.
As shown in Fig. 7, the data processing system 300 may include at least one processor 302 coupled to memory elements 304 through a system bus 306. As such, the data processing system may store program code within memory elements 304. Further, the processor 302 may execute the program code accessed from the memory elements 304 via a system bus 306. In one aspect, the data processing system may be implemented as a computer that is suitable for storing and/or executing program code. It should be appreciated, however, that the data processing system 300 may be implemented in the form of any system including a processor and a memory that can perform the functions described within this specification.
The memory elements 304 may include one or more physical memory devices such as, for example, local memory 308 and one or more bulk storage devices 310. The local memory may refer to random access memory or other non-persistent memory device(s) generally used during actual execution of the program code. A bulk storage device may be implemented as a hard drive or other persistent data storage device. The processing system 300 may also include one or more cache memories (not shown) that provide temporary storage of at least some program code in order to reduce the quantity of times program code must be retrieved from the bulk storage device 310 during execution. The processing system 300 may also be able to use memory elements of another processing system, e.g. if the processing system 300 is part of a cloud-computing platform.
Input/output (I/O) devices depicted as an input device 312 and an output device 314 optionally can be coupled to the data processing system. Examples of input devices may include, but are not limited to, a keyboard, a pointing device such as a mouse, a microphone (e.g. for voice and/or speech recognition), or the like. Examples of output devices may include, but are not limited to, a monitor or a display, speakers, or the like. Input and/or output devices may be coupled to the data processing system either directly or through intervening I/O controllers.
In an embodiment, the input and the output devices may be implemented as a combined input/output device (illustrated in Fig. 7 with a dashed line surrounding the input device 312 and the output device 314). An example of such a combined device is a touch sensitive display, also sometimes referred to as a“touch screen display” or simply“touch screen”. In such an embodiment, input to the device may be provided by a movement of a physical object, such as e.g. a stylus or a finger of a user, on or near the touch screen display.
A network adapter 316 may also be coupled to the data processing system to enable it to become coupled to other systems, computer systems, remote network devices, and/or remote storage devices through intervening private or public networks. The network adapter may comprise a data receiver for receiving data that is transmitted by said systems, devices and/or networks to the data processing system 300, and a data transmitter for transmitting data from the data processing system 300 to said systems, devices and/or networks. Modems, cable modems, and Ethernet cards are examples of different types of network adapter that may be used with the data processing system 300.
As pictured in Fig. 7, the memory elements 304 may store an application 318. In various embodiments, the application 318 may be stored in the local memory 308, the one or more bulk storage devices 310, or separate from the local memory and the bulk storage devices. It should be appreciated that the data processing system 300 may further execute an operating system (not shown in Fig. 7) that can facilitate execution of the application 318. The application 318, being implemented in the form of executable program code, can be executed by the data processing system 300, e.g., by the processor 302. Responsive to executing the application, the data processing system 300 may be configured to perform one or more operations or method steps described herein.
Various embodiments of the invention may be implemented as a program product for use with a computer system, where the program(s) of the program product define functions of the embodiments (including the methods described herein). In one embodiment, the program(s) can be contained on a variety of non-transitory computer-readable storage media, where, as used herein, the expression“non-transitory computer readable storage media” comprises all computer-readable media, with the sole exception being a transitory, propagating signal. In another embodiment, the program(s) can be contained on a variety of transitory computer-readable storage media. Illustrative computer-readable storage media include, but are not limited to: (i) non- writable storage media (e.g., read-only memory devices within a computer such as CD-ROM disks readable by a CD-ROM drive, ROM chips or any type of solid-state non-volatile semiconductor memory) on which information is permanently stored; and (ii) writable storage media (e.g., flash memory, floppy disks within a diskette drive or hard-disk drive or any type of solid-state random-access semiconductor memory) on which alterable information is stored. The computer program may be run on the processor 302 described herein.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
The corresponding structures, materials, acts, and equivalents of all means or step plus function elements in the claims below are intended to include any structure, material, or act for performing the function in combination with other claimed elements as specifically claimed. The description of embodiments of the present invention has been presented for purposes of illustration, but is not intended to be exhaustive or limited to the implementations in the form disclosed. The embodiments were chosen and described in order to best explain the principles and some practical applications of the present invention, and to enable others of ordinary skill in the art to understand the present invention for various embodiments with various modifications as are suited to the particular use contemplated.

Claims

CLAIMS:
1. A system (1) for determining a sequence of chromaticity and/or brightness settings, said system comprising at least one processor (5) configured to:
determine a quantity of pixels which have moved in a second frame of video content compared to a first frame of said video content,
- determine if said quantity of pixels exceeds a certain threshold,
determine one or more properties of a dominant movement between said first frame and said second frame,
determine a dynamicity for chromaticity and/or brightness settings based on said one or more properties in dependence on said quantity exceeding said threshold, and - determine a sequence of chromaticity and/or brightness settings based on at least part of said video content, said sequence having said dynamicity, wherein said dynamicity for said chromaticity and/or brightness settings comprises a transition speed in said chromaticity and/or brightness settings,
determine a first transition speed for said chromaticity settings and a second transition speed for said brightness settings.
2. A system (1) as claimed in claim 1, wherein the transition speed in said chromaticity and/or brightness settings increases at increasing quantity of pixels exceeding said threshold.
3. A system (1) as claimed in claim 2, wherein the first transition speed is higher than the second transition speed.
4. A system (1) as claimed in claim 1, wherein said at least one processor (5) is configured to determine said second transition speed to be a certain normal speed and said first transition speed to be a speed which is at least a certain amount higher than said certain normal speed upon determining that said dominant movement is a camera pan.
5. A system (1) as claimed in claim 1, wherein said at least one processor (5) is configured to determine said transition speed further based on a user-configured transition speed.
6. A system (1) as claimed in claim 1, wherein said at least one processor (5) is configured to increase said transition speed upon determining an increase in a speed of said dominant movement.
7. A system (1) as claimed in claim 1 or 2, wherein said at least one processor (5) is configured to control one or more light devices based on said sequence of chromaticity and/or brightness settings while said video content is being rendered.
8. A system (1) as claimed in claim 7, wherein said at least one processor (5) is configured to control said one or more light devices while receiving said video content and while determining chromaticity and/or brightness settings of said sequence based on said video content.
9. A system (1) as claimed in claim 1 or 2, wherein said at least one processor (5) is configured to store said sequence in a light script, each of said settings being associated in said light script with a time in said video content.
10. A system (1) as claimed in claim 1 or 2, wherein said one or more properties comprise a speed and/or a direction and/or a type of said dominant movement.
11. A system (1) as claimed in claim 10, wherein said movement type is a camera pan or a camera zoom or a camera tilt or a camera forward movement or an object movement.
12. A system (1) as claimed in claim 1 or 2, wherein said at least one processor (5) is configured to determine said quantity by determining a quantity of all pixels which have moved in said second frame of video content compared to said first frame of said video content.
13. A lighting system, comprising a system (1) according to any of the preceding claims and at least one light source (13, 14, 15), wherein the at least one light source is being controlled using the sequence of chromaticity and/or brightness settings.
14. A method of determining a sequence of chromaticity and/or brightness settings, said method comprising:
determining (101) a quantity of pixels which have moved in a second frame of video content compared to a first frame of said video content;
determining (103) if said quantity of pixels exceeds a certain threshold;
- determining (105) one or more properties of a dominant movement between said first frame and said second frame;
determining (107) a dynamicity for chromaticity and/or brightness settings based on said one or more properties in dependence on said quantity exceeding said threshold; and
- determining (109) a sequence of chromaticity and/or brightness settings based on at least part of said video content, said sequence having said dynamicity wherein said dynamicity for said chromaticity and/or brightness settings comprises a transition speed in said chromaticity and/or brightness settings,
determining a first transition speed for said chromaticity settings and a second transition speed for said brightness settings.
15. A computer program or suite of computer programs comprising at least one software code portion or a computer program product storing at least one software code portion, the software code portion, when run on a computer system, causing the computer system to execute the steps of the method of claim 14.
PCT/EP2019/076422 2018-10-09 2019-09-30 Determining dynamicity for light effects based on movement in video content WO2020074303A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP18199233 2018-10-09
EP18199233.0 2018-10-09

Publications (1)

Publication Number Publication Date
WO2020074303A1 true WO2020074303A1 (en) 2020-04-16

Family

ID=63921494

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2019/076422 WO2020074303A1 (en) 2018-10-09 2019-09-30 Determining dynamicity for light effects based on movement in video content

Country Status (1)

Country Link
WO (1) WO2020074303A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113259745A (en) * 2021-05-13 2021-08-13 北京百度网讯科技有限公司 Video playing page processing method and device, electronic equipment and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100026734A1 (en) 2006-12-20 2010-02-04 Koninklijke Philips Electronics N.V. System, method and computer-readable medium for displaying light radiation
US20100165000A1 (en) 2007-05-29 2010-07-01 Koninklijke Philips Electronics N.V. Visualizing objects of a video signal

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100026734A1 (en) 2006-12-20 2010-02-04 Koninklijke Philips Electronics N.V. System, method and computer-readable medium for displaying light radiation
US20100165000A1 (en) 2007-05-29 2010-07-01 Koninklijke Philips Electronics N.V. Visualizing objects of a video signal

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
DAVID J. FLEETYAIR WEISS ET AL.: "Handbook of Mathematical Models in Computer Vision", SPRINGER, article "Optical Flow Estimation"

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113259745A (en) * 2021-05-13 2021-08-13 北京百度网讯科技有限公司 Video playing page processing method and device, electronic equipment and storage medium
CN113259745B (en) * 2021-05-13 2022-11-15 北京百度网讯科技有限公司 Video playing page processing method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
JP6924901B2 (en) Photography method and electronic equipment
CN112913330B (en) Method for selecting color extraction from video content to produce light effects
WO2021169583A1 (en) Virtual reality display device and control method therefor
US10769416B2 (en) Image processing method, electronic device and storage medium
EP3874911B1 (en) Determining light effects based on video and audio information in dependence on video and audio weights
EP4018646B1 (en) Selecting an image analysis area based on a comparison of dynamicity levels
WO2020074303A1 (en) Determining dynamicity for light effects based on movement in video content
US11856673B2 (en) Determining a light effect based on an average color after a detected transition in content
CN110945970B (en) Attention dependent distraction storing preferences for light states of light sources
WO2020144196A1 (en) Determining a light effect based on a light effect parameter specified by a user for other content taking place at a similar location
CN113099101A (en) Camera shooting parameter adjusting method and device and electronic equipment
EP4274387A1 (en) Selecting entertainment lighting devices based on dynamicity of video content
US20230237632A1 (en) Electronic device and operating method thereof
WO2023046673A1 (en) Conditionally adjusting light effect based on second audio channel content
KR20230112964A (en) An electronic apparatus and a method of operating the same
EP3226210A1 (en) Method and device for generating a cinemagraph from light field images

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19779005

Country of ref document: EP

Kind code of ref document: A1