CN218920468U - Background back lighting device and movable green screen - Google Patents

Background back lighting device and movable green screen Download PDF

Info

Publication number
CN218920468U
CN218920468U CN202223215154.1U CN202223215154U CN218920468U CN 218920468 U CN218920468 U CN 218920468U CN 202223215154 U CN202223215154 U CN 202223215154U CN 218920468 U CN218920468 U CN 218920468U
Authority
CN
China
Prior art keywords
light
light emitting
emitting devices
background
green screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202223215154.1U
Other languages
Chinese (zh)
Inventor
张力
黄长运
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sub2r Co
Yuyao Lishuai Film & Television Equipment Co ltd
Original Assignee
Sub2r Co
Yuyao Lishuai Film & Television Equipment Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sub2r Co, Yuyao Lishuai Film & Television Equipment Co ltd filed Critical Sub2r Co
Priority to CN202223215154.1U priority Critical patent/CN218920468U/en
Application granted granted Critical
Publication of CN218920468U publication Critical patent/CN218920468U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Circuit Arrangement For Electric Light Sources In General (AREA)

Abstract

The utility model provides a background back lighting device and a movable green screen, wherein the movable green screen comprises a green screen, a diffusion material and the background back lighting device. The backlight device includes a light assembly and a light controller. The light assembly includes one or more light emitting devices, each emitting light that first passes through the diffusing material and then through the green screen; the light controller comprises at least one light driver for controlling at least one operating parameter of one or more light emitting devices; and at least one user interface or one optical remote control module for driving the at least one optical drive, respectively. The utility model solves the defects of shadow areas, dark spots, hot spots, reflection in a generated monochromatic background and the like existing in the chroma bonding in the prior art.

Description

Background back lighting device and movable green screen
Technical Field
The utility model relates to the technical field of chroma key synthesis, in particular to a background back lighting device and a movable green screen.
Background
Chroma bonding is a technique that involves placing a subject in front of a monochromatic background, capturing an image or video of the subject, and deleting or replacing the monochromatic background captured in the image or video by background replacement so that it can be combined with other still or video images. The composite image or video shows that the subject appears in an environment (e.g., location) different from their actual environment in front of a monochromatic background.
There are various techniques currently available to implement a single color background. For example, a background wall or cloth having a selected green color (commonly referred to as a green background) is placed in the background and illuminated by lights positioned in front of and to the sides of the green background to create a monochromatic green background. The background and three-dimensional objects may be painted green, which may allow the subject to locate himself in the later added image. For distinction this is called a passive system, because the purpose of the background is to reflect green, controlling the intensity generated by the external light source. Alternatively, a monochrome screen is used to generate a monochrome green background. It should be noted that the color key composition is not limited to a monochromatic green background, but any monochromatic color background, such as a blue background, may be used instead, or any other suitable monochromatic color background.
As is well known, defects in color bonding include the presence of shadow areas, dark spots, hot spots, reflections in the resulting monochromatic background, and the like. The shadow area may be caused by a variety of factors including a shadow cast by the subject, a distance of the subject from the background, a shadow cast by an external light source, a shadow cast by an environment in which image or video capturing is performed, and the like. Uneven light may create dark spots. When light is unevenly irradiated onto the background, hot spots like black spots may be generated. Reflection may occur when the surface of the background generates concentrated reflected light. The reflected green light may also partially illuminate the body, interfering with the color key process. Furthermore, if a monochromatic green screen is used, imperfections in the green color produced by the screen may also be responsible for non-uniformity in illumination and color characteristics. In addition, if a wall or cloth associated with a bright lighting system (wall or cloth) is used, imperfections in the wall (or cloth) and/or associated bright lighting system may also be responsible for the shadow areas.
In order to avoid or compensate for unwanted shadow areas in the generated monochromatic background, various solutions have been implemented. However, these prior art solutions require the lighting technician to identify and manually correct the shadow areas by repositioning or recalibrating the lighting in front of the green screen.
Thus, there is a need for a new background backlight device and an active green screen incorporating the device.
Disclosure of Invention
First, the technical problem to be solved
The utility model aims to solve the technical problems of providing a background back lighting device and a movable green screen, and solves the problems in the prior art.
(II) technical scheme
The utility model solves the technical problems that the proposal adopted is a background back lighting device, comprising
A light assembly comprising one or more light emitting devices, each emitting light that first passes through the diffusing material and then through the green screen;
a light controller comprising at least one light driver for controlling at least one operating parameter of one or more light emitting devices; and at least one user interface or one optical remote control module for driving the at least one optical drive, respectively.
Further, wherein each light emitting device emits one of a single color light, a white light, or a broad spectrum light having red, green, and blue components.
Further, wherein the at least one operating parameter of the one or more light emitting devices comprises at least one of an intensity of light emitted by each light emitting device and a color of light emitted by each light emitting device.
Further, wherein the at least one user interface comprises at least one of the following for allowing a user to adjust at least one operating parameter of the one or more light emitting devices: knob, slider and digital input.
Further, wherein the optical remote control module includes a communication interface for receiving light control commands from a remote computing device; the optical remote control module further comprises a processing unit for processing the light control commands to generate control signals for driving at least one light driver to control at least one operating parameter of one or more light emitting devices.
Further, wherein the light assembly comprises a plurality of light emitting devices, and each light control command comprises light control data for controlling one or more of at least one operating parameter of one of: all of the plurality of light emitting devices are selected from a group of the plurality of light emitting devices or a single light emitting device of the plurality of light emitting devices.
The utility model solves the technical problems by adopting a scheme that the movable green screen comprises a green screen, a diffusion material and the background back-lighting device, wherein the background back-lighting device comprises a light assembly, the light assembly comprises one or more light-emitting devices, and light emitted by each light-emitting device firstly passes through the diffusion material and then passes through the green screen; and a light controller comprising at least one light driver for controlling at least one operating parameter of one or more light emitting devices; and at least one user interface or one optical remote control module for driving the at least one optical drive, respectively.
Further, wherein the green screen is made of fabric.
Further, wherein the diffusing material comprises one or more layers of material that is partially translucent and diffuses light.
Further, wherein the one or more light emitting devices comprise at least one of: incandescent lamps or devices using light emitting diodes.
Further, wherein each light emitting device emits one of: monochromatic light, white light or broad spectrum light with red, green and blue components.
Further, wherein the at least one operating parameter of the one or more light emitting devices comprises at least one of: the intensity of the light emitted by each light emitting device and the color of the light emitted by each light emitting device.
Further, wherein the at least one user interface comprises at least one of the following for allowing a user to adjust at least one operating parameter of the one or more light emitting devices: knob, slider and digital input.
Further, wherein the optical remote control module includes a communication interface for receiving light control commands from a remote computing device; the optical remote control module further comprises a processing unit for processing the light control commands to generate control signals for driving at least one light driver to control at least one operating parameter of one or more light emitting devices.
Further, wherein the light assembly comprises a plurality of light emitting devices, and each light control command comprises light control data for controlling one or more of at least one operating parameter of one of: all of the plurality of light emitting devices are selected from a group of the plurality of light emitting devices or a single light emitting device of the plurality of light emitting devices.
(III) beneficial effects
Compared with the prior art, the utility model designs the background back lighting device and the movable green screen, and overcomes the defects of shadow areas, dark spots, hot spots, reflection in a generated monochromatic background and the like existing in chromaticity bond synthesis in the prior art.
Drawings
FIG. 1 is a schematic view of a conventional front-lit green screen (embodying known defects);
FIG. 2A is a schematic view of an active green screen incorporating a background backlight;
FIG. 2B is a schematic diagram depicting illumination characteristics between various components of the active green screen of FIG. 2A;
FIG. 2C is a schematic illustration of another representation of the active green screen of FIG. 2A;
FIGS. 2D and 2E illustrate schematic diagrams of the movable green screen of FIG. 2C in vertical and horizontal directions, respectively;
FIGS. 3A and 3B are schematic illustrations of components for the background backlight of FIG. 2A and a diffusing material;
FIG. 4A is a schematic view of the light assembly of FIGS. 3A and 3B including a plurality of light emitting devices;
FIG. 4B is a schematic illustration of various light patterns of a light emitting device supported by the light assembly of FIGS. 3A and 3B;
FIGS. 5A, 5B and 5C are schematic component views of the background backlight assembly of FIGS. 2A and 3A;
FIG. 6 is a schematic diagram of components of the computing device of FIG. 2A;
FIG. 7 is a schematic diagram of components of the camera of FIG. 2A;
8A, 8B, 8C, and 8D are remote control Graphical User Interfaces (GUIs) displayed by the computing devices of FIGS. 2A and 6;
FIG. 9 is a method implemented by the computing device of FIGS. 2A and 6 for performing user-based dynamic background backlighting;
FIGS. 10A, 10B, and 10C are different implementations of a method for performing algorithm-based dynamic background backlighting implemented by elements of the system shown in FIG. 2A;
FIG. 11 is a set of independently controlled light emitting devices supported by the background backlight of FIGS. 2A and 4A and corresponding areas in an image captured by the camera of FIG. 2A;
FIGS. 12A and 12B are different embodiments of methods for performing algorithm-based remote control of the front light and camera represented in FIG. 2C;
fig. 13 is a neural network for implementing the steps of the methods shown in fig. 10A, 10B, and 10C.
Reference numerals illustrate: 10. a subject; 20. front-lit green screen; 100. a movable green screen; 110. a green screen; 120. a diffusion material; 130. a reflective backing material; 200. a background back lighting device; 210. a light assembly; 211. a light emitting device; 220. a light controller; 221. an optical driver; 222. a user interface; 223. an optical remote control module; 223A, communication interface; 223B, a processing unit; 223C, memory; 230. a power module; 300. a camera; 310. a processing unit; 320. a memory; 330. a communication interface; 340. an image capturing module; 400. a computing device; 410. a processing unit; 420. a memory; 430. a communication interface; 440. a display; 450. a user interface; 50. a front light source.
Detailed Description
The following describes in further detail the embodiments of the present utility model with reference to the drawings and examples. The following examples are illustrative of the utility model but are not intended to limit the scope of the utility model.
In the description of the present utility model, it should be noted that, unless explicitly specified and limited otherwise, the terms "mounted," "connected," and "connected" are to be construed broadly, and may be either fixedly connected, detachably connected, or integrally connected, for example; can be mechanically or electrically connected; can be directly connected or indirectly connected through an intermediate medium, and can be communication between two elements. The specific meaning of the above terms in the present utility model will be understood in specific cases by those of ordinary skill in the art.
Aspects of the present disclosure generally address one or more problems associated with color bonding to avoid well-known drawbacks, such as the presence of shadow areas, dark spots, hot spots, reflections, and green colors reflected on objects in the background of an image or video captured by a camera when performing color bonding, color non-uniformity due to material characteristics or illumination non-uniformity. A background backlighting apparatus and active green screen avoids these well-known drawbacks by dynamic backlighting of the green screen. In the remainder of this disclosure, the term "image" will include still images (e.g., photographs) and images belonging to video.
Fig. 1 illustrates a conventional front-lit green screen 20 illuminated by one or more light sources 50 (e.g., two in fig. 1) positioned in front of the front-lit green screen 20, as is well known in the art. The subject 10 is placed in front of a conventional front-lit green screen 20, and the camera 300 images the subject 10. Some of the above-mentioned well-known drawbacks are illustrated in fig. 1.
It should be noted that: although the following embodiments only demonstrate the application of the green screen 110, i.e., the output of the green screen 110 is a uniformly colored background of the subject 10; but other color screen applications are within the scope of the utility model.
Examples:
regarding the active green screen:
2A-E, 3A-B, 4A-B, and 5A-5C, with FIGS. 2A, 2B, and 2C representing environments for performing color bond formation by an active green screen 100 comprising three components. Fig. 2A and 2B provide schematic diagrams of components of an environment for performing color bond synthesis, while fig. 2C provides a true representation of the components, illustrating differences from the prior art environment of fig. 1.
The active green screen includes a green screen, a diffusing material, and a background backlight.
The diffusing material 120 is positioned behind the green screen 110. The background backlight 200 is located behind the diffusion material 120. Alternatively, the background backlighting 200 is at least partially integrated into the diffusion material 120.
As shown in fig. 2B, the background backlighting apparatus 200 emits background illumination. The backlight used to implement the present dynamic background backlight may be monochromatic, such as monochromatic green, monochromatic blue, etc.; or a broad spectrum such as a combination of white, red, green, blue color components, etc. Thus, although the terms active green screen 100 and green screen 110 are used with reference to terms commonly used in the art, the resulting background illumination need not be green.
The background light first passes through the diffusion material 120 and then through the green screen 110. The output of the green screen 110 is a uniformly colored background of the subject 10. The camera 300 captures a combination of a uniform color background and an image generated by the subject 10.
The green screen 110 is typically made of fabric (e.g., cloth) having a rigid structure for supporting the fabric. For example, the fabric is embedded in a rectangular frame made of a rigid material (e.g., wood, plastic, metal, etc.).
The diffusion material 120 is used to diffuse (scatter) the light generated by the background backlighting apparatus 200 to produce uniform illumination of the front of the green screen 110. The diffusing material 120 includes one or more layers (e.g., 1, 2, or 3 layers) of a material having the following characteristics: the material is partially translucent and is capable of diffusing light. In a first configuration, each layer of diffusing material 120 has the same color (e.g., white, a color that matches a desired monochromatic screen color, etc.). In another configuration, each layer of diffusing material 120 has a different color (a combination of white and a color that matches the desired monochromatic screen color). The layers of the diffusing material 120 may be made of the same material or may be made of different complementary materials. The layers of the diffusion material 120 may have the same diffusion characteristics or complementary diffusion characteristics.
In a first embodiment, the layer of diffusing material 120 is applied near the light source of the background backlighting apparatus 200. In a second embodiment, the layer of diffusing material 120 is applied close to the green screen 110. In a third embodiment, some of the layers of diffusing material 120 are applied near the light sources of the background backlighting 200, and some of the layers of diffusing material 120 are applied at the green screen 110 (as shown in FIG. 3B). In another alternative or additional embodiment, the background backlighting apparatus 200 includes other apparatus for performing diffusion, such as lenses, diffraction gratings, frosted optics, fresnel lenses, and the like.
Optionally, as shown in fig. 2C, the background backlighting apparatus 200 further includes a reflective backing material 130 positioned behind the background backlighting apparatus 200. The reflective backing material 130 includes a reflective surface capable of diffusing light generated by the background backlight 200 and improving the diffusion of light through the diffusion material 120.
Although shown as separate components in fig. 2C, the green screen 110, the diffusing material 120, the background backlighting apparatus 200, and the optional reflective backing material 130 may be assembled together to form the active green screen 100. For example, the active green screen 100 may take the form of a substantially cube assembly, with the green screen 110 and reflective backing material 130 forming two opposite sides of the cube. The other four sides of the cube (not shown in fig. 2C) extend between the green screen 110 and the reflective backing material 130. The four other sides of the cube may be made of or include (at least partially) internally reflective surfaces that are also capable of scattering light generated by the background backlighting apparatus 200 and improving the diffusion of light through the diffusing material 120.
The orientation of the active green screen 100 (or at least green screen 110) may be vertical, horizontal, or at an angle relative to horizontal or vertical. This direction is suitable for providing the most suitable background for the subject 10 positioned in front of the green screen 110. Fig. 2D is a simplified representation of the active green screen of fig. 2C, illustrating the vertical orientation of the green screen 110 or active green screen 100. Fig. 2D is a simplified representation of the active green screen of fig. 2C, illustrating the horizontal orientation of the green screen 110 or active green screen 100.
Examples of support structures for supporting the active green screen 100 include tripods, standing brackets, stand-alone brackets (e.g., adjustable feet), suspension brackets (e.g., one or more hooks), mechanical attachments, and the like.
With respect to the background backlight:
background the backlighting apparatus 200 may consider several embodiments. An exemplary embodiment is shown in fig. 3A, a background backlighting apparatus 200 includes a light assembly 210 and a light controller 220. As previously mentioned, the position of the layer of diffusing material 120 relative to the light assembly 210 may vary. Fig. 3B shows an exemplary structure in which the diffusion material 120 includes three light source diffusion layers (adjacent to the light source 211 of the lamp assembly 210) and one screen diffusion layer (adjacent to the green screen 110).
The light assembly 210 includes one or more light emitting devices (also referred to as light sources 211 shown in fig. 3B) and a structure (not shown) supporting the one or more light emitting devices. The design of the light assembly 210 may vary and depends on implementation. The support structure may be made of metal, wood, plastic, etc. as known in the art. Furthermore, the geometry of the support structure may also vary. The physical integration (e.g., connection) of the light assembly 210 with other components of the active green screen 100 may be accomplished in any manner suitable for use within the industry.
The light emitting means may be realized by any means capable of emitting photons in the visible and near visible spectrum. For example, the lighting device is an incandescent lamp. Alternatively, the light emitting device uses a Light Emitting Diode (LED). As previously described, the light emitting device emits monochromatic light (e.g., green or blue) or broad spectrum light (e.g., white or RGB light components). A combination of various technologies (e.g., incandescent and LEDs) may be used to implement the light emitting device of the light assembly 210.
The distribution of the plurality of light emitting devices over the support structure of the light assembly 210 may vary. For example, a plurality of n×m light emitting devices are supported by the support structure, the plurality of light emitting devices being organized into N rows and M columns. Fig. 4A shows an exemplary light assembly 210 comprising 4 rows and 6 columns of light emitting devices 211 for a total of 24 light emitting devices 211. In an exemplary embodiment, each light emitting device 211 uses an LED.
However, the light emitting devices 211 may be arranged according to various patterns, thereby generating corresponding light emitting patterns. Fig. 4B illustrates several examples of light emitting patterns, including a single light emitting device, light emitting devices arranged in rows, light emitting devices arranged in columns, light emitting devices arranged in circles, light emitting devices arranged randomly. Those skilled in the art will readily understand that other modes may be used.
The combination of light generated by the one or more light emitting devices 211 results in the background illumination represented in fig. 2B. In the case of the plurality of light emitting devices 211, each of the plurality of light emitting devices 211 may have the same light characteristic. Alternatively, the plurality of light emitting devices 211 may include at least two groups of light emitting devices 211, each group of light emitting devices 211 having its own light characteristics (as shown in fig. 11, having four independent light emitting devices 211).
The light controller 220 controls the operating parameters of one or more light emitting devices 211 supported by the light assembly 210. The controlled operating parameters include at least one of: intensity (also referred to as brightness) of the emitted light, color of the emitted light, and combinations thereof. In the case of white light, the color is defined according to the kelvin scale. The kelvin scale is based on the thermodynamics of a heated carbon block, with colors ranging from warm yellow to bright blue. In the case of non-white light, the color is defined by a combination of color components, typically RGB components.
For non-digital light sources, the intensity of the light emitting device 211 may be controlled by a rheostat. For digital light sources, such as LEDs, the intensity of the light emitting device 211 is typically controlled by a Pulse Width Modulator (PWM). In the case of LEDs that preferably operate at a fixed power, the target intensity is achieved by varying the period of power on and off.
In the first embodiment, the light controller 220 simultaneously controls a plurality of light emitting devices 211 (the operation parameters controlled by the light controller 220 are the same for each light emitting device 211). In the second embodiment, the light controller 220 can individually control each light emitting device 211. In the third embodiment, the light controller 220 can individually control groups or blocks of the light emitting devices 211 (as shown in fig. 11, having four independent light emitting devices 211). The operating parameters controlled by the light controller 220 are the same among one set of light emitting devices 211 and may be different from the operating parameters of another set.
Local control of the backlight device:
in a first embodiment shown in fig. 5A, the light controller 220 includes a light driver 221 and at least one user interface 222. Although a single user interface 222 is shown in fig. 5A, multiple user interfaces 222 may be used.
Each user interface 222 allows a user to adjust the operating parameters of the light emitting device 211. Examples of the user interface 222 include a knob, a slider, or a digital input. The light driver 221 is an electrical or electronic circuit that controls the operating parameters of the light emitting means 211 of the light assembly 210. User interaction with the user interface 222 causes the optical driver 221 to act to control the operating parameters of the light emitting device 211 of the light assembly 210.
For example, the light driver 221 controls the intensity of light emitted by the light emitting device 211 through the intensity knob 222. Turning the intensity knob 222 in one direction increases intensity, while turning the intensity knob 222 in the other direction decreases intensity. As described above, the control of the intensity is performed by the optical driver 221 through analog control or through PWM control of the light emitting device 221.
With this embodiment, the intensity of light emitted by all the light emitting devices 211 is simultaneously controlled by one intensity knob 222. It is impractical to have a dedicated intensity knob to control the intensity of each light emitting device 211 (unless the number of light emitting devices 211 is not too great). However, the light emitting devices 211 may also be organized in a small number of groups of light emitting devices (as described previously) with dedicated intensity knobs for controlling each group of light emitting devices.
Alternatively or additionally, the light driver 221 controls the color of the light emitted by the light emitting device 211 through a single color temperature knob 222 or through three color component knobs 222 corresponding to the RGB components of the light emitting device, respectively.
The foregoing examples may be applicable to using sliders or digital inputs instead of knobs. Further, any combination of knobs, sliders, and digital inputs may be used to implement the user interface 222.
The same light driver 221 is typically used for controlling all operating parameters of the light emitting device 211. Alternatively, additional optical drivers (not shown in fig. 5A for simplicity) may be used.
As previously described, the backlight 200 also includes a power module 230. The power provided by the power module 230 is used by the light driver 221 to power the light emitting device 211 and other components of the backlight device 200 (if applicable). For example, the power module 230 is a power source connectable to a power source through an Alternating Current (AC) power line. In another example, the power module 230 is a non-rechargeable battery. In yet another example, the power module 230 is a rechargeable battery that may be charged by an Alternating Current (AC) power cord, universal Serial Bus (USB) cable, or wireless power transmission.
Remote control of the backlight:
in a second embodiment shown in fig. 5B, the user interface 222 of the lamp controller 220 is capable of being accessed by wire (e.g., ethernet) or wireless (e.g., wi-Fi,
Figure SMS_1
Or->
Figure SMS_2
LowEnergy (BLE), infrared) communication protocol, and an optical remote control module 223 that interacts with a remote controller (e.g., computing device 400 and/or camera 300 of fig. 2A). The light remote control module 223 receives a light control command for adjusting an operation parameter (e.g., intensity or color) of the light emitted from the light emitting device 211 under the control of the light controller 220 from the remote controller and is executed by the light driver 221. Examples of remote controls include computers (desktop or notebook), smartphones, tablets, cameras, servers, and the like.
The optical remote control module 223 drives the optical drive 221 by generating a control signal (e.g., an electrical control signal) that is transmitted to the optical drive 221. The control signal is based on the received light control command. As previously described with reference to fig. 5A, the control signals drive the light driver 221 to control the operating parameters of the light emitting device 211 of the light assembly 210.
Fig. 5C shows components of the optical remote control module 223 of fig. 5B, omitting the power module 230 of fig. 5B for simplicity. The optical remote control module 223 includes a communication interface 223A for interacting with the computing device 400 and/or the camera 300. The light control command is received from the computing device 400 through the communication interface 223A. The communication protocol supported by the communication interface 223A is one of the following: infrared, ethernet, wi-Fi, bluetooth or bluetooth low energy, cellular, free space optics, etc. In another embodiment, the light control command is received by the communication interface 223A from the camera 300. The communication interface 223A typically includes a combination of hardware and software executed by the hardware for implementing the communication functions of the communication interface 223A. Although a single communication interface 223A is shown in fig. 5C, the optical remote control module 223 may include several communication interfaces, each communication interface 223A supporting a given communication technology for exchanging data with other devices that also support the given communication technology.
The optical remote control module 223 includes a processing unit 223B capable of executing instructions of a computer program to perform the functions of the optical remote control module 223 (e.g., processing light control commands received through the communication interface 223A and generating corresponding control signals that are transmitted to the optical drive 221). The processing unit 223B may be implemented by one or more processors, one or more Field Programmable Gate Arrays (FPGAs), one or more Application Specific Integrated Circuits (ASICs), combinations thereof, or the like.
The optical remote control module 223 includes a memory 223C, which is represented in fig. 5C as a separate component; but may also be integrated into the processing unit 223B. The memory 223C stores at least one of: instructions of a computer program executed by the processing unit 223B, data generated by execution of the computer program, data received through the communication interface 223A, and the like. The optical remote control module 223 may include several types of memory, including volatile memory (e.g., volatile Random Access Memory (RAM), etc.) and nonvolatile memory (e.g., electrically erasable programmable read-only memory (EEPROM), flash memory, etc.).
The light control command received from the computing device 400 (or camera 300) includes light control data for controlling the operating parameters of the light emitting means 211 of the light assembly 210. Examples of the light control data include data for controlling the intensity of light emitted by the light emitting device 211 and data for controlling the color of light emitted by the light emitting device 211. As described above, each light control command is applicable to all the light emitting devices 211. Alternatively, if the function is supported by the light driver 221, each light control command is applicable to a selected one of the plurality of light emitting devices 211 or a selected light emitting device group of the plurality of light emitting devices 211. In this case, an identifier of the selected light emitting device or the selected light emitting device group is included in the light control command. The identifier is converted by the processing unit 223B into information that allows the light-emitting device 211 selected by the light driver 221 or the selected light-emitting device group 211.
In another embodiment not shown in the figures, the background backlighting apparatus 200 comprises a combination of a user interface 222 (shown in fig. 5A) for controlling the lamps and an optical remote control module 223 (shown in fig. 5B and 5C) for controlling the light emitting means 211 by means of the optical driver 221.
Regarding computing devices:
referring now to FIG. 6, components of a computing device 400 are shown.
Computing device 400 includes a processing unit 410. The processing unit 410 includes one or more processors (not shown in fig. 6 for simplicity) capable of executing instructions of a computer program to perform the functions of the computing device 400 (e.g., receiving data, processing received data, generating data, transmitting generated data, etc.). Each processor may also have one or more cores.
Computing device 400 includes memory 420. The memory 420 stores instructions of a computer program executed by the processing unit 410, data generated by the execution of the computer program, data received through the communication interface 430, and the like. Computing device 400 may include several types of memory, including volatile memory (e.g., volatile Random Access Memory (RAM), etc.) and nonvolatile memory (e.g., hard disk drive, electrically erasable programmable read-only memory (EEPROM), flash memory, etc.).
Computing device 400 includes at least one communication interface 430. One or more communication interfaces 430 allow computing device 400 to exchange data with other devices, such as camera 300 and background backlighting 200. Each communication interface 430 supports one of the following communication technologies: ethernet, universal Serial Bus (USB), cellular (e.g., 4G or 5G cellular network), wi-Fi, bluetooth Low Energy (BLE), etc. Each communication interface 430 supporting a given communication technology is capable of exchanging data with other devices that also support the given communication technology. For example, the computing device 400 communicates with the camera 300 over a Wi-Fi network and with the background rear-lighting apparatus 200 over the same Wi-Fi network or over a bluetooth (low energy) network. Each communication interface 430 typically includes a combination of hardware and software executed by the hardware for implementing the communication functions of the communication interface 430.
Optionally, the computing device 400 also includes a display 440 (e.g., standard screen, touch screen, etc.) and/or a user interface 450 (e.g., keyboard, mouse, touch screen, etc.).
Regarding the camera:
referring now to fig. 7, components of a camera 300 are shown. The representation of camera 300 provided in fig. 7 is simplified because a detailed description of the function and operation of the camera is beyond the scope of the present disclosure and is well known in the art. The camera 300 is capable of capturing still images (e.g., photographs) and/or image streams belonging to video.
The camera 300 includes an image capture module 340. Because the function of the image capture module is well known in the art, implementation details of the image capture module 340 will not be provided. The image capture module 340 includes optical and electronic components for capturing and converting optical images into digital images.
The camera 300 includes a processing unit 310. The processing unit 310 includes one or more processors (not shown in fig. 7 for simplicity) capable of executing instructions of a computer program to implement the functions of the camera 300 (e.g., processing images acquired by the image acquisition module 340, transmitting images acquired by the image acquisition module 340, controlling the operation of the camera 300, processing data received through the communication interface 330, etc.). Each processor may also have one or more cores.
The camera 300 includes a memory 320. The memory 320 stores instructions of a computer program executed by the processing unit 310, images captured by the image capturing module 340, data generated by executing the computer program, data received through the communication interface 330, and the like. The camera 300 may include several types of memory, including volatile memory (e.g., volatile Random Access Memory (RAM), etc.) and non-volatile memory (e.g., electrically erasable programmable read-only memory (EEPROM), flash memory, etc.).
The camera 300 includes at least one communication interface 330. One or more communication interfaces 330 allow the camera 300 to exchange data with other devices, such as the computing device 400 and optional background backlight 200. Each communication interface 330 supports one of the following communication technologies: ethernet, universal Serial Bus (USB), wi-Fi,
Figure SMS_3
Bluetooth Low Energy (BLE), etc. Each communication interface 330 supporting a given communication technology is capable of exchanging data with other devices that also support the given communication technology. For example, the camera 300 communicates with the computing device 400 over a Wi-Fi network and optionally with the background backlight 200 over the same Wi-Fi network or over a bluetooth (low energy) network. Each communication interface 330 typically includes a combination of hardware and software executed by the hardware for implementing the communication functions of the communication interface 330.
Camera 300 also typically includes a display (e.g., standard screen, touch screen, etc.) and a user interface 450 (e.g., control buttons, touch screen, etc.), which are not shown in fig. 7 for simplicity.
Remote control (and optionally front light and/or camera) with respect to the user-based active green screen:
2A, 2C, 5B, 5C, 6, 7, 8A-D, and 9 are now concurrently referenced, wherein FIGS. 8A-D represent an exemplary remote control Graphical User Interface (GUI) and FIG. 9 represents a method 500 for performing user-based dynamic background backlighting.
For purposes of illustration, some steps of method 500 are implemented by computing device 400 represented in fig. 6. For example, the computing device 400 is a smartphone held by the subject 10 shown in fig. 2C. However, one skilled in the art will readily adapt the method 500 such that these steps are implemented by the camera 300 represented in fig. 7 rather than the computing device 400.
The processing unit 410 of the computing device 400 executes user-based remote control software. Some steps of method 500 are implemented based on user remote control software. The user-based remote control software displays a remote control GUI on the display 440 of the computing device 400. A user interacts with the remote control GUI through a user interface 450 of the computing device 400. User interaction with the remote control GUI generates user interaction data. The remote control software based on the user processes the user interaction data and generates a light control command based on the user interaction data. The light control commands are transmitted through the communication interface 430 of the computing device 400. As previously described, the light control command is transmitted to the background backlight device 200 shown in fig. 5B. The processing of the light control commands by the backlight assembly 200 (and more particularly, the light remote control module 223 shown in fig. 5C) has been described in detail above.
The light control command includes light control data (shown in fig. 5B) for controlling the operating parameters of the light emitting device 211 of the light assembly 210. As previously described, the light control data includes at least one of: data for controlling the intensity of light emitted by the light emitting device 211 and data for controlling the color of light emitted by the light emitting device 211. As described above, each light control command is applicable to all the light emitting devices 211, a selected light emitting device group of the plurality of light emitting devices 211, or a selected light emitting device of the plurality of light emitting devices 211.
Referring to fig. 8A, an exemplary remote control GUI for controlling the intensity and color of light emitted by one or more light emitting devices 211 is illustrated. The remote control GUI includes a slider for controlling the light intensity, expressed as a percentage of 0 to 100%. The remote control GUI further comprises three knobs for controlling the values (e.g. 0 to 255) of the RGB color components of the light, respectively.
Referring to fig. 8B-D, an exemplary remote control GUI for selecting one or more light emitting devices 211 among the plurality of light emitting devices 211 of the light assembly 210 is illustrated. These remote control GUIs allow more accurate matching of lighting to the environment. Fig. 8B illustrates a remote control GUI in which light emitting devices 211 are grouped by quadrant. Fig. 8C illustrates a remote control GUI in which light emitting devices 211 are grouped by sector. Fig. 8D illustrates a remote control GUI in which the light emitting devices 211 are individually selectable (however, the individually controlled light emitting devices 211 may be too complex and difficult for a human to manage, and thus an algorithm may be required to simplify this process, as will be explained in the following description).
In the case where all the light emitting devices 211 are controlled at the same time, the remote control GUI shown in fig. 8A is used to set the intensity and/or color of light of all the light emitting devices 211.
In the case where the light emitting devices 211 are controlled in groups, the remote control GUI shown in fig. 8B or 8C is used to select a group of light emitting devices 211. Then, the remote control GUI shown in fig. 8A is used to set the intensity and/or color of light of all the light emitting devices 211 of the selected group.
In the case where the light emitting devices 211 are individually controlled, the remote control GUI shown in fig. 8D is used to select a given light emitting device 211. Then, the remote control GUI shown in fig. 8A is used to set the intensity and/or color of light of the selected light emitting device 211.
User-based remote control software may be used to control other components of the environment through additional remote control GUIs. In a first example, the operating parameters of one or more of the front light 50 shown in fig. 2C are controlled by a user of the computing device 400. In the exemplary configuration shown in fig. 2C, two front light sources 50 provide primary and secondary light, respectively (as is known in the art) for front-illuminating the subject 10. The user interacts with one or more remote control GUIs dedicated to controlling the front light 50, the processing unit 410 generates corresponding front light control commands for controlling the operating parameters of the front light 50, and the front light control commands are transmitted to the front light 50 through the communication interface 430. The front light 50 may be remotely controlled through a communication interface for receiving the headlight control commands and a processing unit for executing the received headlight control commands. The design of the front light 50 is at least partially similar to the design of the background backlighting apparatus 200 shown in fig. 5B and 5C.
In a second example, the operating parameters of the camera 300 are controlled by a user of the computing device 400. The user interacts with one or more remote control GUIs dedicated to controlling the camera 300, and the processing unit 410 generates corresponding camera control commands to control the operating parameters of the camera 300, which are transmitted to the camera 300 through the communication interface 430. As shown in fig. 7, the camera 300 may be remotely controlled through a communication interface 330 for receiving camera control commands and a processing unit 310 for executing the received camera control commands. Examples of operating parameters that a user sets through the computing device 400 and transmits to the camera 400 include exposure, gamma, higher level settings, such as particular color saturation or substitution, etc.
To assist the user of computing device 400, feedback may be provided to the user (e.g., displayed by computing device 400 on display 440), suggesting how to balance the system. For example, images captured by camera 300 (or at least a subset of images) are displayed by computing device 400 on display 440, allowing a user to evaluate the impact of adjustments performed through the remote control GUI described above. The image is captured by the camera 300 and transmitted to the computing device 400, received via the communication interface 430 and displayed by the processing unit 430 on the display 440.
Referring more particularly to fig. 9, the steps of method 500 will now be described. The method 500 is implemented by the computing device 400 and the background backlighting apparatus 200.
The method 500 includes an optional step 505 of displaying an image captured by the camera 300 on the display 440 of the computing device 400. Step 505 is performed by the processing unit 410 of the computing device 400. The image is received through the communication interface 430 of the computing device 400. As previously described, other types of feedback may alternatively or additionally display images captured by the camera 300.
The method 500 includes the step 510 of displaying one or more remote control GUIs on the display 440 of the computing device 400. Step 510 is performed by processing unit 410 of computing device 400.
Method 500 includes step 515 receiving user interaction data resulting from user interaction with one or more remote control GUIs (displayed at step 510). Step 515 is performed by processing unit 410 of computing device 400. The user interaction is performed through the user interface 450 of the computing device 400.
The method 500 includes a step 520 of processing the user interaction data (received at step 515) to generate a light control command based on the user interaction data. Step 520 is performed by processing unit 410 of computing device 400.
The method 500 includes step 525 of transmitting the light control command (generated at step 520) to the background backlight device 200. Step 525 is performed by the processing unit 410 of the computing device 400. The transmission is performed through the communication interface 430 of the computing device 400.
The method 500 includes a step 530 of receiving a light control command (sent at step 525) at the backlight device 200. Step 530 is performed by the processing unit 223B of the optical remote control module 223 of the background backlight device 200. Wherein the receiving is via a communication interface 223A of an optical remote control module 223 of the background backlight device 200.
The method 500 includes a step 535 of applying the light control command (received at step 530). Step 535 is performed by the processing unit 223B of the optical remote control module 223 of the background backlight device 200. Step 535 has been described previously and includes generating a control signal (e.g., an electrical control signal) for transmission to the optical drive 221 based on the received light control command.
The method 500 includes an optional step 540 of processing the user interaction data (received at step 515) to generate a headlight control command based on the user interaction data. Step 540 further includes transmitting the headlight control command to the front light 50 through the communication interface 430 of the computing device 400. Step 540 is performed by processing unit 410 of computing device 400.
The method 500 includes an optional step 545 of processing the user interaction data (received at step 515) to generate camera control commands based on the user interaction data. Step 545 also includes sending camera control commands to the camera 300 through the communication interface 430 of the computing device 400. Step 545 is performed by processing unit 410 of computing device 400.
Remote control of the active green screen with respect to algorithm:
referring now to fig. 2A, 2C, 5B, 5C, 6, 7, 10A, and 11 simultaneously, a method 600A for performing algorithm-based dynamic background backlighting is shown in fig. 10A. The method 600A is implemented by the computing device 400 (shown in fig. 6), some of its steps being performed by the camera 300 (shown in fig. 7) and the background backlighting apparatus 200 (shown in fig. 5B and 5C).
Method 600A includes a step 605 of capturing an image. Step 605 is performed by the camera 300. More specifically, this step involves capturing an optical image by the image capture module 340, converting the optical image to a digital image by the image capture module 340, and optionally processing the digital image by the processing unit 310. Step 605 is well known in the art and will not be described in detail. In the remainder of the description, the term "image" will refer to a digital image generated by the camera 300. Each pixel of the image has three intensity values (e.g., three integers between 0 and 255) for representing each of the three RGB color components of the image, respectively. An exemplary digital representation of an image with M columns and N rows is a three-dimensional array of pixel intensity values with M columns and N rows (M and N being integers), and a third dimension of size 3 for the RGB color components.
Method 600A includes step 610 of transmitting the image (captured at step 605) to computing device 400. Step 610 is performed by the processing unit 310 of the camera 300. The transmission is performed through the communication interface 330 of the camera 300.
Method 600A includes a step 615 of receiving an image at computing device 400 (transmitted at step 610). Step 615 is performed by the processing unit 410 of the computing device 400, receiving via the communication interface 430 of the computing device 400.
The background of the image has a background color generated by the active green screen 100 shown in fig. 2A, 2B and 2C. The goal of the active green screen 100 is to have a nominal background color. One function of this method 600A is to detect that the background color of the image does not match the nominal background color and activate the background backlighting apparatus 200 to correct for such a mismatch. This function is achieved by the following steps 620, 625, 630 and 635 of method 600A.
The nominal background color is a configuration parameter (e.g., nominal value for each RGB color component) stored in memory 420 of computing device 400. The nominal background color may be updated (e.g., via an update command received via the user interface 450 or via the communication interface 430).
Method 600A includes a step 620 of extracting background image data from the image (received at step 615). Step 620 is performed by processing unit 410 of computing device 400. This step is well known in the art of color bond synthesis.
An exemplary implementation of step 620 is as follows. The color of each pixel of a given image is compared with a nominal background color, taking the given image into account, and based on the comparison, it is determined whether the pixel belongs to the background.
For example, the nominal background color is green, the nominal red intensity value is 0, the nominal green intensity value is 200, and the nominal blue intensity value is 0. Pixels with red intensity values between 0 and 5, green intensity values between 195 and 215, and blue intensity values between 0 and 5 are considered to be part of the background.
An exemplary implementation of the background image data is a three-dimensional array of pixel intensity values with M columns and N rows, and a third dimension of 3 for the RGB color components. The RGB color components of the pixels that do not belong to the background are set to 0, while the RGB color components of the pixels that belong to the background are the RGB color components of the original image (received at step 615).
Method 600A includes a step 625 of analyzing the background image data (extracted at step 620). Step 625 is performed by the processing unit 410 of the computing device 400.
Method 600A includes a step 630 of determining that the background color of the image is incorrect based on an analysis of the background image data (performed at step 625). Step 630 is performed by processing unit 410 of computing device 400.
Steps 625 and 630 are closely related and are typically implemented by a background analysis algorithm executed by the processing unit 410. Those skilled in the art will readily appreciate that the background analysis algorithm may be implemented in a variety of ways.
For example, background analysis algorithms use background image data to calculate the Mean Square Error (MSE). The calculation of MSE takes into account the RGB intensity values of each pixel belonging to the background and compares it to the nominal RGB intensity values of the nominal background color. One output of the algorithm is to determine that the background color is incorrect if the calculated MSE is greater than (or equal to) the threshold value, and otherwise to determine that the background color is correct.
The method 600A includes a step 635 of generating a light control command for adjusting the background color. Step 635 is performed by processing unit 410 of computing device 400.
As previously described, the light control command includes light control data (shown in fig. 5B) for controlling the operating parameters of the light emitting device 211 of the light assembly 210. As previously described, the light control data includes at least one of: data for controlling the intensity of light emitted by the light emitting device 211 and data for controlling the color of light emitted by the light emitting device 211. As described above, each light control command is applicable to all the light emitting devices 211, a selected light emitting device group among the plurality of light emitting devices 211, or a selected light emitting device among the plurality of light emitting devices 211.
Step 635 may be implemented in a variety of ways. For example, the output of the background analysis algorithm (performed at steps 625 and 630) includes at least one metric, such as the aforementioned MSE. The light control data is generated based on the value of the at least one metric. For example, a correspondence data structure (e.g., a correspondence table) is stored in the memory 420 of the computing device 400 that provides a correspondence between values of at least one metric and corresponding values of the light control data. Alternatively, an algorithm is used to calculate light control data based on at least one metric. In one exemplary implementation, the at least one metric includes an MSE value for each of the RGB color components. Each of the RGB color components of the light emitting device 211 generates corresponding light control data.
Alternatively, instead of considering all the background image data extracted in step 620, steps 625, 630 and 635 are performed using only one sample of the background image data. For example, referring to the previous exemplary implementation of a background analysis algorithm, consider only a sample of pixels (e.g., one of ten background pixels) that are identified as part of the image background.
The method 600A includes step 640 of transmitting the light control command (generated in step 635) to the background backlighting apparatus 200. Step 640 is performed by processing unit 410 of computing device 400. The transmission is performed through the communication interface 430 of the computing device 400.
The method 600A includes step 645, receiving a light control command at the background backlight device 200 (transmitted at step 640). Step 645 is performed by the processing unit 223B of the optical remote control module 223 of the background backlight device 200. Received through the communication interface 223A of the optical remote control module 223 of the background backlight device 200.
The method 600A includes a step 650 of applying the light control command (received at step 645). Step 650 is performed by the processing unit 223B of the optical remote control module 223 of the background backlight device 200. Step 650 has been previously described, including generating a control signal (e.g., an electrical control signal) for transmission to the optical drive 221 based on the received light control command.
Method 600A includes an optional step 655 of processing the image (received at step 615). Step 655 is performed by the processing unit 410 of the computing device 400. This step is independent of the dynamic background backlighting function of method 600A (and is also performed when steps 635-650 are not performed).
For example, step 655 includes displaying the (original) image (received at step 615) on display 440 of computing device 400. In this case, the displayed (raw) image is used as feedback to monitor the performance of the dynamic background backlighting function.
In another example, step 655 includes processing the image (received at step 615) to replace the background of the image with a replacement background image (e.g., a seaside landscape, a mountain landscape, a city line image, etc.). This process is well known in the art of color bond synthesis. Each pixel of the background image (e.g., a monochromatic green background) is replaced by a corresponding pixel of the replacement background image.
Step 655 may further comprise at least one of: the processed image is stored in memory 420 of computing device 400, displayed on display 440 of computing device 400, transmitted through communication interface 430 of computing device 400 (e.g., to a cloud storage server or to multiple devices for real-time sessions), and so forth.
The first adjustment to the implementation of method 600A is as follows. The dynamic background backlighting function is not applied to all images received at step 615. For example, the function (steps 620 to 640) is applied only to the image sample received at step 615 (e.g., one image of the N received images). In another example, the functionality (steps 620 through 640) applies only when a trigger is received (e.g., a trigger received from a user through the user interface 450 of the computing device 400 or a trigger received through the communication interface 430 of the computing device 400).
A second adjustment to the implementation of method 600A is as follows. The light driver 221 can control the plurality of sets of light emitting devices 211 independently of each other. For example, referring to fig. 11, the optical driver 221 is capable of controlling four groups of light emitting devices 211 (identified as group 1, group 2, group 3, and group 4 in fig. 11) independently of each other. More specifically, the light driver 221 sets the operating parameters of all light emitting devices 211 belonging to a given group to the same value (e.g., the same light intensity and/or the same light color), which may be different from the operating parameters set for the other groups. Each group comprising one or more light emitting devices 211 (a single independently controlled light emitting device is considered to be a group comprising one light emitting device). The image received at step 615 is divided into regions corresponding to the respective groups of light emitting devices 211 (in fig. 11, identified as region 1 corresponding to group 1, region 2 corresponding to group 2, region 3 corresponding to group 3, and region 4 corresponding to group 4). Steps 620, 625, 630 and 635 are applied to each region independently. The light control command generated for the given area at step 635 is applicable to the light emitting device 211 group corresponding to the given area (for example, the result of processing the area 3 of the image is a light control command for the light emitting device 211 group 3). In addition to the aforementioned light control data, each light control command also includes an identifier of the group 211 of light emitting devices to which the light control command should be applied (when the group includes a single light emitting device, the identifier is used to identify the single light emitting device). The identifier is converted by the processing unit 223B of the optical remote control module 223 into information that allows the optical drive 221 to target the appropriate set of light emitting devices (or individual light emitting devices, if applicable).
A third adjustment to the implementation of the method 600A is as follows. Method 600A includes an additional step (not shown in fig. 10A) of determining parameters associated with one or more of the front light 50 shown in fig. 2C. This additional step is performed by process 410 of computing device 400. Examples of parameters associated with the front light 50 include the intensity of light emitted by the front light 50, one or more positional parameters of the front light 50 (e.g., distance of the front light 50 relative to the subject 10, vertical height of the front light 50, direction of the front light 50, combinations thereof, and the like), and the like. The determination of the given parameters is performed as follows: configuring the given parameters in the memory 420 of the computing device 400, the given parameters being calculated by the processing unit 410 of the computing device 400 (e.g., based on information collected by internal sensors of the computing device 400 (not shown in the figures for purposes of simplicity), the given parameters being received by the communication interface 430 of the computing device 400 (e.g., external sensors not shown in the figures for purposes of simplicity), etc. steps 630 and 635 of method 600A adjust to also consider one or more parameters associated with the front light 50 (in addition to background image data) to generate light control commands, or steps 630 and 635 of method 600A adjust to only consider one or more parameters associated with the front light 50 to generate light control commands (in which case steps 620, 625, and 630 are not performed.) it will be readily understood by those skilled in the art that other parameters (in addition to or instead of background image data) may be determined and considered to generate light control commands.
Referring now to fig. 2A, 2C, 5B, 5C, 6, 7, 10A, 10B and 11 concurrently, a method 600B for performing algorithm-based dynamic background backlighting is shown in fig. 10B. Method 600B is a modification of method 600A represented in fig. 10A, in which some steps of method 600A performed by computing device 400 are now performed by camera 300 in method 600B.
More specifically, steps 620, 625, 630, 635, and 640 are performed by camera 300 in place of computing device 400. Steps 620 to 640 are performed by the processing unit 310 of the camera 300. The transmission of step 640 is performed through the communication interface 330 of the camera 300. In step 645, a light control command is received from the camera 300.
All details of the implementation of steps 620, 625, 630, 635, and 640 previously described with respect to method 600A apply to method 600B. For example, the adjustments described above for the embodiments of method 600A also apply to method 600B.
Referring now to fig. 2A, 2C, 5B, 5C, 6, 7, 10A, 10B, 10C, and 11 concurrently, a method 600C for performing algorithm-based dynamic background backlighting is shown in fig. 10C. Method 600C is an improvement of method 600A represented in fig. 10A, wherein some steps of method 600A performed by computing device 400 are now performed by camera 300 and background backlight 200 in method 600C.
Step 620 is performed by camera 300 instead of computing device 400. Step 620 is performed by the processing unit 310 of the camera 300.
A new step 621 is performed by the camera 300. Method 600C includes a step 621 of transmitting the background image data (extracted at step 620) to the background backlighting apparatus 200. Step 621 is performed by the process 310 of the camera 300, transmitting via the communication interface 330 of the camera 300.
The background backlight 200 performs a new step 622. Method 600C includes a step 622 of receiving background image data (transmitted at step 621) at background back-lighting apparatus 200. Step 622 is performed by the processing unit 223B of the optical remote control module 223 of the background backlighting apparatus 200. The receiving is performed through the communication interface 223A of the optical remote control module 223 of the background backlight device 200.
Steps 625, 630, and 635 are performed by the background backlight assembly 200 in place of the computing device 400. Steps 625 to 635 are performed by the processing unit 223B of the optical remote control module 223 of the background backlight device 200.
Steps 640 and 645 of method 600A are not performed in method 600C.
All details of the implementation of steps 620, 625, 630 and 635 previously described with respect to method 600A apply to method 600C. For example, the adjustments described above for the embodiment of method 600A also apply to method 600C.
Regarding remote control of algorithm-based front light and/or camera:
referring now to fig. 2A, 2C, 6 and 12A simultaneously, a method 700A is shown in fig. 12A for performing algorithm-based remote control of a pre-year-old light source 50 (shown in fig. 2C) and a camera 300 (shown in fig. 2A, 2C and 7). Method 700A is implemented by computing device 400 (shown in fig. 2A and 6).
The method 700A includes a step 705 of collecting operational data related to the operating conditions of the front light 50 shown in fig. 2C. Step 705 is performed by the processing unit 410 of the computing device 400. The collection of operational data may be performed by: the relevant data is transmitted to the computing device 400 by the front light 50, to the computing device 400 by one or more sensors (not shown in the figures), to the computing device 400 by the camera 300, etc.
Method 700A includes step 710 of processing the operational data (collected at step 705) to generate a headlight control command based on the operational data. Step 710 is performed by processing unit 410 of computing device 400. One or more headlight control commands control an operating parameter (e.g. light intensity or light color) of the front light 50.
Method 700A includes step 715 of transmitting a headlight control command (generated at step 710) to the headlight 50. Step 715 is performed by process 410 of computing device 400. The transmission is performed through the communication interface 430 of the computing device 400. Although not shown in fig. 12A for simplicity, the headlight control command is enforced by the front light 50 when received by the front light 50.
Method 700A includes a step 720 of collecting operational data related to the operational conditions of camera 300 shown in fig. 2A, 2C, and 7. Step 720 is performed by processing unit 410 of computing device 400. The collection of operational data may be performed by: the relevant data is transmitted by the camera 300 to the computing device 400, by one or more sensors (not shown in the figures) to the computing device 400, by the front light 50 to the computing device 400), and so on.
Method 700A includes step 725 of processing the operational data (collected at step 720) to generate camera control commands based on the operational data. Step 725 is performed by the processing unit 410 of the computing device 400. The one or more camera control commands include camera control data for controlling operational parameters of the camera 300 (e.g., exposure, gamma, higher level settings, such as particular color saturation or substitution, etc.).
Method 700A includes step 730 of transmitting camera control commands (generated at step 725) to camera 300. Step 730 is performed by process 410 of computing device 400. The transmission is performed through the communication interface 430 of the computing device 400. Although not shown in fig. 12A for simplicity, the headlight control command is enforced by the camera 300 when received by the camera 300.
Steps 705 to 715 correspond to steps 510-515 and 540 of the method 500 shown in fig. 9, wherein a remote control algorithm is used instead of a remote control GUI. Likewise, steps 720 through 730 correspond to steps 510-515 and 545 of method 500 shown in FIG. 9, where a remote control algorithm is used in place of the remote control GUI.
Alternatively, only steps 705 to 715 or 720 to 730 are performed, controlling only the front light 50 or only the camera 300, respectively. In addition, if multiple front light sources 50 are used (as shown in fig. 2C, with a key front light source 50 and a full front light source 50), steps 705 to 715 are repeated for each front light source 50.
Referring now to fig. 2A, 2C, 6, 12A, and 12B concurrently, a method 700B for algorithm-based remote control of a front light 50 (shown in fig. 2C) and a camera 300 (shown in fig. 2A, 2C, and 7) is shown in fig. 12B. Method 700B is an adjustment of method 700A represented in fig. 12A, where some steps of method 700A performed by computing device 400 are now performed by camera 300 in method 700B.
Steps 705 through 725 are performed by camera 300 in place of computing device 400. Steps 705 to 725 are performed by the processing unit 310 of the camera 300. The transmission of step 715 is performed through the communication interface 330 of the camera 300.
A new step 731 is performed instead of step 730. Step 731 is performed by the processing unit 310 of the camera 300. Method 700B includes a step 731 of applying camera control commands (generated at step 725) to camera 300.
All implementation details previously described with respect to steps 705 to 725 in method 700A apply to method 700B.
Those skilled in the art will readily appreciate that the method 700A represented in fig. 12A may also be adapted to be performed by the background backlighting apparatus 200 instead of the computing device 400. In this case, steps 705 to 730 are performed by the processing unit 223B of the light remote control module 223 of the background back lighting device 200 shown in fig. 5B and 5C.
Furthermore, the method of adjusting the operating parameters of the background backlighting apparatus 200 ( methods 600A or 600B shown in fig. 10A and 10B, respectively) may be combined with the method of operating parameters of the front light 50 and the camera 300 ( methods 700A or 700B shown in fig. 12A and 12B, respectively). For example, a single algorithm may adjust the operating parameters of the background backlight 200 and the front light 50 simultaneously. Alternatively, a single algorithm may be used to adjust the operating parameters of the background backlight 200 and the camera 300 simultaneously. Alternatively, a single algorithm may be used to adjust the operating parameters of the background backlight 200, the front light 50, and the camera 300 simultaneously.
In addition, a combination of the foregoing remote control GUIs and algorithms may be used to adjust the operating parameters of the background backlight 200, the front light 50, and the camera 300. For example, steps 510-515, 540-545 (use of a GUI) of method 500 shown in FIG. 9 may be used in combination with steps (use of an algorithm) of method 600A shown in FIG. 10A.
Regarding the neural network:
referring now to fig. 10A, 10B, 10C and 13 simultaneously, wherein fig. 13 illustrates a neural network for implementing some of the steps of the methods 600A, 600B and 600C illustrated in fig. 10A, 10B and 10C, respectively. More specifically, a neural network is used to perform steps 625, 630, and 635.
Neural networks are well known in the art. The following is a brief description of how a neural network operates. The neural network includes an input layer, followed by one or more intermediate hidden layers, followed by an output layer, wherein the hidden layers are fully connected. The input layer includes neurons for receiving input data. The output layer includes neurons for outputting output data. The output data generated from the input data uses weights assigned to neurons of the neural network. The L-layer being fully connected means that each neuron of the L-layer receives input from each neuron of the L-1 layer and applies a respective weight to the received input. By default, the output layer is fully connected with the last hidden layer.
The weights associated with neurons of a neural network, as well as other parameters (e.g., number of layers, number of neurons per layer, etc.), are referred to as predictive models. The predictive model is generated during a training phase, wherein training data (an input data set and a corresponding output data set) is used to train the neural network. The result of the training phase is a predictive model. When the neural network presents a given input data set, a predictive model (weights, etc.) is used to infer output data during the operational phase.
In the context of the aforementioned methods 600A, 600B, and 600C, the input data for the input layer of the neural network includes background image data and nominal background color. As previously described, only one sample of background image data can be used to limit the number of inputs to the neural network. The output data generated by the output layer of the neural network includes light control data (applicable to the background backlight device 200 shown in fig. 2A, 2C, 5B, and 5C). The input data for the input layer of the neural network may be adapted, for example by taking into account additional data, by replacing background image data and/or nominal background color with other data, etc.
During the training phase, experimental background image data, corresponding nominal background colors, and corresponding light control data are collected for training the neural network and generating a predictive model.
Although not shown in fig. 13, the neural network may also include one or more convolutional layers, optionally followed by a corresponding one or more pooling layers, followed by a flattening layer (before the hiding layer). The use of convolutional layers is applicable to neural networks in which the input data is in the form of a (multi-dimensional) data array, in the present case background image data consisting of a three-dimensional array of pixel intensity values with M columns and N rows and a three-dimensional size of the RGB color components of 3. Furthermore, several three-dimensional arrays of pixel intensity values corresponding to successive images captured at step 605 may be used simultaneously as input data for the neural network.
The neural network may also be used to implement step 710 of methods 700A and 700B shown in fig. 12A and 12B, respectively. The output of the neural network includes headlight control data (for controlling the operating parameters of the headlight 50 shown in fig. 2C). Similarly, a neural network may be used to implement step 725 of methods 700A and 700B shown in fig. 12A and 12B, respectively. The output of the neural network includes camera control data (for controlling the operating parameters of the camera 300 shown in fig. 2A, 2C, and 7).
A single neural network may also be used to generate an output that contains any combination of the light control data described above (applicable to the background backlighting apparatus 200 shown in fig. 2A, 2C, 5B, and 5C), headlight control data (applicable to the font light source 50 shown in fig. 2C), and camera control data (applicable to the camera 300 shown in fig. 2A, 2C, and 7).
The subject has been used throughout the above disclosure with reference to the subject 10 shown in fig. 2A-C. A subject should be construed broadly to include one or more humans, one or more animals, one or more objects, combinations thereof, and the like.
The foregoing is merely a preferred embodiment of the present utility model, and it should be noted that it will be apparent to those skilled in the art that modifications and variations can be made without departing from the technical principles of the present utility model, and these modifications and variations should also be regarded as the scope of the utility model.

Claims (15)

1. A background backlight device, characterized in that: comprising
A light assembly (210) comprising one or more light emitting devices (211), the light emitted by each light emitting device (211) first passing through the diffusing material (120) and then through the green screen (110);
a light controller (220) comprising at least one light driver (221) for controlling at least one operating parameter of one or more light emitting devices (211); and at least one user interface (222) or one optical remote control module (223) for driving the at least one optical drive (221), respectively.
2. A background backlight as claimed in claim 1, wherein: wherein each light emitting device (211) emits one of monochromatic light, white light or broad spectrum light having red, green and blue components.
3. A background backlight as claimed in claim 1, wherein: wherein the at least one operating parameter of the one or more light emitting devices (211) comprises at least one of an intensity of light emitted by each light emitting device (211) and a color of light emitted by each light emitting device.
4. A background backlight as claimed in claim 1, wherein: wherein the at least one user interface (222) comprises at least one of the following for allowing a user to adjust at least one operating parameter of the one or more light emitting devices (211): knob, slider and digital input.
5. A background backlight as claimed in claim 1, wherein: wherein the optical remote control module (223) includes a communication interface (223A) for receiving light control commands from a remote computing device (400); the optical remote control module (223) further comprises a processing unit (223B) for processing the light control command to generate a control signal for driving the at least one light driver (221) to control at least one operating parameter of the one or more light emitting devices (211).
6. A background backlight assembly as defined in claim 5, wherein: wherein the light assembly (210) comprises a plurality of light emitting devices (211), and each light control command comprises light control data for controlling one or more of at least one operating parameter of one of: all light emitting devices (211) of the plurality of light emitting devices (211) are selected from a group of light emitting devices (211) of the plurality of light emitting devices (211) or a single light emitting device (211) of the plurality of light emitting devices (211).
7. The utility model provides a activity green screen which characterized in that: comprising a green screen (110), a diffusing material (120), and a background back-lighting device (200) as claimed in any one of the preceding claims 1-6, the background back-lighting device (200) comprising a light assembly (210), the light assembly (210) comprising one or more light emitting devices (211), the light emitted by each light emitting device (211) first passing through the diffusing material (120) and then through the green screen (110); and a light controller (220), the light controller (220) comprising at least one light driver (221) for controlling at least one operating parameter of the one or more light emitting devices (211); and at least one user interface (222) or one optical remote control module (223) for driving the at least one optical drive (221), respectively.
8. The active green screen of claim 7, wherein: wherein the green screen (110) is made of fabric.
9. The active green screen of claim 7, wherein: wherein the diffusing material (120) comprises one or more layers of material that is partially translucent and diffuses light.
10. The active green screen of claim 7, wherein: wherein the one or more light emitting devices (211) comprise at least one of incandescent lamps or devices using light emitting diodes.
11. The active green screen of claim 7, wherein: wherein each light emitting device (211) emits one of monochromatic light, white light or broad spectrum light having red, green and blue components.
12. The active green screen of claim 7, wherein: wherein the at least one operating parameter of the one or more light emitting devices (211) comprises at least one of: the intensity of the light emitted by each light emitting device (211) and the color of the light emitted by each light emitting device (211).
13. The active green screen of claim 7, wherein: wherein the at least one user interface (222) comprises at least one of the following for allowing a user to adjust at least one operating parameter of the one or more light emitting devices (211): knob, slider and digital input.
14. The active green screen of claim 7, wherein: wherein the optical remote control module (223) includes a communication interface (223A) for receiving light control commands from a remote computing device (400); the optical remote control module (223) further comprises a processing unit (223B) for processing the light control command to generate a control signal for driving the at least one light driver (221) to control at least one operating parameter of the one or more light emitting devices (211).
15. The active green screen of claim 14, wherein: wherein the light assembly (210) comprises a plurality of light emitting devices (211), and each light control command comprises light control data for controlling one or more of at least one operating parameter of one of: all light emitting devices (211) of the plurality of light emitting devices (211) are selected from a group of light emitting devices (211) of the plurality of light emitting devices (211) or a single light emitting device (211) of the plurality of light emitting devices (211).
CN202223215154.1U 2022-12-01 2022-12-01 Background back lighting device and movable green screen Active CN218920468U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202223215154.1U CN218920468U (en) 2022-12-01 2022-12-01 Background back lighting device and movable green screen

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202223215154.1U CN218920468U (en) 2022-12-01 2022-12-01 Background back lighting device and movable green screen

Publications (1)

Publication Number Publication Date
CN218920468U true CN218920468U (en) 2023-04-25

Family

ID=86038584

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202223215154.1U Active CN218920468U (en) 2022-12-01 2022-12-01 Background back lighting device and movable green screen

Country Status (1)

Country Link
CN (1) CN218920468U (en)

Similar Documents

Publication Publication Date Title
JP5850600B2 (en) Lighting system control method based on target light distribution
RU2549185C2 (en) Method and pc-based device for control of lighting infrastructure
US10015865B2 (en) Interactive lighting control system and method
CN109076680B (en) Controlling a lighting system
CN103168505B (en) For controlling user interactive system and the portable electric appts of illuminator
JP6564387B2 (en) Method and apparatus for wirelessly controlling the illumination effect of a networked light source
CN106104375B (en) The flashing light light of tool optimization wave spectral power distributions
JP6224822B2 (en) Hair consultation tool apparatus and method
CN117440565A (en) Editing system and processing method
US11262035B2 (en) Ventilation fan system with advanced chromatherapy controls
CN107771313A (en) Color extractor
DE102016104253A1 (en) lighting system
US20130321448A1 (en) Lighting control system
US10083495B2 (en) Multi-processor system and operations to drive display and lighting functions of a software configurable luminaire
CN218920468U (en) Background back lighting device and movable green screen
CN106489052A (en) Method and apparatus for the blend of colors via angled light output correction
CN100544532C (en) Improved luminescent system
US11960193B2 (en) Backdrop rear-illumination apparatus, an active green screen and a method for dynamic backdrop rear-illumination
WO2020164807A1 (en) Led lighting simulation system
CN205247027U (en) Light generator, image equipment and image system
JP6626841B2 (en) Device with camera and screen
CN106406785A (en) Optical method and device for providing feedback light color atmosphere rendering for man-machine interaction
US10656813B1 (en) Light control for an image platform
WO2023286568A1 (en) Data processing device, lighting control system, and lighting control data generation method
JP6186686B2 (en) Display device, projection device, display method, and program

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant