CN113709949B - Lighting device control method and device, electronic device and storage medium - Google Patents

Lighting device control method and device, electronic device and storage medium Download PDF

Info

Publication number
CN113709949B
CN113709949B CN202110963094.2A CN202110963094A CN113709949B CN 113709949 B CN113709949 B CN 113709949B CN 202110963094 A CN202110963094 A CN 202110963094A CN 113709949 B CN113709949 B CN 113709949B
Authority
CN
China
Prior art keywords
image data
target
lighting
display area
color
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110963094.2A
Other languages
Chinese (zh)
Other versions
CN113709949A (en
Inventor
周杰
吴文龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Zhiyan Technology Co Ltd
Original Assignee
Shenzhen Zhiyan Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Zhiyan Technology Co Ltd filed Critical Shenzhen Zhiyan Technology Co Ltd
Priority to CN202110963094.2A priority Critical patent/CN113709949B/en
Publication of CN113709949A publication Critical patent/CN113709949A/en
Application granted granted Critical
Publication of CN113709949B publication Critical patent/CN113709949B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/105Controlling the light source in response to determined parameters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B45/00Circuit arrangements for operating light-emitting diodes [LED]
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B45/00Circuit arrangements for operating light-emitting diodes [LED]
    • H05B45/20Controlling the colour of the light
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/155Coordinated control of two or more light sources
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20112Image segmentation details
    • G06T2207/20132Image cropping
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/30Semiconductor lamps, e.g. solid state lamps [SSL] light emitting diodes [LED] or organic LED [OLED]
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Quality & Reliability (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The application discloses a control method and device of lighting equipment, electronic equipment and a storage medium, wherein the method comprises the following steps: acquiring a picture image, wherein the picture image is acquired facing a display screen in the process of displaying pictures on the display screen; extracting target area image data corresponding to a target display area from a picture image; correcting the image data of the target area according to the reference image data corresponding to the target display area to obtain target image data; the reference image data is determined according to the image data corresponding to the target display area in the comparison image, and the comparison image is collected facing the display screen in the process that the display screen does not display the picture; determining the lighting color of the lighting component corresponding to the target display area according to the target image data; the lighting assembly corresponding to the lighting color control is used for lighting, and the scheme ensures the consistency of the lighting color of the lighting assembly and the color of the picture displayed in the corresponding target display area.

Description

Lighting device control method and device, electronic device and storage medium
Technical Field
The application relates to the technical field of intelligent home, in particular to a control method and device of lighting equipment, electronic equipment and a storage medium.
Background
In order to improve the watching experience of the television, in the related art, lighting equipment is arranged around the television and used as background lighting equipment of the television, so that the lighting color of the lighting equipment is controlled according to the color of a picture displayed by the television, the color of the picture displayed by the television is ensured to be consistent with the lighting color of the lighting equipment, and the immersive watching experience is built. However, in the prior art, the accuracy of controlling the lighting color of the lighting device is not high, so that the lighting color of the lighting device is inconsistent with the color of the picture presented by the television.
Disclosure of Invention
In view of the above problems, embodiments of the present application provide a method, an apparatus, an electronic device, and a storage medium for controlling a lighting device, so as to solve the problem in the prior art that the accuracy of controlling the lighting color of the lighting device is not high, resulting in inconsistent lighting color of the lighting device and the color of a picture presented by a television.
According to an aspect of an embodiment of the present application, there is provided a control method of a lighting apparatus including at least one lighting assembly; one of the illumination assemblies corresponds to one target display area in the display screen; the method comprises the following steps: acquiring a picture image, wherein the picture image is acquired facing the display screen in the process of displaying pictures on the display screen; extracting target area image data corresponding to the target display area from the picture image; correcting the image data of the target area according to the reference image data corresponding to the target display area to obtain target image data; the reference image data is determined according to the image data corresponding to the target display area in a comparison image, and the comparison image is collected towards the display screen in the process that the display screen does not display a picture; determining the lighting color of the lighting component corresponding to the target display area according to the target image data; and controlling the corresponding lighting assembly to light according to the lighting color.
According to an aspect of an embodiment of the present application, there is provided a control apparatus of a lighting device including at least one lighting assembly; one of the illumination assemblies corresponds to one target display area in the display screen; the device comprises: the picture image acquisition module is used for acquiring picture images which are acquired facing the display screen in the process of displaying pictures on the display screen; the extraction module is used for extracting target area image data corresponding to the target display area from the picture image; the correction module is used for correcting the image data of the target area according to the reference image data corresponding to the target display area to obtain target image data; the reference image data is determined according to the image data corresponding to the target display area in a comparison image, and the comparison image is collected towards the display screen in the process that the display screen does not display a picture; the lighting color determining module is used for determining the lighting color of the lighting component corresponding to the target display area according to the target image data; and the control module is used for controlling the corresponding lighting assembly to light according to the lighting color.
According to an aspect of an embodiment of the present application, there is provided an electronic apparatus including: a processor; a memory having stored thereon computer readable instructions which, when executed by the processor, implement a method of controlling a lighting device as described above.
According to an aspect of an embodiment of the present application, there is provided a computer-readable storage medium having stored thereon computer-readable instructions, which when executed by a processor, implement a method of controlling a lighting device as described above.
In this scheme, the picture image acquired during the picture display process of the display screen is actually the superposition result of the picture displayed in the display screen and the imaging of things in the surrounding environment of the display device in the display screen, in other words, the picture image is not presented as an image of the picture displayed in the display screen. Therefore, the image data of the corresponding target area is corrected by the reference image data corresponding to the target display area, the determined target image data is enabled to be closer to the image of the picture displayed by the target display area in the display screen, or the target image data is enhanced to obtain the target image data, and the influence of objects in the surrounding environment of the display device on the imaging of the display screen is reduced by darkening the color.
On the basis, the lighting color of the lighting component corresponding to the target display area is determined based on the target image data obtained through correction, the accuracy of lighting color determination is improved, the lighting of the corresponding lighting component is controlled according to the determined lighting color, the consistency of the lighting color of the lighting component and the current picture color presented by the corresponding target display area in the display screen can be ensured, and the viewing experience of a user is improved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application. It is evident that the drawings in the following description are only some embodiments of the present application and that other drawings may be obtained from these drawings without inventive effort for a person of ordinary skill in the art.
FIG. 1 is a schematic diagram of a control system according to an embodiment of the present application.
Fig. 2 is a schematic view illustrating an installation of a lighting device on a display device according to an embodiment of the present application.
Fig. 3 is a flowchart illustrating a control method of a lighting device according to an embodiment of the present application.
Fig. 4A-4B are interactive diagrams illustrating region assignment according to an embodiment of the present application.
Fig. 5 is a flow chart illustrating steps prior to step 230 according to an embodiment of the present application.
FIG. 6 is a flow chart illustrating step 240 according to an embodiment of the present application.
Fig. 7 is a flowchart illustrating a control method of a lighting device according to an embodiment of the present application.
Fig. 8 is a block diagram of a control device of the lighting apparatus shown according to an embodiment.
Fig. 9 shows a schematic diagram of a computer system suitable for use in implementing an embodiment of the application.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. However, the exemplary embodiments may be embodied in many forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the example embodiments to those skilled in the art.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the application. One skilled in the relevant art will recognize, however, that the application may be practiced without one or more of the specific details, or with other methods, components, devices, steps, etc. In other instances, well-known methods, devices, implementations, or operations are not shown or described in detail to avoid obscuring aspects of the application.
The block diagrams depicted in the figures are merely functional entities and do not necessarily correspond to physically separate entities. That is, the functional entities may be implemented in software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
The flow diagrams depicted in the figures are exemplary only, and do not necessarily include all of the elements and operations/steps, nor must they be performed in the order described. For example, some operations/steps may be decomposed, and some operations/steps may be combined or partially combined, so that the order of actual execution may be changed according to actual situations.
It should be noted that: references herein to "a plurality" means two or more. "and/or" describes an association relationship of an association object, meaning that there may be three relationships, e.g., a and/or B may represent: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship.
FIG. 1 is a schematic diagram of a control system according to an embodiment of the present application. As shown in fig. 1, the control system includes a display device 110, an image capturing apparatus 120, a terminal device 130, and an illumination device (not shown in the figure) provided on the display device 110. In some embodiments of the present application, the solution of the present application may also be interactively performed by devices in the control system shown in fig. 1. The display device 110 includes a display screen for performing screen display, and the display device 110 may be a device having a display screen such as a television.
At least one lighting assembly may be included in the lighting device, wherein the lighting assembly is an individually controllable lighting module in the lighting device. The illumination device may be fixedly arranged on the display device as a backlight for the display screen of the display device 110. For example, the illumination device may be a light strip that may be disposed around the display device 110, e.g., the light strip may be disposed around an edge of a display screen of the display device or around a back edge of the display screen. Therefore, in the process of displaying the picture in the display screen, the lighting component can light along with the color of the picture displayed in the display screen, so that the lighting color of the lighting component is the same as or similar to the color of the picture displayed in the display screen, an immersive picture playing environment is constructed, and user experience is provided.
Fig. 2 is a schematic view illustrating the installation of the illumination device on the display device according to an embodiment of the present application, and in fig. 2, the illumination device is disposed on the back surface of the display screen of the display device and is annularly disposed on the back surface of the display screen, however, in other embodiments, the illumination device may be disposed only along one or more edges of the display device, which is not particularly limited herein. In fig. 2 the lighting device comprises 7 lighting assemblies, which 7 lighting assemblies are provided in the area a-area G of fig. 2, respectively. In fig. 2, the camera is used as the image capturing device 120 and is fixedly installed in the middle of the upper edge of the display device, so as to capture an image facing the display screen, however, in other embodiments, the camera may be further disposed in other edges of the display device, and the disposition position is not limited to the middle of the edge, but may be other positions, and may be specifically set according to actual needs.
In other embodiments, the lighting device may also be a lamppost, which may be vertically set up on both sides of the display screen. In case only one lighting assembly is included in the lighting device, a plurality of lighting devices may be arranged on the display device such that the arranged plurality of lighting devices exhibit a rich light effect.
The image capturing device 120 is configured to capture an image on a display screen of the display device 110, and the image capturing device 120 may be a camera or other devices integrated with an image capturing function. In some embodiments, the image capturing apparatus 120 may be fixedly disposed on the display device 110, for example, in fig. 1, the image capturing apparatus 120 is fixedly disposed at a middle position of an upper edge of the display device 110, and of course, in other embodiments, the image capturing apparatus 120 may be separately disposed from the display device 110, so long as the image capturing apparatus 120 can capture an image of a display screen.
The terminal device 130 may be a smart phone, a tablet computer, a portable computer, a desktop computer, a wearable device, etc., and is not particularly limited herein. The terminal device 130 may establish a communication connection with the image capturing device 120 and the backlight, so that the image captured by the image capturing device 120 may be sent to the terminal device 130, so that a user may perform region specification according to the image displayed in the terminal device 130, obtain specified region information, and facilitate subsequent clipping from the image captured by the image capturing device 120 according to the specified region information to obtain a target pixel region where the display screen is located.
In some embodiments of the present application, based on the communication connection between the terminal device 130 and the image capturing apparatus 120, and between the terminal device 130 and the lighting device, after the image capturing apparatus 120 transmits the captured picture image and the reference image to the terminal device 130, the lighting assembly in the lighting device is controlled by the terminal device 130 according to the scheme of the present application. Alternatively, the user may flexibly control the lighting assembly in the lighting device according to actual needs. Of course, in other implementations, the terminal device 130 may also establish a communication connection with the display device 110, so that the user may also control the display device 110 through the terminal device 130.
In some embodiments of the present application, the image capturing device 120 may also establish a communication connection with the lighting device, so that, after the image capturing device 120 captures a frame image facing the display screen, the lighting device may be controlled according to the scheme provided by the present application.
In some embodiments of the present application, based on the communication connection between the image capturing device 120 and the lighting apparatus, the image capturing device 120 may further transmit the captured picture image to the lighting apparatus, and the lighting apparatus determines the lighting color of each lighting assembly in itself according to the scheme of the present application, and lights according to the determined lighting color.
The implementation details of the technical scheme of the embodiment of the application are described in detail below:
fig. 3 shows a flow chart of a method of controlling a lighting device according to an embodiment of the application, which method may be performed by an electronic device with processing capabilities, such as the terminal device, the image capturing apparatus, the lighting device, etc. shown in fig. 1, or interactively performed by a plurality of devices in the control system shown in fig. 1, without specific limitation. One of the illumination assemblies corresponds to a target display area in the display screen. Specifically, the illumination device may be disposed around an edge of a plane where the display screen of the display device is located, or may be disposed on a back surface of the display screen, in a ring shape, or disposed along one or more edges of the display device. The lighting assembly is a lighting module in the lighting device that can be controlled individually, e.g. if a plurality of parallel LED lamps are included in the lighting device, each LED lamp can be controlled individually, each LED lamp can be regarded as a lighting assembly. For another example, the lighting device is a light strip, the light strip includes a plurality of light segments, wherein each light segment is provided with a control IC and a light bead, each light segment can be controlled independently, therefore, each light segment can be regarded as a lighting component, the control instruction for the light strip includes the light segment instructions of all light segments in the light strip, after receiving the control instruction of the light strip, the other light segments (such as the first light segment) in the light strip extract the light segment instructions corresponding to the light segment from the control instruction, and the control instruction corresponding to the other light segments in the control instruction is sent to the next light segment, so that the next light segment extracts the light segment instructions corresponding to the light segment from the next light segment, and so on until all light segments in the light strip extract the light segment instructions corresponding to the light segment.
The target display area in the display screen does not refer specifically to a certain display area, but refers broadly to a display area in the display screen corresponding to the lighting assembly, or refers to a display area in the display screen as a basis for determining the lighting color of the lighting assembly. The target display area may be all or part of the display area in the display screen. It can be understood that the display screen may include one target display area or multiple target display areas, and for each target display area, the lighting color of the lighting assembly corresponding to the target display area may be controlled according to the scheme of the present application.
In some embodiments of the present application, the display area of the display screen may include a plurality of target display areas, one target display area corresponding to each lighting assembly. In some embodiments of the present application, the correspondence between the target display area and the illumination assembly may be set according to a position of the illumination assembly relative to the display screen, for example, if the illumination assembly is disposed on the back of the display screen, a projection area of the illumination assembly disposed at the position on the display screen may be taken as a center, and a display area with a set size may be taken as the target display area associated with the illumination assembly; for example, if the illumination device is provided at the edge of the display screen, a region of a predetermined size extending from the position of the illumination device at the edge of the display screen to the inside of the display region of the display screen may be used as the target display region corresponding to the illumination device; for another example, if the lighting device is a lamppost, a region of a set size closest to the lamp bead in the display screen may be used as the target display region corresponding to the lamppost.
It can be appreciated that the size of the target display area corresponding to the different illumination assemblies may be the same or different, and may be specifically set according to actual needs.
With continued reference to fig. 3, the control of the lighting device includes at least steps 310 to 350, which are described in detail as follows:
step 310, acquiring a picture image, wherein the picture image is acquired facing the display screen in the process of displaying the picture on the display screen.
The display screen may be used to display video such as television programs, video-on-demand, live video, game pictures, and advertising video, without specific limitation. In the solution of the present application, the acquired picture image at least includes an image of a picture displayed in a target display area in the display screen, and in some embodiments, the acquired picture image includes a panoramic image of the display screen, so that the image of the picture displayed in the display screen can be presented according to the picture image.
And 320, extracting target area image data corresponding to the target display area from the picture image.
As described above, the target display area may be a portion of the display area in the display screen, so that, in step 320, an image of a screen displayed by the target display area in the display screen is extracted from the acquired screen image, and so that the extracted target area image data may be an image of a screen displayed by the target display area in the display screen extracted from the screen image.
In some embodiments of the present application, although the image capturing device captures an image facing the display screen, the captured image may include images of other surrounding environment outside the display screen in addition to the image including the display screen. In this case, in order to avoid the influence of the image of the other area outside the display screen on the determination of the lighting color of the illumination device, the image of the other area outside the display screen may be cut off first. Then, image extraction is performed from the cut-out picture image according to the position information of the target display area in the display screen.
In this case, step 310 includes: cutting the picture image according to the set appointed area information to obtain a display screen pixel area where the display screen is located in the picture image; and extracting pixel areas in the pixel areas of the display screen according to the position information of the target display area in the display screen, and obtaining target area image data corresponding to the target display area.
The specified area information is used to indicate an image area range that needs to be reserved in the picture image. It is understood that the image area indicated by the specified area information includes at least an image corresponding to an area including all the target display areas in the display screen. It will be appreciated that the position of the image capturing means for image capturing facing the display screen is generally fixed and the corresponding image capturing parameters (e.g. focal length, angle, etc.) remain substantially unchanged, in which case the image capturing viewing angle of the image capturing means is the same, so that the designated area information can be obtained by pixel area designation in a base image based on one image captured by the image capturing means.
In some embodiments of the present application, the pixel region designation may be performed by a terminal communicatively coupled to the image capture device, such as a smartphone. Fig. 4A-4B are interactive diagrams illustrating region assignment according to an embodiment of the present application. In this embodiment, the display device is a display screen, and the user can specify a target display area in the display screen based on the interface diagram shown in fig. 4A. A schematic illustration of the labelstock in the corner of the display screen is shown in an example area 411 in the interaction diagram of fig. 4A. The area covered by the label paper in the display screen is determined to be the target display area in the display screen. The first user interface 410 shown in fig. 4A further displays a prompt "1," turns on the lights of the room, and turns off the television; 2. labeling paper at the corner of the television.
If the user has tabbed on the television display, the user may trigger the "ready" control 412 in the first user interface 410 to jump to the second user interface 420 shown in FIG. 4B. As shown in fig. 4B, the screen display area 421 in the second user interface 420 may display the image captured by the image capturing device facing the display screen. Meanwhile, a frame selection indication point 422 and a frame selection frame 423 formed by connecting frame selection indication points end to end are provided in the picture display area 421.
In fig. 4B, 5 frame selection indication points 422 are exemplarily shown, and the user may move each frame selection indication point 422 to change an area enclosed in the image displayed under the frame selection frame 423, so that the user may move the frame selection indication point 422 to an end point and an edge of the display screen in the image, so that the area enclosed in the displayed image by the frame selection frame 423 is a pixel area of the display screen where the display screen is located in the image.
Thereafter, the user may trigger a "submit" control 424 in the second user interface 420 to send the determined location information of the frame selection frame 423 as specified area information or to send the image with the frame selection frame 423 attached to the image capturing device, so that the image capturing device may clip the captured image according to the specified area information. Of course, the "submit" control 424 may also be triggered to store the specified region information at the client, so that the client may subsequently clip the image according to the specified region information. Also provided in the second user interface 420 is a "refresh" control 425, which the user can trigger to update the positions of the box selection indication point 422 and the box selection frame 423 in the display screen display area 421.
After the label paper is attached to the display screen, an image capturing device provided on the television may perform image capturing facing the display screen, thereby obtaining a first image of the display screen to which the label paper is attached, which may be displayed in a screen display area 421 shown in fig. 4B, so that the user performs movement of the frame selection instruction point 422 based on the first image.
Then, the terminal or the image acquisition device may identify the label paper pixel area of each label paper in the first image according to the first image, determine the position information of the label paper pixel area in the first image, and determine the position information of the display screen pixel area in the first image according to the position information of the frame 423; the relative position information of the label paper pixel area relative to the display screen pixel area can be determined according to the position information of the label paper pixel area in the first image and the position information of the display screen pixel area in the first image, or the position information of the area covered by the label paper in the display screen can be further determined. In a specific embodiment, the relative position information of the label paper pixel area relative to the display screen pixel area and the position information of the label paper pixel area in the first image are stored, so that the image data of the target area corresponding to the target display area can be conveniently extracted from the picture image in the subsequent process.
The position information of the target display area in the display screen is used for indicating the relative position of the target display area relative to the display screen, in a specific embodiment, the relative position of the target display area relative to the display screen can also be represented by the relative position of the label paper pixel area relative to the display screen pixel area, so in a specific embodiment, the relative position information of the label paper pixel area relative to the display screen pixel area can be used as the position information of the target display area in the display screen, and the pixel area in the cut-out obtained display screen pixel area is extracted, so that the pixel area where the image corresponding to the target display area is located is the target area image data corresponding to the target display area.
In some embodiments of the present application, the position information of the display screen pixel area where the display screen is located in the picture image may be obtained through image recognition, and then the display screen pixel area is extracted from the picture image according to the determined position information, so that the user is not required to specify the position of the display screen pixel area in the image by moving the frame selection indication point in the second user interface.
The target area image data indicates at least color information of each pixel point in an image of a screen displayed in a target display area in the display screen, for example, the target area image data may indicate RGB values corresponding to each pixel point, that is, a red (R) component value, a green (G) component value, and a blue (B) component value corresponding to the pixel point; for another example, the target area image data may indicate YUV values corresponding to respective pixels, and the YUV values corresponding to the pixels may be converted into RGB values, which correspondingly obtain red component values, green component values, and blue component values corresponding to the respective pixels.
Referring to fig. 3, step 330 corrects the target image data according to the reference image data corresponding to the target display area to obtain target image data; the reference image data is determined according to image data corresponding to the target display area in a comparison image, and the comparison image is collected towards the display screen in the process that the display screen does not display a picture.
In practice, it may be that things in the surrounding of the display device are imaged in the display screen, so that in the process of displaying a picture on the display screen, the picture image acquired by the image acquisition device is actually superposition of the pictures displayed in the display screen and the things in the surrounding of the display device imaged in the display screen, and in the case that the brightness of the things in the surrounding of the display device is high, a pixel area in the acquired picture image is not an image of the picture displayed in the corresponding display area in the display screen, but an image of something in the surrounding of the display device imaged in the display screen, and therefore, in this case, the color of the image of the things in the surrounding of the display device imaged in the display screen may mask the color of the picture displayed in the display screen.
For the lighting device used as the background of the display screen, the lighting color of the lighting device needs to be controlled to be basically consistent with the color of the picture displayed in the display screen or the color of the picture displayed in a certain area, so if the imaging color of things in the surrounding environment of the display device in the display screen can be completely covered or partially covered with the color of the picture displayed in the display screen, if the lighting color of the corresponding lighting component is directly determined according to the color of the pixel area where the target display area is located in the picture image, the color of the actual display picture of the target display area is inconsistent with the lighting color of the lighting component, the look and feel of a user are affected, and the user experience is poor.
The inventor of the application is just aware that objects in the surrounding environment of the display device are overlapped in the acquired picture image and imaged in the display screen, and in the case that the contribution proportion of the colors imaged in the display screen by the objects in the surrounding environment of the display device to the colors in the picture image is large, if the colors presented by the picture image are directly determined as the colors of the illumination components, the brightness colors of the determined illumination components are inconsistent with the colors of the pictures displayed in the corresponding display areas in the display screen, and the brightness color control accuracy of the illumination components is low. The proposal of the application is proposed based on the perceived reason that the lighting color of the lighting assembly is inconsistent with the color of the picture displayed in the corresponding display area in the display screen.
Therefore, in the scheme of the application, in order to ensure the consistency between the lighting color of the determined lighting component and the actual color of the picture presented by the corresponding target display area, the reference image data corresponding to the target display area is used for correcting the target area image data corresponding to the target display area, so that the interference of the lighting color of the lighting component caused by the imaging of things in the surrounding environment of the display device in the display screen is reduced, the accuracy of the lighting color of the determined lighting component is improved, and the lighting color of the lighting component is ensured to be consistent with the color of the picture displayed by the target display area.
In some embodiments of the present application, step 330 includes: and differentiating according to the target area image data and the reference image data corresponding to the target display area to obtain the target image data.
In some implementations of the application, the target region image data may be differentiated from the reference image data by pixel, i.e., the color component value of each pixel in the target region image data is subtracted from the color component value of the corresponding pixel in the corresponding reference image data, and the absolute value of the subtracted difference is taken as the color component of the pixel in the target image data. It is understood that the color components of each pixel include a red component, a green component, and a blue component, and that the difference between the red component, the green component, and the blue component is performed, respectively, in the process of performing the difference. In some implementations of the present application, a first color component average value of a pixel region corresponding to the target display region may be calculated according to the target region image data, a second color component average value of the pixel region corresponding to the target display region may be calculated according to the reference image data, and then the first color component average value and the second color component average value may be subtracted, and an absolute value of the difference value may be used as the target image data.
In some embodiments of the present application, step 330 comprises: multiplying a first color component value of each pixel indicated by the target area image data on a color component by a second color component value of a corresponding pixel indicated by the reference image data on the color component to obtain a third color component value of the corresponding pixel point; dividing the third color component value by a first coefficient to obtain a fourth color component value, wherein the first coefficient is a difference between 1 and a specified parameter, and the specified parameter is a value obtained by normalizing the second color component value; adding the fourth color component value to the first color component value to obtain a fifth color component value; and combining the fifth color component values of all the color components of each pixel point to obtain the target image data.
In the solution of the present embodiment, the target area image data indicates RGB values of each pixel in the first pixel area where the target display area is located in the picture image, that is, indicates a red component value of each pixel on the red component, a blue component value on the blue component, and a green component value on the green component. For ease of distinction, the color component values indicated by the target area image data are referred to as first color component values, which include a first red component value on the red component, a first blue component value on the blue component, a first green component value on the green component, respectively.
Similarly, the reference image data indicates RGB values of each pixel in the second pixel region where the target display region is located in the control image, i.e., red component values on the red component, blue component values on the blue component, green component values on the green component of each pixel. For ease of distinction, the color component values indicated by the reference image data are referred to as second color component values, which correspondingly include a second red component value on the red component, a second blue component value on the blue component, a second green component value on the green component.
Similarly, the third color component values include a third red component value on the red component, a third blue component value on the blue component, a third green component value on the green component; the fourth color component value includes a fourth red component value on the red component, a fourth blue component value on the blue component, a fourth green component value on the green component; the fifth color component value includes a fifth red component value on the red component, a fifth blue component value on the blue component, a fifth green component value on the green component.
In the scheme of the embodiment, according to the RGB values of the pixels indicated by the target area image data and the RGB values of the pixels indicated by the reference image data, the color component values of each pixel point on each color component are calculated according to the above process, so as to obtain the fifth color component value of each pixel point on each color component.
The above procedure will be described by way of example, and assuming that the first red component value of the pixel N on the red component indicated by the image data of the target area is R1, and the second red component value of the pixel N on the red component indicated by the reference image data is R2, the third red component value R3 of the pixel N on the red component is calculated according to the following formula 1:
r3=r1×r2; (equation 1)
Since the value range of each color component value is 0 to 255, the quotient of the second color component value and 255 can be taken as the specified parameter. That is, the specified parameter n is:
the first coefficient n1 is: n1=1-n; (equation 3)
The fourth red component value R4 of pixel N on the red component is:
r4=r3/n 1; (equation 4)
The fifth red component value R5 of pixel N on the red component is:
r5=r1+r4; (equation 5)
Combining the above formulas 1 to 5 can obtain:
R5=r1+ (r1×r2)/(1-R2/255); (equation 6)
In this embodiment, since the value range of n is 0-1, and similarly, the value range of n1 is also 0-1, the value of R4 is necessarily greater than R3, and R4 is a positive value, so the obtained value of R5 is greater than the value of R1, and thus, this embodiment is equivalent to deepening the color of each pixel point, so as to reduce the influence of imaging of objects in the surrounding environment of the display device on the display screen.
In some embodiments of the present application, after determining the reference image data corresponding to the target display area, the reference image data corresponding to the target display area is stored. In a subsequent process, if the reference image data corresponding to the target display area is stored in the device (e.g., the image capturing apparatus, the terminal, or the lighting assembly), step 330 is performed; otherwise, if the reference image data corresponding to the target display area is not stored in the device, step 330 is not performed, the target area image data corresponding to the target display area is taken as the target image data, and step 340 is performed in a skip manner.
And step 340, determining the lighting color of the lighting component corresponding to the target display area according to the target image data.
And step 350, controlling the corresponding lighting assembly to light according to the lighting color.
In this scheme, the picture image acquired during the picture display process of the display screen is actually the superposition result of the picture displayed in the display screen and the imaging of things in the surrounding environment of the display device in the display screen, in other words, the picture image is not presented as an image of the picture displayed in the display screen. Therefore, the image data of the corresponding target area is corrected by the reference image data corresponding to the target display area, the determined target image data is enabled to be closer to the image of the picture displayed by the target display area in the display screen, or the target image data is enhanced to obtain the target image data, and the influence of objects in the surrounding environment of the display device on the imaging of the display screen is reduced by darkening the color.
On the basis, the lighting color of the lighting component corresponding to the target display area is determined based on the target image data obtained through correction, the accuracy of lighting color determination is improved, the lighting of the corresponding lighting component is controlled according to the determined lighting color, the consistency of the lighting color of the lighting component and the current picture color presented by the corresponding target display area in the display screen can be ensured, and the viewing experience of a user is improved.
In some embodiments of the present application, as depicted in fig. 5, prior to step 330, the method further comprises:
step 510, obtaining the control image.
And step 520, extracting image data corresponding to the target display area from the control image.
The process of extracting the image data corresponding to the target display area from the reference image refers to the process of extracting the image data corresponding to the target display area from the picture image in the above embodiment, and is not described herein.
In step 530, a statistical analysis is performed on the color components of each pixel in the image data, so as to obtain a first statistical characteristic parameter of the color components in the image data.
In step 530, since the image data includes color components of each pixel on a plurality of color channels, in each color channel, a statistical analysis is performed on the color components of the pixel indicated in the image data, so as to obtain a first statistical characteristic parameter of the color component corresponding to each color channel. For example, if the RGB values of each pixel are included in the image data, statistical analysis is performed on the red component of the pixel in the image data on the red channel, the green component of the pixel in the image data on the green channel, and the blue component of the pixel in the image data on the blue channel.
The first statistical characteristic parameter may be an average value, a mode, a specific score, or the like. The percentile corresponding to the specified percentile may be set according to actual needs, for example, the percentile is set to 75%, 85%, 90%, 95%, etc., which is not specifically limited herein. The overall distribution of the color components in the pixel area of the picture image where the target display area is located is determined by the first statistical characteristic parameter.
It will be appreciated that since the statistical analysis is performed for the color components, a set parameter range is assigned to each color channel, respectively. The range of the setting parameters designated for each color channel may be the same or different, and is not particularly limited, and may be specifically set according to actual needs.
Step 540, detecting whether the first statistical characteristic parameter exceeds the set parameter range, if yes, executing step 550, and if no, executing step 560.
And step 550, taking the image data as reference image data corresponding to the target display area.
Step 560, setting the reference image data corresponding to the target display area as a specified value.
If the first statistical characteristic parameter exceeds the set parameter range, the color of the picture presented by the target display area in the comparison image is darker, and correspondingly, the effect of the imaging of the object in the surrounding environment of the display device on the display screen on the picture displayed in the display screen is larger, and correspondingly, the interference of the imaging of the object in the surrounding environment of the display device on the display screen on the lighting color of the determination lighting component is larger.
If the first statistical characteristic parameter is within the set parameter range, the color of the picture presented by the target display area in the comparison image is lighter, which indicates that the imaging of the object in the surrounding environment of the display device on the display screen has less influence on the picture displayed in the display screen, and the imaging of the object in the surrounding environment of the display device on the display screen has less interference on determining the lighting color of the lighting component. Thus, the reference image data corresponding to the target display area is set to a specified value.
In a specific embodiment, the setting parameter range may be set according to actual needs. In some embodiments of the present application, the setting parameters set for each color component may be (0, 97).
In some embodiments of the present application, the specified value may be 0, or other value lower than the minimum value in the set parameter range, and may be specifically set according to actual needs.
In some embodiments of the present application, it may also be possible to directly determine, as the reference image data of the target display area, the image data corresponding to the target display area in the reference image without detecting whether the first statistical characteristic parameter exceeds the set parameter range.
In some embodiments of the present application, as shown in fig. 6, step 240 includes:
and 610, performing statistical feature analysis according to the color component information of each pixel indicated by the target image data to obtain a second statistical feature parameter of each color component.
Like the first statistical characteristic parameter value, the second statistical characteristic parameter may be an average value, a mode value, a specified score number, or the like. The percentile corresponding to the specified percentile may be set according to actual needs, for example, the percentile is set to 70%, 88%, 90%, 95%, etc., which is not specifically limited herein
Step 620, determining the lighting color of the lighting component corresponding to the target display area according to the second statistical characteristic parameter of each color component.
In some embodiments of the present application, the color indicated by the combination of all color components may be directly determined as the lighting color of the lighting assembly corresponding to the target display area. For example, if the second statistical characteristic parameter is an average value, assuming that the statistical determination is that the average value of the red component is 199, the average value of the blue component is 213, and the average value of the green component is 178, the color indicated by the RGB values (199, 213, 178) is the lighting color of the lighting component corresponding to the target display area.
In some embodiments of the present application, the second statistical characteristic parameter of each color component may be color enhanced (also referred to as a color exaggeration algorithm) because of illumination factors or specular reflection in the environment in which the display device is located, etc., which may cause the screen color collected by the image collection device to fade. In this case, step 620 further includes: enhancing the statistical characteristic parameter value on each color component to obtain a target color component value corresponding to each color component; and combining the target color component values corresponding to all the color components to obtain the lighting color of the lighting component corresponding to the target display area.
In some embodiments of the present application, the statistical characteristic parameter value on each color component may be linearly enhanced according to a preset enhancement coefficient to obtain the target color component value on the color component. For example, if the statistical characteristic parameter value on the red component is T1, the enhancement coefficient is k, and the target color component value is represented by T2, linear enhancement may be performed according to t2=kχt1; linear enhancement is performed according to t2=k×t1+b. The enhancement coefficient k and the constant b may be set according to actual needs, for example, k=3, b=97, and the like, and are not particularly limited herein. It will be appreciated that, since the color component values of the color components range from 0 to 255, the target color component values obtained by enhancement do not exceed 255.
In some embodiments of the present application, the statistical feature parameter value on each color component may be exponentially enhanced according to a preset base, where k is greater than 1. For example, if the statistical characteristic parameter on the red component is T1 and the base is T, the target color component t2=t on the red component T1 The base t may be set as needed, for example, t is set to 2, and is not particularly limited here.
In this embodiment, by enhancing the statistical feature parameter value on each color component, it is possible to compensate for the color fade of the screen collected by the image collecting device due to the illumination factor or specular reflection in the environment where the display device is located, so that the consistency of the lighting color of the lighting component and the color of the picture presented in the corresponding target area in the display screen is achieved.
Fig. 7 is a flowchart illustrating a control method of a lighting device according to an embodiment of the present application. In the embodiment shown in fig. 7, the lighting device is a television light band, and the television light band includes a plurality of LED lighting modules, each LED lighting module serving as a lighting assembly. The display device is a television, and the image acquisition device is a camera. The user's terminal may run an application program controlling the light strip, so that the control method of the lighting device may be performed by a client running in the terminal. As shown in FIG. 7, steps 711-729 are specifically included. The concrete explanation is as follows:
in step 711, the client connects to the television light band. The television lamp band is provided with a communication module which can be a Bluetooth module, a WIFI module, a Zigbee module and the like, so that the television lamp band can be in communication connection with a terminal where the client is located.
Step 712, displaying the first prompt message on the client; the first prompt message is used for prompting a user to turn off the television. After the user turns off the television, the display screen of the television does not display pictures, so that the camera is convenient to collect contrast images facing the display screen of the television.
In step 713, the camera is started for self-inspection. And starting the camera to perform self-checking, namely starting the camera to acquire a contrast image when the display screen does not display a picture.
Step 714, a control image acquired by the camera is received.
Step 715 converts the control image from YUV color space to RGB color space. In this embodiment, the image data corresponding to the reference image is in YUV format, so that the reference image is converted from YUV color space to RGB color space, and the RGB values of each speed limit in the reference image can be obtained.
Step 716, judging whether the RGB average value in the pixel region where the target display region is located exceeds a set threshold value; if yes, go to step 717, if no, go to step 721. It will be appreciated that in step 716, pixel regions of each target display region in the control image need to be extracted from the control image, and thus, an RGB average is determined based on RGB values of each pixel in the extracted pixel regions of the target display region in the control image. The set threshold value may be 97, or may be set to another value according to actual needs.
In step 717, the lighting assembly corresponding to the target display area is controlled to be lighted.
Step 718, prompting the user whether a re-self-test is needed; if yes, return to step 713; if not, step 719 is performed. In some embodiments, when it is determined that the RGB average value in the pixel area where the target display area is located exceeds the set threshold, the client may further generate a second prompting message, where the second prompting message prompts the user that the RGB average value in the pixel area where the target display area is located in the contrast image exceeds the set threshold, so that the user is convenient to remove the interferents in the environment where the television is located according to the second prompting message. After the user removes the interfering object, it may be determined that a re-self-test is required, and thus, the process returns to step 713 to re-perform the self-test.
Step 719, storing the second RGB values corresponding to each target display area. The second RGB value is the RGB value indicated by the reference image data corresponding to the target display area; accordingly, the second RGB value is stored as the interference information, so that the target area image data corresponding to the target display area in the picture image is corrected by the interference information in the subsequent process, that is, after the video mode is entered in step 721, the correction is performed based on the second RGB value.
Step 721, enter video mode. After a user opens the television, the television correspondingly displays pictures, so that after the user enters a video mode, the camera is started to acquire video, the camera can acquire the video on a display screen facing the television in the process of displaying the pictures by the television, and the terminal receives the video acquired by the camera based on the communication connection between the camera and the terminal. In the acquired video, each video frame can be used as a picture image in the scheme of the application.
Step 722, reading the image data of the target area corresponding to each target display area in one frame of image. In step 722, a video frame is selected from the video as a picture image, and target area image data corresponding to each target display area is extracted from the picture image according to the set target display area.
Step 723, converting the target area image data corresponding to the target display area from YUV space to RGB space to obtain RGB value of each pixel point.
Step 724, judging whether the reference image data corresponding to the target display area is stored; if yes, go to step 725; if not, go to step 726.
Step 725, differentiating the first RGB value corresponding to the target display area with the corresponding second RGB value to obtain a third RGB value; the first RGB value is an RGB value indicated by target area image data corresponding to the target display area; the third RGB value is the RGB value indicated by the target image data corresponding to the target display area.
In step 726, the first RGB value is determined as a third RGB value.
Step 727, multiplying the third RGB value by the enhancement coefficient 1.3 to obtain a fourth RGB value. In other embodiments, the enhancement factor may be other values as well.
Step 728, determining the lighting color of the lighting component corresponding to the target display area according to the fourth RGB values of each pixel in the target display area.
Step 729, controlling the corresponding lighting assembly according to the lighting color of the determined lighting assembly. Thereafter, returning to step 722, the next frame of picture is read to repeat the process of steps 722-729.
In the scheme of the application, whether the interference objects which influence the process of determining the lighting color of the lighting component exist around the television is detected through the contrast image acquired by the camera, the target area image data corresponding to the target display area in the picture image is corrected based on the reference image data corresponding to the target display area in the contrast image, the lighting color of the lighting component corresponding to the target display area is determined according to the target image data obtained by correction, and the lighting component is controlled to light according to the determined lighting color, so that the color of the light presented by the lighting component is consistent with the color of the picture presented by the target display area currently, the experience of watching the television by a user is improved, and the user is convenient to watch the television in an immersed mode.
The following describes embodiments of the apparatus of the present application that may be used to perform the methods of the above-described embodiments of the present application. For details not disclosed in the embodiments of the apparatus of the present application, please refer to the above-described method embodiments of the present application.
FIG. 8 is a block diagram illustrating a control apparatus of a lighting device including at least one lighting assembly, according to one embodiment; one of the illumination assemblies corresponds to one target display area in the display screen; as shown in fig. 8, the control device of the lighting apparatus includes: a picture image acquisition module 810, configured to acquire a picture image, where the picture image is acquired facing the display screen in a process of displaying a picture on the display screen; an extracting module 820, configured to extract target area image data corresponding to the target display area from the frame image; the correction module 830 is configured to correct the target area image data according to the reference image data corresponding to the target display area, so as to obtain target image data; the reference image data is determined according to the image data corresponding to the target display area in a comparison image, and the comparison image is collected towards the display screen in the process that the display screen does not display a picture; a lighting color determining module 840, configured to determine a lighting color of the lighting component corresponding to the target display area according to the target image data; and the control module 850 is configured to control the lighting assembly corresponding to the lighting color to light.
In some embodiments of the application, the control device of the lighting apparatus further comprises: a control image acquisition module for acquiring the control image; the second extraction module is used for extracting image data corresponding to the target display area from the control image; the first statistical analysis module is used for carrying out statistical analysis on the color components of all pixels in the image data to obtain first statistical characteristic parameters of the color components in the image data; and the reference image data determining module is used for storing the image data as the reference image data corresponding to the target display area if the first statistical characteristic parameter exceeds the set parameter range.
In some embodiments of the application, the control device of the lighting apparatus further comprises: and the designating module is used for setting the reference image data corresponding to the target display area as a designated value if the first statistical characteristic parameter is positioned in the set parameter range.
In some embodiments of the application, correction module 830 includes: and the difference unit is used for carrying out difference according to the image data of the target area and the reference image data corresponding to the target display area to obtain the target image data.
In other embodiments of the present application, the correction module 830 includes: a first multiplication unit, configured to multiply a first color component value of each pixel indicated by the target area image data on a color component with a second color component value of a corresponding pixel indicated by the reference image data on a color component, so as to obtain a third color component value of the corresponding pixel point; a second multiplying unit, configured to divide the third color component value by a first coefficient to obtain a fourth color component value, where the first coefficient is a difference between 1 and a specified parameter, and the specified parameter is a value obtained by normalizing the second color component value; an adding unit, configured to add the fourth color component value to the first color component value to obtain a fifth color component value; and the target image data determining unit is used for combining the fifth color component values of all the color components of each pixel point to obtain the target image data.
In some embodiments of the present application, the light color determination module 840 includes: a second statistical feature parameter determining unit, configured to perform statistical feature analysis according to color component information of each pixel indicated by the target image data, to obtain a second statistical feature parameter of each color component; and the lighting color determining unit is used for determining the lighting color of the lighting component corresponding to the target display area according to the second statistical characteristic parameters of each color component.
In some embodiments of the present application, a lighting color determining unit includes: the target color component value determining unit is used for enhancing the statistical characteristic parameter value on each color component to obtain a target color component value corresponding to each color component; and the combination determining unit is used for combining the target color component values corresponding to all the color components to obtain the lighting color of the lighting component corresponding to the target display area.
In some embodiments of the application, the extraction module 820 includes: the clipping unit is used for clipping the picture image according to the set appointed area information to obtain a display screen pixel area where the display screen is positioned in the picture image; and the region extraction unit is used for extracting pixel regions in the pixel regions of the display screen according to the position information of the target display region in the display screen, so as to obtain target region image data corresponding to the target display region.
Fig. 9 shows a schematic diagram of a computer system suitable for use in implementing an embodiment of the application.
It should be noted that, the computer system 900 of the electronic device shown in fig. 9 is only an example, and should not impose any limitation on the functions and the application scope of the embodiments of the present application.
As shown in fig. 9, the computer system 900 includes a central processing unit (Central Processing Unit, CPU) 901 which can perform various appropriate actions and processes, such as performing the methods in the above-described embodiments, according to a program stored in a Read-Only Memory (ROM) 902 or a program loaded from a storage section 908 into a random access Memory (Random Access Memory, RAM) 903. In the RAM 903, various programs and data required for system operation are also stored. The CPU901, ROM902, and RAM 903 are connected to each other through a bus 904. An Input/Output (I/O) interface 905 is also connected to bus 904.
The following components are connected to the I/O interface 905: an input section 906 including a keyboard, a mouse, and the like; an output section 907 including a speaker and the like, such as a Cathode Ray Tube (CRT), a liquid crystal display (Liquid Crystal Display, LCD), and the like; a storage portion 908 including a hard disk or the like; and a communication section 909 including a network interface card such as a LAN (Local Area Network ) card, a modem, or the like. The communication section 909 performs communication processing via a network such as the internet. The drive 910 is also connected to the I/O interface 905 as needed. A removable medium 911 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is installed as needed on the drive 910 so that a computer program read out therefrom is installed into the storage section 908 as needed.
In particular, according to embodiments of the present application, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present application include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method shown in the flowcharts. In such an embodiment, the computer program may be downloaded and installed from the network via the communication portion 909 and/or installed from the removable medium 911. When the computer program is executed by a Central Processing Unit (CPU) 901, various functions defined in the system of the present application are performed.
It should be noted that, the computer readable medium shown in the embodiments of the present application may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-Only Memory (ROM), an erasable programmable read-Only Memory (Erasable Programmable Read Only Memory, EPROM), flash Memory, an optical fiber, a portable compact disc read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present application, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wired, etc., or any suitable combination of the foregoing.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. Where each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present application may be implemented by software, or may be implemented by hardware, and the described units may also be provided in a processor. Wherein the names of the units do not constitute a limitation of the units themselves in some cases.
As another aspect, the present application also provides a computer-readable storage medium that may be contained in the electronic device described in the above embodiment; or may exist alone without being incorporated into the electronic device. The computer readable storage medium carries computer readable instructions which, when executed by a processor, implement the method of any of the above embodiments.
According to an aspect of the present application, there is also provided an electronic apparatus including: a processor; a memory having stored thereon computer readable instructions which, when executed by a processor, implement the method of any of the embodiments described above.
According to an aspect of embodiments of the present application, there is provided a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions to cause the computer device to perform the method of any of the embodiments described above.
It should be noted that although in the above detailed description several modules or units of a device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functions of two or more modules or units described above may be embodied in one module or unit in accordance with embodiments of the application. Conversely, the features and functions of one module or unit described above may be further divided into a plurality of modules or units to be embodied.
From the above description of embodiments, those skilled in the art will readily appreciate that the example embodiments described herein may be implemented in software, or may be implemented in software in combination with the necessary hardware. Thus, the technical solution according to the embodiments of the present application may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (may be a CD-ROM, a U-disk, a mobile hard disk, etc.) or on a network, and includes several instructions to cause a computing device (may be a personal computer, a server, a touch terminal, or a network device, etc.) to perform the method according to the embodiments of the present application.
Other embodiments of the application will be apparent to those skilled in the art from consideration of the specification and practice of the embodiments disclosed herein. This application is intended to cover any variations, uses, or adaptations of the application following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the application pertains.
It is to be understood that the application is not limited to the precise arrangements and instrumentalities shown in the drawings, which have been described above, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the application is limited only by the appended claims.

Claims (9)

1. A method of controlling a lighting device, characterized in that the lighting device comprises at least one lighting assembly; one of the illumination assemblies corresponds to one target display area in the display screen; the method comprises the following steps:
acquiring a picture image, wherein the picture image is acquired by an image acquisition device facing the display screen in the process of displaying the picture on the display screen;
extracting target area image data corresponding to the target display area from the picture image;
Correcting the image data of the target area according to the reference image data corresponding to the target display area to obtain target image data;
if the first statistical characteristic parameter exceeds the set parameter range, storing image data corresponding to the target display area in the comparison image as reference image data corresponding to the target display area;
if the first statistical characteristic parameter is located in the set parameter range, setting the reference image data corresponding to the target display area as a designated value;
the first statistical characteristic parameter is obtained by performing statistical analysis according to color components of pixels in image data corresponding to the target display area in a comparison image; the reference image data is determined according to the image data corresponding to the target display area in a comparison image, and the comparison image is collected towards the display screen in the process that the display screen does not display a picture;
determining the lighting color of the lighting component corresponding to the target display area according to the target image data;
and controlling the corresponding lighting assembly to light according to the lighting color.
2. The method according to claim 1, wherein correcting the target area image data according to the reference image data corresponding to the target display area to obtain target image data includes:
and differentiating according to the target area image data and the reference image data corresponding to the target display area to obtain the target image data.
3. The method according to claim 1, wherein correcting the target area image data according to the reference image data corresponding to the target display area to obtain target image data includes:
multiplying a first color component value of each pixel indicated by the target area image data on a color component by a second color component value of a corresponding pixel indicated by the reference image data on the color component to obtain a third color component value of the corresponding pixel point;
dividing the third color component value by a first coefficient to obtain a fourth color component value, wherein the first coefficient is a difference between 1 and a specified parameter, and the specified parameter is a value obtained by normalizing the second color component value;
Adding the fourth color component value to the first color component value to obtain a fifth color component value;
and combining the fifth color component values of all the color components of each pixel point to obtain the target image data.
4. The method of claim 1, wherein determining the lighting color of the lighting assembly corresponding to the target display area from the target image data comprises:
carrying out statistical feature analysis according to the color component information of each pixel indicated by the target image data to obtain a second statistical feature parameter of each color component;
and determining the lighting color of the lighting component corresponding to the target display area according to the second statistical characteristic parameters of each color component.
5. The method of claim 4, wherein determining the lighting color of the lighting assembly corresponding to the target display area according to the second statistical characteristic parameter of each color component comprises:
enhancing the statistical characteristic parameter value on each color component to obtain a target color component value corresponding to each color component;
and combining the target color component values corresponding to all the color components to obtain the lighting color of the lighting component corresponding to the target display area.
6. The method according to claim 1, wherein extracting target area image data corresponding to the target display area from the screen image includes:
cutting the picture image according to the set appointed area information to obtain a display screen pixel area where the display screen is located in the picture image;
and extracting pixel areas in the pixel areas of the display screen according to the position information of the target display area in the display screen, and obtaining target area image data corresponding to the target display area.
7. A control device of a lighting apparatus, characterized in that the lighting apparatus comprises at least one lighting assembly; one of the illumination assemblies corresponds to one target display area in the display screen; the device comprises:
the image acquisition module is used for acquiring an image which is acquired by an image acquisition device facing the display screen in the process of displaying the image on the display screen;
the extraction module is used for extracting target area image data corresponding to the target display area from the picture image;
the correction module is used for correcting the image data of the target area according to the reference image data corresponding to the target display area to obtain target image data; the reference image data is determined according to the image data corresponding to the target display area in a comparison image, and the comparison image is collected towards the display screen in the process that the display screen does not display a picture;
The first statistical analysis module is used for carrying out statistical analysis on the color components of all pixels in the image data to obtain first statistical characteristic parameters of the color components in the image data;
the reference image data determining module is used for storing the image data corresponding to the target display area in the comparison image as the reference image data corresponding to the target display area if the first statistical characteristic parameter exceeds the set parameter range;
the designating module is used for setting the reference image data corresponding to the target display area as a designated value if the first statistical characteristic parameter is in the set parameter range;
the lighting color determining module is used for determining the lighting color of the lighting component corresponding to the target display area according to the target image data;
and the control module is used for controlling the corresponding lighting assembly to light according to the lighting color.
8. An electronic device, comprising:
a processor;
a memory having stored thereon computer readable instructions which, when executed by the processor, implement the method of any of claims 1-6.
9. A computer readable storage medium having stored thereon computer readable instructions which, when executed by a processor, implement the method of any of claims 1-6.
CN202110963094.2A 2021-08-20 2021-08-20 Lighting device control method and device, electronic device and storage medium Active CN113709949B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110963094.2A CN113709949B (en) 2021-08-20 2021-08-20 Lighting device control method and device, electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110963094.2A CN113709949B (en) 2021-08-20 2021-08-20 Lighting device control method and device, electronic device and storage medium

Publications (2)

Publication Number Publication Date
CN113709949A CN113709949A (en) 2021-11-26
CN113709949B true CN113709949B (en) 2023-11-03

Family

ID=78653740

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110963094.2A Active CN113709949B (en) 2021-08-20 2021-08-20 Lighting device control method and device, electronic device and storage medium

Country Status (1)

Country Link
CN (1) CN113709949B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114401574B (en) * 2021-12-02 2024-01-09 北京小米移动软件有限公司 Light emission control method and system, display device, and readable storage medium
CN113888657B (en) * 2021-12-08 2022-03-29 珠海视熙科技有限公司 Screen color acquisition method and device, camera equipment and storage medium

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006107905A (en) * 2004-10-05 2006-04-20 Sony Corp Illumination control device and illumination control method, recording medium, and program
JP2010256912A (en) * 2004-02-09 2010-11-11 Hitachi Ltd Lighting device, image display apparatus with the same, and image display method
JP2013218153A (en) * 2012-04-10 2013-10-24 Sharp Corp Illumination control device, control system, and illumination control method
CN103413526A (en) * 2013-08-19 2013-11-27 西安诺瓦电子科技有限公司 Correction method for LED lamp panel of LED display screen
CN103426403A (en) * 2013-08-16 2013-12-04 西安诺瓦电子科技有限公司 Image acquisition method for calibration, picture display method and calibration method for LED display screen
CN103971134A (en) * 2014-04-25 2014-08-06 华为技术有限公司 Image classifying, retrieving and correcting method and corresponding device
WO2016078598A1 (en) * 2014-11-19 2016-05-26 刘皓挺 Combined lighting apparatus and method based on image quality control
CN105933762A (en) * 2016-05-18 2016-09-07 海信集团有限公司 Linkage control method and device for smart home
JP2017135051A (en) * 2016-01-29 2017-08-03 株式会社Mass Illumination control method
CN208954611U (en) * 2018-10-11 2019-06-07 哈尔滨理工大学 A kind of outdoor LED display screen gamma correction system
CN110012572A (en) * 2019-03-25 2019-07-12 联想(北京)有限公司 A kind of brightness control method and device, equipment, storage medium
CN112512184A (en) * 2020-12-02 2021-03-16 深圳市智岩科技有限公司 Color-taking illumination control method, device, system and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
HUP1000183D0 (en) * 2010-04-07 2010-06-28 Naturen Kft Controlling multicolor lighting based on image colors

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010256912A (en) * 2004-02-09 2010-11-11 Hitachi Ltd Lighting device, image display apparatus with the same, and image display method
JP2006107905A (en) * 2004-10-05 2006-04-20 Sony Corp Illumination control device and illumination control method, recording medium, and program
JP2013218153A (en) * 2012-04-10 2013-10-24 Sharp Corp Illumination control device, control system, and illumination control method
CN103426403A (en) * 2013-08-16 2013-12-04 西安诺瓦电子科技有限公司 Image acquisition method for calibration, picture display method and calibration method for LED display screen
CN103413526A (en) * 2013-08-19 2013-11-27 西安诺瓦电子科技有限公司 Correction method for LED lamp panel of LED display screen
CN103971134A (en) * 2014-04-25 2014-08-06 华为技术有限公司 Image classifying, retrieving and correcting method and corresponding device
WO2016078598A1 (en) * 2014-11-19 2016-05-26 刘皓挺 Combined lighting apparatus and method based on image quality control
CN105682310A (en) * 2014-11-19 2016-06-15 刘皓强 Combined lighting device and method based on image quality control
JP2017135051A (en) * 2016-01-29 2017-08-03 株式会社Mass Illumination control method
CN105933762A (en) * 2016-05-18 2016-09-07 海信集团有限公司 Linkage control method and device for smart home
CN208954611U (en) * 2018-10-11 2019-06-07 哈尔滨理工大学 A kind of outdoor LED display screen gamma correction system
CN110012572A (en) * 2019-03-25 2019-07-12 联想(北京)有限公司 A kind of brightness control method and device, equipment, storage medium
CN112512184A (en) * 2020-12-02 2021-03-16 深圳市智岩科技有限公司 Color-taking illumination control method, device, system and storage medium

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Feng Tian ; Xiang Wu ; Jingao Liu.Research and realization of innovative LED illumination system for DLP projector.《2008 International Conference on Audio, Language and Image Processing》.2008,全文. *
基于数字图像原理的投影仪亮度Android手机检测方法;丁基民,商鸿发,顾芸天,等;《数字化用户》;第24卷(第9期);全文 *
智能家居在照明领域的应用;刘海涛,张奕,高朋;《人工智能》;第18卷(第5期);全文 *

Also Published As

Publication number Publication date
CN113709949A (en) 2021-11-26

Similar Documents

Publication Publication Date Title
CN109783178B (en) Color adjusting method, device, equipment and medium for interface component
CN113709949B (en) Lighting device control method and device, electronic device and storage medium
JP6811321B2 (en) Display device and its control method
CN107995436A (en) A kind of light compensation method and device
CN112423083A (en) Dynamic video overlay
CN108717691B (en) Image fusion method and device, electronic equipment and medium
US20090009525A1 (en) Color Adjustment Device and Method
CN107591134B (en) MURA phenomenon compensation method, television and computer readable storage medium
CN110809120B (en) Light supplementing method for shot picture, smart television and computer readable storage medium
KR101985880B1 (en) Display device and control method thereof
CN106951891B (en) Light spot detection method and device
CN110536172B (en) Video image display adjusting method, terminal and readable storage medium
CN113795071B (en) Atmosphere lamp control method and device and storage medium
CN113597061A (en) Method, apparatus and computer readable storage medium for controlling a magic color light strip
US11889083B2 (en) Image display method and device, image recognition method and device, storage medium, electronic apparatus, and image system
CN111787671A (en) Control method based on movie and television picture synchronous light atmosphere
CN113301414B (en) Interface generation processing method and device, electronic equipment and computer storage medium
CN113781968A (en) Backlight adjusting method, device, equipment and storage medium
CN111311500A (en) Method and device for carrying out color restoration on image
CN112333399A (en) Shooting light supplementing method, intelligent terminal and readable storage medium
CN103959204A (en) Information processing device, information processing method, and recording medium
CN112435265B (en) Lamp effect synchronous processing method and device and computer readable storage medium
CN113556615A (en) Image processing method and device, and computer readable storage medium
CN113487497A (en) Image processing method and device and electronic equipment
CN113507572A (en) Video picture display method, device, terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant