CN117197261A - Atmosphere lamp equipment, color taking method thereof, corresponding device and medium - Google Patents

Atmosphere lamp equipment, color taking method thereof, corresponding device and medium Download PDF

Info

Publication number
CN117197261A
CN117197261A CN202311472393.1A CN202311472393A CN117197261A CN 117197261 A CN117197261 A CN 117197261A CN 202311472393 A CN202311472393 A CN 202311472393A CN 117197261 A CN117197261 A CN 117197261A
Authority
CN
China
Prior art keywords
image
color
taking
target
outer side
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202311472393.1A
Other languages
Chinese (zh)
Other versions
CN117197261B (en
Inventor
黄劲
吴文龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Zhiyan Technology Co Ltd
Shenzhen Qianyan Technology Co Ltd
Original Assignee
Shenzhen Zhiyan Technology Co Ltd
Shenzhen Qianyan Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Zhiyan Technology Co Ltd, Shenzhen Qianyan Technology Co Ltd filed Critical Shenzhen Zhiyan Technology Co Ltd
Priority to CN202311472393.1A priority Critical patent/CN117197261B/en
Publication of CN117197261A publication Critical patent/CN117197261A/en
Application granted granted Critical
Publication of CN117197261B publication Critical patent/CN117197261B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Abstract

The application relates to an atmosphere lamp device, a color taking method thereof, a corresponding device and a medium, wherein the method comprises the following steps: surrounding four sides of the target picture to divide the region, dividing each outer side edge to obtain a plurality of color taking regions, and determining a region image corresponding to each color taking region; detecting black areas belonging to the outer side of a target in each area image, and determining an edge black band corresponding to the outer side of the target according to the black areas of a plurality of area images belonging to the same outer side of the target; determining the image content outside the corresponding edge black band in each area image belonging to the same target outer side as a corresponding color-taking image of the area image; the light emission color value of the light emitting unit in the unit frame corresponding to the color taking image in the atmosphere lamp device is set according to each color taking image. The application can adapt to the limited calculation condition of atmosphere lamp equipment, can quickly and efficiently take colors accurately for the target picture subareas, and improves the image quality of the corresponding lamp effect of the target picture.

Description

Atmosphere lamp equipment, color taking method thereof, corresponding device and medium
Technical Field
The application relates to the field of illumination control, in particular to atmosphere lamp equipment, a color taking method thereof, a corresponding device and a medium.
Background
The atmosphere lamp equipment is used as one of intelligent lamps and lanterns, can play the effect of decorating indoor space effect, show information. With the rise of people's economic level, atmosphere lamp equipment is becoming more popular. One of the functions of the atmosphere lamp device is to generate a light effect corresponding to the light effect according to the light effect of the appointed environment, so as to play a role in reinforcing the environment atmosphere. For example, the corresponding light effect may be generated according to the light of the display screen of the game or video, or the corresponding light effect may be generated according to the light in the real-time image of a specific physical space environment. In summary, the atmosphere light device may generate a corresponding light effect with the image as a reference.
In practice, there are complex light changes in either the display or real-time image, for example, in the display there may be edge black bands due to mismatch between the frame and the screen size; in a real-time image, an edge black band may appear at the top of the real-time image due to the night sky being too dark. When the lamp effect color is determined according to the edge black bands, the determined vividness of the lamp effect can be reduced due to the interference of the large-scale black pixels, so that the lamp effect is wholly dark, and the effect of shaping the effective atmosphere can not be achieved. This situation is clearly undesirable to consumers and thus a corresponding optimization of this extreme situation is necessary to enhance the ability of the ambience light device to flexibly shape the popular lighting atmosphere.
In addition, the control chip in the atmosphere lamp equipment often adopts an embedded chip with relatively limited calculation power, when the control chip performs color taking according to image content, if the image size is too large, the pixels are too many, and the like, in calculation of each link, larger calculation power is required to be occupied, so that the equipment is unstable and not smooth in operation, and the like, so that the black edge elimination is considered, and meanwhile, the control chip is limited to the specific actual hardware condition of the atmosphere lamp equipment, and the consideration of the response is required.
Disclosure of Invention
The application aims to provide an atmosphere lamp device, a color taking method thereof, a corresponding device and a medium.
According to an aspect of the present application, there is provided a color extracting method of an atmosphere lamp device, comprising:
surrounding four sides of the target picture to divide the region, dividing each outer side edge to obtain a plurality of color taking regions, and determining a region image corresponding to each color taking region;
detecting black areas belonging to the outer side of the target picture in each area image, and determining an edge black band corresponding to the outer side of the target according to the black areas of a plurality of area images belonging to the same outer side of the target;
determining the image content outside the corresponding edge black band in each area image belonging to the same target outer side as a corresponding color-taking image of the area image;
The light emission color value of the light emitting unit in the unit frame corresponding to the color taking image in the atmosphere lamp device is set according to each color taking image.
According to another aspect of the present application, there is provided an atmosphere lamp device color extracting apparatus including:
the region segmentation module is arranged to surround four sides of the target picture to segment the target picture into a plurality of region taking areas, and a region image corresponding to each region taking area is determined;
the black band detection module is used for detecting black areas belonging to the outer side of the target picture in each area image, and determining the edge black band corresponding to the outer side of the target according to the black areas of a plurality of area images belonging to the same outer side of the target;
an image determining module, configured to determine, as a color-taking image corresponding to each region image belonging to the same object outer side, an image content located outside the corresponding edge black band in the region image;
and the color setting module is used for setting the luminous color value of the luminous unit in the unit picture corresponding to the color taking image in the atmosphere lamp equipment according to each color taking image.
According to another aspect of the present application, there is provided an ambience light device comprising a central processor and a memory, the central processor being adapted to invoke the steps of running a computer program stored in the memory to perform a color extraction method of the ambience light device.
According to another aspect of the application, a non-volatile readable storage medium is provided, which stores in the form of computer readable instructions a computer program implemented according to the method for color extraction of an ambient light device, which computer program, when being invoked by a computer to run, performs the steps comprised by the method.
The present application has many advantages over the prior art, including but not limited to:
firstly, the application can realize accurate color taking in subareas: according to the application, a plurality of color taking areas are determined for each outer side edge to obtain corresponding area images, black areas in each area image are resolved, then the black areas in the area images of the outer side edge of the same object are utilized to jointly determine the edge black bands of the outer side edge of the object, then the black areas of the area images of the corresponding outer side edge of the object are removed according to the edge black bands to obtain corresponding color taking images of the area images, the luminous color value of the lamp effect is determined according to the color taking images, thereby the control granularity for identifying the edge black bands is specific to the area image level, the black edges in the object image are accurately and effectively removed, the color taking is carried out on each color taking image on the basis of eliminating the black, the vividness represented by the luminous color value determined by the lamp effect can be improved, the overall brightness of the lamp effect is improved, and the quality of the lamp effect is better.
Secondly, the application can be implemented with lower computational effort: according to the application, hardware conditions of the atmosphere lamp equipment are fully considered in each link of eliminating black edges in the color taking process, for example, when edge black bands are identified, granularity of an area image is specified, black areas are identified by using a small image, and then the edge black bands on the outer side of the target are quickly determined by fusing all black areas on the outer side of the same target.
In addition, the application can improve the image quality of the lamp effect: the application not only can realize accurate color taking in subareas, but also can operate in atmosphere lamp equipment with high efficiency, and further refines the color taking granularity of the lamp effect to correspond to the picture of each unit in the lamp effect, so that the lamp effect played by the application has fine and smooth picture quality, bright color and obvious overall picture quality improvement.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic diagram of an electrical structure of an atmosphere lamp device in an embodiment of the present application;
fig. 2 and 3 are schematic diagrams showing a display frame of an atmosphere lamp device according to an embodiment of the present application, wherein the atmosphere lamp of fig. 2 is arranged in a pattern of curtain lamps, and fig. 3 is arranged in a pattern of frame lamps;
fig. 4 is a flowchart of a color extraction method of an atmosphere lamp device according to an embodiment of the present application;
FIG. 5 is a schematic diagram of the structure of the target picture and the black band and color-taking image;
FIG. 6 is a flow chart of determining an area image according to size information according to an embodiment of the present application;
FIG. 7 is a schematic flow chart of determining an edge black band according to an embodiment of the present application;
FIG. 8 is a flow chart of determining a luminescent color value according to an embodiment of the application;
FIG. 9 is a schematic flow chart of updating an edge black band according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of a color extracting device of an atmosphere lamp device according to an embodiment of the present application;
fig. 11 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Detailed Description
Referring to fig. 1, it can be seen from a schematic structural diagram of an atmosphere lamp device according to an embodiment of the present application that the atmosphere lamp device includes a controller 1, an atmosphere lamp 2, and an image acquisition interface, where the atmosphere lamp 2 is electrically connected to the controller 1 so as to receive a computer program running in the controller 1 to control and cooperate with each other, so as to realize lamp effect playing.
The controller 1 typically includes a control chip, communication components, and bus connectors, and in some embodiments, the controller 1 may also configure power adapters, control panels, display screens, etc. as desired.
The power adapter is mainly used for converting commercial power into direct current so as to supply power for the whole atmosphere lamp equipment. The control Chip may be implemented by various embedded chips, such as a bluetooth SoC (System on Chip), a WiFi SoC, an MCU (Micro Controller Unit, a microcontroller), a DSP (Digital Signal Processing ), and the like, and generally includes a central processor and a memory, where the memory and the central processor are respectively used to store and execute program instructions to implement corresponding functions. The control chips of the above types can be used for communication components from the outside, and can be additionally configured according to the requirement. The communication component may be used for communication with an external device, for example, may be used for communication with a terminal device such as a personal computer or various smartphones, so that after a user issues various configuration instructions through its terminal device, the control chip of the controller 1 may receive the configuration instructions through the communication component to complete the basic configuration, so as to control the atmosphere lamp to work. In addition, the controller 1 can also acquire an interface image of the terminal device through the communication component, or acquire a real-time preview image acquired by the camera. The bus connector is mainly used for connecting the atmosphere lamp 2 connected to the bus with a power supply and providing a lamp effect playing instruction, so that pins corresponding to the power bus and the signal bus are correspondingly provided, and therefore, when the atmosphere lamp 2 needs to be connected to the controller 1, the atmosphere lamp is connected with the bus connector through the corresponding connector of the atmosphere lamp. The control panel typically provides one or more keys for performing on-off control of the controller 1, selecting various preset light effect control modes, etc. The display screen can be used for displaying various control information so as to be matched with keys in the control panel and support the realization of man-machine interaction functions. In some embodiments, the control panel and the display screen may be integrated into the same touch display screen.
Referring to fig. 2, the atmosphere lamp in fig. 2 is configured as a curtain lamp, the atmosphere lamp 2 includes a plurality of light-emitting lamp strips 21 connected to a bus, each light-emitting lamp strip 21 includes a plurality of serially connected lamp beads 210, and the lamp beads 210 of each light-emitting lamp strip 21 are generally the same in number and are arranged at equal intervals. When used, the atmosphere lamp 2 used as a curtain lamp is usually configured such that the respective light-emitting lamp strips 21 are unfolded according to the layout shown in fig. 2, so that all the lamp beads in all the light-emitting lamp strips 21 are arranged in an array to form a lamp bead matrix structure, and the whole lamp beads can provide a picture effect when emitting light cooperatively, so that the whole lamp bead matrix structure forms a display picture 4, a certain pattern effect can be formed within the display picture 4 when playing the lamp effect, a static lamp effect can be formed when a single pattern is displayed statically, and a dynamic lamp effect can be formed when switching the patterns according to time sequence.
Each light-emitting lamp strip 21 can be formed by connecting a plurality of lamp beads 210 in series, each lamp bead 210 is a light-emitting unit, each lamp bead 210 in the same light-emitting lamp strip 21 transmits working current through the same group of cables connected to the bus, and the lamp beads 210 in the same light-emitting lamp strip 21 can be connected in parallel in an electrical connection relationship. In one embodiment, the light-emitting light strips 21 in the same light-strip matrix structure may be disposed at equal intervals along the bus direction, and the light-strips 210 of the light-strip 21 are disposed correspondingly in number and positions, so that the whole display frame 4 plays a role similar to a screen when the light-emitting effect is viewed at a long distance, and can form a pattern effect visually for human eyes.
Similarly, referring to fig. 3, the atmosphere lamp in fig. 3 is laid out around the display of the terminal device to form a frame lamp, and the frame lamp may be surrounded by a single or multiple light-emitting lamp strips connected to the bus. As for the luminous lamp strip adopted by the frame lamp and the lamp beads in the luminous lamp strip, the structure and the communication mechanism are the same as those of the curtain lamp. When the frame lamp is arranged, all the lamp beads are arranged around the display, but the display picture 4 formed on the basis of the lamp bead matrix structure can be regarded as a whole, the lamp beads are not arranged at the central part of the display picture 4, and only the lamp beads are arranged at four sides, so that when the lamp effect is played, a certain light atmosphere effect can be scattered inside and outside the range of the display picture 4.
The controller 1 of the atmosphere lamp device is used for realizing the work control of the whole atmosphere lamp device and is responsible for the communication between the inside and the outside of the whole atmosphere lamp device, wherein the controller 1 is also responsible for driving the image acquisition interface to work, the environment reference image is acquired frame by frame through the image acquisition interface, the environment reference image can be an interface image of the terminal device or a real image of a physical space, then a lamp effect playing instruction of a corresponding frame is generated according to the environment reference image of each frame, and the lamp effect of the curtain lamp playing corresponding frame is controlled through the lamp effect playing instruction.
Each of the light beads 210 of each of the light-emitting light strips 21 of the atmosphere lamp 2 is also provided with a corresponding control chip, and the control chip can select the type according to the disclosure, or select other more economical control chips, which mainly has the function of extracting the light-emitting color value corresponding to the light bead 210 from the light effect playing instruction and controlling the light-emitting elements in the light bead 210 to emit corresponding color light. The light emitting element may be an LED lamp.
The image acquisition interface may be either a hardware interface or a software interface implemented in the controller 1. When the camera is aligned to a target picture, for example, a display desktop of a terminal device, or the camera is aligned to a solid space environment, images are acquired according to a certain frame rate, and interface images can be acquired. In the case of a software interface, the image acquisition interface may be an image acquisition program implemented on the controller 1 side by using a graphics infrastructure technology provided by an operating system of the terminal device, where the controller 1 is connected to the terminal device through various cables, such as HDMI, type-C connection lines, so that the interface image of the terminal device can be continuously obtained under the support of the graphics infrastructure technology; of course, if the controller 1 and the terminal device pre-establish a wireless screen-throwing protocol, the controller 1 may also acquire the interface image of the terminal device by means of wireless communication. The graphics infrastructure technology of the operating system varies according to the type of the operating system, and in an example, in the Windows operating system, a corresponding technology is provided, namely: microsoft DirectX Graphics Infrastructure, DXGI for short, may implement this function.
Therefore, when the image acquisition interface is responsible for acquiring the environment reference image, the specific environment for acquiring the image can be flexibly set by a user, for example, when the image acquisition interface is a camera, the user can shoot the camera aiming at the graphical user interface of the computer to acquire a corresponding interface image as a target picture for playing the light effect, so that the atmosphere lamp 2 can generate the corresponding light effect according to the interface image; the user can also aim the camera at an entity space environment such as an outdoor environment, shoot a live-action image to serve as an environment reference image, so that the atmosphere lamp 2 can play a corresponding lamp effect according to the live-action.
When the atmosphere lamp equipment is powered on, the control chip of the controller can call and execute the computer program from the memory, and the atmosphere lamp is powered on and initialized through the default initialization flow of the computer program, so that the driving configuration of the atmosphere lamp and other hardware equipment is completed.
In one embodiment, when the atmosphere lamp is started, the controller can firstly send a self-checking instruction to the atmosphere lamp, and drive each lamp bead in each light-emitting lamp strip of the atmosphere lamp to return the position information of the lamp bead in the light-emitting lamp strip. Each lamp bead is provided with a corresponding control chip for carrying out data communication with the control chip in the controller, so that the characteristic information of the lamp bead and the characteristic information of other lamp beads can be serially connected in sequence according to a serial communication protocol, and the representation of the position information of the lamp bead is realized. The serial communication protocol executed between the controller and the lamp beads can be any one of IIC (Inter-Integrated Circuit, integrated circuit bus), SPI (serial peripheral interface ), UART (Universal Asynchronous Receiver-Transmitter, universal asynchronous receiver/Transmitter). After the controller obtains the result data returned by the self-inspection of each lamp bead from the bus, the result data are analyzed, the positions of each lamp bead in the display frame 4 presented by the whole atmosphere lamp can be determined according to the sequence of the characteristic information of each lamp bead in the result data, therefore, each lamp bead can be used as a light-emitting unit and can be understood as a basic pixel, and when a subsequent controller constructs a lamp effect playing instruction, the corresponding light-emitting color value of each basic pixel can be set according to the actual requirement according to the position information of each lamp bead.
After the initialization is completed, the controller can continuously acquire the environment reference image as a target picture through the image acquisition interface, and color taking is carried out on the target picture so as to determine the luminous color value of each luminous unit in the display picture. Therefore, the display frame of the atmosphere lamp can be divided into a plurality of unit frames, the target picture is divided into a plurality of color taking images, the color taking images are in one-to-one correspondence with the unit frames, corresponding color values are determined according to the dominant tone of each color taking image, and then the luminous color values of all luminous units in the unit frames corresponding to the color taking images are generated according to the color values.
Accordingly, in one embodiment, the atmosphere lamp device may preset the frame configuration information corresponding to the atmosphere lamp, where the granularity of the subdivision on the plane of the whole display frame 4 is specified in the frame configuration information, so that the whole display frame 4 may be divided into the plurality of unit frames 40 according to the constraint of the frame configuration information. Typically, as shown in fig. 2, each unit frame 40 may span multiple adjacent light-emitting light bands by dividing, and each light-emitting light band is correspondingly covered across multiple light beads. The setting of the granularity of dividing the display frame 4 in the frame configuration information may be flexibly implemented, for example, dividing the unit frames 40 according to a nine-square grid, a sixteen-square grid, etc. may be specified in the frame configuration information, and the total number of the unit frames, for example, values of 9, 16, etc. may be expressed correspondingly, so that the unit frames 40 are divided for the whole display frame 4 according to the total number of all the lamp beads in the atmosphere lamp according to the set values; the number of light-emitting lamp bands spanned by each unit frame (column number) and the number of light-emitting lamp beads spanned by each unit frame (row number) may be specified in the frame configuration information, and then each corresponding lamp bead 210 covered by the unit frame 40 may be determined according to the corresponding row number and column number; it is also possible to set the total number of unit frames 40 to be generated only in the frame configuration information.
The frame configuration information can be set through a man-machine interaction function realized on the controller, can also be set through terminal equipment which establishes data communication connection with the controller and is transmitted to the controller, and the controller stores the frame configuration information in a memory of a control chip of the controller and calls the frame configuration information according to requirements.
Considering that the light-emitting bands of the atmosphere lamp can be flexibly configured to be increased or decreased as required, in one embodiment, the controller may detect the number of the light-emitting bands first, more specifically, detect the total number of the light-emitting bands of the whole atmosphere lamp, grasp the total amount of the basic pixels of the whole display frame by detecting the total number of the light-emitting bands, understand that the whole display frame of the atmosphere lamp is divided into a matrix of 4*4 unit frames according to the value set in the frame configuration information, for example, the value of 16, calculate the number of columns according to the number of the light-emitting bands, calculate the number of rows according to the number of the light-emitting bands, determine the number of occupied rows and the number of columns of each unit frame, establish the mapping relationship data between the light-emitting bands corresponding to the number of rows and the number of columns of the light-emitting bands and the light-emitting bands of each unit frame, and then realize the division of the whole unit frame, and obtain the related data of each unit frame. According to the mode, the display picture is divided, the quantity of the luminous lamp strips and the lamp beads can be changed, the pixel density of each unit picture can be flexibly adjusted, the controller can adaptively set the pixel density of the unit picture according to the increase and decrease of the luminous lamp strips in the atmosphere lamp, the playing logic of the lamp effect is kept unchanged, and the lamp effect of the atmosphere lamp can be normally played even after the luminous lamp strips are increased or decreased.
In another embodiment, the number of light-emitting light bands covered by each unit picture and the number of light beads covered by each unit picture, for example, 4*4 light beads, can be specified in the picture configuration information to form one unit picture. When the luminous lamp strip of the atmosphere lamp is increased or decreased, the unit picture is divided for the next time, the whole display picture is divided according to the established specification, so that the number of the luminous lamp strips and the lamp beads correspondingly covered by each unit picture is unchanged, the unit picture is increased or decreased along with the increase or decrease of the luminous lamp strip, the density of the unit picture in the whole display picture is increased or decreased, and the normal operation of the lamp effect playing of the atmosphere lamp can be ensured.
Since the atmosphere lamp is configured according to the number and layout of the assembled light-emitting lamp strips, although fig. 2 and 3 show that the unit frames in different states of the curtain lamp or the frame lamp can be set in the same division manner, considering that the frame lamp can be linearly treated based on a single light-emitting lamp strip, the unit frames can be divided in a simpler manner, for example, in the case that the controller grasps the total number of lamp beads and the frame configuration information, the total number of lamp beads is simply divided equally, and the unit frames in two directions are rapidly determined. For example, a simple implementation is to divide the whole display frame into four unit frames of upper left, upper right, lower left and lower right, and correspondingly, the environment reference image can be set to four area images later, which is fast and efficient.
Of course, the frame configuration information may also have other defining manners, for example, the frame configuration information may be set according to a rule of partitioning an environmental reference image serving as a target picture, in any case, the rule of partitioning the display frame of the atmosphere lamp is generally consistent with the rule of partitioning the target picture, so as to ensure that each color taking image determined after partitioning in the target picture has one unit frame corresponding to one unit frame in the atmosphere lamp, so that the projection of a color value corresponding to each color taking image to a corresponding unit frame in the atmosphere lamp is conveniently realized, and the overall light atmosphere effect of projecting the target picture on the whole display frame of the atmosphere lamp is realized.
According to the product architecture and the working principle of the atmosphere lamp device, the color taking method of the atmosphere lamp device can be realized as a computer program product, the computer program product is stored in a memory of a control chip in a controller of the atmosphere lamp device, a central processing unit in the control chip is invoked from the memory and then runs, and the atmosphere lamp is controlled to play corresponding lamp effects according to an environment reference image acquired by an image acquisition interface during running.
Referring to fig. 4, in one embodiment, the color extraction method of the atmosphere lamp device of the present application is mainly implemented on a controller side of the atmosphere lamp device, and is executed by a control chip of the controller, and includes:
step S5100, surrounding four sides of the target picture to divide the target picture into a plurality of color taking areas, and determining a corresponding area image of each color taking area;
in the working process of the controller, environment reference images can be continuously obtained through the image acquisition interface according to fixed time intervals, wherein the environment reference images can be interface images or live-action images, and the interface images are images generated by a graphical user interface of the terminal equipment. The environment reference image is used for determining the lighting effect to be played by the atmosphere lamp and generating a corresponding lighting effect playing instruction, and controlling the atmosphere lamp to play the corresponding lighting effect. When one environment reference image is used as a target image to generate and play a corresponding light effect, the next environment reference image can be continuously acquired as a new target image to generate and play a new light effect corresponding to a frame, so that the light effect process played by the whole atmosphere lamp is basically synchronous with the change process of the image flow acquired by the image acquisition interface, and the effect of expanding and projecting the ambient light atmosphere of an external image into the atmosphere lamp is achieved.
In one embodiment, if the acquired environmental reference image is not bitmap data, it may be converted to bitmap data first for pixel-based operations.
In one embodiment, after the controller obtains an environmental reference image as the target picture, if the target picture is oversized, the target picture can be compressed to a specified size specification, so as to reduce the operation amount of the control chip of the controller. For example, when the environment reference image is an interface image, the display resolution of various conventional terminal devices is generally above 1920×1080, and the pixel density formed by the light emitting units of the atmosphere lamp is actually far lower than the size of the interface image, so that the target image can be compressed to a desired size, for example, to a resolution of 320×180, and then when the target image is invoked, the image obtained after compression is invoked.
After the target picture is determined, the target picture can be subjected to region segmentation, and a segmentation rule corresponding to the region segmentation can be described in the frame configuration information so as to be directly invoked. Since the frame configuration information is used for describing the partition rule corresponding to the region division, and the purpose of the partition rule is to divide the display frame of the target picture and the atmosphere lamp into a plurality of regions for color extraction, that is, into a plurality of color extraction regions, how many color extraction regions are respectively on the four sides of the target picture can be defined in the frame configuration information through the description of the partition rule, and of course, the definition is usually performed according to the opposite sides of the target picture. According to the various conditions disclosed in the foregoing, such as a nine-square grid and a 16-square grid, the four sides of the target picture are all subjected to region segmentation, so that a plurality of color taking regions can be segmented on each outer side of the target picture. The image content covered by each color-taking area is the corresponding area image.
In the present application, when the region division is performed on the target picture, as shown in fig. 5, the division effect of the outer side of the target picture is focused on, and a plurality of color taking regions are required on each outer side, generally, the respective outer sides are equally divided according to the lengths thereof, and as for the central regions included in the color taking regions on each outer side, the central regions may be unified into the same central region, or may be formed into a plurality of central unit regions according to the relationship of equally dividing the respective sides into equal specifications. Therefore, when the target picture is segmented, the segmentation effect of four sides of the target picture is focused, but the effect of a central region of the target picture is not focused, the target picture has pertinence, the calculated amount when the black band of the target picture is determined can be reduced, and the calculation overhead of a control chip of the controller is saved.
In general, in order to facilitate subsequent multi-thread concurrent detection of each color taking area to determine a black area thereof, when the target picture is segmented, the equally-dividing rule cutting is adopted to ensure that each color taking area obtained by segmentation on four sides of the target picture has the same size specification, and the same length and width are also available among the color taking areas, so that a standardized thread can be adopted to determine the black area for each color taking area.
Step S5200, detecting black areas belonging to the outer side of the target picture in each area image, and determining an edge black band corresponding to the outer side of the target according to the black areas of a plurality of area images belonging to the same outer side of the target;
for each outer side, it is necessary to determine its edge black band so as to further exclude pixels belonging to the edge black band in the respective area images sitting on the outer side, and the remaining image content is taken as a color-taking image effectively representing the dominant color tone.
It is desirable to determine that the outer side of the edge black band is considered the target outer side. The determination of the outer side of the target can be set according to actual conditions. For example, when the environment reference image employed by the target picture is an interface image of the terminal device, since pictures such as video, game, etc. may differ in frame from the graphical user interface of the terminal device, for example, at 16:9, display 21 in the graphical user interface corresponding to the display of fig. 9: 9, an edge black band may appear at the top and bottom edges of the picture, as shown in fig. 5. In this case, the target outer side may be determined as the top and bottom sides of the target picture. Similarly, the target outer side edge can also be a single outer side edge, and all four outer side edges can also be used, and the specific requirement can be determined according to actual needs. For each item of label outer side, the corresponding edge black band can be determined by adopting the service logic which is completely the same in the step.
When the edge black band of the outer side edge of the target is determined, the black area of each area image is determined, and then the corresponding edge black band is determined based on the black areas of all the area images of the outer side edge of the target.
When detecting the black region of each region image, the recognition efficiency can be improved by setting the recognition start point. Specifically, whether each pixel belongs to a color value representing a black color can be identified from one corner point of the outermost edge of the region image on the outer side of the object, and a set of pixels of the outermost edge including the corner point on the outer side of the object is taken as a base line, so that a plurality of lines belonging to the full black color are continuously determined inwards, and a black region in the region image can be determined. It will be appreciated that this black area is also the area of the area image that is closest to the corresponding outer side of the object.
When the black area is detected by taking the area image as a unit, the total number of pixels of the single area image is less, so that the consumption of calculation power is extremely low and the occupation of the buffer space is also low for identifying the black area of the whole area image, the requirement on the performance of the control chip can be reduced, and the realization cost of atmosphere lamp equipment can be saved.
In one embodiment, on the premise that the control chip of the controller supports multi-thread operation, the process of identifying the black area of the single area image is realized as an instruction set of a single thread in advance, so that for each target outer side edge of the black area in the area image to be determined, corresponding threads are created for each area image of each target outer side edge, and the threads are operated concurrently, so that the time for identifying the black area of all the area images can be obviously shortened, and the speed of the control chip for determining the edge black band corresponding to the target outer side edge is improved.
After each region image of the object's outer side has its black region determined, each black region may be represented in a corresponding coordinate form, for example, in a window form (x 0, y0, x1, y 1) including coordinates of the upper left and lower right corners determined with respect to the coordinate system of the object image, where (x 0, y 0) represents coordinates of the upper left corner pixel and (x 1, y 1) represents coordinates of the lower right corner pixel. For ease of understanding, the direction along the outer side of the object is defined as the longitudinal direction of the black region, and the direction perpendicular to the longitudinal direction is defined as the width direction. Accordingly, for the same target outer side, determination of the edge black band can be achieved as long as the width of the edge black band thereof is determined in accordance with the black region having the smallest width therein, and the length of the target outer side is determined as the length of the edge black band. Meanwhile, the edge black band may also be expressed in a window form to designate coordinates of pixels in the upper left and lower right corners thereof.
In one embodiment, the edge black bands of the outer sides of the respective targets determined based on the target pictures may be stored as historical edge black bands for use along with the next target picture. When a condition is met, such as detecting that the next target picture does not match the historical edge black band, a new edge black band may be generated based again on the next target picture correspondence. In addition, according to the mode of timing triggering, a certain historical edge black band which is determined based on a certain target picture is used for a period of time, and when the timing arrives, a new edge black band is correspondingly generated according to the target picture which is being processed when the timing arrives, and is used as a historical edge black band for subsequent use. The above modes can further save the system overhead of the control chip and improve the system operation efficiency.
Step S5300, determining the image content outside the corresponding edge black band in each area image belonging to the same target outer side as a color taking image corresponding to the area image;
when the corresponding edge black band is determined on any one of the outer side edges of the object, the edge black band can be adopted to cut off the image content which is overlapped with the edge black band in each area image on the outer side edge of the object, and the image content which is not overlapped with the edge black band becomes effective image content and can become a color taking image corresponding to the area image.
In one embodiment, for each area image, the area image may be cut according to the edge black band, so that the variable representing the area image does not save the color value of the pixel belonging to the edge black band portion in the area image, and the data volume of the variable is reduced, so as to further reduce the occupation of the memory.
It will be appreciated that the edge black band of each object outer side is essentially a black region spanning the entire object outer side of the object picture and is rectangular, and that even if a black pixel appears in a portion of the object picture outside the rectangle, it will not be recognized as an invalid pixel, but will be an effective pixel, constituting one pixel in the image content of the corresponding color-taken image. Therefore, after the edge black bands are accurately determined, whether the black pixels in the target picture are effective pixels in the color-taking image can be effectively distinguished, and the image content in the color-taking image can be ensured to be correctly understood and transmitted to the original picture.
Step S5400 sets a light emission color value of a light emitting unit in a unit frame corresponding to each color taking image in the atmosphere lamp device according to the color taking image.
After each target outer side edge determines the color taking image corresponding to each area image in the above manner, the light emitting color value of each light emitting unit in the unit picture corresponding to the color taking image in the atmosphere lamp can be determined based on each color taking image.
When the luminous color value of the lamp bead in each unit picture is determined, the luminous color value is generated by referring to the dominant color tone of the color taking image mapped corresponding to the unit picture, so that each unit picture can generally reflect the dominant color tone formed by the corresponding color taking image, projection of the light atmosphere of the whole picture of the whole environment reference image on the display picture of the whole atmosphere lamp is realized, and the lamp efficiency of the atmosphere lamp play plays a role in effectively expanding the light atmosphere of the environment reference image.
It should be noted that, in the case where the atmosphere lamp is a frame lamp, since the lamp beads are not disposed at the central portion in the display frame, it is unnecessary to consider the correspondence between each unit frame of the central portion and the area image, or it is sufficient to combine each area image of the central portion into the area image near the side of the whole environment reference image, and together with the area image of the side, a mapping relationship is established between the unit frames corresponding to the area image of the side.
After the color value of the main tone of a color taking image is determined, the color value of each light emitting unit, namely, each lamp bead, covered by the unit frame corresponding to the color taking image can be further determined, and the projection of the lamp bead lighting effect on the main tone of the area image can be realized as long as the setting of the light emitting color value of each lamp bead in the unit frame is always related to the light emitting color value of the main tone, and the setting is ensured to be maintained in a reasonable variation range relative to the light emitting color value of the main tone, so that the atmosphere expansion effect is realized.
In determining the dominant hue of the area image, the dominant color constituting the area image may be determined as the dominant hue of the area image. In one embodiment, the color tone analysis may be performed on the area image first, a plurality of color tone intervals may be determined, the total number of pixels in each color tone interval may be identified, the pixel corresponding to the color tone interval with the largest total number of pixels may be determined as the main pixel reflecting the main color tone, and the main color tone may be determined according to the color values of the main pixels. In another embodiment, since the highlight and the dark portions in the image are not generally regarded as hues expressed by the image, the pixels corresponding to the dark portions and/or the highlight in the area image may be filtered in advance, and then a dominant hue, which is the dominant hue of the area image, may be determined according to the color values of the remaining pixels. According to the dominant hues determined in these ways, it is possible to determine a significant color in a complex hue distribution, and the determined dominant hue more reflects the dominant color to be expressed by the corresponding area image, thereby more accurately reflecting the light effect of the area image. The main tone determining mode according to the pixels can be determined by taking the average value of the color values of all relevant target pixels, or can be determined by firstly removing the extreme value of the preset range from the color values of all the target pixels and then taking the average value, and the final determined average value can be used as the luminous color value of the regional image.
After the light-emitting color values corresponding to the lamp beads in each unit picture in the whole display picture of the atmosphere lamp are determined in the mode, according to the default service logic, a light effect playing instruction corresponding to the environment reference image of the current frame is generated according to the light-emitting color values corresponding to each lamp bead and the position information thereof, the light effect playing instruction is transmitted to each light-emitting lamp strip of the atmosphere lamp, the control chip of each lamp bead in each light-emitting lamp strip analyzes and extracts the light-emitting color value corresponding to the lamp bead according to the serial communication protocol, and the corresponding light-emitting element is controlled to emit light of the corresponding color, so that the light effect collaborative playing of the current frame is realized.
From the above embodiments, it is seen that the present application has various advantages over the prior art, including but not limited to:
firstly, the application can realize accurate color taking in subareas: according to the application, a plurality of color taking areas are determined for each outer side edge to obtain corresponding area images, black areas in each area image are resolved, then the black areas in the area images of the outer side edge of the same object are utilized to jointly determine the edge black bands of the outer side edge of the object, then the black areas of the area images of the corresponding outer side edge of the object are removed according to the edge black bands to obtain corresponding color taking images of the area images, the luminous color value of the lamp effect is determined according to the color taking images, thereby the control granularity for identifying the edge black bands is specific to the area image level, the black edges in the object image are accurately and effectively removed, the color taking is carried out on each color taking image on the basis of eliminating the black, the vividness represented by the luminous color value determined by the lamp effect can be improved, the overall brightness of the lamp effect is improved, and the quality of the lamp effect is better.
Secondly, the application can be implemented with lower computational effort: according to the application, hardware conditions of the atmosphere lamp equipment are fully considered in each link of eliminating black edges in the color taking process, for example, when edge black bands are identified, granularity of an area image is specified, black areas are identified by using a small image, and then the edge black bands on the outer side of the target are quickly determined by fusing all black areas on the outer side of the same target.
In addition, the application can improve the image quality of the lamp effect: the application not only can realize accurate color taking in subareas, but also can operate in atmosphere lamp equipment with high efficiency, and further refines the color taking granularity of the lamp effect to correspond to the picture of each unit in the lamp effect, so that the lamp effect played by the application has fine and smooth picture quality, bright color and obvious overall picture quality improvement.
On the basis of any embodiment of the method of the present application, please refer to fig. 6, which is a block diagram of a target picture, the block diagram is divided around four sides of the target picture to obtain a plurality of color taking areas on each outer side, and a corresponding area image of each color taking area is determined, which includes:
Step S5110, obtaining size information of a target picture, wherein the size information comprises a transverse size and a longitudinal size;
the size information of the target picture can be determined by calculating the number of pixels thereof in the length direction and the width direction. When the controller obtains a target picture through the image acquisition interface or compresses the target picture to a preset size specification, the controller can obtain corresponding transverse dimensions and longitudinal dimensions according to the corresponding length direction and width direction of the image content of the target picture.
Step S5120, dividing the transverse dimension and the longitudinal dimension equally according to the preset number, and correspondingly determining the transverse unit dimension and the longitudinal unit dimension to form region definition information;
the number of partitions corresponding to the horizontal and vertical directions of the target picture can be preset in the picture configuration information, and then the horizontal size and the vertical size are divided by the corresponding number of partitions respectively, so that the corresponding horizontal unit size and the corresponding vertical unit size can be determined, and the two actually and jointly define a color taking frame with standardized specification to form the region definition information.
Step S5130, according to the transverse unit size and the longitudinal unit size in the region definition information, carrying out region segmentation around the four sides of the target picture to obtain corresponding color taking regions and segmented residual central regions;
After the color taking frames of each standardized specification are obtained, any corner point of the target picture is taken as a starting point, and coordinates of window forms corresponding to each color taking region on each outer side edge of the target picture are calculated, so that surrounding type region segmentation is carried out on four sides of the target picture, each color taking region is obtained, and the rest central region is segmented, wherein the central region is generally formed by one or more regions corresponding to the color taking frames and can be treated as the same region.
Step S5140, performing corresponding subsequent processing on the image content of each color capturing area as a corresponding area image, and directly performing corresponding subsequent processing on the image content in the central area as a color capturing image.
For each determined color taking area on each outer side, the image content in the determined color taking area is a corresponding area image, and for each area image, the subsequent processing is still performed according to the processes disclosed in the embodiments of the present application, for example, step S5200 to step S5400, so that the color taking image can be determined by removing the corresponding edge black band according to each area image, and then the luminous color value of the unit frame in the atmosphere lamp corresponding to the color taking image is determined.
For the image content of the central area, since the black edge processing is not required, flexible processing can be performed according to different states of the atmosphere lamp, for example, in one embodiment, the atmosphere lamp is a frame lamp, the image content of the central area may not be required to be processed, in another embodiment, the atmosphere lamp is a curtain lamp, the image content of the central area may be directly used as a color-taking image, and step S5400 is directly performed to determine the light-emitting color value of the corresponding unit picture.
According to the above embodiment, when the target picture is segmented, the region segmentation of the target picture can be rapidly completed according to the number of the partitions set in the transverse direction and the longitudinal direction of the target picture, so that each region image can be rapidly obtained, the operation amount is extremely low, the efficiency is extremely high, the identification processing of the image content in the central region of the target picture is omitted in the middle process, the consumption of the calculation force of the control chip is saved, the calculation efficiency can be optimized, and the effect is more obvious for the embedded chip with limited calculation force.
On the basis of any embodiment of the method of the present application, referring to fig. 7, detecting a black region belonging to a target outer side of the target picture in each region image, determining an edge black band corresponding to the target outer side according to the black regions of a plurality of region images belonging to the same target outer side, including:
Step S5210, determining two opposite outer sides in the target picture as target outer sides, where the two outer sides are a top side and a bottom side in a display frame of the target picture;
in this embodiment, the key service is to set two opposite outer sides, namely, the top side and the bottom side, of the target picture as the outer sides of the target that need to be subjected to black-edge processing, so as to simplify the complexity of removing black-edge interference processing on the target picture.
Step S5220, for each region image in the target outer side, performing pixel traversal from a pixel of an outer corner at the outermost side of the region image, and identifying a black pixel connected domain formed by extending and expanding the width inwards by taking the outermost side as a base line in the region image;
when it is necessary to determine a black region for each region image on the outer side of the object, the entire object region image may be traversed from a pixel belonging to an outermost one of the outer side points of the object in the object region image, and a black pixel connected region formed by extending an extended width toward the inner side opposite to the outer side of the object with the outermost pixel set including the outermost corner point as a base line in the region image is identified.
Specifically, in the traversal process, for each pixel, it is determined whether the color value of the pixel belongs to a color value representing a black phase, when the color value belongs to the color value, the pixel is regarded as an effective pixel of the black pixel connected domain, otherwise, the color value does not belong to the range of the black pixel connected domain, and so on until the whole black pixel connected domain is determined. As one member constituting the effective black pixel connected region, it is possible to set the entire pixel set of the base line of the entire area image in the longitudinal direction on the outer side of the object to be black pixels, and such base line can constitute the inner range of the black pixel connected region. As for determining whether a pixel belongs to a black pixel, that is, whether it belongs to a color value representing a black phase, it may be set with reference to a standard black color value RGB (0, 0), and a certain tolerance range may be allowed, for example, when the color value of a pixel is not more than RGB (10, 10, 10), it may be recognized as a black pixel so as to fall within the range of the corresponding black pixel connected domain.
In one embodiment, the tolerance range may be further enlarged, and specifically, when the number of pixels not belonging to the black pixel in the same baseline does not exceed the set proportion of the total number of all pixels in the whole baseline, the baseline is still considered as being within the range of the black pixel connected domain. For example, when the set comparison is 0.1, when the base line has 20 pixels, even if the color value of 2 pixels in the base line belongs to the color value corresponding to red, the base is regarded as being within the black pixel connected domain.
By setting the corresponding tolerance ranges based on the pixel color values and/or based on the number of non-black pixels in the baseline, the tolerance of the edge black band to various possible color changes of the individual pixels can be further expanded, and the situation that the pixels which should belong to the edge black band are mistaken for effective pixels of the color taking image is avoided.
Step S5230, using the minimum width in the black pixel connected domain of each area image belonging to the same object outer side as the width of the edge black stripe of the object outer side, using the length of the object outer side as the length of the edge black stripe, and generating the coordinate information of the edge black stripe corresponding to the object picture.
After each region image of each target outer side edge determines its corresponding black pixel connected region, that is, its black region, it is easy to understand that, for each black region located on the same target outer side edge, the black region with the smallest width generally represents a real edge black band, accordingly, the minimum width in the black region of each region image can be taken, and the minimum width is taken as the width corresponding to the edge black phase corresponding to the target outer side edge, the original length of the target outer side edge is determined as the length of the edge black band, and the coordinate information of the edge black band is determined according to the width and the length, specifically, the upper left corner pixel and the lower right corner pixel of the coordinate system of the edge black band relative to the target picture can be expressed in a window form, and a rectangular window can be defined.
According to the above embodiment, when determining the edge black stripe on the outer side of the target, only the black pixel connected domain of each region image on the outer side of the target is determined by taking the region image as a unit, and the edge black stripe on the outer side of the target can be determined by combining the minimum width of the black pixel connected domain of each region image and the length of the outer side of the target, so that the computing amount is extremely small, the computing effect is extremely high, the edge black stripe is prevented from being identified by performing full-image traversal on the target picture, and the method is particularly suitable for being implemented in an embedded chip, can ensure that the effect of rapidly determining the edge black stripe can be still exerted in a control chip with limited computing power, and has a basic effect on the efficient operation of the whole atmosphere lamp equipment.
On the basis of any embodiment of the method of the present application, determining, as a corresponding color-taking image of each area image belonging to the same object outer side, image contents outside the corresponding edge black band, including:
step S5310, for each region image of any outer side of the target, determining whether the pixel belongs to an edge black band according to the coordinate information of each pixel in the image content of the region image and the coordinate information of the edge black band of the outer side of the target;
Step S5320, all pixels in each area image which do not belong to the edge black band are configured as a color-taking image corresponding to the area image.
On the basis of determining the coordinate information of the edge black band on the outer side of each object, for example, the edge black band shown in the window form and shown in the foregoing, the speed of constructing the color-taking image corresponding to each area image can be improved based on the coordinate information.
Specifically, for each region image of the outer side of the object, the coordinate information of each pixel in the region image is compared with the coordinate information of the edge black band corresponding to the outer side of the object, so as to determine whether the pixel belongs to the edge black band. For example, if a pixel in the area image is (x 2, y 2) and the coordinate information of the corresponding edge black stripe is (x 0, y0, x1, y 1), if x2> x1, it can be determined that the pixel is beyond the range of the edge black stripe, that is, the effective pixel belonging to the color-taking image, so that the area image can be fully identified, and all the pixels in the image content of the color-taking image can be quickly determined, thereby constructing the corresponding color-taking image.
According to the above embodiment, whether the pixels in the area image belong to the range of the color capturing image can be quickly identified based on the coordinate information of the edge black band, so that all the pixels corresponding to the color capturing image can be quickly identified, the color capturing image can be quickly constructed, and the processing efficiency can be further improved.
On the basis of any embodiment of the method of the present application, referring to fig. 8, setting, according to each color capturing image, a light emitting color value of a light emitting unit in a unit frame corresponding to the color capturing image in an atmosphere lamp device includes:
step S4100, distinguishing dark pixels and non-dark pixels in the color-taking image according to a preset dark threshold;
the controller may read a preset darkness threshold value, so as to distinguish between a darkness pixel and a non-darkness pixel in each of the captured images, for example, in a scene where the pixel color value is usually represented by RGB, the value of a solid black pixel may be represented by RGB (0, 0), the darkness threshold value may be set to be a neighboring value near the value of the solid black pixel, for example, RGB (10,10,10), so that a pixel in the captured image below the darkness threshold value is a darkness pixel, and a pixel above the darkness threshold value is a non-darkness pixel, thereby implementing the distinction.
In one embodiment, the process of distinguishing individual pixels in a color-taking image may be implemented as follows, including the steps of:
firstly, traversing the luminous color values of all pixels of the color taking image, and marking the pixels with the luminous color values lower than a preset dark light threshold value as dark light pixels according to a preset proportion threshold value;
The color-taken image is bitmap data, and thus, each pixel thereof can be traversed row by row and column by column in accordance with its resolution size, and the light emission color value of each traversed pixel is compared with the darkness threshold value, thereby discriminating whether or not each pixel is a darkness pixel.
In order to avoid the situation that some dark-tone pixels are erroneously determined to be non-dark-tone pixels, so that when a color-taking image has a larger range of dark colors, the finally obtained non-dark-tone pixels are few and are unfavorable for accurately calibrating the main tone, in the embodiment, a preset proportional threshold is adopted for controlling the total quantity of the dark-tone pixels, so that the total quantity of the non-dark-tone pixels is ensured to be not lower than a certain amplitude.
For example, it is possible to set the dark pixel to occupy the total amount of all pixels in the color image not higher than the ratio threshold value, for example, 90%, in which case the pixel whose color value is lower than the priority mark pixel is the dark pixel, and in the case where the total amount of the dark pixels is maintained not to exceed 90%, even if the color value of the individual pixel is lower than the dark threshold value, it is higher than the color values of the other dark pixels, whereby the pixel can be recognized as a non-dark pixel. After a pixel is determined to belong to a dark pixel, a corresponding mark is given, so that the dark pixel is filtered for the color-taking image.
Then, the other pixels except the dark pixels in the color-taking image are marked as non-dark pixels: similarly, for other pixels in the color-taking image that do not belong to dark pixels, the other pixels can be marked as non-dark pixels directly.
In the above embodiment, the proportion threshold is set to ensure that the proportion of the dark light pixels of the filtered color-taking image does not exceed the predetermined proportion threshold, so as to ensure that enough non-dark light pixels are used for determining the dominant color of the color-taking image, and for some low-light image areas, the dominant color can still be accurately identified, so that the corresponding unit frames can obtain the effective luminous color values corresponding to the light effects.
Step S4200, determining a luminous color value corresponding to a dominant hue of the color-capturing image according to a color value of a non-darkness pixel of the color-capturing image;
the color value of the luminescence of the non-dark pixels stored in the color-capturing image can be used as a main basis for determining the dominant color of the color-capturing image, and thus, various embodiments can be evolved, for example:
in one embodiment, the luminance color values of all the non-dark pixels in the color-extracted image are averaged, and the average value is used as the luminance color value corresponding to the dominant color. The dominant hue determined by the corresponding color-taking image is thus the base hue that is averaged to reflect the aggregate appearance of the non-darkish pixels of the color-taking image.
In another embodiment, the light-emitting color values of all the non-dark pixels and the light-emitting colors of all the dark pixels in the color-taking image are respectively averaged, and the two average values are weighted according to a preset weight to obtain an average value as the light-emitting color value corresponding to the dominant hue. The method can avoid the influence of transition filtering of the dark light region on the dominant hue of the color-taking image, of course, when the weights of the two average values are set, the corresponding weight of the non-dark light pixels can be amplified, the weight of the dark light pixels can be reduced, for example, the weight of the average value of the luminous color values of the non-dark light pixels and the average value of the luminous color values of the dark light pixels is set to be 0.9:0.1, so that the dominant hue of the color-taking image is still determined by taking the luminous color values of the non-dark light pixels as a main body, but the functions of the non-dark light pixels are also related and considered, and the finally determined luminous color values of the dominant hue are more balanced.
Step S4300, generating the luminous color value of each luminous unit in the unit frame corresponding to the color drawing image in the atmosphere lamp device according to the luminous color value corresponding to the main tone.
When a color capturing image determines a light emitting color value corresponding to a corresponding dominant color, the light emitting color value of each lamp bead covered by the unit frame corresponding to the color capturing image can be generated accordingly, and the specific implementation manner can be flexibly set, for example:
In one embodiment, the light-emitting color value corresponding to the dominant color is set to the light-emitting color value of each lamp bead covered by the unit frame corresponding to the color-taking image. That is, the light emission color values of the respective beads corresponding to the unit frames corresponding to the color-taking image are set to coincide with the light emission color values of the main color tone of the color-taking image, and the respective beads emit light in accordance with the same light emission color values. When the corresponding light effect is played in the mode, the light effect presented by the whole display picture of the atmosphere lamp corresponds to each color-taking image and the color of the color-taking image is distinct.
In another embodiment, according to the light-emitting color value of the main tone adopted by another unit frame adjacent to the current unit frame corresponding to the color-taking image, the light-emitting color value of each lamp bead covered by the current unit frame is subjected to gradual change adjustment, so that a gradient change relation of the light-emitting color value is formed among each lamp bead in the current unit frame along the direction towards the other unit frame. That is, the luminous color value of the lamp bead of one unit picture maintains the luminous color value corresponding to the dominant color from the center position of the unit picture, then refers to the color value of another unit picture along the way of facing the adjacent unit picture, on the basis of the luminous color value corresponding to the dominant color, the luminous color value of each lamp bead in the direction is gradually changed according to a certain gradient to adjust the certain gradient value, and finally, when the corresponding lamp effect is played, the luminous effect of color transition is generated from one dominant color of one unit picture to another dominant color corresponding to the adjacent unit picture. According to the method, the luminous color values of the lamp beads correspondingly covered by each unit picture are set, so that the lamp bead luminous effect of each unit picture presents a gradual change effect of relatively soft transition, and the lamp effect presented by the whole display picture of the atmosphere lamp is softer and finer overall.
According to the above embodiment, the dark light pixels and the non-dark light pixels in the color taking image are distinguished, the light color value of the main color tone of the color taking image is determined by taking the light color value of the non-dark light pixels as the main body, and on the basis, the light color value of each lamp bead in the unit frame corresponding to the color taking image is determined, so that the light color value of each lamp bead always reflects the main color tone of the color taking image, the effective projection of the main color tone of the corresponding color taking image is realized, and the lamp efficiency played by the atmosphere lamp is ensured to effectively reproduce and expand the atmosphere effect of the environment shot by the camera.
On the basis of any embodiment of the method of the present application, referring to fig. 9, before the region segmentation is performed around four sides of the target picture, the method includes:
step S3100, continuously acquiring interface images in external terminal equipment according to a preset frame rate, and taking the currently acquired interface images as target pictures;
in this embodiment, the controller continuously collects interface images in external terminal devices through the image acquisition interface as environment reference images, and usually collects the interface images according to a preset frame rate, for example, 30 frames per second, so that a control chip of the controller actually needs to process an image stream formed by each frame of interface images, and generates a corresponding light effect for each frame of interface images in the image stream.
Step S3200, calculating the average color value of pixels at each history edge black band corresponding to the target picture according to the history edge black band predetermined corresponding to the outer side of the target;
the controller can process one interface image in the process of playing the light effect, and the edge black band obtained by determining the interface image as the target image is used as the historical edge black band for the other interface images after the interface image, so that the other interface images do not need to execute all relevant steps related to determining the edge black band in the application, thereby saving the system overhead.
However, the frame of the interface image may also change, and for this case, it needs to be quickly identified with a small amount of calculation, so as to adjust the edge black band used to determine the color-taking image in time, that is, to identify whether the currently-used historical edge black band is still suitable for the current and subsequent target pictures in time.
Accordingly, according to the historical edge black bands corresponding to the outer side of the target, the average color value of each pixel of the current target picture falling into the historical edge black bands can be calculated to be used for measuring whether the historical edge black bands have regional variation or not.
Step S3300, when the average color value belongs to a color value representing a black phase, performing a subsequent process based on the history edge black band to determine the light-emitting color value for the target picture;
when the average color value corresponding to the historical edge black band still belongs to the color value representing the black phase, it indicates that the current used historical edge black band still is applicable, even if the width of the corresponding edge black band of the current target picture is actually enlarged, the effect is not great, in this case, the service step of determining the edge black band may not need to be re-executed for the current target picture, and only steps S5300 and S5400 need to be directly executed, the corresponding color-taking image is determined according to the historical edge black band, and the corresponding light-emitting color value is generated according to the color-taking image.
And S3400, when the average color value does not belong to the color value corresponding to the representation black phase, executing subsequent processing to determine the latest edge black band based on the target picture, determining the luminous color value for the target picture according to the latest edge black band, and replacing the latest edge black band with a historical edge black band corresponding to the interface image acquired later.
When the average color value does not belong to the color value corresponding to the representing black, it indicates that the edge black band of the target picture may change, and a part of the colored pixels fall within the area of the original historical edge black band, in this case, the process of implementing the color capturing based on the target picture according to the present application needs to be completely executed, for example, the whole process from step S5100 to step S5400 is executed, so as to determine the edge black band corresponding to the target picture as the latest edge black band, determine the color capturing image according to the latest edge black band, generate the corresponding light-emitting color value according to the color capturing image, and so on.
Similarly, in order to continue to maintain the optimization effect of the operation efficiency, the latest edge black band determined for the current target picture is further replaced with the previously stored historical edge black band, so that the latest edge black band becomes a new historical edge black band, and the subsequent interface image is served.
According to the embodiment, in the process of continuously collecting the interface image playing light effects, the atmosphere lamp equipment can also timely identify the edge black band variation caused by the frame variation of the interface image and timely update the edge black band on the basis of keeping not frequently calculating the edge black band for each interface image, so that the color taking images can be adaptively adjusted according to the frame of the interface image, the light atmosphere effects of the interface image can be restored with high quality according to the light effects of the color taking images.
On the basis of any embodiment of the method, before the surrounding target picture is subjected to region segmentation, the method comprises the following steps:
step S2100, continuously acquiring interface images in external terminal equipment according to a preset frame rate, and taking the currently acquired interface images as target pictures;
in this embodiment, the controller continuously collects interface images in external terminal devices through the image acquisition interface as environment reference images, and usually collects the interface images according to a preset frame rate, for example, 30 frames per second, so that a control chip of the controller actually needs to process an image stream formed by each frame of interface images, and generates a corresponding light effect for each frame of interface images in the image stream.
Step S2200, when the trigger timing reaches an event, performing subsequent processing based on a history edge black band predetermined by the corresponding target outer side edge to determine the luminous color value for the target picture;
the controller presets a timer, sets a corresponding timing period, triggers a corresponding timing arrival event when the timing period arrives, drives to generate a corresponding edge black band again for the target picture at the current moment through the timing arrival event, and stores the edge black band as a historical edge black band in a replacement mode, so that the edge black band used as each processed subsequent interface image does not need to be recalculated in the next timing period at the moment.
Accordingly, when the timer does not trigger the timing arrival event, for the current target picture, step S5300 and step S5400 may be performed on the current target picture based on the historical edge black band predetermined corresponding to the outer side of the target, that is, the edge black band determined corresponding to the previous target picture, to determine the color-taking image thereof, and then generate the corresponding light-emitting color value according to the color-taking image.
And step 2300, when the trigger timing reaches an event, executing subsequent processing based on the target picture to determine a latest edge black band, determining the luminous color value for the target picture according to the latest edge black band, and replacing the latest edge black band with a historical edge black band corresponding to the subsequently acquired interface image.
When the trigger timing reaches the event, the whole process from step S5100 to step S5400 is performed for the target picture in response to the event, so as to determine the edge black band corresponding to the target picture, take it as the latest edge black band, determine the color taking image according to the latest edge black band, generate the corresponding luminous color value according to the color taking image, and so on.
Similarly, in order to continue to maintain the optimization effect of the operation efficiency, the latest edge black band determined for the current target picture is further replaced with the previously stored historical edge black band, so that the latest edge black band becomes a new historical edge black band, and the subsequent interface image is served.
Similarly, according to the above embodiment, according to the atmosphere lamp device of the application, in the process of continuously collecting the interface image playing lamp effect, on the basis of keeping not frequently calculating the edge black band for each interface image, the edge black band can be timely updated in response to the edge black band variation caused by the frame variation of the interface image, so that the edge black band is timely updated to adjust the color-taking image according to the timing, and the lamp effect played according to the color-taking image can be restored with high quality.
Referring to fig. 10, another embodiment of the present application further provides a color extraction device for an atmosphere lamp device, which includes a region segmentation module 5100, a black band detection module 5200, an image determination module 5300, and a color setting module 5400, wherein the region segmentation module 5100 is configured to segment a target picture around four sides thereof, to obtain a plurality of color extraction regions on each outer side, and to determine a region image corresponding to each color extraction region; the black band detection module 5200 is configured to detect a black area belonging to a target outer side of the target picture in each area image, and determine an edge black band corresponding to the target outer side according to the black areas of the plurality of area images belonging to the same target outer side; the image determining module 5300 is configured to determine, as a color-taking image corresponding to each region image belonging to the same target outer edge, an image content located outside the corresponding edge black band; the color setting module 5400 is configured to set, according to each color taking image, a light emission color value of a light emitting unit in a unit frame corresponding to the color taking image in the atmosphere lamp device.
On the basis of any embodiment of the apparatus of the present application, the region segmentation module 5100 includes: a size acquisition unit configured to acquire size information of a target picture, the size information including a lateral size and a longitudinal size; the area definition unit is used for dividing the transverse size and the longitudinal size equally according to the preset number, and correspondingly determining the transverse unit size and the longitudinal unit size to form area definition information; the segmentation execution unit is arranged to carry out region segmentation around four sides of the target picture according to the transverse unit size and the longitudinal unit size in the region definition information, so as to obtain corresponding color taking regions and a central region for residual segmentation; and the classification scheduling unit is used for carrying out corresponding subsequent processing on the image content of each color taking area as a corresponding area image, and directly carrying out corresponding subsequent processing on the image content in the central area as the color taking image.
On the basis of any embodiment of the apparatus of the present application, the black band detecting module 5200 includes: a side edge determining unit, configured to determine two opposite outer side edges in the target picture as target outer side edges, where the two outer side edges are a top edge and a bottom edge in a display frame of the target picture; a communication detection unit configured to perform pixel traversal from a pixel of an outer corner on an outermost side of each of the region images on the outer side of the object, and identify a black pixel communication domain in the region image, the black pixel communication domain being formed by extending an extension width inward with the outermost side as a base line; and the black band fusion unit is used for setting the minimum width in the black pixel connected domain of each area image belonging to the same object outer side as the width of the edge black band of the object outer side, setting the length of the object outer side as the length of the edge black band, and generating coordinate information of the edge black band corresponding to the object picture.
On the basis of any embodiment of the apparatus of the present application, the image determining unit 5300 includes: a pixel detection unit configured to determine, for each region image of any of the target outer sides, whether the pixel belongs to an edge black band based on coordinate information of each pixel in image content of the region image and coordinate information of an edge black band of the target outer side; an image construction unit is arranged to construct all pixels in each area image, which do not belong to the edge black band, as a color-taking image corresponding to the area image.
On the basis of any embodiment of the apparatus of the present application, the color setting module 5400 includes: a shading unit configured to distinguish between dark pixels and non-dark pixels in the color-taken image according to a preset dark threshold; a tone determining unit configured to determine a light emission color value corresponding to a dominant tone of the color taking image from a color value of a non-darkness pixel of the color taking image; and a color value determining unit configured to generate a light emission color value of each light emitting unit in a unit frame corresponding to the color taking image in the atmosphere lamp device according to the light emission color value corresponding to the dominant color.
On the basis of any embodiment of the apparatus of the present application, prior to the operation of the region segmentation module 5100, the color extracting apparatus of the atmosphere lamp device of the present application includes: the desktop acquisition module is used for continuously acquiring interface images in external terminal equipment according to a preset frame rate, and taking the currently acquired interface images as target pictures; the history calling module is used for calculating the average color value of the pixels of the target picture corresponding to each history edge black band according to the preset history edge black band corresponding to the outer side of the target; an on-edge scheduling module configured to perform a subsequent process based on the historical edge black band to determine the luminescent color value for the target picture when the average color value belongs to a color value representing a black phase correspondence; and the change scheduling module is used for determining the latest edge black band based on the target picture in a subsequent processing mode when the average color value does not belong to the color value corresponding to the representation black phase, determining the luminous color value for the target picture according to the latest edge black band, and replacing the latest edge black band with the historical edge black band corresponding to the interface image acquired later.
On the basis of any embodiment of the apparatus of the present application, prior to the operation of the region segmentation module 5100, the color extracting apparatus of the atmosphere lamp device of the present application includes: the desktop acquisition module is used for continuously acquiring interface images in external terminal equipment according to a preset frame rate, and taking the currently acquired interface images as target pictures; the default scheduling module is set to execute subsequent processing based on a history edge black band predetermined by the corresponding target outer side when the trigger timing reaches an event so as to determine the luminous color value for the target picture; and the timely scheduling module is used for determining the latest edge black band based on the target picture when the trigger timing reaches an event, determining the luminous color value for the target picture according to the latest edge black band, and replacing the latest edge black band with a historical edge black band corresponding to the interface image acquired later.
On the basis of any embodiment of the present application, referring to fig. 11, another embodiment of the present application further provides a computer device, which may be used as a controller in an atmosphere lamp device, and as shown in fig. 11, an internal structure of the computer device is schematically shown. The computer device includes a processor, a computer readable storage medium, a memory, and a network interface connected by a system bus. The computer readable storage medium of the computer device stores an operating system, a database and computer readable instructions, the database can store a control information sequence, and when the computer readable instructions are executed by a processor, the processor can realize an atmosphere lamp device color-taking method. The processor of the computer device is used to provide computing and control capabilities, supporting the operation of the entire computer device. The memory of the computer device may store computer readable instructions that, when executed by the processor, cause the processor to perform the method of color extraction of an ambience light device of the present application. The network interface of the computer device is for communicating with a terminal connection. It will be appreciated by those skilled in the art that the structure shown in FIG. 11 is merely a block diagram of some of the structures associated with the present inventive arrangements and is not limiting of the computer device to which the present inventive arrangements may be applied, and that a particular computer device may include more or fewer components than shown, or may combine some of the components, or have a different arrangement of components.
The processor in this embodiment is configured to execute specific functions of each module and its sub-module in fig. 10, and the memory stores program codes and various data required for executing the above-mentioned modules or sub-modules. The network interface is used for data transmission between the user terminal or the server. The memory in the present embodiment stores program codes and data required for executing all modules/sub-modules in the color taking device of the atmosphere lamp device of the present application, and the server can call the program codes and data of the server to execute the functions of all the sub-modules.
The present application also provides a storage medium storing computer readable instructions that, when executed by one or more processors, cause the one or more processors to perform the steps of the method for color extraction of an ambient light device according to any of the embodiments of the present application.
The application also provides a computer program product comprising computer programs/instructions which when executed by one or more processors implement the steps of the method for color extraction of an ambience light device according to any of the embodiments of the application.
Those skilled in the art will appreciate that all or part of the processes implementing the methods of the above embodiments of the present application may be implemented by a computer program for instructing relevant hardware, where the computer program may be stored on a computer readable storage medium, where the program, when executed, may include processes implementing the embodiments of the methods described above. The storage medium may be a computer readable storage medium such as a magnetic disk, an optical disk, a Read-Only Memory (ROM), or a random access Memory (Random Access Memory, RAM).
The foregoing is only a partial embodiment of the present application, and it should be noted that it will be apparent to those skilled in the art that modifications and adaptations can be made without departing from the principles of the present application, and such modifications and adaptations are intended to be comprehended within the scope of the present application.
In summary, the method and the device can adapt to the condition of limited calculation force in the atmosphere lamp equipment, quickly and efficiently partition and accurately take the color of the target picture, generate the luminous color value of the luminous unit in each unit picture corresponding to the target picture in the atmosphere lamp, and when the corresponding lamp effect of the target picture is played, the corresponding picture quality is finer and smoother, the color is bright, and the overall picture quality is obviously improved.

Claims (10)

1. An atmosphere lamp device color extraction method, comprising:
surrounding four sides of the target picture to divide the region, dividing each outer side edge to obtain a plurality of color taking regions, and determining a region image corresponding to each color taking region;
detecting black areas belonging to the outer side of the target picture in each area image, and determining an edge black band corresponding to the outer side of the target according to the black areas of a plurality of area images belonging to the same outer side of the target;
Determining the image content outside the corresponding edge black band in each area image belonging to the same target outer side as a corresponding color-taking image of the area image;
the light emission color value of the light emitting unit in the unit frame corresponding to the color taking image in the atmosphere lamp device is set according to each color taking image.
2. The method of claim 1, wherein the surrounding object picture is divided into a plurality of color taking areas around four sides thereof, each of the outer sides is divided into a plurality of color taking areas, and the determining the area image corresponding to each of the color taking areas comprises:
acquiring size information of a target picture, wherein the size information comprises a transverse size and a longitudinal size;
dividing the transverse dimension and the longitudinal dimension equally according to a preset number, and correspondingly determining the transverse unit dimension and the longitudinal unit dimension to form region definition information;
according to the transverse unit size and the longitudinal unit size in the region definition information, region segmentation is carried out around four sides of the target picture, and corresponding color taking regions and segmented residual central regions are obtained;
and taking the image content of each color taking area as a corresponding area image to carry out corresponding subsequent processing, and directly taking the image content in the central area as the color taking image to carry out corresponding subsequent processing.
3. The method of claim 1, wherein detecting black areas belonging to the target outer side of the target picture in each area image, determining an edge black band corresponding to the target outer side from black areas of a plurality of area images belonging to the same target outer side, comprises:
determining two opposite outer side edges in the target picture as target outer side edges, wherein the two outer side edges are the top edge and the bottom edge in the display picture of the target picture;
for each region image in the outer side of the target, performing pixel traversal from a pixel of an outer side corner at the outermost side of the region image, and identifying a black pixel connected region formed by extending and expanding the width inwards by taking the outermost side as a base line in the region image;
and taking the minimum width in the black pixel connected domain of each area image belonging to the same target outer side as the width of the edge black band of the target outer side, taking the length of the target outer side as the length of the edge black band, and generating the coordinate information of the edge black band corresponding to the target picture.
4. The method of color extraction of an ambient light device according to claim 1, wherein determining, as the respective color extraction image of each area image belonging to the same target outer side, the image content outside the respective edge black band, comprises:
For each area image of any outer side of the target, judging whether the pixel belongs to an edge black band or not according to the coordinate information of each pixel in the image content of the area image and the coordinate information of the edge black band of the outer side of the target;
all pixels in each area image which do not belong to the edge black band are constructed as a color taking image corresponding to the area image.
5. The method of color extraction of an atmosphere lamp device according to claim 1, wherein setting the emission color value of the light emitting unit in the unit frame corresponding to each color extraction image in the atmosphere lamp device according to the color extraction image comprises:
distinguishing dark pixels and non-dark pixels in the color-taking image according to a preset dark threshold;
determining a luminous color value corresponding to a dominant hue of the color taking image according to the color value of the non-darkness pixel of the color taking image;
and generating the luminous color value of each luminous unit in the unit frame corresponding to the color taking image in the atmosphere lamp equipment according to the luminous color value corresponding to the main tone.
6. The method of color extraction for an atmosphere lamp device according to any one of claims 1 to 5, comprising, before dividing the target picture around its four sides in area:
Continuously acquiring interface images in external terminal equipment according to a preset frame rate, and taking the currently acquired interface images as target pictures;
according to a history edge black band predetermined corresponding to the outer side of the target, calculating the average color value of pixels at each history edge black band corresponding to the target picture;
when the average color value belongs to a color value corresponding to a representation black, performing subsequent processing based on the historical edge black band to determine the luminous color value for the target picture;
and when the average color value does not belong to the color value corresponding to the representation black phase, executing subsequent processing based on the target picture to determine a latest edge black band, determining the luminous color value for the target picture according to the latest edge black band, and replacing the latest edge black band with a historical edge black band corresponding to the interface image acquired later.
7. The method of color extraction for an atmosphere lamp device according to any one of claims 1 to 5, comprising, before dividing the target picture around its four sides in area:
continuously acquiring interface images in external terminal equipment according to a preset frame rate, and taking the currently acquired interface images as target pictures;
When the trigger timing reaches an event, performing subsequent processing based on a history edge black band predetermined corresponding to the outer side edge of the target to determine the luminous color value for the target picture;
and when the trigger timing reaches an event, executing subsequent processing based on the target picture to determine a latest edge black band, determining the luminous color value for the target picture according to the latest edge black band, and replacing the latest edge black band with a historical edge black band corresponding to the subsequently acquired interface image.
8. An atmosphere lamp device color extraction apparatus, comprising:
the region segmentation module is arranged to surround four sides of the target picture to segment the target picture into a plurality of region taking areas, and a region image corresponding to each region taking area is determined;
the black band detection module is used for detecting black areas belonging to the outer side of the target picture in each area image, and determining the edge black band corresponding to the outer side of the target according to the black areas of a plurality of area images belonging to the same outer side of the target;
an image determining module, configured to determine, as a color-taking image corresponding to each region image belonging to the same object outer side, an image content located outside the corresponding edge black band in the region image;
And the color setting module is used for setting the luminous color value of the luminous unit in the unit picture corresponding to the color taking image in the atmosphere lamp equipment according to each color taking image.
9. An atmosphere lamp device comprising a central processor and a memory, characterized in that the central processor is adapted to invoke a computer program stored in the memory for performing the steps of the method according to any of claims 1 to 7.
10. A non-transitory readable storage medium, characterized in that it stores a computer program in the form of computer readable instructions, which when invoked by a computer to run, performs the steps of the method according to any one of claims 1 to 7.
CN202311472393.1A 2023-11-07 2023-11-07 Atmosphere lamp equipment, color taking method thereof, corresponding device and medium Active CN117197261B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311472393.1A CN117197261B (en) 2023-11-07 2023-11-07 Atmosphere lamp equipment, color taking method thereof, corresponding device and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311472393.1A CN117197261B (en) 2023-11-07 2023-11-07 Atmosphere lamp equipment, color taking method thereof, corresponding device and medium

Publications (2)

Publication Number Publication Date
CN117197261A true CN117197261A (en) 2023-12-08
CN117197261B CN117197261B (en) 2024-02-27

Family

ID=88990982

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311472393.1A Active CN117197261B (en) 2023-11-07 2023-11-07 Atmosphere lamp equipment, color taking method thereof, corresponding device and medium

Country Status (1)

Country Link
CN (1) CN117197261B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117412452A (en) * 2023-12-13 2024-01-16 深圳市千岩科技有限公司 Atmosphere lamp equipment, color matching method thereof, corresponding device and medium
CN117412451A (en) * 2023-12-13 2024-01-16 深圳市千岩科技有限公司 Atmosphere lamp equipment, mapping color matching method thereof, corresponding device and medium
CN117412449A (en) * 2023-12-13 2024-01-16 深圳市千岩科技有限公司 Atmosphere lamp equipment, lamp effect playing control method thereof, and corresponding device and medium
CN117521179A (en) * 2024-01-04 2024-02-06 深圳市智岩科技有限公司 Atmosphere lamp equipment, luminous partition layout construction method and device and computer equipment
CN117794021A (en) * 2024-02-26 2024-03-29 深圳市智岩科技有限公司 Atmosphere lamp equipment, lamp effect projection playing method and device, medium and product

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007142621A1 (en) * 2006-06-02 2007-12-13 Fotonation Vision Limited Modification of post-viewing parameters for digital images using image region or feature information
US20090058876A1 (en) * 2007-08-27 2009-03-05 Au Optronics Corporation Dynamic color gamut of led backlight
JP2019028338A (en) * 2017-08-01 2019-02-21 キヤノン株式会社 Imaging apparatus, control method thereof and program
CN113053324A (en) * 2021-03-15 2021-06-29 生迪智慧科技有限公司 Backlight control method, device, equipment, system and storage medium
CN113630932A (en) * 2020-12-11 2021-11-09 萤火虫(深圳)灯光科技有限公司 Light control method, controller, module and storage medium based on boundary identification
CN113630655A (en) * 2021-08-13 2021-11-09 海信视像科技股份有限公司 Method for changing color of peripheral equipment along with picture color and display equipment
CN114040249A (en) * 2021-11-26 2022-02-11 康佳集团股份有限公司 Atmosphere lamp adjusting method and device based on picture, intelligent terminal and storage medium
US20220051634A1 (en) * 2019-08-06 2022-02-17 Shenzhen Skyworth-Rgb Electronic Co., Ltd. Method and device for adjusting mini led backlight television picture
CN114266838A (en) * 2021-12-09 2022-04-01 深圳市智岩科技有限公司 Image data processing method, image data processing device, electronic equipment and storage medium
WO2022121879A1 (en) * 2020-12-09 2022-06-16 华为技术有限公司 Tof apparatus and electronic device
CN116321627A (en) * 2023-04-07 2023-06-23 深圳市云海视讯技术有限公司 Screen atmosphere lamp synchronous control method, system and control equipment based on image pickup
CN116506677A (en) * 2023-02-10 2023-07-28 杭州当虹科技股份有限公司 Color atmosphere processing method and system

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007142621A1 (en) * 2006-06-02 2007-12-13 Fotonation Vision Limited Modification of post-viewing parameters for digital images using image region or feature information
US20090058876A1 (en) * 2007-08-27 2009-03-05 Au Optronics Corporation Dynamic color gamut of led backlight
JP2019028338A (en) * 2017-08-01 2019-02-21 キヤノン株式会社 Imaging apparatus, control method thereof and program
US20220051634A1 (en) * 2019-08-06 2022-02-17 Shenzhen Skyworth-Rgb Electronic Co., Ltd. Method and device for adjusting mini led backlight television picture
WO2022121879A1 (en) * 2020-12-09 2022-06-16 华为技术有限公司 Tof apparatus and electronic device
CN113630932A (en) * 2020-12-11 2021-11-09 萤火虫(深圳)灯光科技有限公司 Light control method, controller, module and storage medium based on boundary identification
CN113053324A (en) * 2021-03-15 2021-06-29 生迪智慧科技有限公司 Backlight control method, device, equipment, system and storage medium
CN113630655A (en) * 2021-08-13 2021-11-09 海信视像科技股份有限公司 Method for changing color of peripheral equipment along with picture color and display equipment
CN114040249A (en) * 2021-11-26 2022-02-11 康佳集团股份有限公司 Atmosphere lamp adjusting method and device based on picture, intelligent terminal and storage medium
CN114266838A (en) * 2021-12-09 2022-04-01 深圳市智岩科技有限公司 Image data processing method, image data processing device, electronic equipment and storage medium
CN116506677A (en) * 2023-02-10 2023-07-28 杭州当虹科技股份有限公司 Color atmosphere processing method and system
CN116321627A (en) * 2023-04-07 2023-06-23 深圳市云海视讯技术有限公司 Screen atmosphere lamp synchronous control method, system and control equipment based on image pickup

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117412452A (en) * 2023-12-13 2024-01-16 深圳市千岩科技有限公司 Atmosphere lamp equipment, color matching method thereof, corresponding device and medium
CN117412451A (en) * 2023-12-13 2024-01-16 深圳市千岩科技有限公司 Atmosphere lamp equipment, mapping color matching method thereof, corresponding device and medium
CN117412449A (en) * 2023-12-13 2024-01-16 深圳市千岩科技有限公司 Atmosphere lamp equipment, lamp effect playing control method thereof, and corresponding device and medium
CN117412449B (en) * 2023-12-13 2024-03-01 深圳市千岩科技有限公司 Atmosphere lamp equipment, lamp effect playing control method thereof, and corresponding device and medium
CN117412451B (en) * 2023-12-13 2024-03-15 深圳市千岩科技有限公司 Atmosphere lamp equipment, mapping color matching method thereof, corresponding device and medium
CN117412452B (en) * 2023-12-13 2024-04-02 深圳市千岩科技有限公司 Atmosphere lamp equipment, color matching method thereof, corresponding device and medium
CN117521179A (en) * 2024-01-04 2024-02-06 深圳市智岩科技有限公司 Atmosphere lamp equipment, luminous partition layout construction method and device and computer equipment
CN117521179B (en) * 2024-01-04 2024-04-19 深圳市智岩科技有限公司 Atmosphere lamp equipment, luminous partition layout construction method and device and computer equipment
CN117794021A (en) * 2024-02-26 2024-03-29 深圳市智岩科技有限公司 Atmosphere lamp equipment, lamp effect projection playing method and device, medium and product
CN117794021B (en) * 2024-02-26 2024-05-07 深圳市智岩科技有限公司 Atmosphere lamp equipment, lamp effect projection playing method and device, medium and product

Also Published As

Publication number Publication date
CN117197261B (en) 2024-02-27

Similar Documents

Publication Publication Date Title
CN117197261B (en) Atmosphere lamp equipment, color taking method thereof, corresponding device and medium
CN117202451B (en) Atmosphere lamp equipment, and light-emitting control method, device and medium thereof
CN113613363A (en) Light control method, device, controller, module and storage medium
WO2020093651A1 (en) Method and apparatus for automatically detecting and suppressing fringes, electronic device and computer-readable stroage medium
CN117202447B (en) Atmosphere lamp equipment, corner color taking method thereof, corresponding device and medium
CN107680556B (en) A kind of display power-economizing method, device and display
CN113299245B (en) Method and device for adjusting local backlight of display equipment, display equipment and storage medium
CN110113536A (en) A kind of screen light compensation method, device, storage medium and intelligent terminal
CN114040249A (en) Atmosphere lamp adjusting method and device based on picture, intelligent terminal and storage medium
CN104978186A (en) Interface skin rendering method and apparatus
JP2018018173A (en) Image processing device, image processing method, and computer program
CN114340102B (en) Lamp strip control method, device, display equipment, system and storage medium
CN111612839A (en) Colored lamp position identification method, system and device and storage medium
US11889083B2 (en) Image display method and device, image recognition method and device, storage medium, electronic apparatus, and image system
US9907143B2 (en) Lighting control device, lighting system, and program
CN117412449B (en) Atmosphere lamp equipment, lamp effect playing control method thereof, and corresponding device and medium
CN117521179B (en) Atmosphere lamp equipment, luminous partition layout construction method and device and computer equipment
WO2007036890A2 (en) Improving living lights with color coherency
CN111028772B (en) Lamplight display method, lamplight display device, controller and storage medium
CN113709949A (en) Control method and device of lighting equipment, electronic equipment and storage medium
CN207884957U (en) Lamp bar controller
KR101694824B1 (en) Energy saving signboard using dual smart camer and the method thereof
CN114599130A (en) Intelligent atmosphere lamp system based on STM32 master control
CN117412452B (en) Atmosphere lamp equipment, color matching method thereof, corresponding device and medium
CN203276790U (en) Led video cloth

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant