CN117334162B - LED backlight source control system and method thereof - Google Patents

LED backlight source control system and method thereof Download PDF

Info

Publication number
CN117334162B
CN117334162B CN202311314104.5A CN202311314104A CN117334162B CN 117334162 B CN117334162 B CN 117334162B CN 202311314104 A CN202311314104 A CN 202311314104A CN 117334162 B CN117334162 B CN 117334162B
Authority
CN
China
Prior art keywords
time sequence
light intensity
environment light
training
feature
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311314104.5A
Other languages
Chinese (zh)
Other versions
CN117334162A (en
Inventor
郑汉武
陈潮深
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Suijing Optoelectronics Co ltd
Original Assignee
Shenzhen Suijing Optoelectronics Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Suijing Optoelectronics Co ltd filed Critical Shenzhen Suijing Optoelectronics Co ltd
Priority to CN202311314104.5A priority Critical patent/CN117334162B/en
Publication of CN117334162A publication Critical patent/CN117334162A/en
Application granted granted Critical
Publication of CN117334162B publication Critical patent/CN117334162B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • G09G3/34Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source
    • G09G3/3406Control of illumination source
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0626Adjustment of display parameters for control of overall brightness
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2360/00Aspects of the architecture of display systems
    • G09G2360/14Detecting light within display terminals, e.g. using a single or a plurality of photosensors
    • G09G2360/144Detecting light within display terminals, e.g. using a single or a plurality of photosensors the light being ambient light

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Circuit Arrangement For Electric Light Sources In General (AREA)

Abstract

An LED backlight control system and method thereof are disclosed. The system comprises: an ambient light sensor for detecting ambient light intensity; a user interface for setting backlight preference data of a user; the LED backlight source consists of a plurality of LEDs and is used for emitting light rays with different brightness; and a controller communicatively coupled to the ambient light sensor, the user interface, and the LED backlight, the controller for controlling a brightness value of the LED backlight. In this way, the backlight brightness can be adaptively adjusted according to the time sequence change condition of the ambient light intensity, so as to provide better visual effect and energy consumption management.

Description

LED backlight source control system and method thereof
Technical Field
The application relates to the field of intelligent control, in particular to an LED backlight control system and a method thereof.
Background
Backlights are an important component in liquid crystal displays, televisions, and the like, for providing background light to enhance brightness and contrast of images. Brightness adjustment of the backlight is important for visual effect and power consumption management of the display device.
However, conventional LED backlight control schemes are typically based on a fixed brightness setting or simple manual adjustment, and the backlight brightness cannot be adaptively adjusted according to the change of the ambient light intensity, which results in that the brightness of the display may be too high or too low under different ambient light conditions, affecting the visual experience of the user. Moreover, since the brightness of the backlight source in the conventional scheme is fixed, the backlight source still operates at a higher brightness even in the case of darker ambient light, resulting in waste of energy, which not only increases energy cost, but also causes unnecessary burden to the environment.
In addition, some users may prefer brighter display effects and others may prefer darker display effects due to the difference in brightness and ambient light conditions among different users, but conventional LED backlight control schemes may not meet the personalized needs of each user.
Accordingly, an optimized LED backlight control system is desired.
Disclosure of Invention
In view of this, the present application provides a system and a method for controlling an LED backlight, which can adaptively adjust the brightness of the backlight according to the time sequence variation of the ambient light intensity, so as to provide better visual effect and energy consumption management, which has important significance for improving the user experience of the display device, saving energy and adapting to different ambient light conditions.
According to an aspect of the present application, there is provided an LED backlight control system, including:
An ambient light sensor for detecting ambient light intensity;
a user interface for setting backlight preference data of a user;
The LED backlight source consists of a plurality of LEDs and is used for emitting light rays with different brightness; and
And a controller communicatively coupled to the ambient light sensor, the user interface, and the LED backlight, the controller configured to control a brightness value of the LED backlight.
According to another aspect of the present application, there is provided an LED backlight control method, including:
detecting ambient illumination intensity by an ambient light sensor;
setting backlight preference data of a user through a user interface;
Light rays with different brightness are emitted through the LED backlight source; and
The brightness value of the LED backlight is controlled by a controller communicatively connected to the ambient light sensor, the user interface, and the LED backlight.
According to an embodiment of the application, the system comprises: an ambient light sensor for detecting ambient light intensity; a user interface for setting backlight preference data of a user; the LED backlight source consists of a plurality of LEDs and is used for emitting light rays with different brightness; and a controller communicatively coupled to the ambient light sensor, the user interface, and the backlight, the controller for controlling a brightness value of the LED backlight. In this way, the backlight brightness can be adaptively adjusted according to the time sequence change condition of the ambient light intensity, so as to provide better visual effect and energy consumption management.
Other features and aspects of the present application will become apparent from the following detailed description of exemplary embodiments, which proceeds with reference to the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate exemplary embodiments, features and aspects of the application and together with the description, serve to explain the principles of the application.
Fig. 1 shows a block diagram of an LED backlight control system according to an embodiment of the application.
Fig. 2 shows a block diagram of the controller in the LED backlight control system according to an embodiment of the present application.
Fig. 3 shows a block diagram of the ambient light intensity local timing feature extraction module in an LED backlight control system according to an embodiment of the application.
FIG. 4 shows a block diagram of the ambient light intensity local timing feature transfer correlation encoding module in an LED backlight control system according to an embodiment of the application.
Fig. 5 shows a flowchart of an LED backlight control method according to an embodiment of the present application.
Fig. 6 shows an architecture diagram of substep S140 of the LED backlight control method according to an embodiment of the present application.
Fig. 7 shows an application scenario diagram of an LED backlight control system according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some, but not all embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are also within the scope of the application.
As used in the specification and in the claims, the terms "a," "an," "the," and/or "the" are not specific to a singular, but may include a plurality, unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that the steps and elements are explicitly identified, and they do not constitute an exclusive list, as other steps or elements may be included in a method or apparatus.
Various exemplary embodiments, features and aspects of the application will be described in detail below with reference to the drawings. In the drawings, like reference numbers indicate identical or functionally similar elements. Although various aspects of the embodiments are illustrated in the accompanying drawings, the drawings are not necessarily drawn to scale unless specifically indicated.
In addition, numerous specific details are set forth in the following description in order to provide a better illustration of the application. It will be understood by those skilled in the art that the present application may be practiced without some of these specific details. In some instances, well known methods, procedures, components, and circuits have not been described in detail so as not to obscure the present application.
Fig. 1 shows a block diagram schematic of an LED backlight control system according to an embodiment of the application. As shown in fig. 1, an LED backlight control system 100 according to an embodiment of the present application includes: an ambient light sensor 110 for detecting the intensity of ambient light; a user interface 120 for setting backlight preference data of a user; an LED backlight 130, wherein the LED backlight 130 is composed of a plurality of LEDs, and the LED backlight 130 is used for emitting light with different brightness; and a controller 140, the controller 140 being communicatively coupled to the ambient light sensor 110, the user interface 120, and the LED backlight 130, the controller 140 being configured to control the brightness value of the LED backlight 130.
Aiming at the technical problems, the technical concept of the application is that the ambient light intensity value is acquired in real time through the ambient light sensor, and the time sequence analysis of the ambient light intensity is carried out by introducing the data processing and analysis algorithm at the rear end, so that the adjustment direction of the brightness of the LED backlight source is judged.
Accordingly, as shown in fig. 2, the controller 140 includes: an ambient light intensity data acquisition module 141, configured to acquire ambient light intensity values at a plurality of predetermined time points within a predetermined time period by using the ambient light sensor; a light intensity data timing arrangement module 142, configured to arrange the ambient light intensity values at the plurality of predetermined time points into an ambient light intensity timing input vector according to a time dimension; the local time sequence feature extraction module 143 is configured to perform local time sequence feature analysis on the local time sequence input vector of the environmental light intensity to obtain a sequence of local time sequence feature vectors of the environmental light intensity; the local time sequence feature transfer association coding module 144 is configured to perform transfer association coding of adjacent local time sequence features of the local time sequence feature vectors of the environmental light intensity to obtain a local time sequence feature of the environmental light intensity; and a brightness control module 145 for determining, based on the ambient light intensity shift timing characteristics, that the brightness value of the LED backlight should be increased, should be maintained, or should be decreased.
Specifically, in the technical scheme of the present application, first, the ambient light intensity values at a plurality of predetermined time points within a predetermined period of time acquired by the ambient light sensor are acquired. Then, consider the dynamic nature of the time sequence in the time dimension due to the ambient light intensity value, and it also has fluctuations and uncertainties. Therefore, in order to sufficiently and effectively capture and characterize the time sequence variation characteristics of the environmental light intensity values, in the technical scheme of the application, the environmental light intensity values at a plurality of preset time points need to be arranged into the time sequence input vector of the environmental light intensity according to the time dimension, so as to integrate the time sequence distribution information of the environmental light intensity values.
In order to better capture the variation trend and mode of the ambient light intensity, in the technical scheme of the application, vector segmentation is further carried out on the ambient light intensity time sequence input vector so as to obtain a sequence of the ambient light intensity local time sequence input vector. By slicing the ambient light intensity timing input vector into smaller local timing vectors, short-term and long-term variations in ambient light intensity can be better observed and analyzed. In this way, the time sequence analysis and the characteristic capture of the ambient light intensity can be more fully and accurately performed to perform the brightness adjustment of the LED backlight.
Then, in order to capture the local time sequence variation condition of the ambient light intensity, the sequence of the local time sequence input vector of the ambient light intensity needs to be further subjected to feature mining in an ambient light intensity time sequence feature extractor based on a one-dimensional convolution layer so as to extract the local time sequence dynamic variation feature information of the ambient light intensity in the time dimension respectively, thereby obtaining the sequence of the local time sequence feature vector of the ambient light intensity.
Further, considering that there is a correlation between local time sequence feature information about the ambient light intensity in each local time sequence segment, however, due to the fluctuation and uncertainty of the ambient light intensity in the time dimension, it is difficult to effectively and sufficiently perform time sequence variation feature capturing and characterization of the ambient light intensity in the conventional time sequence feature extraction manner. Therefore, in the technical scheme of the application, the transfer matrix between every two adjacent ambient light intensity local time sequence feature vectors in the sequence of the ambient light intensity local time sequence feature vectors is further calculated, so that the time sequence transfer correlation feature distribution information between the local time sequence change features related to the ambient light intensity in every two adjacent time is calculated, and the sequence of the ambient light intensity time sequence transfer feature matrix is obtained.
Next, it is also considered that the local time-series transfer related characteristic information of each adjacent ambient light intensity has different contribution in the adaptive control process of the brightness value of the actual LED backlight source, because the ambient light intensity may change faster or slower in time series, so that there is a great amount of redundancy in the extracted characteristic information. Based on the above, in the technical scheme of the application, the sequence of the environmental light intensity time sequence transfer characteristic matrix is further passed through a time sequence enhancer based on a channel attention layer to obtain a time sequence enhanced environmental light intensity time sequence characteristic diagram. The time sequence strengthening process is carried out by the time sequence strengthening device based on the channel attention layer, and each local time sequence transfer characteristic of the ambient light intensity can be weighted and adjusted so as to highlight important time sequence information and inhibit noise and redundancy. In this way, the system can better understand the trend of the change of the ambient light intensity and make corresponding brightness adjustment decisions so as to provide better visual effect and user experience.
Accordingly, as shown in fig. 3, the local time sequence feature extraction module 143 of the environmental light intensity includes: the light intensity time sequence vector segmentation unit 1431 is used for carrying out vector segmentation on the environment light intensity time sequence input vector so as to obtain a sequence of environment light intensity local time sequence input vectors; and an ambient light intensity local time sequence feature capturing unit 1432 configured to pass the sequence of the ambient light intensity local time sequence input vectors through an ambient light intensity time sequence feature extractor based on a one-dimensional convolution layer to obtain the sequence of the ambient light intensity local time sequence feature vectors. It should be understood that the ambient light intensity local time series feature extraction module 143 includes two units, namely, a light intensity time series vector segmentation unit 1431 and an ambient light intensity local time series feature capture unit 1432. The light intensity time sequence vector segmentation unit 1431 is used for segmenting the ambient light intensity time sequence input vector to obtain a sequence of ambient light intensity local time sequence input vectors, and can segment a long ambient light intensity time sequence input vector into a plurality of shorter local time sequence input vector sequences, and can decompose long time sequence data into a plurality of local time sequence fragments through segmentation so as to better capture local time sequence characteristics. The function of the local time sequence feature capturing unit 1432 of the environmental light intensity is to make the sequence of the local time sequence input vector of the environmental light intensity pass through the environment light intensity time sequence feature extractor based on the one-dimensional convolution layer to obtain the sequence of the local time sequence feature vector of the environmental light intensity, the unit uses the one-dimensional convolution layer to extract the local time sequence feature, and the convolution operation is carried out on the sequence of the local time sequence input vector by the sliding window mode, so as to capture the local time sequence feature. The convolution operation can identify different local time sequence modes and extract useful characteristic information. In summary, the light intensity time sequence vector segmentation unit is used for segmenting a long ambient light intensity time sequence input vector into a short local time sequence input vector sequence, and the ambient light intensity local time sequence feature capturing unit extracts local time sequence features through a one-dimensional convolution layer and converts the local time sequence input vector sequence into a local time sequence feature vector sequence. The combination of these units may help extract local timing features of the ambient light intensity for subsequent processing and analysis.
It is worth mentioning that one-dimensional convolutional layer is a layer type of convolutional neural network (Convolutional Neural Network, CNN) commonly used in deep learning, and is mainly used for processing data having one-dimensional structure. The input and output of the one-dimensional convolution layer are both one-dimensional tensors. It performs a sliding window convolution operation on the input data through a learnable filter (also called a convolution kernel) to extract features. The convolution operation involves element-wise multiplying a filter with a local window of input and then adding the results to obtain an element of the convolved output. Similar to the two-dimensional convolution layer, the one-dimensional convolution layer has the following important parameters and characteristics: filter size (kernel size): the length of the filter is specified, which determines the size of the input window involved in each convolution operation. Stride (stride): the step size of the filter sliding on the input is specified, determining the size of the output. Fill (padding): optional parameters for filling in additional zero values on both sides or one side of the input to control the size of the output and the nature of feature extraction. The one-dimensional convolution layer captures local patterns and features in the input data by performing a sliding window convolution operation on the input data. Summarizing, a one-dimensional convolution layer is a neural network layer for processing one-dimensional data, and local features of input data are extracted by convolution operation of a sliding window. It is an important component in convolutional neural networks and is widely used in a variety of tasks.
Accordingly, as shown in fig. 4, the local timing characteristic transfer associated coding module 144 of the environmental light intensity includes: the adjacent time sequence light intensity characteristic transfer association unit 1441 is used for calculating a transfer matrix between every two adjacent environment light intensity local time sequence characteristic vectors in the sequence of the environment light intensity local time sequence characteristic vectors so as to obtain the sequence of the environment light intensity time sequence transfer characteristic matrix; and a channel attention timing enhancement unit 1442, configured to pass the sequence of the environmental light intensity timing transfer feature matrix through a timing enhancer based on a channel attention layer to obtain a timing enhanced environmental light intensity timing feature map as the environmental light intensity transfer timing feature.
It should be appreciated that the ambient light intensity local time series characteristic transfer correlation encoding module 144 includes two units, an adjacent time series light intensity characteristic transfer correlation unit 1441 and a channel attention time series emphasis unit 1442. The adjacent time-series light intensity characteristic transfer correlation unit 1441 is used for calculating a transfer matrix between every two adjacent environmental light intensity local time-series characteristic vectors in the sequence of the environmental light intensity local time-series characteristic vectors so as to obtain the sequence of the environmental light intensity time-series transfer characteristic matrix, and the unit captures the transfer relationship between the adjacent characteristic vectors by calculating the adjacent characteristic vectors. The transfer matrix may represent correlations and conversion relationships between feature vectors for describing temporal transfer features of the ambient light intensity. The channel attention time sequence enhancing unit 1442 is used for passing the sequence of the environment light intensity time sequence transfer characteristic matrix through a time sequence enhancer based on the channel attention layer so as to obtain a time sequence enhanced environment light intensity time sequence characteristic diagram as an environment light intensity transfer time sequence characteristic. This unit uses the channel attention layer to weight the transfer feature matrix, strengthen channels (features) with important information, and suppress non-important channels. In this way, the expressive power and the degree of discrimination of the environmental light intensity transfer timing characteristics can be improved. In summary, the adjacent time sequence light intensity characteristic transfer association unit is used for calculating a transfer matrix between adjacent environment light intensity local time sequence characteristic vectors and describing a transfer relationship between the adjacent environment light intensity local time sequence characteristic vectors; the channel attention time sequence strengthening unit weights the transfer characteristic matrix through the channel attention layer to strengthen important characteristics and obtain a time sequence strengthening environment light intensity time sequence characteristic diagram. The combination of these units can be used to extract the shifted temporal features of the ambient light intensity and enhance the focus on important features.
It should be noted that the channel attention layer is an attention mechanism commonly used in deep learning, and is used to enhance the attention of the model to different channels in the input feature. The channel attention layer may adaptively learn channel weights based on channel information of input features to better utilize the feature representation capabilities of different channels in a particular task. The main goal of the channel attention layer is to weight the channel dimensions of the input features to increase the degree of attention of the model to different channels. It can be realized by the following steps: 1. input characteristics: typically a multi-channel feature map, expressed as a three-dimensional tensor, with dimensions [ batch size, channel number, feature map height, feature map width ].2. Feature conversion: to reduce the number of parameters and computation, the input features are typically subjected to a dimension reduction operation, for example, using global averaging pooling (global average pooling) to reduce the height and width dimensions of the feature map to 1, resulting in a feature vector with a shape of [ lot size, channel number ]. 3. Channel attention mechanism: the channel attention layer processes the feature vectors by introducing some learnable parameters, such as a fully connected layer or a convolution layer. It is common practice to use an activation function (e.g., reLU) and normalization operations (e.g., batch normalization) to increase nonlinearity and stability. 4. And (5) calculating channel weight: by processing the feature vectors, a weight vector equal to the number of channels is generated, indicating the importance of each channel. These weights may be considered as attention scores for the corresponding channels, indicating the extent to which the model should focus on each channel. 5. Feature weighting: the channel weights are multiplied by the input features to obtain weighted feature representations. In this way, the model will focus more on channels with higher weights in subsequent layers, while suppressing channels with lower weights. By introducing the channel attention layer, the model can automatically learn the correlation and importance among different channels, thereby improving the expression capability and discriminant of the features. The channel attention layer is widely applied to visual tasks such as image classification, target detection, semantic segmentation and the like, and remarkable performance improvement is achieved.
And then, the time sequence enhanced environment light intensity time sequence characteristic diagram is passed through a classifier to obtain a classification result, wherein the classification result is used for indicating that the brightness value of the LED backlight source should be increased, maintained or reduced. That is, the adjustment direction of the brightness of the LED backlight is determined by performing the classification processing using the transition related characteristic information between the local time sequence characteristics related to the ambient light intensity after the time sequence characteristic is strengthened, and in this way, the backlight brightness can be adaptively adjusted according to the time sequence variation condition of the ambient light intensity, so as to provide better visual effect and energy consumption management.
Accordingly, the brightness control module 145 is configured to: the time sequence enhanced environment light intensity time sequence characteristic diagram is passed through a classifier to obtain a classification result, wherein the classification result is used for indicating that the brightness value of the LED backlight source should be increased, maintained or reduced.
Further, in the technical scheme of the application, the LED backlight control system further comprises a training module for training the environmental light intensity time sequence feature extractor based on the one-dimensional convolution layer, the time sequence enhancer based on the channel attention layer and the classifier. It should be understood that the training module is used in the LED backlight control system to train each component in the system so that it can effectively complete the task of extracting and classifying the time sequence characteristics of the environmental light intensity. Specifically, the training module functions as follows: 1. training an environment light intensity time sequence feature extractor: the training module is used for training the environment light intensity time sequence feature extractor based on the one-dimensional convolution layer. By providing labeled training samples, the training module can adjust the parameters of the extractor so that it can accurately extract local timing features from the timing input of the ambient light intensity. In this way, the extractor can learn the feature representation appropriate for the current task so that the subsequent classifier can better classify. 2. Training time sequence enhancer: the training module is also used for training the time sequence enhancer based on the channel attention layer. By providing marked training samples and corresponding environmental light intensity time sequence characteristics, the training module can adjust parameters of the enhancer, so that the importance of characteristic transfer can be adaptively enhanced according to task requirements, and the expression capability of the environmental light intensity transfer time sequence characteristics is improved. In this way, the enhancer can learn the channel attention weights suitable for the current task so that the subsequent classifier can better utilize the time-sequence-enhanced features for classification. 3. Training a classifier: the training module is also used for training the classifier. By providing marked training samples and corresponding time sequence enhanced environmental light intensity time sequence characteristics, the training module can adjust parameters of the classifier so that the classifier can accurately classify the environmental light intensity time sequence characteristics of different categories. Thus, the classifier can learn decision boundaries and classification rules suitable for the current task so as to accurately classify the environmental light intensity in practical application. Through the training process of the training module, the system can optimize parameters of each component through a large amount of training data, so that the system can better adapt to the change of the environmental light intensity in practical application and provide accurate classification results. The training module plays a key role in the system, and the system can have good performance and generalization capability through training and optimization.
More specifically, in one example, the training module includes: the training data acquisition unit is used for acquiring training data, wherein the training data comprises training environment light intensity values at a plurality of preset time points in a preset time period, and a true value of the brightness value of the LED backlight source, which is to be increased, maintained or reduced; the training data time sequence arrangement unit is used for arranging the training environment light intensity values of the plurality of preset time points into training environment light intensity time sequence input vectors according to the time dimension; the training data time sequence vector segmentation unit is used for carrying out vector segmentation on the training environment light intensity time sequence input vector so as to obtain a sequence of training environment light intensity local time sequence input vector; the training environment light intensity local time sequence feature extraction unit is used for enabling the sequence of the training environment light intensity local time sequence input vector to pass through the environment light intensity time sequence feature extractor based on the one-dimensional convolution layer to obtain the sequence of the training environment light intensity local time sequence feature vector; the training environment light intensity local time sequence feature transfer association unit is used for calculating a transfer matrix between every two adjacent training environment light intensity local time sequence feature vectors in the training environment light intensity local time sequence feature vector sequence to obtain a training environment light intensity time sequence transfer feature matrix sequence; the training light intensity time sequence characteristic strengthening unit is used for enabling the sequence of the training environment light intensity time sequence transfer characteristic matrix to pass through the time sequence strengthening device based on the channel attention layer so as to obtain a training time sequence strengthening environment light intensity time sequence characteristic diagram; the classification loss unit is used for enabling the training time sequence enhanced environment light intensity time sequence characteristic diagram to pass through the classifier to obtain a classification loss function value; and a model training unit, configured to train the one-dimensional convolutional layer-based environmental light intensity time sequence feature extractor, the channel attention layer-based time sequence enhancer, and the classifier based on the classification loss function value and propagating in a gradient descent direction, where, at each iteration of the training, probability density convergence optimization of feature scale constraint is performed on each feature matrix of the training time sequence enhanced environmental light intensity time sequence feature map.
Wherein, the model training unit includes: the optimization weighting factor calculation subunit is used for carrying out probability density convergence optimization of feature scale constraint on each feature matrix of the training time sequence reinforcement environment light intensity time sequence feature map so as to obtain a first weighting coefficient and a second weighting vector; and the characteristic optimization subunit is used for carrying out channel dimension weighting on the training time sequence enhanced environment light intensity time sequence characteristic diagram by using the first weighting coefficient, and carrying out weighting on each characteristic matrix of the training time sequence enhanced environment light intensity time sequence characteristic diagram along the channel dimension by using each position value in the second weighting vector as a weighting factor so as to obtain the optimized training time sequence enhanced environment light intensity time sequence characteristic diagram.
In particular, in the technical scheme of the application, each training environment light intensity local time sequence feature vector in the training environment light intensity local time sequence feature vector expresses the time sequence correlation feature of the training environment light intensity value in the local time domain, so that the domain transfer feature of the training environment light intensity value local time sequence distribution can be obtained by calculating the transfer matrix between every two adjacent training environment light intensity local time sequence feature vectors in the training environment light intensity local time sequence feature vector, the obtained training environment light intensity time sequence transfer feature matrix expresses the local time sequence domain transfer feature under the global time domain, and after the training environment light intensity local time sequence feature vector passes through the time sequence enhancer based on the channel attention layer, the feature enhancement is carried out by emphasizing the local time domain transfer feature distribution in the time sequence dimension, so that the feature expression effect of the obtained training time sequence enhanced environment light intensity time sequence feature graph can be improved. But at the same time, the training time sequence enhanced environment light intensity time sequence characteristic diagram is taken as a whole, imbalance exists for the local time domain inter-domain transfer time sequence distribution characteristic expression of the training environment light intensity value under the global time domain, and the applicant of the application further discovers that the imbalance is related to the characteristic expression scale to a large extent, namely the local time domain inter-domain transfer characteristic expression scale in the space dimension of the characteristic matrix and the global time domain time sequence distribution scale in the channel dimension among the characteristic matrixes, for example, the imbalance between the overall distribution of the characteristic value under the respective time domain is more and more understood as relative to the preset local time domain and the global time domain expression scale. Therefore, when the training time sequence reinforced environment light intensity time sequence characteristic diagram passes through the classifier, the convergence effect of the class probability density distribution domain of the classifier is affected, and the accuracy of the obtained classification result is affected.
Therefore, preferably, the probability density convergence optimization of the feature scale constraint is performed on each feature matrix of the training time sequence enhanced environment light intensity time sequence feature map, for example, denoted as M k.
Accordingly, in one specific example, the optimization weighting factor calculation subunit is further configured to: performing probability density convergence optimization of feature scale constraint on each feature matrix of the training time sequence enhanced environment light intensity time sequence feature graph by using the following coefficient calculation formula to obtain the first weighting coefficient and the second weighting vector; wherein, the coefficient calculation formula is:
Wherein L is the channel number of the training time sequence enhanced environment light intensity time sequence characteristic diagram, V k is the global characteristic mean value of the kth characteristic matrix M k of the training time sequence enhanced environment light intensity time sequence characteristic diagram, V is a characteristic vector formed by V k, Representing the square of the two norms of the feature vector V, S is the scale of the kth feature matrix M k of the training time-series enhanced ambient light intensity time-series feature diagram, i.e. width times height, and/>The square of the Frobenius norm of the kth feature matrix M k of the training time-series enhanced ambient light intensity time-series feature map is represented, M i,j represents the eigenvalue of the (i, j) th position of the kth feature matrix M k of the training time-series enhanced ambient light intensity time-series feature map, w 1 represents the first weighting coefficient, and w 2k represents the eigenvalue of the kth position of the second weighting vector.
Here, the probability density convergence optimization of the feature scale constraint can perform correlation constraint of a multi-level distribution structure on the feature probability density distribution in the high-dimensional feature space based on the feature scale through a tail distribution strengthening mechanism of a quasi-standard cauchy distribution type, so that the probability density distribution of the high-dimensional features with different scales is uniformly unfolded in the whole probability density space, and probability density convergence heterogeneity caused by feature scale deviation is compensated. In this way, in the training process, the training time sequence enhanced environment light intensity time sequence feature diagram is weighted along the channel by the weight w 1, and each feature matrix M k of the training time sequence enhanced environment light intensity time sequence feature diagram is weighted by the weight w 2k, so that the convergence of the optimized time sequence enhanced environment light intensity time sequence feature diagram in the preset class probability density distribution domain of the classifier can be improved, and the accuracy of the obtained classification result is improved. Therefore, the backlight brightness can be adaptively adjusted according to the time sequence change condition of the ambient light intensity, so that better visual effect and energy consumption management are provided, and the method has important significance in improving the user experience of the display equipment, saving energy and adapting to different ambient light conditions.
In summary, the LED backlight control system 100 according to the embodiment of the present application is illustrated, which may be capable of adaptively adjusting backlight brightness according to time sequence variation of ambient light intensity, so as to provide better visual effect and energy consumption management, which has important significance for improving user experience of the display device, saving energy sources, and adapting to different ambient light conditions.
As described above, the LED backlight control system 100 according to the embodiment of the present application may be implemented in various terminal devices, for example, a server having an LED backlight control algorithm, etc. In one example, the LED backlight control system 100 may be integrated into the terminal device as a software module and/or hardware module. For example, the LED backlight control system 100 may be a software module in the operating system of the terminal device, or may be an application developed for the terminal device; of course, the LED backlight control system 100 can also be one of a number of hardware modules of the terminal device.
Alternatively, in another example, the LED backlight control system 100 and the terminal device may be separate devices, and the LED backlight control system 100 may be connected to the terminal device through a wired and/or wireless network and transmit the interactive information in a agreed data format.
Fig. 5 shows a flowchart of an LED backlight control method according to an embodiment of the present application. As shown in fig. 5, the method for controlling the LED backlight according to the embodiment of the present application includes: s110, detecting the surrounding illumination intensity through an ambient light sensor; s120, setting backlight preference data of a user through a user interface; s130, emitting light rays with different brightness through the LED backlight source; and S140, controlling the brightness value of the LED backlight by a controller communicably connected to the ambient light sensor, the user interface and the backlight.
Fig. 6 shows a schematic diagram of the system architecture of substep S140 of the LED backlight control method according to an embodiment of the present application. In one possible implementation, as shown in fig. 6, controlling the brightness value of the LED backlight by a controller communicatively connected to the ambient light sensor, the user interface, and the backlight, comprises: collecting, by the ambient light sensor, ambient light intensity values at a plurality of predetermined time points within a predetermined period of time; arranging the environment light intensity values of the plurality of preset time points into environment light intensity time sequence input vectors according to the time dimension; carrying out local time sequence feature analysis on the environment light intensity time sequence input vector to obtain a sequence of the environment light intensity local time sequence feature vector; performing transfer association coding of adjacent environment light intensity time sequence characteristics on each environment light intensity local time sequence characteristic vector in the sequence of the environment light intensity local time sequence characteristic vectors to obtain environment light intensity transfer time sequence characteristics; and determining, based on the ambient light intensity shift timing characteristics, that a brightness value of the LED backlight should be increased, should be maintained, or should be decreased.
Here, it will be understood by those skilled in the art that the specific operations of the respective steps in the above-described LED backlight control method have been described in detail in the above description of the LED backlight control system with reference to fig. 1 to 4, and thus, repetitive descriptions thereof will be omitted.
Fig. 7 shows an application scenario diagram of an LED backlight control system according to an embodiment of the present application. As shown in fig. 7, in this application scenario, first, ambient light intensity values (e.g., D illustrated in fig. 7) at a plurality of predetermined time points within a predetermined period of time are acquired by an ambient light sensor (e.g., C illustrated in fig. 7), and then, the ambient light intensity values at the plurality of predetermined time points are input to a server (e.g., S illustrated in fig. 7) in which an LED backlight control algorithm is deployed, wherein the server is capable of processing the ambient light intensity values at the plurality of predetermined time points using the LED backlight control algorithm to obtain a classification result for indicating that the brightness value of the LED backlight should be increased, should be maintained, or should be decreased.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The foregoing description of embodiments of the application has been presented for purposes of illustration and description, and is not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the various embodiments described. The terminology used herein was chosen in order to best explain the principles of the embodiments, the practical application, or the improvement of technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (2)

1. An LED backlight control system, comprising:
An ambient light sensor for detecting ambient light intensity;
a user interface for setting backlight preference data of a user;
The LED backlight source consists of a plurality of LEDs and is used for emitting light rays with different brightness; and
A controller communicatively coupled to the ambient light sensor, the user interface, and the LED backlight, the controller for controlling a brightness value of the LED backlight;
Wherein, the controller includes:
The environment light intensity data acquisition module is used for acquiring environment light intensity values of a plurality of preset time points in a preset time period through the environment light sensor;
the light intensity data time sequence arrangement module is used for arranging the environment light intensity values of the plurality of preset time points into environment light intensity time sequence input vectors according to the time dimension;
The local time sequence feature extraction module is used for carrying out local time sequence feature analysis on the local time sequence input vector of the environment light intensity to obtain a sequence of the local time sequence feature vector of the environment light intensity;
the environment light intensity local time sequence feature transfer association coding module is used for carrying out transfer association coding of adjacent environment light intensity time sequence features on each environment light intensity local time sequence feature vector in the sequence of the environment light intensity local time sequence feature vectors so as to obtain environment light intensity transfer time sequence features; and
The brightness control module is used for determining that the brightness value of the LED backlight source should be increased, kept or decreased based on the environmental light intensity transfer time sequence characteristics;
the local time sequence feature extraction module of the environment light intensity comprises:
The light intensity time sequence vector segmentation unit is used for carrying out vector segmentation on the environment light intensity time sequence input vector so as to obtain a sequence of environment light intensity local time sequence input vector; and
The environment light intensity local time sequence feature capturing unit is used for enabling the sequence of the environment light intensity local time sequence input vectors to pass through an environment light intensity time sequence feature extractor based on a one-dimensional convolution layer to obtain the sequence of the environment light intensity local time sequence feature vectors;
The local time sequence characteristic transfer associated coding module of the environment light intensity comprises:
the adjacent time sequence light intensity characteristic transfer association unit is used for calculating a transfer matrix between every two adjacent environment light intensity local time sequence characteristic vectors in the sequence of the environment light intensity local time sequence characteristic vectors so as to obtain the sequence of the environment light intensity time sequence transfer characteristic matrix; and
The channel attention time sequence strengthening unit is used for enabling the sequence of the environment light intensity time sequence transfer characteristic matrix to pass through a time sequence strengthening device based on a channel attention layer to obtain a time sequence strengthening environment light intensity time sequence characteristic diagram as the environment light intensity transfer time sequence characteristic;
wherein, the luminance control module is used for: the time sequence enhanced environment light intensity time sequence characteristic diagram is passed through a classifier to obtain a classification result, wherein the classification result is used for indicating that the brightness value of the LED backlight source should be increased, kept or decreased;
The system further comprises a training module for training the environment light intensity time sequence feature extractor based on the one-dimensional convolution layer, the time sequence enhancer based on the channel attention layer and the classifier;
Wherein, training module includes:
The training data acquisition unit is used for acquiring training data, wherein the training data comprises training environment light intensity values at a plurality of preset time points in a preset time period, and a true value of the brightness value of the LED backlight source, which is to be increased, maintained or reduced;
The training data time sequence arrangement unit is used for arranging the training environment light intensity values of the plurality of preset time points into training environment light intensity time sequence input vectors according to the time dimension;
The training data time sequence vector segmentation unit is used for carrying out vector segmentation on the training environment light intensity time sequence input vector so as to obtain a sequence of training environment light intensity local time sequence input vector;
The training environment light intensity local time sequence feature extraction unit is used for enabling the sequence of the training environment light intensity local time sequence input vector to pass through the environment light intensity time sequence feature extractor based on the one-dimensional convolution layer to obtain the sequence of the training environment light intensity local time sequence feature vector;
The training environment light intensity local time sequence feature transfer association unit is used for calculating a transfer matrix between every two adjacent training environment light intensity local time sequence feature vectors in the training environment light intensity local time sequence feature vector sequence to obtain a training environment light intensity time sequence transfer feature matrix sequence;
The training light intensity time sequence characteristic strengthening unit is used for enabling the sequence of the training environment light intensity time sequence transfer characteristic matrix to pass through the time sequence strengthening device based on the channel attention layer so as to obtain a training time sequence strengthening environment light intensity time sequence characteristic diagram;
The classification loss unit is used for enabling the training time sequence enhanced environment light intensity time sequence characteristic diagram to pass through the classifier to obtain a classification loss function value; and
The model training unit is used for training the environment light intensity time sequence feature extractor based on the one-dimensional convolution layer, the time sequence enhancer based on the channel attention layer and the classifier based on the classification loss function value and through gradient descending direction propagation, wherein each feature matrix of the training time sequence enhanced environment light intensity time sequence feature graph is subjected to probability density convergence optimization of feature scale constraint during each iteration of the training;
Wherein, the model training unit includes:
The optimization weighting factor calculation subunit is used for carrying out probability density convergence optimization of feature scale constraint on each feature matrix of the training time sequence reinforcement environment light intensity time sequence feature map so as to obtain a first weighting coefficient and a second weighting vector; and
And the characteristic optimization subunit is used for carrying out channel dimension weighting on the training time sequence enhanced environment light intensity time sequence characteristic diagram by using the first weighting coefficient, and carrying out weighting on each characteristic matrix of the training time sequence enhanced environment light intensity time sequence characteristic diagram along the channel dimension by using each position value in the second weighting vector as a weighting factor so as to obtain the optimized training time sequence enhanced environment light intensity time sequence characteristic diagram.
2. An LED backlight control method using the LED backlight control system according to claim 1, comprising:
detecting ambient illumination intensity by an ambient light sensor;
setting backlight preference data of a user through a user interface;
Light rays with different brightness are emitted through the LED backlight source; and
The brightness value of the LED backlight is controlled by a controller communicatively connected to the ambient light sensor, the user interface, and the LED backlight.
CN202311314104.5A 2023-10-11 2023-10-11 LED backlight source control system and method thereof Active CN117334162B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311314104.5A CN117334162B (en) 2023-10-11 2023-10-11 LED backlight source control system and method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311314104.5A CN117334162B (en) 2023-10-11 2023-10-11 LED backlight source control system and method thereof

Publications (2)

Publication Number Publication Date
CN117334162A CN117334162A (en) 2024-01-02
CN117334162B true CN117334162B (en) 2024-05-10

Family

ID=89289984

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311314104.5A Active CN117334162B (en) 2023-10-11 2023-10-11 LED backlight source control system and method thereof

Country Status (1)

Country Link
CN (1) CN117334162B (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1569194A1 (en) * 2004-02-13 2005-08-31 Sony Ericsson Mobile Communications AB Portable electronic device controlled according to ambient illumination
CN202363084U (en) * 2011-12-12 2012-08-01 上海全一通讯技术有限公司 Direct type LED (light emitting diode) backlight assembly and liquid crystal display using same
CN102946494A (en) * 2012-11-27 2013-02-27 广东欧珀移动通信有限公司 Mobile terminal and method for automatically adjusting backlight brightness
CN109951594A (en) * 2017-12-20 2019-06-28 广东欧珀移动通信有限公司 Intelligent adjusting method, device, storage medium and the mobile terminal of screen intensity
CN111476219A (en) * 2020-06-02 2020-07-31 苏州科技大学 Image target detection method in intelligent home environment
CN114127838A (en) * 2019-05-20 2022-03-01 美商新思科技有限公司 Classifying patterns in an electronic circuit layout using machine learning based coding
CN114397779A (en) * 2021-12-08 2022-04-26 深圳市穗晶光电股份有限公司 Structure for enabling mini LED backlight partition brightness to be uniform
CN115545168A (en) * 2022-10-31 2022-12-30 齐鲁工业大学 Dynamic QoS prediction method and system based on attention mechanism and recurrent neural network
CN115996504A (en) * 2022-11-21 2023-04-21 江苏东成工具科技有限公司 Brightness adjustment method, device and computer readable medium
CN116744511A (en) * 2023-05-22 2023-09-12 杭州行至云起科技有限公司 Intelligent dimming and toning lighting system and method thereof
CN116844494A (en) * 2023-07-06 2023-10-03 北斗星通智联科技有限责任公司 Screen backlight brightness adjusting method and device, computer equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130221855A1 (en) * 2012-02-24 2013-08-29 Research In Motion Limited Controlling backlight of a portable electronic device

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1569194A1 (en) * 2004-02-13 2005-08-31 Sony Ericsson Mobile Communications AB Portable electronic device controlled according to ambient illumination
CN202363084U (en) * 2011-12-12 2012-08-01 上海全一通讯技术有限公司 Direct type LED (light emitting diode) backlight assembly and liquid crystal display using same
CN102946494A (en) * 2012-11-27 2013-02-27 广东欧珀移动通信有限公司 Mobile terminal and method for automatically adjusting backlight brightness
CN109951594A (en) * 2017-12-20 2019-06-28 广东欧珀移动通信有限公司 Intelligent adjusting method, device, storage medium and the mobile terminal of screen intensity
CN114127838A (en) * 2019-05-20 2022-03-01 美商新思科技有限公司 Classifying patterns in an electronic circuit layout using machine learning based coding
CN111476219A (en) * 2020-06-02 2020-07-31 苏州科技大学 Image target detection method in intelligent home environment
CN114397779A (en) * 2021-12-08 2022-04-26 深圳市穗晶光电股份有限公司 Structure for enabling mini LED backlight partition brightness to be uniform
CN115545168A (en) * 2022-10-31 2022-12-30 齐鲁工业大学 Dynamic QoS prediction method and system based on attention mechanism and recurrent neural network
CN115996504A (en) * 2022-11-21 2023-04-21 江苏东成工具科技有限公司 Brightness adjustment method, device and computer readable medium
CN116744511A (en) * 2023-05-22 2023-09-12 杭州行至云起科技有限公司 Intelligent dimming and toning lighting system and method thereof
CN116844494A (en) * 2023-07-06 2023-10-03 北斗星通智联科技有限责任公司 Screen backlight brightness adjusting method and device, computer equipment and storage medium

Also Published As

Publication number Publication date
CN117334162A (en) 2024-01-02

Similar Documents

Publication Publication Date Title
CN110533097B (en) Image definition recognition method and device, electronic equipment and storage medium
JP2022058915A (en) Method and device for training image recognition model, method and device for recognizing image, electronic device, storage medium, and computer program
CN110533041B (en) Regression-based multi-scale scene text detection method
KR20200145827A (en) Facial feature extraction model learning method, facial feature extraction method, apparatus, device, and storage medium
CN114724007A (en) Training classification model, data classification method, device, equipment, medium and product
CN109961102B (en) Image processing method, image processing device, electronic equipment and storage medium
CN113837308B (en) Knowledge distillation-based model training method and device and electronic equipment
CN113569881A (en) Self-adaptive semantic segmentation method based on chain residual error and attention mechanism
CN114708517B (en) Attention-based self-adaptive meta-learning lithology recognition method and device
CN112712068B (en) Key point detection method and device, electronic equipment and storage medium
CN112380392A (en) Method, apparatus, electronic device and readable storage medium for classifying video
CN114358197A (en) Method and device for training classification model, electronic equipment and storage medium
CN116757986A (en) Infrared and visible light image fusion method and device
CN112508126A (en) Deep learning model training method and device, electronic equipment and readable storage medium
CN117334162B (en) LED backlight source control system and method thereof
CN111711816B (en) Video objective quality evaluation method based on observable coding effect intensity
CN117576781A (en) Training intensity monitoring system and method based on behavior recognition
KR102166547B1 (en) System and method for predicting information based on images
CN115294405A (en) Method, device, equipment and medium for constructing crop disease classification model
CN115311186A (en) Cross-scale attention confrontation fusion method for infrared and visible light images and terminal
CN116523767B (en) Image defogging method and system combined with fog concentration classification
CN117241443B (en) Intelligent lighting lamp based on Internet of things and intelligent control method thereof
US20230056947A1 (en) Learning apparatus, method, and storage medium
CN117295198B (en) Control method and system of intelligent desk lamp
CN116959489B (en) Quantization method and device for voice model, server and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant