CN115578271A - Image processing method, projection apparatus, and storage medium - Google Patents

Image processing method, projection apparatus, and storage medium Download PDF

Info

Publication number
CN115578271A
CN115578271A CN202211098336.7A CN202211098336A CN115578271A CN 115578271 A CN115578271 A CN 115578271A CN 202211098336 A CN202211098336 A CN 202211098336A CN 115578271 A CN115578271 A CN 115578271A
Authority
CN
China
Prior art keywords
brightness
source image
light source
image
matrix
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211098336.7A
Other languages
Chinese (zh)
Inventor
全晓荣
张聪
胡震宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Huole Science and Technology Development Co Ltd
Original Assignee
Shenzhen Huole Science and Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Huole Science and Technology Development Co Ltd filed Critical Shenzhen Huole Science and Technology Development Co Ltd
Priority to CN202211098336.7A priority Critical patent/CN115578271A/en
Publication of CN115578271A publication Critical patent/CN115578271A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • G06T5/90
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3182Colour adjustment, e.g. white balance, shading or gamut

Abstract

The application relates to an image processing method, a projection device and a storage medium, comprising: acquiring a source image, and determining the brightness parameter of the source image according to the pixel information of the source image; determining the brightness of a target light source required for displaying the source image based on the brightness parameter of the source image in the brightness parameters; and making the light source module emit light of the brightness of the target light source; updating the brightness value of each pixel point in the source image based on the brightness parameter to obtain a target brightness matrix; and modulating the source image based on the target light source brightness and the target brightness matrix to obtain a modulated source image, and enabling the projection equipment to project and display the modulated source image. By the method and the device, dynamic contrast of the image can be guaranteed while power consumption is effectively reduced, and the overall projection viewing effect is improved.

Description

Image processing method, projection apparatus, and storage medium
Technical Field
The present application relates to the field of projection technologies, and in particular, to an image processing method, a projection device, and a storage medium.
Background
With the development of the projector market, the user has higher and higher requirements on the subjective visual experience of the projection picture. The contrast is very critical to the visual effect, and the high contrast has a great effect on the definition, the detail expression and the gray level expression of a projection picture.
However, in practical applications, it is usually necessary to increase the output power of the projector to improve the dynamic contrast of the image, and such operation results in high power consumption, so that the projector is very prone to generate heat during operation, and consumes energy, which is not favorable for energy saving.
Disclosure of Invention
The application discloses an image processing method, a projection device and a storage medium, which can improve the dynamic contrast of an image and improve the overall projection effect while effectively reducing power consumption.
In a first aspect, the present application relates to an image processing method, comprising: acquiring a source image, and determining the brightness parameter of the source image according to the pixel information of the source image; determining the brightness of a target light source required for displaying a source image based on the brightness parameter; and making the light source module emit light with the brightness of the target light source; updating the brightness value of each pixel point in the source image based on the brightness parameter to obtain a target brightness matrix; and modulating the source image based on the target light source brightness and the target brightness matrix to obtain a modulated source image, and enabling the projection equipment to project and display the modulated source image.
Optionally, the step of determining the brightness parameter of the source image according to the pixel information of the source image includes: taking the maximum value of the sub-pixel of each pixel point in the source image as the brightness value of the pixel point; each pixel point in the source image comprises a plurality of sub-pixels; determining a first brightness matrix of the source image based on the brightness value of each pixel point of the source image; determining a brightness parameter of the source image according to the first brightness matrix.
Optionally, the step of determining the brightness of the target light source required for displaying the source image based on the brightness parameter includes: determining a light source debugging coefficient of a source image based on the brightness parameter; determining the target number of light source modules to be started based on the light source debugging coefficient; when the light source modules of the target number are started, light with the brightness of the target light source can be emitted.
Optionally, the method further includes: acquiring real-time temperature of each primary color lamp in a light source module, wherein each light source module comprises a plurality of primary color lamps; confirming temperature compensation coefficients corresponding to the primary color lamps based on the real-time temperature of the primary color lamps; and correspondingly adjusting the brightness of each primary color lamp according to the temperature compensation coefficient.
Optionally, each light source module includes a red-based light, a green-based light, and a blue-based light, and the step of correspondingly adjusting the brightness of each basic-color light according to the temperature compensation coefficient includes: respectively adjusting the brightness of the red basic color lamp, the green basic color lamp and the blue basic color lamp to ensure that the second brightness ratio among the adjusted red basic color lamp, the adjusted green basic color lamp and the adjusted blue basic color lamp is the same as the first brightness ratio; the first luminance ratio is a luminance ratio among red, green and blue light in the source image.
Optionally, the step of updating the brightness value of each pixel point in the source image based on the brightness parameter to obtain the target brightness matrix includes: determining a pixel compensation coefficient of a source image according to the brightness parameter; based on a preset mapping relation, carrying out mapping transformation on the first brightness matrix to obtain a second brightness matrix; and obtaining a target brightness matrix according to the pixel compensation coefficient and the second brightness matrix.
Optionally, the step of performing mapping transformation on the first luminance matrix to obtain a second luminance matrix includes: and searching a corresponding mapping value for each brightness value in the first brightness matrix based on the mapping relation, and updating the corresponding brightness value according to the mapping value to obtain the second brightness matrix.
Optionally, the step of obtaining the target luminance matrix according to the pixel compensation coefficient and the second luminance matrix includes: let the target luminance matrix P = β × Lum ', where β represents a pixel compensation coefficient, and Lum' represents the second luminance matrix.
In a second aspect, the present application further provides a projection apparatus, comprising: one or more processors; a memory; and one or more application programs, one or more of which are stored in the memory and configured to be executed by the processor to implement the image processing method of the first aspect described above.
In a third aspect, the present application further provides a computer-readable storage medium, on which a computer program is stored, the computer program being loaded by a processor to perform the image processing method of the first aspect.
The application relates to an image processing method, a projection apparatus and a storage medium. The method dynamically adjusts and determines the brightness of a target light source required by each frame of source image according to the pixel information of each frame of source image and the brightness parameter of the whole image, reduces the whole power consumption and simultaneously ensures that the projection image is not distorted; and aiming at the change of the brightness of the output light source, carrying out corresponding pixel compensation on each frame of source image so as to improve the dynamic contrast of the projected image.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings required to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the description below are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is an application environment diagram of an image processing method according to an embodiment of the present application.
Fig. 2 is a first flowchart illustrating an image processing method according to an embodiment of the present disclosure.
Fig. 3 is an application schematic diagram of an image processing method provided in an embodiment of the present application.
Fig. 4 is a flowchart for determining the brightness of the target light source according to an embodiment of the present application.
Fig. 5 is a flowchart for obtaining a target luminance matrix according to an embodiment of the present application.
Fig. 6 is an exemplary diagram of a mapping relationship provided in an embodiment of the present application.
Fig. 7 is a schematic flowchart of a second image processing method according to an embodiment of the present application.
Fig. 8 is a schematic structural diagram of a projection apparatus provided in an embodiment of the present application.
Detailed Description
Embodiments of the present application will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present application are shown in the drawings, it should be understood that the present application may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but rather are provided for a more thorough and complete understanding of the present application. It should be understood that the drawings and embodiments of the present application are for illustration purposes only and are not intended to limit the scope of the present application.
It should be understood that the various steps recited in the method embodiments of the present application may be performed in a different order and/or performed in parallel. Moreover, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present application is not limited in this respect.
The term "include" and variations thereof as used herein are open-ended, i.e., "including but not limited to". The term "based on" is "based, at least in part, on". The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments". Relevant definitions for other terms will be given in the following description.
It should be noted that the terms "first", "second", and the like in the present application are only used for distinguishing different devices, modules or units, and are not used for limiting the order or interdependence relationship of the functions performed by the devices, modules or units.
It is noted that references to "a", "an", and "the" modifications in this application are intended to be illustrative rather than limiting, and that those skilled in the art will recognize that reference to "one or more" unless the context clearly dictates otherwise.
The names of messages or information exchanged between devices in the embodiments of the present application are for illustrative purposes only, and are not intended to limit the scope of the messages or information.
For example, fig. 1 is a diagram of an application environment of an image processing method according to an embodiment of the present application. When a projection device (for example, the projection device 101 in fig. 1) projects an original image to be projected (hereinafter, simply referred to as a source image) onto a projection plane (for example, the plane 102 in fig. 1) for visual presentation, the brightness, contrast, and other attributes of the image displayed on the projection plane will be changed correspondingly with the brightness of the light source of the projection device, which is in turn related to the light source controller of the projection device.
In the currently common projection display technology, the output of the light source controller of the projection device is usually output in a fixed voltage and current manner, and cannot be dynamically adjusted according to the image to be projected input at the front end. If the light source controller keeps a high current output state, the power is higher, and higher energy consumption is caused; if the light source controller maintains the state of outputting at a lower current, the brightness of the image displayed by the projection equipment is reduced, and the visual experience of the user is affected. Therefore, the current static dimming mode cannot achieve the effects of improving the image contrast and saving energy and low power consumption at the same time.
The image processing method provided by the embodiment of the application can be applied to determining the brightness of the target light source of the projector according to the pixel information of the source image and performing pixel compensation on the source image, so that the source image subjected to pixel compensation is visually displayed by using the brightness of the target light source, the power of the projector can be reduced, the dynamic contrast of the image can be enhanced, and better visual experience is provided for a user.
Fig. 2 is a schematic flow chart of an image processing method according to an embodiment of the present disclosure.
In this embodiment, the data image processing method may be applied to a projection device (for example, the projection device shown in fig. 8), and for a computer device that needs to perform image processing and storage, the functions provided by the method of the present application for image processing may be directly integrated on the projection device, or may be run on the projection device in the form of a Software Development Kit (SDK).
As shown in fig. 2, the image processing method specifically includes the following steps, and the order of the steps in the flowchart may be changed and some steps may be omitted according to different requirements.
S1, acquiring a source image, and determining the brightness parameter of the source image according to the pixel information of the source image.
In an embodiment, the source image refers to an image to be projected on the projection device, for example, the projection device may receive a video to be projected input by an external electronic device or each frame of image in a plurality of frames of images to be projected as a frame of source image, or may also receive each frame of image to be projected, which is pre-stored inside the projection device or received from a server, as a source image. In subsequent steps, the multi-frame source images may be processed in parallel.
In one embodiment, the source image may include a plurality of images, for example, the source image may be an 8-bit RGB (Red, green, blue) rectangular image, each pixel of the source image includes three sub-pixels (R, G, B), and the luminance value of the source image has a range of [0,255], and the maximum value of the gray scale is 255.
In one embodiment, the pixel information of the source image includes the number of pixel points included in the long side of the source image, the number of pixel points included in the short side of the source image, the brightness value of each pixel point, and the like. The position of each pixel point in the source image can be determined according to the number of the pixel points contained in the long edge of the source image and the number of the pixel points contained in the short edge of the source image.
For example, the position of any pixel point in the source image can be determined by taking the lower left corner of the source image as a coordinate origin, taking the long side of the source image as an abscissa axis, taking the short side of the source image as an ordinate axis, and taking the size of one pixel point as a unit length.
In one embodiment, determining the brightness parameter of the source image from the pixel information of the source image comprises:
s11, taking the maximum value of the sub-pixel of each pixel point in the source image as the brightness value of the pixel point; and each pixel point in the source image comprises a plurality of sub-pixels.
And S12, determining a first brightness matrix of the source image based on the brightness value of each pixel point of the source image.
S13, determining the brightness parameters of the source image according to the first brightness matrix, wherein the brightness parameters comprise: the display device comprises a first brightness parameter and a second brightness parameter, wherein the first brightness parameter comprises average brightness, and the second brightness parameter comprises maximum brightness.
For example, determining a brightness value of each pixel point in the source image, where each pixel point includes a plurality of sub-pixels, and the plurality of sub-pixels include R, G, and B (for example, a brightness value of a certain pixel point is (28, 64, 64), where a brightness value of the sub-pixel R is 128, a brightness value of the sub-pixel G is 64, and a brightness value of the sub-pixel B is 64);
taking the maximum value of the sub-pixel of each pixel point of the source image as the brightness value of the pixel point, wherein the maximum value comprises max (LR (i, j), LG (i, j), LB (i, j)), wherein max () represents a function taking the maximum value (e.g., max (128, 64, 64) = 128), (i, j) represents the position of the pixel point in the source image, LR (i, j) represents the brightness value of the sub-pixel R of the pixel point at (i, j), LG (i, j) represents the brightness value of the sub-pixel G of the pixel point at (i, j), LB (i, j)) represents the brightness value of the sub-pixel B of the pixel point at (i, j);
determining a first brightness matrix of the source image based on the brightness value of each pixel point of the source image, wherein the first brightness matrix comprises the following components: let the elements Lum (i, j) = max (LR (i, j), LG (i, j), LB (i, j)) in the first luminance matrix Lum;
determining a first luminance parameter of the source image based on a maximum of all luminance values in the first luminance matrix, comprising: let the first luminance parameter Lmax = a maximum value max (Lum (i, j)) of all luminance values in the first luminance matrix Lum;
determining an average value of all luminance values based on the first luminance matrix, determining a second luminance parameter of the source image based on the average value, comprising: let the second luminance parameter Lave equal the average of all luminance values in the luminance matrix Lum.
In one embodiment, the first luminance matrix is formed based on the maximum luminance values of the sub-pixels of the pixel points of the source image, so that the color details of the source image can be better preserved.
In an embodiment, the calculation method for confirming the maximum brightness value, the first brightness matrix Lum, the first brightness parameter Lmax, and the second brightness parameter Lave in the foregoing embodiments may further include other calculation manners, for example, different scaling factors are set, and the like, which is not limited in this application.
In an embodiment, as shown in fig. 3, for an application schematic diagram of the image processing method provided in the embodiment of the present application, the front-end image processor shown in fig. 3 may be used to perform the preprocessing in step S1 on the source image.
Furthermore, the front-end image processor shown in fig. 3 may also be represented in fig. 8 as an image processing section of the projection apparatus, the light source controller shown in fig. 3 may also be represented in fig. 8 as a control section of the projection apparatus, and the debug display shown in fig. 3 may also be represented in fig. 8 as a light modulator of the projection apparatus.
S2, determining the brightness of a target light source required by the display of the source image based on the brightness parameter; and the light source module emits light with the target light source brightness.
In one embodiment, the brightness, color contrast, etc. of the image projected by the projection device onto the projection plane will vary with the brightness of the light source of the projection device.
The projection equipment can adopt a plurality of combination modes such as a multi-unit light source module, a multi-lamp module and a single-lamp module to carry out lighting projection. In the embodiment of the present application, the projection apparatus uses a multi-unit light source module (for example, a plurality of light sources shown in fig. 3) to perform lighting.
The projection apparatus may adjust the number of turned-on light source modules (e.g., the number n of light sources shown in fig. 3) and the power of each light source module by using a light source controller (e.g., shown in fig. 3), so as to control the brightness of the output light source, and avoid the distortion of an image caused by excessive brightness loss of the image while effectively reducing power consumption.
Specifically, the larger the number of light source modules used in projection and the larger the power of each light source module, the brighter the light source brightness output by the projection apparatus and the larger the power consumption. Each light source module can comprise a plurality of primary color lamps, and the colors of light emitted by the primary color lamps used for lighting the source image correspond to the colors of the sub-pixels of the source image one by one. For example, when an 8bit RGB image is lighted, three primary color lamps of R, G, and B (for example, as shown in fig. 3) may be used.
In one embodiment, the pixel information of the source image is different for each frame, and the light source brightness value (hereinafter referred to as target light source brightness) required to preserve image detail and save power when projected is also different. The target light source brightness can be obtained by determining the number of the light source modules and the brightness value emitted by each light source module.
Therefore, when the source image is projected, the number of light source modules required for lighting the source image (hereinafter, simply referred to as "target number" for convenience of description) and the brightness value required to be emitted by each light source module (hereinafter, simply referred to as "target brightness value" for convenience of description) need to be determined according to the first brightness parameter and the second brightness parameter of the source image.
In one embodiment, determining the target light source brightness required to present the source image based on the brightness parameter includes steps S21-S23 in the flowchart of FIG. 4:
and S21, determining a light source debugging coefficient of the source image based on the brightness parameter.
In one embodiment, the light source adjustment coefficients are obtained according to a first luminance matrix Lum, a first luminance parameter Lmax, and a second luminance parameter Lave, and include: let L = Lave + (Ldiff + L) 2 diff/255)/2, where Ldiff = Lmax-Lave, let the luminaire commissioning coefficient λ = L/255. Therefore, the value range of the light source debugging coefficient lambda is [0,1 ]]。
It can be seen that the value range of L is the same as the value range of the luminance values of the pixels in the source image (e.g., [0,255 ]), and when the luminance values of all the pixels in the source image are 0, L =0; when L ≠ 0, L is greater than or equal to the average luminance Lave.
S22, determining the target number of the light source modules to be started based on the light source debugging coefficient; when the light source modules of the target number are started, light with the brightness of the target light source can be emitted.
In one embodiment, determining a target number of light source modules to be turned on required to present the source image comprises: determining the target number of the light source modules to be started according to the light source debugging coefficient and the total number of the light source modules, wherein the method comprises the following steps: let the target number m = [ λ × n ], where n is a positive integer, n represents the total number of all light source modules, and [ λ × n ] represents rounding λ × n.
In one embodiment, in order for the light source modules to emit light with a target light source brightness, a target brightness value of each primary color lamp in each of a target number of light source modules is also determined. Therefore, the method provided by the embodiment of the present application further includes:
step S101, acquiring real-time temperature of each primary color lamp in a light source module, wherein each light source module comprises a plurality of primary color lamps.
And S102, confirming the temperature compensation coefficient corresponding to each primary color lamp based on the real-time temperature of each primary color lamp.
And S103, correspondingly adjusting the brightness of each primary color lamp according to the temperature compensation coefficient.
In one embodiment, each light source module comprises a red-based light, a green-based light, and a blue-based light; correspondingly adjusting the brightness of each primary color lamp according to the temperature compensation coefficient comprises the following steps:
respectively adjusting the brightness of the red basic color lamp, the green basic color lamp and the blue basic color lamp to ensure that the second brightness ratio among the adjusted red basic color lamp, the adjusted green basic color lamp and the adjusted blue basic color lamp is the same as the first brightness ratio; the first luminance ratio is a luminance ratio among red, green and blue light in the source image.
In one embodiment, the adjusting the current of each primary color lamp according to the temperature compensation coefficient may further include:
and respectively adjusting the current of each basic color lamp to ensure that the adjusted current of each basic color lamp is the same as the current of each basic color lamp corresponding to the ambient temperature.
In one embodiment, a temperature detector may be used to measure the real-time temperature of each primary color lamp (for example, as shown in fig. 3), and the corresponding temperature compensation coefficient of each primary color lamp at different temperatures is looked up in the preset environment temperature compensation coefficient table, for example, the real-time temperature of the three primary color lamps is 30 degrees, and the corresponding temperature compensation coefficient (λ R ', λ G ', λ B ') is 24 degrees.
Specifically, the currents of the tricolor lamps with the real-time temperature of 30 degrees are respectively IR, IG and IB, and the currents of the tricolor lamps with the ambient temperature of 24 degrees stored in the corresponding ambient temperature compensation coefficient table are respectively IR ', IG ' and IB '. Therefore, the effects of the ambient temperature compensation coefficients (λ R ', λ G ', λ B ') in the ambient temperature compensation coefficient table include: so that IR '= λ R' × IR, IG '= λ G' × IG and IB '= λ B' × IB.
Furthermore, the effect of the temperature compensation coefficients (λ R ', λ G ', λ B ') includes: it is ensured that the adjusted second luminance ratio between the primary color lamps (e.g., red, green, and blue) is still consistent with the first luminance ratio of the primary color lights (e.g., red, green, and blue lights) in the source image.
Specifically, if the light source luminances of R, G, and B primary colors in the source image are KLR, KLG, and KLB, respectively, and the adjusted light source luminances of the red, green, and blue primary colors are KLR ', KLG ', and KLB ', respectively, then KLR: KLG: KLB = KLR': KLG': KLB'. Therefore, the lightened source image can be ensured to have the same tone as the original source image, and the color cast condition of the picture displayed by the projection equipment is avoided.
In one embodiment, the light source module may include primary color lamps of multiple colors (e.g., a red primary color lamp, a green primary color lamp, and a blue primary color lamp), or may include only primary color lamps of the same color (e.g., all of the blue primary color lamps). When the light source module only comprises primary color lamps of the same color, the current of the primary color lamps can be directly adjusted according to the temperature compensation coefficient.
According to the above, the current of the primary color lamp can be adjusted according to the temperature compensation coefficient, and the luminous intensity of the primary color lamp is adjusted to be consistent with the luminous intensity at the ambient temperature, so that the light source brightness of each primary color lamp is adjusted to the target brightness value corresponding to the real-time ambient temperature, and the effect of reducing power consumption can be achieved while the same tone as the source image is ensured.
In one embodiment, after the target number of the light source modules required to be started by the source image and the target brightness value of the primary color lamp of each light source module are determined, each primary color lamp in each light source module with the target number is utilized to shine according to the target brightness value, and light with the target light source brightness can be emitted.
In one embodiment, as can be seen from the above-mentioned relationship analysis, the target light source brightness is determined by the brightness average values Lave and Ldiff of the source images.
Specifically, the method comprises the following steps: a) When the value of Lave is larger and the source image has more high-brightness pixels, the brightness of the target light source required by projection is higher, and a light source module is required to output high current, so that the overall image brightness of the source image is effectively ensured, and the image distortion after projection is reduced; b) When the value of Lave is in the middle brightness, for example, the value of Lave is between 100-150, the brightness of the source image is relatively uniform, and the brightness of the target light source during projection can be reduced to reduce the power consumption of the projection device; c) In the case of lower Lave, if the maximum brightness of the source image is also low, the target light source brightness at the time of projection can be reduced to reduce power consumption. In addition, when the maximum brightness value of the source image is large, the value of the target light source brightness required when the image is projected should be high, so as to prevent distortion of the projected image and reduction of the overall brightness.
And S3, updating the brightness value of each pixel point in the source image based on the brightness parameter to obtain a target brightness matrix.
In one embodiment, pixel compensation is performed on a source image by using a spatial domain image enhancement technology based on a brightness parameter of the source image, so that the brightness value of each pixel point in the source image is updated, and the pixel-compensated source image is obtained after a target brightness matrix is obtained.
In one embodiment, the spatial domain image enhancement technology is an image enhancement technology based on a spatial domain, and can directly process each pixel point of an image, so that the image is imaged uniformly, the dynamic range of the image is expanded, the contrast of the image is expanded, and the like.
In the embodiment of the application, the pixel compensation is carried out on the source image based on the spatial domain enhancement technology, so that the overall brightness and color of the image displayed by the projection after the power consumption reduction and dimming are basically unchanged from the source image.
In an embodiment, the updating of the luminance value of each pixel point in the source image based on the luminance parameter of the source image to obtain the target luminance matrix includes steps S31 to S33 in the flowchart of obtaining the target luminance matrix shown in fig. 5.
And S31, determining a pixel compensation coefficient of the source image according to the brightness parameter of the source image.
In one embodiment, determining the pixel compensation coefficients for the source image based on the brightness parameters of the source image comprises: a pixel compensation coefficient is determined based on the first parameter and the second parameter.
Specifically, when L ≠ 0, let the pixel compensation coefficient β =255/L, where the luminance parameter L = Lave + (Ldiff + L) 2 diff/255)/2, ldiff = Lmax-Lave; when L =0, let the pixel compensation coefficient β =0.
And S32, mapping the first brightness matrix based on a preset mapping relation to obtain a second brightness matrix.
In one embodiment, updating the luminance values in the luminance matrix to obtain an updated luminance matrix includes: and searching a corresponding mapping value for each brightness value in the first brightness matrix based on a preset mapping relation, and updating the corresponding brightness value according to the mapping value to obtain a second brightness matrix.
Specifically, a preset mapping relationship table stores mapping relationships including mapping values corresponding to each luminance value in the first luminance matrix, where the mapping values may be specifically set according to specific image quality expressions and effects to be achieved under the condition that a trend of pixels in an image from dark to bright is not changed.
For example, fig. 6 illustrates an exemplary mapping relationship provided in the embodiment of the present application, wherein an S-shaped curve may be represented as a mapping relationship of an S-shaped conversion trend, and the first luminance matrix Lum may be updated according to the S-shaped curve to obtain the second luminance matrix Lum'.
Specifically, both coordinate axes of the rectangular coordinate system in fig. 6 represent luminance values, where x represents a luminance value in the first luminance matrix Lum (as shown by the horizontal axis), and y represents a luminance value of the second luminance matrix Lum' (as shown by the vertical axis).
The line y = x has three intersections with the sigmoid curve, including: two vertices and the middle a (x 1, y 1) point. When a luminance value (hereinafter, expressed as an input value) in the first luminance matrix Lum before mapping is on an intersection of the straight line y = x and the sigmoid curve, then a value mapped on the y-axis is unchanged, i.e., a luminance value (hereinafter, expressed as an output value) at the same position in the second luminance matrix Lum' is unchanged; when the input value is between 0 and x1, the output value after the S-shaped curve mapping conversion is smaller than the input value; when the input value is between x1 and 255, the output value after the S-shaped curve mapping conversion is larger than the input value.
Based on the above-described embodiment, the increase or decrease of the luminance value can be controlled by controlling the intersection of the S-shaped curve and y = x, the magnitude of the decrease can be controlled by controlling y1, and the magnitude of the increase can be controlled by controlling y2, where y1 may be a point where the ordinate of the point where y = x corresponds to the S-shaped curve is most decreased, and may also be referred to as the ordinate of the point where the decrease is maximum, and y2 may be a point where the ordinate of the point where y = x corresponds to the S-shaped curve is most increased, and may also be referred to as the ordinate of the point where the increase is maximum.
In addition, the upper graph S curve is a possible embodiment of the implementation of the mapping relationship, and may also be represented as any curve, straight line or broken line taking this as an example of the trend, and the final mapping result may be stored in the form of a mapping relationship table. By adopting a table look-up mode to look up a corresponding mapping value for each brightness value in the brightness matrix, the hardware environment can be simplified, the operation time can be reduced, and dynamic control can be realized.
And step S33, obtaining a target brightness matrix according to the pixel compensation coefficient and the second brightness matrix.
In one embodiment, obtaining the target luminance matrix according to the pixel compensation coefficients and the second luminance matrix comprises:
let the target luminance matrix P = β × Lum ', where β represents the pixel compensation coefficient and Lum' represents the second luminance matrix.
In an embodiment, the value in the target brightness matrix P is used to update the brightness value of the corresponding pixel point (the pixel point at the same coordinate position) in the source image, so as to obtain the source image after pixel compensation.
And S4, modulating the source image based on the target light source brightness and the target brightness matrix, obtaining a modulated source image, and enabling the projection equipment to project and display the modulated source image.
In one embodiment, the source image is modulated by a predetermined display (e.g., the modulation display shown in fig. 3) based on the target light source brightness and the target brightness matrix, so that the projection device projects and displays the modulated source image.
In an embodiment, for example, as shown in fig. 7, a second flowchart of the image processing method according to the above-mentioned embodiment of the present application is shown.
The image processing method provided by the embodiment of the application can determine the corresponding pixel compensation coefficient and the target number of the started light source modules according to the pixel information of each frame of source image, and can adjust the luminous power of the primary color lamps based on the real-time temperature of the primary color lamps of the light source modules, so that the primary color lamps are adjusted to be the target brightness value, each primary color lamp in the light source modules with the target number is used for outputting the target light source brightness according to the target brightness value, the source image after the pixel compensation obtained based on the pixel compensation coefficient is subjected to lighting and modulation, and the modulated image is presented.
The image processing method provided by the embodiment of the application has the beneficial effects that: dynamic light control is realized by utilizing a plurality of groups of light source modules; dynamic compensation of the temperature of the light source module is realized, so that the output light source does not deviate color; according to the pixel information of each frame of source image and the brightness parameter of the whole image, the light source brightness required by each frame of source image is dynamically adjusted and determined, the whole power consumption is reduced, and meanwhile, the projection image is ensured not to be distorted; and aiming at the change of the brightness of the output light source, carrying out corresponding pixel compensation on each frame of source image so as to improve the dynamic contrast of the projected image.
It should be understood that although the various steps in the flowcharts of fig. 2, 4-5, 7 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in fig. 2, 4-5, and 7 may include multiple steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, which are not necessarily performed sequentially, but may be performed alternately or alternately with other steps or at least some of the other steps.
In some embodiments, the image processing method described above may be implemented in the form of a computer program that is executable on a projection device as shown in fig. 8. The architecture of a hardware device for implementing the image processing method is described below with reference to fig. 8.
It is to be understood that the embodiments are illustrative only and that the scope of the appended claims is not limited to the details of construction set forth herein.
Fig. 8 is a schematic structural diagram of a projection apparatus according to an embodiment of the present disclosure.
The projection apparatus 200 includes a projection section 210 and a driving section 220 that drives the projection section 210. The projection section 210 may form an optical image and project the optical image onto the imaging medium SC.
The projection unit 210 includes a light source unit 211, a light modulator 212, and an optical system 213. The driving section 220 includes a light source driving section 221 and a light modulator driving section 222.
The Light source 211 may include a solid-state Light source such as a Light Emitting Diode (LED), a laser, and a pump lamp. The light source section 211 may include optical elements such as lenses and polarizing plates for improving optical characteristics of projection light, and light adjusting elements for adjusting light flux.
The light source driving part 221 may control the operation of the light source in the light source part 211, including turning on and off, according to an instruction of the control part 250.
The light modulator 212 includes a Display panel 215, and the Display panel 215 may be a transmissive Liquid Crystal panel (LCD), a reflective Liquid Crystal On Silicon (LCOS), or a Digital micro-mirror Device (DMD).
The light modulator 212 is driven by the light modulator driving unit 222, and the light modulator driving unit 222 is connected to the image processing unit 245.
The image processing unit 245 inputs image data to the light modulator driving unit 222. The light modulator driving section 222 converts the input image data into a data signal suitable for the operation of the display panel 215. The light modulator driving section 222 applies a voltage to each pixel of each display panel 215 based on the converted data signal, and draws an image on the display panel 215.
The optical system 213 includes a lens or a mirror or the like that images the incident image light PLA on the imaging medium SC. The optical system 213 may also include a zoom mechanism that enlarges or reduces the image projected onto the imaging medium SC, a focus adjustment mechanism that performs focus adjustment, and the like.
The projection apparatus 200 further includes an operation section 231, a signal receiving section 233, an input interface 235, a storage section 237, a data interface 241, an interface section 242, a frame memory 243, an image processing section 245, and a control section 250. The input interface 235, the storage unit 237, the data interface 241, the interface unit 242, the image processing unit 245, and the control unit 250 can mutually perform data communication via the internal bus 207.
The operation section 231 may generate a corresponding operation signal according to an operation of various buttons and switches acting on the surface of the housing of the projection apparatus 200, and output the signal to the input interface 235. The input interface 235 includes a circuit that outputs an operation signal input from the operation unit 231 to the control unit 250.
The signal receiving unit 233 receives a signal (e.g., an infrared signal or a bluetooth signal) transmitted from the control device 5 (e.g., a remote controller), and decodes the received signal to generate a corresponding operation signal. The signal receiving unit 233 outputs the generated operation signal to the input interface 235. The input interface 235 outputs the received operation signal to the control section 250.
The storage unit 237 may be a magnetic recording device such as a Hard Disk Drive (HDD) or a storage device using a semiconductor memory element such as a flash memory. The storage unit 237 stores a program executed by the control unit 250, data processed by the control unit 250, image data, and the like.
The data interface 241 includes a connector and an interface circuit, and can be connected to the other electronic devices 100 by wire. The data interface 241 may be a communication interface that performs communication with other electronic devices 100. The data interface 241 receives image data, sound data, and the like from the other electronic devices 100. In the present embodiment, the image data may be a content image.
The interface section 242 is a communication interface for communicating with other electronic devices 100 according to the ethernet standard. The interface unit 242 includes a connector and an interface circuit that processes a signal transmitted by the connector. The interface part 242 is a main substrate that is an interface substrate including a connector and an interface circuit and is connected to the control part 250, and the main substrate is a substrate on which the processor 253 and other components are mounted. The connector and the interface circuit constituting the interface section 242 are mounted on the main board of the control section 250. The interface section 242 may receive setting information or instruction information transmitted from another electronic apparatus 100.
The control section 250 includes a memory 251 and a processor 253.
The memory 251 is a nonvolatile storage device that stores programs and data executed by the processor 253. The Memory 251 is configured by a semiconductor Memory element such as a magnetic Memory device or a flash-Only Memory (ROM), or another type of nonvolatile Memory device. The Memory 251 may also include a Random Access Memory (RAM) constituting a work area of the processor 253. The memory 251 stores data processed by the control unit 250 and a control program executed by the processor 253.
The processor 253 may be constituted by a single processor, or may be constituted by combining a plurality of processing groups. The processor 253 executes a control program to control the respective portions of the projection apparatus 200. For example, the processor 253 executes corresponding image processing based on the operation signal generated by the operation unit 231, and outputs parameters used in the image processing (such as parameters for performing keystone correction on an image) to the image processing unit 245. In addition, the processor 253 can control the light source driving part 221 to turn on or off the light source in the light source part 211 or adjust the brightness.
The image processing section 245 and the frame memory 243 may be formed of integrated circuits. The Integrated Circuit includes a Large Scale Integration (LSI), an Application Specific Integrated Circuit (ASIC), and a Programmable Logic Device (PLD), wherein the PLD may include a Field-Programmable Gate Array (FPGA). The integrated circuit may also comprise a portion of an analog circuit, or a combination of a processor and an integrated circuit. The combination of a processor and an integrated circuit is called a Micro Controller Unit (MCU), a System on Chip (SoC), a System LSI, a chipset, or the like.
The image processing unit 245 may store the image data received from the data interface 241 in the frame memory 243. The frame memory 243 includes a plurality of banks, each of which includes a memory capacity in which image data of one frame can be written. The frame Memory 243 may be composed of a Synchronous Dynamic Random Access Memory (SDRAM) or a Dynamic Random Access Memory (DRAM).
The image processing section 245 can perform image processing including resolution conversion, size adjustment, distortion correction, shape correction, digital zoom, image tone adjustment, image brightness adjustment, and the like on the image data stored in the frame memory 243.
The image processing section 245 may also convert an input frame frequency of the vertical synchronization signal into a drawing frequency and generate a vertical synchronization signal having the drawing frequency, which is referred to as an output synchronization signal. The image processing unit 245 outputs the output synchronization signal to the light modulator driving unit 222.
In some embodiments of the present application, a computer-readable storage medium is provided, in which a computer program is stored, and the computer program is loaded by a processor, so that the processor executes the steps of the data migration method. Here, the steps of the data migration method may be steps in the image processing method of each of the embodiments described above.
The foregoing description is only exemplary of the preferred embodiments of the application and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the application referred to in the present application is not limited to the embodiments with a particular combination of the above-mentioned features, but also encompasses other embodiments with any combination of the above-mentioned features or their equivalents without departing from the scope of the application. For example, the above features and the technical features (but not limited to) having similar functions in the present application are mutually replaced to form the technical solution.
Further, while operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. Under certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limitations on the scope of the application. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims. With regard to the apparatus in the above embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be described in detail here.

Claims (10)

1. An image processing method applied to a projection device, the method comprising:
acquiring a source image, and determining the brightness parameter of the source image according to the pixel information of the source image;
determining the brightness of a target light source required for displaying the source image based on the brightness parameters, and enabling a light source module to emit light with the brightness of the target light source;
updating the brightness value of each pixel point in the source image based on the brightness parameter to obtain a target brightness matrix;
and modulating the source image based on the target light source brightness and the target brightness matrix to obtain a modulated source image, and enabling the projection equipment to project and display the modulated source image.
2. The image processing method of claim 1, wherein said determining the brightness parameter of the source image from the pixel information of the source image comprises:
taking the maximum value of the sub-pixel of each pixel point in the source image as the brightness value of the pixel point; each pixel point in the source image comprises a plurality of sub-pixels;
determining a first brightness matrix of the source image based on the brightness value of each pixel point of the source image;
determining the brightness parameters of the source image from the first brightness matrix.
3. The method of claim 1, wherein said determining a target light source brightness required to render the source image based on the brightness parameter comprises:
determining a light source debugging coefficient of the source image based on the brightness parameter;
determining the target number of the light source modules to be started based on the light source debugging coefficient; when the light source modules with the target number are started, light with the target light source brightness can be emitted.
4. The image processing method according to any one of claims 1 to 3, characterized in that the method further comprises:
acquiring real-time temperature of each primary color lamp in the light source module, wherein each light source module comprises a plurality of primary color lamps;
confirming temperature compensation coefficients corresponding to the primary color lamps based on the real-time temperatures of the primary color lamps;
and correspondingly adjusting the brightness of each primary color lamp according to the temperature compensation coefficient.
5. The method of claim 4, wherein each light source module comprises a red-based light, a green-based light, and a blue-based light;
the correspondingly adjusting the brightness of each primary color lamp according to the temperature compensation coefficient comprises:
respectively adjusting the brightness of the red basic color lamp, the green basic color lamp and the blue basic color lamp to ensure that the second brightness ratio among the adjusted red basic color lamp, the adjusted green basic color lamp and the adjusted blue basic color lamp is the same as the first brightness ratio; the first luminance ratio is a luminance ratio among red light, green light, and blue light in the source image.
6. The image processing method according to claim 2, wherein the updating the brightness value of each pixel point in the source image based on the brightness parameter to obtain a target brightness matrix comprises:
determining a pixel compensation coefficient of the source image according to the brightness parameter;
based on a preset mapping relation, carrying out mapping transformation on the first brightness matrix to obtain a second brightness matrix;
and obtaining the target brightness matrix according to the pixel compensation coefficient and the second brightness matrix.
7. The image processing method according to claim 6, wherein the mapping the first luminance matrix to obtain a second luminance matrix comprises:
and searching a corresponding mapping value for each brightness value in the first brightness matrix based on the mapping relation, and updating the corresponding brightness value according to the mapping value to obtain the second brightness matrix.
8. The method according to claim 6, wherein the obtaining the target luminance matrix according to the pixel compensation coefficient and the second luminance matrix comprises:
let the target luminance matrix P = β × Lum ', where β represents the pixel compensation coefficient and Lum' represents the second luminance matrix.
9. A projection device, characterized in that the projection device comprises:
one or more processors;
a memory; and one or more application programs, wherein the one or more application programs are stored in the memory and configured to be executed by the processor to implement the image processing method of any one of claims 1 to 8.
10. A computer storage medium having stored thereon a computer program to be loaded by a processor for performing the image processing method of any one of claims 1 to 8.
CN202211098336.7A 2022-09-08 2022-09-08 Image processing method, projection apparatus, and storage medium Pending CN115578271A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211098336.7A CN115578271A (en) 2022-09-08 2022-09-08 Image processing method, projection apparatus, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211098336.7A CN115578271A (en) 2022-09-08 2022-09-08 Image processing method, projection apparatus, and storage medium

Publications (1)

Publication Number Publication Date
CN115578271A true CN115578271A (en) 2023-01-06

Family

ID=84580914

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211098336.7A Pending CN115578271A (en) 2022-09-08 2022-09-08 Image processing method, projection apparatus, and storage medium

Country Status (1)

Country Link
CN (1) CN115578271A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116343658A (en) * 2023-03-23 2023-06-27 深圳市陆百亿光电有限公司 Intelligent control method and device for LED lamp beads

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116343658A (en) * 2023-03-23 2023-06-27 深圳市陆百亿光电有限公司 Intelligent control method and device for LED lamp beads

Similar Documents

Publication Publication Date Title
JP4432818B2 (en) Image display device, image display method, and image display program
US7453475B2 (en) Optical display device, program for controlling the optical display device, and method of controlling the optical display device
TWI357044B (en) Display driving circuit
US20050195223A1 (en) Light modulating apparatus, optical display apparatus, light modulation control program, optical display apparatus control program, light modulation control method, and optical display apparatus control method
CN100490512C (en) Multi-projection display
JP2004341206A (en) Display apparatus
TW200417812A (en) Adaptive image display
US9588410B2 (en) Projection type display device and control method thereof
JP6331382B2 (en) Image display device and method for controlling image display device
JP6237020B2 (en) Image display device and method for controlling image display device
JP2000214827A (en) Color liquid crystal display device in field sequential drive system
JP2017003926A (en) Electro-optical device and control method for the same
JP2006284982A (en) Dimming information generation device, method thereof, program thereof, recording medium with program recorded therein, and image display device
JP2004163518A (en) Device and method for image display
CN115578271A (en) Image processing method, projection apparatus, and storage medium
JP7155697B2 (en) DISPLAY DEVICE AND CONTROL METHOD OF DISPLAY DEVICE
JP2006330177A (en) Display device and projector
JP2014059530A (en) Dimming control apparatus, image display device, dimming control method, and program
JP2010237633A (en) Projector
JP2006003586A (en) Image display apparatus, its method and program
JP2008026355A (en) Light source control device
JP2018045068A (en) Control apparatus
JP2005077638A (en) Video display device and projection type display device
WO2019239918A1 (en) Control device, display device, and control method
JP2008083653A (en) Image display device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination