WO2023000868A1 - Procédé et appareil de traitement d'image, dispositif, et support de stockage - Google Patents

Procédé et appareil de traitement d'image, dispositif, et support de stockage Download PDF

Info

Publication number
WO2023000868A1
WO2023000868A1 PCT/CN2022/098683 CN2022098683W WO2023000868A1 WO 2023000868 A1 WO2023000868 A1 WO 2023000868A1 CN 2022098683 W CN2022098683 W CN 2022098683W WO 2023000868 A1 WO2023000868 A1 WO 2023000868A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
gamma curve
gamma
image area
visual feature
Prior art date
Application number
PCT/CN2022/098683
Other languages
English (en)
Chinese (zh)
Inventor
吴义孝
胡木
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Publication of WO2023000868A1 publication Critical patent/WO2023000868A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • G06T5/94Dynamic range modification of images or parts thereof based on local image properties, e.g. for local contrast enhancement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation

Definitions

  • This application relates to image processing technology, but not limited to image processing methods, devices, equipment, and storage media.
  • gamma correction is usually performed on the image before display.
  • the so-called gamma correction is to edit the gamma curve of the image, and use the method of non-linear tone editing to the image to determine the dark part and light part of the image signal, and increase the ratio of the two, thereby improving the image quality.
  • the contrast effect In the field of computer graphics, it is customary to use the conversion relationship curve between the screen output voltage and the corresponding brightness, which is called the gamma curve. However, for some images, the gamma-corrected image quality is poor.
  • the image processing method, device, device, and storage medium provided in the embodiments of the present application can improve image quality and improve user's visual experience.
  • the image processing method, device, equipment, and storage medium provided in the embodiments of the present application are implemented as follows:
  • the image processing method provided in the embodiment of the present application includes: based on the first visual feature of the pixel unit of the image to be processed, segmenting the image to be processed to obtain multiple image regions; wherein the first visual feature includes the following At least one of: brightness, color temperature; determining a first gamma curve corresponding to a first image area; wherein, the first image area is one of the plurality of image areas; based at least in part on the first A gamma curve, performing gamma correction on the first image area.
  • the image processing chip provided in the embodiment of the present application includes a processor unit configured to execute the image processing method described in the embodiment of the present application.
  • the image processing device includes: a segmentation module configured to segment the image to be processed based on the first visual feature of the pixel unit of the image to be processed to obtain a plurality of image regions; wherein the first A visual feature includes at least one of the following: brightness, color temperature; a determination module configured to determine a first gamma curve corresponding to a first image area; wherein the first image area is one of the plurality of image areas 1.
  • a first correction module configured to perform gamma correction on the first image region based at least in part on the first gamma curve.
  • the electronic device provided by the embodiment of the present application includes a memory and a processor, the memory stores a computer program that can run on the processor, and the processor implements the method described in the embodiment of the present application when executing the program.
  • the computer-readable storage medium provided by the embodiment of the present application has a computer program stored thereon, and when the computer program is executed by a processor, the method described in the embodiment of the present application is implemented.
  • the image to be processed is segmented based on the first visual feature of the pixel unit of the image to be processed to obtain multiple image regions; then, the first gamma curve corresponding to the first image region is determined, at least partially Based on the first gamma curve, gamma correction is performed on the first image region; in this way, the corresponding gamma curve is used for correction for image regions with different visual characteristics, so that the quality of the image region can be improved in a targeted manner, and then Improve the user's visual experience.
  • FIG. 1 is a schematic diagram of an implementation flow of an image processing method provided by the present application
  • Figure 2 shows the gamma curves of the human eye in different environments
  • FIG. 3 is a schematic diagram of an implementation flow of another image processing method provided by the present application.
  • FIG. 4 is a schematic diagram of a transitional image area provided by the present application.
  • FIG. 5 is a schematic diagram of an implementation flow of another image processing method provided by the present application.
  • FIG. 6 is a schematic diagram of the relationship between brightness in nature and the corresponding brightness value perceived by human eyes
  • Figure 7 is a comparison of image display effects before and after gamma correction
  • FIG. 8 is a schematic diagram of an implementation flow of another image processing method provided by the present application.
  • FIG. 9 is a schematic diagram of the brightness of picture a before gamma correction
  • FIG. 10 is a schematic diagram of gamma curves corresponding to scene a1 and scene b1 of the present application.
  • Fig. 11 is a schematic diagram of the transitional image area of the picture and the gamma correction of the image area in the present application;
  • FIG. 12 is a schematic diagram of the weight relationship between the first weight function and the second weight function of the present application at the same pixel position;
  • FIG. 13 is a schematic diagram of the overall brightness of picture a after gamma correction in the present application.
  • FIG. 14 is a schematic structural diagram of an image processing device of the present application.
  • FIG. 15 is a schematic structural diagram of an electronic device provided by the present application.
  • first ⁇ second ⁇ third involved in this application does not represent a specific ordering of objects. It is understandable that “first ⁇ second ⁇ third” can be interchanged with specific sequence or sequence such that the application described herein can be practiced in sequences other than those illustrated or described herein.
  • An embodiment of the present application provides an image processing method, which is applied to an electronic device, and the electronic device may be various types of devices with information processing capabilities during implementation, for example, the electronic device may include a mobile phone, a tablet computer , laptop computer, projector, desktop computer, personal digital assistant, navigator, digital phone, video phone, television or sensory equipment, etc.
  • the functions realized by the method can be realized by calling the program codes by the processor in the electronic device, and of course the program codes can be stored in the computer storage medium. It can be seen that the electronic device at least includes a processor and a storage medium.
  • Figure 1 is a schematic diagram of the implementation flow of the image processing method provided by the embodiment of the present application. As shown in Figure 1, the method may include the following steps 101 to 103:
  • Step 101 Segment the image to be processed based on the first visual feature of the pixel unit of the image to be processed to obtain a plurality of image regions; wherein the first visual feature includes at least one of the following: brightness and color temperature.
  • the images to be processed may be various types of images.
  • the image to be processed is an image to be corrected by the image sensor; another example, the image to be processed is an image to be displayed on the image display terminal.
  • the image processing method described in the present application can be executed on the images generated in different stages.
  • the image to be processed may be segmented in the following manner: the pixel units in which the first visual feature falls within the same specific interval are divided into one image area, so as to obtain multiple image areas.
  • the pixel units whose brightness falls in the first brightness range can be divided into one image area, and the pixel units whose brightness falls in the second brightness range can be divided into another image area, and so on.
  • the first visual feature is color temperature
  • pixel units whose color temperature falls within the first color temperature range can be divided into one image area
  • pixel units whose color temperature falls within the second color temperature range can be divided into another image area, and so on.
  • the first visual feature is brightness and color temperature, divide the pixel units whose brightness falls into the third brightness range and whose color temperature falls into the third color temperature range into an image area, and divide the pixel units whose brightness falls into the fourth brightness range and whose color temperature falls into the third color temperature range
  • the pixel unit of the four color temperature intervals is divided into another image area, and so on.
  • a specific interval has at least one boundary, for example, a specific interval is [m, ⁇ ), it can be seen that it has only one boundary m; another example, a specific interval is [A, B], it can be seen that it has Two boundaries, namely A and B.
  • a pixel unit may include one pixel or multiple pixels.
  • the first visual feature of the pixel unit may be a representative feature of the visual feature of each pixel, and the representative feature may be an average value or a median value.
  • whether to segment the image to be processed may be unconditional, and in some embodiments, the segmentation may also be conditional. For example, if the overall second visual feature of the image to be processed satisfies the segmentation condition, execute steps 101 to 103; otherwise, if the second visual feature does not meet the segmentation condition, the image to be processed is not segmented, but determined The fourth gamma curve corresponding to the second visual feature, and performing gamma correction on the image to be processed according to the fourth gamma curve; in this way, under the premise of ensuring that each image conforms to the human eye's perception of brightness, Reduce unnecessary segmentation and other processing, thereby saving power consumption and computing resources.
  • the electronic device may determine the overall second visual feature of the image to be processed based on the first visual feature of the pixel units of the image to be processed.
  • the second visual feature may be a value used to characterize changes in the first visual feature of the pixel units of the image to be processed.
  • the second visual feature includes the variance and/or standard deviation of brightness of each pixel unit of the image to be processed, and/or the variance and/or standard deviation of color temperature of each pixel unit of the image to be processed.
  • the segmentation condition is that the variance and/or the standard deviation is greater than a corresponding threshold.
  • the second visual feature may also be used to characterize whether the image to be processed includes a light source, the number of light sources and/or the position of the light source, and the like.
  • the segmentation condition is: the image to be processed includes a light source, the number of light sources is greater than a corresponding threshold, and/or the position of the light source is at a specific position, etc.
  • the specific location is an edge area or a central area of the image to be processed.
  • the segmentation condition may also be: the variance and/or the standard deviation are greater than a corresponding threshold, and the image to be processed includes light sources, the number of light sources is greater than the corresponding threshold, and/or the position of the light source is at a specific position.
  • Step 102 determining a first gamma curve corresponding to a first image area; wherein, the first image area is one of the plurality of image areas.
  • the first gamma curve is obtained by calibration at the image sensor end, based on this, the first image region is calculated based on the first gamma curve at the image sensor end Perform gamma correction.
  • the image sensor side also calibrates other different gamma curves, and different gamma curves are applicable to different image areas.
  • different gamma curves have a mapping relationship with a third visual feature characterizing the scene.
  • the electronic device may determine the overall fourth visual feature of the first image area, and then determine a third visual feature that matches the fourth visual feature from different third visual features, and then obtain and determine the fourth visual feature.
  • the gamma curve corresponding to the three visual features (namely the first gamma curve); wherein, different third visual features correspond to different scenes, in other words, different scenes include different light sources and light intensities.
  • the electronic device may pre-acquire the images taken of the same scene under each candidate gamma curve, and then select a target image whose image quality meets the conditions from the captured images, and use the candidate gamma curve corresponding to the target image as the The final gamma curve for the scene.
  • the electronic device captures a scene under candidate gamma curve 1 to obtain image 1, captures the scene under candidate gamma curve 2 to obtain image 2, and captures the scene under candidate gamma curve 3 to obtain
  • the tester can select the image that is most in line with human perception from image 1 to image 3, and use the candidate gamma curve corresponding to the image as the final gamma curve of the scene; then, the electronic device calibrates the final gamma curve of the scene. gamma curve.
  • the final gamma curves of different scenes are selected through testing, and the mapping relationship between the third visual feature of each scene and the final gamma curves of the corresponding scenes is recorded.
  • the image to be processed is the image to be displayed on the image display terminal, it is similar to the solution of determining the corresponding first gamma curve in the above solution of the image to be corrected, and will not be repeated here.
  • the fourth visual feature represents the overall feature of the first image region, for example, the overall feature is an average brightness value, an average color temperature value, a median color temperature value, and/or a median brightness value.
  • the multiple gamma curves calibrated by the image sensor end and/or the image display end correspond to different visual feature intervals.
  • its mapping relationship is shown in Table 1: gamma curve 10 corresponding to brightness interval 10, gamma curve 20 corresponding to brightness interval 20, ..., gamma curve N0 corresponding to brightness interval N0; wherein, gamma curve 10, 20 , . . . , N0 are different, and the brightness interval 10 to the brightness interval N0 include different brightness.
  • the electronic device may determine the gamma curve corresponding to the interval from Table 1 according to the brightness interval in which the average brightness value or the median brightness value of the first image area falls, and use the curve as the first gamma curve corresponding to the first image area. For example, if the brightness mean value or brightness median value of the first image area falls into the brightness interval 20, then the gamma curve 20 is used as the first gamma curve.
  • the gamma curve 11 corresponding to the brightness interval 11 and the color temperature interval 11 the gamma curve 21 corresponding to the brightness interval 21 and the color temperature interval 21, ..., the gamma curve corresponding to the brightness interval N1 and the color temperature interval N1 N1; wherein, the gamma curves 11, 21, .
  • the electronic device may determine the corresponding first gamma curve according to the interval in which the average brightness value and the average color temperature value of the first image region fall. For example, if the average brightness falls into the brightness interval 11, and the average color temperature falls into the color temperature interval 11, the gamma curve 11 is taken as the first gamma curve; for another example, if the average brightness falls into the brightness interval 11, the average color temperature falls into If the color temperature range is 12, the gamma curve 11 or the gamma curve 21 can be used as the first gamma curve, or the fusion of the gamma curve 11 and the gamma curve 21 can be used as the first gamma curve.
  • Step 103 Perform gamma correction on the first image region based at least in part on the first gamma curve.
  • the image to be processed is segmented based on the first visual feature of the pixel unit of the image to be processed to obtain multiple image regions; then, the first gamma curve corresponding to the first image region is determined , performing gamma correction on the first image region based at least in part on the first gamma curve; in this way, for image regions with different visual characteristics, the corresponding gamma curve is used for correction, so that the gamma correction of the image region can be improved in a targeted manner. quality, thereby enhancing the user's visual experience.
  • gamma correction may be performed on the first image region based on the first gamma curve; gamma correction may also be performed on the first image region based on the third gamma curve and the first gamma curve; wherein, The third gamma curve is obtained by calibration at the image display end, and the first gamma curve is obtained by calibration at the image sensor end.
  • different third gamma curves correspond to different light characteristics
  • the electronic device may determine the corresponding third gamma curve according to the light characteristics of the current physical environment.
  • the corresponding gamma curves of human eyes are different.
  • the gamma value corresponding to the human eye will adapt to the overall brightness. Compress the dark places and brighten the dark places. Therefore, in the embodiment of the present application, when gamma correction is performed on the first image area, not only the influence of the visual characteristics of the first image area itself on the gamma curve of the human eye is considered, but also the current location of the electronic device is considered.
  • the influence of the light characteristics of the physical environment on the gamma curve of the human eye in this way, it helps to simulate the human eye's perception of light and shade in different light environments, thereby eliminating image mutations and making the first image area after gamma correction more accurate. Conforms to the human eye's perception of light and dark.
  • FIG. 3 is a schematic diagram of the implementation flow of another image processing method provided by the present application. As shown in FIG. 3 , the method may include the following steps 301 to 305:
  • Step 301 segment the image to be processed based on the first visual feature of the pixel unit of the image to be processed to obtain multiple image regions; wherein the first visual feature includes at least one of the following: brightness, color temperature;
  • Step 302 determining a first gamma curve corresponding to a first image area; wherein, the first image area is one of the plurality of image areas;
  • Step 303 performing gamma correction on the first image region based at least in part on the first gamma curve
  • Step 304 determining a second gamma curve corresponding to a second image area; wherein, the second image area is another image area adjacent to the first image area among the plurality of image areas;
  • Step 305 Perform gamma correction on the second image region based at least in part on the second gamma curve.
  • each image area of the image to be processed adopts the processing method for the first image area or the second image area described in the embodiment of the present application; in this way, when the electronic device performs gamma correction on the image to be processed , instead of using only one gamma curve to perform gamma correction on the entire image, the image is divided into multiple image regions, so as to select the gamma curve that matches the image region in a targeted manner, and based on this, the corresponding image region Perform gamma correction to improve the gamma correction effect of the image to be processed and improve image quality.
  • the method further includes: performing gamma correction on a transitional image region based at least in part on the first gamma curve and/or the second gamma curve, wherein the transitional image region includes A boundary line between the first image area and the second image area.
  • the boundary line between the first image area and the second image area can be determined first; An area enclosed by the second image area offset by a specific pixel distance is used as the transition image area.
  • 401 is the boundary line between the first image area 402 and the second image area 403, then, the boundary line 401 can be offset upward by the first pixel distance to obtain a boundary 404, and the boundary line 401 is shifted down by a second pixel distance, resulting in border 405 .
  • the area enclosed by the boundary 404 and the boundary 405 is the transition image area between the first image area 402 and the second image area.
  • the first pixel distance and the second pixel distance can be the same or different, and there is no limit to the size of the first pixel distance and the second pixel distance.
  • the first pixel distance and the second pixel distance are set specific pixel distances, and the value is a length of 3 to 5 pixel points.
  • the electronic device may perform gamma correction on the transitional image region in the following way: based on the first weight function and the second weight function, the first gamma curve and the second gamma curve The weighted sum is used as a fifth gamma curve; performing gamma correction on the transition image region based at least in part on the fifth gamma curve; thus, the brightness between different regions of the gamma-corrected image can be further improved continuity, the image effect looks more natural.
  • the first weight function is the coefficient of the first gamma curve
  • the second weight function is the coefficient of the second gamma curve
  • the value of the first weight function corresponding to the same pixel unit is the same as The sum of the values of the second weight function is 1; the value of the first weight function is negatively correlated with the distance from the first pixel unit to the first boundary of the first image area, wherein the first pixel A unit is a pixel unit belonging to a first image area in the transition image area, and the first boundary is located in the first image area.
  • light (x, y) represents the brightness of the pixel unit (x, y)
  • fun1 (light (x, y) ) represents the first gamma curve of the first image area
  • weight1 refers to the first weight function
  • the function The independent variable of is the pixel distance from the pixel unit of the transition image area to the first boundary
  • weight2 refers to the second weight function
  • the sum of the first weight function is 1
  • fun2(light (x,y) ) indicates the second image
  • the second gamma curve for the region
  • the electronic device may determine the gamma value of the pixel unit (x, y) in the transition image area according to the above formula (1), and perform gamma correction on the brightness of the pixel unit according to the gamma value.
  • FIG. 5 is a schematic diagram of the implementation flow of another image processing method provided by the present application. As shown in FIG. 5 , the method may include the following steps 501 to 505:
  • Step 501 segment the image to be processed based on the first visual feature of the pixel unit of the image to be processed to obtain multiple image regions; wherein the first visual feature includes at least one of the following: brightness, color temperature;
  • Step 502 determine the first gamma curve corresponding to the first image area; wherein, the first image area is one of the plurality of image areas, and the first gamma curve is calibrated at the image sensor end obtained;
  • Step 503 Perform gamma correction on the first image region based at least in part on the first gamma curve.
  • Step 504 determining a third gamma curve corresponding to the first image area; wherein, the third gamma curve is obtained by performing calibration on the image display terminal.
  • the electronic device may perform gamma correction on the first image area based on the first gamma curve at the image sensor end to obtain the corrected first image area; then, at the image display end based on the third A gamma curve, performing gamma correction on the corrected first image area to obtain a target image area.
  • the electronic device may firstly fuse the first gamma curve and the third gamma curve to obtain the fused gamma curve; then, based on the fused gamma curve, the first image region Perform gamma correction to obtain the target image area.
  • this application does not limit how the electronic device corrects the first image region based on the first gamma curve and the third gamma curve.
  • the electronic device may implement step 404 as follows: determine the light characteristics of the physical environment where the electronic device is located; and determine the corresponding third gamma curve according to the light characteristics.
  • the electronic device can calibrate different light characteristics on the image display end to correspond to the corresponding gamma curves, so that the electronic device can determine the corresponding gamma curve according to the light characteristics of the current physical environment before actually displaying the image to be processed.
  • the third gamma curve so that the display effect of the first image area corrected based on the third gamma curve is more in line with the current perception of light and shade by human eyes.
  • the light characteristics of the physical environment where the electronic device is located may be collected through a sensor on the electronic device. Further, in some embodiments, the light characteristics may include light intensity and/or color temperature.
  • Step 505 Perform gamma correction on the first image region according to the third gamma curve.
  • the two ends may or may not use the same image segmentation result.
  • the electronic device can re-segment the image to be displayed on the image display end.
  • the segmentation method is the same as the solution described above, and will not be repeated here.
  • the processing method for the transitional image area of the adjacent image area of the image to be displayed is the same as the above solution, and will not be repeated here.
  • FIG. 6 shows the brightness in nature and the corresponding brightness value felt by the human eye. It can be seen from the figure , the human eye is more sensitive to darker (closer to 0) brightness values, and less sensitive to brighter (closer to 1) brightness values. It can be understood that the human eye can better distinguish changes in darker brightness values, so the color is in When storing, the color value of the darker part should be preserved more.
  • the gamma curve is set according to the characteristics of the image sensor (sensor) and lens (lens), lighting environment, and even user preference.
  • the Gamma value When the Gamma value is large, the image will be very bright, but the contrast and saturation will decrease, as shown in Figure 7, it feels intuitively that there is a layer of fog covering the picture, and the noise in the dark will be amplified.
  • the Gamma value is small, the brightness will decrease, the overall contrast of the image will increase, the color will be more vivid, the noise in the dark place will be smaller, and the picture in the dark place will be darker, making it difficult to see clearly.
  • a gamma curve is used to perform gamma correction on the image to be processed during shooting by the mobile phone.
  • the gamma curves corresponding to the actual human eyes are not exactly the same in scenes with different brightness, especially in environments with obvious changes in light and dark and in environments where light and dark changes are not obvious.
  • the gamma corresponding to the human eye will adapt to the overall brightness. The higher the overall brightness, the steeper the gamma, compressing bright areas and brightening dark areas. That is to say, the human eye does not perceive the actual darker area as darker.
  • the gamma of the human eye will be automatically compressed, and the human eye will not feel that it is particularly bright.
  • the result will be that the brighter area will be brighter, or even appear white, while the darker area will be darker, or even appear black.
  • a gamma curve parameter that can adaptively adjust gamma curve parameters for different shooting scenes is provided, so as to simulate the human eye's perception of lightness and darkness in different environments as much as possible, and output the corresponding Image.
  • the overall process includes three parts:
  • gamma parameter calibration stage a set of gamma curves divided and calibrated according to the scene
  • This part calibrates the gamma curves in different scenes, that is, the gamma curves corresponding to different visual feature intervals.
  • the degree of brightness change of the captured image that is, an example of an image to be processed
  • determine whether the captured image needs to be segmented if necessary, perform regional segmentation on the captured image , to obtain image regions corresponding to different scenes, and then, based on the brightness of the image region, search for a corresponding gamma curve from the first part of the calibrated gamma curve set, and use the gamma curve to perform gamma correction on the corresponding image region.
  • interpolation is performed on the adjacent areas of the corresponding multiple image areas (ie transition image areas), so as to ensure the continuity of the adjacent image areas after gamma correction.
  • the third part the image display stage, corrects the gamma correction according to the display environment and finally displays it;
  • the gamma curve obtained in the second part needs to be adjusted according to the actual physical environment.
  • the gamma curve that is, the gamma correction parameter
  • a reference method for setting the shape of the gamma curve is to shoot 24 color cards in a light box environment, and make the brightness data Y of the 6 color blocks at the bottom of the color card as much as possible showing a linear proportional relationship.
  • the method of segmented linear interpolation is generally used, and SRAM can also be used to realize point-by-point mapping, so that the shape of the curve can be flexibly configured. After the first part is completed, the mapping relationship between multiple sets of gamma curves and scenes can be obtained.
  • scene recognition and scene segmentation in actual shooting, due to the complexity of the scene, the scene in the picture often contains not exactly the same environment when shooting, and the acquired image is down-sampled at the front end to obtain RGB condensed Outline, use the front-end segmentation module to segment different image areas, judge the environment of different image areas, and determine the gamma curve under the environment according to the judged environment.
  • the brightness of a certain picture a before gamma correction is shown in Figure 9, where x and y represent the pixel position respectively, and the z value represents the brightness of the pixel position.
  • the image is divided into two image areas, as shown in FIG. 4 , one image area 302 includes the scene a1, and the other image area 303 includes the scene b1.
  • the horizontal and vertical coordinates correspond to the positions of the pixels in the picture a
  • the curve represents the divided boundary line 301, which divides the whole picture into scene a1 and scene b1
  • the gamma curve corresponding to the pixel in scene a1 is The gamma curve a1
  • the gamma curve corresponding to the pixel in the scene b1 is the gamma curve b1.
  • the boundary line 301 is offset by a certain number of pixels on both sides.
  • the distance p is used to obtain the transition image area c, and the gamma curve corresponding to the pixels in the transition image area c adopts the weighted average of the former two, and sets the weight according to the distance of the offset.
  • scene a1 and scene b1 compare the values obtained in the first part to find the corresponding gamma curve.
  • x and y represent pixel positions respectively, and different gamma curves are used according to different scenes.
  • the curve of the transition image area is interpolated to ensure the continuity of the brightness change after mapping.
  • the method used here is linear interpolation, as shown in the following formula (2), the obtained gamma curve fun3 of the transition image area
  • the expression for (light (x,y) ) is:
  • light (x, y) represents the brightness of the pixel unit (x, y)
  • fun1(light (x, y) ) represents the target gamma curve of the image area
  • weight1 refers to the first weight function
  • the argument of the function is the pixel distance from the pixel unit of the transition image area to the boundary of the transition image area in the image area
  • weight2 refers to the second weight function
  • the sum of the first weight function is 1
  • fun2(light (x,y) ) means Target gamma curves of image regions adjacent to the image region.
  • FIG. 11 is a schematic diagram of the transitional image area of the above picture a and the gamma correction of the image area.
  • the area [0.0.5] represents the gamma correction result in the image area a1
  • [ 0.5,1.5] represents the gamma correction result in the transition image region
  • [1.5,2] represents the gamma correction result in the image region b1.
  • 0.5, 1.5, 2, etc. have no practical meaning, and are only examples, and do not limit the scope of the technical solution of the present application.
  • (x, y) represents the position of the pixel in the picture
  • light represents the brightness of the pixel at the x, y position
  • weight represents the weight, where the pixel width is p, when the pixel is on the upper boundary (that is, the transition image area is in The boundary of the image area of a1) weight1 is 1, weight2 is 0; when the pixel point is at the lower boundary (that is, the transition image area is at the boundary of the image area of b1), weight1 is 0, weight2 is 1, and other areas change as shown in Figure 12
  • the abscissa represents the proportion of the distance between the pixel point in the transition image area and the upper boundary (the maximum value is 1, and the distance from the upper boundary is 1*p, which is the lower boundary).
  • the corresponding gamma curves in different areas of the picture are saved and bound to the picture ID, and stored locally.
  • the light sensor of the mobile phone is used to obtain the light intensity and color temperature of the current external environment, combined with the currently displayed ambient brightness.
  • the data in the second part is subjected to corresponding light and dark calibration processing, and the corresponding image is output.
  • the gamma curve collection in the first part that the gamma curve corresponding to the currently displayed scene is fun2(), combined with the gamma curves in different areas of the shooting scene picture in the second part is fun(), Then the gamma curve used in the final display is: Weignt2*fun()+Weignt3*fun2().
  • the gamma curves corresponding to the human eye are not completely consistent in different environments, especially the corresponding gamma curves are not completely consistent under environments with obvious changes in brightness and darkness and in environments with insignificant changes in brightness and darkness.
  • different gamma curves are used according to the shooting scene and the display scene, and the gamma curve is adaptively adjusted for different shooting scenes, so as to simulate the human eye's perception of light and shade in different environments as much as possible, and the output corresponding image.
  • the front-end chip is used to divide different regions according to the color temperature and light intensity to select the gamma curve.
  • interpolation is performed in the transition image area of the adjacent image area to ensure the continuity of the brightness change.
  • the determination of the gamma curve during actual display is adjusted again according to the light intensity and color temperature of the current display environment, the corresponding gamma curve is obtained according to the current display scene, and the gamma curve corresponding to the existing shooting environment is carried out Weighted to get the final gamma curve.
  • the gamma curve sets generated by human eye calibration under different color temperature and light intensity scenes are pre-calibrated.
  • the original image is first divided into image regions according to the overall brightness difference of the picture in the front-end chip;
  • the principle of image segmentation is based on scene areas with obviously different color temperatures and light intensities in the overall picture.
  • threshold ladders multiple sets of thresholds
  • the color temperature or brightness difference of different areas falls within a certain threshold range to divide into different areas and perform segmentation.
  • the pre-calibrated gamma curve set use different gamma curves to perform gamma correction on these areas; then, in order to ensure that the gamma parameters between adjacent image areas in the image change with the pixel position
  • the continuity of the gamma parameter in the whole image is two-dimensional cubic polynomial interpolation; at the same time, the obtained information (including color temperature distribution, light intensity and gamma parameter distribution G1(x,y)) is saved in the image information.
  • the light sensor of the mobile phone When the picture is previewed and displayed, use the light sensor of the mobile phone to obtain the light intensity and color temperature value of the current external environment, compare it with the existing scene information during shooting, and perform gamut according to the similarity between the current environment and the shooting environment saved in the original picture Secondary adjustment of the horse curve; for example, find the corresponding overall gamma correction parameter G2 based on the environmental information obtained when the picture is currently displayed (because the current display scene is fixed, the gamma parameter is a fixed value), and the gamma generated by the old interpolation
  • the horse parameters G1(x, y) are summed and averaged to obtain the final parameters.
  • the present application provides an image processing device, which includes each module included and each unit included in each module, which can be realized by a processor; of course, it can also be realized by a specific logic circuit;
  • the processor may be a central processing unit (CPU), a microprocessor (MPU), a digital signal processor (DSP), a field programmable gate array (FPGA), or an image signal processing chip (ISP).
  • CPU central processing unit
  • MPU microprocessor
  • DSP digital signal processor
  • FPGA field programmable gate array
  • ISP image signal processing chip
  • FIG. 14 is a schematic structural diagram of the image processing device of the present application. As shown in FIG. 14, the image processing device 140 includes:
  • the segmentation module 141 is configured to segment the image to be processed based on the first visual feature of the pixel unit of the image to be processed to obtain a plurality of image regions; wherein the first visual feature includes at least one of the following: brightness, color temperature;
  • the determination module 142 is configured to determine a first gamma curve corresponding to a first image area; wherein the first image area is one of the plurality of image areas;
  • the first correction module 143 is configured to perform gamma correction on the first image region based at least in part on the first gamma curve.
  • the determining module 142 is further configured to: determine a second gamma curve corresponding to a second image area; wherein, the second image area is one of the plurality of image areas that is similar to the first image Another image region adjacent to the region; the first gamma curve is different from the second gamma curve; the first correction module 143 is also configured to: based on the second gamma curve, correct the second gamma curve Image areas are gamma corrected.
  • the first correction module 143 is further configured to perform gamma correction on the transitional image region based at least in part on the first gamma curve and/or the second gamma curve, wherein the The transition image area includes a boundary between the first image area and the second image area.
  • the determination module 142 is further configured to: determine the boundary line between the first image area and the second image area; The area enclosed by the area and the second image area offset by a specific pixel distance is used as the transition image area.
  • the first correction module 143 is configured to: based on the first weight function and the second weight function, use the weighted sum of the first gamma curve and the second gamma curve as the fifth gamma curve; performing gamma correction on the transition image region based at least in part on the fifth gamma curve; wherein the first weight function is a coefficient of the first gamma curve, and the second weight function is the coefficient of the second gamma curve, the sum of the value of the first weight function corresponding to the same pixel unit and the value of the second weight function is 1; the value of the first weight function and the value of the first pixel
  • the distance between the unit and the first boundary of the first image area is negatively correlated, wherein the first pixel unit is a pixel unit belonging to the first image area in the transition image area, and the first boundary is located in the first image area.
  • the image processing device 140 further includes a second correction module; wherein, the determination module 142 is further configured to: determine a third gamma curve corresponding to the first image region; wherein, the third gamma curve The horse curve is obtained by calibration at the image display end, and the first gamma curve is obtained by calibration at the image sensor end; the second correction module is also configured to, according to the third gamma curve, correct the Perform gamma correction on the first image area.
  • the determining module 142 is configured to: determine light characteristics of the physical environment; and determine a corresponding third gamma curve according to the light characteristics.
  • the segmentation module 141 is configured to: segment the pixel units in which the first visual feature falls into the same specific interval into one image area, so as to obtain multiple image areas.
  • the determining module 142 is further configured to: determine an overall second visual feature of the image to be processed based on the first visual feature of the pixel units of the image to be processed; if the second visual feature Satisfy the segmentation condition, trigger the segmentation module 141 to segment the image to be processed; if the second visual feature does not meet the segmentation condition, determine the fourth gamma curve corresponding to the second visual feature; and trigger the first correction module 143 according to The fourth gamma curve performs gamma correction on the image to be processed.
  • the second visual feature includes at least one of the following: the change of the first visual feature of the pixel unit of the image to be processed; whether the image to be processed includes a light source; the number of the light source ; the position of the light source.
  • the segmentation condition includes at least one of the following: the parameter characterizing the change is greater than a corresponding threshold; the image to be processed includes a light source; the number of the light source is greater than a corresponding threshold; the position of the light source at a specific location.
  • the determination module 142 is further configured to: acquire the mapping relationship between different gamma curves and the third visual feature representing the scene; determine the overall fourth visual feature of the first image region; from the Determining a third visual feature that matches the fourth visual feature from the third visual features that characterize the scene; determining a gamma curve corresponding to the matched third visual feature as the first gamma curve .
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, or physically exist separately, or two or more units may be integrated into one unit.
  • the above-mentioned integrated units can be implemented in the form of hardware or in the form of software functional units. It can also be implemented in the form of a combination of software and hardware.
  • the above-mentioned method is implemented in the form of a software function module and sold or used as an independent product, it can also be stored in a computer-readable storage medium.
  • the computer software product is stored in a storage medium and includes several instructions to make the electronic device Execute all or part of the methods described in the various embodiments of the present application.
  • the aforementioned storage medium includes: various media that can store program codes such as U disk, mobile hard disk, read-only memory (Read Only Memory, ROM), magnetic disk or optical disk.
  • the present application is not limited to any specific combination of hardware and software.
  • the present application provides an image processing chip, including a processor unit configured to: execute the image processing method as described above.
  • FIG. 15 is a schematic diagram of the hardware entity of the electronic device of the present application.
  • a computer program running on the processor 152, and the processor 152 implements the steps in the methods provided in the above-mentioned embodiments when executing the program.
  • the memory 151 is configured to store instructions and applications executable by the processor 152, and may also cache data to be processed or processed by each module in the processor 152 and the electronic device 150 (for example, image data, audio data, etc. , voice communication data and video communication data), can be realized by flash memory (FLASH) or random access memory (Random Access Memory, RAM).
  • FLASH FLASH
  • RAM Random Access Memory
  • the present application provides a computer-readable storage medium on which a computer program is stored, and when the computer program is executed by a processor, the steps in the method provided in the above-mentioned embodiments are implemented.
  • the present application provides a computer program product containing instructions, which, when run on a computer, causes the computer to execute the steps in the methods provided by the above method embodiments.
  • the disclosed devices and methods may be implemented in other ways.
  • the above-described embodiments are only illustrative.
  • the division of the modules is only a logical function division.
  • the mutual coupling, or direct coupling, or communication connection between the various components shown or discussed may be through some interfaces, and the indirect coupling or communication connection of devices or modules may be in electrical, mechanical or other forms of.
  • modules described above as separate components may or may not be physically separated, and the components displayed as modules may or may not be physical modules; they may be located in one place or distributed to multiple network units; Part or all of the modules can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional module in each embodiment of the present application can be integrated into one processing unit, or each module can be used as a single unit, or two or more modules can be integrated into one unit; the above-mentioned integration
  • the modules can be implemented in the form of hardware, or in the form of hardware plus software functional units.
  • the above-mentioned integrated units of the present application are realized in the form of software function modules and sold or used as independent products, they can also be stored in a computer-readable storage medium.
  • the computer software product is stored in a storage medium and includes several instructions to make the electronic device Execute all or part of the methods described in the various embodiments of the present application.
  • the aforementioned storage medium includes various media capable of storing program codes such as removable storage devices, ROMs, magnetic disks or optical disks.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)
  • Facsimile Image Signal Circuits (AREA)
  • Picture Signal Circuits (AREA)

Abstract

L'invention concerne un procédé et un appareil de traitement d'image, un dispositif, et un support de stockage. Le procédé consiste à : segmenter une image à traiter sur la base d'une première caractéristique visuelle d'une unité de pixel d'une image à traiter de façon à obtenir de multiples zones d'image, la première caractéristique visuelle comprenant une luminosité et/ou une température de couleur ; déterminer une première courbe gamma correspondant à la première zone d'image, la première zone d'image étant l'une des multiples zones d'image ; et réaliser une correction gamma sur la première zone d'image au moins en partie sur la base de la première courbe gamma.
PCT/CN2022/098683 2021-07-23 2022-06-14 Procédé et appareil de traitement d'image, dispositif, et support de stockage WO2023000868A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110836393.XA CN115689903A (zh) 2021-07-23 2021-07-23 图像处理方法及装置、设备、存储介质
CN202110836393.X 2021-07-23

Publications (1)

Publication Number Publication Date
WO2023000868A1 true WO2023000868A1 (fr) 2023-01-26

Family

ID=84979802

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/098683 WO2023000868A1 (fr) 2021-07-23 2022-06-14 Procédé et appareil de traitement d'image, dispositif, et support de stockage

Country Status (2)

Country Link
CN (1) CN115689903A (fr)
WO (1) WO2023000868A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116072059A (zh) * 2023-02-27 2023-05-05 卡莱特云科技股份有限公司 一种图像显示方法、装置、电子设备及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100026731A1 (en) * 2008-07-31 2010-02-04 Sony Corporation Image processing circuit and image display apparatus
US20150244915A1 (en) * 2014-02-21 2015-08-27 Taro Kikuchi Image processing apparatus, image capturing apparatus, image correction method, and storage medium
CN108337450A (zh) * 2017-01-19 2018-07-27 卡西欧计算机株式会社 图像处理装置、图像处理方法以及记录介质
CN110139020A (zh) * 2018-02-09 2019-08-16 杭州海康威视数字技术股份有限公司 一种图像处理方法及其装置
CN112714356A (zh) * 2020-12-18 2021-04-27 北京百度网讯科技有限公司 图像亮度校正方法、装置、设备以及存储介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100026731A1 (en) * 2008-07-31 2010-02-04 Sony Corporation Image processing circuit and image display apparatus
US20150244915A1 (en) * 2014-02-21 2015-08-27 Taro Kikuchi Image processing apparatus, image capturing apparatus, image correction method, and storage medium
CN108337450A (zh) * 2017-01-19 2018-07-27 卡西欧计算机株式会社 图像处理装置、图像处理方法以及记录介质
CN110139020A (zh) * 2018-02-09 2019-08-16 杭州海康威视数字技术股份有限公司 一种图像处理方法及其装置
CN112714356A (zh) * 2020-12-18 2021-04-27 北京百度网讯科技有限公司 图像亮度校正方法、装置、设备以及存储介质

Also Published As

Publication number Publication date
CN115689903A (zh) 2023-02-03

Similar Documents

Publication Publication Date Title
JP5772991B2 (ja) 電子機器
US10013739B2 (en) Image enhancement methods and systems using the same
US8538147B2 (en) Methods and appartuses for restoring color and enhancing electronic images
CN107274351B (zh) 图像处理设备、图像处理系统和图像处理方法
JPWO2006059573A1 (ja) 色彩調整装置及び方法
JP3959909B2 (ja) ホワイトバランス調整方法及び調整装置
US20180232862A1 (en) Image processing device and image processing method
KR20070090224A (ko) 전자 색 이미지 채도 처리 방법
WO2021218603A1 (fr) Procédé de traitement d'images et système de projection
CN112351195B (zh) 图像处理方法、装置和电子系统
JP4595569B2 (ja) 撮像装置
WO2023000868A1 (fr) Procédé et appareil de traitement d'image, dispositif, et support de stockage
KR20190073516A (ko) 화상 처리 장치, 디지털 카메라, 화상 처리 프로그램, 및 기록 매체
CN109982012B (zh) 图像处理方法及装置、存储介质、终端
US10970822B2 (en) Image processing method and electronic device thereof
JP2012028937A (ja) 映像信号補正装置および映像信号補正プログラム
WO2023284528A1 (fr) Procédé et appareil d'amélioration d'image, dispositif informatique et support de stockage
JP4359662B2 (ja) カラー画像の露出補正方法
TWI438718B (zh) 適應性反雙曲線影像處理方法及其系統
WO2022067761A1 (fr) Procédé et appareil de traitement d'image, dispositif de capture, plateforme mobile, et support de stockage lisible par ordinateur
JP2002369004A (ja) 画像処理装置と画像出力装置と画像処理方法及び記憶媒体
JP5050141B2 (ja) カラー画像の露出評価方法
WO2013114802A1 (fr) Dispositif et procédé de traitement d'images, programme informatique et système de traitement d'images
JP2013115571A (ja) 情報処理装置
JP6403811B2 (ja) 画像処理装置及びプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22845035

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE