CN108769634B - Image processing method, image processing device and terminal equipment - Google Patents

Image processing method, image processing device and terminal equipment Download PDF

Info

Publication number
CN108769634B
CN108769634B CN201810734863.XA CN201810734863A CN108769634B CN 108769634 B CN108769634 B CN 108769634B CN 201810734863 A CN201810734863 A CN 201810734863A CN 108769634 B CN108769634 B CN 108769634B
Authority
CN
China
Prior art keywords
images
background
area
preset
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810734863.XA
Other languages
Chinese (zh)
Other versions
CN108769634A (en
Inventor
张弓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Oppo Chongqing Intelligent Technology Co Ltd
Original Assignee
Oppo Chongqing Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo Chongqing Intelligent Technology Co Ltd filed Critical Oppo Chongqing Intelligent Technology Co Ltd
Priority to CN201810734863.XA priority Critical patent/CN108769634B/en
Publication of CN108769634A publication Critical patent/CN108769634A/en
Application granted granted Critical
Publication of CN108769634B publication Critical patent/CN108769634B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/88Camera processing pipelines; Components thereof for processing colour signals for colour balance, e.g. white-balance circuits or colour temperature control

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Processing Of Color Television Signals (AREA)
  • Color Television Image Signal Generators (AREA)

Abstract

The application provides an image processing method, an image processing device and a terminal device, wherein the method comprises the following steps: acquiring a plurality of images, and judging whether the plurality of images are images shot in the same area; if the multiple images are shot in the same area, determining an overlapping background area of the multiple images; counting a color histogram of the overlapped background area; judging whether the overlapped background area is a preset pure color background or not according to the color histogram; and if the overlapped background area is a preset pure background, selecting an image from the multiple images, and carrying out white balance processing on the selected image according to the overlapped background area. The method and the device can solve the problem that when the image has a large pure color background, the color cast of the processed image is too serious due to a traditional white balance algorithm.

Description

Image processing method, image processing device and terminal equipment
Technical Field
The present application belongs to the field of image processing technologies, and in particular, to an image processing method, an image processing apparatus, a terminal device, and a computer-readable storage medium.
Background
At present, after the terminal device finishes photographing, white balance processing must be performed on a photographed image, and the purpose of the white balance processing is to enable a photographed object to present an original color, so that the phenomena that the photographed object is greenish under a fluorescent lamp, yellowish under tungsten lamp light, and bluish under a sunlight shadow are avoided.
The conventional white balance algorithm first needs to estimate the color temperature of the current environment from the photographed image, and then performs white balance processing on the photographed image according to the estimated color temperature. However, when the captured image has a large solid background, such as a blue background or a yellow background, the conventional white balance algorithm may fail. For example, when a large amount of blue background exists in a shot image, a conventional white balance algorithm may identify the large amount of blue background as a white background at a high color temperature, so as to identify a color temperature of a current environment as a high color temperature environment, correct the blue background to be white when performing white balance processing, and cause other objects in the image to have severe color cast, for example, a human face to have a yellow phenomenon. Therefore, when the image has a large background with pure color, the conventional white balance algorithm may cause the overall color cast of the processed image to be too severe, so that the user experience is poor.
Disclosure of Invention
In view of the above, the present application provides an image processing method, an image processing apparatus, a terminal device and a computer readable storage medium, which can solve the technical problem that when an image has a large solid background, the color cast of the processed image is too severe due to the conventional white balance algorithm.
A first aspect of the present application provides an image processing method, including:
acquiring a plurality of images, and judging whether the plurality of images are images shot in the same area;
if the plurality of images are images captured in the same area, then:
determining an overlapping background area of the plurality of images;
counting a color histogram of the overlapped background area;
judging whether the overlapped background area is a preset pure color background or not according to the color histogram;
and if the overlapped background area is a preset pure background, selecting one image from the plurality of images, and carrying out white balance processing on the selected image according to the overlapped background area.
A second aspect of the present application provides an image processing apparatus comprising:
the image acquisition module is used for acquiring a plurality of images and judging whether the plurality of images are images shot in the same area;
a background region determining module configured to determine an overlapping background region of the plurality of images if the plurality of images are images captured in the same region;
a histogram statistic module for counting the color histogram of the overlapped background region;
the pure color background judgment module is used for judging whether the overlapped background area is a preset pure color background or not according to the color histogram;
and the white balance processing module is used for selecting one image from the plurality of images if the overlapped background area is a preset pure background, and carrying out white balance processing on the selected image according to the overlapped background area.
A third aspect of the present application provides a terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor implements the steps of the method according to the first aspect when executing the computer program.
A fourth aspect of the present application provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the method of the first aspect as described above.
A fifth aspect of the present application provides a computer program product comprising a computer program which, when executed by one or more processors, performs the steps of the method of the first aspect as described above.
In view of the above, the present application provides an image processing method, which includes, first, acquiring a plurality of images, and determining whether the plurality of images are images shot in a same area; secondly, when the multiple images are shot in the same area, determining an overlapping background area of the multiple images according to the multiple images; then, after the overlapping background area is obtained, counting a color histogram of the overlapping background area, and judging whether the overlapping background area is a preset pure color background, such as blue or yellow, according to the counted color histogram; and finally, if the overlapped background area is a preset pure background, selecting one image from the multiple images, and carrying out white balance processing on the selected image according to the overlapped background area. Therefore, in the technical solution provided by the present application, before performing the white balance processing on the image, it is required to first determine whether the overlapped background area is a preset pure color background, such as a blue background or a yellow background, and if the overlapped background area is the preset pure color background, the white balance processing is assisted according to the overlapped background area. Whether a pure color background is contained in an image is not judged before the image is subjected to white balance processing by a traditional white balance algorithm, and therefore the pure color background can be identified as a white background, so that the estimation deviation of the environmental color temperature is large, and the image color cast after the white balance processing is serious.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic flow chart illustrating an implementation of an image processing method according to an embodiment of the present application;
fig. 2 is a schematic diagram illustrating an acquiring method of an overlapping area according to an embodiment of the present application;
fig. 3 is a schematic flow chart illustrating an implementation of another image processing method according to the second embodiment of the present application;
FIG. 4 is a schematic diagram of another ambient color temperature estimation method provided in the second embodiment of the present application;
fig. 5 is a schematic structural diagram of an image processing apparatus according to a third embodiment of the present application;
fig. 6 is a schematic structural diagram of a terminal device according to a fourth embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
The image processing method provided by the embodiment of the application can be applied to terminal devices, and the terminal devices include, but are not limited to: smart phones, tablet computers, learning machines, intelligent wearable devices, and the like.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the present application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to a determination" or "in response to a detection". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
In particular implementations, the terminal devices described in embodiments of the present application include, but are not limited to, other portable devices such as mobile phones, laptop computers, or tablet computers having touch sensitive surfaces (e.g., touch screen displays and/or touch pads). It should also be understood that in some embodiments, the devices described above are not portable communication devices, but rather are desktop computers having touch-sensitive surfaces (e.g., touch screen displays and/or touch pads).
In the discussion that follows, a terminal device that includes a display and a touch-sensitive surface is described. However, it should be understood that the terminal device may include one or more other physical user interface devices such as a physical keyboard, mouse, and/or joystick.
The terminal device supports various applications, such as one or more of the following: a drawing application, a presentation application, a word processing application, a website creation application, a disc burning application, a spreadsheet application, a gaming application, a telephone application, a video conferencing application, an email application, an instant messaging application, an exercise support application, a photo management application, a digital camera application, a web browsing application, a digital music player application, and/or a digital video player application.
Various applications that may be executed on the terminal device may use at least one common physical user interface device, such as a touch-sensitive surface. One or more functions of the touch-sensitive surface and corresponding information displayed on the terminal can be adjusted and/or changed between applications and/or within respective applications. In this way, a common physical architecture (e.g., touch-sensitive surface) of the terminal can support various applications with user interfaces that are intuitive and transparent to the user.
In addition, in the description of the present application, the terms "first", "second", and the like are used only for distinguishing the description, and are not intended to indicate or imply relative importance.
In order to explain the technical solution of the present application, the following description will be given by way of specific examples.
Example one
Referring to fig. 1, an image processing method according to a first embodiment of the present application is described below, where the image processing method includes:
in step S101, a plurality of images are acquired, and it is determined whether or not the plurality of images are images captured in the same area;
in the first embodiment of the present application, a plurality of images are first obtained, where the plurality of images may be a plurality of images automatically selected from an album of a terminal device; alternatively, the terminal device may be a plurality of images manually selected from an album by the user, or may be a plurality of images automatically photographed by the terminal device at the same time when receiving a photographing instruction from the user, and how to acquire the plurality of images is not limited herein.
After acquiring the multiple images, it is necessary to determine whether the multiple images are captured in the same area, and in a first embodiment of the present application, the determining whether the multiple images are captured in the same area may include:
firstly, acquiring a shooting time interval between two adjacent images in the plurality of images and an average moving speed of the terminal equipment when the plurality of images are shot;
specifically, after acquiring a plurality of images, the acquired images may be sorted in sequence according to the shooting time, and then the shooting time interval between two adjacent images and the average moving speed of the terminal device during shooting the plurality of images are acquired. For example, if the acquired images are image a, image B and image C, the shooting time of image a is 9:00 am of 2018-06-21, the shooting time of image B is 9:05 am of 2018-06-21 and the shooting time of image C is 9:02 am of 2018-06-21, the 3 images are sorted first, i.e. image a, image C and image B, then the shooting time interval between image a and image C, i.e. 2 minutes, and the shooting time interval between image C and image B, i.e. 3 minutes, are acquired, the average moving speed of the terminal device during shooting the images (i.e. 9:00-9:05 minutes) is calculated, and the instantaneous moving speed of the terminal device during shooting the images can be detected by an acceleration sensor or a gyroscope sensor on the terminal device, an average moving speed is then calculated from the detected instantaneous moving speed.
Secondly, if the shooting time interval between the two adjacent images is smaller than a preset shooting time length and the average moving speed is smaller than a preset moving speed, determining the multiple images as the images shot in the same area; otherwise, the plurality of images are not taken in the same area;
in the embodiment of the present application, since the overlapping background areas of the multiple images need to be acquired in subsequent steps, the preset shooting duration and the preset moving speed are often relatively small, and in order to ensure that the multiple acquired images are shot in the same area as much as possible, the method for acquiring the multiple images may be generally selected such that when a shooting instruction of a user is received, the terminal device automatically shoots the multiple images at the same time.
In step S102, if the plurality of images are images captured in the same area, determining an overlapping background area of the plurality of images;
if the multiple acquired images are determined to be images captured in the same area in step S101, the overlapping background areas of the multiple images are determined, in this embodiment, the overlapping background area is a background area commonly owned by the multiple images, the overlapping areas of the multiple images (i.e., the commonly owned image areas) may be calculated first, and then the area where the foreground is located in the overlapping area is removed, so as to obtain the overlapping background area, specifically, the overlapping background areas of the multiple images may be obtained by the following method:
firstly, determining an overlapping area of the plurality of images according to an image matching algorithm;
as shown in fig. 2, it is assumed that the images 201, 202, and 203 are multiple images captured after the user clicks the capture button (because the user inevitably has a hand shake phenomenon during the capture process, the multiple images are not completely the same), in order to obtain an overlapping region of the images 201, 202, and 203 (i.e., an image region that the images 201, 202, and 203 commonly have), the images 201, 202, and 203 may be subjected to a blocking process, then, features of each block, such as color features, texture features, edge features, and/or brightness features, may be extracted, then, with reference to the image 201, a certain block in the image 201, such as the block a, may be subjected to feature similarity matching with all blocks in the images 202 and 203, respectively, and whether the images 202 and 203 include the block a is determined, the image 201 is traversed to obtain all the common patches of the image 201, the image 202 and the image 203, so as to obtain the overlapping area of the above-mentioned plurality of images. As shown in fig. 2, the block a may be first subjected to similarity matching with each block in the image 202, and whether a block whose similarity to the block a is greater than a preset similarity threshold exists in the image 202 is determined, and if a block whose similarity to the block a is greater than the preset similarity threshold exists in the image 202, the block a is considered as an image area shared by the image 201 and the image 202; secondly, similarity matching is carried out on the block A and each block in the image 203, whether the block with the similarity larger than a preset similarity threshold exists in the image 203 or not is judged, and if the block with the similarity larger than the preset similarity threshold exists in the image 203, the block A is considered to be an image area shared by the image 201 and the image 203; finally, all blocks of the image 201 are traversed, and the overlapping regions of the image 201, the image 202, and the image 203 are acquired.
Secondly, detecting a preset target in the overlapping area;
in general, the overlapping area obtained in the above steps often includes foreground objects, such as a portrait, an animal, a plant, and the like, and therefore, after the overlapping area of the multiple images is obtained, the foreground objects in the overlapping area can be detected.
And finally, if the overlapping area contains a preset target, acquiring the area where the preset target is located, and determining the overlapping background area according to the area where the preset target is located, wherein the overlapping background area is composed of the overlapping area except the area where the preset target is located.
And if the overlapped area is detected to contain the preset target, removing the area where the preset target is located from the overlapped area to obtain an overlapped background area, and if the preset target is not detected in the overlapped area, directly taking the overlapped area as the overlapped background area.
In step S103, counting a color histogram of the overlapping background region;
in this embodiment of the application, after obtaining the above-mentioned overlapping background region, it is necessary to obtain a color histogram of the overlapping background region, so that the color of the overlapping background region can be determined subsequently according to the color histogram. The color histogram includes an R histogram, a G histogram, and a B histogram, which are respectively used to indicate R, G, B a ratio of the number of corresponding pixels to the total number of pixels when each pixel value is taken.
For example, if the value of R, G, B of each pixel in the overlapped background region is 0 to 255, the R histogram is used to indicate the proportion of the number of pixels when the R value is 0 to 255 to the total number of pixels, the G histogram is used to indicate the proportion of the number of pixels when the G value is 0 to 255 to the total number of pixels, and the B histogram is used to indicate the proportion of the number of pixels when the B value is 0 to 255 to the total number of pixels.
In step S104, determining whether the overlapped background region is a preset solid background according to the color histogram;
generally, when an image contains a large amount of blue background and yellow background, the image after white balance adjustment is severely color-shifted by using a conventional white balance algorithm. Therefore, in the embodiment of the present application, the preset solid background may be a blue solid background and/or a yellow solid background, so that when the image includes the blue solid background or the yellow solid background, the white balance adjustment may be assisted according to the color of the overlapped background region.
In this embodiment, the determining whether the overlapped background area is a preset solid background according to the color histogram may include:
firstly, acquiring a preset color histogram of the preset pure color background, for example, when the preset pure color background is a blue pure color background, acquiring the preset color histogram of the blue pure color background;
secondly, performing similarity matching on the color histogram and the preset color histogram, that is, performing similarity matching on the color histogram of the overlapping background region obtained in step S103 and the preset color histogram of the blue solid background, wherein a similarity matching algorithm may refer to the prior art and is not described herein again;
then, if the similarity between the color histogram and the preset color histogram is greater than a preset similarity threshold, the overlapped background region is considered to be the preset pure color background, otherwise, the overlapped background region is not the preset pure color background, that is, if the similarity between the color histogram and the preset color histogram of the blue pure color background is greater than a preset similarity threshold, the overlapped background region is considered to be the blue pure color background.
In addition, if the preset pure color background is a blue pure color background and a yellow pure color background, the color histogram of the overlapped background region obtained in step S103 may be respectively subjected to similarity matching with the preset color histogram of the blue pure color background and the preset color histogram of the yellow pure color background, if the similarity between the color histogram of the overlapped background region and the preset color histogram of the blue pure color background is greater than a preset similarity threshold, the overlapped background region is considered as the blue pure color background, and if the similarity between the color histogram of the overlapped background region and the preset color histogram of the yellow pure color background is greater than a preset similarity threshold, the overlapped background region is considered as the yellow pure color background.
In step S105, if the overlapped background area is a preset solid background, selecting one image from the plurality of images, and performing white balance processing on the selected image according to the overlapped background area;
in the first embodiment of the present application, the selected image may be any one selected by the terminal device from multiple images, or may be one of the images selected according to an instruction of the user, which is not limited herein.
The performing white balance processing on the selected image according to the overlapped background region may include:
if the overlapped background area is a preset solid background, for example, a blue solid background, the image area excluding the overlapped background area can be obtained from the multiple images; estimating the color temperature of the current environment according to the image area except the overlapped background area in the plurality of images; finally, the selected image is subjected to white balance processing according to the estimated color temperature.
The color temperature of the current environment is estimated according to the whole image by the traditional white balance algorithm, so that the color temperature estimation deviation is large when the image contains a large amount of pure color backgrounds. The estimation of the color temperature of the current environment and the white balance processing of the image according to the color temperature belong to the prior art, and are not described herein again.
In the technical solution provided in the first embodiment of the present application, before performing white balance processing on an image, it is required to first determine whether an overlapping background area is a preset solid background, such as a blue background or a yellow background, and if the overlapping background area is the preset solid background, assist the white balance processing according to the overlapping background area. Whether a pure color background is contained in an image is not judged before the image is subjected to white balance processing by a traditional white balance algorithm, and therefore the pure color background can be identified as a white background, so that the estimation deviation of the environmental color temperature is large, and the image color cast after the white balance processing is serious.
Example two
Referring to fig. 3, an image processing method provided in the second embodiment of the present application is described below, where the image processing method includes:
in step S301, a plurality of images are acquired, and it is determined whether the plurality of images are images captured in the same area;
in step S302, if the plurality of images are captured in the same area, an overlapping background area of the plurality of images is determined;
in step S303, counting a color histogram of the overlapping background region;
in step S304, determining whether the overlapped background region is a preset solid background according to the color histogram;
in the second embodiment of the present application, the steps S301 to S304 are the same as the steps S101 to S104 in the first embodiment, and specific reference may be made to the description of the first embodiment, which is not repeated herein.
In step S305, if the overlapped background region is a preset solid background, selecting an image from the images, and estimating a color temperature of the current environment according to a color of the preset solid background corresponding to the overlapped background region;
in the second embodiment of the present application, another method for estimating an ambient color temperature is provided. In this embodiment of the present application, the preset pixel values of the solid background may be pre-stored at different color temperatures (since the preset solid background may be color-shifted at high color temperature and low color temperature, so that the preset solid background may present different colors at different color temperatures, that is, different pixel values are provided, and we may pre-record the pixel values of the preset solid background at different color temperatures, and may store the "color temperature-pixel value correspondence information" in the memory before the terminal device leaves the factory), and then, if the overlapped background region is determined to be the preset solid background in step S304, the color temperature value of the current environment may be estimated according to the "color temperature-pixel value correspondence information" that we preset and store (although we consider that the overlapped background region is the preset solid background, the color of the overlapped background region necessarily has a color with the color of the preset solid background to a certain extent And a little deviation, therefore, we can estimate the color temperature of the current environment according to the pixel values of the pixels in the overlapped background region and the preset "color temperature-pixel value correspondence information").
In order to more intuitively describe the above technical solution provided by the embodiment of the present application, the method for estimating the ambient color temperature provided above is described below with reference to fig. 4. Assuming that the preset solid background is the image 402, color shift occurs at high color temperature, and the image is changed to the image 401, and the image 403 is changed at low color temperature, and then at different color temperatures, the pixel value of the preset solid background is shown as the corresponding relation information 404 in fig. 4, and the corresponding relation information 404 may be stored in the memory of the terminal device in advance, then, if it is determined in step S304 that the color histograms of the superimposed background areas 405 of the multiple images acquired in step S301 are similar to the color histogram of the preset solid background 402 (there is a certain deviation inevitably), the overlapping background area 405 is assumed to be a predetermined solid background, but in general, the color of the overlapped background region 405 is different from the color of the predetermined solid background, and has a slight deviation, therefore, the current color temperature value can be estimated according to the pixel value of each pixel point in the overlapped background region and the preset corresponding relationship information 404. Specifically, the overlapped background area 405 may be partitioned, an average value of pixel values of each pixel point in each partition is calculated, and according to the calculated average value, a color temperature value corresponding to the average value is searched in the correspondence information 404, so as to obtain a color temperature value corresponding to each partition, and then the color temperature values corresponding to each partition may be averaged, so as to obtain a current ambient color temperature value.
In step S306, white balance processing is performed on the selected image according to the color temperature;
in the embodiment of the present application, in addition to the "color temperature-pixel value correspondence information" described in step S305, it is necessary to store the correction value of the pixel value when the preset solid color background with color shift is corrected to a normal color (because the preset solid color background with color shift occurs at a high color temperature and a low color temperature, the preset solid color background with different color temperatures will show different colors, that is, different pixel values are provided, we can correct the preset solid color background with color shift to a normal color in advance, and store the "color temperature-correction value correspondence information" in the memory before the terminal device leaves the factory), then, the pixel value of each pixel point in the selected image is corrected according to the color temperature value of the current environment estimated in step S305 and the pre-stored "color temperature-correction value correspondence information".
The second embodiment of the present application provides another estimation method for a color temperature of a current environment, because the method provided by the second embodiment of the present application estimates the color temperature of the current environment only when an overlapped background region is a preset pure color background, such as blue, the estimated color temperature is neither too high nor too low, and therefore the adjustment of the selected image is not too large, and therefore, the problem that the color temperature estimation is an ultra-high color temperature or an ultra-low color temperature due to the fact that a traditional white balance algorithm identifies the pure color background as white can be effectively avoided, and the problem that the overall color cast of the image after white balance processing is too serious can be avoided.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
EXAMPLE III
A third embodiment of the present application provides an image processing apparatus, in which only a part related to the present application is shown for convenience of description, and as shown in fig. 5, an image processing apparatus 500 includes:
an image obtaining module 501, configured to obtain multiple images and determine whether the multiple images are captured in the same area;
a background region determining module 502, configured to determine an overlapping background region of the plurality of images if the plurality of images are captured in the same region;
a histogram statistic module 503, configured to count a color histogram of the overlapping background region;
a pure color background judgment module 504, configured to judge whether the overlapped background area is a preset pure color background according to the color histogram;
a white balance processing module 505, configured to select one image from the multiple images if the overlapped background area is a preset pure background, and perform white balance processing on the selected image according to the overlapped background area.
Optionally, the background area determining module 502 includes:
an overlapping area determining unit for determining an overlapping area of the plurality of images according to an image matching algorithm;
a target detection unit for detecting a preset target in the overlap region;
and a background area determining unit, configured to, if the overlap area includes a preset target, obtain an area where the preset target is located, and determine the overlap background area according to the area where the preset target is located, where the overlap background area is formed by excluding the area where the preset target is located in the overlap area.
Optionally, the solid background determining module 504 includes:
a preset histogram obtaining unit, configured to obtain a preset color histogram corresponding to the preset pure color background;
a similarity matching unit, configured to perform similarity matching between the color histogram and the preset color histogram;
a first solid background judging unit, configured to consider the overlapped background area as the preset solid background if a similarity between the color histogram and the preset color histogram is greater than a preset similarity threshold;
and a second solid background judgment unit, configured to determine that the overlapped background area is not the preset solid background if the similarity between the color histogram and the preset color histogram is smaller than or equal to the similarity threshold.
Optionally, the image obtaining module 501 is specifically configured to:
when receiving a user shooting instruction, shooting a plurality of images at the same time, taking the plurality of shot images as the plurality of images, and judging whether the plurality of images are shot in the same area.
Optionally, the image obtaining module 501 includes:
an interval and speed acquisition unit configured to acquire a shooting time interval between two adjacent images of the plurality of images and an average moving speed of the terminal device when the plurality of images are shot;
a first determining unit, configured to determine that the multiple images are images captured in the same area if the capturing time intervals between the two adjacent images are both smaller than a preset capturing duration and the average moving speed is smaller than a preset moving speed;
a second determining unit, configured to determine that the multiple images are not images captured in the same area if the capturing time interval between the two adjacent images is greater than or equal to the preset capturing time period or the average moving speed is greater than or equal to the preset moving speed.
Optionally, the white balance processing module 505 includes:
the color temperature estimation unit is used for estimating the color temperature of the current environment according to the color of the preset pure color background corresponding to the overlapped background area;
and the white balance processing unit is used for carrying out white balance processing on the selected image according to the color temperature.
It should be noted that, for the information interaction, execution process, and other contents between the above-mentioned devices/units, the specific functions and technical effects thereof are based on the same concept as those of the embodiment of the method of the present application, and specific reference may be made to the part of the embodiment of the method, which is not described herein again.
Example four
Fig. 6 is a schematic diagram of a terminal device according to a fourth embodiment of the present application. As shown in fig. 6, the terminal device 6 of this embodiment includes: a processor 60, a memory 61 and a computer program 62 stored in said memory 61 and executable on said processor 60. The processor 60, when executing the computer program 62, implements the steps of the various method embodiments described above, such as the steps S101 to S105 shown in fig. 1. Alternatively, the processor 60 implements the functions of the modules/units in the device embodiments, such as the functions of the modules 501 to 505 shown in fig. 5, when executing the computer program 62.
Illustratively, the computer program 62 may be divided into one or more modules/units, which are stored in the memory 61 and executed by the processor 60 to accomplish the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution process of the computer program 62 in the terminal device 6. For example, the computer program 62 may be divided into an image acquisition module, a background region determination module, a histogram statistics module, a pure background judgment module, and a white balance processing module, and the functions of the modules are as follows:
acquiring a plurality of images, and judging whether the plurality of images are images shot in the same area;
if the plurality of images are images captured in the same area, then:
determining an overlapping background area of the plurality of images;
counting a color histogram of the overlapped background area;
judging whether the overlapped background area is a preset pure color background or not according to the color histogram;
and if the overlapped background area is a preset pure background, selecting one image from the plurality of images, and carrying out white balance processing on the selected image according to the overlapped background area.
The terminal device 6 may be a computing device such as a smart phone, a tablet computer, a learning machine, and an intelligent wearable device. The terminal device may include, but is not limited to, a processor 60 and a memory 61. It will be understood by those skilled in the art that fig. 6 is merely an example of a terminal device 6, and does not constitute a limitation of the terminal device 6, and may include more or less components than those shown, or some components may be combined, or different components, for example, the terminal device may further include an input-output device, a network access device, a bus, etc.
The Processor 60 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 61 may be an internal storage unit of the terminal device 6, such as a hard disk or a memory of the terminal device 6. The memory 61 may be an external storage device of the terminal device 6, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like provided in the terminal device 6. Further, the memory 61 may include both an internal storage unit and an external storage device of the terminal device 6. The memory 61 is used for storing the computer program and other programs and data required by the terminal device. The above-mentioned memory 61 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned functions may be distributed as different functional units and modules according to needs, that is, the internal structure of the apparatus may be divided into different functional units or modules to implement all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal device are merely illustrative, and for example, the division of the above modules or units is only one logical function division, and there may be other division manners in actual implementation, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units described above, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method of the embodiments described above may be implemented by a computer program, which may be stored in a computer readable storage medium and used by a processor to implement the steps of the embodiments of the methods described above. The computer program includes computer program code, and the computer program code may be in a source code form, an object code form, an executable file or some intermediate form. The computer readable medium may include: any entity or device capable of carrying the above-mentioned computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signal, telecommunication signal, software distribution medium, etc. It should be noted that the computer readable medium described above may include content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media that does not include electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
The above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (9)

1. An image processing method, comprising:
acquiring a plurality of images, and judging whether the plurality of images are images shot in the same area or not according to the shooting time interval between two adjacent images in the plurality of images and the average moving speed of the terminal equipment when the plurality of images are shot;
if the plurality of images are images shot in the same area, then:
determining overlapping background areas of the plurality of images;
counting a color histogram of the overlapped background area;
judging whether the overlapped background area is a preset pure color background or not according to the color histogram;
if the overlapped background area is a preset pure background, selecting an image from the multiple images, and carrying out white balance processing on the selected image according to the overlapped background area; wherein, the white balance processing on the selected image according to the overlapped background area specifically includes:
if the overlapped background area is a preset pure background, acquiring an image area except the overlapped background area from the plurality of images, and estimating the color temperature of the current environment according to the image area except the overlapped background area in the plurality of images; and performing white balance processing on the selected image according to the estimated color temperature.
2. The image processing method of claim 1, wherein the determining the overlapping background regions of the plurality of images comprises:
determining an overlapping area of the plurality of images according to an image matching algorithm;
detecting a preset target in the overlapping area;
if the overlapping area contains a preset target, acquiring an area where the preset target is located, and determining the overlapping background area according to the area where the preset target is located, wherein the overlapping background area is composed of the overlapping area except the area where the preset target is located.
3. The image processing method according to claim 1, wherein said determining whether the overlapped background region is a preset solid background according to the color histogram comprises:
acquiring a preset color histogram corresponding to the preset pure color background;
performing similarity matching on the color histogram and the preset color histogram;
if the similarity between the color histogram and the preset color histogram is greater than a preset similarity threshold, the overlapped background area is considered as the preset pure color background;
and if the similarity between the color histogram and the preset color histogram is less than or equal to the similarity threshold, the overlapped background area is not considered as the preset pure color background.
4. The image processing method of claim 1, wherein said acquiring a plurality of images comprises:
when a user shooting instruction is received, multiple images are shot simultaneously.
5. The image processing method according to claim 1, wherein the determining whether the plurality of images are images captured in the same area includes:
acquiring a shooting time interval between two adjacent images in the plurality of images and an average moving speed of the terminal equipment when the plurality of images are shot;
if the shooting time interval between the two adjacent images is smaller than the preset shooting duration and the average moving speed is smaller than the preset moving speed, determining that the multiple images are the images shot in the same area;
and if the shooting time interval between the two adjacent images is greater than or equal to the preset shooting time length, or the average moving speed is greater than or equal to the preset moving speed, determining that the multiple images are not the images shot in the same area.
6. An image processing apparatus characterized by comprising:
the image acquisition module is used for acquiring a plurality of images and judging whether the plurality of images are images shot in the same area or not according to the shooting time interval between two adjacent images in the plurality of images and the average moving speed of the terminal equipment when the plurality of images are shot;
a background region determining module, configured to determine an overlapping background region of the multiple images if the multiple images are images captured in the same region;
the histogram statistical module is used for counting the color histogram of the overlapped background area;
the pure color background judgment module is used for judging whether the overlapped background area is a preset pure color background or not according to the color histogram;
the white balance processing module is used for selecting one image from the multiple images if the overlapped background area is a preset pure background, and carrying out white balance processing on the selected image according to the overlapped background area; wherein, the white balance processing is performed on the selected image according to the overlapped background area, specifically: if the overlapped background area is a preset pure background, acquiring an image area except the overlapped background area from the plurality of images, and estimating the color temperature of the current environment according to the image area except the overlapped background area in the plurality of images; and performing white balance processing on the selected image according to the estimated color temperature.
7. The image processing apparatus of claim 6, wherein the background region determination module comprises:
an overlapping area determining unit for determining an overlapping area of the plurality of images according to an image matching algorithm;
a target detection unit for detecting a preset target in the overlap region;
and the background area determining unit is used for acquiring an area where the preset target is located if the overlapping area contains the preset target, and determining the overlapping background area according to the area where the preset target is located, wherein the overlapping background area is formed by removing the area where the preset target is located from the overlapping area.
8. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the method according to any of claims 1 to 5 when executing the computer program.
9. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 5.
CN201810734863.XA 2018-07-06 2018-07-06 Image processing method, image processing device and terminal equipment Active CN108769634B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810734863.XA CN108769634B (en) 2018-07-06 2018-07-06 Image processing method, image processing device and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810734863.XA CN108769634B (en) 2018-07-06 2018-07-06 Image processing method, image processing device and terminal equipment

Publications (2)

Publication Number Publication Date
CN108769634A CN108769634A (en) 2018-11-06
CN108769634B true CN108769634B (en) 2020-03-17

Family

ID=63972494

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810734863.XA Active CN108769634B (en) 2018-07-06 2018-07-06 Image processing method, image processing device and terminal equipment

Country Status (1)

Country Link
CN (1) CN108769634B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109600606B (en) * 2018-11-30 2019-12-27 深圳市华星光电半导体显示技术有限公司 Method for identifying single tone image
CN111182217B (en) * 2020-01-07 2021-08-10 上海海鸥数码照相机有限公司 Image white balance processing method and device
WO2022032564A1 (en) * 2020-08-13 2022-02-17 华为技术有限公司 White balance processing method and device
CN112348905B (en) * 2020-10-30 2023-12-19 深圳市优必选科技股份有限公司 Color recognition method and device, terminal equipment and storage medium
CN112362068B (en) * 2020-12-04 2022-09-23 浙江煤炭测绘院有限公司 Unmanned aerial vehicle surveying and mapping method, device and system
CN112600991A (en) * 2020-12-09 2021-04-02 深圳市焦点数字科技有限公司 Method for solving high-magnification monochromatic color cast of variable-magnification movement
CN112822476A (en) * 2021-02-26 2021-05-18 广东以诺通讯有限公司 Automatic white balance method, system and terminal for color cast of large number of monochrome scenes
CN113052836A (en) * 2021-04-21 2021-06-29 深圳壹账通智能科技有限公司 Electronic identity photo detection method and device, electronic equipment and storage medium

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20040067086A (en) * 2003-01-21 2004-07-30 엘지전자 주식회사 Automatic white balance apparatus and method thereof
TWI360353B (en) * 2008-06-11 2012-03-11 Vatics Inc Method for auto-white-balance control
CN102446347B (en) * 2010-10-09 2014-10-01 株式会社理光 White balance method and device for image
CN104243819B (en) * 2014-08-29 2018-02-23 小米科技有限责任公司 Photo acquisition methods and device
CN104317932B (en) * 2014-10-31 2018-04-27 小米科技有限责任公司 Method for picture sharing and device
CN107483906B (en) * 2017-07-25 2019-03-19 Oppo广东移动通信有限公司 White balancing treatment method, device and the terminal device of image

Also Published As

Publication number Publication date
CN108769634A (en) 2018-11-06

Similar Documents

Publication Publication Date Title
CN108769634B (en) Image processing method, image processing device and terminal equipment
CN109064390B (en) Image processing method, image processing device and mobile terminal
WO2021083059A1 (en) Image super-resolution reconstruction method, image super-resolution reconstruction apparatus, and electronic device
CN107909569B (en) Screen-patterned detection method, screen-patterned detection device and electronic equipment
CN108961183B (en) Image processing method, terminal device and computer-readable storage medium
CN108965835B (en) Image processing method, image processing device and terminal equipment
CN108737739B (en) Preview picture acquisition method, preview picture acquisition device and electronic equipment
CN110113534B (en) Image processing method, image processing device and mobile terminal
CN109118447B (en) Picture processing method, picture processing device and terminal equipment
CN108961267B (en) Picture processing method, picture processing device and terminal equipment
CN109215037B (en) Target image segmentation method and device and terminal equipment
CN109376645B (en) Face image data optimization method and device and terminal equipment
CN110335216B (en) Image processing method, image processing apparatus, terminal device, and readable storage medium
CN104486552A (en) Method and electronic device for obtaining images
CN110457963B (en) Display control method, display control device, mobile terminal and computer-readable storage medium
CN108932703B (en) Picture processing method, picture processing device and terminal equipment
CN110166696B (en) Photographing method, photographing device, terminal equipment and computer-readable storage medium
CN110444181B (en) Display method, display device, terminal and computer-readable storage medium
CN110618852B (en) View processing method, view processing device and terminal equipment
CN109358927B (en) Application program display method and device and terminal equipment
CN107360361B (en) Method and device for shooting people in backlight mode
CN109886864B (en) Privacy mask processing method and device
CN108763491B (en) Picture processing method and device and terminal equipment
CN108776959B (en) Image processing method and device and terminal equipment
CN108805883B (en) Image segmentation method, image segmentation device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant