CN111131709A - Preview mode preview image generation method, electronic device and storage medium - Google Patents

Preview mode preview image generation method, electronic device and storage medium Download PDF

Info

Publication number
CN111131709A
CN111131709A CN201911411921.6A CN201911411921A CN111131709A CN 111131709 A CN111131709 A CN 111131709A CN 201911411921 A CN201911411921 A CN 201911411921A CN 111131709 A CN111131709 A CN 111131709A
Authority
CN
China
Prior art keywords
target
brightness
determining
exposure value
brightness interval
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911411921.6A
Other languages
Chinese (zh)
Other versions
CN111131709B (en
Inventor
李振华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201911411921.6A priority Critical patent/CN111131709B/en
Publication of CN111131709A publication Critical patent/CN111131709A/en
Application granted granted Critical
Publication of CN111131709B publication Critical patent/CN111131709B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • H04N23/632Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/50Control of the SSIS exposure
    • H04N25/53Control of the integration time

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)

Abstract

The embodiment of the application provides a preview mode preview image generation method, electronic equipment and a storage medium, wherein the method comprises the following steps: if the camera application is in a preview mode, obtaining a target exposure value aiming at the current photographing scene; wherein the target exposure value is an exposure value determined for the captured image based on at least three different exposure values; obtaining a preview image based on the target exposure value; and displaying the preview image.

Description

Preview mode preview image generation method, electronic device and storage medium
Technical Field
The present invention relates to image processing technologies, and in particular, to a preview mode preview image generating method, an electronic device, and a storage medium.
Background
At present, when a user starts a camera to shoot, due to the fact that no previous statistical data exists, a first frame of image which is shot at the beginning can only be shot according to a preset exposure, and accordingly, a previewed object to be shot can be overexposed or underexposed when the camera is opened along with different environments, and user experience is affected very much. The related scheme is that a preset fixed exposure value is adopted, and when a user uses a camera for shooting, the exposure value is modified frame by frame according to an exposure algorithm until the exposure value is proper. However, due to the fact that the exposure value is fixed, the proper exposure value can be obtained only by modifying the image frame by frame, and the obtained exposure value is easily influenced by the brightness change of the current environment.
Disclosure of Invention
In view of this, embodiments of the present invention provide a preview mode preview image generating method, an electronic device and a storage medium to solve the problems in the prior art.
The technical scheme of the embodiment of the application is realized as follows:
the embodiment of the application provides a preview mode preview image generation method, which comprises the following steps:
if the camera application is in a preview mode, obtaining a target exposure value aiming at the current photographing scene; wherein the target exposure value is an exposure value determined for the captured image based on at least three different exposure values;
obtaining a preview image based on the target exposure value;
and displaying the preview image.
An embodiment of the present application provides an electronic device, which includes:
a camera module;
and the processor is used for controlling the camera module to realize the preview mode preview image generation method.
Correspondingly, the embodiment of the application provides a computer-readable storage medium, in which computer-executable instructions are stored, and the computer-executable instructions are stored in the computer-readable storage medium and configured to execute the preview-mode preview image generating method provided above.
The embodiment of the application provides a preview mode preview image generation method, electronic equipment and a storage medium, and when a camera is in a preview mode, a plurality of different exposure values are adopted to determine a target exposure value when the camera shoots an image, so that the determined target exposure value is matched with a current shooting scene, and in this way, the target exposure value matched with the current shooting scene is adopted to collect a preview image, so that the image quality of the obtained preview image is better.
Drawings
Fig. 1 is a schematic flow chart illustrating an implementation of a preview mode preview image generation method according to an embodiment of the present application;
FIG. 2 is a schematic flow chart of still another implementation of a preview mode preview image generation method according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a target histogram according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a splice diagram of an embodiment of the present application;
FIG. 5 is a schematic view of a sliding window of a mosaic according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of an electronic device for generating a preview image in a preview mode according to an embodiment of the present application.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
In the following description, suffixes such as "module", "component", or "unit" used to denote elements are used only for the convenience of description of the present application, and have no specific meaning by themselves. Thus, "module", "component" or "unit" may be used mixedly.
The electronic device may be implemented in various forms. For example, the electronic devices described in the present application may include mobile electronic devices such as a mobile phone, a tablet computer, a notebook computer, a palmtop computer, a Personal Digital Assistant (PDA), a Portable Media Player (PMP), a navigation device, a wearable electronic device, a smart band, a pedometer, and the like, and fixed electronic devices such as a Digital TV, a desktop computer, and the like.
While the following description will be made taking a mobile electronic device as an example, those skilled in the art will appreciate that the configuration according to the embodiment of the present application can be applied to a fixed type electronic device in addition to elements particularly used for moving purposes.
An embodiment of the present application provides a preview mode preview image generating method, fig. 1 is a schematic view of an implementation flow of the preview mode preview image generating method according to the embodiment of the present application, and as shown in fig. 1, the preview mode preview image generating method includes the following steps:
step S101: if the camera application is in the preview mode, a target exposure value for the current photo scene is obtained.
Here, the target exposure value is an exposure value determined for the captured image based on at least three different exposure values. When the camera application is opened to shoot images, the camera application firstly enters a preview mode, different exposure values can be applied to collect multi-frame images aiming at the current shooting scene in the preview mode, and a target exposure value suitable for the current shooting scene can be obtained based on the exposure values of the multi-frame images. For example, the target exposure value is determined based on the captured images of 5 different exposure values, so that the target exposure value can perfectly match the current shot scene.
Step S102: based on the target exposure value, a preview image is obtained.
Here, after acquiring the target exposure value, the user performs image capturing using the target exposure value, obtaining a preview image.
Step S103: and displaying the preview image.
Here, the preview image is displayed to the user, for example, a preview image taken by the user is displayed on the camera application.
In the embodiment of the application, when the camera is in the preview mode, the target exposure value of the camera during image shooting is determined by adopting a plurality of different exposure values, so that the determined target exposure value is matched with the current shooting scene, and in this way, the target exposure value matched with the current shooting scene is adopted to acquire the preview image, so that the image quality of the obtained preview image is better.
An embodiment of the present application provides a preview mode preview image generating method, and fig. 2 is a schematic flow chart of another implementation of the preview mode preview image generating method according to the embodiment of the present application, as shown in fig. 2, the preview mode preview image generating method includes the following steps:
step S201: and if the camera application is in a preview mode, respectively acquiring a first acquired image, a second acquired image and a third acquired image aiming at the current photographing scene according to the first exposure value, the second exposure value and the third exposure value.
Here, the first exposure value is smaller than the second exposure value, which is smaller than the third exposure value. When the camera application needs to be used, the camera application is opened, the camera application enters a preview mode, and at the moment, the camera application respectively exposes the current photographing scene by adopting three preset exposure modes of low exposure, medium exposure and high exposure so as to obtain a first collected image, a second collected image and a third collected image. The exposure is obtained by controlling the sensitivity and the exposure time, and the exposure data sequence, i.e., the exposure sequence, is composed from low exposure to high exposure. In a specific example, the settings take the data at 1/4, 2/4, and 3/4 in the exposure sequence, corresponding to a low exposure value, a medium exposure value, and a high exposure value, respectively. For the current photographing environment, three exposure values are respectively adopted for image acquisition, so that three frames of images can be obtained, and the three frames of images are not displayed on a display screen of the camera application, namely the three frames of images do not need to be presented to a user.
Step S202: and determining a histogram of each acquired image to obtain a histogram set.
Here, after each captured image is obtained, a histogram corresponding to each captured image is obtained using the viewgraph tool. The histogram graphically represents the number of pixels per brightness level of an image, and shows the distribution of the pixels in the image. The horizontal axis of the histogram represents luminance values of 0 to 255, and the vertical axis represents the number of pixels corresponding to luminance in the image. The histogram may show details in shading (shown in the left portion of the histogram), middle tones (shown in the middle), and highlights (shown in the right portion).
Step S203: determining the number of pixels of different luminance bins based on a plurality of different luminance bins and the histogram set.
Here, the luminance section includes at least three luminance sections, and the luminance values included in the at least three luminance sections are different. In a specific example, the three luminance sections are: 0-50, 50-200 and 200-; in other embodiments, the three luminance sections may be set to other values. Dividing the histogram corresponding to each acquired image into three regions according to three different brightness intervals, then respectively counting the number of pixels in the three regions in each histogram, and cumulatively counting the number of pixels in the three regions to obtain the number of pixels in different brightness intervals.
Step S204: and determining the brightness interval with the number of pixels larger than the number threshold value as a target brightness interval.
In the three obtained luminance sections, the number of pixels in the three luminance sections is compared and sorted, and the luminance section corresponding to the maximum value of the number of pixels is set as the target luminance section.
In some embodiments, in order to determine an accurate target brightness interval, the step S204 may be further implemented by:
the method comprises the following steps: and if the number of the pixels in the first brightness interval is larger than the number threshold, determining that the target brightness interval is the first brightness interval.
Here, if the number of pixels in the first luminance section is greater than or equal to 80% of the total number of pixels of the captured image, in a specific example, if the number of pixels in the first luminance section is the largest, i.e., the number of pixels is the largest in magnitude, the first luminance section is coordinated to the target luminance section.
Step two: and if the number of the pixels in the second brightness interval is greater than the number threshold, determining that the target brightness interval is the second brightness interval.
Here, if the number of pixels in the second luminance section is greater than or equal to 80% of the total number of pixels of the captured image, in a specific example, if the number of pixels in the second luminance section is the largest, i.e., the number of pixels is the largest in magnitude, the second luminance section is taken as the target luminance section.
Step three: and if the number of the pixels in the third brightness interval is greater than the number threshold, determining that the target brightness interval is the third brightness interval.
Here, if the number of pixels in the third luminance interval is greater than or equal to 80% of the total number of pixels of the captured image, in a specific example, if the number of pixels in the third luminance interval is the largest, i.e., the number of pixels is the largest in magnitude, the third luminance interval is taken as the target luminance interval. The brightness value contained in the first brightness interval is less than or equal to the brightness value contained in the second brightness interval; the brightness value included in the second brightness interval is less than or equal to the brightness value included in the third brightness interval. For example, the first luminance interval is 0-50, the second luminance interval is 50-200, and the third luminance interval is 200-255.
Step S205: and determining the target exposure value based on the brightness value corresponding to the target brightness interval.
Here, after the target brightness interval is determined, the brightness of the current environment can be known, and then the target exposure value of the shot image can be determined according to the brightness of the current environment.
Step S206: based on the target exposure value, a preview image is obtained.
Step S207: and displaying the preview image.
In the embodiment of the application, the exposure value adaptive to the current photographing scene can be determined through the exposure value of the previewed three-frame image and the histogram corresponding to the image, the speed of determining the target exposure value is increased, and the photographing experience is improved.
In some embodiments, in order to obtain an accurate target exposure value, the step S205 may be further implemented by:
the method comprises the following steps: and if the target brightness interval is the first brightness interval, determining the target exposure value as high exposure.
Here, when the target brightness interval is the first brightness interval, it is described that most of the pixel numbers in the exposure data of the acquired multi-frame captured image are concentrated in the area with the brightness of 0 to 50, and it is described that the brightness of the current environment is relatively dark, and the brightness value of the acquired image is relatively low, and at this time, it is necessary to expose the image with a high exposure value to obtain a better exposed image.
Step two: and if the target brightness interval is the third brightness interval, determining that the target exposure value is low exposure.
Here, when the target brightness interval is the third brightness interval, it is described that most of the pixels in the exposure data of the acquired multi-frame captured image are concentrated in the area with brightness of 200-.
Step three: and if the target brightness interval is a second brightness interval, sequencing the number of pixels of each histogram distributed in the second brightness interval to obtain a sequencing result.
Here, when the target brightness interval is the second brightness interval, it is described that most of the pixel numbers in the exposure data of the acquired multiple frames of captured images are concentrated in the area with brightness of 50-200, which means that the brightness of the current shooting environment is relatively mild, two frames of images need to be selected from the three frames of images, and the target exposure value is determined according to the distribution of the pixel numbers in the histograms of the two frames of images. Therefore, the number of pixels in the histogram of the three-frame collected image is concentrated on the number of pixels in the area with the brightness of 50-200, and the sequencing result of the number of pixels in the area with the brightness interval of 50-200 from small to large is obtained.
Step four: and determining two target histograms arranged at specific positions in the sorting result.
Here, from the sorting results of the number of pixels of the three-frame histogram with most of the number of pixels concentrated in the interval of 50-200, two histograms with a greater number of pixels in the interval of 50-200 are selected as the target histogram, i.e., the target histogram one and the target histogram two, where the exposure value of the target histogram one is higher than that of the target histogram two, and fig. 3 is a schematic diagram of the target histogram of the embodiment of the present application, as shown in fig. 3(a) and fig. 3(b), where fig. 3(a) is the target histogram one, and fig. 3(b) is the target histogram two.
Step five: based on the two target histograms, a target exposure value is determined.
Here, the two target histograms are subjected to expansion and exposure difference conversion to determine a target exposure value.
In some possible embodiments, after obtaining the target histogram, the following process may be further performed:
first, a first region with the same shape and a second region with different shape between the two target histograms are determined.
Here, as shown in fig. 3(a) and 3(B), the region B in fig. 3(a) is the same region as the region C in fig. 3(B), i.e., the first region. The second region comprises at least two sets of sub-regions of different luminance values. For example, the region a on the left side of the straight line 301 in fig. 3(a) and the region D on the right side of the straight line 302 in fig. 3(b) are regions with different luminance values, that is, the two sub-regions form a sub-region set to form a second region; as can be seen from fig. 3, the number of pixels included in the a region in fig. 3(a) is different from that included in the D region in fig. 3 (b).
In a specific example, comparing the shapes of the two target histograms in fig. 3, a region with the same shape in the two target histograms is determined and is denoted as a first region, and a region with a different shape is denoted as a second region. The regions of the two object histograms that are not the same shape include two sub-regions, and the luminance values in the two sub-regions are different.
And secondly, splicing the two target histograms based on the first region and the sub-region set to obtain a spliced graph.
Here, the two histograms are divided according to the boundary of the sub-region set included in the first region and the second region, and then region histograms corresponding to different regions are obtained. And splicing the two target histograms according to the positions of the histograms in different areas to obtain a widened histogram, namely a spliced graph.
In some embodiments, the mosaic may be obtained by:
first, in the two target histograms, a position relationship of the sub-region set and the first region is determined.
Here, of the two target histograms, the two target histograms are marked as a first target histogram and a second target histogram, respectively, and the sub-regions included in the set of marked sub-regions are a first sub-region and a second sub-region. As shown in fig. 3(a), in the first target histogram, a first sub-region a belonging to the first target histogram in the set of sub-regions is determined, and the position relationship between the first sub-region a and the region B in the first region is determined, that is, the first sub-region a is located on the left side of the first region B. As shown in fig. 3(b), in the second target histogram, a second subregion D belonging to the second target histogram in the set of subregions is determined, and the position relationship between the second subregion D and the first region C is determined, that is, the second subregion D is located on the right side of the region C in the first region.
And secondly, splicing the first region and the sub-region set according to the position relation to obtain a spliced graph.
Here, the position relationship between each sub-region in the sub-region set and the first region is obtained by splicing each sub-region at a corresponding position of the first region according to the original position relationship with the first region. In a specific example, as shown in fig. 4, fig. 4 is a schematic diagram of a splicing diagram of an embodiment of the present application, where regions B and C in fig. 3 are overlapped to obtain a first region 42, i.e., a region between a straight line 401 and a straight line 402, and a first sub-region a in the sub-region set in fig. 3(a) is spliced on the left side of the first region 42, i.e., a region 41 in the splicing diagram, i.e., a region on the left side of the straight line 401; the second sub-region D of the set of sub-regions in fig. 3(b) is stitched to the right of the first region 42, i.e. the region 43 in the stitched map, i.e. the region to the right of the straight line 402, resulting in the stitched map 44.
And thirdly, determining a target exposure value based on the splicing map.
Here, an exposure value corresponding to each splicing area in the splicing map is determined, and a target exposure value can be determined by calculating an exposure difference value according to the exposure value corresponding to each splicing area.
In some possible implementations, the third step may be implemented by:
firstly, dividing the mosaic image into a plurality of sliding windows along the horizontal axis direction according to a sliding frame with a specific step size and a specific size to obtain a sliding window set.
For example, the specific size of the sliding frame is set as the width of the second brightness interval, the specific step size is the width of one brightness, the sliding frame is slid from left to right in the horizontal axis direction of the mosaic, and the mosaic is divided into a plurality of sliding windows in the sliding process to obtain the sliding window set.
Secondly, in the sliding window set, the number of pixels in each sliding window is determined, and a second pixel number set is obtained.
Here, during sliding of the sliding windows, the number of pixels in the histogram area corresponding to each sliding window is determined.
Again, in the second set of pixel numbers, a maximum value of the number of pixels is determined.
Here, the obtained plurality of second pixel numbers are sorted, and the maximum value of the pixel numbers is determined.
And thirdly, determining the target position of the sliding window corresponding to the maximum value of the pixel number in the spliced graph.
Here, in the mosaic, the position of the left side frame of the sliding window corresponding to the maximum value of the number of pixels on the horizontal axis in the mosaic is determined.
Finally, a target exposure value is determined based on the target position.
Here, the distance of the target position from the left end point of the stitched histogram, the distance of the target position from the right end point of the stitched map, and the target exposure value are determined from the distances and the exposure values at the left and right end points.
In one specific example, based on the target position, determining the target exposure value may be accomplished by: first, two exposure values corresponding to the two target histograms are respectively obtained. Here, the exposure value corresponding to the target histogram to which the left-side stitched region of the same region belongs in the stitched image is denoted as F1, and the exposure value corresponding to the target histogram to which the right-side stitched region of the same region belongs in the stitched image is denoted as F2, where F1> F2. Secondly, a first end point and a second end point of two ends of the mosaic are determined. Selecting the endpoint at the leftmost side of the splicing map as a first endpoint, and selecting the endpoint at the rightmost side of the splicing map as a second endpoint; as shown in fig. 5, fig. 5 is a schematic diagram of a tiled graph sliding window according to an embodiment of the present application, where the sliding window is a window formed by A1B1C1, and a plurality of sliding windows are obtained in a process that the sliding window A1B1C1 slides from left to right on a horizontal axis of the tiled graph 51, where when the sliding window slides to a position 502, the number of pixels in the sliding window A2B2C2 is obtained to be the maximum, and then 502 is determined as a target position, 501 is a first endpoint, and 503 is a second endpoint. Again, a ratio of the distance between the target position and the first endpoint and the distance between the target position and the second endpoint is determined. Here, the distance between the target position and the first end point is denoted as S1, the distance between the target position and the second end point is denoted as S2, and the ratio between S1 and S2 is denoted as M. Finally, the target exposure value is determined based on the ratio and the two exposure values. Here, a difference exposure calculation is performed based on the ratio and the two exposure values, and a target exposure value F is determined. Namely, F1- (F1-F2) × M.
An electronic device provided in an embodiment of the present application is provided, fig. 6 is a schematic structural diagram of an electronic device for generating a preview image in a preview mode in an embodiment of the present application, and as shown in fig. 6, the electronic device 600 includes: a camera module 601 and a processor 602;
the processor 602 is configured to control the camera module to obtain a target exposure value for a current photographing scene if the camera application is in a preview mode; wherein the target exposure value is an exposure value determined for the captured image based on at least three different exposure values; obtaining a preview image based on the target exposure value; and displaying the preview image.
In the above electronic device, the processor 602 is further configured to control the camera module to respectively acquire a first acquired image, a second acquired image and a third acquired image for the current photographing scene according to a first exposure value, a second exposure value and a third exposure value; wherein the first exposure value is less than the second exposure value, which is less than the third exposure value; determining a histogram of each acquired image to obtain a histogram set; determining the number of pixels of different brightness intervals based on a plurality of different brightness intervals and the histogram set; determining a brightness interval with the number of pixels larger than a number threshold value as a target brightness interval; and determining the target exposure value based on the brightness value corresponding to the target brightness interval.
In the above electronic device, the processor 602 is further configured to control the camera module to determine that the target brightness interval is the first brightness interval if the number of pixels in the first brightness interval is greater than the number threshold; if the number of the pixels in the second brightness interval is larger than the number threshold, determining that the target brightness interval is the second brightness interval; if the number of the pixels in the third brightness interval is larger than the number threshold, determining that the target brightness interval is the third brightness interval; the brightness value contained in the first brightness interval is less than or equal to the brightness value contained in the second brightness interval; the brightness value included in the second brightness interval is less than or equal to the brightness value included in the third brightness interval.
In the above electronic device, the processor 602 is further configured to control the camera module to determine that the target exposure value is high exposure if the target brightness interval is the first brightness interval; if the target brightness interval is the third brightness interval, determining that the target exposure value is low exposure; if the target brightness interval is the second brightness interval, sorting the number of pixels of each histogram distributed in the second brightness interval to obtain a sorting result; determining two target histograms arranged at specific positions in the sorting result; based on the two target histograms, the target exposure value is determined.
In the above electronic device, the processor 602 is further configured to control the camera module to determine a first region with the same shape and a second region with a different shape between the two target histograms; wherein the second region comprises at least two sets of sub-regions differing in luminance value; splicing the two target histograms based on the first region and the sub-region set to obtain a spliced graph; determining the target exposure value based on the mosaic.
In the above electronic device, the processor 602 is further configured to control the camera module to implement in the two target histograms, and determine a position relationship between the sub-region set and the first region; and splicing the first region and the sub-region set according to the position relation to obtain the spliced graph.
In the above electronic device, the processor 602 is further configured to control the camera module to implement a sliding frame according to a specific step size and a specific size, and divide the mosaic into a plurality of sliding windows along the horizontal axis direction to obtain a sliding window set; in the sliding window set, determining the number of pixels in each sliding window to obtain a second pixel number set; determining a maximum value of the number of pixels in the second set of numbers of pixels; determining a target position of a sliding window corresponding to the maximum value of the pixel quantity in the splicing map; based on the target position, the target exposure value is determined.
In the above electronic device, the processor 602 is further configured to control the camera module to respectively obtain two exposure values corresponding to the two target histograms; determining a first endpoint and a second endpoint at two ends of the mosaic; determining a ratio of a distance between the target location and the first endpoint and a distance between the target location and the second endpoint; based on the ratio and the two exposure values, the target exposure value is determined.
It should be noted that the above description of the embodiment of the electronic device is similar to the description of the embodiment of the method, and has similar beneficial effects to the embodiment of the method. For technical details not disclosed in the embodiments of the electronic device of the present application, refer to the description of the embodiments of the method of the present application for understanding.
Correspondingly, an embodiment of the present application provides a computer storage medium, in which computer-executable instructions are stored, and the computer-executable instructions are configured to execute the preview-mode preview image generating method provided in other embodiments of the present application.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, etc.) to execute the method described in the embodiments of the present application.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, electronic devices (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing electronic device to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing electronic device, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing electronic devices to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing electronic device to cause a series of operational steps to be performed on the computer or other programmable electronic device to produce a computer implemented process such that the instructions which execute on the computer or other programmable electronic device provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The above description is only a preferred embodiment of the present application, and not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes, which are made by the contents of the specification and the drawings of the present application, or which are directly or indirectly applied to other related technical fields, are included in the scope of the present application.

Claims (10)

1. A preview mode preview image generation method, the method comprising:
if the camera application is in a preview mode, obtaining a target exposure value aiming at the current photographing scene; wherein the target exposure value is an exposure value determined for the captured image based on at least three different exposure values;
obtaining a preview image based on the target exposure value;
and displaying the preview image.
2. The method of claim 1, the obtaining a target exposure value for a current photo scene, comprising:
respectively acquiring a first acquired image, a second acquired image and a third acquired image aiming at the current photographing scene according to the first exposure value, the second exposure value and the third exposure value; wherein the first exposure value is less than the second exposure value, which is less than the third exposure value;
determining a histogram of each acquired image to obtain a histogram set;
determining the number of pixels of different brightness intervals based on a plurality of different brightness intervals and the histogram set;
determining a brightness interval with the number of pixels larger than a number threshold value as a target brightness interval;
and determining the target exposure value based on the brightness value corresponding to the target brightness interval.
3. The method of claim 2, wherein the brightness intervals comprise at least three brightness intervals, and the at least three brightness intervals comprise different brightness values,
the determining that the brightness interval with the number of pixels larger than the number threshold is the target brightness interval comprises:
if the number of the pixels in the first brightness interval is larger than the number threshold, determining that the target brightness interval is the first brightness interval;
if the number of the pixels in the second brightness interval is larger than the number threshold, determining that the target brightness interval is the second brightness interval;
if the number of the pixels in the third brightness interval is larger than the number threshold, determining that the target brightness interval is the third brightness interval; the brightness value contained in the first brightness interval is less than or equal to the brightness value contained in the second brightness interval; the brightness value included in the second brightness interval is less than or equal to the brightness value included in the third brightness interval.
4. The method of claim 3, wherein determining the target exposure value based on the brightness value corresponding to the target brightness interval comprises:
if the target brightness interval is the first brightness interval, determining that the target exposure value is high exposure;
if the target brightness interval is the third brightness interval, determining that the target exposure value is low exposure;
if the target brightness interval is the second brightness interval, sorting the number of pixels of each histogram distributed in the second brightness interval to obtain a sorting result;
determining two target histograms arranged at specific positions in the sorting result;
based on the two target histograms, the target exposure value is determined.
5. The method of claim 4, after the determining two target histograms arranged at a particular location in the ranking result, the method further comprising:
determining a first region with the same shape and a second region with different shapes between the two target histograms; wherein the second region comprises at least two sets of sub-regions differing in luminance value;
splicing the two target histograms based on the first region and the sub-region set to obtain a spliced graph;
determining the target exposure value based on the mosaic.
6. The method of claim 5, wherein the stitching the two target histograms based on the first region and the set of sub-regions to obtain a stitched map comprises:
determining the position relation between the sub-region set and the first region in the two target histograms;
and splicing the first region and the sub-region set according to the position relation to obtain the spliced graph.
7. The method of claim 5, the determining the target exposure value based on the mosaic, comprising:
dividing the splicing map into a plurality of sliding windows along the direction of a horizontal axis according to a sliding frame with a specific step length and a specific size to obtain a sliding window set;
in the sliding window set, determining the number of pixels in each sliding window to obtain a second pixel number set;
determining a maximum value of the number of pixels in the second set of numbers of pixels;
determining a target position of a sliding window corresponding to the maximum value of the pixel quantity in the splicing map;
based on the target position, the target exposure value is determined.
8. The method of claim 7, the determining the target exposure value based on the target position comprising:
respectively acquiring two exposure values corresponding to the two target histograms;
determining a first endpoint and a second endpoint at two ends of the mosaic;
determining a ratio of a distance between the target location and the first endpoint and a distance between the target location and the second endpoint;
based on the ratio and the two exposure values, the target exposure value is determined.
9. An electronic device, the electronic device comprising:
a camera module;
a processor for controlling the camera module to implement the preview mode preview image generating method of any one of the above claims 1 to 8.
10. A computer-readable storage medium having computer-executable instructions stored therein, the computer-executable instructions being configured to perform the preview mode preview image generating method provided by any of the above claims 1 to 8.
CN201911411921.6A 2019-12-31 2019-12-31 Preview mode preview image generation method, electronic device and storage medium Active CN111131709B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911411921.6A CN111131709B (en) 2019-12-31 2019-12-31 Preview mode preview image generation method, electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911411921.6A CN111131709B (en) 2019-12-31 2019-12-31 Preview mode preview image generation method, electronic device and storage medium

Publications (2)

Publication Number Publication Date
CN111131709A true CN111131709A (en) 2020-05-08
CN111131709B CN111131709B (en) 2021-04-13

Family

ID=70506393

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911411921.6A Active CN111131709B (en) 2019-12-31 2019-12-31 Preview mode preview image generation method, electronic device and storage medium

Country Status (1)

Country Link
CN (1) CN111131709B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113422908A (en) * 2021-07-01 2021-09-21 联想(北京)有限公司 Data processing method and device
CN114885096A (en) * 2022-03-29 2022-08-09 北京旷视科技有限公司 Shooting mode switching method, electronic equipment and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7046307B1 (en) * 1999-11-11 2006-05-16 Stmicroelectronics Asia Pacific Pte Ltd. Video signal noise level estimator
CN101604449A (en) * 2009-07-02 2009-12-16 浙江大学 A kind of tracking image target method and device based on parallel particle filtering
CN102045546A (en) * 2010-12-15 2011-05-04 广州致远电子有限公司 Panoramic parking assist system
CN103248828A (en) * 2012-02-13 2013-08-14 宏达国际电子股份有限公司 Exposure value adjustment apparatus and method
CN106412447A (en) * 2015-07-31 2017-02-15 广达电脑股份有限公司 Exposure control system and method thereof
CN106534714A (en) * 2017-01-03 2017-03-22 南京地平线机器人技术有限公司 Exposure control method, device and electronic equipment
CN107613191A (en) * 2017-08-01 2018-01-19 努比亚技术有限公司 A kind of photographic method, equipment and computer-readable recording medium
CN109729275A (en) * 2019-03-14 2019-05-07 Oppo广东移动通信有限公司 Imaging method, device, terminal and storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7046307B1 (en) * 1999-11-11 2006-05-16 Stmicroelectronics Asia Pacific Pte Ltd. Video signal noise level estimator
CN101604449A (en) * 2009-07-02 2009-12-16 浙江大学 A kind of tracking image target method and device based on parallel particle filtering
CN102045546A (en) * 2010-12-15 2011-05-04 广州致远电子有限公司 Panoramic parking assist system
CN103248828A (en) * 2012-02-13 2013-08-14 宏达国际电子股份有限公司 Exposure value adjustment apparatus and method
CN106412447A (en) * 2015-07-31 2017-02-15 广达电脑股份有限公司 Exposure control system and method thereof
CN106534714A (en) * 2017-01-03 2017-03-22 南京地平线机器人技术有限公司 Exposure control method, device and electronic equipment
CN107613191A (en) * 2017-08-01 2018-01-19 努比亚技术有限公司 A kind of photographic method, equipment and computer-readable recording medium
CN109729275A (en) * 2019-03-14 2019-05-07 Oppo广东移动通信有限公司 Imaging method, device, terminal and storage medium

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113422908A (en) * 2021-07-01 2021-09-21 联想(北京)有限公司 Data processing method and device
CN113422908B (en) * 2021-07-01 2023-05-23 联想(北京)有限公司 Data processing method and device
CN114885096A (en) * 2022-03-29 2022-08-09 北京旷视科技有限公司 Shooting mode switching method, electronic equipment and storage medium
CN114885096B (en) * 2022-03-29 2024-03-15 北京旷视科技有限公司 Shooting mode switching method, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN111131709B (en) 2021-04-13

Similar Documents

Publication Publication Date Title
CN107409166B (en) Automatic generation of panning shots
WO2017016050A1 (en) Image preview method, apparatus and terminal
CN112135046B (en) Video shooting method, video shooting device and electronic equipment
US20080143841A1 (en) Image stabilization using multi-exposure pattern
CN107749944A (en) A kind of image pickup method and device
US7466356B2 (en) Method and apparatus for setting a marker on an object and tracking the position of the object
CN110349163B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN112714255B (en) Shooting method and device, electronic equipment and readable storage medium
KR20120099713A (en) Algorithms for estimating precise and relative object distances in a scene
CN111131709B (en) Preview mode preview image generation method, electronic device and storage medium
CN103797782A (en) Image processing device and program
JP2014127963A (en) Image processing system and image processing method
CN113794829B (en) Shooting method and device and electronic equipment
CN107645628B (en) Information processing method and device
CN101472076B (en) Device for filming image and filming control method thereof
CN110392211B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN111201773A (en) Photographing method and device, mobile terminal and computer readable storage medium
CN103795927B (en) Photographing method and system
CN113727001B (en) Shooting method and device and electronic equipment
CN113989387A (en) Camera shooting parameter adjusting method and device and electronic equipment
EP2200275B1 (en) Method and apparatus of displaying portrait on a display
CN111726526B (en) Image processing method and device, electronic equipment and storage medium
JP6116436B2 (en) Image processing apparatus and image processing method
JP2019176305A (en) Imaging apparatus, control method of the same, and program
US11972546B2 (en) Image processing apparatus and image processing method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant