CN111225158A - Image generation method and device, electronic equipment and computer readable storage medium - Google Patents

Image generation method and device, electronic equipment and computer readable storage medium Download PDF

Info

Publication number
CN111225158A
CN111225158A CN201811413079.5A CN201811413079A CN111225158A CN 111225158 A CN111225158 A CN 111225158A CN 201811413079 A CN201811413079 A CN 201811413079A CN 111225158 A CN111225158 A CN 111225158A
Authority
CN
China
Prior art keywords
image data
exposure
exposure parameters
exposure parameter
generate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811413079.5A
Other languages
Chinese (zh)
Other versions
CN111225158B (en
Inventor
孙伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN201811413079.5A priority Critical patent/CN111225158B/en
Publication of CN111225158A publication Critical patent/CN111225158A/en
Application granted granted Critical
Publication of CN111225158B publication Critical patent/CN111225158B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)

Abstract

The disclosure relates to an image generation method and device, an electronic device and a computer readable storage medium, wherein the method comprises the following steps: acquiring an exposure parameter corresponding to a current shot object and a plurality of groups of other exposure parameters similar to the exposure parameter; selecting image data corresponding to the exposure parameters indicated by the user selection instruction according to the received user selection instruction, wherein the user selection instruction is used for indicating the exposure parameters selected by the user from the multiple groups of other exposure parameters; synthesizing the image data corresponding to the exposure parameter with the selected image data to generate a target image for the current subject.

Description

Image generation method and device, electronic equipment and computer readable storage medium
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to an image generation method and apparatus, an electronic device, and a computer-readable storage medium.
Background
With the development of terminal technology, users have more and more demands on shooting with terminal equipment. Taking a mobile phone as an example, people can take pictures by using the mobile phone in many occasions based on the portability of the mobile phone. However, when a user takes a picture in an environment with a backlight or a large difference between brightness and brightness, because the brightness is too high, some parts of the taken picture cannot embody the details of the parts because of being too dark or too bright, so that the clarity of the picture is low, and the user experience is poor.
Disclosure of Invention
The present disclosure provides an image generation method and apparatus, an electronic device, and a computer-readable storage medium to solve the deficiencies in the related art.
According to a first aspect of embodiments of the present disclosure, there is provided an image generation method, including:
acquiring an exposure parameter corresponding to a current shot object and a plurality of groups of other exposure parameters similar to the exposure parameter;
selecting image data corresponding to the exposure parameters indicated by the user selection instruction according to the received user selection instruction, wherein the user selection instruction is used for indicating the exposure parameters selected by the user from the multiple groups of other exposure parameters;
synthesizing the image data corresponding to the exposure parameter with the selected image data to generate a target image for the current subject.
Optionally, the obtaining of the exposure parameter corresponding to the current subject and a plurality of groups of exposure parameters similar to the exposure parameter includes:
acquiring photometric data corresponding to a current subject;
determining an exposure parameter corresponding to the photometric data according to a photometric data table;
and determining other photometric data of the difference value of the photometric data in the photometric data table within a preset range, and reading exposure parameters recorded in the photometric data table and corresponding to the other photometric data.
Alternatively to this, the first and second parts may,
further comprising: shooting the current shot object according to the exposure parameters to generate a target photo, and shooting the current shot object according to each exposure parameter in the multiple groups of other exposure parameters to generate a photo;
the selecting the image data corresponding to the exposure parameter indicated by the user selection instruction according to the received user selection instruction comprises: selecting a photo corresponding to the exposure parameter indicated by the user selection instruction;
the synthesizing of the image data corresponding to the exposure parameter with the picked-up image data to generate a target image for the current subject includes: and synthesizing the target photo and the selected photo to generate the target image for the current shot object.
Optionally, the method further includes: shooting the current shot object according to the exposure parameters to generate a target picture, shooting the current shot object according to the multiple groups of other exposure parameters, and recording color channel values of pixel points obtained by shooting the current shot object under the exposure parameters;
the selecting the image data corresponding to the exposure parameter indicated by the user selection instruction according to the received user selection instruction comprises: selecting a color channel value of each pixel point corresponding to the exposure parameter indicated by the user selection instruction;
the synthesizing of the image data corresponding to the exposure parameter with the picked-up image data to generate a target image for the current subject includes: and adjusting the target picture according to the selected color channel value to generate the target image aiming at the current shot object.
Optionally, the synthesizing the image data corresponding to the exposure parameter with the selected image data to generate the target image for the current subject includes:
identifying a dark region of overexposed image data and a bright region of underexposed image data in the selected image data;
and adjusting image data corresponding to the exposure parameters according to the dark part area and the bright part area to generate a target image aiming at the current shot object.
According to a second aspect of the embodiments of the present disclosure, there is provided an image generation apparatus including:
an acquisition unit that acquires an exposure parameter corresponding to a current subject and a plurality of groups of other exposure parameters that are close to the exposure parameter;
the selecting unit is used for selecting image data corresponding to the exposure parameter indicated by the user selecting instruction according to the received user selecting instruction, wherein the user selecting instruction is used for indicating the exposure parameter selected by the user from the plurality of groups of other exposure parameters;
a generation unit that synthesizes image data corresponding to the exposure parameter with the picked-up image data to generate a target image for the current subject.
Optionally, the obtaining unit includes:
an acquisition subunit that acquires photometric data corresponding to a current subject;
a first determining subunit that determines an exposure parameter corresponding to the photometry data from a photometry data table;
and the second determining subunit determines other photometric data of the difference value between the photometric data and the photometric data in a preset range in the photometric data table, and reads the exposure parameters corresponding to the other photometric data recorded in the photometric data table.
Alternatively to this, the first and second parts may,
further comprising: a first photo generation unit which takes the current subject according to the exposure parameters to generate a target photo, and takes the current subject according to each exposure parameter of the plurality of other exposure parameters and generates a photo;
the selecting unit comprises: the first selecting subunit selects a photo corresponding to the exposure parameter indicated by the user selecting instruction;
the generation unit includes: and the synthesis subunit synthesizes the target photo and the selected photo to generate the target image aiming at the current shot object.
Alternatively to this, the first and second parts may,
further comprising: the second photo generation unit is used for shooting the current shot object according to the exposure parameters to generate a target photo, shooting the current shot object according to the multiple groups of other exposure parameters, and recording color channel values of all pixel points obtained by shooting the current shot object under all exposure parameters;
the selecting unit comprises: the second selection subunit selects the color channel value of each pixel point corresponding to the exposure parameter indicated by the user selection instruction;
the generation unit includes: and the first adjusting subunit adjusts the target picture according to the selected color channel value to generate the target image for the current shot object.
Optionally, the generating unit includes:
an identifying subunit that identifies a dark region of the overexposed image data and a bright region of the underexposed image data in the selected image data;
a second adjusting subunit that adjusts the image data corresponding to the exposure parameter according to the dark area and the bright area to generate a target image for the current subject.
According to a third aspect of the embodiments of the present disclosure, there is provided an electronic apparatus including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor implements the method as in any of the above embodiments by executing the executable instructions.
According to a fourth aspect of embodiments of the present disclosure, there is provided a computer-readable storage medium having stored thereon computer instructions which, when executed by a processor, implement the steps of the method as in any one of the above-mentioned embodiments.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects:
as can be seen from the foregoing embodiments, according to the present disclosure, by obtaining other exposure parameters similar to the exposure parameters (i.e., normal exposure parameters) adopted when the current subject is shot, and further obtaining image data (i.e., including overexposed and underexposed image data) obtained under the other exposure parameters for selection by a user, the user can flexibly select overexposed and/or underexposed image data according to the user's own needs to adjust the image data obtained under the normal exposure parameters, so that details of a finally generated image lost due to over-brightness or over-darkness can be effectively avoided.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
FIG. 1 is a flow chart illustrating an image generation method according to an exemplary embodiment.
FIG. 2 is a flow chart illustrating another method of image generation according to an exemplary embodiment.
3A-3D are schematic diagrams illustrating the generation of a target image according to an exemplary embodiment.
FIG. 4 is a flow chart illustrating another method of image generation according to an exemplary embodiment.
Fig. 5 is a block diagram illustrating an image generation apparatus according to an exemplary embodiment.
Fig. 6 is a block diagram illustrating another image generation apparatus according to an exemplary embodiment.
Fig. 7 is a block diagram illustrating another image generation apparatus according to an exemplary embodiment.
Fig. 8 is a block diagram illustrating another image generation apparatus according to an exemplary embodiment.
Fig. 9 is a block diagram illustrating another image generation apparatus according to an exemplary embodiment.
Fig. 10 is a schematic diagram illustrating a structure for an image generation apparatus according to an exemplary embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this application and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It is to be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present application. The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination", depending on the context.
Fig. 1 is a flowchart illustrating an image generation method according to an exemplary embodiment, which is applied to an electronic device, as shown in fig. 1, and may include the following steps:
in step 102, an exposure parameter corresponding to a current subject and a plurality of sets of other exposure parameters close to the exposure parameter are acquired.
In the present embodiment, a photometry data table may be configured in advance for recording a mapping relationship of photometry data to exposure parameters (shutter, aperture, and sensitivity). Then, when shooting the current subject, photometry may be performed first (e.g., the electronic device may perform photometry through a photometry table) to acquire photometry data corresponding to the current subject, an exposure parameter corresponding to the photometry data is determined from a photometry data table, other photometry data in the photometry data table having a difference from the photometry data within a preset range is determined, and the exposure parameter corresponding to the other photometry data recorded in the photometry data table is read. In other words, after determining the exposure parameter to be currently used (hereinafter, referred to as a normal exposure parameter) from the photometric data and the photometric data table, a plurality of sets of other exposure parameters similar to the normal exposure parameter in the photometric data table can be further obtained. The plurality of groups of other exposure parameters comprise exposure parameters which are overexposed relative to the normal exposure parameters and exposure parameters which are underexposed relative to the normal exposure parameters. Based on the obtained multiple groups of other exposure parameters, image data corresponding to the multiple groups of other exposure parameters can be collected for a user to flexibly select according to actual requirements. As an exemplary embodiment, the user may instruct the electronic device to select the corresponding image data synthesis target image by inputting a user selection instruction to the electronic device.
In step 104, image data corresponding to the exposure parameter indicated by the user selection instruction is selected according to the received user selection instruction, where the user selection instruction is used to indicate the exposure parameter selected by the user from the plurality of other sets of exposure parameters.
In an embodiment, after acquiring the normal exposure parameter and the plurality of other exposure parameters, the current subject may be captured according to the exposure parameters to generate a target photo, and the current subject may be captured according to each exposure parameter of the plurality of other exposure parameters to generate a photo. After the photos are shot and generated based on all the acquired exposure parameters, the photos corresponding to the exposure parameters indicated by the user selection instruction can be selected according to the user selection instruction, and the target photos and the selected photos are synthesized to generate the target images for the current shot object. For example, the electronic device may show all generated photos and exposure parameters corresponding to the photos to the user, so that the user can flexibly select the photos according to actual requirements; it should be noted that, the user may select only a part of the exposure parameters in the plurality of other exposure parameters, or may select all of the exposure parameters in the plurality of other exposure parameters.
In another embodiment, after obtaining the normal exposure parameters and the plurality of groups of other exposure parameters, the current subject may be photographed according to the exposure parameters to generate a target picture, the current subject may be photographed according to the plurality of groups of other exposure parameters, and color channel values of each pixel point obtained by photographing the current subject under each exposure parameter are recorded (no picture is generated). Based on the generation of the target picture and the recording of the color channel values, the color channel values of the pixel points corresponding to the exposure parameters indicated by the user selection instruction can be selected, and the target picture is adjusted according to the selected color channel values so as to generate the target image for the current shot object. For example, the electronic device may present the target photo and the selected sets of other exposure parameters to the user for the user to flexibly select according to actual needs.
In step 106, the image data corresponding to the exposure parameter is synthesized with the picked-up image data to generate a target image for the current subject.
In the present embodiment, based on the above analysis, the plurality of other exposure parameters include an overexposed exposure parameter and an underexposed exposure parameter. Accordingly, a dark portion region of image data obtained using exposure parameters for overexposure (hereinafter simply referred to as overexposed image data) exhibits abundant details (as compared with a region corresponding to the dark portion region in image data obtained using normal exposure parameters), and a bright portion region of image data obtained using exposure parameters for underexposure (hereinafter simply referred to as underexposed image data) exhibits abundant details (as compared with a region corresponding to the bright portion region in image data obtained using normal exposure parameters). Therefore, when synthesizing the target image, the dark region of the overexposed image data and the bright region of the underexposed image data in the selected image data can be identified, and the image data corresponding to the exposure parameter is adjusted according to the dark region and the bright region to generate the target image for the current subject. By the method for synthesizing the target image, the details of each region in the target image can be richer, the image quality is higher, and the user experience is favorably improved.
As can be seen from the foregoing embodiments, according to the present disclosure, by obtaining other exposure parameters similar to the exposure parameters (i.e., normal exposure parameters) adopted when the current subject is shot, and further obtaining image data (i.e., including overexposed and underexposed image data) obtained under the other exposure parameters for selection by a user, the user can flexibly select overexposed and/or underexposed image data according to the user's own needs to adjust the image data obtained under the normal exposure parameters, so that details of a finally generated image lost due to over-brightness or over-darkness can be effectively avoided.
For ease of understanding, the image generation scheme of the present disclosure is described in detail below in conjunction with the scene and the figures.
Referring to fig. 2, fig. 2 is a flowchart illustrating another image generation method according to an exemplary embodiment, and as shown in fig. 2, the method applied to an electronic device may include the following steps:
in step 202, photometric data is acquired.
In step 204, an exposure parameter corresponding to photometric data is determined.
In step 206, similar sets of other exposure parameters are determined.
In the present embodiment, the electronic apparatus may store a photometry data table in advance for recording a mapping relationship of photometry data and exposure parameters, wherein the exposure parameters include a shutter, an aperture, and sensitivity (ISO). Then, when the current subject is photographed, photometry (for example, the electronic device may perform photometry through a photometry table) may be performed first to acquire photometry data corresponding to the current subject, and then an exposure parameter corresponding to the photometry data and other photometry data having a difference value with the photometry data within a preset range in the photometry data table may be determined according to the photometry data table, so as to read the exposure parameter corresponding to the other photometry data.
Taking the measurement of the brightness of the shooting environment as an example, the photometric data table can be shown in table 1:
Figure BDA0001877902570000081
Figure BDA0001877902570000091
TABLE 1
For example, if the brightness acquired by the electronic device when shooting the subject is 400lx, the normal exposure parameters can be determined to be "1/30 s, F10, and ISO 86". Further, assuming that the preset range is that the brightness difference does not exceed 100lx, it can be determined that the exposure parameters are similar to the other exposure parameters, i.e., "1/40 s, F11, ISO100, 1/35s, F10.6, ISO90, 1/28s, F9.7, ISO77, 1/23s, F9.1, and ISO 70", respectively. Wherein, 1/40s, F11, ISO100 and 1/35s, F10.6 and ISO90 are exposure parameters of overexposure, and 1/28s, F9.7, ISO77, 1/23s, F9.1 and ISO70 are exposure parameters of underexposure.
In step 208, a picture is taken at each exposure parameter.
In the present embodiment, the subject is photographed with the determined respective exposure parameters to generate the corresponding photographs. The photo obtained by shooting the exposure parameters of overexposure is an overexposed photo, and the photo obtained by shooting the exposure parameters of underexposure is an underexposed photo.
In step 210, a user selection instruction is received.
In this embodiment, after the determined exposure parameters are respectively used to take and generate a photo, the generated photo and the corresponding exposure parameters may be displayed (for example, the photo is displayed through a touch screen) to a user so that the user may select the photo to synthesize the photo. The user can select part of the photos to be synthesized, and can also select all the photos to be synthesized. In other words, the user can flexibly select the photo synthesis target image according to the actual requirement.
In step 212, the corresponding photo is selected according to the user selection instruction.
In step 214, a target image is generated.
In this embodiment, the dark area of the overexposed picture exhibits more detail (than the area corresponding to the dark area in a picture taken using normal exposure parameters), and the light area of the underexposed picture exhibits more detail (than the area corresponding to the light area in image data taken using normal exposure parameters). Therefore, when synthesizing the target image, the dark area of the overexposed picture and the bright area of the underexposed picture in the picture selected by the user can be identified, and the generated picture (namely the target picture) shot under the normal exposure parameters is adjusted according to the dark area and the bright area, so that the target image is generated. By the method for synthesizing the target image, the details of each region in the target image can be richer, the image quality is higher, and the user experience is favorably improved.
For example, assume that the user selects the photos corresponding to the overexposure parameters "1/35 s, F10.6, ISO 90" and the underexposure parameters "1/28 s, F9.7, ISO 77". As shown in FIG. 3A, FIG. 3A shows a photograph generated using normal exposure parameters; in the figure, the mountain 31 is too bright and the person 32 is too dark to show details due to the backlight shooting, and the details of the water flow 33 are more clearly shown. As shown in FIG. 3B, FIG. 3B shows a photograph taken using overexposure exposure parameters "1/35 s, F10.6, ISO 90"; in which figure 3B shows a photograph that is brighter than figure 3A, and therefore the detail of the person 32 in figure 3B appears more clearly than the person 32 in figure 3A. As shown in FIG. 3C, FIG. 3C shows a photograph taken using the underexposure exposure parameters "1/28 s, F9.7, ISO 77"; in which the photograph shown in fig. 3C is darker than in fig. 3A, and therefore, the detailed portion of the mountain 31 in fig. 3C appears more clearly than the mountain 31 in fig. 3A. In other words, the person 32 in fig. 3B is a dark region of the overexposed image data, and the mountain 31 in fig. 3C is a bright region of the underexposed image data. As shown in fig. 3D, based on the recognition of the person 32 in fig. 3B and the mountain 31 in fig. 3C, the person 32 and the mountain 31 in fig. 3A (the adjusted fig. 3A is the target image) can be adjusted by the person 32 in fig. 3B and the mountain 31 in fig. 3C, so that the clarity of the person 32 and the mountain 31 is improved, and the person 32 and the mountain 31 in fig. 3A can present more detailed parts.
In the above description, the user selects the overexposed image data and the underexposed image data separately as an example. Of course, the user may also select only overexposed image data or only underexposed image data; in other words, the user can flexibly select the required image data according to the self requirement.
As can be seen from the above embodiments, according to the present disclosure, by obtaining other exposure parameters similar to the exposure parameters (i.e., normal exposure parameters) adopted when the current subject is shot, and further obtaining image data (i.e., including overexposed and underexposed image data) obtained under the other exposure parameters for selection by a user, the user can flexibly select overexposed and/or underexposed image data according to the own needs to adjust the image data obtained under the normal exposure parameters, so that details of a finally generated image can be effectively prevented from being lost due to over brightness or over darkness. For example, when the electronic device performs face recognition by shooting a face, if the current shooting environment is backlight, the image generation method disclosed by the invention can effectively avoid recognition failure caused by face information loss due to too dark face part of the image.
Referring to fig. 4, fig. 4 is a flowchart illustrating another image generation method according to an exemplary embodiment, and as shown in fig. 4, the method applied to an electronic device may include the following steps:
in step 402, photometric data is acquired.
In step 404, an exposure parameter corresponding to photometric data is determined.
In step 406, similar sets of other exposure parameters are determined.
In this embodiment, the specific implementation process of the steps 402-406 is similar to the steps 202-206, and is not described herein again.
In step 408, the current subject is captured in accordance with the exposure parameters corresponding to the photometric data and a target photograph is generated.
In step 410, the current subject is photographed according to each exposure parameter of the plurality of other exposure parameters, and color channel values of each pixel point obtained by photographing the subject under each exposure parameter are recorded.
In this embodiment, the electronic device only records the color channel value of each pixel point obtained by shooting the subject under each exposure parameter of the plurality of groups of other exposure parameters, without executing the operation of generating a picture. The color channel value of each pixel point may be recorded in an RGB format, and certainly, the color channel value of each pixel point may also be recorded in any other color coding format, such as a YUV (Luminance, Luma, Chroma) format, which is not limited in this disclosure. The method only records the color channel value of each pixel point for subsequently adjusting the target picture without executing the operation of generating the picture, thereby improving the efficiency of generating the target image by the electronic equipment and reducing the power consumption of the electronic equipment.
In step 412, a user selection instruction is received.
In this embodiment, the generated target photograph and the plurality of sets of other exposure parameters may be presented to the user for the user to adjust the light and/or dark regions in the target photograph. Furthermore, when a plurality of groups of other exposure parameters are displayed, overexposure exposure parameters and underexposure exposure parameters contained in the exposure parameters can be marked. Then, the user can select the overexposure parameter and/or the underexposure parameter according to the self requirement, and the electronic equipment adjusts the target picture according to the color channel value of each pixel point corresponding to the selected exposure parameter, so that the target image aiming at the shot object is generated.
In step 414, the color channel value of each corresponding pixel point is selected according to the exposure parameter indicated by the user selection instruction.
In this embodiment, based on the generation of the target picture and the recording of the color channel values, the color channel values of the pixel points corresponding to the exposure parameters indicated by the user selection instruction may be selected, and the target picture is adjusted according to the selected color channel values, so as to generate a target image for the current subject.
In step 416, a target image is generated.
In this embodiment, in the process of adjusting the target picture according to the selected color channel value to generate the target image, the generated picture (i.e., the target picture) taken under the normal exposure parameter may also be adjusted according to the dark area of the overexposed image data (in this embodiment, the color channel value of each pixel point obtained by taking the exposure parameter of overexposure) and the bright area of the underexposed image data (in this embodiment, the color channel value of each pixel point obtained by taking the exposure parameter of underexposure), so as to generate the target image. The process is similar to step 214 described above and will not be described further herein.
As can be seen from the foregoing embodiments, according to the present disclosure, by obtaining other exposure parameters similar to the exposure parameters (i.e., normal exposure parameters) adopted when the current subject is shot, and further obtaining image data (i.e., including overexposed and underexposed image data) obtained under the other exposure parameters for selection by a user, the user can flexibly select overexposed and/or underexposed image data according to the user's own needs to adjust the image data obtained under the normal exposure parameters, so that details of a finally generated image lost due to over-brightness or over-darkness can be effectively avoided.
Corresponding to the foregoing embodiments of the image generation method, the present disclosure also provides embodiments of an image generation apparatus.
Fig. 5 is a block diagram illustrating an image generation apparatus according to an exemplary embodiment. Referring to fig. 5, the apparatus includes:
an acquisition unit 51 configured to acquire an exposure parameter corresponding to a current subject and a plurality of sets of other exposure parameters close to the exposure parameter;
a selecting unit 52 configured to select, according to a received user selection instruction, image data corresponding to the exposure parameter indicated by the user selection instruction, where the user selection instruction is used to instruct a user to select an exposure parameter from the plurality of groups of other exposure parameters;
a generating unit 53 configured to synthesize image data corresponding to the exposure parameter with the picked-up image data to generate a target image for the current subject.
As shown in fig. 6, fig. 6 is a block diagram of another image generation apparatus according to an exemplary embodiment, which is based on the foregoing embodiment shown in fig. 5, and the obtaining unit 51 may include:
an acquisition sub-unit 511 configured to acquire photometric data corresponding to a current subject;
a first determining subunit 512 configured to determine an exposure parameter corresponding to the photometry data from a photometry data table;
a second determining subunit 513 configured to determine other photometric data in the photometric data table whose difference from the photometric data is within a preset range, and read an exposure parameter recorded in the photometric data table corresponding to the other photometric data.
As shown in fig. 7, fig. 7 is a block diagram of another image generation apparatus according to an exemplary embodiment, which is based on the foregoing embodiment shown in fig. 5, and further includes:
a first photograph generation unit 54 configured to capture the current subject in accordance with the exposure parameters to generate a target photograph, and capture the current subject in accordance with each of the plurality of other exposure parameters and generate a photograph;
the selecting unit 52 includes: a first selecting subunit 521 configured to select a picture corresponding to the exposure parameter indicated by the user selection instruction;
the generation unit 53 includes: a synthesizing subunit 531 configured to synthesize the target photograph with the picked-up photograph to generate the target image for the current subject.
It should be noted that, the structures of the first photo generation unit 54, the first selection subunit 521 and the synthesis subunit 531 in the apparatus embodiment shown in fig. 7 may also be included in the apparatus embodiment of fig. 6, and the disclosure is not limited thereto.
As shown in fig. 8, fig. 8 is a block diagram of another image generation apparatus according to an exemplary embodiment, which is based on the foregoing embodiment shown in fig. 5, and further includes:
a second photo generation unit 55 configured to capture the current subject according to the exposure parameters to generate a target photo, capture the current subject according to the plurality of other exposure parameters, and record color channel values of each pixel point obtained by capturing the current subject under each exposure parameter;
the selecting unit 52 includes: a second selecting subunit 522 configured to select a color channel value of each pixel point corresponding to the exposure parameter indicated by the user selection instruction;
the generation unit 53 includes: a first adjusting subunit 532, configured to adjust the target photograph according to the selected color channel value, so as to generate the target image for the current subject.
It should be noted that, the structures of the second photo generation unit 55, the second selection subunit 522, and the first adjustment subunit 532 in the apparatus embodiment shown in fig. 8 may also be included in the apparatus embodiment shown in fig. 6, and the disclosure is not limited thereto.
As shown in fig. 9, fig. 9 is a block diagram of another image generation apparatus according to an exemplary embodiment, which is based on the foregoing embodiment shown in fig. 5, and the generation unit 53 may include:
an identifying subunit 533 configured to identify a dark portion region of the overexposed image data and a bright portion region of the underexposed image data in the selected image data;
a second adjusting subunit 534 configured to adjust the image data corresponding to the exposure parameter according to the dark area and the bright area to generate a target image for the current subject.
It should be noted that the structures of the identifying subunit 533 and the second adjusting subunit 534 in the device embodiment shown in fig. 8 may also be included in the device embodiments of fig. 6 to 8, and the disclosure is not limited thereto.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the disclosed solution. One of ordinary skill in the art can understand and implement it without inventive effort.
Correspondingly, the present disclosure also provides an image generating apparatus, including: a processor; a memory for storing processor-executable instructions; wherein the processor is configured to implement the image generation method as in any of the above embodiments, such as the method may comprise: acquiring an exposure parameter corresponding to a current shot object and a plurality of groups of other exposure parameters similar to the exposure parameter; selecting image data corresponding to the exposure parameters indicated by the user selection instruction according to the received user selection instruction, wherein the user selection instruction is used for indicating the exposure parameters selected by the user from the multiple groups of other exposure parameters; synthesizing the image data corresponding to the exposure parameter with the selected image data to generate a target image for the current subject.
Accordingly, the present disclosure also provides a terminal, which includes a memory, and one or more programs, where the one or more programs are stored in the memory, and configured to be executed by one or more processors, where the one or more programs include instructions for implementing the method for implementing screen light supplement as described in any of the above embodiments, for example, the method may include: acquiring an exposure parameter corresponding to a current shot object and a plurality of groups of other exposure parameters similar to the exposure parameter; selecting image data corresponding to the exposure parameters indicated by the user selection instruction according to the received user selection instruction, wherein the user selection instruction is used for indicating the exposure parameters selected by the user from the multiple groups of other exposure parameters; synthesizing the image data corresponding to the exposure parameter with the selected image data to generate a target image for the current subject.
Fig. 10 is a block diagram illustrating an apparatus 1000 for image generation according to an exemplary embodiment. For example, the apparatus 1000 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, and the like.
Referring to fig. 10, the apparatus 1000 may include one or more of the following components: processing component 1002, memory 1004, power component 1006, multimedia component 1008, audio component 1010, input/output (I/O) interface 1012, sensor component 1014, and communications component 1016.
The processing component 1002 generally controls the overall operation of the device 1000, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing components 1002 may include one or more processors 1020 to execute instructions to perform all or a portion of the steps of the methods described above. Further, processing component 1002 may include one or more modules that facilitate interaction between processing component 1002 and other components. For example, the processing component 1002 may include a multimedia module to facilitate interaction between the multimedia component 1008 and the processing component 1002.
The memory 1004 is configured to store various types of data to support operations at the apparatus 1000. Examples of such data include instructions for any application or method operating on device 1000, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 1004 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
The power supply component 1006 provides power to the various components of the device 1000. The power components 1006 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the device 1000.
The multimedia component 1008 includes a screen that provides an output interface between the device 1000 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 1008 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the device 1000 is in an operating mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 1010 is configured to output and/or input audio signals. For example, audio component 1010 includes a Microphone (MIC) configured to receive external audio signals when apparatus 1000 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signal may further be stored in the memory 1004 or transmitted via the communication component 1016. In some embodiments, audio component 1010 also includes a speaker for outputting audio signals.
I/O interface 1012 provides an interface between processing component 1002 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 1014 includes one or more sensors for providing various aspects of status assessment for the device 1000. For example, sensor assembly 1014 may detect an open/closed state of device 1000, the relative positioning of components, such as a display and keypad of device 1000, the change in position of device 1000 or a component of device 1000, the presence or absence of user contact with device 1000, the orientation or acceleration/deceleration of device 1000, and the change in temperature of device 1000. The sensor assembly 1014 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 1014 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 1014 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 1016 is configured to facilitate communications between the apparatus 1000 and other devices in a wired or wireless manner. The device 1000 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 1016 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communications component 1016 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 1000 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer readable storage medium comprising instructions, such as the memory 1004 comprising instructions, executable by the processor 1020 of the device 1000 to perform the above-described method is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (12)

1. An image generation method, comprising:
acquiring an exposure parameter corresponding to a current shot object and a plurality of groups of other exposure parameters similar to the exposure parameter;
selecting image data corresponding to the exposure parameters indicated by the user selection instruction according to the received user selection instruction, wherein the user selection instruction is used for indicating the exposure parameters selected by the user from the multiple groups of other exposure parameters;
synthesizing the image data corresponding to the exposure parameter with the selected image data to generate a target image for the current subject.
2. The method according to claim 1, wherein the acquiring of the exposure parameter corresponding to the current subject and the plurality of sets of exposure parameters similar to the exposure parameter comprises:
acquiring photometric data corresponding to a current subject;
determining an exposure parameter corresponding to photometric data according to a preconfigured photometric data table;
and determining other photometric data of the difference value of the photometric data in the photometric data table within a preset range, and reading exposure parameters recorded in the photometric data table and corresponding to the other photometric data.
3. The method of claim 1,
further comprising: shooting the current shot object according to the exposure parameters to generate a target photo, and shooting the current shot object according to each exposure parameter in the multiple groups of other exposure parameters to generate a photo;
the selecting the image data corresponding to the exposure parameter indicated by the user selection instruction according to the received user selection instruction comprises: selecting a photo corresponding to the exposure parameter indicated by the user selection instruction;
the synthesizing of the image data corresponding to the exposure parameter with the picked-up image data to generate a target image for the current subject includes: and synthesizing the target photo and the selected photo to generate a target image for the current shot object.
4. The method of claim 1,
further comprising: shooting the current shot object according to the exposure parameters to generate a target picture, shooting the current shot object according to the multiple groups of other exposure parameters, and recording color channel values of pixel points obtained by shooting the current shot object under the exposure parameters;
the selecting the image data corresponding to the exposure parameter indicated by the user selection instruction according to the received user selection instruction comprises: selecting a color channel value of each pixel point corresponding to the exposure parameter indicated by the user selection instruction;
the synthesizing of the image data corresponding to the exposure parameter with the picked-up image data to generate a target image for the current subject includes: and adjusting the target picture according to the selected color channel value to generate the target image aiming at the current shot object.
5. The method of claim 1, wherein the synthesizing of the image data corresponding to the exposure parameters with the selected image data to generate the target image for the current subject comprises:
identifying a dark region of overexposed image data and a bright region of underexposed image data in the selected image data;
and adjusting image data corresponding to the exposure parameters according to the dark part area and the bright part area to generate a target image aiming at the current shot object.
6. An image generation apparatus, comprising:
an acquisition unit that acquires an exposure parameter corresponding to a current subject and a plurality of groups of other exposure parameters that are close to the exposure parameter;
the selecting unit is used for selecting image data corresponding to the exposure parameter indicated by the user selecting instruction according to the received user selecting instruction, wherein the user selecting instruction is used for indicating the exposure parameter selected by the user from the plurality of groups of other exposure parameters;
a generation unit that synthesizes image data corresponding to the exposure parameter with the picked-up image data to generate a target image for the current subject.
7. The apparatus of claim 6, wherein the obtaining unit comprises:
an acquisition subunit that acquires photometric data corresponding to a current subject;
a first determining subunit that determines an exposure parameter corresponding to the photometry data from a photometry data table;
and the second determining subunit determines other photometric data of the difference value between the photometric data and the photometric data in a preset range in the photometric data table, and reads the exposure parameters corresponding to the other photometric data recorded in the photometric data table.
8. The apparatus of claim 6,
further comprising: a first photo generation unit which takes the current subject according to the exposure parameters to generate a target photo, and takes the current subject according to each exposure parameter of the plurality of other exposure parameters and generates a photo;
the selecting unit comprises: the first selecting subunit selects a photo corresponding to the exposure parameter indicated by the user selecting instruction;
the generation unit includes: and the synthesis subunit synthesizes the target photo and the selected photo to generate the target image aiming at the current shot object.
9. The apparatus of claim 6,
further comprising: the second photo generation unit is used for shooting the current shot object according to the exposure parameters to generate a target photo, shooting the current shot object according to the multiple groups of other exposure parameters, and recording color channel values of all pixel points obtained by shooting the current shot object under all exposure parameters;
the selecting unit comprises: the second selection subunit selects the color channel value of each pixel point corresponding to the exposure parameter indicated by the user selection instruction;
the generation unit includes: and the first adjusting subunit adjusts the target picture according to the selected color channel value to generate the target image for the current shot object.
10. The apparatus of claim 6, wherein the generating unit comprises:
an identifying subunit that identifies a dark region of the overexposed image data and a bright region of the underexposed image data in the selected image data;
a second adjusting subunit that adjusts the image data corresponding to the exposure parameter according to the dark area and the bright area to generate a target image for the current subject.
11. An electronic device, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor implements the method of any one of claims 1-5 by executing the executable instructions.
12. A computer-readable storage medium having stored thereon computer instructions, which when executed by a processor, perform the steps of the method according to any one of claims 1-5.
CN201811413079.5A 2018-11-23 2018-11-23 Image generation method and device, electronic equipment and computer readable storage medium Active CN111225158B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811413079.5A CN111225158B (en) 2018-11-23 2018-11-23 Image generation method and device, electronic equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811413079.5A CN111225158B (en) 2018-11-23 2018-11-23 Image generation method and device, electronic equipment and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN111225158A true CN111225158A (en) 2020-06-02
CN111225158B CN111225158B (en) 2021-10-22

Family

ID=70830477

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811413079.5A Active CN111225158B (en) 2018-11-23 2018-11-23 Image generation method and device, electronic equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN111225158B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102075687A (en) * 2009-11-25 2011-05-25 奥林巴斯映像株式会社 Imaging device and imaging device control method
CN103973958A (en) * 2013-01-30 2014-08-06 阿里巴巴集团控股有限公司 Image processing method and image processing equipment
CN106572311A (en) * 2016-11-11 2017-04-19 努比亚技术有限公司 Shooting apparatus and method thereof
CN106791380A (en) * 2016-12-06 2017-05-31 周民 The image pickup method and device of a kind of vivid photograph
CN106878624A (en) * 2015-12-10 2017-06-20 奥林巴斯株式会社 Camera head and image capture method
CN107277387A (en) * 2017-07-26 2017-10-20 维沃移动通信有限公司 High dynamic range images image pickup method, terminal and computer-readable recording medium
CN107392877A (en) * 2017-07-11 2017-11-24 中国科学院电子学研究所苏州研究院 A kind of single polarization diameter radar image puppet coloured silkization method
CN107566749A (en) * 2017-09-25 2018-01-09 维沃移动通信有限公司 Image pickup method and mobile terminal
CN107809579A (en) * 2016-09-09 2018-03-16 奥林巴斯株式会社 Camera device and image capture method
WO2018190649A1 (en) * 2017-04-12 2018-10-18 Samsung Electronics Co., Ltd. Method and apparatus for generating hdr images

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102075687A (en) * 2009-11-25 2011-05-25 奥林巴斯映像株式会社 Imaging device and imaging device control method
CN103973958A (en) * 2013-01-30 2014-08-06 阿里巴巴集团控股有限公司 Image processing method and image processing equipment
CN106878624A (en) * 2015-12-10 2017-06-20 奥林巴斯株式会社 Camera head and image capture method
CN107809579A (en) * 2016-09-09 2018-03-16 奥林巴斯株式会社 Camera device and image capture method
CN106572311A (en) * 2016-11-11 2017-04-19 努比亚技术有限公司 Shooting apparatus and method thereof
CN106791380A (en) * 2016-12-06 2017-05-31 周民 The image pickup method and device of a kind of vivid photograph
WO2018190649A1 (en) * 2017-04-12 2018-10-18 Samsung Electronics Co., Ltd. Method and apparatus for generating hdr images
CN107392877A (en) * 2017-07-11 2017-11-24 中国科学院电子学研究所苏州研究院 A kind of single polarization diameter radar image puppet coloured silkization method
CN107277387A (en) * 2017-07-26 2017-10-20 维沃移动通信有限公司 High dynamic range images image pickup method, terminal and computer-readable recording medium
CN107566749A (en) * 2017-09-25 2018-01-09 维沃移动通信有限公司 Image pickup method and mobile terminal

Also Published As

Publication number Publication date
CN111225158B (en) 2021-10-22

Similar Documents

Publication Publication Date Title
CN108419016B (en) Shooting method and device and terminal
CN109345485B (en) Image enhancement method and device, electronic equipment and storage medium
CN106210496B (en) Photo shooting method and device
CN111586282B (en) Shooting method, shooting device, terminal and readable storage medium
JP6538079B2 (en) Imaging parameter setting method, apparatus, program, and recording medium
CN106131441B (en) Photographing method and device and electronic equipment
CN107463052B (en) Shooting exposure method and device
CN109922252B (en) Short video generation method and device and electronic equipment
CN108040204B (en) Image shooting method and device based on multiple cameras and storage medium
CN111385456A (en) Photographing preview method and device and storage medium
CN110677734A (en) Video synthesis method and device, electronic equipment and storage medium
CN111953904A (en) Shooting method, shooting device, electronic equipment and storage medium
CN111953903A (en) Shooting method, shooting device, electronic equipment and storage medium
CN111343386B (en) Image signal processing method and device, electronic device and storage medium
CN111586280B (en) Shooting method, shooting device, terminal and readable storage medium
CN108156381B (en) Photographing method and device
US11617023B2 (en) Method for brightness enhancement of preview image, apparatus, and medium
CN111461950A (en) Image processing method and device
CN111225158B (en) Image generation method and device, electronic equipment and computer readable storage medium
CN111277754B (en) Mobile terminal shooting method and device
CN111586296B (en) Image capturing method, image capturing apparatus, and storage medium
CN111698414B (en) Image signal processing method and device, electronic device and readable storage medium
CN111835977B (en) Image sensor, image generation method and device, electronic device, and storage medium
CN110874829B (en) Image processing method and device, electronic device and storage medium
CN112752010B (en) Shooting method, device and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant