CN111835941B - Image generation method and device, electronic equipment and computer readable storage medium - Google Patents

Image generation method and device, electronic equipment and computer readable storage medium Download PDF

Info

Publication number
CN111835941B
CN111835941B CN201910313742.2A CN201910313742A CN111835941B CN 111835941 B CN111835941 B CN 111835941B CN 201910313742 A CN201910313742 A CN 201910313742A CN 111835941 B CN111835941 B CN 111835941B
Authority
CN
China
Prior art keywords
pixel
image data
image
pixels
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910313742.2A
Other languages
Chinese (zh)
Other versions
CN111835941A (en
Inventor
豆子飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN201910313742.2A priority Critical patent/CN111835941B/en
Publication of CN111835941A publication Critical patent/CN111835941A/en
Application granted granted Critical
Publication of CN111835941B publication Critical patent/CN111835941B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • H04M1/0264Details of the structure or mounting of specific components for a camera module assembly
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/743Bracketing, i.e. taking a series of images with varying exposure conditions

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Studio Devices (AREA)
  • Color Television Image Signal Generators (AREA)

Abstract

The disclosure relates to an image generation method and device, an electronic device and a computer readable storage medium, which are applied to the electronic device, wherein a pixel array of an image sensor included in the electronic device is divided into a plurality of pixel units, and each pixel unit comprises a group of pixels; the method comprises the following steps: responding to a received shooting instruction, and respectively acquiring multiple frames of first image data which are generated by the image sensor based on a first pixel arrangement mode and have different exposure time lengths and at least one frame of second image data generated based on a second pixel arrangement mode; the corresponding pixel data of the same pixel unit in the first image data belong to the same color component, and the corresponding pixel data in the second image data belong to a plurality of color components; fusing the first image data and the second image data to generate a final image.

Description

Image generation method and device, electronic equipment and computer readable storage medium
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to an image generation method and apparatus, an electronic device, and a computer-readable storage medium.
Background
At present, users have higher and higher requirements for the photographing quality of electronic equipment such as mobile phones. The shooting quality is mainly improved by improving hardware parameters in the related art, for example, the size of a camera module is increased, the power consumption is increased, and the cost is increased by using an image sensor containing more pixels and a lens system with higher specification.
Disclosure of Invention
The present disclosure provides an image generation method and apparatus, an electronic device, and a computer-readable storage medium to solve the deficiencies in the related art.
According to a first aspect of the embodiments of the present disclosure, an image generating method is provided, which is applied to an electronic device, where a pixel array of an image sensor included in the electronic device is divided into a plurality of pixel units, and each pixel unit includes a group of pixels; the method comprises the following steps:
responding to a received shooting instruction, and respectively acquiring multiple frames of first image data which are generated by the image sensor based on a first pixel arrangement mode and have different exposure time lengths and at least one frame of second image data generated based on a second pixel arrangement mode; the corresponding pixel data of the same pixel unit in the first image data belong to the same color component, and the corresponding pixel data in the second image data belong to a plurality of color components;
fusing the first image data and the second image data to generate a final image.
Optionally, the fusing the first image data and the second image data to generate a final image includes:
determining a target resolution;
and after the first image data and the second image data are respectively adjusted to the target resolution, fusing to generate the final image.
Optionally, the determining the target resolution includes:
determining the target resolution according to illumination intensity, wherein the target resolution is positively correlated with the illumination intensity; or the like, or, alternatively,
and determining the resolution set by the user as the target resolution.
Optionally, in the first pixel arrangement mode, the pixel units are arranged according to a standard bayer array.
Optionally, in the second pixel arrangement, the pixels included in the same pixel unit are arranged according to a standard bayer array.
Optionally, the pixels on the image sensor are arranged according to a Quad Bayer array.
According to a second aspect of the embodiments of the present disclosure, an image generating apparatus is provided, which is applied to an electronic device including a pixel array of an image sensor divided into a plurality of pixel units, each pixel unit including a group of pixels; the device comprises:
the acquisition unit is used for responding to a received shooting instruction and respectively acquiring a plurality of frames of first image data which are generated by the image sensor based on a first pixel arrangement mode and have different exposure time lengths and at least one frame of second image data generated based on a second pixel arrangement mode; the corresponding pixel data of the same pixel unit in the first image data belong to the same color component, and the corresponding pixel data in the second image data belong to a plurality of color components;
a fusion unit fusing the first image data and the second image data to generate a final image.
Optionally, the fusion unit includes:
a determining subunit determining a target resolution;
and the fusion subunit is used for respectively adjusting the first image data and the second image data to the target resolution and then fusing to generate the final image.
Optionally, the determining subunit is specifically configured to:
determining the target resolution according to illumination intensity, wherein the target resolution is positively correlated with the illumination intensity; or the like, or, alternatively,
and determining the resolution set by the user as the target resolution.
Optionally, in the first pixel arrangement mode, the pixel units are arranged according to a standard bayer array.
Optionally, in the second pixel arrangement, the pixels included in the same pixel unit are arranged according to a standard bayer array.
Optionally, the pixels on the image sensor are arranged according to a Quad Bayer array.
According to a third aspect of the embodiments of the present disclosure, there is provided an electronic device including a pixel array of an image sensor divided into a number of pixel units, each pixel unit including a group of pixels; the electronic device further includes:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to execute the instructions to implement the method as described in any of the embodiments of the first aspect above.
According to a fourth aspect of embodiments of the present disclosure, there is provided a computer-readable storage medium having stored thereon computer instructions, characterized in that the instructions, when executed by a processor, implement the steps of the method as set forth in any of the embodiments of the first aspect described above.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects:
the method and the device enable the fused final image to have a high dynamic range by generating the multi-frame first image data with different exposure time lengths. Meanwhile, as the corresponding pixel data of the same pixel unit in the first image data belong to the same color component, when the pixel data of the same pixel unit is fused into a single combined pixel, the single combined pixel can be fused with the brightness information contained in each pixel data, so as to obtain relatively better dark light expression. Meanwhile, the corresponding pixel data in the second image data belong to a plurality of color components, so that the second image data contains relatively more detail information and has relatively higher image resolution. Therefore, by fusing the second image data with the multi-frame first image data, the obtained final image can have better dim light expression, high dynamic range and high resolution, and thus better image shooting quality can be achieved without increasing the number of pixels of the image sensor or upgrading the lens system.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
FIG. 1 is a schematic diagram illustrating an array of pixels on an image sensor in accordance with an exemplary embodiment.
FIG. 2 is a flow chart illustrating a method of image generation according to an exemplary embodiment.
FIG. 3 is a schematic diagram illustrating one type of generation of image data according to an exemplary embodiment.
FIG. 4 is a flow chart illustrating another method of image generation according to an exemplary embodiment.
Fig. 5 is a block diagram illustrating an image generation apparatus according to an exemplary embodiment.
Fig. 6 is a block diagram illustrating another image generation apparatus according to an exemplary embodiment.
Fig. 7 is a schematic structural diagram illustrating an apparatus for image generation according to an exemplary embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this application and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It is to be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present application. The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination", depending on the context.
FIG. 1 is a schematic diagram illustrating an array of pixels on an image sensor in accordance with an exemplary embodiment. As shown in fig. 1, the image sensor may include 64 pixels, each of which is characterized by a corresponding square, and the pixels are sequentially arranged in an 8 × 8 pixel array; of course, the pixel array is not necessarily square, and the pixel array may include other numbers of pixels, which is not limited by this disclosure. The pixel labeled with the letter R corresponds to the red component, the pixel labeled with the letter G corresponds to the green component, and the pixel labeled with the letter B corresponds to the blue component; other color components may be present in other embodiments and the disclosure is not limited in this regard.
The pixel array shown in fig. 1 is divided into a plurality of pixel units, and each pixel unit includes a group of pixels. For example, in different pixel arrangement directions, the pixel array is divided by lines L1, L2, L3 and lines K1, K2, K3, respectively, to form 16 pixel units as shown in fig. 1, each pixel unit being 4 pixels to form a 2 × 2 square array. Of course, the division of the pixel units may also take other forms, for example, the pixel units are not necessarily square, and each pixel unit may contain other numbers of pixels, which is not limited by the disclosure.
From the embodiment shown in fig. 1 it can be seen that: according to the image quality optimization method, the pixels on the image sensor are divided into the pixel units, and the image data with different characteristics can be acquired respectively based on the switching of the pixel arrangement modes, so that the final image generated by fusion can have multiple characteristics, and the image quality can be obviously optimized under the condition that the image sensor or the lens system is not required to be replaced.
Fig. 2 is a flowchart illustrating an image generation method according to an exemplary embodiment, and as shown in fig. 2, the method is applied to an electronic device, the electronic device includes a pixel array of an image sensor that is divided into a plurality of pixel units, each pixel unit includes a group of pixels (the pixel array of the image sensor may be, for example, the embodiment shown in fig. 1), and the method may include the following steps:
in step 202, in response to a received shooting instruction, acquiring multiple frames of first image data which are generated by the image sensor based on a first pixel arrangement mode and have different exposure durations, and at least one frame of second image data generated based on a second pixel arrangement mode respectively; the corresponding pixel data of the same pixel unit in the first image data belong to the same color component, and the corresponding pixel data in the second image data belong to a plurality of color components.
In an embodiment, a user may send a corresponding shooting instruction to an electronic device by starting a camera APP (Application) on the electronic device and sending a shooting instruction based on the camera APP, for example, triggering a virtual "shooting" key displayed on a screen of a mobile phone, pressing a physical key (a single key or a combination of multiple keys) on the mobile phone, or sending a control voice.
In one embodiment, the electronic device may support multiple shooting modes, such as a normal mode and a high quality mode. In the normal mode, the electronic device may implement a photographing operation based on a technical solution in the related art; in the high-quality mode, the electronic device may acquire and combine the first image data and the second image data based on the technical solution of the present disclosure to obtain a high-quality final image (the quality of the final image captured in the high-quality mode is relatively higher than that of the image captured in the normal mode).
In one embodiment, the electronic device may provide the user with a corresponding mode option, so that the user may select to adopt a normal mode, a high quality mode, or another mode according to actual needs.
In an embodiment, the electronic device may be provided with an AI (artificial intelligence) system, so that the electronic device automatically determines a currently applicable shooting mode according to the detected ambient brightness, a shot scene, a type of a shot object, and the like, and further automatically switches to the mode or provides recommended information to the user.
In an embodiment, since the pixel data corresponding to the same pixel unit in the first image data belongs to the same color component, the pixel data corresponding to the pixel unit can be fused into a single pixel (the specification of the pixel is the same as that of the original pixel unit), so that the fused new pixel can obtain the luminance information sensed by the plurality of pixels included in the original pixel unit, and thus the fused new pixel can achieve relatively better dark light expression compared with the original single pixel.
In the first pixel arrangement mode, the pixel units may be arranged according to a standard Bayer (Bayer) array, so that after the pixel data corresponding to each pixel unit is respectively fused into a corresponding single pixel, the whole first image data can realize relatively better dark light expression. For example, when the pixels on the image sensor are arranged in the manner shown in fig. 1, the arrangement is actually Quad Bayer array in the related art, and the pixel units in the first pixel arrangement may be fused based on a four-in-one (4in 1) manner in the related art, so as to obtain the first image data. For example, FIG. 3 is a schematic diagram illustrating one type of generation of image data according to an exemplary embodiment; as shown in fig. 3, the left side is the initial image data sensed by the pixel array arranged by the Quad Bayer array, and the color information of each pixel (referring to the pixel in the image) in the initial image data is consistent with the corresponding color component of each pixel (referring to the pixel on the image sensor) in the pixel array shown in fig. 1. Then, the first image data 31 can be obtained by fusing the pixels included in each pixel unit in a four-in-one manner.
In an embodiment, multiple frames of first image data with different exposure time lengths are respectively obtained, so that while relatively better dark light expression is obtained for each frame of image data, the final image can obtain a relatively higher dynamic range after the multiple frames of first image data are fused to the final image based on the different exposure time lengths. In other words, based on the multi-frame first image data obtained in the above manner, the final image can be made to have both a high dynamic range and excellent dim light expression.
In an embodiment, the second image data can contain more detail information than the first image data, i.e. the second image data has a relatively higher image resolution, since the corresponding pixel data in the second image data belong to a plurality of color components. Therefore, by fusing the second image data with the first image data, the obtained final image inherits the high dynamic range and excellent dim expression brought by the first image data on one hand, and inherits the high resolution of the second image data on the other hand, so that the image quality of the final image is remarkably improved under the same hardware level.
In an embodiment, in the second pixel arrangement, the pixels included in the same pixel unit may be arranged according to a standard bayer array. Taking fig. 3 as an example, the left side is the initial image data described above, and the color information corresponding to each pixel in the initial image data can be adjusted through the algorithm processing, so that the pixels included in each pixel unit are arranged again according to the labeled bayer array, instead of each pixel unit corresponding to the same color component.
With reference to the reference numerals in fig. 1, assuming that the coordinates of the pixels are (x, y), where x is the reference numeral of the pixels corresponding in the left-right direction and y is the reference numeral of the pixels corresponding in the up-down direction, the coordinates are also applied to the initial image data shown in fig. 3. When adjusting the color information of the pixel, firstly, it is necessary to determine a target pixel, for example, the coordinates of the target pixel are (3, 4), that is, the pixel of the blue component in the 3 rd column and the 4 th row in the initial image data is circled in the figure; determining the neighboring pixels of the target pixel, where "neighboring" may include four directions, i.e. up, down, left, and right, or may further include four directions, i.e. up, down, left, up, and down, such as when these eight directions are included simultaneously, the neighboring pixels of the target pixel include: a pixel located above the target pixel and having coordinates of (3, 3) and corresponding to the blue component, a pixel located below the target pixel and having coordinates of (3, 5) and corresponding to the green component, a pixel located on the left side of the target pixel and having coordinates of (2, 4) and corresponding to the green component, a pixel located on the right side of the target pixel and having coordinates of (4, 4) and corresponding to the blue component, a pixel located above the left of the target pixel and having coordinates of (2, 3) and corresponding to the green component, a pixel located below the left of the target pixel and having coordinates of (2, 5) and corresponding to the red component, a pixel located above the right of the target pixel and having coordinates of (4, 3) and corresponding to the blue component, a pixel located below the right of the target pixel and having coordinates of (4, 5) and corresponding to the green component; the target pixel is then adjusted based on the color information of the neighboring pixels. For example, FIG. 3 shows second image data 32 resulting from the processing of the initial image data. The above process of generating the second image data from the initial image data can be implemented by using remosaic technology or other technologies in the related art, and the disclosure is not limited thereto.
In step 204, the first image data and the second image data are fused to generate a final image.
In an embodiment, when the embodiment shown in fig. 3 is adopted, assuming that the original effective pixels of the image sensor are 4000 ten thousand, a plurality of frames of 1000 thousand pixels of first image data can be generated respectively in a four-in-one manner, and 4000 thousand pixels of second image data are generated by the remosaic technology, so as to output a final image with a maximum of 4000 thousand pixels, where the final image can achieve both high dynamic range, excellent dim light effect and high resolution.
As can be seen from the embodiments shown in fig. 2-3: by arranging the pixels on the image sensor in a Quad Bayer array, the pixels contained in each pixel unit can be processed in 4in1 based on a remosaic technique, so that the brightness information obtained by a single pixel in the first image data is relatively more while the resolution of the partial image is sacrificed, and better dark light performance is realized. Meanwhile, when the second image data is directly generated by the image sensor of the Quad Bayer array, the second image data can be made to contain relatively more detailed information to obtain better image resolution. Further, based on different exposure time lengths, the multiple frames of first image data can respectively have different dynamic ranges. Therefore, by fusing the multi-frame first image data and the second image data with different exposure durations, the obtained final image can have better dim light expression, high dynamic range and high resolution at the same time without increasing the number of pixels of the image sensor or upgrading the lens system, so that better image shooting quality is realized.
FIG. 4 is a flow chart illustrating another method of image generation according to an exemplary embodiment. As shown in fig. 4, the method applied to the electronic device may include the following steps:
in step 402, first image data and second image data are acquired, respectively.
In an embodiment, multiple frames of the first image data and the second image data with different exposure time lengths may be obtained by an embodiment such as that shown in fig. 2, which is not described herein again.
In step 404, a target resolution is determined.
In an embodiment, the electronic device may determine the target resolution based on the illumination intensity. Generally, when the illumination intensity is larger, the image sensor obtains more sufficient light in the same time period, and thus a higher target resolution can be achieved while ensuring the same brightness, i.e., the target resolution can be positively correlated with the illumination intensity. Of course, it is possible that a threshold value, such as above a certain intensity the target resolution does not continue to increase (such as having reached the maximum resolution of the image sensor), below a certain intensity the target resolution does not continue to decrease (such as having reached the predefined minimum resolution). For example, a correspondence table between the illumination intensity and the resolution may be counted in advance, and a value of the target resolution may be quickly determined based on the correspondence table and the currently detected illumination intensity; the correspondence table may refer to table 1 below.
Illumination intensity (lux) Resolution ratio
<100 2560×1920
150 2000×1000
750 5000×3000
>800 5120×3840
TABLE 1
In an embodiment, the resolution set by the user may be determined as the target resolution. For example, the electronic device may present a selection interface to the user, where the selection interface includes several alternative resolutions, so that the user may select the target resolution according to actual needs.
In an embodiment, the electronic device may determine a preferred resolution according to information such as the illumination intensity (the determination is performed by referring to the above-mentioned determination of the target resolution based on the illumination intensity), and provide the preferred resolution to the user, so that the user determines whether to use the preferred resolution or to use another resolution as the target resolution.
In step 406, resolution adjustment is performed on the first image data and the second image data.
In step 408, a final image is generated.
In an embodiment, a target resolution may be determined; and respectively adjusting the first image data and the second image data to target resolution, and fusing to generate a final image.
In an embodiment, the original effective pixels of the image sensor are 4000 ten thousand, a plurality of frames of 1000 ten thousand pixels of first image data can be generated respectively in a four-in-one manner, and 4000 ten thousand pixels of second image data can be generated by the remosaic technology. Then, when the target resolution is 2000 ten thousand pixels, the resolution of the first image data may be increased to 2000 ten thousand pixels and the resolution of the second image data may be decreased to 2000 ten thousand pixels, respectively, to obtain the adjusted first image data and the adjusted second image data; then, a final image of 2000 ten thousand pixels is obtained by fusing the adjusted first image data and the adjusted second image data.
As can be seen from the embodiment shown in fig. 4: according to the method, the illumination intensity is detected, the proper target resolution ratio can be selected according to the illumination intensity, and the final image obtained by fusion is enabled to adopt the target resolution ratio, so that the final image can obtain relatively better image resolving power by improving the image resolution ratio under the condition of meeting the brightness requirement, or the image resolution ratio is reduced under the condition of insufficient brightness, and the shooting effect under the dark light environment is optimized. Therefore, based on the detection of the illumination intensity and the reasonable selection of the target resolution, the first image data and the second image data are generated in the above embodiment, so that the final image has a high dynamic range and balance between the dim light expression and the high resolution is realized.
Corresponding to the foregoing embodiments of the image generation method, the present disclosure also provides embodiments of an image generation apparatus.
FIG. 5 is a block diagram illustrating an image generation apparatus according to an exemplary embodiment. Referring to fig. 5, the apparatus is applied to an electronic device including a pixel array of an image sensor divided into a plurality of pixel units, each pixel unit including a group of pixels; the device comprises:
an acquiring unit 51, in response to a received shooting instruction, respectively acquiring a plurality of frames of first image data generated by the image sensor based on a first pixel arrangement mode and having different exposure durations, and at least one frame of second image data generated based on a second pixel arrangement mode; the corresponding pixel data of the same pixel unit in the first image data belong to the same color component, and the corresponding pixel data in the second image data belong to a plurality of color components;
a fusion unit 52 for fusing the first image data and the second image data to generate a final image.
Optionally, in the first pixel arrangement mode, the pixel units are arranged according to a standard bayer array.
Optionally, in the second pixel arrangement, the pixels included in the same pixel unit are arranged according to a standard bayer array.
Optionally, the pixels on the image sensor are arranged according to a Quad Bayer array.
As shown in fig. 6, fig. 6 is a block diagram of another image generation apparatus according to an exemplary embodiment, which is based on the foregoing embodiment shown in fig. 5, and the fusion unit 52 includes:
a determination subunit 521 that determines a target resolution;
and a fusion subunit 522, which generates the final image by fusing the first image data and the second image data after adjusting the first image data and the second image data to the target resolutions, respectively.
Optionally, the determining subunit 521 is specifically configured to:
determining the target resolution according to illumination intensity, wherein the target resolution is positively correlated with the illumination intensity; or the like, or, alternatively,
and determining the resolution set by the user as the target resolution.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the disclosed solution. One of ordinary skill in the art can understand and implement it without inventive effort.
Correspondingly, the present disclosure also provides an image generating apparatus, including: an image sensor, a pixel array of which is divided into a plurality of pixel units, each pixel unit including a group of pixels; a processor; a memory for storing processor-executable instructions; wherein the processor is configured to implement the image generation method as in any of the above embodiments, such as the method may comprise: responding to a received shooting instruction, and respectively acquiring multiple frames of first image data which are generated by the image sensor based on a first pixel arrangement mode and have different exposure time lengths and at least one frame of second image data generated based on a second pixel arrangement mode; the corresponding pixel data of the same pixel unit in the first image data belong to the same color component, and the corresponding pixel data in the second image data belong to a plurality of color components; fusing the first image data and the second image data to generate a final image.
Correspondingly, the present disclosure further provides a terminal, including: an image sensor, a pixel array of which is divided into a plurality of pixel units, each pixel unit including a group of pixels; memory, and one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors to include instructions for implementing an image generation method as described in any of the above embodiments, such as the method may include: responding to a received shooting instruction, and respectively acquiring multiple frames of first image data which are generated by the image sensor based on a first pixel arrangement mode and have different exposure time lengths and at least one frame of second image data generated based on a second pixel arrangement mode; the corresponding pixel data of the same pixel unit in the first image data belong to the same color component, and the corresponding pixel data in the second image data belong to a plurality of color components; fusing the first image data and the second image data to generate a final image.
Fig. 7 is a block diagram illustrating an apparatus 700 for image generation according to an example embodiment. For example, the apparatus 700 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, and the like.
Referring to fig. 7, apparatus 700 may include one or more of the following components: a processing component 702, a memory 704, a power component 706, a multimedia component 708, an audio component 710, an input/output (I/O) interface 712, a sensor component 714, and a communication component 716.
The processing component 702 generally controls overall operation of the device 700, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing components 702 may include one or more processors 720 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 702 may include one or more modules that facilitate interaction between the processing component 702 and other components. For example, the processing component 702 may include a multimedia module to facilitate interaction between the multimedia component 708 and the processing component 702.
The memory 704 is configured to store various types of data to support operations at the apparatus 700. Examples of such data include instructions for any application or method operating on device 700, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 704 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
The power supply component 706 provides power to the various components of the device 700. The power components 706 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the apparatus 700.
The multimedia component 708 includes a screen that provides an output interface between the device 700 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 708 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the device 700 is in an operation mode, such as a photographing mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability. The image sensor comprises a front camera, a rear camera and a plurality of pixel units, wherein the pixel array of the image sensor built in the front camera and/or the rear camera is divided into the plurality of pixel units, and each pixel unit comprises a group of pixels so as to be matched with the image generation scheme of the present disclosure.
The audio component 710 is configured to output and/or input audio signals. For example, audio component 710 includes a Microphone (MIC) configured to receive external audio signals when apparatus 700 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signal may further be stored in the memory 704 or transmitted via the communication component 716. In some embodiments, audio component 710 also includes a speaker for outputting audio signals.
The I/O interface 712 provides an interface between the processing component 702 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 714 includes one or more sensors for providing status assessment of various aspects of the apparatus 700. For example, sensor assembly 714 may detect an open/closed state of device 700, the relative positioning of components, such as a display and keypad of device 700, sensor assembly 714 may also detect a change in position of device 700 or a component of device 700, the presence or absence of user contact with device 700, orientation or acceleration/deceleration of device 700, and a change in temperature of device 700. The sensor assembly 714 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 714 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 714 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 716 is configured to facilitate wired or wireless communication between the apparatus 700 and other devices. The apparatus 700 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, 4G LTE, 5G NR (New Radio), or a combination thereof. In an exemplary embodiment, the communication component 716 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 716 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 700 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer readable storage medium comprising instructions, such as the memory 704 comprising instructions, executable by the processor 720 of the device 700 to perform the above-described method is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (10)

1. The image generation method is applied to electronic equipment, a pixel array of an image sensor contained in the electronic equipment is divided into a plurality of pixel units, and each pixel unit comprises a group of pixels; the method comprises the following steps:
responding to a received shooting instruction, and respectively acquiring multiple frames of first image data which are generated by the image sensor based on a first pixel arrangement mode and have different exposure time lengths and at least one frame of second image data generated based on a second pixel arrangement mode; the corresponding pixel data of the same pixel unit in the first image data belong to the same color component, and the corresponding pixel data in the second image data belong to a plurality of color components;
fusing the first image data and the second image data to generate a final image;
wherein the fusing the first image data and the second image data to generate a final image comprises:
determining a target resolution from the illumination intensity, wherein the target resolution is positively correlated with the illumination intensity;
and after the first image data and the second image data are respectively adjusted to the target resolution, fusing to generate the final image.
2. The method of claim 1, wherein in the first pixel arrangement, the pixel units are arranged according to a standard bayer array.
3. The method according to claim 1, wherein in the second pixel arrangement, pixels included in the same pixel unit are arranged in a standard bayer array.
4. The method of claim 1, wherein the pixels on the image sensor are arranged in a Quad Bayer array.
5. An image generation device is applied to an electronic device, wherein a pixel array of an image sensor included in the electronic device is divided into a plurality of pixel units, and each pixel unit comprises a group of pixels; the device comprises:
the acquisition unit is used for responding to a received shooting instruction and respectively acquiring a plurality of frames of first image data which are generated by the image sensor based on a first pixel arrangement mode and have different exposure time lengths and at least one frame of second image data generated based on a second pixel arrangement mode; the corresponding pixel data of the same pixel unit in the first image data belong to the same color component, and the corresponding pixel data in the second image data belong to a plurality of color components;
a fusion unit fusing the first image data and the second image data to generate a final image;
the fusion unit includes:
a determining subunit, configured to determine a target resolution according to the illumination intensity, wherein the target resolution is positively correlated to the illumination intensity;
and the fusion subunit is used for respectively adjusting the first image data and the second image data to the target resolution and then fusing to generate the final image.
6. The apparatus of claim 5, wherein in the first pixel arrangement, the pixel units are arranged according to a standard Bayer array.
7. The apparatus according to claim 5, wherein in the second pixel arrangement, pixels included in the same pixel unit are arranged in a standard Bayer array.
8. The apparatus of claim 5, wherein the pixels on the image sensor are arranged in a Quad Bayer array.
9. An electronic device, wherein a pixel array of an image sensor included in the electronic device is divided into a plurality of pixel units, each pixel unit including a set of pixels; the electronic device further includes:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to execute the instructions to implement the method of any one of claims 1-4.
10. A computer-readable storage medium having stored thereon computer instructions, which, when executed by a processor, carry out the steps of the method according to any one of claims 1 to 4.
CN201910313742.2A 2019-04-18 2019-04-18 Image generation method and device, electronic equipment and computer readable storage medium Active CN111835941B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910313742.2A CN111835941B (en) 2019-04-18 2019-04-18 Image generation method and device, electronic equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910313742.2A CN111835941B (en) 2019-04-18 2019-04-18 Image generation method and device, electronic equipment and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN111835941A CN111835941A (en) 2020-10-27
CN111835941B true CN111835941B (en) 2022-02-15

Family

ID=72915573

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910313742.2A Active CN111835941B (en) 2019-04-18 2019-04-18 Image generation method and device, electronic equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN111835941B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20220084578A (en) * 2020-12-14 2022-06-21 에스케이하이닉스 주식회사 Image sensing device
CN115565057B (en) * 2021-07-02 2024-05-24 北京小米移动软件有限公司 Map generation method, map generation device, foot robot and storage medium

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015033107A (en) * 2013-08-07 2015-02-16 ソニー株式会社 Image processing apparatus, image processing method, and electronic apparatus
KR102281162B1 (en) * 2014-11-20 2021-07-23 삼성전자주식회사 Image processing apparatus and method
CN105245775B (en) * 2015-09-25 2018-04-24 小米科技有限责任公司 camera imaging method, mobile terminal and device
CN105469774A (en) * 2015-12-11 2016-04-06 深圳一电航空技术有限公司 Display screen brightness adjusting method and apparatus
CN105391998B (en) * 2015-12-24 2017-05-24 无锡市星迪仪器有限公司 Automatic detection method and apparatus for resolution of low-light night vision device
CN106412407B (en) * 2016-11-29 2019-06-07 Oppo广东移动通信有限公司 Control method, control device and electronic device
CN108419022A (en) * 2018-03-06 2018-08-17 广东欧珀移动通信有限公司 Control method, control device, computer readable storage medium and computer equipment
CN108989700B (en) * 2018-08-13 2020-05-15 Oppo广东移动通信有限公司 Imaging control method, imaging control device, electronic device, and computer-readable storage medium

Also Published As

Publication number Publication date
CN111835941A (en) 2020-10-27

Similar Documents

Publication Publication Date Title
CN107038037B (en) Display mode switching method and device
CN109360261B (en) Image processing method, image processing device, electronic equipment and storage medium
CN110958401B (en) Super night scene image color correction method and device and electronic equipment
CN108122195B (en) Picture processing method and device
CN108040204B (en) Image shooting method and device based on multiple cameras and storage medium
CN111835941B (en) Image generation method and device, electronic equipment and computer readable storage medium
CN111953903A (en) Shooting method, shooting device, electronic equipment and storage medium
CN110876014B (en) Image processing method and device, electronic device and storage medium
CN111343386B (en) Image signal processing method and device, electronic device and storage medium
CN107527072B (en) Method and device for determining similar head portrait and electronic equipment
CN112331158B (en) Terminal display adjusting method, device, equipment and storage medium
CN111131596B (en) Screen brightness adjusting method and device
CN109005360B (en) Light supplementing method and device for shooting environment and computer readable storage medium
CN108156381B (en) Photographing method and device
CN109922203B (en) Terminal, screen off method and device
CN111835977B (en) Image sensor, image generation method and device, electronic device, and storage medium
US11617023B2 (en) Method for brightness enhancement of preview image, apparatus, and medium
CN110891131A (en) Camera module, processing method and device, electronic equipment and storage medium
CN112188111B (en) Photographing method and device, terminal and storage medium
CN111698414B (en) Image signal processing method and device, electronic device and readable storage medium
CN110874829B (en) Image processing method and device, electronic device and storage medium
CN114338956A (en) Image processing method, image processing apparatus, and storage medium
CN112447145A (en) Display panel, display mode switching method and device and electronic equipment
CN111343375A (en) Image signal processing method and device, electronic device and storage medium
CN112019680A (en) Screen brightness adjusting method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant