CN115118889A - Image generation method, image generation device, electronic equipment and storage medium - Google Patents

Image generation method, image generation device, electronic equipment and storage medium Download PDF

Info

Publication number
CN115118889A
CN115118889A CN202210730378.1A CN202210730378A CN115118889A CN 115118889 A CN115118889 A CN 115118889A CN 202210730378 A CN202210730378 A CN 202210730378A CN 115118889 A CN115118889 A CN 115118889A
Authority
CN
China
Prior art keywords
image
image data
photosensitive
reading
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210730378.1A
Other languages
Chinese (zh)
Inventor
陈泓至
胡璇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202210730378.1A priority Critical patent/CN115118889A/en
Publication of CN115118889A publication Critical patent/CN115118889A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Studio Devices (AREA)

Abstract

The application discloses an image generation method, an image generation device, electronic equipment and a storage medium, and belongs to the technical field of image processing. The method comprises the following steps: determining N photosensitive areas which correspond to N image areas of a preview image one by one from photosensitive areas of an image sensor, wherein the brightness parameters of each image area are different, the N photosensitive areas correspond to N pixel value reading modes one by one, and N is an integer which is greater than or equal to 2; acquiring N groups of original image data through N photosensitive areas, wherein the N photosensitive areas correspond to the N groups of original image data one by one; reading out pixels corresponding to N groups of original image data by adopting N pixel value reading-out modes to obtain N groups of target image data; and obtaining a target image according to the N groups of target image data.

Description

Image generation method, image generation device, electronic equipment and storage medium
Technical Field
The application belongs to the technical field of camera shooting, and particularly relates to an image generation method and device, electronic equipment and a storage medium.
Background
Currently, a user can increase the definition of an image captured by an electronic device by increasing the number of photosensitive pixels in an image sensor in the electronic device.
However, due to the fixed size of the image sensor, increasing the number of photosensitive pixels in the image sensor results in a smaller area of a single photosensitive pixel in the image sensor, resulting in a poor light sensing capability of the image sensor in the electronic device and a reduced dynamic range of the image sensor.
Therefore, the definition of the image shot by the electronic equipment is low.
Disclosure of Invention
An object of the embodiments of the present application is to provide an image generation method, an image generation apparatus, an electronic device, and a storage medium, which can achieve an effect of improving the sharpness of an image captured by the electronic device.
In a first aspect, an embodiment of the present application provides an image generation method, where the image generation method includes: determining N photosensitive areas which correspond to N image areas of a preview image one by one from photosensitive areas of an image sensor, wherein the brightness parameters of each image area are different, the N photosensitive areas correspond to N pixel value reading modes one by one, and N is an integer which is greater than or equal to 2; acquiring N groups of original image data through N photosensitive areas, wherein the N photosensitive areas correspond to the N groups of original image data one by one; reading out pixel values corresponding to N groups of original image data by adopting N pixel value reading-out modes to obtain N groups of target image data; and obtaining the target image according to the N groups of target image data.
In a second aspect, an embodiment of the present application provides an image generation apparatus, including: the device comprises a determining module, an acquisition module, a reading module and an acquisition module. And the determining module is used for determining N photosensitive areas which correspond to the N image areas of the preview image one by one from the photosensitive areas of the image sensor, wherein the brightness parameters of each image area are different, the N photosensitive areas correspond to the N pixel value reading modes one by one, and N is an integer which is greater than or equal to 2. And the acquisition module is used for acquiring N groups of original image data through N photosensitive areas, wherein the N photosensitive areas correspond to the N groups of original image data one by one. And the reading module is used for reading out the pixel values corresponding to the N groups of original image data by adopting N pixel value reading modes to obtain N groups of target image data. And the acquisition module is used for acquiring the target image according to the N groups of target image data.
In a third aspect, embodiments of the present application provide an electronic device, which includes a processor and a memory, where the memory stores a program or instructions executable on the processor, and the program or instructions, when executed by the processor, implement the steps of the method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method according to the first aspect.
In a sixth aspect, embodiments of the present application provide a computer program product, stored on a storage medium, for execution by at least one processor to implement the method according to the first aspect.
In this embodiment, the electronic device may determine, from photosensitive regions of the image sensor, N photosensitive regions corresponding to N image regions in the preview image, acquire N pieces of original image data through the N photosensitive regions, and output the original image data by adopting different pixel value reading manners for the N photosensitive regions to obtain N pieces of target image data, thereby obtaining a target image. Because the electronic device can read out different raw image data collected by different photosensitive areas (brightness parameters of image areas of preview images corresponding to the different photosensitive areas are different) by adopting different pixel reading modes after acquiring the N photosensitive areas in the preview images, namely, the image sensor can adaptively control the reading mode according to scenes shot by a user, compared with a single-frame interpolation scheme, the number of real photosensitive pixels can be effectively increased, compared with a multi-frame fusion scheme, the image blurring phenomenon caused by movement of a user or movement of a shooting object during shooting can be avoided, and thus, the resolution and definition of the images shot by the electronic device are increased.
Drawings
Fig. 1 is a flowchart of an image generation method provided in an embodiment of the present application;
FIG. 2 is a schematic diagram of an example of an image generation method provided in an embodiment of the present application;
fig. 3 is a second schematic diagram of an example of an image generation method according to an embodiment of the present application;
fig. 4 is a third schematic diagram of an example of an image generation method according to an embodiment of the present application;
fig. 5 is a fourth schematic diagram of an example of an image generation method provided in an embodiment of the present application;
fig. 6 is a schematic structural diagram of an image generating apparatus according to an embodiment of the present application;
fig. 7 is a schematic hardware structure diagram of an electronic device according to an embodiment of the present application;
fig. 8 is a second schematic diagram of a hardware structure of an electronic device according to an embodiment of the present disclosure.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, of the embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be practiced in sequences other than those illustrated or described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense and do not limit the number of terms, e.g., the first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/", and generally means that the former and latter related objects are in an "or" relationship.
Terms related to the embodiments of the present application will be described below.
1. Image Sensor (Sensor)
The image sensor is the core of the camera and is also the most critical technology in the camera. Sensors can be divided into two types, one is a widely used Charge-coupled Device (CCD) element; the other is a Complementary Metal Oxide Semiconductor (CMOS) device. In contrast to conventional cameras, which use "film" as their information recording carrier, the "film" of a digital camera is its imaging photosensitive element, which is the "film" of the digital camera that is not to be replaced and is integral with the camera.
Currently, CMOS devices, which are semiconductors capable of recording light changes in digital cameras, like CCDs, are mainly used. The CMOS process includes sensing light signal with great amount of photosensitive diodes, converting the sensed light signal into electric signal, forming digital signal matrix, image processing with image signal processor and compressing and storing.
A CMOS Camera Module (CMOS Camera Module) is a Camera Module mainly used in a mobile phone at present, and mainly includes a Lens (Lens), a Voice Coil Motor (Voice Coil Motor), an infrared Filter (IR Filter), an image sensor (CMOS), a Digital Signal Processor (DSP), and a Flexible Printed Circuit (FPC).
The working process of the CCD Camera Module is that the voice coil motor drives the lens to reach a position with accurate focusing, external light passes through the lens, is filtered by the infrared filter and irradiates on a photosensitive diode (pixel) of the image sensor, the photosensitive diode converts a sensed optical signal into an electric signal, a digital signal matrix (namely an image) is formed through the amplifying circuit and the digital/analog conversion circuit, and the digital signal matrix (namely the image) is processed by the DSP and compressed and stored.
2. Camera lens (lens)
The camera lens is the most important component in the camera because its quality directly affects the quality of the captured image. The lens can be divided into two categories of zooming and fixed focus. The zoom lens has variable focal length and variable visual angle, namely, the zoom lens can be pushed and pulled; the fixed focus lens is a lens of which the focal length cannot be changed into only one focal length or only one visual angle.
4. RAW map
The RAW image is a RAW data image in which an optical signal captured by a CMOS or CCD image sensor is converted into a digital signal.
5、quad bayer
The quad bayer, i.e., the pixel 4 in 1 technique, refers to arranging 4 pixels of the same color together, and can increase the pixel density by 4 times, so that an image 4 times as many as the previous pixels can be output when high resolution is required, the quad bayer can adopt different output modes according to image sensors with different photosensitive brightness, for example, if the image sensor is a 50M sensor, the quad bayer can output for each pixel (hereinafter, referred to as the quad first mode), and if the image sensor is a 48M sensor, the quad1 can output for the pixel 4 in 1 (hereinafter, referred to as the quad second mode).
6、binning
Binning is an image readout mode in which charges induced by adjacent picture elements are added together or averaged, and read out in a one-pixel mode.
7. Binarization method
The gray value of the pixel point on the image is set to 0 or 255, that is, the whole image is presented with a black-and-white effect.
The image generation method provided by the embodiment of the present application is described in detail below with reference to the accompanying drawings by specific embodiments and application scenarios thereof.
At present, with the development of electronic devices, the electronic devices are increasingly required to perform shooting, and particularly in terms of shooting image definition, the number of photosensitive pixels in an image sensor of an electronic device may be increased, so as to improve the definition of an image shot by the electronic device.
However, in the above method, the single-frame interpolation algorithm performs interpolation pixel calculation according to the edge direction by considering the image edge direction of the position of each pixel when the image pixels are interpolated and amplified, because the number of the real photosensitive pixels in the electronic device is not increased, and in a high-frequency scene, interpolation errors may occur in the single-frame interpolation algorithm, the improvement of the single-frame interpolation algorithm on the definition of the image is limited, and the definition of the image subjected to the single-frame interpolation algorithm is still low; for a multi-frame algorithm, a typical example is a pixel shifting technique, that is, pixels are respectively moved to the periphery by image sensor distribution, an image is acquired once each movement, so as to achieve the purpose of improving the actual photosensitive pixels, and then multi-frame synthesis is performed to improve the image resolution, however, the actual operation difficulty of the multi-frame algorithm is high, mainly the distance that the image sensor is accurately moved by only one pixel each time is not easy to control, and thus, the definition of the image shot by the electronic device is poor.
In this embodiment, the electronic device may determine, from photosensitive regions of the image sensor, N photosensitive regions corresponding to N image regions in the preview image, acquire N pieces of original image data through the N photosensitive regions, and output the original image data by adopting different pixel value reading manners for the N photosensitive regions to obtain N pieces of target image data, thereby obtaining a target image. After the electronic equipment acquires the N photosensitive areas in the preview image, the image sensor can adopt different pixel reading modes to read different original image data acquired by different photosensitive areas (the brightness parameters of the image areas of the preview image corresponding to the different photosensitive areas are different), namely, the image sensor can adaptively control the reading modes according to scenes shot by a user.
An embodiment of the present application provides an image generation method, and fig. 1 shows a flowchart of an image generation method provided in an embodiment of the present application. As shown in fig. 1, the image generation method provided in the embodiment of the present application may include steps 201 to 204 described below.
In step 201, the electronic device determines N photosensitive areas corresponding to the N image areas of the preview image one to one from the photosensitive areas of the image sensor.
In this embodiment, the luminance parameters of each of the N image areas are different, the N photosensitive areas correspond to the N pixel value reading manners one by one, and N is an integer greater than or equal to 2.
In this embodiment, the electronic device may divide the preview image according to the brightness parameter to obtain N image areas, so as to obtain N photosensitive areas corresponding to the N image areas.
In the embodiment of the present application, the image sensor may be an all-in-one image sensor (e.g., a 4-in-1 image sensor, a 9-in-1 image sensor, a 16-in-1 image sensor, etc.).
For example, in an all-in-one image sensor, taking 4-in-1 as an example, as shown in fig. 2, the physical ordering of the photosensitive pixels in the image sensor arranges 4R photosensitive pixels together, 4 Gb photosensitive pixels together, and 4B photosensitive pixels together, so that there are two output modes for the four-in-one image sensor, a quadbayer output mode (quad first mode) and a binding output mode, the quadbayer output mode being output for each pixel, and finally outputting a 4 × 4 quadbayer image; the binning output mode is that 4 pixels of the same color are added or averaged to output, and finally 2 × 2 bayer images are output.
Optionally, in an embodiment of the present application, the brightness parameter includes at least one of: brightness value, contrast value, color temperature value and color saturation.
Optionally, in this embodiment of the application, in a case that the electronic device displays an identifier (for example, an application icon) of at least one application, the electronic device may display a shooting preview interface according to a click input of a user on the identifier of a target application in the at least one application, and capture a preview image through the image sensor, so that the electronic device may determine the N photosensitive areas from the photosensitive areas of the image sensor.
Wherein the target application may be any one of the following: chat-type applications, photography-type applications, image-processing-type applications, web-page-type applications, and the like.
Optionally, in this embodiment of the application, for each of the N image regions, the electronic device may obtain position information of all pixel points in one image region, and then determine, according to the position information, a photosensitive region corresponding to each image region, so as to determine the N photosensitive regions.
Step 202, the electronic device collects N sets of original image data through the N photosensitive areas.
In the embodiment of the present application, the N photosensitive regions correspond to N sets of original image data one to one.
Optionally, in this embodiment of the application, the raw image data may include at least one of the following: pixel luminance value, pixel location, pixel saturation, and pixel color value (e.g., RGB).
Optionally, in the embodiment of the present application, the electronic device may acquire N pieces of original image data through N photosensitive areas according to a click input of a user on a shooting control in a shooting preview interface; alternatively, the electronic device may directly acquire N raw image data through the N photosensitive regions after determining the N photosensitive regions.
Optionally, in this embodiment of the present application, the electronic device may control the N photosensitive regions to perform exposure, so that the N photosensitive regions may output N pieces of raw image data to acquire the N pieces of raw image data.
Optionally, in this embodiment of the application, the electronic device may simultaneously acquire N pieces of original image data corresponding to the N photosensitive areas; or, the electronic device may collect one original image data corresponding to one photosensitive region, then collect another original image data, and so on until collecting N original image data, that is, completing the collection of all the original image data corresponding to the preview image.
And 203, reading pixels corresponding to the N groups of original image data by the electronic equipment in an N pixel value reading mode to obtain N groups of target image data.
In an embodiment of the present application, each of the N pixel value reading methods includes any one of: the method comprises the steps of summing pixel values corresponding to adjacent pixel units with the same color in an image sensor and then reading the pixel values, averaging the pixel values corresponding to the adjacent pixel units with the same color in the image sensor and then reading the pixel values, reading the maximum pixel values corresponding to the adjacent pixel units with the same color in the image sensor and reading the pixel values of each pixel unit in the image sensor one by one.
In this embodiment, the electronic device may output the pixels of the original Image data to an Image Signal Processor (ISP) in different pixel output manners according to different brightness parameters corresponding to the N pieces of original Image data, so as to obtain the N pieces of target Image data.
Illustratively, when taking a picture, the camera switches to the hybrid mode for imaging, and reads out the pixels corresponding to the N sets of original image data to obtain N sets of target image data. The hybrid mode is a hybrid readout mode, and if the image sensor is a 50M sensor, in this mode, the original image data (e.g., pixel values) at the positions corresponding to the bright regions may be set to the first mode (i.e., the pixel values of each pixel unit in the bright regions are read out one by one, i.e., the quad first mode), so that the electronic device may output the original image data at the positions corresponding to the bright regions to the ISP in the first mode, and the original image data (e.g., pixel values) at the positions corresponding to the dark regions may be set to the second mode (i.e., the pixel values corresponding to the adjacent pixel units in the image sensor are added and then read out, i.e., the binning mode), so that the electronic device may output the original image data at the positions corresponding to the dark regions to the ISP in the second mode, so as to obtain N target image data.
Optionally, in this embodiment of the application, in a case that the preview image is in the hybrid mode, the electronic device may perform data encoding transmission on the target image data.
Specifically, the electronic device may perform binary encoding processing on target image data and expand the target image data to a preset bit size, thereby performing transmission.
Illustratively, taking an object image data of 10 bits as an example, the maximum value of the object image data acquired by the electronic device is 1023, and the binary value is 1111111111, and in the hybrid mode, the electronic device needs to expand the object image data into 12 bits, i.e. the binary value is 001111111111.
Alternatively, in this embodiment of the application, the step 203 may be specifically implemented by the step 203a described below.
Step 203a, the electronic device adds different identifications to each set of target image data.
In the embodiment of the present application, the identifier is used to indicate a pixel value reading manner corresponding to N groups of target image data.
In the embodiment of the application, the electronic device can add different identifications to each group of target image data, so that N groups of target image data corresponding to different pixel value reading modes can be distinguished, and further different processing is performed on the N groups of target image data.
Illustratively, the electronic device acquires target image data with a maximum value of 1023 and a binary value of 1111111111, but needs to be expanded to 12 bits at the time of transmission, i.e., the binary value is adjusted to 001111111111; the 12 th bit of the transmitted data is used to distinguish the quad bayer data from the binning data, e.g., set to 1 is used to mark the quad bayer data (i.e., 101111111111) and the binning data (i.e., 001111111111) so that the electronic device can distinguish the quad bayer data from the binning data when performing image reconstruction, thereby differently processing the pixel values corresponding to the quad bayer data and the binning data.
In the embodiment of the application, the electronic device can add different identifications to each group of target image data, so that N groups of target image data corresponding to different pixel value reading modes can be distinguished, and the flexibility of processing images by the electronic device is improved.
And step 204, the electronic equipment obtains a target image according to the N groups of target image data.
In this embodiment of the application, after the electronic device obtains N target image data, the electronic device may perform image reconstruction processing on the N target image data to obtain a target image.
Specifically, the electronic device may perform different processing on N target image data corresponding to different pixel value reading manners to obtain a target image.
The above embodiments are described below by way of specific embodiments.
The first scheme is as follows: for example, as shown in fig. 3, taking the image sensor as an example of four-in-one, after the ISP receives N target image data, the electronic device may add pixels corresponding to the binding data in the original image to obtain binding 1 (for example, if the pixels corresponding to the binding data are R1, R2, R3 and R4, then the binding 1 is R1+ R2+ R3+ R4 with 12 bits), and output each pixel corresponding to the quad bayer data in the original image (for example, the pixels corresponding to the quad bayer data are R1, R2, R3 and R4, and the output data are R1, R2, R3 and R4 with 12 bits, the binding data output is 4-in-1, and the data output is the image value of each pixel corresponding to the quad bayer data; after obtaining 12-bit binning1 and 12-bit quad bayer data, the electronic device may copy binning1 by 4 copies and input the copies to pixels corresponding to the binning data, so that the binning data and quad bayer data may be matched 1 to 1, thereby preventing the problem of pixel loss.
Scheme II: in the case of the image sensor being nine-in-one, the image reconstruction process is the same as the above embodiment, and is not described herein again to avoid repetition.
And a third scheme is as follows: for example, as shown in fig. 4, taking sixteen-in-one image sensor as an example, specifically taking a 48M image sensor as an example, to describe the image reconstruction process in the present application, after the ISP receives N target image data, the electronic device may add pixels corresponding to the binning data in the original image to obtain binning1 (for example, pixels corresponding to the binning data are R1, R2, R3, R4, R5, R6, R7, R8, …, and R16, the binning1 is 12-bit R1+ R2+ R3+ R4+ 4), and output pixels corresponding to the quad bayer data in the original image by the same pixel 4-in-1 (for example, pixels corresponding to the quad data are Gr4, 4+ 4, and Gr 4+ 4, where the output is 12-bit, Gr 4+ 4, specifically, the second gro 4+ 4, the output is a gro 4+ 4, quad4 is Gr13+ Gr14+ Gr15+ Gr16), and each pixel corresponding to 16bayer data in the original image is output (for example, the pixels corresponding to 16bayer data are R1, R2, R3, R4, …, and R16, and the output data are 16bayer1, 16bayer2, 16bayer3, 16bayer4, …, and 16bayer16 with 12 bits), that is, the data output by binning data is 16-in-1, the data output by quad bayer data is 4-in-1, and the pixel value output by 16bayer data is the pixel value corresponding to each 16bayer data; after obtaining 12-bit binning data and 12-bit quad bayer data and 16bayer data, the electronic device may copy binning1 by 16 copies and input the copies into pixels corresponding to the binning data, and copy quad1, quad2, quad3 and quad4 by 4 copies and input the copies into pixels corresponding to the quad bayer data, respectively, so that the binning data, the quad bayer data and the 16bayer data may be matched by 1 to 1, preventing the problem of pixel loss.
Optionally, in this embodiment of the application, after the image reconstruction is completed, if the resolution of the reconstructed image is greater than the target resolution, the electronic device may adjust the reconstructed image through the first algorithm.
Optionally, in this embodiment of the application, the target resolution may be determined by the electronic device or determined by a user.
Optionally, in this embodiment of the present application, the first algorithm may be a down scale algorithm or a remosaic algorithm.
Exemplarily, assuming that the resolution of the reconstructed image is 4800, in case that the target resolution is 1200, the electronic device may reduce the resolution of the reconstructed image to 1200 by a down scale algorithm.
The down scale algorithm reduces the resolution of the reconstructed image, but does not affect the sharpness of the reconstructed image. Compared with a remosaic algorithm, the method has better dim light shooting capability, and the amount of transmitted data is relatively less, so that the power consumption, the performance and the frame rate of the image sensor are more excellent.
The embodiment of the application provides an image generation method, and an electronic device may determine, from photosensitive areas of an image sensor, N photosensitive areas corresponding to N image areas in a preview image, acquire N original image data through the N photosensitive areas, and output the original image data to the N photosensitive areas in different pixel value reading manners to obtain N target image data, thereby obtaining a target image. In the scheme, after the electronic device acquires the N photosensitive areas in the preview image, the image sensor can adopt different pixel reading modes to read different original image data acquired by different photosensitive areas (brightness parameters of image areas of the preview image corresponding to the different photosensitive areas are different), namely, the image sensor can adaptively control the reading modes according to scenes shot by a user.
Optionally, in this embodiment of the application, before "determining N photosensitive areas corresponding to N image areas of the preview image one to one" in step 201, the image generation method provided in this embodiment of the application further includes steps 301 and 302 described below.
Step 301, the electronic device obtains a brightness parameter of the preview image.
In the embodiment of the application, the electronic device may acquire the brightness parameters of X pixels in the preview image, where X is a positive integer.
Optionally, in this embodiment of the application, the X pixels may be a part of pixels or all pixels of the preview image.
Optionally, in this embodiment of the application, the electronic device may perform brightness calculation on the preview image to obtain brightness parameters of X pixels.
Optionally, in this embodiment of the application, the electronic device may divide X pixels into N pixel groups according to the M luminance ranges.
In this embodiment of the application, each of the N pixel groups includes at least one pixel of X pixels, the luminance parameters of the pixels in each pixel group are respectively located in a luminance range, and M is a positive integer.
Optionally, in this embodiment of the application, the M brightness ranges may be brightness ranges pre-stored in the electronic device.
Optionally, in this embodiment of the application, the electronic device may determine N luminance ranges from M luminance ranges according to the luminance ranges of X pixels, and then perform grouping processing on the X pixels according to the N luminance ranges to obtain N pixel groups, where the luminance ranges of the X pixels are located in the N luminance ranges.
Specifically, the electronic device may perform binarization processing on the preset image, so as to determine different brightness parameters in the preview image, and then the electronic device may determine an image area (i.e., connected domain processing) formed by pixels having the same pixel value and located adjacent to each other, as a pixel group, so as to obtain N pixel groups.
Step 302, the electronic device determines N image areas based on the brightness parameter.
In the embodiment of the present application, the luminance range corresponding to each of the N image regions is different.
In this embodiment, the electronic device may determine, as the N image areas, the image areas where the N pixel groups are located by using a connected component algorithm.
Optionally, in this embodiment of the application, the N image regions may include one or more continuous sub-regions or one or more discontinuous sub-regions.
Optionally, in this embodiment of the application, after the electronic device determines the N image areas, the electronic device may label the N image areas in a target labeling manner.
Optionally, in an embodiment of the present application, the target labeling manner includes at least one of: colors, borders (e.g., solid or dashed lines), fill patterns, or fill lines.
For example, as shown in fig. 5, after the electronic device performs binarization processing on the preview image, the electronic device may acquire the luminance parameter of the preview image, so that the electronic device may determine, as a first region (a region indicated by oblique lines in fig. 5), an image region composed of pixels having the same pixel value and adjacent to each other and having the luminance parameter greater than or equal to a preset threshold in the preview image (i.e., the connected component algorithm described above), and determine, as a second region (a blank region in fig. 5), an image region composed of pixels having the same pixel value and adjacent to each other and having the luminance parameter smaller than the preset threshold in the preview image, that is, the electronic device may divide the preset image into two pixel groups through binarization processing, so as to determine, according to a region corresponding to the two pixel groups, the preset image region as two photosensitive regions.
Further exemplarily, assuming that the luminance values of the preview image acquired by the electronic device are 49, 61, and 42 and the preset threshold is 50, the electronic device may determine the region corresponding to the luminance value of 61 as a first region (that is, the luminance value is greater than the preset threshold, and the first region may also be referred to as a bright region), and determine the regions corresponding to the luminance values of 49 and 42 as a second region (that is, the luminance value is less than the preset threshold, and the second region may also be referred to as a dark region), that is, the electronic device divides the preview image into two regions according to the difference in luminance of the preview image.
In the embodiment of the application, the electronic device can determine the N image areas by acquiring the brightness parameters of the preview image, so that the electronic device can obtain target image data through different pixel output modes, and the flexibility of processing images by the electronic device is improved.
Optionally, in this embodiment of the present application, before step 301 described above, the image generation method provided in this embodiment of the present application further includes steps 401 to 403 described below.
Step 401, the electronic device acquires original preview image data through an image sensor.
In the implementation of the application, the electronic device can acquire original preview image data (i.e., image data converted from an electric signal to a digital signal, i.e., RAW image) through the pixel points in the image sensor.
Optionally, in an embodiment of the present application, the original preview image data includes at least one of: pixel brightness value, pixel distribution position, number of pixels, and pixel exposure value.
Step 402, the electronic device reads out the pixel value of the original preview image data by using a first pixel value reading mode to obtain the preview image data.
In an embodiment of the present application, the first pixel value reading method includes: and summing pixel values corresponding to adjacent pixel units with the same color in the image sensor, and reading the sum.
In this embodiment, after the electronic device obtains the original preview image data, the electronic device may output the pixel value of the original preview image data by using a binning mode (i.e., a first pixel output mode), so as to obtain the preview image data.
And step 403, the electronic equipment obtains a preview image according to the preview image data.
In the embodiment of the application, the electronic device can obtain a preview image in a bayer format after four unifications of the same pixels according to the first pixel output mode.
In the embodiment of the application, the electronic equipment can acquire original preview image data so as to obtain a preview image in a bayer format after four unifications of the same pixels in a first pixel output mode, and the definition of the image shot by the electronic equipment is improved.
Optionally, in this embodiment of the present application, the N photosensitive regions include a target photosensitive region, where the target photosensitive region corresponds to a second pixel value reading manner of the N pixel value reading manners, and the second pixel value reading manner is to read out pixel values of each pixel unit in the image sensor one by one; the above step 202 includes:
the electronic equipment adjusts the exposure gain corresponding to the target photosensitive area to obtain the same brightness parameters as other photosensitive areas.
In an embodiment of the application, the other photosensitive regions are photosensitive regions of the N photosensitive regions, where a brightness range satisfies a preset threshold.
In the embodiment of the application, the electronic device may perform brightness compensation on the pixel corresponding to the bright region position through an Analog amplifier (Analog gain), so as to be consistent with the brightness of other photosensitive regions.
It should be noted that, the exposure parameters when the quad baseband mode is output are the same as those when the binning mode is output, so that the number of pixels output by the quad baseband mode is increased to 4 times of the original number, but the pixel brightness is also reduced to one fourth of the binning number, so in the embodiment of the present application, the exposure gain is adjusted by the analog amplifier to perform brightness compensation on the pixels corresponding to the quad baseband mode, that is, the pixel brightness corresponding to the quad baseband mode is adjusted to 4 times of the original brightness, and thus, it is ensured that the image brightness can be normally restored.
In the embodiment of the application, the electronic equipment can compensate the brightness of the bright area, so that the whole brightness of the image shot by the electronic equipment can be normally restored, and the accuracy of processing the image by the electronic equipment is improved.
In the image generating method provided in the embodiment of the present application, the execution subject may be an image generating apparatus. The image generation device provided by the embodiment of the present application will be described with an example in which an image generation device executes an image generation method.
Fig. 6 shows a schematic diagram of a possible structure of the image generation apparatus according to the embodiment of the present application. As shown in fig. 4, the image generating apparatus 70 may include: a determination module 71, an acquisition module 72, a readout module 73 and an acquisition module 74.
The determining module 71 is configured to determine, from photosensitive areas of an image sensor of an electronic device, N photosensitive areas that are in one-to-one correspondence with N image areas of a preview image, where luminance parameters of each image area are different, the N photosensitive areas are in one-to-one correspondence with N pixel value reading manners, and N is an integer greater than or equal to 2. The collecting module 72 is configured to collect N sets of original image data through N photosensitive areas, where the N photosensitive areas correspond to the N sets of original image data one to one. And the reading module 73 is configured to read out pixel values corresponding to the N groups of original image data by using N pixel value reading manners, so as to obtain N groups of target image data. And the acquisition module is used for acquiring the target image according to the N groups of target image data.
In a possible implementation manner, the obtaining module 74 is further configured to obtain the brightness parameter of the preview image before the determining module 71 determines the N photosensitive areas corresponding to the N image areas of the preview image in a one-to-one manner. The determining module 71 is further configured to determine N image regions based on the brightness parameter; wherein, the brightness range corresponding to each image area in the N image areas is different.
In a possible implementation, the acquiring module 72 is further configured to acquire raw preview image data through an image sensor before the acquiring module 74 acquires the brightness parameter of the preview image. The reading module 73 is further configured to read out the pixel value of the original preview image data by using a first pixel value reading manner, so as to obtain the preview image data. The obtaining module 74 is further configured to obtain a preview image according to the preview image data; wherein the first pixel value readout mode is: and summing pixel values corresponding to adjacent pixel units with the same color in the image sensor, and then reading out the sum.
In one possible implementation, each of the N pixel value readout manners includes any one of: the method comprises the steps of summing pixel values corresponding to adjacent pixel units with the same color in an image sensor and then reading the pixel values, averaging the pixel values corresponding to the adjacent pixel units with the same color in the image sensor and then reading the pixel values, reading the maximum pixel values corresponding to the adjacent pixel units with the same color in the image sensor and reading the pixel values of each pixel unit in the image sensor one by one.
In a possible implementation manner, the N photosensitive regions include a target photosensitive region, and the target photosensitive region corresponds to a second pixel value reading manner of the N pixel value reading manners, where the second pixel value reading manner is to read out pixel values of each pixel unit in the image sensor one by one; the acquisition module is also used for adjusting the exposure gain corresponding to the target photosensitive area to obtain the same brightness parameters as other photosensitive areas; the other photosensitive areas are photosensitive areas of which the brightness ranges meet a preset threshold value in the N photosensitive areas.
In a possible implementation manner, the readout module is specifically configured to add a different identifier to each group of target image data, where the identifier is used to indicate a readout manner of pixel values corresponding to N groups of target image data.
The embodiment of the application provides an image generation device, because after the image generation device acquires N photosensitive areas in a preview image, an image sensor can adopt different pixel reading modes to read different original image data collected by different photosensitive areas (brightness parameters of image areas of the preview image corresponding to the different photosensitive areas are different), namely, the image sensor can adaptively control the reading mode according to a scene shot by a user.
The image generation device in the embodiment of the present application may be a device, or may be a component, an integrated circuit, or a chip in an electronic apparatus. The device can be mobile electronic equipment or non-mobile electronic equipment. The Mobile electronic Device may be, for example, a Mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted electronic Device, a Mobile Internet Device (MID), an Augmented Reality (AR)/Virtual Reality (VR) Device, a robot, a wearable Device, an ultra-Mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and may also be a server, a Network Attached Storage (Storage), a personal computer (NAS), a Television (TV), a teller machine, a self-service machine, and the like, and the embodiments of the present application are not limited in particular.
The image generation apparatus in the embodiment of the present application may be an apparatus having an operating system. The operating system may be an Android operating system, an IOS operating system, or other possible operating systems, which is not specifically limited in the embodiment of the present application.
The image generation device provided in the embodiment of the present application can implement each process implemented by the method embodiments of fig. 1 to fig. 3, achieve the same technical effect, and is not described herein again to avoid repetition.
Optionally, as shown in fig. 7, an electronic device 90 provided in an embodiment of the present application further includes a processor 91 and a memory 92, where the memory 92 stores a program or an instruction that can be executed on the processor 91, and when the program or the instruction is executed by the processor 91, the steps of the embodiment of the image generation method are implemented, and the same technical effect can be achieved, and are not described again here to avoid repetition.
It should be noted that the electronic device in the embodiment of the present application includes the mobile electronic device and the non-mobile electronic device described above.
Fig. 8 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 100 includes, but is not limited to: a radio frequency unit 101, a network module 102, an audio output unit 103, an input unit 104, a sensor 105, a display unit 106, a user input unit 107, an interface unit 108, a memory 109, and a processor 110.
Those skilled in the art will appreciate that the electronic device 100 may further comprise a power supply (e.g., a battery) for supplying power to various components, and the power supply may be logically connected to the processor 110 via a power management system, so as to implement functions of managing charging, discharging, and power consumption via the power management system. The electronic device structure shown in fig. 6 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is omitted here.
The processor 110 is configured to determine, from photosensitive regions of the image sensor, N photosensitive regions that are in one-to-one correspondence with N image regions of a preview image, where luminance parameters of each image region are different, the N photosensitive regions are in one-to-one correspondence with N pixel value reading manners, and N is an integer greater than or equal to 2; acquiring N groups of original image data through N photosensitive areas, wherein the N photosensitive areas correspond to the N groups of original image data one by one; reading out pixel values corresponding to the N groups of original image data by adopting N pixel value reading-out modes to obtain N groups of target image data; and obtaining a target image according to the N groups of target image data.
The embodiment of the application provides an electronic device, because after the electronic device acquires N photosensitive areas in a preview image, an image sensor can adopt different pixel reading modes to read different original image data acquired by different photosensitive areas (brightness parameters of image areas of the preview image corresponding to the different photosensitive areas are different), namely, the image sensor can adaptively control the reading modes according to a scene shot by a user, compared with a single-frame interpolation scheme, the number of real photosensitive pixels can be effectively increased, compared with a multi-frame fusion scheme, the image blurring phenomenon caused by movement of a user or movement of a shooting object during shooting can be avoided, and thus, the resolution and the definition of the image shot by the electronic device are improved.
Optionally, in this embodiment of the application, the processor 110 is further configured to obtain a brightness parameter of the preview image before determining N photosensitive areas corresponding to N image areas of the preview image in a one-to-one manner; determining N image areas based on the brightness parameters; wherein, the brightness range corresponding to each image area in the N image areas is different.
Optionally, in this embodiment of the application, the processor 110 is further configured to acquire original preview image data through the image sensor before acquiring a brightness parameter of a preview image; reading out the pixel value of the original preview image data by adopting a first pixel value reading mode to obtain the preview image data; obtaining a preview image according to the preview image data; wherein the first pixel value reading mode is as follows: and summing pixel values corresponding to adjacent pixel units with the same color in the image sensor, and reading the sum.
Optionally, in this embodiment of the present application, the N photosensitive regions include a target photosensitive region, where the target photosensitive region corresponds to a second pixel value reading manner of the N pixel value reading manners, and the second pixel value reading manner is to read out pixel values of each pixel unit in the image sensor one by one; the processor 110 is further configured to adjust an exposure gain corresponding to the target photosensitive area to obtain an exposure gain identical to that of other photosensitive areas; the other photosensitive regions are photosensitive regions of the N photosensitive regions except the target photosensitive region.
Optionally, in this embodiment of the application, the processor 110 is specifically configured to, in a case that pixels corresponding to N groups of original image data are read by using N pixel value reading manners, add different identifiers to each group of original image data, where the identifiers are used to indicate the pixel value reading manners corresponding to the N groups of original image data.
The electronic device provided by the embodiment of the application can realize each process realized by the method embodiment, and can achieve the same technical effect, and for avoiding repetition, the details are not repeated here.
The beneficial effects of the various implementation manners in this embodiment may specifically refer to the beneficial effects of the corresponding implementation manners in the above method embodiments, and are not described herein again to avoid repetition.
It should be understood that, in the embodiment of the present application, the input Unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, and the Graphics Processing Unit 1041 processes image data of a still picture or a video obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 107 includes at least one of a touch panel 1071 and other input devices 1072. The touch panel 1071 is also referred to as a touch screen. The touch panel 1071 may include two parts of a touch detection device and a touch controller. Other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a first storage area storing a program or an instruction and a second storage area storing data, wherein the first storage area may store an operating system, an application program or an instruction (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like. Further, memory 109 may comprise volatile memory or non-volatile memory, or memory 109 may comprise both volatile and non-volatile memory. The non-volatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable PROM (EEPROM), or a flash Memory. The volatile Memory may be a Random Access Memory (RAM), a Static Random Access Memory (Static RAM, SRAM), a Dynamic Random Access Memory (Dynamic RAM, DRAM), a Synchronous Dynamic Random Access Memory (Synchronous DRAM, SDRAM), a Double Data Rate Synchronous Dynamic Random Access Memory (Double Data Rate SDRAM, ddr SDRAM), an Enhanced Synchronous SDRAM (ESDRAM), a Synchronous Link DRAM (SLDRAM), and a Direct Memory bus RAM (DRRAM). Memory 109 in the embodiments of the subject application includes, but is not limited to, these and any other suitable types of memory.
Processor 110 may include one or more processing units; optionally, the processor 110 integrates an application processor, which primarily handles operations involving the operating system, user interface, and applications, etc., and a modem processor, which primarily handles wireless communication signals, such as a baseband processor. It will be appreciated that the modem processor described above may not be integrated into the processor 110. .
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements the processes of the foregoing method embodiments, and can achieve the same technical effects, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a computer read only memory ROM, a random access memory RAM, a magnetic or optical disk, and the like.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to execute a program or an instruction to implement each process of the foregoing method embodiment, and the same technical effect can be achieved.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
Embodiments of the present application provide a computer program product, where the program product is stored in a storage medium, and the program product is executed by at least one processor to implement the processes of the foregoing image generation method embodiments, and can achieve the same technical effects, and in order to avoid repetition, details are not repeated here.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatuses in the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions recited, e.g., the described methods may be performed in an order different from that described, and various steps may be added, omitted, or combined. Additionally, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a computer software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (14)

1. An image generation method applied to an electronic device including an image sensor, the method comprising:
determining N photosensitive areas which correspond to N image areas of a preview image one by one from the photosensitive areas of the image sensor, wherein the brightness parameters of each image area are different, the N photosensitive areas correspond to N pixel value reading modes one by one, and N is an integer which is greater than or equal to 2;
acquiring N groups of original image data through the N photosensitive areas, wherein the N photosensitive areas correspond to the N groups of original image data one by one;
reading out pixel values corresponding to the N groups of original image data by adopting the N pixel value reading-out modes to obtain N groups of target image data;
and obtaining a target image according to the N groups of target image data.
2. The method of claim 1, wherein prior to determining the N photosensitive regions that correspond one-to-one to the N image regions of the preview image, the method further comprises:
acquiring a brightness parameter of a preview image;
determining N image areas based on the brightness parameter;
wherein, the brightness range corresponding to each image area in the N image areas is different.
3. The method of claim 2, wherein before the obtaining the brightness parameter of the preview image, the method further comprises:
acquiring original preview image data through the image sensor;
reading out the pixel value of the original preview image data by adopting a first pixel value reading mode to obtain preview image data;
obtaining the preview image according to the preview image data;
wherein the first pixel value readout mode is: and summing pixel values corresponding to adjacent pixel units with the same color in the image sensor, and reading out the sum.
4. The method according to claim 1, wherein each of the N pixel value readout modes comprises any one of: the method comprises the steps of summing pixel values corresponding to adjacent pixel units with the same color in the image sensor and then reading the pixel values, averaging the pixel values corresponding to the adjacent pixel units with the same color in the image sensor and then reading the pixel values, reading the maximum pixel values corresponding to the adjacent pixel units with the same color in the image sensor and reading the pixel values of each pixel unit in the image sensor one by one.
5. The method according to claim 4, wherein the N photosensitive regions comprise target photosensitive regions, and the target photosensitive regions correspond to a second pixel value reading mode of the N pixel value reading modes, wherein the second pixel value reading mode is a one-to-one reading mode of pixel values of each pixel unit in the image sensor;
through N photosensitive areas, gather N group's original image data, include:
adjusting exposure gain corresponding to the target photosensitive area to obtain brightness parameters same as those of other photosensitive areas; and the other photosensitive areas are photosensitive areas of which the brightness ranges meet a preset threshold value in the N photosensitive areas.
6. The method according to claim 1, wherein said reading out pixel values corresponding to the N sets of original image data by using the N pixel value reading-out manners to obtain N sets of target image data comprises:
and adding different marks to each group of target image data, wherein the marks are used for indicating pixel value reading modes corresponding to the N groups of target image data.
7. An image generation apparatus, characterized in that the image generation apparatus comprises: the device comprises a determining module, an acquisition module, a reading module and an acquisition module;
the determining module is configured to determine, from photosensitive regions of the image sensor, N photosensitive regions that are in one-to-one correspondence with N image regions of a preview image, where luminance parameters of each image region are different, the N photosensitive regions are in one-to-one correspondence with N pixel value reading manners, and N is an integer greater than or equal to 2;
the acquisition module is used for acquiring N groups of original image data through the N photosensitive areas, wherein the N photosensitive areas correspond to the N groups of original image data one by one;
the reading module is used for reading out the pixel values corresponding to the N groups of original image data by adopting the N pixel value reading modes to obtain N groups of target image data;
and the acquisition module is used for acquiring a target image according to the N groups of target image data.
8. The apparatus according to claim 7, wherein the obtaining module is further configured to obtain the brightness parameter of the preview image before the determining module determines N photosensitive areas corresponding to the N image areas of the preview image in a one-to-one manner;
the determining module is further configured to determine N image regions based on the brightness parameter;
wherein, the brightness range corresponding to each image area in the N image areas is different.
9. The apparatus according to claim 8, wherein the acquiring module is further configured to acquire raw preview image data through the image sensor before the acquiring module acquires the brightness parameter of the preview image;
the reading module is further configured to read out a pixel value of the original preview image data in a first pixel value reading manner to obtain preview image data;
the acquisition module is further used for obtaining the preview image according to the preview image data;
wherein the first pixel value readout mode is: and summing pixel values corresponding to adjacent pixel units with the same color in the image sensor, and reading out the sum.
10. The apparatus according to claim 7, wherein each of the N pixel value readout modes comprises any one of: the method comprises the steps of summing pixel values corresponding to adjacent pixel units with the same color in the image sensor and then reading the pixel values, averaging the pixel values corresponding to the adjacent pixel units with the same color in the image sensor and then reading the pixel values, reading the maximum pixel values corresponding to the adjacent pixel units with the same color in the image sensor and reading the pixel values of each pixel unit in the image sensor one by one.
11. The apparatus according to claim 10, wherein the N photosensitive regions include a target photosensitive region, the target photosensitive region corresponding to a second pixel value readout mode of the N pixel value readout modes, the second pixel value readout mode being a one-to-one readout of pixel values of each pixel unit in the image sensor;
the acquisition module is also used for adjusting the exposure gain corresponding to the target photosensitive area to obtain the same brightness parameters as other photosensitive areas; and the other photosensitive areas are photosensitive areas of which the brightness ranges meet a preset threshold value in the N photosensitive areas.
12. The apparatus according to claim 7, wherein the readout module is specifically configured to add different identifiers to each group of target image data, where the identifiers are used to indicate pixel value readout manners corresponding to the N groups of target image data.
13. An electronic device comprising a processor, a memory, and a program or instructions stored on the memory and executable on the processor, the program or instructions when executed by the processor implementing the steps of the image generation method of any of claims 1 to 6.
14. A readable storage medium, characterized in that it stores thereon a program or instructions which, when executed by a processor, implement the steps of the image generation method according to any one of claims 1 to 6.
CN202210730378.1A 2022-06-24 2022-06-24 Image generation method, image generation device, electronic equipment and storage medium Pending CN115118889A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210730378.1A CN115118889A (en) 2022-06-24 2022-06-24 Image generation method, image generation device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210730378.1A CN115118889A (en) 2022-06-24 2022-06-24 Image generation method, image generation device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115118889A true CN115118889A (en) 2022-09-27

Family

ID=83329810

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210730378.1A Pending CN115118889A (en) 2022-06-24 2022-06-24 Image generation method, image generation device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115118889A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015026192A (en) * 2013-07-25 2015-02-05 カシオ計算機株式会社 Image processor, image processing method and program
CN107172363A (en) * 2017-04-07 2017-09-15 深圳市金立通信设备有限公司 A kind of image processing method and terminal
CN109005361A (en) * 2018-08-06 2018-12-14 Oppo广东移动通信有限公司 Control method, device, imaging device, electronic equipment and readable storage medium storing program for executing
CN111614908A (en) * 2020-05-29 2020-09-01 Oppo广东移动通信有限公司 Image processing method, image processing device, electronic equipment and storage medium
CN111885312A (en) * 2020-07-27 2020-11-03 展讯通信(上海)有限公司 HDR image imaging method, system, electronic device and storage medium
CN112437237A (en) * 2020-12-16 2021-03-02 维沃移动通信有限公司 Shooting method and device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015026192A (en) * 2013-07-25 2015-02-05 カシオ計算機株式会社 Image processor, image processing method and program
CN107172363A (en) * 2017-04-07 2017-09-15 深圳市金立通信设备有限公司 A kind of image processing method and terminal
CN109005361A (en) * 2018-08-06 2018-12-14 Oppo广东移动通信有限公司 Control method, device, imaging device, electronic equipment and readable storage medium storing program for executing
CN111614908A (en) * 2020-05-29 2020-09-01 Oppo广东移动通信有限公司 Image processing method, image processing device, electronic equipment and storage medium
CN111885312A (en) * 2020-07-27 2020-11-03 展讯通信(上海)有限公司 HDR image imaging method, system, electronic device and storage medium
CN112437237A (en) * 2020-12-16 2021-03-02 维沃移动通信有限公司 Shooting method and device

Similar Documents

Publication Publication Date Title
JP4720859B2 (en) Image processing apparatus, image processing method, and program
JP4898761B2 (en) Apparatus and method for correcting image blur of digital image using object tracking
CN103843033B (en) Image processing equipment and method and program
US9538085B2 (en) Method of providing panoramic image and imaging device thereof
US20100238325A1 (en) Image processor and recording medium
JP6308748B2 (en) Image processing apparatus, imaging apparatus, and image processing method
CN101083773B (en) Apparatus and method of gamma correction in digital image processing device
CN105282429A (en) Imaging device, and control method for imaging device
CN104704807A (en) Image-processing device, image-capturing device, image-processing method, and recording medium
CN105432068A (en) Imaging device, imaging method, and image processing device
JP4992698B2 (en) Chromatic aberration correction apparatus, imaging apparatus, chromatic aberration calculation method, and chromatic aberration calculation program
JPH11112956A (en) Image composite and communication equipment
CN113014803A (en) Filter adding method and device and electronic equipment
CN105453540A (en) Image processing device, imaging device, image processing method, and program
CN112565603B (en) Image processing method and device and electronic equipment
KR20140106221A (en) Photographing method and apparatus using multiple image sensors
JPH10336494A (en) Digital camera with zoom display function
CN105122787A (en) Image pickup device, calibration system, calibration method, and program
Lukac Single-sensor imaging in consumer digital cameras: a survey of recent advances and future directions
CN115118889A (en) Image generation method, image generation device, electronic equipment and storage medium
CN109218602A (en) Image capture unit, image treatment method and electronic device
CN114298889A (en) Image processing circuit and image processing method
KR100810344B1 (en) Apparatus for digital photographing and method for smear appearance correction and detecting using the same
JP2005277618A (en) Photography taking apparatus and device and method for correcting shading
JP4687619B2 (en) Image processing apparatus, image processing method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination