CN110874829B - Image processing method and device, electronic device and storage medium - Google Patents

Image processing method and device, electronic device and storage medium Download PDF

Info

Publication number
CN110874829B
CN110874829B CN201811012444.1A CN201811012444A CN110874829B CN 110874829 B CN110874829 B CN 110874829B CN 201811012444 A CN201811012444 A CN 201811012444A CN 110874829 B CN110874829 B CN 110874829B
Authority
CN
China
Prior art keywords
image
resolution
pixel arrangement
arrangement mode
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811012444.1A
Other languages
Chinese (zh)
Other versions
CN110874829A (en
Inventor
豆子飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN201811012444.1A priority Critical patent/CN110874829B/en
Publication of CN110874829A publication Critical patent/CN110874829A/en
Application granted granted Critical
Publication of CN110874829B publication Critical patent/CN110874829B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)

Abstract

The present disclosure relates to an image processing method and apparatus, an electronic device, and a storage medium, wherein the method includes: when a photographing instruction is received, simultaneously outputting a first image with a first pixel arrangement mode and a second image with a second pixel arrangement mode by using a photosensitive element; the pixels with the same color component in the first pixel arrangement mode are distributed in a square array mode, and the second pixel arrangement mode is a standard Bayer arrangement mode; and fusing the first image and the second image to obtain an image to be displayed. According to the technical scheme, the resolution and the light sensitivity of the obtained image to be displayed can be balanced, and the effect of enhancing the image quality is achieved.

Description

Image processing method and device, electronic device and storage medium
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to an image processing method and apparatus, an electronic device, and a storage medium.
Background
In the related art, the resolution of an image shot by a mobile phone through a front camera is different according to different ambient light. By adjusting the resolution, high resolution in a highlight environment and high image quality in a low-light environment can be achieved. However, in a highlight environment, image quality is lost to ensure high resolution of an image, and in a low-light environment, image resolution is lost to ensure image quality, so that the photographing method in the related art cannot make a photograph have both image quality and image resolution.
Disclosure of Invention
In order to overcome the problems in the related art, embodiments of the present disclosure provide an image processing method and apparatus, an electronic device, and a storage medium, which can balance image resolution and sensitivity and achieve an effect of enhancing image quality.
According to a first aspect of embodiments of the present disclosure, there is provided an image processing method, including:
when a photographing instruction is received, simultaneously outputting a first image with a first pixel arrangement mode and a second image with a second pixel arrangement mode by using a photosensitive element;
the pixels with the same color component in the first pixel arrangement mode are distributed in a square array mode, and the second pixel arrangement mode is a standard Bayer arrangement mode;
and fusing the first image and the second image to obtain an image to be displayed.
According to a second aspect of the embodiments of the present disclosure, there is provided an image processing apparatus including:
the image output module is used for outputting a first image with a first pixel arrangement mode and a second image with a second pixel arrangement mode simultaneously by using the photosensitive element when receiving a photographing instruction;
the pixels with the same color component in the first pixel arrangement mode are distributed in a square array mode, and the second pixel arrangement mode is a standard Bayer arrangement mode;
and the image fusion module is used for fusing the first image and the second image output by the image output module to obtain an image to be displayed.
According to a third aspect of the embodiments of the present disclosure, there is provided an electronic apparatus including:
the photographic element is used for simultaneously outputting a first image with a first pixel arrangement mode and a second image with a second pixel arrangement mode when receiving a photographing instruction; the pixels with the same color component in the first pixel arrangement mode are distributed in a square array mode, and the second pixel arrangement mode is a standard Bayer arrangement mode;
a memory for storing processor-executable instructions;
and the processor is used for fusing the first image and the second image to obtain an image required to be displayed.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects:
one pixel point on the first image is obtained through four pixel points of the same color component of the photosensitive element, which is equivalent to increase of the photosensitive area of the pixel point, so that the first image has low resolution and high light sensitivity; the second image is obtained by pixel rearrangement of the inherent pixel points of the photosensitive element, which is equivalent to the fact that the inherent resolution of the photosensitive element is reserved, so that the second image has high resolution and low sensitivity; by fusing the first image and the second image, the resolution and the sensitivity of the obtained image to be displayed can be balanced, and the effect of enhancing the image quality is achieved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
FIG. 1A is a flow diagram illustrating an image processing method according to an exemplary embodiment.
FIG. 1B is a schematic diagram of the arrangement of pixels on the photosensitive element in the embodiment shown in FIG. 1A.
FIG. 1C is a schematic diagram of a first pixel arrangement of the photosensitive element in the embodiment shown in FIG. 1A.
FIG. 1D is a diagram illustrating a second pixel arrangement of the photosensitive element in the embodiment shown in FIG. 1A.
Fig. 2 is a flowchart illustrating an image processing method according to another exemplary embodiment.
Fig. 3 is a flow chart of how to determine the preset resolution for the exemplary embodiment shown in fig. 2.
Fig. 4A is a flowchart of how to determine a preset resolution for another exemplary embodiment shown in fig. 2.
Fig. 4B is a diagram of setting options in the exemplary embodiment shown in fig. 4A.
Fig. 5 is a block diagram illustrating an image processing apparatus according to an exemplary embodiment.
Fig. 6 is a block diagram illustrating an image processing apparatus according to another exemplary embodiment.
Fig. 7 is a block diagram illustrating an image processing apparatus according to still another exemplary embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. The following description refers to the accompanying drawings in which the same numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present invention. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the invention, as detailed in the appended claims.
FIG. 1A is a flow chart illustrating an image processing method according to an exemplary embodiment, FIG. 1B is a schematic diagram of an arrangement of pixels on a photosensitive element in the embodiment shown in FIG. 1A, FIG. 1C is a schematic diagram of a first arrangement of pixels of a photosensitive element in the embodiment shown in FIG. 1A, and FIG. 1D is a schematic diagram of a second arrangement of pixels of a photosensitive element in the embodiment shown in FIG. 1A; the image processing method can be applied to electronic devices (for example, devices such as smart phones, tablet computers, digital cameras, etc.), and as shown in fig. 1A, the image processing method includes the following steps S101 to S103:
in step S101, when a photographing instruction is received, a first image having a first pixel arrangement and a second image having a second pixel arrangement are simultaneously output by using a photosensitive element; the pixels of the same color component are distributed in a square array in the first pixel arrangement mode, and the second pixel arrangement mode is a standard Bayer arrangement mode.
In one embodiment, a first image is acquired based on a first pixel arrangement of a photosensitive element; acquiring a second image based on a second pixel arrangement mode in the same frame of original image data as the first image; the first image and the second image are output simultaneously.
In one embodiment, the square array may be an N × N array, where N is greater than or equal to 2. As shown in fig. 1B, taking a square array with a square array of 2 × 2 as an example for illustration, the pixel arrangement mode inherent to the photosensitive element is a four-in-one (4 in 1) pixel arrangement mode, that is, when the 2 × 2 array corresponding to the same color component is taken as one large pixel, the resolution of the first image is one fourth of the resolution of the photosensitive element. Based on the color components in the 2 × 2 array shown in fig. 1B, the color components of the large pixels corresponding to the 2 × 2 array can be determined, and then the first pixel arrangement mode shown in fig. 1C is obtained. In one embodiment, a frame of raw image data as shown in fig. 1B is collected by a photosensitive element, and the raw image data is converted into a first pixel arrangement as shown in fig. 1C, so as to obtain a first image.
In one embodiment, the standard Bayer arrangement pattern is shown in fig. 1D, and the second image is obtained by rearranging the pixels of the raw image data shown in fig. 1B according to the second pixel arrangement shown in fig. 1D. The first image and the second image may be acquired by the sensor and simultaneously output to the processor through the sensor, and the processor performs the following step S102.
In step S102, the first image and the second image are fused to obtain an image to be displayed.
In an embodiment, the first RGB image may be obtained by performing an interpolation operation on the first image, and the second RGB image may be obtained by performing an interpolation operation on the second image; and fusing the first RGB image and the second RGB image to obtain an image which needs to be displayed on a user interface of the camera application program. The specific interpolation algorithm can be referred to the description of the related art, and the disclosure will not be described in detail.
In this embodiment, one pixel point on the first image is obtained through four pixel points of the same color component of the photosensitive element, which is equivalent to increase the photosensitive area of the pixel point, so that the first image has low resolution and high sensitivity; the second image is obtained by pixel rearrangement of the inherent pixel points of the photosensitive element, which is equivalent to the fact that the inherent resolution of the photosensitive element is reserved, so that the second image has high resolution and low sensitivity; by fusing the first image and the second image, the resolution and the sensitivity of the obtained image to be displayed can be balanced, and the effect of enhancing the image quality is achieved.
The technical solutions provided by the embodiments of the present disclosure are described below with specific embodiments.
FIG. 2 is a flow diagram illustrating an image processing method according to another exemplary embodiment; the present embodiment uses the above method provided by the embodiment of the present disclosure, and takes how to generate an image to be displayed based on a first image and a second image as an example for an exemplary description, as shown in fig. 2, the method includes the following steps:
in step S201, when a photographing instruction is received, a first image having a first pixel arrangement and a second image having a second pixel arrangement are output simultaneously by using a photosensitive element; the pixels of the same color component are distributed in a square array in the first pixel arrangement mode, and the second pixel arrangement mode is a standard Bayer arrangement mode.
The description of step S201 can refer to the description of step S101 in the embodiment shown in fig. 1A, and is not described in detail here.
In step S202, the first image is enlarged from the first resolution to a preset resolution to obtain a third image.
In one embodiment, the first resolution is one fourth of the intrinsic resolution of the photosensitive elements, for example, as shown in fig. 1B, the intrinsic resolution of the photosensitive elements is 16 × 16, and the first resolution is 4 × 4.
The following description of how to determine the preset resolution can be referred to in the embodiments of fig. 3 and 4A, and the present embodiment will not be described in detail. In an embodiment, a description of the related art may be referred to for a magnification algorithm of the first image, and the magnification algorithm is not limited in this embodiment.
In step S203, the second image is reduced from the second resolution to a preset resolution to obtain a fourth image.
In an embodiment, the description of the related art may be referred to for a reduction algorithm of the second image, and the reduction algorithm is not limited in this embodiment.
In step S204, the third image and the fourth image are fused to obtain an image to be displayed.
In an embodiment, interpolation operation may be performed on the third image to obtain a fifth RGB image corresponding to the third image; similarly, performing interpolation operation on the fourth image to obtain a sixth RGB image corresponding to the fourth image; and finally, fusing based on the fifth RGB image and the sixth RGB image to obtain an image to be displayed.
In an embodiment, the image to be displayed may be subjected to edge sharpening, image enhancement, and the like, so as to further improve the visual experience of the user.
In this embodiment, the first image and the second image are adjusted to the third image and the fourth image with the same resolution, and the image to be displayed is obtained based on the fusion of the third image and the fourth image with the same resolution.
FIG. 3 is a flow chart of how the preset resolution of the exemplary embodiment of FIG. 2 is determined; in this embodiment, an example of how to determine the preset resolution is described by using the above method provided in the embodiment of the present disclosure, as shown in fig. 3, the method includes the following steps:
in step S301, the illumination intensity of the ambient light is detected.
In one embodiment, the brightness of the entire image frame may be determined from the raw image captured by the image sensor, and the illumination intensity of the ambient light may be determined based on the brightness. In another embodiment, the illumination intensity of the ambient light may be detected by a light sensor disposed on the electronic device where the image sensor is located.
In step S302, a preset resolution is determined based on the illumination intensity, the first resolution, and the second resolution.
In an embodiment, a correspondence table between the illumination intensity and a specific value of the preset resolution may be counted, when the illumination intensity is greater than or equal to a first threshold, the specific value of the preset resolution is set to be equal to a second resolution, when the illumination intensity is less than or equal to a second threshold, the specific value of the preset resolution is set to be equal to the first resolution, when the illumination intensity is less than the first threshold and greater than the second threshold, the specific value of the preset resolution corresponding to the illumination intensity is found according to the correspondence table between the illumination intensity and the preset resolution, where the first threshold is greater than the second threshold, for example, the first threshold is 800 Lux (Lux), and the second threshold is 100Lux. The table of the comparison can be seen in table 1 below.
Intensity of illumination Presetting a specific value of resolution
<100Lux 2560×1920
150Lux 2000×1000
750Lux 5000×3000
>800Lux 5120×3840
TABLE 1
The preset resolution is determined based on the illumination intensity, the size relationship between the first threshold and the second threshold, the first resolution and the second resolution, so that the resolution of the image to be displayed can be ensured to be changed along with the illumination intensity, and meanwhile, the image to be displayed can be ensured to have better image quality and higher resolution.
The present embodiment may be executed before executing step S203 in the embodiment shown in fig. 2, or may be executed when executing any step from step S201 to step S202, and the present disclosure does not limit the timing of obtaining the third resolution, as long as a reasonable preset resolution can be ensured when executing step S203.
In this embodiment, predetermined resolution is determined based on illumination intensity, the first resolution and the second resolution, when the illumination intensity is different, the obtained predetermined resolution is also different, and then the resolution of the image required to be displayed, which is obtained by fusing the at least one frame of third image and the at least one frame of fourth image, can also change along with the difference of the illumination intensity, when the illumination intensity is high, the image required to be displayed can have higher resolution, when the illumination intensity is low, the image required to be displayed can have higher definition and signal-to-noise ratio, thereby ensuring that the image required to be displayed can have better resolution and signal-to-noise ratio along with the difference of the illumination intensity.
Fig. 4A is a flowchart of how to determine a preset resolution of another exemplary embodiment shown in fig. 2, and fig. 4B is a schematic diagram of a setting option in the exemplary embodiment shown in fig. 4A, which is exemplarily illustrated by how to determine the preset resolution by using the above method provided by the embodiment of the present disclosure; as shown in fig. 4A, the method comprises the following steps:
in step S401, a prompt box for setting the third resolution is displayed on the currently displayed image preview interface.
In step S402, the resolution input by the user in the prompt box is detected, and the input resolution is determined as the preset resolution.
As shown in fig. 4B, on the currently displayed image preview interface, a prompt box 41 may be displayed, and the content in the prompt box 41 is used to prompt the user of the resolution of the photo generated after the photographing is completed. After the user sets the specific value of the preset resolution through the prompt box, the user triggers the photographing key, and the resolution of the generated photo is the preset resolution.
In the embodiment, the preset resolution is set in a user interface mode, so that a user can select the proper photo resolution according to the specific requirements of the user, and the user photographing experience is improved.
In the above embodiments shown in fig. 3 and 4A, the embodiments shown in fig. 3 and 4A may be executed before step S203 in the embodiment shown in fig. 2 is executed, or the embodiment may be executed when any step from step S201 to step S202 is executed, and the present disclosure does not limit the time point when the preset resolution is obtained, as long as a reasonable preset resolution can be ensured when step S203 is executed.
Fig. 5 is a block diagram illustrating an image processing apparatus according to an exemplary embodiment, as illustrated in fig. 5, the image processing apparatus including:
an image output module 51, configured to output a first image with a first pixel arrangement and a second image with a second pixel arrangement simultaneously by using a photosensitive element when receiving a photographing instruction;
the pixels with the same color component are distributed in a square array in the first pixel arrangement mode, and the second pixel arrangement mode is a standard Bayer arrangement mode;
and an image fusion module 52, configured to fuse the first image and the second image output by the image output module 51 to obtain an image to be displayed.
Fig. 6 is a block diagram of an image processing apparatus according to another exemplary embodiment, and as shown in fig. 6, on the basis of the above-mentioned embodiment shown in fig. 5, the image fusion module 52 may include:
the amplifying unit 521 is configured to amplify the first image from the first resolution to a preset resolution to obtain a third image;
a reducing unit 522, configured to reduce the second image from the second resolution to a preset resolution to obtain a fourth image;
a fusion unit 523, configured to fuse the third image obtained by the enlarging unit 521 and the fourth image obtained by the reducing unit 522 to obtain an image to be displayed.
In an embodiment, the apparatus further comprises:
a first detection module 53, configured to detect an illumination intensity of the ambient light;
a determining module 54, configured to determine a preset resolution based on the illumination intensity detected by the first detecting module 53, the first resolution, and the second resolution.
In an embodiment, the apparatus further comprises:
a display module 55, configured to display a prompt box for setting a preset resolution on a preview interface for displaying an image;
a second detecting module 56, configured to detect a resolution input in the prompt box displayed by the display module 55, and determine the input resolution as a preset resolution.
In one embodiment, the image output module 51 may include:
a first acquisition unit 511 configured to acquire a first image based on a first pixel arrangement of the photosensitive elements;
a second acquisition unit 512 configured to acquire a second image based on a second pixel arrangement within the same frame of native image data as the first image acquired by the first acquisition unit 511;
an output unit 513 is configured to output the first image acquired by the first acquisition unit and the second image acquired by the second acquisition unit at the same time.
In one embodiment, the first pixel arrangement is a Quad Bayer arrangement pattern.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
Fig. 7 is a block diagram illustrating a photograph generation apparatus according to yet another exemplary embodiment. For example, the apparatus 700 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, and the like.
Referring to fig. 7, apparatus 700 may include one or more of the following components: processing component 702, memory 704, power component 706, multimedia component 708, audio component 710, input/output (I/O) interface 712, sensor component 714, and communications component 716.
The processing component 702 generally controls overall operation of the device 700, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing element 702 may include one or more processors 720 to execute instructions to perform all or part of the steps of the methods described above. Further, the processing component 702 may include one or more modules that facilitate interaction between the processing component 702 and other components. For example, the processing component 702 may include a multimedia module to facilitate interaction between the multimedia component 708 and the processing component 702.
The memory 704 is configured to store various types of data to support operation at the device 700. Examples of such data include instructions for any application or method operating on device 700, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 704 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
The power component 706 provides power to the various components of the device 700. The power components 706 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the apparatus 700.
The multimedia component 708 includes a screen that provides an output interface between the device 700 and the user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 708 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the device 700 is in an operating mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 710 is configured to output and/or input audio signals. For example, audio component 710 includes a Microphone (MIC) configured to receive external audio signals when apparatus 700 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signal may further be stored in the memory 704 or transmitted via the communication component 716. In some embodiments, audio component 710 also includes a speaker for outputting audio signals.
The I/O interface 712 provides an interface between the processing component 702 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 714 includes one or more sensors for providing status assessment of various aspects of the apparatus 700. For example, sensor assembly 714 may detect an open/closed state of device 700, the relative positioning of components, such as a display and keypad of apparatus 700, sensor assembly 714 may also detect a change in position of apparatus 700 or a component of apparatus 700, the presence or absence of user contact with apparatus 700, orientation or acceleration/deceleration of apparatus 700, and a change in temperature of apparatus 700. The sensor assembly 714 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 714 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 714 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 716 is configured to facilitate communication between the apparatus 700 and other devices in a wired or wireless manner. The apparatus 700 may access a wireless network based on a communication standard, such as WiFi,2G or 3G, or a combination thereof. In an exemplary embodiment, the communication section 716 receives a broadcast signal or broadcast-related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 716 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 700 may be implemented by one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital Signal Processing Devices (DSPDs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, the sensor assembly 614 may further include a photosensitive element for outputting a first image having a first pixel arrangement and a second image having a second pixel arrangement simultaneously when receiving a photographing instruction; the pixels with the same color component are distributed in a square array in the first pixel arrangement mode, and the second pixel arrangement mode is a standard Bayer arrangement mode. The processor 720 fuses the first image and the second image to obtain an image to be displayed.
In an exemplary embodiment, a non-transitory computer-readable storage medium including instructions, such as the memory 704 including computer instructions, is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements that have been described above and shown in the drawings, and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (13)

1. An image processing method, characterized in that the method comprises:
when a photographing instruction is received, simultaneously outputting a first image with a first pixel arrangement mode and a second image with a second pixel arrangement mode by using a photosensitive element;
the pixels with the same color component in the first pixel arrangement mode are distributed in a square array mode, and the second pixel arrangement mode is a standard Bayer arrangement mode;
adjusting a first resolution of the first image and a second resolution of the second image to preset resolutions respectively; wherein the preset resolution is determined based on the illumination intensity of the ambient light, the first resolution, and the second resolution;
and fusing the first image with the adjusted resolution and the second image with the adjusted resolution to obtain an image to be displayed.
2. The method of claim 1, wherein the adjusting the first resolution of the first image and the second resolution of the second image to preset resolutions respectively comprises:
amplifying the first image from a first resolution to a preset resolution;
and reducing the second image from the second resolution to the preset resolution.
3. The method of claim 1, further comprising:
and detecting the illumination intensity of the ambient light to obtain the illumination intensity.
4. The method of claim 2, further comprising:
displaying a prompt box for setting the preset resolution on a preview interface for displaying an image;
and detecting the resolution input in the prompt box, and determining the input resolution as the preset resolution.
5. The method of claim 1, wherein outputting a first image having a first pixel arrangement and a second image having a second pixel arrangement simultaneously using a photosensitive element comprises:
acquiring a first image based on a first pixel arrangement mode of a photosensitive element;
acquiring a second image based on the second pixel arrangement mode in the same frame of original image data as the first image;
outputting the first image and the second image simultaneously.
6. The method of any of claims 1-5, wherein the first pixel arrangement is a Quad Bayer arrangement pattern.
7. An image processing apparatus, characterized in that the apparatus comprises:
the image output module is used for outputting a first image with a first pixel arrangement mode and a second image with a second pixel arrangement mode simultaneously by using the photosensitive element when receiving a photographing instruction;
the first pixel arrangement mode is a Quad Bayer arrangement mode, and the second pixel arrangement mode is a standard Bayer arrangement mode;
the image fusion module is used for adjusting the first resolution of the first image and the second resolution of the second image to preset resolutions respectively; wherein the preset resolution is determined based on the illumination intensity of the ambient light, the first resolution, and the second resolution; and fusing the first image with the adjusted resolution and the second image with the adjusted resolution to obtain an image to be displayed.
8. The apparatus of claim 7, wherein the image fusion module comprises:
the amplifying unit is used for amplifying the first image from a first resolution to a preset resolution;
a reducing unit, configured to reduce the second image from a second resolution to the preset resolution.
9. The apparatus of claim 8, further comprising:
the first detection module is used for detecting the illumination intensity of the ambient light to obtain the illumination intensity.
10. The apparatus of claim 8, further comprising:
the display module is used for displaying a prompt box for setting the preset resolution on a preview interface for displaying an image;
and the second detection module is used for detecting the resolution input in the prompt box displayed by the display module and determining the input resolution as the preset resolution.
11. The apparatus of claim 7, wherein the image output module comprises:
a first acquisition unit configured to acquire a first image based on a first pixel arrangement of the photosensitive element;
a second acquisition unit configured to acquire a second image based on the second pixel arrangement in the same frame of raw image data as the first image acquired by the first acquisition unit;
and the output unit is used for simultaneously outputting the first image acquired by the first acquisition unit and the second image acquired by the second acquisition unit.
12. The device according to any one of claims 7-11, wherein the first pixel arrangement is a Quad Bayer arrangement pattern.
13. An electronic device, characterized in that the electronic device comprises:
the photosensitive element is used for simultaneously outputting a first image with a first pixel arrangement mode and a second image with a second pixel arrangement mode when receiving a photographing instruction; pixels of the same color component in the first pixel arrangement mode are distributed in a square array mode, and the second pixel arrangement mode is a standard Bayer arrangement mode;
a memory for storing processor-executable instructions;
the processor is used for respectively adjusting the first resolution of the first image and the second resolution of the second image to preset resolutions; wherein the preset resolution is determined based on the illumination intensity of the ambient light, the first resolution, and the second resolution;
and fusing the first image with the adjusted resolution and the second image with the adjusted resolution to obtain an image to be displayed.
CN201811012444.1A 2018-08-31 2018-08-31 Image processing method and device, electronic device and storage medium Active CN110874829B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811012444.1A CN110874829B (en) 2018-08-31 2018-08-31 Image processing method and device, electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811012444.1A CN110874829B (en) 2018-08-31 2018-08-31 Image processing method and device, electronic device and storage medium

Publications (2)

Publication Number Publication Date
CN110874829A CN110874829A (en) 2020-03-10
CN110874829B true CN110874829B (en) 2022-10-14

Family

ID=69715515

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811012444.1A Active CN110874829B (en) 2018-08-31 2018-08-31 Image processing method and device, electronic device and storage medium

Country Status (1)

Country Link
CN (1) CN110874829B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114125237A (en) * 2021-11-30 2022-03-01 维沃移动通信有限公司 Image sensor, camera module, electronic device, image acquisition method, and medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102948141A (en) * 2010-05-28 2013-02-27 索尼公司 Imaging device and imaging method
CN104737527A (en) * 2012-09-19 2015-06-24 富士胶片株式会社 Image processing device, imaging device, image processing method, and image processing program
CN105611185A (en) * 2015-12-18 2016-05-25 广东欧珀移动通信有限公司 Image generation method and device and terminal device
CN106412407A (en) * 2016-11-29 2017-02-15 广东欧珀移动通信有限公司 Control method, control device and electronic device
WO2018137267A1 (en) * 2017-01-25 2018-08-02 华为技术有限公司 Image processing method and terminal apparatus
CN108377369A (en) * 2018-02-05 2018-08-07 中国科学院长春光学精密机械与物理研究所 The Binning methods of Bayer format coloured image

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102948141A (en) * 2010-05-28 2013-02-27 索尼公司 Imaging device and imaging method
CN104737527A (en) * 2012-09-19 2015-06-24 富士胶片株式会社 Image processing device, imaging device, image processing method, and image processing program
CN105611185A (en) * 2015-12-18 2016-05-25 广东欧珀移动通信有限公司 Image generation method and device and terminal device
CN106412407A (en) * 2016-11-29 2017-02-15 广东欧珀移动通信有限公司 Control method, control device and electronic device
WO2018137267A1 (en) * 2017-01-25 2018-08-02 华为技术有限公司 Image processing method and terminal apparatus
CN108377369A (en) * 2018-02-05 2018-08-07 中国科学院长春光学精密机械与物理研究所 The Binning methods of Bayer format coloured image

Also Published As

Publication number Publication date
CN110874829A (en) 2020-03-10

Similar Documents

Publication Publication Date Title
CN108419016B (en) Shooting method and device and terminal
CN107038037B (en) Display mode switching method and device
RU2630167C1 (en) Method and device for switching colour range mode
CN106657780B (en) Image preview method and device
CN108040204B (en) Image shooting method and device based on multiple cameras and storage medium
CN114500821B (en) Photographing method and device, terminal and storage medium
CN110876014B (en) Image processing method and device, electronic device and storage medium
CN112188096A (en) Photographing method and device, terminal and storage medium
CN110874829B (en) Image processing method and device, electronic device and storage medium
CN111343386B (en) Image signal processing method and device, electronic device and storage medium
CN111698414B (en) Image signal processing method and device, electronic device and readable storage medium
CN107682623B (en) Photographing method and device
CN110876013B (en) Method and device for determining image resolution, electronic equipment and storage medium
CN110876015B (en) Method and device for determining image resolution, electronic equipment and storage medium
CN107707819B (en) Image shooting method, device and storage medium
CN111343375A (en) Image signal processing method and device, electronic device and storage medium
CN114390211B (en) Exposure convergence method, device, electronic equipment and storage medium
CN114339017B (en) Distant view focusing method, device and storage medium
CN117082335A (en) Image display method, device and storage medium
CN118057824A (en) Image processing method, device and storage medium
CN117764895A (en) Image processing method, device and storage medium
CN118175420A (en) Image acquisition method, device, equipment and storage medium
CN118118789A (en) Image generation method, device and storage medium
CN116402695A (en) Video data processing method, device and storage medium
CN116419075A (en) Image data processing method, device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant