CN115134564A - Photographing method, photographing apparatus, electronic device, and readable storage medium - Google Patents

Photographing method, photographing apparatus, electronic device, and readable storage medium Download PDF

Info

Publication number
CN115134564A
CN115134564A CN202210736855.5A CN202210736855A CN115134564A CN 115134564 A CN115134564 A CN 115134564A CN 202210736855 A CN202210736855 A CN 202210736855A CN 115134564 A CN115134564 A CN 115134564A
Authority
CN
China
Prior art keywords
image
camera
green
red
blue
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210736855.5A
Other languages
Chinese (zh)
Inventor
陈洁茹
刘宏基
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202210736855.5A priority Critical patent/CN115134564A/en
Publication of CN115134564A publication Critical patent/CN115134564A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Color Television Image Signal Generators (AREA)

Abstract

The application discloses a shooting method, a shooting device, an electronic device and a readable storage medium, wherein the shooting device comprises: the camera comprises a first camera, a second camera, a third camera and a fourth camera, wherein the first camera comprises a Bayer array sensor, the second camera comprises a red light filter, the third camera comprises a green light filter, and the fourth camera comprises a blue light filter; the shooting method comprises the following steps: controlling a first camera to acquire an RGB image, a second camera to acquire a red image, a third camera to acquire a green image and a fourth camera to acquire a blue image; based on the RGB image, the red image, the green image, and the blue image, a target image is obtained.

Description

Photographing method, photographing apparatus, electronic device, and readable storage medium
Technical Field
The application belongs to the technical field of camera shooting, and particularly relates to a shooting method, a shooting device, electronic equipment and a readable storage medium.
Background
When the camera converts the real scene into image data, information of three components of red, green and blue on each pixel point needs to be acquired, and then the information of the three colors is synthesized into a color image.
The image sensor acquires light rays passing through the optical filter, the optical filter used on the current mobile phone is a Color Filter Array (CFA), the color filter array is formed by three small optical filters of red, green and blue alternately, an image acquired by the image sensor through the color filter array needs to be subjected to difference processing, and the processed image has the defects of uneven image edge and partial loss of image color.
Disclosure of Invention
The embodiment of the application aims to provide a shooting method, a shooting device, an electronic device and a readable storage medium, which can solve the problems that the edge of a shot image is not uniform and the color information of the image is partially lost.
In a first aspect, an embodiment of the present application provides a shooting method, which is applied to a shooting device, where the shooting device includes: the camera comprises a first camera, a second camera, a third camera and a fourth camera, wherein the first camera comprises a Bayer array sensor, the second camera comprises a red light filter, the third camera comprises a green light filter, and the fourth camera comprises a blue light filter;
the shooting method comprises the following steps:
controlling a first camera to acquire an RGB image, a second camera to acquire a red image, a third camera to acquire a green image and a fourth camera to acquire a blue image;
and obtaining a target image based on the RGB image, the red image, the green image and the blue image.
In a second aspect, an embodiment of the present application provides a shooting device, including: the camera comprises a first camera, a second camera, a third camera and a fourth camera, wherein the first camera comprises a Bayer array sensor, the second camera comprises a red light filter, the third camera comprises a green light filter, and the fourth camera comprises a blue light filter;
the photographing apparatus further includes:
the control module controls the first camera to acquire RGB images, the second camera to acquire red images, the third camera to acquire green images and the fourth camera to acquire blue images;
and the fusion module is used for obtaining a target image based on the RGB image, the red image, the green image and the blue image.
In a third aspect, embodiments of the present application provide an electronic device, which includes a processor and a memory, where the memory stores a program or instructions executable on the processor, and the program or instructions, when executed by the processor, implement the steps of the method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium on which a program or instructions are stored, which when executed by a processor, implement the steps of the method according to the first aspect.
In a fifth aspect, embodiments of the present application provide a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method according to the first aspect.
In a sixth aspect, embodiments of the present application provide a computer program product, stored on a storage medium, for execution by at least one processor to implement a method as in the first aspect.
In the embodiment of the application, when the shooting input is received, the first camera, the second camera, the third camera and the fourth camera shoot simultaneously. Thus, one RGB image, one red image, one green image, and one blue image can be obtained through one shooting. And then fusing the color image, the red image, the green image and the blue image to obtain a final target image.
And fusing the RGB image, the red image, the green image and the blue image to obtain a new color image. Compared with an image obtained through difference processing in the related art, monochrome information is richer in a red image, a green image and a blue image, details are more completely reserved, after the RGB image, the red image, the green image and the blue image are fused, the obtained color information of the color image is richer, edge texture information is more complete, and the shooting effect is improved.
Drawings
Fig. 1 is a schematic structural diagram of an electronic device in an embodiment of the present application;
FIG. 2 is a schematic distribution diagram of cameras in the embodiment of the present application;
FIG. 3 is a flow chart illustrating a photographing method according to an embodiment of the present application;
FIG. 4 is a schematic diagram of an RGB image in an embodiment of the present application;
FIG. 5 is a diagram illustrating an embodiment of the present application in which unnecessary information is included in a captured image;
FIG. 6 is a schematic diagram of a distorted image included in a captured image in an embodiment of the present application;
FIG. 7 is a schematic view of a display interface of an electronic device in an embodiment of the application;
FIG. 8 is a second schematic diagram of a display interface of an electronic device according to an embodiment of the present application;
FIG. 9 is a third schematic diagram of a display interface of an electronic device in an embodiment of the present application;
FIG. 10 is a fourth schematic diagram of a display interface of an electronic device in an embodiment of the present application;
fig. 11 is a schematic block diagram of a photographing apparatus in an embodiment of the present application;
FIG. 12 is a schematic block diagram of an electronic device in an embodiment of the application;
fig. 13 is a schematic hardware structure diagram of an electronic device according to an embodiment of the present application.
Reference numerals are as follows:
110 a first camera, 120 a second camera, 130 a third camera, 140 a fourth camera.
Detailed Description
The technical solutions in the embodiments of the present application will be described clearly below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments that can be derived by one of ordinary skill in the art from the embodiments given herein are intended to be within the scope of the present disclosure.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be practiced in sequences other than those illustrated or described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense and do not limit the number of terms, e.g., the first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
The shooting method, the shooting device, the electronic device and the readable storage medium provided by the embodiments of the present application are described in detail below with reference to the accompanying drawings by specific embodiments and application scenarios thereof.
In an embodiment of the present application, a shooting method is proposed, which is applied to a shooting device, wherein, as shown in fig. 1 and fig. 2, the shooting device includes: the camera comprises a first camera 110, a second camera 120, a third camera 130 and a fourth camera 140, wherein the first camera 110 comprises a Bayer array sensor, the second camera 120 comprises a red filter, the third camera 130 comprises a green filter, and the fourth camera 140 comprises a blue filter.
As shown in fig. 3, the photographing method includes:
step 202, controlling a first camera to acquire an RGB image, a second camera to acquire a red image, a third camera to acquire a green image and a fourth camera to acquire a blue image;
and step 204, obtaining a target image based on the RGB image, the red image, the green image and the blue image.
The first camera comprises a Bayer array sensor, and when the first camera shoots, RGB images can be acquired. Since the bayer array sensor is composed of three small filters of red, green, and blue colors alternately, images of R-channel image data, G-channel image data, and B-channel image data are captured alternately, and then RGB images are obtained by demosiac (demosaicing) processing.
The second camera comprises a red optical filter, and a red image can be acquired when the second camera shoots. In the second camera, a red filter replaces a Bayer array sensor, and after the red filter filters light, a complete red image can be obtained.
The third camera includes green filter, and the third camera can gather and obtain green image when shooing. In the third camera, a green filter replaces a Bayer array sensor, and after the green filter filters light, a complete green image can be acquired.
The fourth camera comprises a blue optical filter, and when the fourth camera shoots, a blue image can be acquired. In the fourth camera, a blue filter is used for replacing a Bayer array sensor, and after the blue filter filters light, a complete blue image can be obtained.
And under the condition of receiving the shooting input, the first camera, the second camera, the third camera and the fourth camera shoot simultaneously. Thus, one RGB image, one red image, one green image, and one blue image can be obtained through one shooting. And then fusing the color image, the red image, the green image and the blue image to obtain a final target image.
And fusing the RGB image, the red image, the green image and the blue image to obtain a new color image. Compared with an image obtained through difference processing in the related art, monochrome information is richer in a red image, a green image and a blue image, details are more completely reserved, after the RGB image, the red image, the green image and the blue image are fused, the obtained color information of the color image is richer, edge texture information is more complete, and the shooting effect is improved.
In one possible embodiment, deriving the target image based on the RGB image, the red image, the green image, and the blue image comprises: based on the RGB image, obtaining R channel image data, G channel image data and B channel image data; fusing R channel image data and a red image to obtain a red fused image, fusing G channel image data and a green image to obtain a green fused image, and fusing B channel image data and a blue image to obtain a blue fused image; and fusing the red fused image, the green fused image and the blue fused image to obtain a target image.
In the case of taking a red image, a green image, and a blue image, since pixels in the red image, the green image, and the blue image are relatively complete, demosiac processing is not required.
And fusing the red image with R channel image data decomposed from the RGB image to obtain a red fused image. And fusing the green image with the G channel image data decomposed from the RGB image to obtain a green fused image. And fusing the blue image and B-channel image data decomposed from the RGB image to obtain a blue fused image. And then fusing the three fused images to obtain a target image. Because the target image is the image subjected to fusion processing, the color of the target image is accurate, richer and clearer, and the problems of inaccurate image color and edge information loss in the related technology are solved.
In a possible embodiment, after obtaining the R channel image data, the G channel image data, and the B channel image data based on the RGB image, the method further includes: and carrying out alignment processing on the R-channel image data and the red image, carrying out alignment processing on the G-channel image data and the green image, and carrying out alignment processing on the B-channel image data and the blue image.
Due to the fact that shooting positions of the first camera, the second camera, the third camera and the fourth camera are different, images shot by different cameras may also be different. Therefore, when the RGB image, the red image, the green image, and the blue image are captured, it is necessary to compare the red image, the green image, and the blue image with the RGB image, respectively, and determine whether the red image, the green image, and the blue image have the problem of redundant information or image distortion.
The first camera may be used as the main camera and then images captured by other cameras may be compared with the image captured by the first camera.
Specifically, after the R-channel image data is aligned with the red image, it can be determined whether there is unnecessary information in the red image or whether the image is distorted. For example, if a single person image is captured in the RGB image and an animal image appears in the red image, the animal image in the red image is an unnecessary image. In the case where the unnecessary information is determined, the red image needs to be clipped so as to delete the unnecessary information in the red image. The cut parts in the red image are not fused with the RGB image, and the uncut parts in the red image are aligned with the RGB image and fused.
As shown in fig. 4, the first region 610 shows a person image, and the second region 620 in fig. 5 is an unnecessary image compared to fig. 4, and the second region 620 is not shown in fig. 4, so that the image within the dotted line frame needs to be removed.
If the red image has distorted images, the image proportion in the red image is adjusted based on the image proportion of the RGB image.
As shown in fig. 6, the image in the third region 630 shows a head that is malformed with respect to the head of the person in fig. 4, and therefore correction of the head of the person is required.
The alignment process of the green image and the blue image with the RGB image is the same.
Through aligning the red image, the green image, the blue image and the RGB image, redundant information can be removed, and distortion in the image is corrected, so that the red image, the green image, the blue image and the RGB image are more accurately fused, the fusion effect is favorably improved, and the image shooting effect is further improved.
In a possible embodiment, the photographing method further includes: and displaying a pixel weight setting interface, wherein the pixel weight setting interface is used for setting the image fusion proportion.
The user can adjust the weight ratio of the red pixel, the green pixel and the blue pixel, so that the ratio of the red pixel, the green pixel and the blue pixel in the target image is more suitable for the requirement of the user.
Specifically, in the case of entering the pixel weight setting interface, the user can set the weight ratio of the red pixel, the green pixel, and the blue pixel. As shown in fig. 10, the user may click the "R" control, and at this time, a filling box of weight values appears on the display interface of the electronic device, and the data range is between 0 and 1, and the user may fill the weight ratio of the red pixel in the filling box. Similarly, the "G" control may be clicked to fill in the weight fraction of green pixels, and the "B" control may be clicked to fill in the weight fraction of blue pixels.
For red, green and blue pixels, if the weights set by the user are wr, wg and wb, respectively, the image captured by the first camera is x, and the images captured by the second, third and fourth cameras are R, G, B, respectively, the final picture Y obtained in the pixel weight adjustment mode can be represented as: y ═ x + wr × R + wg × G + wb × B.
And finally Y is a picture output to the user. Illustratively, if the user prefers a somewhat redder picture style, the weighting of the red pixels may be increased to get the final picture. Because the details of the images shot by the second camera, the third camera and the fourth camera are completely reserved and the monochrome information is richer, the quality of the images finally obtained by the user is higher, and the preference tendency of the user is better met by adjusting the pixel weight.
In a possible embodiment, before the first camera acquires the RGB image, the method further includes: receiving a first input of a user; and responding to the first input, and entering a target shooting mode, wherein the target shooting mode is used for indicating and controlling the first camera, the second camera, the third camera and the fourth camera to simultaneously acquire images.
The target shooting mode can be a special shooting mode, a user enters the special shooting mode by clicking a special shooting control on the electronic equipment, and in the special shooting mode, when the user shoots, the RGB image, the red image, the green image and the blue image are fused to obtain a high-quality target image.
As shown in fig. 7, the user may click a "special shoot" control in the display interface 640, and the electronic device enters the target shooting mode.
And under the condition that the target shooting mode is not entered, the electronic equipment is in a normal shooting mode, and in the normal shooting mode, an image is obtained only through shooting by the first camera. Different shooting requirements of users can be met by setting different shooting modes.
As shown in fig. 8, in the display interface 660, there are two controls of "high quality imaging" and "special fusion". As shown in fig. 9, in the display interface 670, the user may click on the "high quality imaging" control, at which time the user may capture a high quality image using the electronic device. If the "special blend" control is clicked, the process proceeds to the display interface 691 shown in fig. 10, that is, the pixel weight setting interface, where the weight ratios of the red pixel, the green pixel, and the blue pixel can be set.
In a possible embodiment, the number of the third cameras is two, and the two third cameras respectively acquire two green images; fusing G channel image data with a green image to obtain a green fused image, comprising: and fusing the G channel image data with the two green images to obtain a green fused image.
The number of the third cameras is two, when high-quality shooting is carried out, two green images can be obtained, and due to the fact that human eyes are more sensitive to green, accuracy of green in the target image is improved by collecting the two green images, and quality of the target image is further improved.
In other embodiments, the number of the third cameras may also be plural, and the number of the second cameras and the fourth cameras may also be provided as at least one.
According to the shooting method provided by the embodiment of the application, the execution main body can be a shooting device. The embodiment of the present application takes an example in which a shooting device executes a shooting method, and the shooting device provided in the embodiment of the present application is described.
As shown in fig. 11, in some embodiments of the present application, a camera 300 is provided, where the camera 300 includes: the first camera comprises a Bayer array sensor, the second camera comprises a red light filter, the third camera comprises a green light filter, and the fourth camera comprises a blue light filter.
The photographing device 300 includes:
the control module 310 is used for controlling the first camera to acquire an RGB image, the second camera to acquire a red image, the third camera to acquire a green image and the fourth camera to acquire a blue image;
and the fusion module 320 is used for obtaining a target image based on the RGB image, the red image, the green image and the blue image.
And under the condition of receiving the shooting input, the first camera, the second camera, the third camera and the fourth camera shoot simultaneously. Thus, one RGB image, one red image, one green image, and one blue image can be obtained through one shooting. And then fusing the color image, the red image, the green image and the blue image to obtain a final target image.
And fusing the RGB image, the red image, the green image and the blue image to obtain a new color image. Compared with an image obtained through difference processing in the related art, monochrome information is richer in a red image, a green image and a blue image, details are more completely reserved, after the RGB image, the red image, the green image and the blue image are fused, the obtained color information of the color image is richer, edge texture information is more complete, and the shooting effect is improved.
In a possible embodiment, the fusion module 320 is specifically configured to: based on the RGB image, obtaining R channel image data, G channel image data and B channel image data; fusing R channel image data and a red image to obtain a red fused image, fusing G channel image data and a green image to obtain a green fused image, and fusing B channel image data and a blue image to obtain a blue fused image; and fusing the red fused image, the green fused image and the blue fused image to obtain a target image.
In a possible embodiment, the camera 300 further comprises: and an alignment module.
After obtaining R-channel image data, G-channel image data, and B-channel image data based on the RGB image, the alignment module is configured to: and carrying out alignment processing on the R-channel image data and the red image, carrying out alignment processing on the G-channel image data and the green image, and carrying out alignment processing on the B-channel image data and the blue image.
In a possible embodiment, the camera 300 further comprises: and a display module.
The display module is used for: and displaying a pixel weight setting interface, wherein the pixel weight setting interface is used for setting the image fusion proportion.
In a possible embodiment, the camera 300 further comprises: and a receiving module. Before the first camera acquires the RGB image, the receiving module is used for: receiving a first input of a user; the control module is further configured to: and responding to the first input, and entering a target shooting mode, wherein the target shooting mode is used for indicating and controlling the first camera, the second camera, the third camera and the fourth camera to simultaneously acquire images.
In a possible embodiment, the number of the third cameras is two, and two green images are acquired by the two third cameras respectively.
The fusion module is further configured to: and fusing the G channel image data with the two green images to obtain a green fused image.
The shooting device in the embodiment of the present application may be an electronic device, and may also be a component in the electronic device, such as an integrated circuit or a chip. The electronic device may be a terminal, or may be a device other than a terminal. The electronic Device may be, for example, a Mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted electronic Device, a Mobile Internet Device (MID), an Augmented Reality (AR)/Virtual Reality (VR) Device, a robot, a wearable Device, an ultra-Mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and may also be a server, a Network Attached Storage (Network Attached Storage, NAS), a personal computer (personal computer, PC), a television (television, TV), an assistant, or a self-service machine, and the embodiments of the present application are not limited in particular.
The photographing apparatus in the embodiment of the present application may be an apparatus having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present application are not limited specifically.
The shooting device provided by the embodiment of the application can realize each process realized by the method embodiment, achieves the same technical effect, and is not repeated herein for avoiding repetition.
Optionally, as shown in fig. 12, an electronic device 400 is further provided in an embodiment of the present application, and includes a processor 410 and a memory 420, where the memory 420 stores a program or an instruction that can be executed on the processor 410, and when the program or the instruction is executed by the processor 410, the steps of the embodiment of the shooting method of the electronic device are implemented, and the same technical effects can be achieved, and are not described again here to avoid repetition.
It should be noted that the electronic devices in the embodiments of the present application include the mobile electronic device and the non-mobile electronic device described above.
Fig. 13 is a schematic hardware configuration diagram of an electronic device 500 for implementing an embodiment of the present application.
The electronic device 500 includes, but is not limited to: radio frequency unit 501, network module 502, audio output unit 503, input unit 504, sensor 505, display unit 506, user input unit 507, interface unit 508, memory 509, processor 510, and the like.
Those skilled in the art will appreciate that the electronic device 500 may further include a power supply (e.g., a battery) for supplying power to various components, and the power supply may be logically connected to the processor 510 via a power management system, so as to implement functions of managing charging, discharging, and power consumption via the power management system. The electronic device structure shown in fig. 13 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is omitted here.
Wherein, the electronic device 500 further comprises: first camera, second camera, third camera, fourth camera, first camera includes bayer's array sensor, and the second camera includes red light filter, and the third camera includes green light filter, and the fourth camera includes blue light filter.
Processor 510 is configured to: controlling a first camera to acquire an RGB image, a second camera to acquire a red image, a third camera to acquire a green image and a fourth camera to acquire a blue image; and obtaining a target image based on the RGB image, the red image, the green image and the blue image.
Optionally, processor 510 is specifically configured to: based on the RGB image, obtaining R channel image data, G channel image data and B channel image data; fusing R channel image data and a red image to obtain a red fused image, fusing G channel image data and a green image to obtain a green fused image, and fusing B channel image data and a blue image to obtain a blue fused image; and fusing the red fused image, the green fused image and the blue fused image to obtain a target image.
Optionally, after obtaining R-channel image data, G-channel image data and B-channel image data based on the RGB image, processor 510 is further configured to: and carrying out alignment processing on the R-channel image data and the red image, carrying out alignment processing on the G-channel image data and the green image, and carrying out alignment processing on the B-channel image data and the blue image.
Optionally, the display unit 506 is configured to: and displaying a pixel weight setting interface, wherein the pixel weight setting interface is used for setting the image fusion proportion.
Optionally, before the first camera acquires the RGB image, the user input unit 507 is configured to: receiving a first input of a user; processor 510 is further configured to: and responding to the first input, and entering a target shooting mode, wherein the target shooting mode is used for indicating and controlling the first camera, the second camera, the third camera and the fourth camera to simultaneously acquire images.
Optionally, the number of the third cameras is two, and the two third cameras respectively acquire two green images. Processor 510 is specifically configured to: and fusing the G channel image data with the two green images to obtain a green fused image.
In the embodiment of the application, a new color image is obtained by fusing the RGB image, the red image, the green image and the blue image. Compared with an image obtained through difference processing in the related art, monochrome information is richer in a red image, a green image and a blue image, details are more completely reserved, after the RGB image, the red image, the green image and the blue image are fused, the obtained color information of the color image is richer, edge texture information is more complete, and the shooting effect is improved.
It should be understood that in the embodiment of the present application, the input Unit 504 may include a Graphics Processing Unit (GPU) 5041 and a microphone 5042, and the Graphics processor 5041 processes image data of still pictures or videos obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The display unit 506 may include a display panel 5061, and the display panel 5061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 507 includes at least one of a touch panel 5071 and other input devices 5072. A touch panel 5071, also referred to as a touch screen. The touch panel 5071 may include two parts of a touch detection device and a touch controller. Other input devices 5072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in further detail herein.
The memory 509 may be used to store software programs as well as various data. The memory 509 may mainly include a first storage area storing a program or an instruction and a second storage area storing data, wherein the first storage area may store an operating system, an application program or an instruction (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like. Further, the memory 509 may include volatile memory or non-volatile memory, or the memory 509 may include both volatile and non-volatile memory. The non-volatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable PROM (EEPROM), or a flash Memory. The volatile Memory may be a Random Access Memory (RAM), a Static Random Access Memory (Static RAM, SRAM), a Dynamic Random Access Memory (Dynamic RAM, DRAM), a Synchronous Dynamic Random Access Memory (Synchronous DRAM, SDRAM), a Double Data Rate Synchronous Dynamic Random Access Memory (Double Data Rate SDRAM, ddr SDRAM), an Enhanced Synchronous SDRAM (ESDRAM), a Synchronous Link DRAM (SLDRAM), and a Direct bus RAM (DRRAM). The memory 509 in embodiments of the present application includes, but is not limited to, these and any other suitable types of memory.
Processor 510 may include one or more processing units; optionally, the processor 510 integrates an application processor, which primarily handles operations involving the operating system, user interface, and applications, and a modem processor, which primarily handles wireless communication signals, such as a baseband processor. It will be appreciated that the modem processor described above may not be integrated into processor 510.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements the processes of the foregoing shooting method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device in the above embodiment. Readable storage media include computer readable storage media such as computer read only memory ROM, random access memory RAM, magnetic or optical disks, and the like.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to execute a program or an instruction to implement each process of the foregoing shooting method embodiment, and can achieve the same technical effect, and in order to avoid repetition, the details are not repeated here.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
Embodiments of the present application provide a computer program product, where the program product is stored in a storage medium, and the program product is executed by at least one processor to implement the processes of the foregoing shooting method embodiments, and achieve the same technical effects, and in order to avoid repetition, details are not repeated here.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a computer software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (10)

1. A shooting method is applied to a shooting device, and is characterized in that the shooting device comprises: the camera comprises a first camera, a second camera, a third camera and a fourth camera, wherein the first camera comprises a Bayer array sensor, the second camera comprises a red optical filter, the third camera comprises a green optical filter, and the fourth camera comprises a blue optical filter;
the shooting method comprises the following steps:
controlling the first camera to acquire an RGB image, the second camera to acquire a red image, the third camera to acquire a green image and the fourth camera to acquire a blue image;
and obtaining a target image based on the RGB image, the red image, the green image and the blue image.
2. The photographing method according to claim 1, wherein the obtaining a target image based on the RGB image, the red image, the green image, and the blue image includes:
obtaining R channel image data, G channel image data and B channel image data based on the RGB image;
fusing the R channel image data with the red image to obtain a red fused image, fusing the G channel image data with the green image to obtain a green fused image, and fusing the B channel image data with the blue image to obtain a blue fused image;
and fusing the red fusion image, the green fusion image and the blue fusion image to obtain a target image.
3. The photographing method according to claim 2, wherein after obtaining R-channel image data, G-channel image data, and B-channel image data based on the RGB image, further comprising:
and carrying out alignment processing on the R channel image data and the red image, carrying out alignment processing on the G channel image data and the green image, and carrying out alignment processing on the B channel image data and the blue image.
4. The photographing method according to claim 1 or 2, wherein the photographing method further comprises:
and displaying a pixel weight setting interface, wherein the pixel weight setting interface is used for setting the image fusion proportion.
5. The shooting method according to claim 1 or 2, wherein before the first camera acquires the RGB image, the method further comprises:
receiving a first input of a user;
and responding to the first input, and entering a target shooting mode, wherein the target shooting mode is used for indicating and controlling the first camera, the second camera, the third camera and the fourth camera to simultaneously acquire images.
6. The shooting method according to claim 2, wherein the number of the third cameras is two, and two green images are acquired by the two third cameras respectively;
the fusing the G channel image data with the green image to obtain a green fused image comprises:
and fusing the G channel image data with the two green images to obtain the green fused image.
7. A camera, characterized in that the camera comprises: the camera comprises a first camera, a second camera, a third camera and a fourth camera, wherein the first camera comprises a Bayer array sensor, the second camera comprises a red light filter, the third camera comprises a green light filter, and the fourth camera comprises a blue light filter;
the photographing apparatus further includes:
the control module controls the first camera to acquire RGB images, the second camera to acquire red images, the third camera to acquire green images and the fourth camera to acquire blue images;
and the fusion module is used for obtaining a target image based on the RGB image, the red image, the green image and the blue image.
8. The camera according to claim 7, wherein the fusion module is specifically configured to: obtaining R channel image data, G channel image data and B channel image data based on the RGB image;
fusing the R channel image data with the red image to obtain a red fused image, fusing the G channel image data with the green image to obtain a green fused image, and fusing the B channel image data with the blue image to obtain a blue fused image;
and fusing the red fusion image, the green fusion image and the blue fusion image to obtain a target image.
9. An electronic device, characterized by comprising a processor and a memory, said memory storing a program or instructions executable on said processor, said program or instructions, when executed by said processor, implementing the steps of the shooting method according to any one of claims 1 to 6.
10. A readable storage medium, characterized in that it stores thereon a program or instructions which, when executed by a processor, implement the steps of the shooting method according to any one of claims 1 to 6.
CN202210736855.5A 2022-06-27 2022-06-27 Photographing method, photographing apparatus, electronic device, and readable storage medium Pending CN115134564A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210736855.5A CN115134564A (en) 2022-06-27 2022-06-27 Photographing method, photographing apparatus, electronic device, and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210736855.5A CN115134564A (en) 2022-06-27 2022-06-27 Photographing method, photographing apparatus, electronic device, and readable storage medium

Publications (1)

Publication Number Publication Date
CN115134564A true CN115134564A (en) 2022-09-30

Family

ID=83379050

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210736855.5A Pending CN115134564A (en) 2022-06-27 2022-06-27 Photographing method, photographing apparatus, electronic device, and readable storage medium

Country Status (1)

Country Link
CN (1) CN115134564A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105704465A (en) * 2016-01-20 2016-06-22 海信电子科技(深圳)有限公司 Image processing method and terminal
CN108881701A (en) * 2018-09-30 2018-11-23 华勤通讯技术有限公司 Image pickup method, camera, terminal device and computer readable storage medium
CN208537024U (en) * 2018-05-11 2019-02-22 Oppo广东移动通信有限公司 A kind of colour filter array, imaging sensor and camera module
CN109729253A (en) * 2019-02-22 2019-05-07 王熙 A kind of algorithm and four color cameras based on the overlapping enhancing of colored and independent RGB optical imagery
CN109963080A (en) * 2019-03-26 2019-07-02 Oppo广东移动通信有限公司 Image-pickup method, device, electronic equipment and computer storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105704465A (en) * 2016-01-20 2016-06-22 海信电子科技(深圳)有限公司 Image processing method and terminal
CN208537024U (en) * 2018-05-11 2019-02-22 Oppo广东移动通信有限公司 A kind of colour filter array, imaging sensor and camera module
CN108881701A (en) * 2018-09-30 2018-11-23 华勤通讯技术有限公司 Image pickup method, camera, terminal device and computer readable storage medium
CN109729253A (en) * 2019-02-22 2019-05-07 王熙 A kind of algorithm and four color cameras based on the overlapping enhancing of colored and independent RGB optical imagery
CN109963080A (en) * 2019-03-26 2019-07-02 Oppo广东移动通信有限公司 Image-pickup method, device, electronic equipment and computer storage medium

Similar Documents

Publication Publication Date Title
US8379140B2 (en) Video image pickup apparatus and exposure guide display method
JP6317577B2 (en) Video signal processing apparatus and control method thereof
US20120294522A1 (en) Image processing apparatus, image processing method, program and imaging apparatus
CN105120247A (en) White-balance adjusting method and electronic device
WO2020011112A1 (en) Image processing method and system, readable storage medium, and terminal
CN110958401A (en) Super night scene image color correction method and device and electronic equipment
CN112637515B (en) Shooting method and device and electronic equipment
CN113014803A (en) Filter adding method and device and electronic equipment
CN112669758A (en) Display screen correction method, device, system and computer readable storage medium
WO2023160496A1 (en) Photographing method, photographing apparatus, electronic device and readable storage medium
CN112532881A (en) Image processing method and device and electronic equipment
JP2009147770A (en) Chromatic aberration correction apparatus, imaging device, chromatic aberration calculation method, and chromatic aberration calculation program
CN111372002A (en) Display processing method and electronic equipment
CN114331916A (en) Image processing method and electronic device
CN113014817A (en) Method and device for acquiring high-definition high-frame video and electronic equipment
CN116309224A (en) Image fusion method, device, terminal and computer readable storage medium
CN115134564A (en) Photographing method, photographing apparatus, electronic device, and readable storage medium
CN114339029B (en) Shooting method and device and electronic equipment
CN109218602A (en) Image capture unit, image treatment method and electronic device
JP2001008088A (en) Image pickup device and method
CN112738399B (en) Image processing method and device and electronic equipment
CN113965687A (en) Shooting method and device and electronic equipment
CN112738425A (en) Real-time video splicing system with multiple cameras for acquisition
CN117479025A (en) Video processing method, video processing device, electronic equipment and medium
US20230394787A1 (en) Imaging apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination