CN115706869A - Terminal image processing method and device and terminal equipment - Google Patents
Terminal image processing method and device and terminal equipment Download PDFInfo
- Publication number
- CN115706869A CN115706869A CN202110923173.0A CN202110923173A CN115706869A CN 115706869 A CN115706869 A CN 115706869A CN 202110923173 A CN202110923173 A CN 202110923173A CN 115706869 A CN115706869 A CN 115706869A
- Authority
- CN
- China
- Prior art keywords
- color
- image
- terminal device
- black
- yuv
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Landscapes
- Studio Devices (AREA)
Abstract
The embodiment of the application provides an image processing method and device for a terminal and terminal equipment, wherein the terminal comprises a black-and-white camera and a color camera, and the method comprises the following steps: after the photographing function of the terminal is operated, acquiring a color RAW image through a color camera and acquiring a black-and-white RAW image through a black-and-white camera in the previewing process; acquiring a shooting instruction, and controlling a color camera to shoot a current shooting scene to acquire a color RAW image of the shooting scene; carrying out RAW domain processing on the color RAW image to obtain a processed color RAW image; processing the processed color RAW image to obtain a color YUV image; processing the black and white RAW image acquired in the preview process through a second Bayer domain processing algorithm link to obtain a black and white YUV image; and fusing the color YUV image and the black-and-white YUV image to obtain a target color image, thereby retaining more image details and improving the color reduction degree of the target color image.
Description
Technical Field
The embodiment of the application relates to the technical field of intelligent terminals, in particular to a terminal image processing method and device and terminal equipment.
Background
Nowadays, photographing has become a common function in mobile phones. With the popularization of smart phones, cameras are more and more widely applied, and the photographing function is more and more excellent. The excellent photographing performance becomes a great selling point of the smart phone, technical innovation surrounding the photographing function is in a variety, the sizes of the aperture and the sensor are also larger and larger, and the upgrading of the technologies and the hardware greatly improves the photographing experience of the smart phone, such as the quality of the picture and the whole photographing experience. When the quality of photos is pursued, the lightening and thinning of the body of the smart phone is also a great trend of the development of the smart phone, so that the existing smart phones mostly adopt at least two camera modules, the quality of the photos can be improved, and the requirement of lightening and thinning of the body of the smart phone can be met.
However, taking two camera modules as an example, the smart phone directly uses two cameras to obtain two different images, and therefore, the two dual-shot images need to be fused by using an image fusion algorithm to obtain a final dual-shot enhanced image.
The existing image fusion method loses more image details, so that the color reduction degree of the final imaging is poor; especially for shooting wind and/or still in a taillight scene, partial details, especially high-frequency details, cannot be completely recovered due to the limited capability of a single camera device and the detail enhancement capability of an algorithm.
Disclosure of Invention
The embodiment of the application provides an image processing method and device of a terminal, a server and terminal equipment, and further provides a computer readable storage medium, so that the terminal equipment performs RAW domain processing on a color RAW image acquired by a color camera to convert the color RAW image into a color YUV image, and then fuses the color YUV image with a black-and-white YUV image, more image details are reserved, and the color reduction degree of final imaging is improved.
In a first aspect, the present application provides an image processing method for a terminal, where the terminal includes a black-and-white camera and a color camera, and the method includes: after the photographing function of the terminal is operated, acquiring a color RAW image through a color camera and acquiring a black and white RAW image through a black and white camera in the previewing process; after the terminal equipment acquires a shooting instruction, responding to the shooting instruction to acquire a color RAW image acquired in a previewing process, and performing RAW domain processing on the color RAW image acquired in the previewing process to acquire a color YUV image; and processing the black and white RAW image acquired in the preview process through a second Bayer domain processing algorithm link to obtain a black and white YUV image. And finally, the terminal equipment fuses the color YUV image and the black-and-white YUV image to obtain a target color image.
In one possible implementation manner, the acquiring of the color RAW image acquired in the preview process in response to the shooting instruction may be: and the terminal equipment responds to the shooting instruction and acquires the color RAW image acquired in the preview process from the cache.
In one possible implementation manner, before the terminal device performs RAW domain processing on the color RAW image acquired in the preview process to obtain the color YUV image, linearization and dead pixel correction processing may be performed on the color RAW image acquired in the preview process through the first bayer domain processing algorithm link.
In one possible implementation manner, before the terminal device obtains the shooting instruction, it may also determine that the current shooting scene is a non-high dynamic scene according to the color RAW image acquired in the preview process.
In one possible implementation manner, the color RAW image acquired by the terminal device in the preview process includes at least two frames of color RAW images, so that performing RAW domain processing on the color RAW image acquired in the preview process to obtain a color YUV image includes: preprocessing at least two frames of color RAW images; performing noise reduction processing on the at least two frames of color RAW images after preprocessing so as to fuse the at least two frames of color RAW images into one frame of color RAW image; converting the color RAW image obtained by fusion into a color RGB image; adjusting the color and brightness of the color RGB image; and converting the color RGB image after the adjustment of the color and the brightness into a color YUV image.
In one possible implementation manner, the preprocessing, performed by the terminal device, the at least two frames of color RAW images includes: aligning the sizes of at least two frames of color RAW images; mapping pixels in every two frames of images in at least two frames of color RAW images to obtain a corresponding relation between the pixels; and carrying out error detection on the corresponding relation, and correcting the corresponding relation with the error.
In one possible implementation manner, the terminal device fuses the color YUV image and the black-and-white YUV image to obtain a target color image, which may be: aligning the sizes of the color YUV image and the black-and-white YUV image; acquiring a region to be fused in the color YUV image and a region to be fused in the black-white YUV image from the color YUV image after the size alignment and the black-white YUV image after the size alignment; then, mapping the pixels in the area where the color YUV images need to be fused with the pixels in the area where the black-white YUV images need to be fused to obtain the corresponding relation between the pixels; then, carrying out error detection on the corresponding relation, and correcting the corresponding relation with the error; and finally, fusing the clear pixels in the area where the color YUV image needs to be fused and the area where the black-white YUV image needs to be fused to the corresponding fuzzy pixels.
In one possible implementation manner, in the previewing process, after the current shooting scene is determined to be a non-high dynamic scene, the color RAW image acquired by the terminal device through the color camera comprises a short-time exposure color RAW image and a normal exposure color RAW image; the above method may further comprise: fusing the color RAW image subjected to short-time exposure and the color RAW image subjected to normal exposure by adopting an overlapping exposure high-dynamic technology; and displaying the color image obtained by fusion in the previewing process.
In one possible implementation manner, in the previewing process, after the terminal device collects the color RAW image through the color camera, the normally exposed color RAW image may be cached.
In the image processing method of the terminal provided by the embodiment of the application, the terminal equipment firstly carries out RAW domain processing on the color RAW image, and then the color RAW image is converted into the color YUV image and then is fused with the black-white YUV image, so that more image details can be reserved, and the color reduction degree of a target color image is improved.
In a second aspect, an embodiment of the present application provides an image processing apparatus for a terminal, where the apparatus is included in a terminal device, and the apparatus has a function of implementing a behavior of the terminal device in the first aspect and possible implementations of the first aspect. The functions may be implemented by hardware, or by hardware executing corresponding software. The hardware or software includes one or more modules or units corresponding to the above-described functions. Such as a receiving module or unit, a processing module or unit, a transmitting module or unit, etc.
In a third aspect, an embodiment of the present application provides a terminal device, including: the system comprises a black-and-white camera and a color camera; one or more processors; a memory; a plurality of application programs; and one or more computer programs, wherein the one or more computer programs are stored in the memory, the one or more computer programs comprising instructions which, when executed by the terminal device, cause the terminal device to perform the steps of: after the photographing function of the terminal equipment is operated, acquiring a color RAW image through a color camera and acquiring a black and white RAW image through a black and white camera in the previewing process; after a shooting instruction is obtained, responding to the shooting instruction to obtain a color RAW image collected in a previewing process, and performing RAW domain processing on the color RAW image collected in the previewing process to obtain a color YUV image; processing the black and white RAW image acquired in the preview process through a second Bayer domain processing algorithm link to obtain a black and white YUV image; and fusing the color YUV image and the black-white YUV image to obtain a target color image.
In one possible implementation manner, when the instruction is executed by the terminal device, the step of causing the terminal device to execute the step of acquiring the color RAW image acquired in the preview process in response to the shooting instruction includes: and responding to the shooting instruction, and acquiring the color RAW image acquired in the preview process from the cache.
In one possible implementation manner, when the instruction is executed by the terminal device, the terminal device executes RAW domain processing on a color RAW image acquired in a preview process, and before the step of obtaining a color YUV image, the following steps are further executed: and carrying out linearization and dead pixel correction processing on the color RAW image acquired in the preview process through a first Bayer domain processing algorithm link.
In one possible implementation manner, when the instruction is executed by the terminal device, before the terminal device executes the step of obtaining the shooting instruction, the following steps are further executed: and determining the current shooting scene as a non-high dynamic scene according to the color RAW image acquired in the previewing process.
In one possible implementation manner, the color RAW image collected in the preview process includes at least two frames of color RAW images; when the above instruction is executed by the terminal device, the terminal device may perform RAW domain processing on the color RAW image acquired in the preview process, and the step of obtaining the color YUV image may include: preprocessing the at least two frames of color RAW images; performing noise reduction processing on the at least two frames of color RAW images after preprocessing so as to fuse the at least two frames of color RAW images into one frame of color RAW image; converting the color RAW image obtained by fusion into a color RGB image; adjusting the color and brightness of the color RGB image; and converting the color RGB image after the color and the brightness are adjusted into a color YUV image.
In one possible implementation manner, when the instruction is executed by the terminal device, the step of causing the terminal device to perform preprocessing on the at least two frames of color RAW images includes: aligning the sizes of the at least two frames of color RAW images; mapping pixels in every two frames of images in the at least two frames of color RAW images to obtain a corresponding relation between the pixels; and carrying out error detection on the corresponding relation, and correcting the corresponding relation with the error.
In one possible implementation manner, when the instruction is executed by the terminal device, the terminal device performs fusion of the color YUV image and the black-and-white YUV image, and the step of obtaining the target color image may be: aligning the sizes of the color YUV image and the black-white YUV image; acquiring a region to be fused in the color YUV image and a region to be fused in the black-white YUV image from the color YUV image after the size alignment and the black-white YUV image after the size alignment; mapping pixels in a region where color YUV images need to be fused with pixels in a region where black-white YUV images need to be fused to obtain a corresponding relation between the pixels; carrying out error detection on the corresponding relation, and correcting the corresponding relation with the error; fusing the clear pixels in the area of the color YUV image needing to be fused and the area of the black-white YUV image needing to be fused to the corresponding fuzzy pixels.
In one possible implementation manner, in the previewing process, after the current shooting scene is determined to be a non-high dynamic scene, the color RAW image acquired by the terminal device through the color camera comprises a short-time exposure color RAW image and a normal exposure color RAW image; when the instruction is executed by the terminal equipment, the terminal equipment further executes the following steps: fusing the color RAW image subjected to short-time exposure and the color RAW image subjected to normal exposure by adopting an overlapping exposure high-dynamic technology; and displaying the color image obtained by fusion in the previewing process.
In one possible implementation manner, when the instruction is executed by the terminal device, the terminal device executes the following steps after the step of acquiring the color RAW image by the color camera in the preview process: the normally exposed color RAW image is buffered.
It should be understood that the second aspect and the third aspect of the embodiment of the present application are consistent with the technical solution of the first aspect of the embodiment of the present application, and beneficial effects achieved by various aspects and corresponding possible implementation manners are similar and will not be described again.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium, in which a computer program is stored, and when the computer program runs on a computer, the computer program causes the computer to execute the method provided in the first aspect.
In a fifth aspect, the present application provides a computer program, which is used to execute the method provided in the first aspect when the computer program is executed by a computer.
In a possible design, the program in the fifth aspect may be stored in whole or in part on a storage medium packaged with the processor, or in part or in whole on a memory not packaged with the processor.
Drawings
Fig. 1 is a schematic structural diagram of a terminal device according to an embodiment of the present application;
fig. 2 (a) is a schematic view of a display interface of a terminal device provided in an embodiment of the present application;
fig. 2 (b) is a schematic diagram of a display interface of a terminal device provided in another embodiment of the present application;
fig. 3 is a schematic diagram of an image processing method of a terminal according to an embodiment of the present application;
fig. 4 is a schematic diagram of an image processing method of a terminal according to another embodiment of the present application;
FIG. 5 is a flow chart of color and black-and-white image fusion provided by an embodiment of the present application;
fig. 6 is a schematic structural diagram of a terminal device according to another embodiment of the present application.
Detailed Description
The terminology used in the description of the embodiments section of the present application is for the purpose of describing particular embodiments of the present application only and is not intended to be limiting of the present application.
In an image fusion scheme provided by the prior art, original (RAW) images acquired by a color camera and a black and white camera are respectively converted into a color YUV image and a black and white YUV image, and then the color YUV image and the black and white YUV image are fused.
Particularly, in the scene of taillight, wind and/or still life, etc., due to the limited capability of a single camera device and the capability of enhancing the details of the algorithm, some details, especially high-frequency details, cannot be fully recovered.
Based on the above problems, the embodiment of the present application provides an image processing method for a terminal, which can implement that a terminal device performs RAW domain processing on a color RAW image acquired by a color camera, converts the color RAW image into a color YUV image, and fuses the color YUV image with a black-and-white YUV image, thereby retaining more image details and improving the color reduction degree of final imaging.
The image processing method of the terminal provided by the embodiment of the application can be applied to terminal equipment, wherein the terminal equipment can be equipment such as a smart phone, a tablet computer, wearable equipment, vehicle-mounted equipment, augmented Reality (AR)/Virtual Reality (VR) equipment, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA); the embodiment of the application does not limit the specific type of the terminal device.
Exemplarily, fig. 1 is a schematic structural diagram of a terminal device according to an embodiment of the present invention, and as shown in fig. 1, the terminal device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identity Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation to the terminal device 100. In other embodiments of the present application, terminal device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bidirectional synchronous serial bus including a serial data line (SDA) and a serial clock line (DCL). In some embodiments, the processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, so that the processor 110 and the touch sensor 180K communicate through an I2C bus interface to implement a touch function of the terminal device 100.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 through an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through the I2S interface, so as to implement a function of receiving a call through a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, audio module 170 and wireless communication module 160 may be coupled by a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 and the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through a UART interface, so as to realize the function of playing music through a bluetooth headset.
The MIPI interface may be used to connect the processor 110 with peripheral devices such as the display screen 194, the camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the capture function of terminal device 100. The processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the terminal device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, I2S interface, UART interface, MIPI interface, and the like.
The USB interface 130 is an interface conforming to the USB standard specification, and may be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the terminal device 100, and may also be used to transmit data between the terminal device 100 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the interface connection relationship between the modules illustrated in the embodiment of the present application is only an exemplary illustration, and does not constitute a limitation on the structure of the terminal device 100. In other embodiments of the present application, the terminal device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the terminal device 100. The charging management module 140 may also supply power to the terminal device 100 through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140, and supplies power to the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the terminal device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in terminal device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied on the terminal device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the terminal device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), global Navigation Satellite System (GNSS), frequency Modulation (FM), near Field Communication (NFC), infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, the antenna 1 of the terminal device 100 is coupled to the mobile communication module 150 and the antenna 2 is coupled to the wireless communication module 160 so that the terminal device 100 can communicate with a network and other devices through wireless communication technology. The wireless communication technology may include global system for mobile communications (GSM), general Packet Radio Service (GPRS), code Division Multiple Access (CDMA), wideband Code Division Multiple Access (WCDMA), time division code division multiple access (time-division multiple access, TD-SCDMA), long Term Evolution (LTE), BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The terminal device 100 implements a display function by the GPU, the display screen 194, and the application processor, etc. The GPU is a microprocessor for image processing, connected to the display screen 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the terminal device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The terminal device 100 can implement a photographing function through the ISP, the camera 193, the video codec, the GPU, the display screen 194, and the application processor, etc.
The ISP is used to process the data fed back by the camera 193. For example, when a user takes a picture, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, an optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and converting into an image visible to the naked eye. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV and other formats. In some embodiments, the terminal device 100 may include N cameras 193, N being a positive integer greater than 1. In the embodiment of the present application, the N cameras 193 include a black-and-white camera and a color camera.
Specifically, during shooting, the user turns on the camera, and light is transmitted to the photosensitive element through the lens (which may correspond to the camera 193 described earlier), in other words, after the lens can project an ambient light signal to a photosensitive area of the photosensitive element, the photosensitive element is subjected to photoelectric conversion, and the light signal is converted into an image visible to the naked eye. The photosensitive element transmits an internal original image (Bayer format, also referred to as Bayer format or Bayer format) to the ISP module, and the ISP module outputs an image of an RGB spatial domain to a rear-end acquisition unit after algorithm processing, and displays the image in an image preview area of the terminal device 100 or a display screen of the terminal device 100. In the process, the processor performs corresponding control on the lens, the photosensitive element and the ISP module through a firmware program run on the processor, and further, the image preview or shooting function is completed.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the terminal device 100 selects a frequency point, the digital signal processor is used to perform fourier transform or the like on the frequency point energy.
Video codecs are used to compress or decompress digital video. The terminal device 100 may support one or more video codecs. In this way, the terminal device 100 can play or record video in a plurality of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor, which processes input information quickly by referring to a biological neural network structure, for example, by referring to a transfer mode between neurons of a human brain, and can also learn by itself continuously. The NPU can implement applications such as intelligent recognition of the terminal device 100, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the storage capability of the terminal device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (such as audio data, a phonebook, etc.) created during use of the terminal device 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like. The processor 110 executes various functional applications of the terminal device 100 and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The terminal device 100 may implement an audio function through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The terminal device 100 can listen to music through the speaker 170A, or listen to a handsfree call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into a sound signal. When the terminal device 100 answers a call or voice information, it is possible to answer a voice by bringing the receiver 170B close to the human ear.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking near the microphone 170C through the mouth. The terminal device 100 may be provided with at least one microphone 170C. In other embodiments, the terminal device 100 may be provided with two microphones 170C, which may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the terminal device 100 may further include three, four or more microphones 170C to collect a sound signal, reduce noise, identify a sound source, and implement a directional recording function.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used for sensing a pressure signal, and can convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a variety of types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The terminal device 100 determines the intensity of the pressure from the change in the capacitance. When a touch operation is applied to the display screen 194, the terminal device 100 detects the intensity of the touch operation based on the pressure sensor 180A. The terminal device 100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion attitude of the terminal device 100. In some embodiments, the angular velocity of terminal device 100 about three axes (i.e., x, y, and z axes) may be determined by gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. Illustratively, when the shutter is pressed, the gyro sensor 180B detects the shake angle of the terminal device 100, calculates the distance to be compensated for the lens module according to the shake angle, and allows the lens to counteract the shake of the terminal device 100 through a reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, the terminal device 100 calculates an altitude from the barometric pressure measured by the barometric pressure sensor 180C, and assists in positioning and navigation.
The magnetic sensor 180D includes a hall sensor. The terminal device 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the terminal device 100 is a folder, the terminal device 100 may detect the opening and closing of the folder according to the magnetic sensor 180D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 180E can detect the magnitude of acceleration of the terminal device 100 in various directions (generally, three axes). The magnitude and direction of gravity may be detected when the terminal device 100 is stationary. The method can also be used for identifying the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and the like.
A distance sensor 180F for measuring a distance. The terminal device 100 may measure the distance by infrared or laser. In some embodiments, shooting a scene, the terminal device 100 may range using the distance sensor 180F to achieve fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The terminal device 100 emits infrared light to the outside through the light emitting diode. The terminal device 100 detects infrared reflected light from a nearby object using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the terminal device 100. When insufficient reflected light is detected, the terminal device 100 can determine that there is no object near the terminal device 100. The terminal device 100 can utilize the proximity light sensor 180G to detect that the user holds the terminal device 100 close to the ear for talking, so as to automatically turn off the screen to achieve the purpose of saving power. The proximity light sensor 180G may also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 180L is used to sense ambient light brightness. The terminal device 100 may adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the terminal device 100 is in a pocket, in order to prevent accidental touches.
The fingerprint sensor 180H is used to collect a fingerprint. The terminal device 100 can utilize the collected fingerprint characteristics to realize fingerprint unlocking, access to an application lock, fingerprint photographing, fingerprint incoming call answering and the like.
The temperature sensor 180J is used to detect temperature. In some embodiments, the terminal device 100 executes a temperature processing policy using the temperature detected by the temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds the threshold, the terminal device 100 performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the terminal device 100 heats the battery 142 when the temperature is below another threshold to avoid the terminal device 100 being abnormally shut down due to low temperature. In other embodiments, when the temperature is lower than a further threshold, the terminal device 100 performs boosting on the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
The touch sensor 180K is also called a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided via the display screen 194. In other embodiments, the touch sensor 180K may be disposed on the surface of the terminal device 100, different from the position of the display screen 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human vocal part vibrating the bone mass. The bone conduction sensor 180M may also contact the human pulse to receive the blood pressure pulsation signal. In some embodiments, bone conduction sensor 180M may also be provided in a headset, integrated into a bone conduction headset. The audio module 170 may analyze a voice signal based on the vibration signal of the bone mass vibrated by the sound part acquired by the bone conduction sensor 180M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so that the heart rate detection function is realized.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The terminal device 100 may receive a key input, and generate a key signal input related to user setting and function control of the terminal device 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration prompts as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be brought into and out of contact with the terminal device 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The terminal device 100 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 195 can be inserted with multiple cards at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The terminal device 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the terminal device 100 employs eSIM, namely: an embedded SIM card. The eSIM card may be embedded in the terminal device 100 and cannot be separated from the terminal device 100.
For ease of understanding, the bayer domain and the RAW domain mentioned in the following examples are explained first.
1. Bayer domain: each lens on the digital camera is provided with a light sensor for measuring the brightness degree of light, but if a full-color image is to be obtained, three light sensors are generally required to obtain red, green and blue tricolor information respectively, and in order to reduce the cost and the volume of the digital camera, manufacturers usually adopt a CCD or CMOS image sensor, and usually, an original image output by the CMOS image sensor is in a bayer domain RGB format, a single pixel only contains a color value, and to obtain the gray value of the image, the color information of each pixel needs to be interpolated first, and then the gray value of each pixel needs to be calculated. That is, the bayer domain refers to a raw picture format inside the digital camera.
2. RAW field: RAW domain images, i.e., RAW images, contain data processed from an image sensor of a digital camera, scanner, or motion picture film scanner. This is so named because the RAW domain image has not been processed, printed or used for editing. The RAW domain image contains the most original information of the image and is not subjected to the nonlinear processing in the ISP process.
The following embodiments of the present application will specifically describe, by taking a terminal device having a structure shown in fig. 1 as an example, an image processing method of a terminal provided in the embodiments of the present application with reference to the accompanying drawings and application scenarios.
Specifically, after a user clicks a "camera" icon in a display interface of the terminal device, in response to an operation of the user, the terminal device runs a photographing function, and in a preview process, the terminal device acquires a color RAW image through a color camera and a black and white RAW image through a black and white camera, and then, the terminal device determines that a current photographing scene is a non-high dynamic scene according to the color RAW image acquired in the preview process, where the non-high dynamic scene is a scene with sufficient illumination and a small brightness span in the current photographing scene, for example: building or wind-light, etc. After determining that the current shooting scene is a non-high dynamic scene, the terminal device may display, in a current display interface, that the current shooting scene is the non-high dynamic scene, as shown in fig. 2 (a), where fig. 2 (a) is a schematic view of a display interface of the terminal device provided in an embodiment of the present application, and fig. 2 (a) illustrates, as an example, that the non-high dynamic scene is a building.
Specifically, the step of determining, by the terminal device, that the current shooting scene is a non-high dynamic scene according to the color RAW image acquired in the preview process may be: the terminal equipment acquires the maximum brightness and the minimum brightness in the color RAW image acquired in the previewing process and calculates the ratio of the maximum brightness to the minimum brightness; if the ratio is smaller than a preset ratio threshold value, the current shooting scene can be determined to be a non-high dynamic scene. The predetermined ratio threshold may be set according to system performance and/or implementation requirements, and the size of the predetermined ratio threshold is not limited in this embodiment.
In addition, it should be noted that the current shooting scene may be determined by the terminal device in the above manner, or may be selected by the user in the interface of the shooting function.
In addition, if in the preview process, after the terminal device detects that the black-and-white camera in the N cameras 193 is blocked, a prompt message of "the black-and-white camera is blocked" may be displayed in the current display interface, as shown in fig. 2 (b), to remind the user that the black-and-white camera is blocked, so that the user may remove the blocking object of the black-and-white camera as soon as possible, and the quality of the captured image is ensured. Fig. 2 (b) is a schematic diagram of a display interface of a terminal device provided in another embodiment of the present application.
Further, referring to fig. 3, fig. 3 is a schematic diagram of an image processing method of a terminal according to an embodiment of the present application, and as can be seen from fig. 3, after it is determined that a current shooting scene is a non-high dynamic scene (35), a terminal device obtains a shooting instruction, and in response to the shooting instruction, the terminal device obtains a color RAW image acquired in a preview process. Specifically, the terminal device responds to the shooting instruction, and can acquire a color RAW image acquired in a preview process from a cache; that is to say, in the preview process, after the terminal device collects the color RAW image through the color camera, the collected color RAW image may be stored in the cache, and after the shooting instruction is obtained, the color RAW image collected in the preview process is obtained from the cache.
Then, the terminal device performs linearization and dead pixel correction processing on the color RAW image acquired in the preview process through the bayer domain processing algorithm link 31. The bayer domain processing algorithm link 31 is a first bayer domain processing algorithm link, and is configured to process the color RAW image.
Further, the terminal device performs RAW domain processing 32 on the color RAW image acquired in the preview process to obtain a color YUV image.
In the previewing process, the terminal device acquires the black-and-white RAW image acquired in the previewing process through the black-and-white RAW image acquired by the black-and-white camera in response to the shooting instruction, and processes the black-and-white RAW image acquired in the previewing process through the Bayer domain processing algorithm link 33 to acquire the black-and-white YUV image. The bayer processing algorithm link 33 is a second bayer processing algorithm link, and is configured to process a black-and-white RAW image.
And finally, the terminal equipment fuses (34) the color YUV image and the black-and-white YUV image to obtain a target color image, and displays the target color image (namely the photographed image in the figure 3).
In addition, referring to fig. 3, in the previewing process, after determining that the current shooting scene is a non-high dynamic scene, the color RAW image acquired by the terminal device through the color camera may include a short-time exposed color RAW image and a normal exposed color RAW image; in this way, the terminal device may also adopt a marker high dynamic range (marker HDR) technique to fuse (36) the short-time exposed color RAW image and the normally exposed color RAW image; then, in the preview process, a color image (37) obtained by fusion is displayed.
In specific implementation, the terminal device can process the normally exposed color RAW image through the image processing front end 0, process the short-time exposed color RAW image through the image processing front end 1, then fuse (36) the color RAW images processed by the image processing front end 0 and the image processing front end 1 by using Stagger HDR, and preview and display (37) the color image obtained by fusion. When the color RAW images processed by the image processing front end 0 and the image processing front end 1 are fused by using the stager HDR, the fusion proportion of the color RAW images processed by the image processing front end 0 and the image processing front end 1 can be determined according to the illumination intensity in the current shooting scene; for example, if the current shooting scene is sufficiently illuminated and is a front-lighting scene, the fusion ratios of the color RAW image processed by the image processing front-end 0 and the image processing front-end 1 can be determined to be 95% and 5%, respectively, and the effect of the color image obtained by fusion is similar to the effect of the image without fusion.
In addition, for black and white RAW images acquired by a black and white camera in the preview process, one path of the black and white RAW images is sent to a bayer domain processing algorithm link 33 for processing, the other path of the black and white RAW images is sent to the image processing front end 2 for processing, and the black and white RAW images processed by the image processing front end 2 are sent to the 3A statistics/semantic understanding module 38, the 3A statistics/semantic understanding module 38 and the 3A statistics/semantic understanding module 38 to realize the functions of automatic color, brightness and/or focusing mainly according to the black and white RAW images, and identify the semantic content of the images and use the semantic content in decision making.
The following describes in detail an image processing method of a terminal according to an embodiment of the present application with reference to fig. 4, and fig. 4 is a schematic diagram of an image processing method of a terminal according to another embodiment of the present application.
In this embodiment, in the preview process, the terminal device acquires a color RAW image through the color camera, then the terminal device determines that the current shooting scene is a non-high dynamic scene according to the color RAW image acquired in the preview process, and then the image acquired by the terminal device through the color camera includes a normally exposed color RAW image and a short-time exposed color RAW image, and the normally exposed color RAW image is cached in the color image cache.
And for the black-and-white RAW image acquired by the terminal equipment through the black-and-white camera, the terminal equipment caches the black-and-white RAW image into a black-and-white image cache, wherein the black-and-white RAW image is a normally exposed black-and-white RAW image.
Next, the terminal device acquires a shooting instruction, and acquires 8 frames of normally exposed color RAW images from the color image buffer in response to the shooting instruction.
Then, the terminal device performs linearization and dead pixel correction processing on the 8-frame color RAW image acquired in the preview process through the bayer domain processing algorithm link 41. In this embodiment, the bayer domain processing algorithm link 41 in fig. 4 corresponds to the bayer domain processing algorithm link 31 in fig. 3, and is used to process the color RAW image.
Next, the terminal device caches the 8 frames of color RAW images after the linearization and dead pixel correction processing in the image cache pool 42, further performs RAW domain processing (43) on the 8 frames of color RAW images in the image cache pool 42 to obtain color YUV images, and caches the color YUV images in the image cache 44. As can be seen from fig. 4, the number of bits of the color YUV image buffered in the image buffer 44 is 10.
Specifically, the RAW domain processing (43) is performed on the 8-frame color RAW image in the image buffer pool 42, and obtaining the color YUV image may include:
step 1, preprocessing the 8-frame color RAW image; wherein, the preprocessing the 8-frame color RAW image may include: aligning the sizes of the 8-frame color RAW images, mapping the pixels in every two frames of images in the 8-frame color RAW images to obtain the corresponding relation between the pixels, carrying out error detection on the corresponding relation, and correcting the corresponding relation with errors.
And 2, carrying out noise reduction processing on the preprocessed 8-frame color RAW images so as to fuse the 8-frame color RAW images into one frame color RAW image.
And 3, converting the color RAW image obtained by fusion into a color RGB image.
Step 4, adjusting the color and brightness of the color RGB image; specifically, adjusting the color and brightness of the color RGB image may include: luminance and color processing flows such as Automatic White Balance (AWB), vignetting (shading), and color correction (Gamma) are performed on the color RGB image.
And 5, converting the color RGB image after the color and the brightness are adjusted into a color YUV image.
In addition, with reference to fig. 4, after the terminal device obtains the shooting instruction for the black-and-white RAW image collected by the black-and-white camera during the preview process, the terminal device obtains 1 frame of black-and-white RAW image from the black-and-white image cache in response to the shooting instruction, and sequentially performs linearization, dead pixel correction, bayer processing, graph Transformation Matching (GTM) and color correction (Gamma) processing on the 1 frame of black-and-white RAW image through the bayer domain processing algorithm link 45 to obtain a black-and-white YUV image, and the obtained black-and-white YUV image is cached in the image cache 46. The number of bits of the black-and-white YUV image buffered in the image buffer 46 is 10. In this embodiment, the bayer domain processing algorithm link 45 corresponds to the bayer domain processing algorithm link 33 in fig. 3, and is configured to process a black and white RAW image.
Finally, the terminal device fuses (47) the color YUV image in the image buffer 44 and the black-and-white YUV image in the image buffer 46 to obtain a target color image, and displays the target color image.
Specifically, fig. 5 is a flowchart of color and black-and-white image fusion according to an embodiment of the present application, and as shown in fig. 5, the fusing the color YUV image and the black-and-white YUV image to obtain the target color image may include:
And 502, acquiring a region to be fused in the color YUV image and a region to be fused in the black-and-white YUV image from the color YUV image after the size alignment and the black-and-white YUV image after the size alignment.
And 505, fusing the clear pixels in the area where the color YUV image needs to be fused and the area where the black-and-white YUV image needs to be fused to the corresponding fuzzy pixels. For example, pixel a in the region where the color YUV image needs to be fused corresponds to pixel a' in the region where the black-and-white YUV image needs to be fused; in one case, assuming that pixel a is a sharp pixel in pixel a and pixel a ', and pixel a ' is a blurred pixel in both, the color of pixel a may be fused to pixel a '; in another case, assuming that pixel A is a blurred pixel in pixel A and pixel A 'is a sharp pixel in both, the details of pixel A' may be merged onto pixel A.
In the image processing method of the terminal provided by the embodiment of the application, the terminal equipment firstly carries out RAW domain processing on a color RAW image, converts the color RAW image into a color YUV image, and then fuses the color YUV image with a black-and-white YUV image, so that more image details can be retained, especially when wind light and/or static objects and the like are shot in a taillight scene, the image details, especially high-frequency details, can be better recovered, and the color reduction degree of a target color image is improved.
In addition, in the previewing process, after the current shooting scene is determined to be a non-high dynamic scene, the color RAW images acquired by the terminal equipment through the color camera comprise a short-time exposure color RAW image and a normal exposure color RAW image, and after the short-time exposure color RAW image and the normal exposure color RAW image are fused, the image obtained through fusion is previewed and displayed, so that the difference between the image obtained through shooting and the image obtained through previewing and displaying can be reduced, and the user experience is improved.
It is to be understood that some or all of the steps or operations in the above-described embodiments are merely examples, and other operations or variations of various operations may be performed by the embodiments of the present application. Further, the various steps may be performed in a different order presented in the above-described embodiments, and it is possible that not all of the operations in the above-described embodiments are performed.
It will be appreciated that the terminal device, in order to implement the above-described functions, comprises corresponding hardware and/or software modules for performing the respective functions. The exemplary algorithm steps described in connection with the embodiments disclosed herein may be embodied in hardware or in a combination of hardware and computer software. Whether a function is performed in hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, with the embodiment described in connection with the particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In this embodiment, the terminal device may be divided into function modules according to the method embodiment, for example, each function module may be divided corresponding to each function, or two or more functions may be integrated into one module. The integrated module can be implemented in the form of hardware. It should be noted that the division of the modules in this embodiment is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
Fig. 6 is a schematic structural diagram of a terminal device according to another embodiment of the present application, where each functional module is divided according to each function, and fig. 6 illustrates a schematic possible composition diagram of the terminal device 600 according to the foregoing embodiment, as shown in fig. 6, the terminal device 600 may include: a receiving unit 601, a processing unit 602, and a transmitting unit 603;
the processing unit 602 may be configured to support the terminal device 600 to implement the technical solutions described in the embodiments shown in fig. 2 (a) to fig. 5 in this application.
It should be noted that all relevant contents of each step related to the above method embodiment may be referred to the functional description of the corresponding functional module, and are not described herein again.
The present embodiment provides a terminal apparatus 600 for executing the image processing method of the above terminal, and therefore can achieve the same effects as the above method.
It should be understood that the terminal device 600 may correspond to the terminal device 100 shown in fig. 1. Wherein, the functions of the receiving unit 601 and the transmitting unit 603 may be implemented by the processor 110, the antenna 1 and the mobile communication module 60 in the terminal device 100 shown in fig. 1, and/or by the processor 110, the antenna 2 and the wireless communication module 160; the functions of the processing unit 602 may be implemented by the processor 110 in the terminal device 100 shown in fig. 1.
In case of employing an integrated unit, the terminal device 600 may include a processing module, a storage module, and a communication module.
The processing module may be configured to control and manage the operation of the terminal device 600, and for example, may be configured to support the terminal device 600 to execute the steps executed by the receiving unit 601, the processing unit 602, and the sending unit 603. The memory module may be used to support the terminal device 600 in storing program codes and data, etc. A communication module, which may be used to support communication between the terminal device 600 and other devices.
Among other things, a processing module can be a processor or controller that can implement or execute the various illustrative logical blocks, modules, and circuits described in connection with the present disclosure. A processor may also be a combination of computing functions, e.g., a combination comprising one or more microprocessors, digital Signal Processing (DSP) and microprocessors, or the like. The storage module may be a memory. The communication module may specifically be a radio frequency circuit, a bluetooth chip and/or a Wi-Fi chip, and the like, which interact with other electronic devices.
In an embodiment, when the processing module is a processor and the storage module is a memory, the terminal device 600 according to this embodiment may be a device having the structure shown in fig. 1.
An embodiment of the present application further provides a computer-readable storage medium, in which a computer program is stored, and when the computer program runs on a computer, the computer is caused to execute the method provided by the embodiment shown in fig. 2 (a) to fig. 5 of the present application.
Embodiments of the present application also provide a computer program product, which includes a computer program and when the computer program runs on a computer, the computer executes the method provided in the embodiments shown in fig. 2 (a) to fig. 5 of the present application.
In the embodiments of the present application, "at least one" means one or more, "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, and means that there may be three relationships, for example, a and/or B, and may mean that a exists alone, a and B exist simultaneously, and B exists alone. Wherein A and B can be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" and the like, refer to any combination of these items, including any combination of singular or plural items. For example, at least one of a, b, and c may represent: a, b, c, a and b, a and c, b and c or a and b and c, wherein a, b and c can be single or multiple.
Those of ordinary skill in the art will appreciate that the various elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of electronic hardware and computer software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the technical solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, any function, if implemented in the form of a software functional unit and sold or used as a separate product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a portable hard disk, a read-only memory (ROM), a Random Access Memory (RAM), a magnetic disk, an optical disk, or other various media capable of storing program codes.
The above description is only for the specific embodiments of the present application, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present disclosure, and all the changes or substitutions should be covered by the protection scope of the present application. The protection scope of the present application shall be subject to the protection scope of the claims.
Claims (20)
1. An image processing method of a terminal, wherein the terminal comprises a black-and-white camera and a color camera, the method comprising:
after the photographing function of the terminal is operated, acquiring a color RAW image through the color camera and acquiring a black and white RAW image through the black and white camera in the previewing process;
after a shooting instruction is obtained, responding to the shooting instruction to obtain a color RAW image collected in a previewing process, and performing RAW domain processing on the color RAW image collected in the previewing process to obtain a color YUV image;
processing the black and white RAW image acquired in the preview process through a second Bayer domain processing algorithm link to obtain a black and white YUV image;
and fusing the color YUV image and the black-and-white YUV image to obtain a target color image.
2. The method according to claim 1, wherein the acquiring a color RAW image acquired during a preview process in response to the photographing instruction includes:
and responding to the shooting instruction, and acquiring the color RAW image acquired in the preview process from a cache.
3. The method according to claim 1, wherein before performing RAW domain processing on the color RAW image acquired during the preview process to obtain a color YUV image, the method further comprises:
and carrying out linearization and dead pixel correction processing on the color RAW image acquired in the preview process through a first Bayer domain processing algorithm link.
4. The method of claim 1, wherein before the obtaining the shooting instruction, further comprising:
and determining the current shooting scene as a non-high dynamic scene according to the color RAW image acquired in the previewing process.
5. The method according to claim 1, wherein the color RAW image acquired during the preview process includes at least two frames of color RAW images, and the RAW domain processing the color RAW image acquired during the preview process to obtain color YUV images includes:
preprocessing the at least two frames of color RAW images;
performing noise reduction processing on the at least two frames of preprocessed color RAW images to fuse the at least two frames of color RAW images into one frame of color RAW image;
converting the color RAW image obtained by fusion into a color RGB image;
adjusting the color and brightness of the color RGB image;
and converting the color RGB image after the color and the brightness are adjusted into a color YUV image.
6. The method of claim 5, wherein the pre-processing the at least two frames of color RAW images comprises:
aligning the sizes of the at least two frames of color RAW images;
mapping pixels in every two frames of images in the at least two frames of color RAW images to obtain a corresponding relation between the pixels;
and carrying out error detection on the corresponding relation, and correcting the corresponding relation with the error.
7. The method according to claim 1, wherein said fusing the color YUV images and the black-and-white YUV images to obtain a target color image comprises:
aligning the sizes of the color YUV image and the black and white YUV image;
acquiring a region to be fused in the color YUV image and a region to be fused in the black-white YUV image from the color YUV image after the size alignment and the black-white YUV image after the size alignment;
mapping the pixels in the area where the color YUV images need to be fused with the pixels in the area where the black-white YUV images need to be fused to obtain the corresponding relation between the pixels;
carrying out error detection on the corresponding relation, and correcting the corresponding relation with errors;
and fusing the clear pixels in the area where the color YUV image needs to be fused and the area where the black-white YUV image needs to be fused to the corresponding fuzzy pixels.
8. The method according to claim 4, wherein in the previewing process, after the current shooting scene is determined to be a non-high dynamic scene, the color RAW images acquired by the terminal device through the color camera include a short-time exposure color RAW image and a normal exposure color RAW image;
the method further comprises the following steps:
fusing the short-time exposed color RAW image and the normally exposed color RAW image by adopting an overlapping exposure high dynamic technology;
and displaying a color image obtained by fusion in the previewing process.
9. The method of claim 8, wherein the previewing process further comprises, after acquiring a color RAW image by the color camera:
buffering the normally exposed color RAW image.
10. An image processing apparatus of a terminal, characterized by being configured to perform the method of any of claims 1 to 9.
11. A terminal device is characterized by comprising a black-and-white camera and a color camera; one or more processors; a memory; a plurality of application programs; and one or more computer programs, wherein the one or more computer programs are stored in the memory, the one or more computer programs comprising instructions which, when executed by the terminal device, cause the terminal device to perform the steps of:
after the photographing function of the terminal is operated, acquiring a color RAW image through the color camera and acquiring a black-and-white RAW image through the black-and-white camera in the previewing process;
after a shooting instruction is acquired, responding to the shooting instruction to acquire a color RAW image acquired in a previewing process, and performing RAW domain processing on the color RAW image acquired in the previewing process to acquire a color YUV image;
processing the black and white RAW image acquired in the preview process through a second Bayer domain processing algorithm link to obtain a black and white YUV image;
and fusing the color YUV image and the black-and-white YUV image to obtain a target color image.
12. The terminal device according to claim 11, wherein the instructions, when executed by the terminal device, cause the terminal device to perform the step of acquiring a color RAW image acquired during a preview process in response to the photographing instruction includes:
and responding to the shooting instruction, and acquiring the color RAW image acquired in the preview process from a cache.
13. The terminal device according to claim 11, wherein the instructions, when executed by the terminal device, cause the terminal device to perform RAW domain processing on a color RAW image acquired during a preview process, and further perform the following steps before the step of obtaining a color YUV image:
and carrying out linearization and dead pixel correction processing on the color RAW image acquired in the preview process through a first Bayer domain processing algorithm link.
14. The terminal device of claim 11, wherein the instructions, when executed by the terminal device, cause the terminal device to perform the step of obtaining the shooting instructions before performing the step of obtaining the shooting instructions, further perform the steps of:
and determining the current shooting scene as a non-high dynamic scene according to the color RAW image acquired in the previewing process.
15. The terminal device according to claim 11, wherein the color RAW image acquired in the preview process includes at least two frames of color RAW images; when the instruction is executed by the terminal device, the terminal device executes RAW domain processing on the color RAW image acquired in the preview process, and the step of obtaining the color YUV image comprises the following steps:
preprocessing the at least two frames of color RAW images;
performing noise reduction processing on the at least two frames of color RAW images after preprocessing so as to fuse the at least two frames of color RAW images into one frame of color RAW image;
converting the color RAW image obtained by fusion into a color RGB image;
adjusting the color and brightness of the color RGB image;
and converting the color RGB image after the color and the brightness are adjusted into a color YUV image.
16. The terminal device of claim 15, wherein the instructions, when executed by the terminal device, cause the terminal device to perform the preprocessing of the at least two frames of color RAW images comprises:
aligning the sizes of the at least two frames of color RAW images;
mapping pixels in every two frames of images in the at least two frames of color RAW images to obtain a corresponding relation between the pixels;
and carrying out error detection on the corresponding relation, and correcting the corresponding relation with the error.
17. The terminal device of claim 11, wherein the instructions, when executed by the terminal device, cause the terminal device to perform the fusing the color YUV image and the black and white YUV image to obtain a target color image comprises:
aligning the sizes of the color YUV image and the black and white YUV image;
acquiring a region to be fused in the color YUV image and a region to be fused in the black-white YUV image from the color YUV image after the size alignment and the black-white YUV image after the size alignment;
mapping the pixels in the area where the color YUV images need to be fused with the pixels in the area where the black-white YUV images need to be fused to obtain the corresponding relation between the pixels;
carrying out error detection on the corresponding relation, and correcting the corresponding relation with errors;
and fusing the clear pixels in the area where the color YUV image needs to be fused and the area where the black-white YUV image needs to be fused to the corresponding fuzzy pixels.
18. The terminal device according to claim 14, wherein in the preview process, after determining that the current shooting scene is a non-high dynamic scene, the color RAW image collected by the terminal device through the color camera includes a short-time exposed color RAW image and a normal exposed color RAW image;
the instructions, when executed by the terminal device, cause the terminal device to further perform the steps of:
fusing the short-time exposed color RAW image and the normally exposed color RAW image by adopting an overlapping exposure high dynamic technology;
and displaying the color image obtained by fusion in the previewing process.
19. The terminal device of claim 18, wherein the instructions, when executed by the terminal device, cause the terminal device to perform the following steps after the step of acquiring a color RAW image by the color camera is performed during the preview process:
buffering the normally exposed color RAW image.
20. A computer-readable storage medium, in which a computer program is stored which, when run on a computer, causes the computer to carry out the method according to any one of claims 1-9.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110923173.0A CN115706869A (en) | 2021-08-12 | 2021-08-12 | Terminal image processing method and device and terminal equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110923173.0A CN115706869A (en) | 2021-08-12 | 2021-08-12 | Terminal image processing method and device and terminal equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115706869A true CN115706869A (en) | 2023-02-17 |
Family
ID=85180809
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110923173.0A Pending CN115706869A (en) | 2021-08-12 | 2021-08-12 | Terminal image processing method and device and terminal equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115706869A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116156334A (en) * | 2023-02-28 | 2023-05-23 | 维沃移动通信有限公司 | Shooting method, shooting device, electronic equipment and readable storage medium |
-
2021
- 2021-08-12 CN CN202110923173.0A patent/CN115706869A/en active Pending
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116156334A (en) * | 2023-02-28 | 2023-05-23 | 维沃移动通信有限公司 | Shooting method, shooting device, electronic equipment and readable storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113810600B (en) | Terminal image processing method and device and terminal equipment | |
CN113810601B (en) | Terminal image processing method and device and terminal equipment | |
CN112532892B (en) | Image processing method and electronic device | |
CN111770282B (en) | Image processing method and device, computer readable medium and terminal equipment | |
CN114489533A (en) | Screen projection method and device, electronic equipment and computer readable storage medium | |
CN111552451A (en) | Display control method and device, computer readable medium and terminal equipment | |
CN113810603B (en) | Point light source image detection method and electronic equipment | |
CN113542580B (en) | Method and device for removing light spots of glasses and electronic equipment | |
WO2022156555A1 (en) | Screen brightness adjustment method, apparatus, and terminal device | |
CN113467735A (en) | Image adjusting method, electronic device and storage medium | |
CN113542613A (en) | Device and method for photographing | |
CN113436576A (en) | OLED display screen dimming method and device applied to two-dimensional code scanning | |
CN115631250B (en) | Image processing method and electronic equipment | |
CN112188094B (en) | Image processing method and device, computer readable medium and terminal equipment | |
CN115412678B (en) | Exposure processing method and device and electronic equipment | |
CN115706869A (en) | Terminal image processing method and device and terminal equipment | |
EP4156168A1 (en) | Image processing method and electronic device | |
CN113674258B (en) | Image processing method and related equipment | |
CN117593236A (en) | Image display method and device and terminal equipment | |
CN115696067B (en) | Image processing method for terminal, terminal device and computer readable storage medium | |
CN117119314B (en) | Image processing method and related electronic equipment | |
CN116708317B (en) | Data packet MTU adjustment method and device and terminal equipment | |
CN115297269B (en) | Exposure parameter determination method and electronic equipment | |
CN113364067B (en) | Charging precision calibration method and electronic equipment | |
CN114520870B (en) | Display method and terminal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |