CN113810601B - Terminal image processing method and device and terminal equipment - Google Patents

Terminal image processing method and device and terminal equipment Download PDF

Info

Publication number
CN113810601B
CN113810601B CN202110923315.3A CN202110923315A CN113810601B CN 113810601 B CN113810601 B CN 113810601B CN 202110923315 A CN202110923315 A CN 202110923315A CN 113810601 B CN113810601 B CN 113810601B
Authority
CN
China
Prior art keywords
color
image
black
white
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110923315.3A
Other languages
Chinese (zh)
Other versions
CN113810601A (en
Inventor
李光源
廖川
许集润
邵涛
周茂森
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202110923315.3A priority Critical patent/CN113810601B/en
Publication of CN113810601A publication Critical patent/CN113810601A/en
Application granted granted Critical
Publication of CN113810601B publication Critical patent/CN113810601B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • H04N23/632Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • Computing Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)

Abstract

The embodiment of the application provides an image processing method and device for a terminal and terminal equipment, wherein the terminal comprises a black-and-white camera and a color camera, and the method comprises the following steps: after the photographing function of the terminal is operated, acquiring a color RAW image through a color camera and acquiring a black-and-white RAW image through a black-and-white camera in the previewing process; after a shooting instruction is obtained, controlling the color camera and the black-and-white camera to restart, indicating the color camera to image in a merged pixel mode, and indicating the black-and-white camera to image in a full size mode; carrying out RAW domain processing on a color RAW image acquired by a color camera to obtain a color YUV image; processing the black and white RAW image acquired in the preview process through a first Bayer domain processing algorithm link to obtain a black and white YUV image; and fusing the color YUV image and the black-and-white YUV image to obtain a target color image, thereby retaining more image details and improving the color reduction degree of the target color image.

Description

Terminal image processing method and device and terminal equipment
Technical Field
The embodiment of the application relates to the technical field of intelligent terminals, in particular to a terminal image processing method and device and terminal equipment.
Background
Nowadays, photographing has become a common function in mobile phones. With the popularization of smart phones, cameras are more and more widely applied, and the photographing function is more and more excellent. The excellent photographing performance becomes a great selling point of the smart phone, the technical innovation of the photographing function is in a large range, the sizes of the aperture and the sensor are also larger and larger, and the upgrading of the technology and the hardware greatly improves the photographing experience of the smart phone, such as the quality of the picture and the whole photographing experience. When the quality of a photo is pursued, the smart phone body is also a great trend in the development of smart phones in a light and thin manner, so that at least two camera modules are mostly adopted in the existing smart phones, the quality of the photo can be improved, and the light and thin requirements of the smart phone body can be met.
However, taking two camera modules as an example, the smart phone directly uses two cameras to obtain two different images, and therefore, the two dual-shot images need to be fused by using an image fusion algorithm to obtain a final dual-shot enhanced image.
The existing image fusion method loses more image details, so that the color reduction degree of the final imaging is poor; particularly, when shooting is carried out in a frontlighting highlight static scene, partial details, particularly high-frequency details, cannot be completely recovered due to the limited capability of a single camera device and the limited capability of enhancing the details of an algorithm.
Disclosure of Invention
The embodiment of the application provides an image processing method and device of a terminal, a server and terminal equipment, and further provides a computer readable storage medium to retain more image details and improve the color reproduction of the final image.
In a first aspect, the present application provides an image processing method for a terminal, where the terminal includes a black-and-white camera and a color camera, and the method includes: after the photographing function of the terminal is operated, acquiring a color RAW image through a color camera and acquiring a black and white RAW image through a black and white camera in the previewing process; determining the current shooting scene as a non-high dynamic and non-motion scene according to the color RAW image acquired in the previewing process; after the terminal equipment acquires a shooting instruction, the terminal equipment responds to the shooting instruction to control the color camera and the black-and-white camera to be restarted, instructs the color camera to output the collected color RAW image in a pixel merging mode, and instructs the black-and-white camera to output the collected black-and-white RAW image in a full size; then, the terminal equipment acquires a color RAW image acquired by the color camera and a black and white RAW image acquired by the black and white camera; the method comprises the steps of carrying out RAW domain processing on a color RAW image acquired by a color camera to obtain a color YUV image, and processing a black and white RAW image acquired by a black and white camera through a first Bayer domain processing algorithm link to obtain a black and white YUV image. And finally, the terminal equipment fuses the color YUV image and the black-white YUV image to obtain a target color image.
In one possible implementation manner, the acquiring, by the terminal device, the color RAW image collected by the color camera, and the acquiring the black-and-white RAW image collected by the black-and-white camera may be: and acquiring the color RAW image acquired by the color camera from the color image cache, and acquiring the black-and-white RAW image acquired by the black-and-white camera from the black-and-white image cache.
In one possible implementation manner, before the terminal device performs RAW domain processing on the color RAW image acquired by the color camera to obtain the color YUV image, linearization and dead pixel correction processing can be performed on the color RAW image acquired by the color camera through the second bayer domain processing algorithm link.
In one possible implementation manner, after the terminal device obtains the shooting instruction, the color RAW image collected by the color camera includes at least two frames of color RAW images, so that RAW domain processing is performed on the color RAW image collected by the color camera to obtain a color YUV image includes: preprocessing at least two frames of color RAW images; performing noise reduction processing on the at least two frames of preprocessed color RAW images to fuse the at least two frames of color RAW images into one frame of color RAW image; converting the color RAW image obtained by fusion into a color RGB image; adjusting the color and brightness of the color RGB image; and converting the color RGB image after the color and the brightness are adjusted into a color YUV image.
In one possible implementation manner, the preprocessing, performed by the terminal device, the at least two frames of color RAW images includes: aligning the sizes of at least two frames of color RAW images; mapping pixels in every two frames of images in at least two frames of color RAW images to obtain a corresponding relation between the pixels; and carrying out error detection on the corresponding relation, and correcting the corresponding relation with the error.
In one possible implementation manner, the terminal device fuses the color YUV image and the black-and-white YUV image, and the obtaining of the target color image may be: aligning the sizes of the color YUV image and the black-white YUV image; acquiring a region to be fused in the color YUV image and a region to be fused in the black-white YUV image from the color YUV image after the size alignment and the black-white YUV image after the size alignment; then, mapping the pixels in the area where the color YUV images need to be fused with the pixels in the area where the black-white YUV images need to be fused to obtain the corresponding relation between the pixels; then, carrying out error detection on the corresponding relation, and correcting the corresponding relation with the error; and finally, fusing the clear pixels in the area where the color YUV image needs to be fused and the area where the black-white YUV image needs to be fused to the corresponding fuzzy pixels.
In one possible implementation manner, the size alignment of the color YUV image and the black and white YUV image by the terminal device may be: down-sampling the black-and-white YUV image to reduce the size of the black-and-white YUV image to be the same as that of the color YUV image; and aligning the size of the black-white YUV image and the color YUV image after down sampling.
In the image processing method of the terminal provided by the embodiment of the application, after determining that the current shooting scene is a non-high dynamic and non-moving scene, the terminal device controls the color camera and the black-and-white camera to restart, instructs the color camera to output the acquired color RAW image in a pixel merging manner, and instructs the black-and-white camera to output the acquired black-and-white RAW image in a full size; then, the terminal equipment firstly carries out RAW domain processing on the color RAW image, converts the color RAW image into a color YUV image, and then fuses the color YUV image and the black-and-white YUV image, so that more image details can be reserved, and the color reduction degree of a target color image is improved; in addition, in this embodiment, the black-and-white camera outputs the acquired black-and-white RAW image according to the full size, and may also retain more image details, thereby improving the image quality of the finally obtained target color image.
In a second aspect, an embodiment of the present application provides an image processing apparatus for a terminal, where the apparatus is included in a terminal device, and the apparatus has a function of implementing a behavior of the terminal device in the first aspect and possible implementations of the first aspect. The functions may be implemented by hardware, or by hardware executing corresponding software. The hardware or software includes one or more modules or units corresponding to the above-described functions. Such as a receiving module or unit, a processing module or unit, a transmitting module or unit, etc.
In a third aspect, an embodiment of the present application provides a terminal device, including: the system comprises a black-and-white camera and a color camera; one or more processors; a memory; a plurality of application programs; and one or more computer programs, wherein the one or more computer programs are stored in the memory, the one or more computer programs comprising instructions which, when executed by the terminal device, cause the terminal device to perform the steps of: after the photographing function of the terminal equipment is operated, acquiring a color RAW image through a color camera and acquiring a black and white RAW image through a black and white camera in the previewing process; determining the current shooting scene as a non-high dynamic and non-motion scene according to the color RAW image acquired in the previewing process; after a shooting instruction is acquired, controlling the color camera and the black-and-white camera to restart in response to the shooting instruction, instructing the color camera to output the acquired color RAW image in a pixel merging mode, and instructing the black-and-white camera to output the acquired black-and-white RAW image in a full-size mode; acquiring a color RAW image acquired by a color camera, and acquiring a black and white RAW image acquired by a black and white camera; carrying out RAW domain processing on a color RAW image acquired by a color camera to obtain a color YUV image; processing the black and white RAW image acquired by the black and white camera through a first Bayer domain processing algorithm link to obtain a black and white YUV image; and fusing the color YUV image and the black-and-white YUV image to obtain a target color image.
In one possible implementation manner, when the instruction is executed by the terminal device, the step of causing the terminal device to execute the steps of acquiring the color RAW image acquired by the color camera and acquiring the black and white RAW image acquired by the black and white camera includes: and acquiring the color RAW image acquired by the color camera from the color image cache, and acquiring the black-and-white RAW image acquired by the black-and-white camera from the black-and-white image cache.
In one possible implementation manner, when the instruction is executed by the terminal device, the terminal device executes RAW domain processing on a color RAW image acquired by the color camera, and before the step of obtaining a color YUV image, the following steps are further executed: and carrying out linearization and dead pixel correction processing on the color RAW image acquired by the color camera through a second Bayer domain processing algorithm link.
In one possible implementation manner, after the shooting instruction is acquired, the color RAW image acquired in the preview process includes at least two frames of color RAW images; when the instruction is executed by the terminal device, the terminal device executes RAW domain processing on the color RAW image acquired by the color camera, and the step of obtaining the color YUV image may include: preprocessing the at least two frames of color RAW images; performing noise reduction processing on the at least two frames of color RAW images after preprocessing so as to fuse the at least two frames of color RAW images into one frame of color RAW image; converting the color RAW image obtained by fusion into a color RGB image; adjusting the color and brightness of the color RGB image; and converting the color RGB image after the color and the brightness are adjusted into a color YUV image.
In one possible implementation manner, when the instruction is executed by the terminal device, the step of causing the terminal device to perform preprocessing on the at least two frames of color RAW images includes: aligning the sizes of the at least two frames of color RAW images; mapping pixels in every two frames of images in the at least two frames of color RAW images to obtain a corresponding relation between the pixels; and carrying out error detection on the corresponding relation, and correcting the corresponding relation with the error.
In one possible implementation manner, when the instruction is executed by the terminal device, the terminal device may perform fusion of the color YUV image and the black-and-white YUV image, and the step of obtaining the target color image may be: aligning the sizes of the color YUV image and the black-and-white YUV image; acquiring a region to be fused in the color YUV image and a region to be fused in the black-white YUV image from the color YUV image after the size alignment and the black-white YUV image after the size alignment; mapping pixels in a region where a color YUV image needs to be fused with pixels in a region where a black-white YUV image needs to be fused to obtain a corresponding relation between the pixels; carrying out error detection on the corresponding relation, and correcting the corresponding relation with the error; fusing the clear pixels in the area of the color YUV image needing to be fused and the area of the black-white YUV image needing to be fused to the corresponding fuzzy pixels.
In one possible implementation manner, when the above instructions are executed by the terminal device, the step of causing the terminal device to perform size alignment of the color YUV image and the black and white YUV image includes: down-sampling the black-and-white YUV image to reduce the size of the black-and-white YUV image to be the same as that of the color YUV image; and aligning the size of the black-white YUV image and the color YUV image after down sampling.
It should be understood that the second aspect and the third aspect of the embodiment of the present application are consistent with the technical solution of the first aspect of the embodiment of the present application, and beneficial effects achieved by various aspects and corresponding possible implementation manners are similar and will not be described again.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium, in which a computer program is stored, and when the computer program runs on a computer, the computer is caused to execute the method provided in the first aspect.
In a fifth aspect, the present application provides a computer program, which is used to execute the method provided in the first aspect when the computer program is executed by a computer.
In a possible design, the program of the fifth aspect may be stored in whole or in part on a storage medium packaged with the processor, or in part or in whole on a memory not packaged with the processor.
Drawings
Fig. 1 is a schematic structural diagram of a terminal device according to an embodiment of the present application;
fig. 2 (a) is a schematic view of a display interface of a terminal device provided in an embodiment of the present application;
fig. 2 (b) is a schematic diagram of a display interface of a terminal device provided in another embodiment of the present application;
fig. 2 (c) is a schematic view of a display interface of a terminal device provided in yet another embodiment of the present application;
fig. 3 is a schematic diagram of an image processing method of a terminal according to an embodiment of the present application;
fig. 4 is a schematic diagram of an image processing method of a terminal according to another embodiment of the present application;
FIG. 5 is a flow chart of color and black-and-white image fusion provided by one embodiment of the present application;
fig. 6 is a schematic structural diagram of a terminal device according to another embodiment of the present application.
Detailed Description
The terminology used in the description of the embodiments section of the present application is for the purpose of describing particular embodiments of the present application only and is not intended to be limiting of the present application.
In an image fusion scheme provided by the prior art, original (RAW) images acquired by a color camera and a black and white camera are respectively converted into a color YUV image and a black and white YUV image, and then the color YUV image and the black and white YUV image are fused.
Particularly, when shooting is carried out in a smooth highlight static scene, partial details, particularly high-frequency details, cannot be completely recovered due to the limited capability of a single camera device and the limited capability of enhancing the details of an algorithm.
Based on the above problems, the embodiment of the present application provides an image processing method for a terminal, which can implement that a terminal device performs RAW domain processing on a color RAW image acquired by a color camera, converts the color RAW image into a color YUV image, and then fuses the color YUV image with a black-and-white YUV image, thereby retaining more image details and improving the color reduction degree of final imaging.
The image processing method of the terminal provided by the embodiment of the application can be applied to terminal equipment, wherein the terminal equipment can be equipment such as a smart phone, a tablet computer, wearable equipment, vehicle-mounted equipment, augmented Reality (AR)/Virtual Reality (VR) equipment, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA); the embodiment of the present application does not set any limit to the specific type of the terminal device.
For example, fig. 1 is a schematic structural diagram of a terminal device according to an embodiment of the present disclosure, and as shown in fig. 1, the terminal device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a key 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identity Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation to the terminal device 100. In other embodiments of the present application, terminal device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. Wherein, the different processing units may be independent devices or may be integrated in one or more processors.
The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose-input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bidirectional synchronous serial bus including a serial data line (SDA) and a serial clock line (DCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, so that the processor 110 and the touch sensor 180K communicate through an I2C bus interface to implement a touch function of the terminal device 100.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 through an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through the I2S interface, so as to implement a function of answering a call through a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, audio module 170 and wireless communication module 160 may be coupled by a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 and the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through a UART interface, so as to implement the function of playing music through a bluetooth headset.
MIPI interfaces may be used to connect processor 110 with peripheral devices such as display screen 194, camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, the processor 110 and the camera 193 communicate through a CSI interface to implement the photographing function of the terminal device 100. The processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the terminal device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, I2S interface, UART interface, MIPI interface, and the like.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the terminal device 100, and may also be used to transmit data between the terminal device 100 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the interface connection relationship between the modules illustrated in the embodiment of the present application is only an exemplary illustration, and does not constitute a limitation on the structure of the terminal device 100. In other embodiments of the present application, the terminal device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the terminal device 100. The charging management module 140 may also supply power to the terminal device 100 through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140, and supplies power to the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the terminal device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in terminal device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied on the terminal device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the terminal device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), global Navigation Satellite System (GNSS), frequency Modulation (FM), near Field Communication (NFC), infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, the antenna 1 of the terminal device 100 is coupled to the mobile communication module 150 and the antenna 2 is coupled to the wireless communication module 160 so that the terminal device 100 can communicate with the network and other devices through wireless communication technology. The wireless communication technology may include global system for mobile communications (GSM), general Packet Radio Service (GPRS), code Division Multiple Access (CDMA), wideband Code Division Multiple Access (WCDMA), time division code division multiple access (time-division multiple access, TD-SCDMA), long Term Evolution (LTE), BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou satellite navigation system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The terminal device 100 implements a display function by the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, connected to the display screen 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the terminal device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The terminal device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display screen 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the terminal device 100 may include N cameras 193, N being a positive integer greater than 1. In the embodiment of the present application, the N cameras 193 include a black-and-white camera and a color camera.
Specifically, during the shooting process, the user turns on the camera, and the light is transmitted to the photosensitive element through the lens (which may correspond to the camera 193 described earlier), in other words, after the lens can project the ambient light signal to the photosensitive area of the photosensitive element, the photosensitive element is subjected to photoelectric conversion, and the light signal is converted into an image visible to the naked eye. The photosensitive element transmits an internal original image (Bayer format, also referred to as Bayer format or Bayer format) to the ISP module, and the ISP module outputs an image of an RGB spatial domain to a rear-end acquisition unit after algorithm processing, and displays the image in an image preview area of the terminal device 100 or a display screen of the terminal device 100. In the process, the processor performs corresponding control on the lens, the photosensitive element and the ISP module through a firmware program run on the processor, and further, the image preview or shooting function is completed.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the terminal device 100 selects a frequency point, the digital signal processor is used to perform fourier transform or the like on the frequency point energy.
Video codecs are used to compress or decompress digital video. The terminal device 100 may support one or more video codecs. In this way, the terminal device 100 can play or record video in a plurality of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. The NPU can implement applications such as intelligent recognition of the terminal device 100, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the storage capability of the terminal device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (such as audio data, a phonebook, etc.) created during use of the terminal device 100, and the like. In addition, the internal memory 121 may include a high speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a Universal Flash Storage (UFS), and the like. The processor 110 executes various functional applications of the terminal device 100 and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The terminal device 100 may implement an audio function through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into analog audio signals for output, and also used to convert analog audio inputs into digital audio signals. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The terminal device 100 can listen to music through the speaker 170A, or listen to a handsfree call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the terminal device 100 answers a call or voice information, it is possible to answer a voice by bringing the receiver 170B close to the human ear.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking near the microphone 170C through the mouth. The terminal device 100 may be provided with at least one microphone 170C. In other embodiments, the terminal device 100 may be provided with two microphones 170C, which may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the terminal device 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used for sensing a pressure signal, and can convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The terminal device 100 determines the intensity of the pressure from the change in the capacitance. When a touch operation is applied to the display screen 194, the terminal device 100 detects the intensity of the touch operation based on the pressure sensor 180A. The terminal device 100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion attitude of the terminal device 100. In some embodiments, the angular velocity of terminal device 100 about three axes (i.e., x, y, and z axes) may be determined by gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. Illustratively, when the shutter is pressed, the gyro sensor 180B detects the shake angle of the terminal device 100, calculates the distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the terminal device 100 through a reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, the terminal device 100 calculates an altitude from the barometric pressure measured by the barometric pressure sensor 180C, and assists in positioning and navigation.
The magnetic sensor 180D includes a hall sensor. The terminal device 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the terminal device 100 is a flip, the terminal device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 180E can detect the magnitude of acceleration of the terminal device 100 in various directions (generally, three axes). The magnitude and direction of gravity can be detected when the terminal device 100 is stationary. The method can also be used for identifying the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and the like.
A distance sensor 180F for measuring a distance. The terminal device 100 may measure the distance by infrared or laser. In some embodiments, shooting a scene, the terminal device 100 may range using the distance sensor 180F to achieve fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The terminal device 100 emits infrared light to the outside through the light emitting diode. The terminal device 100 detects infrared reflected light from a nearby object using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the terminal device 100. When insufficient reflected light is detected, the terminal device 100 can determine that there is no object near the terminal device 100. The terminal device 100 can utilize the proximity light sensor 180G to detect that the user holds the terminal device 100 close to the ear for talking, so as to automatically turn off the screen to achieve the purpose of saving power. The proximity light sensor 180G may also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 180L is used to sense the ambient light level. The terminal device 100 may adaptively adjust the brightness of the display screen 194 according to the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the terminal device 100 is in a pocket, in order to prevent accidental touches.
The fingerprint sensor 180H is used to collect a fingerprint. The terminal device 100 can utilize the collected fingerprint characteristics to realize fingerprint unlocking, access to an application lock, fingerprint photographing, fingerprint incoming call answering and the like.
The temperature sensor 180J is used to detect temperature. In some embodiments, the terminal device 100 executes a temperature processing policy using the temperature detected by the temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds the threshold, the terminal device 100 performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the terminal device 100 heats the battery 142 when the temperature is below another threshold to avoid the terminal device 100 being abnormally shut down due to low temperature. In other embodiments, when the temperature is lower than a further threshold, the terminal device 100 performs boosting on the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
The touch sensor 180K is also called a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided via the display screen 194. In other embodiments, the touch sensor 180K may be disposed on the surface of the terminal device 100, different from the position of the display screen 194.
The bone conduction sensor 180M can acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human vocal part vibrating the bone mass. The bone conduction sensor 180M may also contact the human pulse to receive the blood pressure pulsation signal. In some embodiments, the bone conduction sensor 180M may also be disposed in a headset, integrated into a bone conduction headset. The audio module 170 may analyze a voice signal based on the vibration signal of the bone mass vibrated by the sound part acquired by the bone conduction sensor 180M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so as to realize the heart rate detection function.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The terminal device 100 may receive a key input, and generate a key signal input related to user setting and function control of the terminal device 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be brought into and out of contact with the terminal device 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The terminal device 100 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 195 can be inserted with multiple cards at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 is also compatible with different types of SIM cards. The SIM card interface 195 is also compatible with external memory cards. The terminal device 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the terminal device 100 employs esims, namely: an embedded SIM card. The eSIM card may be embedded in the terminal device 100 and cannot be separated from the terminal device 100.
For ease of understanding, the bayer domain and the RAW domain mentioned in the following examples will be explained first.
1. Bayer domain: each lens of the digital camera is provided with an optical sensor for measuring the brightness of light, but if a full-color image is to be obtained, three optical sensors are generally required to obtain red, green and blue three-primary-color information, and in order to reduce the cost and the volume of the digital camera, manufacturers generally adopt a CCD or CMOS image sensor, and generally, an original image output by the CMOS image sensor is in a bayer domain RGB format, a single pixel only contains a color value, and to obtain a gray value of the image, it is necessary to interpolate the complete color information of each pixel first and then calculate the gray value of each pixel. That is, the bayer domain refers to a raw picture format inside the digital camera.
2. RAW field: RAW domain images, i.e., RAW images, contain data processed from an image sensor of a digital camera, scanner, or motion picture film scanner. This is so named because the RAW domain image has not been processed, printed or used for editing. The RAW domain image contains the most original information of the image and is not subjected to the nonlinear processing in the ISP process.
The following embodiments of the present application will specifically describe, by taking a terminal device having a structure shown in fig. 1 as an example, an image processing method of a terminal provided in the embodiments of the present application with reference to the accompanying drawings and application scenarios.
Specifically, after a user clicks a camera icon in a display interface of the terminal device, in response to the operation of the user, the terminal device operates a photographing function, in a previewing process, the terminal device acquires a color RAW image through a color camera, acquires a black and white RAW image through a black and white camera, and then determines that a current photographing scene is a non-high dynamic scene and a non-motion scene according to the color RAW image acquired in the previewing process, wherein the non-high dynamic scene is a scene with sufficient illumination and small brightness span in the current photographing scene; the non-motion scene means that the displacement of the object in the current shooting scene is not changed and is in a non-motion state. Non-highly dynamic and non-moving scenes, for example, are: when the illumination is sufficient, scenes such as buildings, wind and light are shot.
After determining that the current shooting scene is a non-high dynamic scene and a non-moving scene, the terminal device may display, in the current display interface, that the current shooting scene is a non-high dynamic scene and a non-moving scene, as shown in fig. 2 (a), where fig. 2 (a) is a schematic view of the display interface of the terminal device provided in an embodiment of the present application, and fig. 2 (a) illustrates a non-high dynamic scene as a building and a non-moving scene.
Specifically, the step of determining, by the terminal device, that the current shooting scene is a non-high dynamic scene according to the color RAW image acquired in the preview process may be: the method comprises the steps that terminal equipment obtains the maximum brightness and the minimum brightness in a color RAW image collected in a previewing process, and the ratio of the maximum brightness to the minimum brightness is calculated; if the ratio is smaller than a preset ratio threshold value, the current shooting scene can be determined to be a non-high dynamic scene. The predetermined ratio threshold may be set according to system performance and/or implementation requirements, and the size of the predetermined ratio threshold is not limited in this embodiment. In addition, the terminal device may obtain a displacement of an object in the current shooting scene according to at least two frames of color RAW images acquired in the preview process, and if the displacement is less than or equal to a predetermined displacement threshold, it may be determined that the current shooting scene is a non-moving scene.
In addition, it should be noted that the current shooting scene may be determined by the terminal device in the above manner, or may be selected by the user in the interface of the shooting function.
In addition, if in the preview process, after the terminal device detects that the black-and-white camera in the N cameras 193 is blocked, a prompt message of "the black-and-white camera is blocked" may be displayed in the current display interface, as shown in fig. 2 (b), to remind the user that the black-and-white camera is blocked, so that the user may remove the blocking object of the black-and-white camera as soon as possible, and the quality of the captured image is ensured. Fig. 2 (b) is a schematic diagram of a display interface of a terminal device provided in another embodiment of the present application.
Further, referring to fig. 3, fig. 3 is a schematic diagram of an image processing method of a terminal according to an embodiment of the present application, and as can be seen from fig. 3, after determining that a current shooting scene is a non-high-dynamic and non-moving scene (35), the terminal device obtains a shooting instruction, then in response to the shooting instruction, the terminal device controls a color camera and a black and white camera to restart (36), instructs the color camera to output a collected color RAW image in a merged pixel manner, and instructs the black and white camera to output the collected black and white RAW image in a full size. The acquiring, by the terminal device, the shooting instruction may include: the terminal device detects that the shooting button is pressed, or detects an operation of clicking the shutter icon, and the like.
Specifically, after the shooting instruction is acquired, the terminal device issues instructions to the color camera and the black-and-white camera respectively, and after the instructions are received, the color camera and the black-and-white camera are restarted (36). Then, the color camera collects a color RAW image, the collected color RAW image is output in a pixel merging mode, and the color RAW image output by the color camera is cached in a color image cache; similarly, after the black-and-white camera is restarted, the black-and-white RAW image is collected, the collected black-and-white RAW image is output according to the full size (full size), and the black-and-white RAW image output by the black-and-white camera is cached in the black-and-white image cache. In this embodiment, the black and white camera outputs the black and white RAW image according to the full size, the black and white RAW image has higher pixels, more image details can be retained, the image quality is better, and if the image pixels are higher, the image frame rate is lower, the frame rate is low, the image frame rate is not suitable for a motion scene, but is more suitable for a non-motion scene.
Then, the terminal equipment can acquire a color RAW image acquired by the color camera and a black and white RAW image acquired by the black and white camera; as described above, in a specific implementation, the terminal device may obtain the color RAW image collected by the color camera from the color image buffer, and obtain the black-and-white RAW image collected by the black-and-white camera from the black-and-white image buffer.
Then, the terminal device performs linearization and dead pixel correction processing on the color RAW image acquired by the color camera through the bayer domain processing algorithm link 31. The bayer domain processing algorithm link 31 is a second bayer domain processing algorithm link, and is configured to process the color RAW image.
Furthermore, the terminal device can perform RAW domain processing (32) on the color RAW image collected by the color camera to obtain a color YUV image.
For the black and white RAW image obtained by the terminal device from the black and white image cache, the terminal device processes the black and white RAW image collected by the black and white camera through the bayer domain processing algorithm link 33 to obtain a black and white YUV image. The bayer domain processing algorithm link 33 is a first bayer domain processing algorithm link, and is configured to process a black-and-white RAW image.
And finally, the terminal equipment fuses (34) the color YUV image and the black-and-white YUV image to obtain a target color image.
In addition, it should be noted that, after determining that the current shooting scene is a non-high-dynamic and non-moving scene, the terminal device obtains a shooting instruction, and then in response to the shooting instruction, the terminal device controls the color camera and the black and white camera to restart (36), where the preview stream will be stuck due to the restart of the color camera, and the length of the stuck time of the preview stream is expected to be 500-1000 ms, at this time, the terminal device may display a prompt message "the preview is stuck and please wait" in the current display interface, as shown in fig. 2 (c), and fig. 2 (c) is a schematic diagram of the display interface of the terminal device provided in another embodiment of the present application.
It should be further noted that, after the terminal device runs the photographing function, before the current photographing scene is determined to be a non-high dynamic scene and a non-moving scene, in the previewing process, the color RAW image acquired by the terminal device through the color camera may include a short-time exposure color RAW image and a normal exposure color RAW image; in this way, the terminal device may further adopt a steady high dynamic range (stager HDR) technique to fuse the short-time exposed color RAW image and the normally exposed color RAW image; then, in the above preview process, the color image obtained by the fusion is displayed.
The Stagger HDR technology needs to occupy two paths of image processing front ends, so after the current shooting scene is determined to be a non-high dynamic and non-moving scene and the terminal equipment obtains a shooting instruction, the terminal equipment not only needs to control the color camera and the black and white camera to restart, but also needs to pause using the Stagger HDR, and then one path of image processing front end occupied by the Stagger HDR is allocated to the black and white camera for use.
Of course, if the terminal device does not use the stager HDR technology in the preview process after the terminal device runs the photographing function and before the current photographing scene is determined to be a non-highly dynamic and non-moving scene, the stager HDR does not need to be paused, and only the color camera and the black-and-white camera need to be controlled to restart.
The following describes in detail an image processing method of a terminal according to an embodiment of the present application with reference to fig. 4, and fig. 4 is a schematic diagram of an image processing method of a terminal according to another embodiment of the present application.
In this embodiment, in the preview process, the terminal device acquires a color RAW image through the color camera, and acquires a black and white RAW image through the black and white camera. And then, the terminal equipment determines that the current shooting scene is a non-high dynamic scene and a non-moving scene according to the color RAW image acquired in the previewing process.
Next, after the terminal device detects that the photographing icon is clicked, the terminal device obtains a photographing instruction, responds to the photographing instruction, controls the color camera and the black-and-white camera to be restarted, sends configuration information of a combining pixel (combining setting) to the color camera, instructs the color camera to collect a color RAW image according to the resolution in the configuration information, and outputs the collected color RAW image according to the combining pixel (combining setting) mode; in addition, after controlling the black-and-white camera to restart, the terminal equipment instructs the black-and-white camera to output the acquired black-and-white RAW image in full size. In this way, the size of the monochrome RAW image output from the monochrome camera is full-size, and since the 4 pixels can be read out by combining the pixels, the size of the color RAW image output by combining the pixels is 1/4 of the full-size.
In this embodiment, a color RAW image output by a color camera in a binning setting mode is buffered in a color image buffer, and a monochrome RAW image output by a monochrome camera in a full size is buffered in a monochrome image buffer. In a specific implementation, the size of the color RAW image output by the color camera in a merged pixel manner is not directly related to the size of the black and white RAW image output by the black and white camera in a full size, for example, the size of the color RAW image may be 12M, and the size of the black and white RAW image may be 64M. In this way, when image fusion is performed, a 64M black-and-white image and a 12M color image are fused, and the sizes of the two images are aligned within the fusion algorithm.
Then, the terminal device can acquire 6 frames of color RAW images from the color image buffer and 1 frame of black and white RAW images from the black and white image buffer. It should be noted that, the frame number of the color RAW image obtained from the color image buffer is merely an example, and does not constitute a limitation of the embodiment of the present application, the terminal device may obtain N color RAW images from the color image buffer, the size of N may be set according to system performance and/or implementation requirements, and the size of N is not limited in this embodiment, for example, the value range of N may be [4,8], and the embodiment of the present application is described with N being 6.
Then, the terminal device performs linearization and dead pixel correction processing on the 6 frames of color RAW images acquired by the color camera through the bayer domain processing algorithm link 41. In this embodiment, the bayer domain processing algorithm link 41 in fig. 4 corresponds to the bayer domain processing algorithm link 31 in fig. 3, and is used for processing the color RAW image.
Next, the terminal device caches the 6 frames of color RAW images after the linearization and dead pixel correction processing in the image cache pool 42, further performs RAW domain processing (43) on the 6 frames of color RAW images in the image cache pool 42 to obtain color YUV images, and caches the color YUV images in the image cache 44. As can be seen from fig. 4, the number of bits of the color YUV image buffered in the image buffer 44 is 10.
Specifically, performing RAW domain processing (43) on the 6-frame color RAW image in the image buffer pool 42 to obtain a color YUV image may include:
step 1, preprocessing the 6-frame color RAW image; the preprocessing of the 6-frame color RAW image may include: and aligning the sizes of the 6 frames of color RAW images, mapping the pixels in every two frames of images in the 6 frames of color RAW images to obtain the corresponding relation between the pixels, carrying out error detection on the corresponding relation, and correcting the corresponding relation with errors.
And 2, performing noise reduction processing on the preprocessed 6-frame color RAW image to fuse the 6-frame color RAW image into one frame color RAW image.
And 3, converting the color RAW image obtained by fusion into a color RGB image.
Step 4, adjusting the color and brightness of the color RGB image; specifically, adjusting the color and brightness of the color RGB image may include: luminance and color processing flows such as Automatic White Balance (AWB), vignetting (shading), and color correction (Gamma) are performed on the color RGB image.
And 5, converting the color RGB image after the color and the brightness are adjusted into a color YUV image.
In addition, with reference to fig. 4, for 1 frame of black-and-white RAW image obtained by the terminal device from the black-and-white image cache, the terminal device sequentially performs linearization, dead pixel correction, bayer processing, graph Transformation Matching (GTM) and color correction (Gamma) on the 1 frame of black-and-white RAW image through the bayer domain processing algorithm link 45 to obtain a YUV black-and-white image, and the obtained black-and-white YUV image is cached in the image cache 46. In this embodiment, the bayer domain processing algorithm link 45 corresponds to the bayer domain processing algorithm link 33 in fig. 3, and is configured to process a black and white RAW image.
Finally, the terminal device fuses (47) the color YUV image in the image buffer 44 and the black-and-white YUV image in the image buffer 46 to obtain a target color image.
Specifically, fig. 5 is a flowchart of color and black-and-white image fusion according to an embodiment of the present application, and as shown in fig. 5, the fusing the color YUV image and the black-and-white YUV image to obtain the target color image may include:
step 501, the sizes of the color YUV image and the black YUV image are aligned. Specifically, in this embodiment, the size of the 1 frame black-and-white RAW image obtained by the terminal device from the black-and-white image cache may be 64M, so the size of the black-and-white YUV image obtained by converting the black-and-white RAW image is also 64M, and the size of the color YUV image is 12M, so when the sizes of the color YUV image and the black-and-white YUV image are aligned, the black-and-white YUV image needs to be downsampled first, and the size of the black-and-white YUV image is reduced to be the same as the size of the color YUV image; and aligning the size of the black-white YUV image and the color YUV image after down sampling.
And 502, acquiring a region to be fused in the color YUV image and a region to be fused in the black-and-white YUV image from the color YUV image after the size alignment and the black-and-white YUV image after the size alignment.
Step 503, mapping the pixels in the area where the color YUV images need to be fused with the pixels in the area where the black and white YUV images need to be fused to obtain the corresponding relationship between the pixels.
Step 504, performing error detection on the corresponding relationship, and correcting the corresponding relationship with the error.
And 505, fusing the clear pixels in the area where the color YUV image needs to be fused and the area where the black-and-white YUV image needs to be fused to the corresponding fuzzy pixels. For example, pixel a in the region where the color YUV image needs to be fused corresponds to pixel a' in the region where the black-and-white YUV image needs to be fused; in one case, assuming that pixel a is a sharp pixel in pixel a and pixel a ', and pixel a ' is a blurred pixel in both, the color of pixel a may be fused to pixel a '; in another case, assuming that pixel a is a blurred pixel in pixel a and pixel a ', and pixel a ' is a sharp pixel in both, the details of pixel a ' may be fused to pixel a.
In the image processing method of the terminal provided by the embodiment of the application, after determining that the current shooting scene is a non-high dynamic and non-moving scene, the terminal device controls the color camera and the black-and-white camera to restart, instructs the color camera to output the acquired color RAW image in a pixel merging manner, and instructs the black-and-white camera to output the acquired black-and-white RAW image in a full size; then, the terminal equipment firstly carries out RAW domain processing on the color RAW image, converts the color RAW image into a color YUV image, and then fuses the color YUV image and the black-and-white YUV image, so that more image details can be reserved, particularly when shooting in a direct-light highlight static scene, the image details, particularly high-frequency details, can be better recovered, and the color reduction degree of a target color image is improved; in addition, in this embodiment, the black-and-white camera outputs the acquired black-and-white RAW image according to the full size, and may also retain more image details, thereby improving the image quality of the finally obtained target color image.
It is to be understood that some or all of the steps or operations in the above-described embodiments are merely examples, and other operations or variations of various operations may be performed by the embodiments of the present application. Further, the various steps may be performed in a different order presented in the above-described embodiments, and it is possible that not all of the operations in the above-described embodiments are performed.
It will be appreciated that the terminal device, in order to implement the above-described functions, comprises corresponding hardware and/or software modules for performing the respective functions. The exemplary algorithm steps described in connection with the embodiments disclosed herein may be embodied in hardware or a combination of hardware and computer software. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, with the embodiment described in connection with the particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In this embodiment, the terminal device may be divided into functional modules according to the method embodiments, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated into one module. The integrated module may be implemented in the form of hardware. It should be noted that the division of the modules in this embodiment is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
Fig. 6 is a schematic structural diagram of a terminal device according to another embodiment of the present application, where each functional module is divided according to each function, and fig. 6 illustrates a schematic possible composition diagram of the terminal device 600 according to the foregoing embodiment, as shown in fig. 6, the terminal device 600 may include: a receiving unit 601, a processing unit 602, and a transmitting unit 603;
the processing unit 602 may be configured to support the terminal device 600 to implement the technical solutions described in the embodiments shown in fig. 2 (a) to fig. 5 in this application.
It should be noted that all relevant contents of each step related to the above method embodiment may be referred to the functional description of the corresponding functional module, and are not described herein again.
The present embodiment provides the terminal apparatus 600 for executing the image processing method of the terminal described above, and therefore can achieve the same effects as the method described above.
It should be understood that the terminal device 600 may correspond to the terminal device 100 shown in fig. 1. Wherein, the functions of the receiving unit 601 and the transmitting unit 603 may be implemented by the processor 110, the antenna 1 and the mobile communication module 150 in the terminal device 100 shown in fig. 1, and/or by the processor 110, the antenna 2 and the wireless communication module 160; the functions of the processing unit 602 may be implemented by the processor 110 in the terminal device 100 shown in fig. 1.
In case of employing an integrated unit, the terminal device 600 may include a processing module, a storage module and a communication module.
The processing module may be configured to control and manage the actions of the terminal device 600, and for example, may be configured to support the terminal device 600 to execute the steps executed by the receiving unit 601, the processing unit 602, and the sending unit 603. The memory module can be used to support the terminal device 600 in storing program codes and data, etc. A communication module, which may be used to support communication between the terminal device 600 and other devices.
Among other things, a processing module may be a processor or controller that may implement or execute the various illustrative logical blocks, modules, and circuits described in connection with the present disclosure. A processor may also be a combination of computing functions, e.g., a combination of one or more microprocessors, a Digital Signal Processing (DSP) and a microprocessor, or the like. The storage module may be a memory. The communication module may specifically be a radio frequency circuit, a bluetooth chip, and/or a Wi-Fi chip, and the like, which interact with other electronic devices.
In one embodiment, when the processing module is a processor and the storage module is a memory, the terminal device 600 according to this embodiment may be a device having the structure shown in fig. 1.
An embodiment of the present application further provides a computer-readable storage medium, in which a computer program is stored, and when the computer program runs on a computer, the computer is caused to execute the method provided by the embodiment shown in fig. 2 (a) to fig. 5 of the present application.
Embodiments of the present application also provide a computer program product, which includes a computer program and when the computer program runs on a computer, the computer executes the method provided in the embodiments shown in fig. 2 (a) to fig. 5 of the present application.
In the embodiments of the present application, "at least one" means one or more, and "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, and means that there may be three relationships, for example, a and/or B, and may mean that a exists alone, a and B exist simultaneously, and B exists alone. Wherein A and B can be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" and similar expressions refer to any combination of these items, including any combination of singular or plural items. For example, at least one of a, b, and c may represent: a, b, c, a and b, a and c, b and c or a and b and c, wherein a, b and c can be single or multiple.
Those of ordinary skill in the art will appreciate that the various elements and algorithm steps described in connection with the embodiments disclosed herein can be implemented as electronic hardware, computer software, or combinations of electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, any function, if implemented in the form of a software functional unit and sold or used as a separate product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a portable hard disk, a read-only memory (ROM), a Random Access Memory (RAM), a magnetic disk, an optical disk, or other various media capable of storing program codes.
The above description is only for the specific embodiments of the present application, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present disclosure, and all the changes or substitutions should be covered by the protection scope of the present application. The protection scope of the present application shall be subject to the protection scope of the claims.

Claims (13)

1. An image processing method of a terminal, wherein the terminal comprises a black-and-white camera and a color camera, the method comprising:
after the photographing function of the terminal is operated, acquiring a color RAW image through the color camera and acquiring a black and white RAW image through the black and white camera in the previewing process;
determining the current shooting scene as a non-high dynamic and non-motion scene according to the color RAW image acquired in the previewing process;
after a shooting instruction is acquired, controlling the color camera and the black-and-white camera to restart in response to the shooting instruction, and instructing the color camera to output the acquired color RAW image in a pixel merging mode and instructing the black-and-white camera to output the acquired black-and-white RAW image in a full size;
acquiring a color RAW image acquired by the color camera, and acquiring a black and white RAW image acquired by the black and white camera;
carrying out RAW domain processing on a color RAW image acquired by the color camera to obtain a color YUV image; processing the black and white RAW image acquired by the black and white camera through a first Bayer domain processing algorithm link to obtain a black and white YUV image;
fusing the color YUV image and the black-and-white YUV image to obtain a target color image;
after the shooting instruction is obtained, the color RAW image collected by the color camera comprises at least two frames of color RAW images, and the RAW domain processing of the color RAW image collected by the color camera to obtain a color YUV image comprises:
preprocessing the at least two frames of color RAW images;
carrying out noise reduction processing on at least two frames of color RAW images after preprocessing;
fusing at least two frames of color RAW images subjected to noise reduction processing into one frame of color RAW image;
converting the color RAW image obtained by fusion into a color RGB image;
adjusting the color and brightness of the color RGB image;
and converting the color RGB image after the adjustment of the color and the brightness into a color YUV image.
2. The method of claim 1, wherein the acquiring the color RAW image collected by the color camera and the acquiring the black and white RAW image collected by the black and white camera comprise:
and acquiring the color RAW image collected by the color camera from a color image cache, and acquiring the black and white RAW image collected by the black and white camera from a black and white image cache.
3. The method according to claim 1, wherein before performing RAW domain processing on the color RAW image collected by the color camera to obtain a color YUV image, the method further comprises:
and carrying out linearization and dead pixel correction processing on the color RAW image acquired by the color camera through a second Bayer domain processing algorithm link.
4. The method of claim 1, wherein the pre-processing the at least two frames of color RAW images comprises:
aligning the sizes of the at least two frames of color RAW images;
mapping pixels in every two frames of images in the at least two frames of color RAW images to obtain a corresponding relation between the pixels;
and carrying out error detection on the corresponding relation, and correcting the corresponding relation with the error.
5. The method of claim 1, wherein fusing the color YUV image and the black-and-white YUV image to obtain a target color image comprises:
aligning the sizes of the color YUV image and the black and white YUV image;
acquiring a region to be fused in the color YUV image and a region to be fused in the black-white YUV image from the color YUV image after the size alignment and the black-white YUV image after the size alignment;
mapping the pixels in the area where the color YUV images need to be fused with the pixels in the area where the black-white YUV images need to be fused to obtain the corresponding relation between the pixels;
carrying out error detection on the corresponding relation, and correcting the corresponding relation with errors;
and fusing the clear pixels in the area where the color YUV image needs to be fused and the area where the black-white YUV image needs to be fused to the corresponding fuzzy pixels.
6. The method of claim 5, wherein the aligning the color YUV image and the black and white YUV image in size comprises:
downsampling the black-and-white YUV image to reduce the size of the black-and-white YUV image to be the same as the size of the color YUV image;
and aligning the size of the black-white YUV image after down sampling with the size of the color YUV image.
7. A terminal device is characterized by comprising a black-and-white camera and a color camera; one or more processors; a memory; a plurality of application programs; and one or more computer programs, wherein the one or more computer programs are stored in the memory, the one or more computer programs comprising instructions which, when executed by the terminal device, cause the terminal device to perform the steps of:
after the photographing function of the terminal equipment is operated, acquiring a color RAW image through the color camera and acquiring a black and white RAW image through the black and white camera in the previewing process;
determining the current shooting scene as a non-high dynamic and non-motion scene according to the color RAW image acquired in the previewing process;
after a shooting instruction is acquired, controlling the color camera and the black-and-white camera to restart in response to the shooting instruction, and instructing the color camera to output the acquired color RAW image in a pixel merging mode and instructing the black-and-white camera to output the acquired black-and-white RAW image in a full size;
acquiring a color RAW image acquired by the color camera, and acquiring a black and white RAW image acquired by the black and white camera;
carrying out RAW domain processing on the color RAW image acquired by the color camera to obtain a color YUV image; processing the black and white RAW image acquired by the black and white camera through a first Bayer domain processing algorithm link to obtain a black and white YUV image;
fusing the color YUV image and the black-and-white YUV image to obtain a target color image;
after the shooting instruction is obtained, the color RAW image collected by the color camera comprises at least two frames of color RAW images; when the instruction is executed by the terminal device, the terminal device executes the RAW domain processing on the color RAW image collected by the color camera, and the step of obtaining the color YUV image comprises the following steps:
preprocessing the at least two frames of color RAW images;
carrying out noise reduction processing on at least two frames of color RAW images after preprocessing;
fusing at least two frames of color RAW images subjected to noise reduction processing into one frame of color RAW image;
converting the color RAW image obtained by fusion into a color RGB image;
adjusting the color and brightness of the color RGB image;
and converting the color RGB image after the adjustment of the color and the brightness into a color YUV image.
8. The terminal device according to claim 7, wherein the instructions, when executed by the terminal device, cause the terminal device to perform the steps of acquiring the color RAW image acquired by the color camera and acquiring the black and white RAW image acquired by the black and white camera comprise:
and acquiring the color RAW image collected by the color camera from a color image cache, and acquiring the black and white RAW image collected by the black and white camera from a black and white image cache.
9. The terminal device according to claim 7, wherein the instructions, when executed by the terminal device, cause the terminal device to perform RAW domain processing on a color RAW image acquired by the color camera, and further perform the following steps before the step of obtaining a color YUV image:
and carrying out linearization and dead pixel correction processing on the color RAW image acquired by the color camera through a second Bayer domain processing algorithm link.
10. The terminal device of claim 7, wherein the instructions, when executed by the terminal device, cause the terminal device to perform the preprocessing of the at least two frames of color RAW images comprises:
aligning the sizes of the at least two frames of color RAW images;
mapping pixels in every two frames of images in the at least two frames of color RAW images to obtain a corresponding relation between the pixels;
and carrying out error detection on the corresponding relation, and correcting the corresponding relation with the error.
11. The terminal device of claim 7, wherein the instructions, when executed by the terminal device, cause the terminal device to perform the fusing the color YUV image and the black and white YUV image to obtain a target color image comprises:
aligning the sizes of the color YUV image and the black and white YUV image;
acquiring a region to be fused in the color YUV image and a region to be fused in the black-white YUV image from the color YUV image after the size alignment and the black-white YUV image after the size alignment;
mapping the pixels in the area where the color YUV images need to be fused with the pixels in the area where the black-white YUV images need to be fused to obtain the corresponding relation between the pixels;
carrying out error detection on the corresponding relation, and correcting the corresponding relation with errors;
and fusing the clear pixels in the area where the color YUV image needs to be fused and the area where the black-white YUV image needs to be fused to the corresponding fuzzy pixels.
12. The terminal device of claim 11, wherein the instructions, when executed by the terminal device, cause the terminal device to perform the step of aligning the sizes of the color YUV image and the black and white YUV image comprises:
downsampling the black-and-white YUV image to reduce the size of the black-and-white YUV image to be the same as the size of the color YUV image;
and aligning the size of the black-white YUV image and the color YUV image after down sampling.
13. A computer-readable storage medium, in which a computer program is stored which, when run on a computer, causes the computer to carry out the method according to any one of claims 1 to 6.
CN202110923315.3A 2021-08-12 2021-08-12 Terminal image processing method and device and terminal equipment Active CN113810601B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110923315.3A CN113810601B (en) 2021-08-12 2021-08-12 Terminal image processing method and device and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110923315.3A CN113810601B (en) 2021-08-12 2021-08-12 Terminal image processing method and device and terminal equipment

Publications (2)

Publication Number Publication Date
CN113810601A CN113810601A (en) 2021-12-17
CN113810601B true CN113810601B (en) 2022-12-20

Family

ID=78942864

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110923315.3A Active CN113810601B (en) 2021-08-12 2021-08-12 Terminal image processing method and device and terminal equipment

Country Status (1)

Country Link
CN (1) CN113810601B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115550575B (en) * 2022-04-21 2023-07-07 荣耀终端有限公司 Image processing method and related device
CN116048323B (en) * 2022-05-27 2023-11-24 荣耀终端有限公司 Image processing method and electronic equipment
CN116095497B (en) * 2022-06-29 2023-10-20 荣耀终端有限公司 Exposure control method, device and terminal equipment
CN116091392B (en) * 2022-08-16 2023-10-20 荣耀终端有限公司 Image processing method, system and storage medium
CN116051435B (en) * 2022-08-23 2023-11-07 荣耀终端有限公司 Image fusion method and electronic equipment
CN115696063A (en) * 2022-09-13 2023-02-03 荣耀终端有限公司 Photographing method and electronic equipment
CN115619628B (en) * 2022-12-05 2023-05-23 荣耀终端有限公司 Image processing method and terminal equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005229472A (en) * 2004-02-16 2005-08-25 Ricoh Co Ltd Image input/output device
CN103281491A (en) * 2013-05-30 2013-09-04 中国科学院长春光学精密机械与物理研究所 Image fusion device based on DSP
CN106506962A (en) * 2016-11-29 2017-03-15 维沃移动通信有限公司 A kind of image processing method and mobile terminal
CN107395898A (en) * 2017-08-24 2017-11-24 维沃移动通信有限公司 A kind of image pickup method and mobile terminal
CN108605099A (en) * 2016-10-17 2018-09-28 华为技术有限公司 The method and terminal taken pictures for terminal

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106534677B (en) * 2016-10-27 2019-12-17 成都西纬科技有限公司 Image overexposure optimization method and device
JP7024782B2 (en) * 2017-03-27 2022-02-24 ソニーグループ株式会社 Image processing device and image processing method and image pickup device
JP7052811B2 (en) * 2018-02-07 2022-04-12 ソニーグループ株式会社 Image processing device, image processing method and image processing system
CN109005343A (en) * 2018-08-06 2018-12-14 Oppo广东移动通信有限公司 Control method, device, imaging device, electronic equipment and readable storage medium storing program for executing
CN111064899B (en) * 2019-12-06 2021-06-08 成都华为技术有限公司 Exposure parameter adjusting method and device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005229472A (en) * 2004-02-16 2005-08-25 Ricoh Co Ltd Image input/output device
CN103281491A (en) * 2013-05-30 2013-09-04 中国科学院长春光学精密机械与物理研究所 Image fusion device based on DSP
CN108605099A (en) * 2016-10-17 2018-09-28 华为技术有限公司 The method and terminal taken pictures for terminal
CN106506962A (en) * 2016-11-29 2017-03-15 维沃移动通信有限公司 A kind of image processing method and mobile terminal
CN107395898A (en) * 2017-08-24 2017-11-24 维沃移动通信有限公司 A kind of image pickup method and mobile terminal

Also Published As

Publication number Publication date
CN113810601A (en) 2021-12-17

Similar Documents

Publication Publication Date Title
CN112333380B (en) Shooting method and equipment
CN113810601B (en) Terminal image processing method and device and terminal equipment
CN113810600B (en) Terminal image processing method and device and terminal equipment
CN111770282B (en) Image processing method and device, computer readable medium and terminal equipment
CN113542580B (en) Method and device for removing light spots of glasses and electronic equipment
CN110602403A (en) Method for taking pictures under dark light and electronic equipment
WO2020015144A1 (en) Photographing method and electronic device
CN114489533A (en) Screen projection method and device, electronic equipment and computer readable storage medium
WO2022156555A1 (en) Screen brightness adjustment method, apparatus, and terminal device
CN113572948B (en) Video processing method and video processing device
CN114880251A (en) Access method and access device of storage unit and terminal equipment
CN112188094B (en) Image processing method and device, computer readable medium and terminal equipment
CN113467735A (en) Image adjusting method, electronic device and storage medium
CN115412678B (en) Exposure processing method and device and electronic equipment
CN115706869A (en) Terminal image processing method and device and terminal equipment
CN113674258B (en) Image processing method and related equipment
CN115631250B (en) Image processing method and electronic equipment
CN115696067B (en) Image processing method for terminal, terminal device and computer readable storage medium
CN114520870B (en) Display method and terminal
CN117119314B (en) Image processing method and related electronic equipment
CN115297269B (en) Exposure parameter determination method and electronic equipment
CN114125144B (en) Method, terminal and storage medium for preventing false touch
CN113364067B (en) Charging precision calibration method and electronic equipment
CN116708317B (en) Data packet MTU adjustment method and device and terminal equipment
CN115705663B (en) Image processing method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant