CN116095517A - Blurring method and blurring device - Google Patents

Blurring method and blurring device Download PDF

Info

Publication number
CN116095517A
CN116095517A CN202211064862.1A CN202211064862A CN116095517A CN 116095517 A CN116095517 A CN 116095517A CN 202211064862 A CN202211064862 A CN 202211064862A CN 116095517 A CN116095517 A CN 116095517A
Authority
CN
China
Prior art keywords
image
exposure time
terminal device
processing
image data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211064862.1A
Other languages
Chinese (zh)
Other versions
CN116095517B (en
Inventor
邵涛
徐荣跃
魏芅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202211064862.1A priority Critical patent/CN116095517B/en
Publication of CN116095517A publication Critical patent/CN116095517A/en
Application granted granted Critical
Publication of CN116095517B publication Critical patent/CN116095517B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

The embodiment of the application provides a blurring method and device, which relate to the field of terminal equipment, and the method comprises the following steps: the terminal equipment acquires a first image and a second image; the terminal device processes the second image according to the target exposure time, or processes the first image and the second image according to the target exposure time, wherein the processed first image and the processed second image are images which are exposed according to the target exposure time; the terminal device performs blurring processing on the first image based on the processed second image, or based on the processed first image and the processed second image. In this way, the terminal device can align the first image and the second image at the target exposure time based on the target exposure time, so that the influence of different exposure times on the phase difference between the first image and the second image is reduced, the accuracy of depth calculation is improved, and the accuracy of blurring processing is further improved.

Description

Blurring method and blurring device
Technical Field
The present application relates to the field of terminal technologies, and in particular, to a blurring method and apparatus.
Background
With the popularization and development of the internet, the functional demands of people on terminal devices are becoming more diverse. For example, in the process of shooting or video recording, the terminal device can perform blurring processing on a shot photo, a recorded video or a preview picture so as to obtain better space feeling and movie feeling.
In general, the terminal device may obtain main road image data and auxiliary road image data based on a binocular camera (including a main road camera and an auxiliary road camera), calculate a depth image using the main road image data and the auxiliary road image data, and perform blurring processing on the main road image data using the depth image, so that an image located in a background in the main road image data shows a blurring state.
However, the terminal device often has a phenomenon of blurring abnormality of the main-path image data, for example, a phenomenon that a background in the main-path image data is not blurring or a foreground in the main-path image data is blurring.
Disclosure of Invention
The embodiment of the application provides a blurring method and device, wherein a terminal device can determine target exposure time based on exposure time of a first image and exposure time of a second image, align the first image and the second image at the target exposure time, reduce influence of different exposure time on phase difference between the first image and the second image, improve accuracy of depth calculation, and further improve accuracy of blurring processing.
In a first aspect, an embodiment of the present application provides a blurring method, where a terminal device includes a first camera and a second camera, and the method includes: the terminal equipment acquires a first image and a second image; the first image is obtained based on the first camera, and the second image is obtained based on the second camera; the terminal device processes the second image according to the target exposure time, or processes the first image and the second image according to the target exposure time, wherein the processed first image is an image exposed according to the target exposure time, and the processed second image is an image exposed according to the target exposure time; the target exposure time is either one of the exposure time of the first image or the exposure time of the second image; the terminal device performs blurring processing on the first image based on the processed second image, or based on the processed first image and the processed second image. In this way, the terminal device can determine the target exposure time based on the exposure time of the first image and the exposure time of the second image, align the first image and the second image at the target exposure time, reduce the influence of different exposure times on the phase difference between the first image and the second image, improve the accuracy of depth calculation, and further improve the accuracy of blurring processing.
The first camera may be a main path camera described in the embodiment of the present application, and the first image may be main path image data described in the embodiment of the present application; the second camera may be an auxiliary road camera described in the embodiment of the present application, and the second image may be auxiliary road image data described in the embodiment of the present application; the first transformation matrix may be a first warp matrix described in the embodiments of the present application; the second transformation matrix may be a second warp matrix described in embodiments of the present application.
In one possible implementation, the terminal device processes the second image according to the target exposure time, or processes the first image and the second image according to the target exposure time, including: the terminal equipment acquires a first transformation matrix and a second transformation matrix; the first transformation matrix is used for aligning any row of pixel points in the first image to a position corresponding to the target exposure time; the second transformation matrix is used for aligning any row of pixel points in the second image to a position corresponding to the target exposure time; the terminal device processes the first image using the first transformation matrix and processes the second image using the second transformation matrix. In this way, since the time of the staggered exposure of the first image in the progressive exposure is inconsistent with the time of the staggered exposure of the second image in the progressive exposure, the phase difference of the two paths of image data is greatly changed, so that the terminal equipment can correct the first image by respectively acquiring the first transformation matrix corresponding to the target exposure time and respectively correcting the second image based on the second transformation matrix corresponding to the target exposure time, thereby reducing the phase difference influence caused by different exposure times.
In one possible implementation manner, the terminal device performs blurring processing on the first image based on the processed second image, or based on the processed first image and the processed second image, including: the terminal equipment determines a first depth image based on the processed first image and the processed second image; and the terminal equipment performs blurring processing on the first image based on the first depth image. In this way, the terminal device can obtain relatively accurate depth information through the processed first image and the processed second image, thereby improving the blurring effect.
In one possible implementation manner, the terminal device performs blurring processing on the first image based on the first depth image, including: the terminal equipment corrects the first depth image based on an inverse matrix corresponding to the first transformation matrix to obtain a second depth image; and the terminal equipment performs blurring processing on the first image based on the second depth image. In this way, since the terminal device performs blurring processing based on the image corresponding to the first camera under normal conditions, the first depth image can be corrected to be a depth image related to the first camera, so that blurring processing is performed on the image corresponding to the first camera.
In one possible implementation, the terminal device processes the second image according to the target exposure time, or processes the first image and the second image according to the target exposure time, including: the terminal equipment acquires a first transformation matrix and a second transformation matrix; the first transformation matrix is used for aligning any row of pixel points in the first image to a position corresponding to the target exposure time; the second transformation matrix is used for aligning any row of pixel points in the second image to a position corresponding to the target exposure time; the terminal equipment processes a second image according to the inverse matrix corresponding to the first transformation matrix and the second transformation matrix; wherein the exposure time of the processed second image is the same as the exposure time of the first image. In this way, the terminal equipment can simulate the image with the same exposure time as the first image by using the inverse matrix and the second transformation matrix corresponding to the first transformation matrix, and the influence of the difference of the exposure time on the phase difference between the first image and the processed second image is reduced; and the steps of correcting the image data and determining the depth image by using the first transformation matrix and the second transformation matrix are simplified, so that the memory occupation is saved, and the speed of calculating the depth image and generating the blurring image is improved.
In one possible implementation manner, the terminal device performs blurring processing on the first image based on the processed second image, or based on the processed first image and the processed second image, including: the terminal equipment determines a third depth image based on the first image and the processed second image; and the terminal equipment performs blurring processing on the first image based on the third depth image. Therefore, the terminal equipment can accurately calculate the depth data based on the first image with smaller phase difference change and the processed second image, and further improve the blurring effect.
In one possible implementation, the method further includes: the terminal equipment respectively carries out image preprocessing on the first image and the second image to obtain a first image after image preprocessing and a second image after image preprocessing; the terminal device processes the second image according to the target exposure time, or processes the first image and the second image according to the target exposure time, including: the terminal device processes the second image after the image preprocessing according to the target exposure time, or processes the first image after the image preprocessing and the second image after the image preprocessing according to the target exposure time. In this way, the terminal equipment can convert the RAW format image into the YUV format image through image preprocessing, so that the memory occupation in the process of blurring is saved.
In a second aspect, an embodiment of the present application provides an blurring apparatus, configured to acquire a first image and a second image; the first image is obtained based on the first camera, and the second image is obtained based on the second camera; the processing unit is used for processing the second image according to the target exposure time or processing the first image and the second image according to the target exposure time, wherein the processed first image is an image exposed according to the target exposure time, and the processed second image is an image exposed according to the target exposure time; the target exposure time is either one of the exposure time of the first image or the exposure time of the second image; and the processing unit is also used for blurring the first image based on the processed second image or based on the processed first image and the processed second image.
In one possible implementation manner, the acquiring unit is specifically configured to acquire the first transformation matrix and the second transformation matrix; the first transformation matrix is used for aligning any row of pixel points in the first image to a position corresponding to the target exposure time; the second transformation matrix is used for aligning any row of pixel points in the second image to a position corresponding to the target exposure time; the processing unit is specifically configured to process the first image by using the first transformation matrix and process the second image by using the second transformation matrix.
In a possible implementation, the processing unit is specifically configured to: determining a first depth image based on the processed first image and the processed second image; and blurring the first image based on the first depth image.
In one possible implementation, the processing unit is specifically configured to: correcting the first depth image based on an inverse matrix corresponding to the first transformation matrix to obtain a second depth image; and blurring the first image based on the second depth image.
In one possible implementation manner, the acquiring unit is specifically configured to acquire the first transformation matrix and the second transformation matrix; the first transformation matrix is used for aligning any row of pixel points in the first image to a position corresponding to the target exposure time; the second transformation matrix is used for aligning any row of pixel points in the second image to a position corresponding to the target exposure time; the processing unit is specifically used for processing the second image according to the inverse matrix corresponding to the first transformation matrix and the second transformation matrix; wherein the exposure time of the processed second image is the same as the exposure time of the first image.
In one possible implementation, the processing unit is specifically configured to: determining a third depth image based on the first image and the processed second image; and blurring the first image based on the third depth image.
In a possible implementation, the processing unit is further configured to: respectively carrying out image preprocessing on the first image and the second image to obtain a first image after image preprocessing and a second image after image preprocessing; the second image after the image pre-processing is processed according to the target exposure time, or the first image after the image pre-processing and the second image after the image pre-processing are processed according to the target exposure time.
In a third aspect, embodiments of the present application provide a terminal device, including a processor and a memory, where the memory is configured to store code instructions; the processor is configured to execute code instructions to cause the terminal device to perform the blurring method as described in the first aspect or any implementation of the first aspect.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium storing instructions that, when executed, cause a computer to perform a blurring method as described in the first aspect or any implementation of the first aspect.
In a fifth aspect, a computer program product comprising a computer program which, when run, causes a computer to perform the blurring method as described in the first aspect or any implementation of the first aspect.
It should be understood that, the second aspect to the fifth aspect of the present application correspond to the technical solutions of the first aspect of the present application, and the beneficial effects obtained by each aspect and the corresponding possible embodiments are similar, and are not repeated.
Drawings
Fig. 1 is a schematic view of a scenario provided in an embodiment of the present application;
fig. 2 is a schematic hardware structure of a terminal device according to an embodiment of the present application;
fig. 3 is a schematic software structure of a terminal device according to an embodiment of the present application;
fig. 4 is a schematic flow chart of an blurring method according to an embodiment of the present application;
fig. 5 is a schematic diagram of an interface for acquiring image data according to an embodiment of the present application;
fig. 6 is a schematic diagram of generating image data according to an embodiment of the present application;
FIG. 7 is a schematic diagram of an image correction according to an embodiment of the present disclosure;
FIG. 8 is a flow chart of another blurring method according to an embodiment of the present disclosure;
fig. 9 is a schematic flow chart of image preprocessing according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of an blurring apparatus according to an embodiment of the present application;
fig. 11 is a schematic hardware structure of another terminal device according to an embodiment of the present application.
Detailed Description
The words described in the embodiments of the present application are explained below. It will be understood that this description is for the purpose of more clearly explaining the embodiments of the present application and is not necessarily construed as limiting the embodiments of the present application.
In order to clearly describe the technical solutions of the embodiments of the present application, in the embodiments of the present application, the words "first", "second", etc. are used to distinguish the same item or similar items having substantially the same function and effect. For example, the first value and the second value are merely for distinguishing between different values, and are not limited in their order. It will be appreciated by those of skill in the art that the words "first," "second," and the like do not limit the amount and order of execution, and that the words "first," "second," and the like do not necessarily differ.
In this application, the terms "exemplary" or "such as" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
In the present application, "at least one" means one or more, and "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a alone, a and B together, and B alone, wherein a, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b, or c may represent: a, b, c, a and b, a and c, b and c, or a, b and c, wherein a, b, c may be single or plural.
Exemplary, fig. 1 is a schematic view of a scenario provided in an embodiment of the present application. As shown in fig. 1, the scene may include a terminal device 101, for example, the terminal device 101 may be a mobile phone, etc., and a screen 102 captured by the terminal device 101 may include a user 103 located in the foreground and a user 104 located in the background in the screen 102.
Typically, when the terminal device receives that the user turns on the photographing function or the video function, the terminal device may collect image data including the screen 102 based on the camera. For example, the terminal device may acquire main path image data corresponding to the screen 102 using the main path camera, and acquire auxiliary path image data corresponding to the screen 102 using the auxiliary path camera; obtaining depth image data by performing depth calculation on main road image data and auxiliary road image data; and then blurring the main path image data by utilizing the depth image data to obtain a blurring result. For example, the result of the blurring process may be the frame 102 in fig. 1, for example, the frame 102 may be a user 104 with no blurring display and the frame 103 with no blurring display, or the frame 102 may be a user 104 with no blurring display and the frame 102 may be a user 104 with blurring display and the frame 103 with blurring display, and the final blurring process result is not limited in this embodiment of the present application.
However, the exposure time of the main road image data and the auxiliary road image data is affected by the reasons of inconsistent image time of the main road image data and the auxiliary road image data, inconsistent setting parameters of the main road camera and the auxiliary road camera, and the like, so that the phase difference between the main road image data and the auxiliary road image data is changed, and the depth information obtained by the two road image data is affected, thereby causing the situation of blurring abnormality. The patterning time can also be understood as the exposure start time.
In view of this, an embodiment of the present application provides a blurring method, where a terminal device includes a first camera and a second camera, and the method includes: the terminal equipment acquires a first image and a second image; the first image is obtained based on the first camera, and the second image is obtained based on the second camera; the terminal device processes the second image according to the target exposure time, or processes the first image and the second image according to the target exposure time, wherein the processed first image is an image exposed according to the target exposure time, and the processed second image is an image exposed according to the target exposure time; the target exposure time is either one of the exposure time of the first image or the exposure time of the second image; the terminal device performs blurring processing on the first image based on the processed second image, or based on the processed first image and the processed second image. The terminal equipment can determine the target exposure time based on the exposure time of the first image and the exposure time of the second image, align the first image and the second image at the target exposure time, reduce the influence of different exposure times on the phase difference between the first image and the second image under the same exposure time, improve the accuracy of depth calculation and further improve the accuracy of blurring processing.
The first camera may be a main path camera described in the embodiment of the present application, and the first image may be main path image data described in the embodiment of the present application; the second camera may be an auxiliary road camera described in the embodiment of the present application, and the second image may be auxiliary road image data described in the embodiment of the present application; the first transformation matrix may be a first warp matrix described in the embodiments of the present application; the second transformation matrix may be a second warp matrix described in embodiments of the present application.
It is understood that the above terminal device may also be referred to as a terminal (terminal), a User Equipment (UE), a Mobile Station (MS), a Mobile Terminal (MT), etc. The terminal device may be a mobile phone (mobile phone) with a binocular camera, a smart television, a wearable device, a tablet (Pad), a computer with wireless transceiving function, a Virtual Reality (VR) terminal device, an augmented reality (augmented reality, AR) terminal device, a wireless terminal in industrial control (industrial control), a wireless terminal in self-driving (self-driving), a wireless terminal in teleoperation (remote medical surgery), a wireless terminal in smart grid (smart grid), a wireless terminal in transportation security (transportation safety), a wireless terminal in smart city (smart city), a wireless terminal in smart home (smart home), and so on. The embodiment of the application does not limit the specific technology and the specific equipment form adopted by the terminal equipment.
Therefore, in order to better understand the embodiments of the present application, the structure of the terminal device of the embodiments of the present application is described below. Fig. 2 is a schematic structural diagram of a terminal device according to an embodiment of the present application.
The terminal device may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, an indicator 192, a camera 193, a display 194, and the like.
It is to be understood that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the terminal device. In other embodiments of the present application, the terminal device may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units. Wherein the different processing units may be separate devices or may be integrated in one or more processors. A memory may also be provided in the processor 110 for storing instructions and data.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge a terminal device, or may be used to transfer data between the terminal device and a peripheral device. And can also be used for connecting with a headset, and playing audio through the headset. The interface may also be used to connect other terminal devices, such as AR devices, etc.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. The power management module 141 is used for connecting the charge management module 140 and the processor 110.
The wireless communication function of the terminal device may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Antennas in the terminal device may be used to cover single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G or the like applied on a terminal device. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wirelesslocal area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), etc. as applied on a terminal device.
The terminal device implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. In some embodiments, the terminal device may include 1 or N display screens 194, N being a positive integer greater than 1.
The terminal device may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing, so that the electrical signal is converted into an image visible to naked eyes. ISP can also perform algorithm optimization on noise and brightness of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a format of a standard Red Green Blue (RGB), YUV (or understood as a gray value, color, and saturation), or the like.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the terminal device selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, etc.
Video codecs are used to compress or decompress digital video. The terminal device may support one or more video codecs. In this way, the terminal device may play or record video in multiple encoding formats, for example: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The camera 193 is used to capture still images or video. In some embodiments, the terminal device may include 1 or N cameras 193, N being a positive integer greater than 1.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to realize expansion of the memory capability of the terminal device. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer-executable program code that includes instructions. The internal memory 121 may include a storage program area and a storage data area.
The terminal device may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The terminal device can listen to music through the speaker 170A or listen to hands-free calls. A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When the terminal device picks up a call or voice message, the voice can be picked up by placing the receiver 170B close to the human ear. The earphone interface 170D is used to connect a wired earphone.
Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals.
The sensor module 180 may include a gyro sensor 180A. The gyro sensor 180A may be used to determine a motion gesture of the terminal device. In some embodiments, the angular acceleration of the terminal device about three axes (i.e., x, y, and z axes) may be determined by the gyro sensor 180A. The gyro sensor 180A may be used for photographing anti-shake. Illustratively, when the shutter is pressed, the gyro sensor 180A detects the shake angle of the terminal device, calculates the distance to be compensated by the lens module according to the angle, and makes the lens counteract the shake of the terminal device through the reverse motion, thereby realizing anti-shake.
In a possible implementation, the sensor module 180 may further include one or more of the following sensors, for example: a pressure sensor, a barometric pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, or a bone conduction sensor, etc. (not shown in fig. 2). The acceleration sensor is used for detecting the acceleration of the terminal equipment in all directions (generally three axes), so as to further identify the gesture of the terminal equipment.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The terminal device may receive key inputs, generating key signal inputs related to user settings of the terminal device and function control. The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
The software system of the terminal device may adopt a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, a cloud architecture, or the like, which will not be described herein.
Fig. 3 is a schematic software structure of a terminal device according to an embodiment of the present application. As shown in fig. 3, the layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun row (Android run) and system libraries, and a kernel layer, respectively.
The application layer may include a series of application packages.
As shown in fig. 3, the application package may include camera, calendar, phone, map, phone, music, settings, mailbox, video, social, etc. applications. In the embodiment of the application, the blurring method can be implemented in camera application.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 3, the application framework layer may include a window manager, a content provider, a resource manager, a view system, a notification manager, and the like.
The window manager is used for managing window programs. The window manager may obtain the display screen size, determine if there is a status bar, lock the screen, touch the screen, drag the screen, intercept the screen, etc.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the terminal equipment vibrates, and an indicator light blinks.
Android runtimes include core libraries and virtual machines. Android run time is responsible for scheduling and management of the Android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The following describes the technical solutions of the present application and how the technical solutions of the present application solve the above technical problems in detail with specific embodiments. The following embodiments may be implemented independently or combined with each other, and the same or similar concepts or processes may not be described in detail in some embodiments.
Fig. 4 is a schematic flow chart of an blurring method according to an embodiment of the present application. In the embodiment corresponding to fig. 4, the binocular camera may include a main path camera and an auxiliary path camera.
In a possible implementation manner, the binocular camera may include: a main camera (or understood as a main path camera) supporting 1x to 3.5x and an ultra-wide angle camera (or understood as a sub path camera); alternatively, the binocular camera may also include: a tele camera (or a main camera) and a main camera (or a secondary camera) which support more than 3.5x are supported.
In a possible implementation manner, the terminal device may also include a plurality of cameras, for example, including 3 cameras, and at least 2 cameras of the 3 cameras may be used to implement the functions of the binocular camera, which is not limited in the embodiment of the present application.
As shown in fig. 4, the blurring method may include the steps of:
s401, when the terminal equipment receives the operation of opening the video recording function or the shooting function, the terminal equipment acquires main road image data by using the main road camera and acquires auxiliary road image data by using the auxiliary road camera.
The format of the main path image data and the format of the auxiliary path image data may be a native (RAW) format.
The RAW format image may also be referred to as RAW image data, which may be RAW data when the image sensor converts the captured light source signal into a digital signal. RAW information of the camera is recorded in the RAW file, and metadata generated by shooting of the camera, such as setting of image sensitivity, shutter speed, aperture value, white balance value and the like, are recorded at the same time. The RAW format is an unprocessed, uncompressed format.
Fig. 5 is a schematic diagram of an interface for acquiring image data according to an embodiment of the present application. In the embodiment corresponding to fig. 5, a terminal device is taken as an example for a mobile phone to be described as an example, which does not limit the embodiments of the present application.
When the terminal device receives the operation of opening the video recording function in the camera application by the user, the terminal device can acquire main path image data by using the main path camera and auxiliary path image data by using the auxiliary path camera, and obtain the blurring processing result based on the embodiment corresponding to fig. 4, so that the terminal device displays an interface shown as a in fig. 5, and the interface can be an interface corresponding to the video recording function, thereby realizing blurring processing on the preview picture.
In a possible implementation manner, when receiving the triggering operation of the control 501 for starting video recording from the interface shown in a of fig. 5, the terminal device performs blurring processing on the main path image data acquired based on the main path camera and the auxiliary path image data acquired based on the auxiliary path camera based on the embodiment corresponding to fig. 4, so as to obtain a blurring processing result, and further, the terminal device displays the interface shown in b of fig. 5, so as to implement blurring processing on the preview picture.
In a possible implementation manner, when receiving the triggering operation of the control 505 for ending recording by the user in the interface shown in b in fig. 5, the terminal device may perform the blurring process on the main path image data acquired based on the main path camera and the auxiliary path image data acquired based on the auxiliary path camera based on the embodiment corresponding to fig. 4, to obtain a blurring process result, and store the blurring process result in the terminal device, to generate a video file including the blurring process result. In this scenario, the terminal device may not perform the blurring process on the preview screen, so that the blurring process result cannot be displayed in the interface shown in a (and/or b) in fig. 5; alternatively, the blurring process may be performed on the video frame, so that the blurring process result is displayed in the interface shown in a (and/or b) in fig. 5, which is not limited in this embodiment of the present application.
An interface as shown in a in fig. 5, which may include one or more functionality controls in a primary menu of a camera application, for example: an aperture control, a night scene control, a portrait control, a video control (or a control for turning on a video recording function as well), a short video control, or more controls for turning on more functions in a camera application, etc. The interface may also include one or more of the following, for example: based on the pictures acquired by the camera in real time, such as a preview picture 503, a control 501 for starting video recording, a control for starting a gallery, a control for switching the camera, a setting control for setting a camera application, a control for adjusting shooting times, a flash control for setting a flash to be started or closed, and the like. The preview screen 503 may include: a user 502 in the foreground, and a user 504 in the background, the user 504 in the background may be in a virtual state.
It will be appreciated that, as shown in the interface a in fig. 5, the embodiments of the present application may represent the virtual state of the object by a dotted line, the non-virtual state of the object by a solid line, or may also be referred to as a normal display state.
In a possible implementation, when the terminal device receives a trigger operation of the user for opening the control 501 for video recording, the terminal device may display an interface as shown in b in fig. 5. The interface may include: based on the preview screen 503 obtained by blurring processing, information for indicating video recording time, a control for suspending recording, a control for shooting a current screen, a control 505 for ending recording, a control for adjusting shooting times, and the like.
In a possible implementation manner, when the terminal device receives the shooting function, the terminal device performs the blurring process on the main road image data acquired based on the main road camera and the auxiliary road image data acquired based on the auxiliary road camera based on the embodiment corresponding to fig. 4, so as to obtain a blurring process result, and displays the blurring process result in the preview picture and/or stores the blurring process result in the image file based on the triggering operation of the user on the shooting control, where the triggering manner of the blurring method is similar to the embodiment corresponding to fig. 5 and is not repeated herein.
S402, the terminal equipment respectively carries out image preprocessing on the main road image data and the auxiliary road image data to obtain main road image data after the image preprocessing and auxiliary road image data after the image preprocessing.
The image preprocessing is used for processing the image data in the RAW format into the image data in the YUV format.
By way of example, the image pre-processing may include one or more of the following, for example: the specific process of the image preprocessing in the embodiment of the present application is not limited, and the method includes a dead pixel removal correction process, a RAW domain noise reduction process, a black level correction process, an optical shading correction process/an automatic white balance process, a color interpolation process, a tone mapping process, a color correction process, a Gamma correction process, an image conversion process, and the like.
S403, the terminal equipment determines a target exposure time by using the exposure time of the main path image data or the exposure time of the auxiliary path image data.
Wherein the target exposure time may be any one of the exposure time of the first image or the exposure time of the second image. For example, the target exposure time is an intermediate time between a start exposure time for exposing a first line of pixels of the main line image data and a start exposure time for exposing a last line of pixels of the main line image data; alternatively, the target exposure time is an intermediate time between a start exposure time for exposing the first line of pixels of the subsidiary image data and a start exposure time for exposing the last line of pixels of the main image data. Alternatively, the target exposure time may be any time between a start exposure time for exposing the first line of pixels of the main line image data (or the auxiliary line image data) and a start exposure time for exposing the last line of pixels of the main line image data (or the auxiliary line image data), which is not limited in the embodiment of the present application.
Fig. 6 is a schematic diagram of generating image data according to an embodiment of the present application.
The terminal device may start exposing the main path image data at t1 according to the sine function shown in b in fig. 6, and complete exposing the main path image data at t4, thereby generating the main path image data. As shown in a of fig. 6, the main path image data exposes the first line of pixels in the main path image data at t1 according to a line-by-line exposure principle, and completes exposure of the first line of pixels at t2, exposes the last line of pixels in the main path image data at t3, and completes exposure of the last line of pixels at t4, so as to generate the main path image data. Wherein the difference between t1 and t2 is the same as the difference between t3 and t4, or it can be understood that the exposure time of the terminal device per line is the same.
The terminal device may start exposing the auxiliary image data at t6 according to the sine function shown as d in fig. 6, and complete exposing the auxiliary image data at t9, thereby generating the auxiliary image data. As shown in c in fig. 6, the auxiliary image data is exposed according to a line-by-line exposure principle, the first line of pixels in the auxiliary image data is exposed at t6, the exposure of the first line of pixels is completed at t8, the line-by-line exposure is performed, the last line of pixels in the auxiliary image data is finally exposed at t7, the exposure of the last line of pixels is completed at t9, and the auxiliary image data is generated. Wherein, the difference between t6 and t8 is the same as the difference between t7 and t 9. t1 may be earlier than t6, or it is understood that the main path image data may be exposed earlier than the auxiliary path image data.
It can be understood that due to the influence of the graph time and the sensor setting, the initial exposure time of the two paths of image data and the time interval during the progressive exposure are different (or understood as the initial exposure time during the progressive exposure), and the difference of the exposure time affects the phase difference of the two paths of image data, so that the depth information calculated based on the two paths of image data is inaccurate, thereby affecting the blurring effect.
Thus, as shown in fig. 6, the terminal device may determine a target exposure time, which may be a time at which two paths of image data are aligned, which may cause the two paths of image data to start exposure at the same point in time. For example, the target exposure time may be any exposure time of the main path image data or the auxiliary path image data, for example, the target exposure time may be t5, and the t5 may be an intermediate value between t1 and t 3; alternatively, the target exposure time may be an intermediate value between t6 and t7, which is not limited in the embodiment of the present application.
Based on the above, the terminal device can utilize the target exposure time to enable the two paths of image data to start exposure at the same time point, so that the influence of different exposure starting times on the phase difference is avoided.
S404, the terminal equipment corrects the main road image data after the image preprocessing by using a first warp matrix corresponding to the main road image data to obtain corrected main road image data, and corrects the auxiliary road image data after the image preprocessing by using a second warp matrix corresponding to the auxiliary road image data to obtain corrected auxiliary road image data.
In this embodiment of the present application, a first warp matrix (or called a first transform matrix) is used to align any row of pixels in a first image to a position corresponding to a target exposure time, and a second warp matrix (or called a second transform matrix) is used to align any row of pixels in a second image to a position corresponding to a target exposure time. Wherein each value in the first transformation matrix (or the second transformation matrix) can be used to determine a distance that needs to be moved when each row of pixel points is aligned to a position corresponding to the target exposure time point.
For example, the terminal device may obtain angular acceleration data through an optical image stabilizer (optical image stabilizer, OIS), and obtain a first warp matrix corresponding to the main path image data at the target exposure time and a second warp matrix corresponding to the auxiliary path image data at the target exposure time through (electric image stabilization, EIS) electronic anti-shake processing; further, the terminal device may perform position correction on the main road image after the image preprocessing by using the first warp matrix to obtain main road image data after the correction processing, and perform position correction on the auxiliary road image after the image preprocessing by using the second warp matrix to obtain auxiliary road image data after the correction processing.
It can be understood that, because the time of the staggered exposure of the main path image data in the progressive exposure is inconsistent with the time of the staggered exposure of the auxiliary path image data in the progressive exposure, the phase difference of the two paths of image data has larger change, so that the terminal equipment can correct the image data by respectively acquiring the corresponding warp matrix aligned to the target exposure time, and the phase difference influence caused by different exposure times is reduced.
Fig. 7 is a schematic diagram of image correction according to an embodiment of the present application.
As shown in a of fig. 7, the terminal device may perform image position correction on the main image data through the first warp matrix to obtain corrected main image data shown in b of fig. 7, where the exposure start time of the corrected main image data may be t5, and the exposure end time of the corrected main image data may be t5+ (t 2-t 1).
The terminal device may correct the image position of the auxiliary road image data through the second warp matrix as shown in c of fig. 7, to obtain corrected auxiliary road image data as shown in d of fig. 7, where the exposure start time of the corrected auxiliary road image data may be t5, and the exposure end time of the corrected auxiliary road image data may be t5+ (t 8-t 6).
It will be appreciated that the warp matrix-based image rectification process described in the step shown in S404 may be one step in the terminal device image post-processing process. In a possible implementation manner, after S404, the terminal device may perform other image post-processing on the corrected main road image data and the corrected auxiliary road image data. For example, the image post-processing may include one or more of the following: noise processing, brightness and color processing, image scaling processing, and the like, and the specific manner of post-processing of the image in the embodiments of the present application is not limited.
For noise processing, it can be used to remove the noise effect in the current image. The terminal device may remove noise in the current image through a low-pass filter, a bilateral filter, or the like.
For the luminance and color processing, it is possible to adjust the influence on the luminance and color of the photographic subject due to the light condition or the like. The color processing method may include: color processing methods based on color correction matrices, and the like. The brightness processing method may include: local tone mapping methods, and the like.
For image scaling processing, it may be used to convert the current image from one resolution to another. The image scaling processing method may include: nearest neighbor interpolation, linear interpolation, regional interpolation, or cubic spline interpolation.
It can be appreciated that the image correction and adjustment processing method, the noise processing method, the brightness and color processing method, and the image scaling processing method may include other contents according to the actual scene, which is not limited in the embodiment of the present application.
And S405, the terminal equipment determines depth image data based on the corrected main road image data and the corrected auxiliary road image data.
It can be understood that the terminal device may determine the phase difference corresponding to any pair of pixels in the corrected main road image data and the corrected auxiliary road image data, and determine the depth corresponding to any pair of pixels by using the phase difference and the distance between the two cameras, so as to obtain depth image data.
S406, the terminal equipment processes the depth image data by using an inverse matrix corresponding to the first warp matrix to obtain corrected depth image data.
It can be understood that, in general, the terminal device performs blurring processing based on the image data corresponding to the main camera, so that the depth image data can be corrected to the depth image data related to the main camera, and corrected depth image data can be obtained.
S407, the terminal equipment performs blurring processing on the main path image data after image preprocessing by using the corrected depth image data to obtain blurring processing results.
The terminal device may perform the blurring process by using a gaussian blur process, a neural network model, and the like, and the blurring process method is not limited in the embodiment of the present application.
S408, the terminal equipment sends the blurring processing result to a display and is used as a preview picture, and/or the terminal equipment stores the blurring processing result and is used as an image or video.
Based on the method, the terminal equipment can determine the target exposure time based on the exposure time of the main road image data and the exposure time of the auxiliary road image data, align the main road image data and the auxiliary road image data at the target exposure time, reduce the influence of different exposure times on the phase difference between the main road image data and the auxiliary road image data, improve the accuracy of depth calculation, and further improve the accuracy of blurring processing.
Fig. 8 is a schematic flow chart of another blurring method according to an embodiment of the present application.
As shown in fig. 8, the blurring method may include the steps of:
S801, when the terminal equipment receives an operation of opening a video recording function or a shooting function, the terminal equipment acquires main road image data by using a main road camera and acquires auxiliary road image data by using an auxiliary road camera.
S802, the terminal equipment respectively carries out image preprocessing on the main road image data and the auxiliary road image data to obtain main road image data after the image preprocessing and auxiliary road image data after the image preprocessing.
S803, the terminal device determines a target exposure time using the exposure time of the main path image data or the exposure time of the auxiliary path image data.
The descriptions of the steps shown in S801 to S803 may be referred to the descriptions of the steps shown in S401 to S403, and are not described herein.
S804, the terminal equipment acquires a first warp matrix corresponding to the main path image data under the target exposure time and a second warp matrix corresponding to the auxiliary path image data under the target exposure time.
The process of obtaining the first warp matrix and the second warp matrix may refer to the description in the step shown in S404, which is not described herein.
S805, the terminal equipment acquires an inverse matrix corresponding to the first warp matrix, calculates convolution of the inverse matrix corresponding to the first warp matrix and the second warp matrix, and obtains a target matrix.
S806, the terminal equipment corrects the auxiliary road image data after the image preprocessing by using the target matrix to obtain auxiliary road image data after the position correction, and determines corrected depth image data based on the auxiliary road image data after the position correction and the main road image data after the image preprocessing.
It can be understood that the terminal device can correct the auxiliary road image data after the image preprocessing through the target matrix, so that the exposure time of the auxiliary road image data after the position correction can be the same as that of the main road image data after the image preprocessing, and further the depth image obtained by calculating based on the auxiliary road image data after the position correction and the main road image data after the image preprocessing is more accurate.
In a possible implementation manner, the terminal device may also perform one or more of the image post-processing procedures described in S404 on the auxiliary road image data after the position correction, which is not limited in this embodiment of the present application.
S808, the terminal equipment performs blurring processing on the main path image data after image preprocessing by using the corrected depth image data to obtain blurring processing results.
S809, the terminal device sends the blurring processing result to a display and is used as a preview picture, and/or the terminal device stores the blurring processing result and is used as an image or video.
Based on this, compared with the embodiment corresponding to fig. 4, the terminal device may simplify the steps of correcting the image data by using the first warp matrix and the second warp matrix and determining the depth image through calculation of the target matrix, save memory occupation, and increase the speed of calculating the depth image and generating the blurring image.
It will be understood that the order between the steps in the blurring processing method in the embodiments corresponding to fig. 4 and fig. 8 may not be limited to the description in the embodiments corresponding to fig. 4 or fig. 8, which is not limited in this embodiment of the present application.
On the basis of the embodiment corresponding to fig. 4 or fig. 8, in one implementation, the terminal device may send the blurring result to the display, and store the blurring result in real time, so that the terminal device may display the blurring result in real time in the preview interface, and may also encode the stored blurring result into the blurring video content or the image content when receiving the operation that the user finishes shooting or recording. In this scenario, the terminal device may not only display the blurring processing result in the preview interface, but also display the blurring processed video content or image content when receiving an operation of the user to play back the video content or view the image content.
It can be understood that, the terminal device sends the blurring processing result to the display, and the processing method for storing the blurring processing result in real time can also be implemented in a live broadcast scene, a call and other scenes with preview requirements and video recording requirements, which is not limited in the embodiment of the present application.
In another implementation, the terminal device may store the blurring processing result in real time, and encode the stored blurring processing result into blurring processing content when receiving the operation of ending shooting or recording by the user; alternatively, the terminal device may store the main road image data and the auxiliary road image data acquired in S401 (or S801) in real time, and when receiving the operation of ending shooting or recording by the user, perform the steps shown in S402-S407 (or S802-S807) on the stored main road image data and auxiliary road image data to obtain the blurring result, and encode the blurring result as the content of the blurring process. In this scenario, the blurring processing result may not be displayed in the preview interface of the terminal device, but the blurring processed video content or image content is displayed upon receiving an operation of the user to play back the video content or view the image content.
It can be understood that the method for storing the blurring processing result in real time by the terminal device can realize the video recording requirement and the image shooting requirement of the user on the terminal device.
It is understood that the subsequent processing flow of the blurring processing result in the embodiment of the present application is not specifically limited.
On the basis of the corresponding embodiment of fig. 4 (or fig. 8), in a possible implementation manner, the terminal device may perform the steps shown in S401-S407 (or S801-S807) in the present device; alternatively, the terminal device may perform the blurring method in the server, for example, after the terminal device acquires the main path image data and the auxiliary path image data in S401 (or S801), the image data may be sent to the server, so that the server may perform the steps shown in S402-S407 (or S801-S807) to obtain a blurring result, and the server may send the blurring result to the terminal device, so that the terminal device may perform subsequent storage processing or display processing based on the blurring result.
It can be appreciated that the apparatus for performing the blurring method in the embodiments of the present application is not specifically limited.
In a possible implementation manner, based on the embodiment corresponding to fig. 4 or fig. 8, the image preprocessing procedure in the step shown in S402 or S802 may refer to the embodiment corresponding to fig. 9.
Fig. 9 is a schematic flow chart of image preprocessing according to an embodiment of the present application.
As shown in fig. 9, the image preprocessing may include one or more of the following, for example: the specific process of the image preprocessing in the embodiment of the present application is not limited, and the method includes a dead pixel removal correction process, a RAW domain noise reduction process, a black level correction process, an optical shading correction process/an automatic white balance process, a color interpolation process, a tone mapping process, a color correction process, a Gamma correction process, an image conversion process, and the like.
For the dead pixel removal correction process, the dead pixel may be a point whose brightness or color is greatly different from that of other surrounding pixels. For example, the terminal device may determine the bad point by detecting the bright point and the colored point in a full black environment and detecting the black point and the colored point in a highlight environment. In the dead pixel removal correction process, the terminal device may remove dead pixels by taking the average value of surrounding pixels in the luminance domain.
For RAW domain noise reduction processing, noise may appear on an image as isolated pixels or blocks of pixels that cause a stronger visual effect. In the RAW domain noise reduction process, the terminal device may remove noise in the RAW domain by a Low Pass Filter (LPF), a bilateral filter (bilateral filtering), or the like.
For the black level correction process, in the process of debugging the camera, the camera is put into a closed sealed box, and the picture is found to be black, but the black degree is not black enough, because of the influence of dark current, so that the image data output by the sensor is not black balance which is needed by us. In the black level correction process, the terminal device can make the picture appear pure black by finding a correction value and subtracting this correction value from the pixels of all areas.
For the optical shading correction processing, the brightness of the periphery of the picture is gradually reduced relative to the central brightness due to the physical property of the lens, and meanwhile, the crosstalk between adjacent pixels and the corner color cast are caused due to the large incident angle of the edge. In the optical shading correction process, the terminal device may calculate a luminance correction value corresponding to each pixel according to a certain correction method, thereby compensating for the luminance of the peripheral attenuation. The correction method may be quadratic correction, quartic correction, or the like.
For automatic white balance processing, a piece of white paper is yellow under a low color temperature and blue under a high color temperature due to the influence of the color temperature. In the automatic white balance processing, white balance can enable a white object to be white under any color temperature, and color cast is avoided. The automatic white balance method may include: gray world, or perfect reflection, etc.
With respect to the color interpolation process, since each pixel perceives only one color component, three components of RGB can be simultaneously contained on each pixel by color interpolation, and thus the color interpolation process can be used to convert image data in the RAW format into image data in the RGB format.
For the tone mapping process, the overall brightness of the image is adjusted so that the brightness-adjusted picture can be closer to the brightness presented in the real world.
With respect to the color correction process, there is a deviation in RGB value color due to a difference between the spectral responsivity of visible light of the user's eyes and the spectral responsivity of the semiconductor sensor, and the influence of lenses or the like. In the color correction process, the terminal device needs to perform color correction, for example, the terminal device may perform color correction using a 3×3 color change matrix.
For Gamma correction processing, the Gamma correction processing is used for nonlinear operation on the gray value of the input image so that the gray value of the output image is exponentially related to the gray value of the input image.
For the image conversion processing, it can be used to convert the image data in red, green and blue RGB format into the image data in YUV format.
It is to be understood that the interface provided by the embodiments of the present application is provided as an example only and is not intended to limit the embodiments of the present application further.
The method provided by the embodiment of the present application is described above with reference to fig. 4 to 9, and the device for performing the method provided by the embodiment of the present application is described below. As shown in fig. 10, fig. 10 is a schematic structural diagram of a blurring apparatus provided in an embodiment of the present application, where the blurring apparatus may be a terminal device in an embodiment of the present application, or may be a chip or a chip system in a terminal device.
As shown in fig. 10, the blurring apparatus 1000 may be used in a communication device, a circuit, a hardware component, or a chip, and the blurring apparatus 1000 includes: acquisition unit 1001 and processing unit 1002. Wherein the acquiring unit 1001 is configured to support a step of data acquisition performed by the blurring apparatus 1000; the processing unit 1002 is configured to support the blurring apparatus 1000 to perform steps of information processing.
Specifically, an embodiment of the present application provides a blurring apparatus 1000, where a terminal device includes a first camera and a second camera, and the method includes: an acquisition unit 1001 for acquiring a first image and a second image; the first image is obtained based on the first camera, and the second image is obtained based on the second camera; a processing unit 1002, configured to process the second image according to the target exposure time, or process the first image and the second image according to the target exposure time, where the processed first image is an image exposed according to the target exposure time, and the processed second image is an image exposed according to the target exposure time; the target exposure time is either one of the exposure time of the first image or the exposure time of the second image; the processing unit 1002 is further configured to perform blurring processing on the first image based on the processed second image, or based on the processed first image and the processed second image.
In a possible implementation, the blurring apparatus 1000 may further include: a communication unit 1003, the communication unit 1003 being configured to instruct the blurring apparatus 1000 to perform steps such as transmission and reception of data. The communication unit 1003 may be an input or output interface, pin or circuit, etc.
In a possible embodiment, the blurring apparatus 1000 may further include: a storage unit 1004. The processing unit 1002 and the storage unit 1004 are connected by a line. The storage unit 1004 may include one or more memories, which may be one or more devices, devices in a circuit, for storing programs or data. The storage unit 1004 may exist independently and be connected to the processing unit 1002 provided in the blurring apparatus 1000 through a communication line. The memory unit 1004 may also be integrated with the processing unit 1002.
The storage unit 1004 may store computer-executed instructions of the method in the terminal device to cause the processing unit 1002 to execute the method in the above-described embodiment. The storage unit 1004 may be a register, a cache, a RAM, or the like, and the storage unit 1004 may be integrated with the processing unit 1002. The storage unit 1004 may be a read-only memory (ROM) or other type of static storage device that may store static information and instructions, and the storage unit 1004 may be independent of the processing unit 1002.
Fig. 11 is a schematic hardware structure of another terminal device according to an embodiment of the present application, as shown in fig. 11, where the terminal device includes a processor 1101, a communication line 1104, and at least one communication interface (illustrated in fig. 11 by taking a communication interface 1103 as an example).
The processor 1101 may be a general purpose central processing unit (central processing unit, CPU), microprocessor, application Specific Integrated Circuit (ASIC), or one or more integrated circuits for controlling the execution of the programs of the present application.
Communication line 1104 may include circuitry for communicating information between the components described above.
Communication interface 1103 uses any transceiver-like device for communicating with other devices or communication networks, such as ethernet, wireless local area network (wireless local area networks, WLAN), etc.
Possibly, the terminal device may also comprise a memory 1102.
The memory 1102 may be, but is not limited to, a read-only memory (ROM) or other type of static storage device that can store static information and instructions, a random access memory (random access memory, RAM) or other type of dynamic storage device that can store information and instructions, an electrically erasable programmable read-only memory (EEPROM), a compact disc-only memory (compact disc read-only memory) or other optical disk storage, a compact disc storage (including compact disc, laser disc, optical disc, digital versatile disc, blu-ray disc, etc.), a magnetic disk storage medium or other magnetic storage device, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. The memory may be separate and coupled to the processor via communication line 1104. The memory may also be integrated with the processor.
The memory 1102 is used for storing computer-executable instructions for executing the embodiments of the present application, and the processor 1101 controls the execution. The processor 1101 is configured to execute computer-executable instructions stored in the memory 1102, thereby implementing the methods provided by the embodiments of the present application.
Possibly, the computer-executed instructions in the embodiments of the present application may also be referred to as application program code, which is not specifically limited in the embodiments of the present application.
In a particular implementation, the processor 1101 may include one or more CPUs, such as CPU0 and CPU1 of FIG. 11, as an embodiment.
In a specific implementation, as an embodiment, the terminal device may include multiple processors, such as processor 1101 and processor 1105 in fig. 11. Each of these processors may be a single-core (single-CPU) processor or may be a multi-core (multi-CPU) processor. A processor herein may refer to one or more devices, circuits, and/or processing cores for processing data (e.g., computer program instructions).
The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, the processes or functions in accordance with embodiments of the present application are produced in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wired (e.g., coaxial cable, fiber optic, digital subscriber line (digital subscriber line, DSL), or wireless (e.g., infrared, wireless, microwave, etc.), or semiconductor medium (e.g., solid state disk, SSD)) or the like.
Embodiments of the present application also provide a computer-readable storage medium. The methods described in the above embodiments may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. Computer readable media can include computer storage media and communication media and can include any medium that can transfer a computer program from one place to another. The storage media may be any target media that is accessible by a computer.
As one possible design, the computer-readable medium may include compact disk read-only memory (CD-ROM), RAM, ROM, EEPROM, or other optical disk memory; the computer readable medium may include disk storage or other disk storage devices. Moreover, any connection is properly termed a computer-readable medium. For example, if the software is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. Disk and disc, as used herein, includes Compact Disc (CD), laser disc, optical disc, digital versatile disc (digital versatile disc, DVD), floppy disk and blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers.
Combinations of the above should also be included within the scope of computer-readable media. The foregoing is merely illustrative embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily think about variations or substitutions within the technical scope of the present invention, and the invention should be covered. Therefore, the protection scope of the invention is subject to the protection scope of the claims.

Claims (10)

1. A blurring method, applied to a terminal device, the terminal device including a first camera and a second camera, the method comprising:
the terminal equipment acquires a first image and a second image; wherein the first image is obtained based on the first camera and the second image is obtained based on the second camera;
the terminal device processes the second image according to the target exposure time, or processes the first image and the second image according to the target exposure time, wherein the processed first image is an image exposed according to the target exposure time, and the processed second image is an image exposed according to the target exposure time; the target exposure time is either one of the exposure time of the first image or the exposure time of the second image;
The terminal device performs blurring processing on the first image based on the processed second image or based on the processed first image and the processed second image.
2. The method of claim 1, wherein the terminal device processing the second image according to a target exposure time or processing the first image and the second image according to the target exposure time comprises:
the terminal equipment acquires a first transformation matrix and a second transformation matrix; the first transformation matrix is used for aligning any row of pixel points in the first image to a position corresponding to the target exposure time; the second transformation matrix is used for aligning any row of pixel points in the second image to a position corresponding to the target exposure time;
the terminal device processes the first image using the first transformation matrix and processes the second image using the second transformation matrix.
3. The method according to claim 2, wherein the terminal device performs blurring processing on the first image based on the second image after processing or based on the first image after processing and the second image after processing, including:
The terminal equipment determines a first depth image based on the processed first image and the processed second image;
and the terminal equipment performs blurring processing on the first image based on the first depth image.
4. A method according to claim 3, wherein the terminal device performs blurring processing on the first image based on the first depth image, including:
the terminal equipment corrects the first depth image based on an inverse matrix corresponding to the first transformation matrix to obtain a second depth image;
and the terminal equipment performs blurring processing on the first image based on the second depth image.
5. The method of claim 1, wherein the terminal device processing the second image according to a target exposure time or processing the first image and the second image according to the target exposure time comprises:
the terminal equipment acquires a first transformation matrix and a second transformation matrix; the first transformation matrix is used for aligning any row of pixel points in the first image to a position corresponding to the target exposure time; the second transformation matrix is used for aligning any row of pixel points in the second image to a position corresponding to the target exposure time;
The terminal equipment processes the second image according to the inverse matrix corresponding to the first transformation matrix and the second transformation matrix; wherein the exposure time of the second image after processing is the same as the exposure time of the first image.
6. The method according to claim 5, wherein the terminal device performs blurring processing on the first image based on the second image after processing or based on the first image after processing and the second image after processing, including:
the terminal equipment determines a third depth image based on the first image and the processed second image;
and the terminal equipment performs blurring processing on the first image based on the third depth image.
7. The method according to any one of claims 1-6, further comprising:
the terminal equipment respectively carries out image preprocessing on the first image and the second image to obtain the first image after the image preprocessing and the second image after the image preprocessing;
the terminal device processes the second image according to a target exposure time, or processes the first image and the second image according to the target exposure time, including: the terminal device processes the second image after the image preprocessing according to the target exposure time, or processes the first image after the image preprocessing and the second image after the image preprocessing according to the target exposure time.
8. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor, when executing the computer program, causes the terminal device to perform the method according to any of claims 1 to 7.
9. A computer readable storage medium storing a computer program, which when executed by a processor causes a computer to perform the method of any one of claims 1 to 7.
10. A computer program product comprising a computer program which, when run, causes a computer to perform the method of any of claims 1 to 7.
CN202211064862.1A 2022-08-31 2022-08-31 Blurring method, terminal device and readable storage medium Active CN116095517B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211064862.1A CN116095517B (en) 2022-08-31 2022-08-31 Blurring method, terminal device and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211064862.1A CN116095517B (en) 2022-08-31 2022-08-31 Blurring method, terminal device and readable storage medium

Publications (2)

Publication Number Publication Date
CN116095517A true CN116095517A (en) 2023-05-09
CN116095517B CN116095517B (en) 2024-04-09

Family

ID=86205185

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211064862.1A Active CN116095517B (en) 2022-08-31 2022-08-31 Blurring method, terminal device and readable storage medium

Country Status (1)

Country Link
CN (1) CN116095517B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104717480A (en) * 2014-01-28 2015-06-17 杭州海康威视数字技术股份有限公司 Binocular camera pixel-level synchronous image acquisition device and method thereof
CN107948519A (en) * 2017-11-30 2018-04-20 广东欧珀移动通信有限公司 Image processing method, device and equipment
CN108492245A (en) * 2018-02-06 2018-09-04 浙江大学 Low light images based on wavelet decomposition and bilateral filtering are to fusion method
CN109863742A (en) * 2017-01-25 2019-06-07 华为技术有限公司 Image processing method and terminal device
CN110312056A (en) * 2019-06-10 2019-10-08 青岛小鸟看看科技有限公司 A kind of synchronous exposure method and image capture device
CN111418201A (en) * 2018-03-27 2020-07-14 华为技术有限公司 Shooting method and equipment
WO2021136078A1 (en) * 2019-12-31 2021-07-08 RealMe重庆移动通信有限公司 Image processing method, image processing system, computer readable medium, and electronic apparatus
CN113450391A (en) * 2020-03-26 2021-09-28 华为技术有限公司 Method and equipment for generating depth map
CN113592922A (en) * 2021-06-09 2021-11-02 维沃移动通信(杭州)有限公司 Image registration processing method and device
CN113658065A (en) * 2021-08-09 2021-11-16 Oppo广东移动通信有限公司 Image noise reduction method and device, computer readable medium and electronic equipment

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104717480A (en) * 2014-01-28 2015-06-17 杭州海康威视数字技术股份有限公司 Binocular camera pixel-level synchronous image acquisition device and method thereof
CN109863742A (en) * 2017-01-25 2019-06-07 华为技术有限公司 Image processing method and terminal device
CN107948519A (en) * 2017-11-30 2018-04-20 广东欧珀移动通信有限公司 Image processing method, device and equipment
CN108492245A (en) * 2018-02-06 2018-09-04 浙江大学 Low light images based on wavelet decomposition and bilateral filtering are to fusion method
CN111418201A (en) * 2018-03-27 2020-07-14 华为技术有限公司 Shooting method and equipment
CN110312056A (en) * 2019-06-10 2019-10-08 青岛小鸟看看科技有限公司 A kind of synchronous exposure method and image capture device
WO2021136078A1 (en) * 2019-12-31 2021-07-08 RealMe重庆移动通信有限公司 Image processing method, image processing system, computer readable medium, and electronic apparatus
CN113450391A (en) * 2020-03-26 2021-09-28 华为技术有限公司 Method and equipment for generating depth map
CN113592922A (en) * 2021-06-09 2021-11-02 维沃移动通信(杭州)有限公司 Image registration processing method and device
CN113658065A (en) * 2021-08-09 2021-11-16 Oppo广东移动通信有限公司 Image noise reduction method and device, computer readable medium and electronic equipment

Also Published As

Publication number Publication date
CN116095517B (en) 2024-04-09

Similar Documents

Publication Publication Date Title
EP3410390B1 (en) Image processing method and device, computer readable storage medium and electronic device
US11949978B2 (en) Image content removal method and related apparatus
US20240119566A1 (en) Image processing method and apparatus, and electronic device
US20220086360A1 (en) Big aperture blurring method based on dual cameras and tof
CN115526787B (en) Video processing method and device
WO2023160295A9 (en) Video processing method and apparatus
US20220214892A1 (en) Foreground element display method and electronic device
WO2022267608A1 (en) Exposure intensity adjusting method and related apparatus
CN113630558B (en) Camera exposure method and electronic equipment
CN115529411B (en) Video blurring method and device
CN116095517B (en) Blurring method, terminal device and readable storage medium
CN113891008B (en) Exposure intensity adjusting method and related equipment
CN116112813B (en) Blurring method and blurring device
CN116704928A (en) Display screen adjusting method and electronic equipment
CN116048323B (en) Image processing method and electronic equipment
CN115359105B (en) Depth-of-field extended image generation method, device and storage medium
CN115460343B (en) Image processing method, device and storage medium
CN115767287B (en) Image processing method and electronic equipment
CN116757963B (en) Image processing method, electronic device, chip system and readable storage medium
CN114363482B (en) Method for determining calibration image and electronic equipment
CN111479075B (en) Photographing terminal and image processing method thereof
CN114630153B (en) Parameter transmission method and device for application processor and storage medium
CN116708931B (en) Image processing method and electronic equipment
CN117135470A (en) Shooting method, electronic equipment and storage medium
CN117115003A (en) Method and device for removing noise

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant