CN110971822A - Picture processing method and device, terminal equipment and computer readable storage medium - Google Patents

Picture processing method and device, terminal equipment and computer readable storage medium Download PDF

Info

Publication number
CN110971822A
CN110971822A CN201911200429.4A CN201911200429A CN110971822A CN 110971822 A CN110971822 A CN 110971822A CN 201911200429 A CN201911200429 A CN 201911200429A CN 110971822 A CN110971822 A CN 110971822A
Authority
CN
China
Prior art keywords
picture
light source
exposure
preview interface
exposure parameter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911200429.4A
Other languages
Chinese (zh)
Inventor
张海裕
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201911200429.4A priority Critical patent/CN110971822A/en
Publication of CN110971822A publication Critical patent/CN110971822A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/76Circuitry for compensating brightness variation in the scene by influencing the image signals

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The application is applicable to the technical field of images, and provides a picture processing method, a picture processing device, terminal equipment and a computer readable storage medium, wherein the method comprises the following steps: if the terminal equipment enters a preview interface, detecting whether a light source exists in a picture of the preview interface; if the light source exists in the picture of the preview interface, shooting a first picture by adopting a first exposure parameter, and shooting a second picture by adopting a non-first exposure parameter; detecting the positions of the light source and the diffraction light spot in the first picture to obtain positions to be replaced; acquiring a pixel value of a position corresponding to the position to be replaced in the second picture; and replacing the pixel value of the position to be replaced of the first picture with the acquired pixel value to obtain a processed picture. By the method, the probability of diffraction phenomenon of the obtained processed image can be effectively reduced.

Description

Picture processing method and device, terminal equipment and computer readable storage medium
Technical Field
The present application belongs to the field of image processing technologies, and in particular, to a method and an apparatus for processing an image, a terminal device, and a computer-readable storage medium.
Background
A transmissive organic light-emitting device (TOLED) is similar to an amplitude type transmissive two-dimensional grating, and when light passes through the TOLED, a diffraction phenomenon occurs, which generally causes a double image of a displayed image.
The conventional TOLED increases a gap between Thin Film Transistors (TFTs), that is, between pixels, by rearranging the pixels, so that a grating period is increased, thereby reducing a diffraction phenomenon. However, the above scheme cannot effectively reduce the diffraction effect, rearranges the Pixels, increases the distribution period of the Pixels and TFTs, and reduces the pixel density (Pixels Per inc, PPI) of the display screen, thereby affecting the display effect.
Therefore, a new method is needed to solve the above technical problems.
Disclosure of Invention
The embodiment of the application provides a picture processing method, a picture processing device, terminal equipment and a computer readable storage medium, so as to reduce the problem of probability of diffraction phenomenon in a picture.
In a first aspect, an embodiment of the present application provides an image processing method, where the image processing method is applied to a terminal device, and the image processing method includes:
if the terminal equipment enters a preview interface, detecting whether a light source exists in a picture of the preview interface;
if the light source exists in the picture of the preview interface, shooting a first picture by adopting a first exposure parameter, and shooting a second picture by adopting a non-first exposure parameter, wherein the exposure degree of the second picture is lower than that of the first picture;
detecting the positions of the light source and the diffraction light spot in the first picture to obtain positions to be replaced;
acquiring a pixel value of a position corresponding to the position to be replaced in the second picture;
and replacing the pixel value of the position to be replaced of the first picture with the acquired pixel value to obtain a processed picture.
In a second aspect, an embodiment of the present application provides a picture processing apparatus, where the picture processing apparatus is applied to a terminal device, and the picture processing apparatus includes:
the light source detection unit is used for detecting whether a light source exists in a picture of a preview interface or not if the terminal equipment enters the preview interface;
the image shooting unit is used for shooting a first image by adopting a first exposure parameter and shooting a second image by adopting a non-first exposure parameter if the light source exists in the image of the preview interface, wherein the exposure degree of the second image is lower than that of the first image;
the to-be-replaced position detection unit is used for detecting the positions of the light source and the diffraction light spot in the first picture to obtain a to-be-replaced position;
a pixel value obtaining unit, configured to obtain a pixel value of a position in the second picture corresponding to the position to be replaced;
and the processed picture acquisition unit is used for replacing the acquired pixel value with the pixel value of the position to be replaced of the first picture to obtain a processed picture.
In a third aspect, an embodiment of the present application provides a terminal device, which includes a camera, a memory, a processor, and a computer program stored in the memory and executable on the processor, and the processor implements the method according to the first aspect when executing the computer program.
In a fourth aspect, the present application provides a computer-readable storage medium, which stores a computer program, and when the computer program is executed by a processor, the computer program implements the method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a computer program product, which, when running on a terminal device, causes the terminal device to execute the image processing method according to the first aspect.
Compared with the prior art, the embodiment of the application has the advantages that:
the first picture and the second picture are respectively pictures shot by adopting the first exposure parameters and the non-first exposure parameters, and the exposure of the second picture is smaller than that of the first picture, so that the pixel values of the positions of the light source and the light spot in the second picture are inevitably lower than those of the positions of the light source and the light spot in the first picture, the lower pixel values can reduce the probability of generating the diffraction phenomenon, the pixel values of the positions of the light source and the light spot in the second picture are replaced with the pixel values of the positions of the light source and the light spot in the first picture, and the probability of the diffraction phenomenon of the obtained processed image can be effectively reduced.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings used in the embodiments or the description of the prior art will be briefly described below.
Fig. 1 is a schematic structural diagram of a mobile phone to which an image processing method provided in an embodiment of the present application is applied;
fig. 2 is a schematic flowchart illustrating a picture processing method according to an embodiment of the present application;
fig. 3A is a schematic diagram of a first picture needing to be processed in one scene according to an embodiment of the present application;
fig. 3B is a schematic diagram of a second picture in the same scene as fig. 3A according to an embodiment of the present application;
fig. 3C is a schematic diagram of a processed picture obtained by replacing a pixel value of a first picture according to an embodiment of the present application
Fig. 4A is a fraunhofer diffraction pattern corresponding to the first picture provided in the present application;
fig. 4B is a fraunhofer diffraction pattern corresponding to the second picture provided in the present application;
fig. 5 is a schematic structural diagram of a picture processing apparatus according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of a terminal device according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to" determining "or" in response to detecting ". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
Furthermore, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used for distinguishing between descriptions and not necessarily for describing or implying relative importance.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
The image processing method provided by the embodiment of the application can be applied to terminal devices such as a mobile phone, a tablet personal computer, a wearable device, a vehicle-mounted device, an Augmented Reality (AR)/Virtual Reality (VR) device, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a Personal Digital Assistant (PDA), and the like, and the embodiment of the application does not limit the specific type of the terminal device at all.
For example, the terminal device may be a Station (ST) in a WLAN, which may be a cellular phone, a cordless phone, a Session Initiation Protocol (SIP) phone, a Wireless Local Loop (WLL) station, a Personal Digital Assistant (PDA) device, a handheld device with wireless communication capability, a computing device or other processing device connected to a wireless modem, a vehicle-mounted device, a vehicle-mounted networking terminal, a computer, a laptop, a handheld communication device, a handheld computing device, a satellite wireless device, a wireless modem card, a television set-top box (STB), a Customer Premises Equipment (CPE), and/or other devices for communicating over a wireless system and a next generation communication system, e.g., a Mobile terminal in a 5G Network or a Public Land Mobile Network (future evolved, PLMN) mobile terminals in the network, etc.
By way of example and not limitation, when the terminal device is a wearable device, the wearable device may also be a generic term for intelligently designing daily wearing by applying wearable technology, developing wearable devices, such as glasses, gloves, watches, clothing, shoes, and the like. A wearable device is a portable device that is worn directly on the body or integrated into the clothing or accessories of the user. The wearable device is not only a hardware device, but also realizes powerful functions through software support, data interaction and cloud interaction. The generalized wearable intelligent device has the advantages that the generalized wearable intelligent device is complete in function and large in size, can realize complete or partial functions without depending on a smart phone, such as a smart watch or smart glasses, and only is concentrated on a certain application function, and needs to be matched with other devices such as the smart phone for use, such as various smart bracelets for monitoring physical signs, smart jewelry and the like.
Take the terminal device as a mobile phone as an example. Fig. 1 is a block diagram illustrating a partial structure of a mobile phone according to an embodiment of the present disclosure. Referring to fig. 1, the cellular phone includes: a Radio Frequency (RF) circuit 110, a memory 120, an input unit 130, a display unit 140, a sensor 150, an audio circuit 160, a Wireless fidelity (WiFi) module 170, a processor 180, and a power supply 190. Those skilled in the art will appreciate that the handset configuration shown in fig. 1 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The following describes each component of the mobile phone in detail with reference to fig. 1:
the RF circuit 110 may be used for receiving and transmitting signals during information transmission and reception or during a call, and in particular, receives downlink information of a base station and then processes the received downlink information to the processor 180; in addition, the data for designing uplink is transmitted to the base station. Typically, the RF circuitry includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, the RF circuitry 110 may also communicate with networks and other devices via wireless communications. The wireless communication may use any communication standard or protocol, including but not limited to global system for Mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), email, Short Messaging Service (SMS), etc.
The memory 120 may be used to store software programs and modules, and the processor 180 executes various functional applications and data processing of the mobile phone by operating the software programs and modules stored in the memory 120. The memory 120 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 120 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The input unit 130 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the cellular phone 100. Specifically, the input unit 130 may include a touch panel 131 and other input devices 132. The touch panel 131, also referred to as a touch screen, may collect touch operations of a user on or near the touch panel 131 (e.g., operations of the user on or near the touch panel 131 using any suitable object or accessory such as a finger or a stylus pen), and drive the corresponding connection device according to a preset program. Alternatively, the touch panel 131 may include two parts, i.e., a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 180, and can receive and execute commands sent by the processor 180. In addition, the touch panel 131 may be implemented by various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. The input unit 130 may include other input devices 132 in addition to the touch panel 131. In particular, other input devices 132 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The display unit 140 may be used to display information input by a user or information provided to the user and various menus of the mobile phone. The display unit 140 may include a display panel 141, and optionally, the display panel 141 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. Further, the touch panel 131 can cover the display panel 141, and when the touch panel 131 detects a touch operation on or near the touch panel 131, the touch operation is transmitted to the processor 180 to determine the type of the touch event, and then the processor 180 provides a corresponding visual output on the display panel 141 according to the type of the touch event. Although the touch panel 131 and the display panel 141 are shown as two separate components in fig. 1 to implement the input and output functions of the mobile phone, in some embodiments, the touch panel 131 and the display panel 141 may be integrated to implement the input and output functions of the mobile phone.
The handset 100 may also include at least one sensor 150, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor that adjusts the brightness of the display panel 141 according to the brightness of ambient light, and a proximity sensor that turns off the display panel 141 and/or the backlight when the mobile phone is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing the posture of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone, further description is omitted here.
Audio circuitry 160, speaker 161, and microphone 162 may provide an audio interface between the user and the handset. The audio circuit 160 may transmit the electrical signal converted from the received audio data to the speaker 161, and convert the electrical signal into a sound signal for output by the speaker 161; on the other hand, the microphone 162 converts the collected sound signal into an electrical signal, which is received by the audio circuit 160 and converted into audio data, which is then processed by the audio data output processor 180 and then transmitted to, for example, another cellular phone via the RF circuit 110, or the audio data is output to the memory 120 for further processing.
WiFi belongs to short-distance wireless transmission technology, and the mobile phone can help a user to receive and send e-mails, browse webpages, access streaming media and the like through the WiFi module 170, and provides wireless broadband Internet access for the user. Although fig. 1 shows the WiFi module 170, it is understood that it does not belong to the essential constitution of the handset 100, and can be omitted entirely as needed within the scope not changing the essence of the invention.
The processor 180 is a control center of the mobile phone, connects various parts of the entire mobile phone by using various interfaces and lines, and performs various functions of the mobile phone and processes data by operating or executing software programs and/or modules stored in the memory 120 and calling data stored in the memory 120, thereby integrally monitoring the mobile phone. Alternatively, processor 180 may include one or more processing units; preferably, the processor 180 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 180.
The handset 100 also includes a power supply 190 (e.g., a battery) for powering the various components, which may preferably be logically connected to the processor 180 via a power management system, such that the power management system may be used to manage charging, discharging, and power consumption.
Although not shown, the handset 100 also includes a camera. Optionally, the position of the camera on the mobile phone 100 may be front-located or rear-located, which is not limited in this embodiment of the application.
Optionally, the mobile phone 100 may include a single camera, a dual camera, or a triple camera, which is not limited in this embodiment.
For example, the cell phone 100 may include three cameras, one being a main camera, one being a wide camera, and one being a tele camera.
Optionally, when the mobile phone 100 includes a plurality of cameras, the plurality of cameras may be all front-mounted, all rear-mounted, or a part of the cameras front-mounted and another part of the cameras rear-mounted, which is not limited in this embodiment of the present application.
In addition, although not shown, the mobile phone 100 may further include a bluetooth module or the like, which is not described herein.
Fig. 2 shows a flowchart of a picture processing method provided in an embodiment of the present application, where the picture processing method is applied to a terminal device (the terminal device may be a mobile phone), and in this embodiment, a user starts a photographing application installed on the terminal device and enters a preview interface, where the preview interface may be a preview interface for photographing a picture or a preview interface for photographing a video. And then, the terminal equipment detects whether a light source exists in the picture of the preview interface, if so, different exposure parameters are adopted to respectively shoot corresponding pictures, and pixel values corresponding to the positions of the light sources in the pictures shot by the low exposure parameters are replaced by pixel values corresponding to the positions of the light sources in the pictures shot by the high exposure parameters, so that the processed pictures are obtained. The details are as follows:
step S21, if the terminal equipment enters a preview interface, detecting whether a light source exists in a picture of the preview interface;
specifically, since the pixel value of the position where the light source is located is greatly different from the pixel value of the position where the non-light source is located, whether the light source is located in the screen can be detected according to the size of the pixel value. For example, if all the pixel values in a certain region are greater than or equal to the preset light source threshold, the region is regarded as the region where the light source is located. In some embodiments, since the gray-level value of the edge of the light source region and the non-light source region will be greatly reduced when the light source is present, in order to more accurately detect whether the light source is present on the picture, the judgment can be made by analyzing the reduction amplitude of the gray-level value.
Step S22, if a light source is detected in the picture of the preview interface, a first picture is shot by adopting a first exposure parameter, and a second picture is shot by adopting a non-first exposure parameter, wherein the exposure of the second picture is lower than that of the first picture;
in this embodiment, the first exposure parameter includes: the first photosensitive value, the first exposure duration and the first aperture value, and correspondingly, the non-first exposure parameters include: the non-first photosensitive value, the non-first exposure time and the non-first aperture value. Specifically, the smaller the photosensitive value and the exposure duration are, the lower the exposure degree of the picture obtained by shooting the photosensitive value and the exposure duration are, and conversely, the higher the exposure degree is; and the smaller the aperture value, the higher the exposure degree of the picture taken by the camera. That is, the exposure level of the second picture taken according to the non-first exposure parameters is lower than the exposure level of the first picture by adjusting the non-first exposure parameters.
In this embodiment, the exposure level of the second picture is smaller than that of the first picture, for example, if the exposure level of the first picture is "overexposure", the exposure level of the second picture may be "underexposure".
Of course, if it is detected that no light source exists in the picture of the preview interface, a first picture is taken by using the first exposure parameter, and the first picture is taken as a processed picture.
Step S23, detecting the positions of the light source and the diffraction light spot in the first picture to obtain the position to be replaced;
wherein the position to be replaced is described by coordinate information.
The method for detecting the positions of the light source and the diffraction spot is the same as the method for detecting whether the light source exists in the picture in step S21, and is not described herein again.
Step S24, acquiring a pixel value of a position in the second picture corresponding to the position to be replaced;
because the first picture and the second picture are shot on the same terminal device, the size of the first picture is the same as that of the second picture, at the moment, the corresponding coordinate information can be quickly determined in the second picture according to the position (coordinate information) to be replaced determined in the first picture, and then the corresponding pixel value is obtained according to the coordinate information. Of course, in order to ensure that the difference between the scenes of the first picture and the second picture is minimal, the difference between the shooting time of the first picture and the shooting time of the second picture is set to be less than a preset time difference value, for example, less than 0.5 second.
And step S25, replacing the pixel value of the position to be replaced of the first picture with the acquired pixel value to obtain a processed picture.
Specifically, the obtained pixel value is directly used as the pixel value of the position to be replaced in the first picture, so that the processed picture is obtained. For example, a pixel value 1 obtained from a position 1 in the second picture is taken as a pixel value of a position x in the first picture corresponding to the position 1 in the second picture, a pixel value 2 obtained from a position 2 in the second picture is taken as a pixel value of a position y in the first picture corresponding to the position 2 in the second picture, and the like.
Referring to fig. 3A to 3C, fig. 3A is a schematic diagram of a first picture that needs to be processed in one scene according to an embodiment of the present disclosure, fig. 3B is a schematic diagram of a second picture in the same scene as fig. 3A according to an embodiment of the present disclosure, fig. 3C is a schematic diagram of a processed picture obtained by replacing a pixel value of the first picture according to an embodiment of the present disclosure, fig. 3A is a picture taken by a first exposure parameter, fig. 3B is a picture taken by a non-first exposure parameter, and fig. 3C is a processed picture. The Fraunhofer diffraction pattern corresponding to the picture shot by the first exposure parameter is shown in fig. 4A, the Fraunhofer diffraction pattern corresponding to the picture shot by the non-first exposure parameter is shown in fig. 4B, the highest order in the middle of the pictures in fig. 4A and 4B is the position of the light source, and the two secondary orders are the positions of the diffraction spots, so that as can be seen from fig. 4A and 4B, the energy of the light source position under the first exposure parameter is very high, even an overexposure phenomenon may exist, the energy of the light source at the position of the diffraction spots is also very high, and the energies of the light source and the diffraction spots under the non-first exposure parameter are relatively low.
In fig. 3A to 3C, the darker the color indicates that the gray scale value at that position is larger, and the finally obtained processed picture fig. 3C not only retains the pixel information of the non-light source position, but also reduces the diffraction phenomenon at the light source position.
In the embodiment of the application, the first picture and the second picture are pictures shot by adopting the first exposure parameter and the non-first exposure parameter respectively, and the exposure degree of the second picture is smaller than that of the first picture, so that the pixel values of the positions of the light source and the light spot in the second picture are inevitably lower than those of the positions of the light source and the light spot in the first picture, the lower pixel values can reduce the probability of generating the diffraction phenomenon, the pixel values of the positions of the light source and the light spot in the second picture are replaced with the pixel values of the positions of the light source and the light spot in the first picture, and the probability of the diffraction phenomenon of the obtained processed image can be effectively reduced.
In some embodiments, determining whether a light source exists in a picture by detecting a descending amplitude of gray scale values of edges of a light source area and a non-light source area, where detecting whether the light source exists in the picture of the preview interface includes:
a1, detecting whether an area with the gray value falling amplitude larger than or equal to a preset falling amplitude threshold exists in the picture of the preview interface, wherein the falling amplitude of the gray value is determined according to the difference of the gray values of adjacent areas;
specifically, the gray values of the adjacent regions are compared, and if the difference between the gray values of the adjacent regions is greater than or equal to a preset descending amplitude threshold, it is indicated that the adjacent region is a region whose descending amplitude is greater than or equal to the preset descending amplitude threshold. In the specific implementation, the gray value of each position in one region may be compared with the gray value of the adjacent region of each position in the adjacent region, or the mean value of the gray values of each position in one region is firstly counted, the mean value of the gray values of each position in the adjacent region adjacent to the region is counted, and then the two counted mean values are compared.
A2, if an area with the gray value falling amplitude larger than or equal to a preset falling amplitude threshold exists, judging that a light source exists in the picture of the preview interface, otherwise, judging that no light source exists in the picture of the preview interface.
In the above-mentioned a1 and a2, in order to further improve the accuracy of the determined light source, after it is determined that there is an area where the grayscale value has a descending extent greater than or equal to the preset descending extent threshold, it is determined whether there is a grayscale value greater than or equal to the preset grayscale value of the light source in the area, and the number of grayscale values greater than or equal to the preset grayscale value of the light source is greater than or equal to the preset number threshold, if there is a grayscale value greater than or equal to the preset grayscale value of the light source, and the number of grayscale values greater than or equal to the preset grayscale value of the light source is greater than or equal to the preset number threshold, it is determined that there is a light source in the picture of the preview interface.
In some embodiments, since a picture taken from a picture with a light source may also meet a user requirement, and a certain time is consumed for processing the picture, in order to quickly obtain a picture meeting the user requirement, if it is detected that the light source exists in the picture of the preview interface, taking a first picture with a first exposure parameter and taking a second picture with a non-first exposure parameter includes:
if the light source exists in the picture of the preview interface, whether an overexposed area exists in the picture of the preview interface is analyzed by combining with the first exposure parameter, if the overexposed area exists, the first picture is shot by adopting the first exposure parameter, and the second picture is shot by adopting the non-first exposure parameter.
In this embodiment, when a picture exists in a preview interface, a first exposure parameter is determined according to the brightness, color and the like of a scene where the picture is located, and when a picture is taken according to the first exposure parameter, it can be ensured that the overall brightness and color saturation of the obtained picture meet the user requirements, but it cannot be ensured that the position where a light source is located has the possibility of overexposure. Therefore, if it is determined that there is an overexposed area in the current frame shot by using the first exposure parameter, the first picture and the second picture are shot by using different exposure parameters. Of course, if there is no overexposed area, a first picture is taken by using the first exposure parameters, and the first picture is taken as a processed picture. And further, displaying the processed picture.
In some embodiments, if it is detected that there are light sources in the preview interface frame and the number of the light sources is greater than 1, the taking the first picture using the first exposure parameter and the taking the second picture using the non-first exposure parameter includes:
shooting a first picture by adopting first exposure parameters, and shooting at least one second picture by respectively adopting at least one non-first exposure parameter, wherein one light source corresponds to one non-first exposure parameter;
correspondingly, the step S24 specifically includes:
and respectively acquiring pixel values of positions corresponding to the positions to be replaced in the at least one second picture, wherein the positions corresponding to the positions to be replaced refer to positions where the light sources corresponding to the non-first exposure parameters are located.
In this embodiment, if there are multiple light sources in the preview interface, since the diffraction intensities generated by different light sources are usually different, in order to make the processed picture more suitable for the user's requirement, determining corresponding non-first exposure parameters for different light sources, acquiring corresponding second pictures according to the determined different non-first exposure parameters, and then acquiring pixel values of positions corresponding to the corresponding light sources from the different second pictures, for example, if two light sources, i.e. the light source 1 and the light source 2, exist in the picture of the preview interface, the second exposure parameter is determined according to the light source 1, determining a third exposure parameter according to the light source 2, acquiring a second picture m according to the second exposure parameter, and acquiring a second picture n according to the third exposure parameter, finally acquiring a pixel value corresponding to the position of the light source 1 from the second picture m, and acquiring a pixel value corresponding to the position of the light source 2 from the second picture n.
In some embodiments, in order to obtain a more natural picture, a certain processing is performed on the replaced picture after the pixel value replacement, in this case, the step S25 includes:
b1, replacing the pixel value of the position to be replaced of the first picture with the acquired pixel value;
b2, executing a preset processing action on the picture after the pixel value replacement to obtain a processed picture, wherein the preset processing action includes: smoothing filtering and/or noise reduction.
In this embodiment, since the pixel value of the position to be replaced is obtained from another picture, that is, the obtained pixel value may have a larger difference from the existing pixel value of the first picture, in order to ensure that the transition of the pixel value of the processed picture is more natural, the replaced picture is subjected to smoothing filtering and/or noise reduction processing. Of course, the preset processing actions may also be other processing actions, such as sharpening processing, color saturation processing, and contrast processing, etc.
In some embodiments, the first exposure parameter described above is determined by:
adjusting the exposure parameters of the terminal equipment, analyzing whether the quality of the picture in the preview interface meets the requirements or not by combining the adjusted exposure parameters, if so, taking the adjusted exposure parameters as first exposure parameters, and if not, returning to the step of adjusting the exposure parameters of the terminal equipment, wherein the quality of the picture comprises at least one of the following: brightness, noise.
In the embodiment of the application, after a picture appears on a preview interface, the terminal device automatically adjusts corresponding exposure parameters (for example, whether brightness meets requirements and whether noise meets requirements) in combination with preset picture quality requirements (for example, whether brightness meets requirements and whether noise meets requirements and the like), for example, an exposure parameter is determined first, whether the picture quality under the exposure parameter meets the preset picture quality requirements is analyzed, if not, adjustment is continued to obtain another exposure parameter, whether the picture quality under the another exposure parameter meets the preset picture quality requirements is analyzed, and the exposure parameter corresponding to the picture quality meeting the picture quality requirements is taken as a first exposure parameter. It is noted that the photographing action is not performed until the first exposure parameter is determined.
In some embodiments, the non-first exposure parameter described above is determined by:
reducing a first photosensitive value and a first exposure duration in the first exposure parameter to obtain an intermediate photosensitive value and an intermediate exposure duration, and increasing a first aperture value in the first exposure parameter to obtain an intermediate aperture value; and analyzing whether the picture in the preview interface has no overexposure phenomenon or not by combining the intermediate photosensitive value, the intermediate exposure time length and the intermediate aperture value, and if the picture in the preview interface has no overexposure phenomenon, taking the intermediate photosensitive value, the intermediate exposure time length and the intermediate aperture value as non-first exposure parameters.
In this embodiment, the non-first exposure parameter is mainly used to obtain a picture in which the light source does not generate the diffraction phenomenon, and when the picture in the preview interface has the overexposure phenomenon, the probability that the picture position corresponding to the light source generates the diffraction is higher, so that taking the exposure parameter corresponding to which the overexposure phenomenon does not exist as the non-first exposure parameter helps to ensure that the processed picture does not generate the diffraction phenomenon.
Fig. 5 shows a block diagram of a picture processing apparatus provided in an embodiment of the present application, which corresponds to the picture processing method described in the above embodiment, and only shows portions related to the embodiment of the present application for convenience of description.
Referring to fig. 5, the picture processing apparatus 5 is applied to a terminal device, and includes: a light source detection unit 51, a picture taking unit 52, a to-be-replaced position detection unit 53, a pixel value acquisition unit 54, and a processed picture acquisition unit 55. Wherein:
a light source detecting unit 51, configured to detect whether a light source exists in a picture of a preview interface if the terminal device enters the preview interface;
a picture taking unit 52, configured to take a first picture by using a first exposure parameter and a second picture by using a non-first exposure parameter if it is detected that a light source exists in a picture of the preview interface, where an exposure level of the second picture is lower than an exposure level of the first picture;
of course, the picture processing apparatus 5 may further include:
and the direct shooting unit is used for shooting a first picture by adopting the first exposure parameter if the fact that no light source exists in the picture of the preview interface is detected, and taking the first picture as a processed picture.
A to-be-replaced position detection unit 53, configured to detect positions of the light source and the diffraction spot in the first picture, so as to obtain a to-be-replaced position;
wherein the position to be replaced is described by coordinate information.
A pixel value obtaining unit 54, configured to obtain a pixel value of a position in the second picture corresponding to the position to be replaced;
and the size of the first picture is the same as that of the second picture.
And the processed picture acquiring unit 55 is configured to replace the acquired pixel value with the pixel value of the position to be replaced of the first picture, so as to obtain a processed picture.
In the embodiment of the application, the first picture and the second picture are pictures shot by adopting the first exposure parameter and the non-first exposure parameter respectively, and the exposure degree of the second picture is smaller than that of the first picture, so that the pixel values of the positions of the light source and the light spot in the second picture are inevitably lower than those of the positions of the light source and the light spot in the first picture, the lower pixel values can reduce the probability of generating the diffraction phenomenon, the pixel values of the positions of the light source and the light spot in the second picture are replaced with the pixel values of the positions of the light source and the light spot in the first picture, and the probability of the diffraction phenomenon of the obtained processed image can be effectively reduced.
In some embodiments, it is determined whether a light source exists in the image by detecting a descending amplitude of the gray scale values of the edges of the two regions, and at this time, when detecting whether a light source exists in the image of the preview interface in the light source detecting unit 51, the method is specifically configured to:
detecting whether an area with the gray value falling amplitude larger than or equal to a preset falling amplitude threshold exists in a picture of the preview interface, wherein the falling amplitude of the gray value is determined according to the difference of the gray values of adjacent areas; and if the area with the gray value falling amplitude larger than or equal to the preset falling amplitude threshold exists, judging that the light source exists in the picture of the preview interface, otherwise, judging that the light source does not exist in the picture of the preview interface.
Specifically, the gray value of each position in one region may be compared with the gray value of the adjacent region of each position in the adjacent region, or the mean value of the gray values of each position in one region is counted first, the mean value of the gray values of each position in the adjacent region adjacent to the region is counted, and then the two counted mean values are compared.
In order to further improve the accuracy of the determined light source, after it is determined that a region with a grayscale value falling amplitude greater than or equal to a preset falling amplitude threshold exists, it is determined whether a grayscale value greater than or equal to a preset grayscale value of the light source exists in the region, and the number of grayscale values greater than or equal to the preset grayscale value of the light source is greater than or equal to a preset number threshold, if a grayscale value greater than or equal to the preset grayscale value of the light source exists, and the number of grayscale values greater than or equal to the preset grayscale value of the light source is greater than or equal to the preset number threshold, it is determined that the light source exists in the picture of the preview interface, otherwise, it is determined that the light.
In some embodiments, since a picture taken from a picture with a light source may also satisfy a user requirement, and it will take a certain time to process the picture, in order to quickly obtain a picture satisfying the user requirement, the picture taking unit 52 is specifically configured to:
if the light source exists in the picture of the preview interface, whether an overexposed area exists in the picture of the preview interface is analyzed by combining with the first exposure parameter, if the overexposed area exists, the first picture is shot by adopting the first exposure parameter, and the second picture is shot by adopting the non-first exposure parameter.
In some embodiments, if it is detected that a light source exists in the preview interface screen and the number of light sources is greater than 1, the picture taking unit 52 is specifically configured to:
shooting a first picture by adopting first exposure parameters, and shooting at least one second picture by respectively adopting at least one non-first exposure parameter, wherein one light source corresponds to one non-first exposure parameter;
correspondingly, the pixel value obtaining unit 54 is specifically configured to:
and respectively acquiring pixel values of positions corresponding to the positions to be replaced in the at least one second picture, wherein the positions corresponding to the positions to be replaced refer to positions where the light sources corresponding to the non-first exposure parameters are shot.
In some embodiments, in order to obtain a more natural picture, a certain processing is performed on the replaced picture after the pixel value replacement, in this case, the processed picture obtaining unit 55 includes:
the pixel value replacing module is used for replacing the pixel value of the position to be replaced of the first picture with the acquired pixel value;
the optimization processing module is configured to execute a preset processing action on the picture after the pixel value replacement to obtain a processed picture, where the preset processing action includes: smoothing filtering and/or noise reduction.
The preset processing action may also be other processing actions, such as sharpening processing, color saturation processing, contrast processing, and the like.
In some embodiments, the first exposure parameter is determined by:
adjusting the exposure parameters of the terminal equipment, analyzing whether the quality of the picture in the preview interface meets the requirements or not by combining the adjusted exposure parameters, if so, taking the adjusted exposure parameters as first exposure parameters, and if not, returning to the step of adjusting the exposure parameters of the terminal equipment, wherein the quality of the picture comprises at least one of the following: brightness, noise.
In some embodiments, the non-first exposure parameter is determined by:
reducing a first photosensitive value and a first exposure duration in the first exposure parameter to obtain an intermediate photosensitive value and an intermediate exposure duration, and increasing a first aperture value in the first exposure parameter to obtain an intermediate aperture value;
and analyzing whether the picture in the preview interface has no overexposure phenomenon or not by combining the intermediate photosensitive value, the intermediate exposure time length and the intermediate aperture value, and if the picture in the preview interface has no overexposure phenomenon, taking the intermediate photosensitive value, the intermediate exposure time length and the intermediate aperture value as non-first exposure parameters.
It should be noted that, for the information interaction, execution process, and other contents between the above-mentioned devices/units, the specific functions and technical effects thereof are based on the same concept as those of the embodiment of the method of the present application, and specific reference may be made to the part of the embodiment of the method, which is not described herein again.
Fig. 6 is a schematic structural diagram of a terminal device according to an embodiment of the present application. As shown in fig. 6, the terminal device 6 of this embodiment includes: at least one processor 60 (only one processor is shown in fig. 6), a memory 61, and a computer program 62 stored in the memory 61 and operable on the at least one processor 60, a camera 63, the processor 60 implementing the steps in any of the various method embodiments described above when executing the computer program 62:
if the terminal equipment enters a preview interface, detecting whether a light source exists in a picture of the preview interface;
if the light source exists in the picture of the preview interface, shooting a first picture by adopting a first exposure parameter, and shooting a second picture by adopting a non-first exposure parameter, wherein the exposure degree of the second picture is lower than that of the first picture;
detecting the positions of the light source and the diffraction light spot in the first picture to obtain positions to be replaced;
acquiring a pixel value of a position corresponding to the position to be replaced in the second picture;
and replacing the pixel value of the position to be replaced of the first picture with the acquired pixel value to obtain a processed picture.
The terminal device 6 may be a desktop computer, a notebook, a palm computer, a cloud server, or other computing devices. The terminal device may include, but is not limited to, a processor 60, a memory 61. Those skilled in the art will appreciate that fig. 6 is only an example of the terminal device 6, and does not constitute a limitation to the terminal device 6, and may include more or less components than those shown, or combine some components, or different components, such as an input/output device, a network access device, and the like.
The Processor 60 may be a Central Processing Unit (CPU), and the Processor 60 may be other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 61 may in some embodiments be an internal storage unit of the terminal device 6, such as a hard disk or a memory of the terminal device 6. The memory 61 may also be an external storage device of the terminal device 6 in other embodiments, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are equipped on the terminal device 6. Further, the memory 61 may also include both an internal storage unit and an external storage device of the terminal device 6. The memory 61 is used for storing an operating system, an application program, a BootLoader (BootLoader), data, and other programs, such as program codes of the computer program. The memory 61 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
An embodiment of the present application further provides a network device, where the network device includes: at least one processor, a memory, and a computer program stored in the memory and executable on the at least one processor, the processor implementing the steps of any of the various method embodiments described above when executing the computer program.
The embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the computer program implements the steps in the above-mentioned method embodiments.
The embodiments of the present application provide a computer program product, which when running on a mobile terminal, enables the mobile terminal to implement the steps in the above method embodiments when executed.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the processes in the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium and can implement the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to a photographing apparatus/terminal apparatus, a recording medium, computer Memory, Read-Only Memory (ROM), random-access Memory (RAM), an electrical carrier signal, a telecommunications signal, and a software distribution medium. Such as a usb-disk, a removable hard disk, a magnetic or optical disk, etc. In certain jurisdictions, computer-readable media may not be an electrical carrier signal or a telecommunications signal in accordance with legislative and patent practice.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/network device and method may be implemented in other ways. For example, the above-described apparatus/network device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implementing, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (10)

1. A picture processing method is applied to terminal equipment, and is characterized by comprising the following steps:
if the terminal equipment enters a preview interface, detecting whether a light source exists in a picture of the preview interface;
if the light source exists in the picture of the preview interface, shooting a first picture by adopting a first exposure parameter, and shooting a second picture by adopting a non-first exposure parameter, wherein the exposure degree of the second picture is lower than that of the first picture;
detecting the positions of the light source and the diffraction light spot in the first picture to obtain positions to be replaced;
acquiring a pixel value of a position corresponding to the position to be replaced in the second picture;
and replacing the pixel value of the position to be replaced of the first picture with the acquired pixel value to obtain a processed picture.
2. The picture processing method according to claim 1, wherein the detecting whether a light source exists in the picture of the preview interface comprises:
detecting whether an area with the gray value falling amplitude larger than or equal to a preset falling amplitude threshold exists in a picture of the preview interface, wherein the falling amplitude of the gray value is determined according to the difference of the gray values of adjacent areas;
and if the area with the gray value falling amplitude larger than or equal to the preset falling amplitude threshold exists, judging that the light source exists in the picture of the preview interface, otherwise, judging that the light source does not exist in the picture of the preview interface.
3. The method of claim 1, wherein the taking a first picture using a first exposure parameter and taking a second picture using a non-first exposure parameter if a light source is detected in a frame of the preview interface comprises:
if the light source exists in the picture of the preview interface, whether an overexposed area exists in the picture of the preview interface is analyzed by combining with the first exposure parameter, if the overexposed area exists, the first picture is shot by adopting the first exposure parameter, and the second picture is shot by adopting the non-first exposure parameter.
4. The method as claimed in claim 3, wherein if it is detected that there are light sources in the frame of the preview interface and the number of light sources is greater than 1, the taking the first picture using the first exposure parameter and the taking the second picture using the non-first exposure parameter comprises:
shooting a first picture by adopting first exposure parameters, and shooting at least one second picture by respectively adopting at least one non-first exposure parameter, wherein one light source corresponds to one non-first exposure parameter;
correspondingly, the obtaining of the pixel value of the position corresponding to the position to be replaced in the second picture specifically includes:
and respectively acquiring pixel values of positions corresponding to the positions to be replaced in the at least one second picture, wherein the positions corresponding to the positions to be replaced refer to positions where the light sources corresponding to the non-first exposure parameters are located.
5. The picture processing method according to any one of claims 1 to 4, wherein the replacing the pixel value of the position to be replaced of the first picture with the acquired pixel value to obtain the processed picture comprises:
replacing the pixel value of the position to be replaced of the first picture with the acquired pixel value;
executing a preset processing action on the picture after the pixel value replacement to obtain a processed picture, wherein the preset processing action comprises: smoothing filtering and/or noise reduction.
6. The picture processing method according to any one of claims 1 to 4, wherein the first exposure parameter is determined by:
adjusting the exposure parameters of the terminal equipment, analyzing whether the quality of the picture in the preview interface meets the requirements or not by combining the adjusted exposure parameters, if so, taking the adjusted exposure parameters as first exposure parameters, and if not, returning to the step of adjusting the exposure parameters of the terminal equipment, wherein the quality of the picture comprises at least one of the following: brightness, noise.
7. The picture processing method according to claim 6, wherein the non-first exposure parameter is determined by:
reducing a first photosensitive value and a first exposure duration in the first exposure parameter to obtain an intermediate photosensitive value and an intermediate exposure duration, and increasing a first aperture value in the first exposure parameter to obtain an intermediate aperture value;
and analyzing whether the picture in the preview interface has no overexposure phenomenon or not by combining the intermediate photosensitive value, the intermediate exposure time length and the intermediate aperture value, and if the picture in the preview interface has no overexposure phenomenon, taking the intermediate photosensitive value, the intermediate exposure time length and the intermediate aperture value as non-first exposure parameters.
8. A picture processing device is applied to a terminal device, and the picture processing device comprises:
the light source detection unit is used for detecting whether a light source exists in a picture of a preview interface or not if the terminal equipment enters the preview interface;
the image shooting unit is used for shooting a first image by adopting a first exposure parameter and shooting a second image by adopting a non-first exposure parameter if the light source exists in the image of the preview interface, wherein the exposure degree of the second image is lower than that of the first image;
the to-be-replaced position detection unit is used for detecting the positions of the light source and the diffraction light spot in the first picture to obtain a to-be-replaced position;
a pixel value obtaining unit, configured to obtain a pixel value of a position in the second picture corresponding to the position to be replaced;
and the processed picture acquisition unit is used for replacing the acquired pixel value with the pixel value of the position to be replaced of the first picture to obtain a processed picture.
9. A terminal device comprising a camera, a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the method according to any of claims 1 to 7 when executing the computer program.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 7.
CN201911200429.4A 2019-11-29 2019-11-29 Picture processing method and device, terminal equipment and computer readable storage medium Pending CN110971822A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911200429.4A CN110971822A (en) 2019-11-29 2019-11-29 Picture processing method and device, terminal equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911200429.4A CN110971822A (en) 2019-11-29 2019-11-29 Picture processing method and device, terminal equipment and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN110971822A true CN110971822A (en) 2020-04-07

Family

ID=70032121

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911200429.4A Pending CN110971822A (en) 2019-11-29 2019-11-29 Picture processing method and device, terminal equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN110971822A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111741214A (en) * 2020-05-13 2020-10-02 北京迈格威科技有限公司 Image processing method and device and electronic equipment
CN115334228A (en) * 2021-04-26 2022-11-11 华为技术有限公司 Video processing method and related device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103259976A (en) * 2012-02-17 2013-08-21 佳能株式会社 Image processing apparatus, image pickup apparatus, image processing method
CN104735347A (en) * 2013-12-24 2015-06-24 三星泰科威株式会社 Autofocus adjusting method and apparatus
US20150189139A1 (en) * 2013-12-26 2015-07-02 Seiko Epson Corporation Camera and image processing method
CN107770438A (en) * 2017-09-27 2018-03-06 维沃移动通信有限公司 A kind of photographic method and mobile terminal
CN108234880A (en) * 2018-02-02 2018-06-29 成都西纬科技有限公司 A kind of image enchancing method and device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103259976A (en) * 2012-02-17 2013-08-21 佳能株式会社 Image processing apparatus, image pickup apparatus, image processing method
CN104735347A (en) * 2013-12-24 2015-06-24 三星泰科威株式会社 Autofocus adjusting method and apparatus
US20150189139A1 (en) * 2013-12-26 2015-07-02 Seiko Epson Corporation Camera and image processing method
CN107770438A (en) * 2017-09-27 2018-03-06 维沃移动通信有限公司 A kind of photographic method and mobile terminal
CN108234880A (en) * 2018-02-02 2018-06-29 成都西纬科技有限公司 A kind of image enchancing method and device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111741214A (en) * 2020-05-13 2020-10-02 北京迈格威科技有限公司 Image processing method and device and electronic equipment
CN115334228A (en) * 2021-04-26 2022-11-11 华为技术有限公司 Video processing method and related device

Similar Documents

Publication Publication Date Title
US11330194B2 (en) Photographing using night shot mode processing and user interface
CN108093134B (en) Anti-interference method of electronic equipment and related product
CN107038681B (en) Image blurring method and device, computer readable storage medium and computer device
CN110636375B (en) Video stream processing method and device, terminal equipment and computer readable storage medium
CN108494974B (en) Display brightness adjusting method, mobile terminal and storage medium
CN110852951B (en) Image processing method, device, terminal equipment and computer readable storage medium
CN106993136B (en) Mobile terminal and multi-camera-based image noise reduction method and device thereof
CN108200421B (en) White balance processing method, terminal and computer readable storage medium
CN111294625B (en) Method, device, terminal equipment and storage medium for combining equipment service capability
CN106851119B (en) Picture generation method and equipment and mobile terminal
CN110851350A (en) Method and device for monitoring white screen of web page interface
CN107330867B (en) Image synthesis method, image synthesis device, computer-readable storage medium and computer equipment
WO2022267506A1 (en) Image fusion method, electronic device, storage medium, and computer program product
CN110363702B (en) Image processing method and related product
CN110971822A (en) Picture processing method and device, terminal equipment and computer readable storage medium
CN109639981B (en) Image shooting method and mobile terminal
CN110933293A (en) Shooting method, terminal and computer readable storage medium
CN107292833B (en) Image processing method and device and mobile terminal
CN111028192B (en) Image synthesis method and electronic equipment
CN110536067B (en) Image processing method, image processing device, terminal equipment and computer readable storage medium
CN110660032A (en) Object shielding method, object shielding device and electronic equipment
CN107835336B (en) Dual-camera frame synchronization method and device, user terminal and storage medium
CN107613284B (en) A kind of image processing method, terminal and computer readable storage medium
CN109729280A (en) A kind of image processing method and mobile terminal
CN109242768B (en) Image processing method and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200407