CN113518171A - Image processing method, device, terminal equipment and medium - Google Patents

Image processing method, device, terminal equipment and medium Download PDF

Info

Publication number
CN113518171A
CN113518171A CN202010225502.XA CN202010225502A CN113518171A CN 113518171 A CN113518171 A CN 113518171A CN 202010225502 A CN202010225502 A CN 202010225502A CN 113518171 A CN113518171 A CN 113518171A
Authority
CN
China
Prior art keywords
image
glare
area
target image
aperture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010225502.XA
Other languages
Chinese (zh)
Other versions
CN113518171B (en
Inventor
武小宇
王津男
秦超
张运超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202010225502.XA priority Critical patent/CN113518171B/en
Publication of CN113518171A publication Critical patent/CN113518171A/en
Application granted granted Critical
Publication of CN113518171B publication Critical patent/CN113518171B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/81Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

The application is applicable to the technical field of image processing, and provides an image processing method, an image processing device, terminal equipment and a medium, wherein the method comprises the following steps: controlling a first camera of a terminal device to acquire a first image according to a first parameter, and controlling a second camera of the terminal device to acquire a second image according to a second parameter, wherein the first parameter comprises a first focal length and a first aperture, the second parameter comprises a second focal length and a second aperture, the first focal length is greater than the second focal length, and the first aperture is smaller than the second aperture; identifying a first glare area in the first image and a second glare area in the second image; respectively removing the first glare area and the second glare area to obtain a first target image and a second target image; and generating a non-glare image according to the first target image and the second target image. By adopting the method, the pixel information loss caused by the glare can be effectively compensated, and the purpose of removing the glare is achieved.

Description

Image processing method, device, terminal equipment and medium
Technical Field
The present application belongs to the field of image processing technologies, and in particular, to an image processing method, an image processing apparatus, a terminal device, and a medium.
Background
Glare (Dazzle) refers to a visual condition in the field of view due to the presence of an unfavorable luminance distribution, or an extreme contrast in luminance in space or time, so as to cause visual discomfort and a reduction in the visibility of objects. In general, glare is formed by a high intensity light source such that a plurality of reflecting surfaces interfere with each other, and the essence of the glare problem is the superposition of light rays caused by refraction of the light rays within the lens.
At present, when a terminal such as a mobile phone is used for taking a picture, the problem of glare is a big difficulty influencing the imaging quality of the mobile phone. For mobile phone manufacturers seeking higher screen occupation ratio, a more ideal solution for improving the screen occupation ratio is to hide a front camera of a mobile phone below a screen for imaging, so as to reduce the influence caused by the front camera. Compared with a common mobile phone, the under-screen imaging scheme is more prone to generate a glare problem due to refraction of the screen.
In order to reduce the influence of glare on imaging quality, in the prior art, a single camera can be used for shooting multiple times to obtain multiple images with different forms, and then the multiple images are fused for glare removal. Although the glare forms of different times of shooting are changed, glare exists near a light source all the time, information is easy to lose, and the recovery effect is not ideal; meanwhile, because multiple times of shooting are required, the time consumed for fusion processing of multiple shot images is long. Some mobile phone cameras can obtain a plurality of images with different glare forms after one-time shooting by covering the micro-lens array, and then the images are fused for glare removal, so that the time consumed by fusion processing is reduced. However, the aperture and the focal length of the microlens array camera are generally fixed, and the glare forms in the captured images are relatively similar, and the missing information cannot be filled through the fusion processing ideally.
Disclosure of Invention
The embodiment of the application provides an image processing method, an image processing device, terminal equipment and a medium, and can solve the problem that in the prior art, camera imaging quality of terminal equipment such as a mobile phone is poor due to glare.
In a first aspect, an embodiment of the present application provides an image processing method, which is applied to a terminal device, and the method includes:
controlling a first camera of the terminal device to acquire a first image according to a first parameter, and controlling a second camera of the terminal device to acquire a second image according to a second parameter, wherein the first parameter comprises a first focal length and a first aperture, the second parameter comprises a second focal length and a second aperture, the first focal length is greater than the second focal length, and the first aperture is smaller than the second aperture; the first glare area and the second glare area in the first image can be removed by identifying the first glare area and the second glare area in the second image, respectively, to obtain a first target image and a second target image, and then a non-glare image can be generated according to the first target image and the second target image.
In a possible embodiment of the first aspect, the first parameter further includes a first exposure time, and the second parameter further includes a second exposure time, the first exposure time is shorter than the second exposure time, and the area of the glare area is reduced and the pixel information near the glare is increased by controlling the exposure time.
In a possible implementation manner of the first aspect, the first image and the second image may be detected by using a preset detection model, and the detection model may be obtained by training a plurality of training images marked with glare areas in a supervised learning manner; by receiving the detection result output by the detection model, a first glare area in the first image and a second glare area in the second image can be detected.
In a possible implementation manner of the first aspect, since the first image and the second image respectively include a plurality of pixel points, and each pixel point has a corresponding luminance value, the pixel points in the first image and the second image may be clustered according to the luminance values, so as to obtain a plurality of first image region classes of the first image and a plurality of second image region classes of the second image; a first glare area in the first image and a second glare area in the second image are then identified based on the first image area class and the second image area class.
In a possible implementation manner of the first aspect, each first image area class and each second image area class respectively have a corresponding cluster center, and when a first glare area in the first image and a second glare area in the second image are identified according to the first image area class and the second image area class, an area corresponding to an image area class of which the cluster center has a brightness value greater than a preset threshold value in the first image area class can be identified as the first glare area; and identifying the area corresponding to the image area class of which the brightness value of the clustering center is greater than the preset threshold value in the second image area class as a second glare area.
In a possible implementation manner of the first aspect, the first glare area and the second glare area are removed to obtain the first target image and the second target image, and the pixel values of the pixel points in the first glare area and the pixel values in the second glare area are set to be zero, respectively.
In a possible implementation manner of the first aspect, when registering and fusing the first target image and the second target image, a first corresponding relationship between pixel points of the first target image and pixel points of the second target image may be determined first; and then respectively fusing all pixel points with the first corresponding relation in the first target image and the second target image to generate a non-glare image.
In a possible implementation manner of the first aspect, after the first target image and the second target image are registered and fused, the non-glare image may be repaired to obtain the non-glare target image, so as to further improve the image imaging quality.
In a possible implementation manner of the first aspect, the repairing of the non-glare image may be performed based on the non-glare image and the first target image, and similar to the registering and fusing of the first target image and the second target image, the second corresponding relationship between each pixel point of the first target image and each pixel point of the non-glare image may be determined first; and then respectively fusing all the pixel points with the second corresponding relation in the first target image and the non-glare image to generate the non-glare target image.
In a second aspect, an embodiment of the present application provides an image processing apparatus, which is applied to a terminal device, and the apparatus includes:
a first image acquisition module for controlling a first camera of the terminal device to acquire a first image with a first parameter,
the second image acquisition module is used for controlling a second camera of the terminal equipment to acquire a second image according to second parameters, wherein the first parameters comprise a first focal length and a first aperture, the second parameters comprise a second focal length and a second aperture, the first focal length is greater than the second focal length, and the first aperture is smaller than the second aperture;
a glare area identification module for identifying a first glare area in the first image and a second glare area in the second image;
the glare area processing module is used for respectively removing the first glare area and the second glare area to obtain a first target image and a second target image;
and the non-glare image generating module is used for generating a non-glare image according to the first target image and the second target image.
In a third aspect, an embodiment of the present application provides a terminal device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the image processing method according to any one of the above first aspects when executing the computer program.
In a fourth aspect, the present application provides a computer-readable storage medium, which stores a computer program, and when the computer program is executed by a processor of a terminal device, the computer program implements the image processing method according to any one of the first aspect.
In a fifth aspect, the present application provides a computer program product, which when run on a terminal device, causes the terminal device to execute the image processing method according to any one of the above first aspects.
Compared with the prior art, the embodiment of the application has the following beneficial effects:
according to the embodiment of the application, the at least two cameras are adopted to shoot respectively, so that the first image and the second image which contain different glare influence areas can be obtained. After detecting and dividing the glare area in the first image and the second image, the divided glare area can be removed from the images, and a first target image and a second target image which do not include the glare area are obtained. On the basis, the first target image and the second target image are registered and fused, so that the non-glare image with obvious texture can be output. According to the embodiment of the application, the pixel information loss caused by the glare is compensated through at least two images, so that the purpose of effectively removing the glare is achieved, the photo imaging quality of terminal equipment such as a mobile phone is improved, and the photographing level of the terminal is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings used in the embodiments or the description of the prior art will be briefly described below. It is obvious that the drawings in the following description are only some embodiments of the application, and that for a person skilled in the art, other drawings can be derived from them without inventive effort.
Fig. 1 is a schematic overall flow chart of an image processing method according to an embodiment of the present application;
fig. 2 is a schematic diagram of a system architecture to which an image processing method according to an embodiment of the present disclosure is applied;
fig. 3 is a schematic diagram of a hardware structure of a mobile phone to which an image processing method according to an embodiment of the present application is applied;
fig. 4 is a schematic diagram of a software structure of a mobile phone to which an image processing method according to an embodiment of the present application is applied;
FIG. 5 is a flowchart illustrating exemplary steps of an image processing method according to an embodiment of the present application;
FIG. 6 is a flow chart of exemplary steps of an image processing method provided by another embodiment of the present application;
FIG. 7 is an exemplary effect diagram of the de-glare process performed according to the image processing method shown in FIG. 6;
FIG. 8 is a flow chart of illustrative steps of a method of image processing provided by yet another embodiment of the present application;
FIG. 9 is an exemplary effect diagram of the de-glare process performed according to the image processing method shown in FIG. 8;
fig. 10 is a block diagram of an image processing apparatus according to an embodiment of the present application;
fig. 11 is a schematic structural diagram of a terminal device according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. However, it will be apparent to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
Generally, the reflection paths of light rays with different apertures or focal lengths are different, so that the forms of the multiple glares generated are also different; in addition, the lens at different positions can cause different influence areas of the glare. By dividing glare areas in different images and then registering and fusing images of different apertures, long-short exposures and lenses, loss of pixel information caused by glare can be compensated, and the purpose of glare removal is achieved.
Fig. 1 is a schematic overall flow chart of an image processing method according to an embodiment of the present application. According to the flow shown in fig. 1, at least two cameras can be used to capture images simultaneously. For example, a main camera and a wide-angle camera are configured in some mobile phones. When shooting, the main camera and the wide-angle camera can be adopted to shoot according to different shooting parameters respectively at the same time, and two different images are obtained respectively. For example, the main camera may take a long-focus small aperture to obtain a first image; the wide-angle camera can adopt a short-focus large wide angle to shoot to obtain a second image. Then, the glare area in the two images is detected and divided. At the time of detection, a supervised deep learning network model, such as a Convolutional Neural Network (CNN), can be adopted to identify the glare area in the first image and the second image; or, each pixel point in the first image and the second image may be detected in a clustering manner, and the glare areas of the first image and the second image are respectively identified. And removing the detected glare area to obtain a first target image and a second target image with glare removed respectively. After the first target image and the second target image are registered and fused, an image without a glare area can be obtained. After the image is repaired, the image is a non-glare image with obvious texture for output.
Fig. 2 is a schematic diagram of a system architecture to which an image processing method according to an embodiment of the present application is applied. In the architecture shown in fig. 2, the cameras are disposed below the screen of the terminal device, and include a main camera and a wide-angle camera. The main camera is configured to be a long-focus small aperture, the aperture range is F1-N, the wide-angle camera is configured to be a short-focus large aperture, the aperture range is F1-N, and the values of N and N can be determined according to actual needs. When shooting, glare is generated due to refraction of the screen, and the obtained first image and the second image respectively comprise a certain range of glare areas. In order to remove glare, the glare regions in the first image and the second image can be detected and divided respectively, the glare regions in the respective images are removed, then the two images with the glare regions removed are registered and fused, and a non-glare image is output.
The image processing method of the present application will be described below with reference to specific embodiments.
The terminology used in the following examples is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of this application and the appended claims, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, such as "one or more", unless the context clearly indicates otherwise. It should also be understood that in the embodiments of the present application, "one or more" means one, two, or more than two; "and/or" describes the association relationship of the associated objects, indicating that three relationships may exist; for example, a and/or B, may represent: a alone, both A and B, and B alone, where A, B may be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
The image processing method provided by the embodiment of the application can be applied to terminal devices such as a mobile phone, a tablet personal computer, a wearable device, a vehicle-mounted device, an Augmented Reality (AR)/Virtual Reality (VR) device, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a Personal Digital Assistant (PDA), and the like, and the embodiment of the application does not limit the specific type of the terminal device at all.
Take the terminal device as a mobile phone as an example. Fig. 3 is a block diagram illustrating a partial structure of a mobile phone according to an embodiment of the present disclosure. Referring to fig. 3, the cellular phone includes: radio Frequency (RF) circuit 310, memory 320, input unit 330, display unit 340, sensor 350, audio circuit 360, wireless fidelity (Wi-Fi) module 370, processor 380, and power supply 390. Those skilled in the art will appreciate that the handset configuration shown in fig. 3 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The following describes each component of the mobile phone in detail with reference to fig. 3:
the RF circuit 310 may be used for receiving and transmitting signals during information transmission and reception or during a call, and in particular, receives downlink information of a base station and then processes the received downlink information to the processor 380; in addition, the data for designing uplink is transmitted to the base station. Typically, the RF circuitry includes, but is not limited to, an antenna, at least one Amplifier, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, RF circuit 310 may also communicate with networks and other devices via wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to Global System for Mobile communication (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE)), e-mail, Short Messaging Service (SMS), and the like.
The memory 320 may be used to store software programs and modules, and the processor 380 executes various functional applications and data processing of the mobile phone by operating the software programs and modules stored in the memory 320. The memory 320 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 320 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The input unit 330 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the cellular phone 300. Specifically, the input unit 330 may include a touch panel 331 and other input devices 332. The touch panel 331, also referred to as a touch screen, can collect touch operations of a user (e.g., operations of the user on the touch panel 331 or near the touch panel 331 using any suitable object or accessory such as a finger, a stylus, etc.) on or near the touch panel 331, and drive the corresponding connection device according to a preset program. Alternatively, the touch panel 331 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 380, and can receive and execute commands sent by the processor 380. In addition, the touch panel 331 may be implemented in various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. The input unit 330 may include other input devices 332 in addition to the touch panel 331. In particular, other input devices 332 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The display unit 340 may be used to display information input by the user or information provided to the user and various menus of the mobile phone. The Display unit 340 may include a Display panel 341, and optionally, the Display panel 341 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. Further, the touch panel 331 can cover the display panel 341, and when the touch panel 331 detects a touch operation on or near the touch panel 331, the touch panel is transmitted to the processor 380 to determine the type of the touch event, and then the processor 380 provides a corresponding visual output on the display panel 341 according to the type of the touch event. Although in fig. 3, the touch panel 331 and the display panel 341 are two independent components to implement the input and output functions of the mobile phone, in some embodiments, the touch panel 331 and the display panel 341 may be integrated to implement the input and output functions of the mobile phone.
The handset 300 may also include at least one sensor 350, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor that adjusts the brightness of the display panel 341 according to the brightness of ambient light, and a proximity sensor that turns off the display panel 341 and/or the backlight when the mobile phone is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing the posture of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone, further description is omitted here.
Audio circuitry 360, speaker 361, microphone 362 may provide an audio interface between the user and the handset. The audio circuit 360 may transmit the electrical signal converted from the received audio data to the speaker 361, and the audio signal is converted by the speaker 361 and output; on the other hand, the microphone 362 converts the collected sound signals into electrical signals, which are received by the audio circuit 360 and converted into audio data, which are then processed by the audio data output processor 380 and then transmitted to, for example, another cellular phone via the RF circuit 310, or output to the memory 320 for further processing.
Wi-Fi belongs to short-distance wireless transmission technology, and the mobile phone can help a user to receive and send e-mails, browse webpages, access streaming media and the like through the Wi-Fi module 370, and provides wireless broadband internet access for the user. Although fig. 3 shows the Wi-Fi module 370, it is understood that it does not belong to the essential constitution of the handset 300, and may be omitted entirely as needed within the scope not changing the essence of the invention.
The processor 380 is a control center of the mobile phone, connects various parts of the whole mobile phone by using various interfaces and lines, and performs various functions of the mobile phone and processes data by operating or executing software programs and/or modules stored in the memory 320 and calling data stored in the memory 320, thereby performing overall monitoring of the mobile phone. Optionally, processor 380 may include one or more processing units; preferably, the processor 380 may integrate an application processor, which primarily handles operating systems, user interfaces, applications, etc., and a modem processor, which primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 380.
The handset 300 also includes a camera 390. Optionally, the position of the camera on the mobile phone 300 may be front-located or rear-located, which is not limited in this embodiment of the application.
Optionally, the mobile phone 300 may include two cameras or three cameras, and the like, which is not limited in this embodiment.
For example, the cell phone 300 may include three cameras, one being a main camera, one being a wide camera, and one being a tele camera.
Optionally, when the mobile phone 300 includes a plurality of cameras, the plurality of cameras may be all front-mounted, all rear-mounted, or a part of the cameras front-mounted and another part of the cameras rear-mounted, which is not limited in this embodiment of the present application.
Although not shown, the handset 300 may also include a power supply (e.g., a battery) for powering the various components, and preferably, the power supply may be logically connected to the processor 380 via a power management system, such that the functions of managing charging, discharging, and power consumption are performed via the power management system.
In addition, although not shown, the mobile phone 300 may further include a bluetooth module, etc., which will not be described herein.
Fig. 4 is a schematic diagram of a software structure of a mobile phone 300 according to an embodiment of the present application. Taking the operating system of the mobile phone 300 as an Android system as an example, in some embodiments, the Android system is divided into four layers, which are an application layer, an application Framework (FWK) layer, a system layer and a hardware abstraction layer, and the layers communicate with each other through a software interface.
As shown in fig. 4, the application layer may include a series of application packages, which may include short message, calendar, camera, video, navigation, gallery, call, and other applications.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer may include some predefined functions, such as functions for receiving events sent by the application framework layer.
As shown in fig. 4, the application framework layer may include a window manager, a resource manager, and a notification manager, among others.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like. The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
The application framework layer may further include:
a viewing system that includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The phone manager is used to provide the communication functions of the handset 300. Such as management of call status (including on, off, etc.).
The system layer may include a plurality of functional modules. For example: a sensor service module, a physical state identification module, a three-dimensional graphics processing library (such as OpenGL ES), and the like.
The sensor service module is used for monitoring sensor data uploaded by various sensors in a hardware layer and determining the physical state of the mobile phone 300;
the physical state recognition module is used for analyzing and recognizing user gestures, human faces and the like;
the three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The system layer may further include:
the surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, and the like.
The hardware abstraction layer is a layer between hardware and software. The hardware abstraction layer may include a display driver, a camera driver, a sensor driver, etc. for driving the relevant hardware of the hardware layer, such as a display screen, a camera, a sensor, etc.
The following embodiments may be implemented on the cellular phone 300 having the above-described hardware structure/software structure. The following embodiment will take the mobile phone 300 as an example to explain the image processing method provided in the embodiment of the present application.
Referring to fig. 5, a schematic step flow chart of an image processing method provided in an embodiment of the present application is shown, and by way of example and not limitation, the method may be applied to the mobile phone 300, and the method may specifically include the following steps:
s501, controlling a first camera of a terminal device to acquire a first image according to a first parameter, and controlling a second camera of the terminal device to acquire a second image according to a second parameter, wherein the first parameter comprises a first focal length and a first aperture, the second parameter comprises a second focal length and a second aperture, the first focal length is greater than the second focal length, and the first aperture is smaller than the second aperture;
it should be noted that the method can be applied to a terminal device having at least two cameras, such as a mobile phone having two cameras or three cameras.
In the embodiment of the present application, the first camera and the second camera should be cameras located on the same side of the mobile phone. For example, the first camera and the second camera are both front-facing cameras of the mobile phone, or the first camera and the second camera are both rear-facing cameras of the mobile phone, which is not limited in this embodiment.
Generally, when a user uses a mobile phone to shoot, the shooting parameters can be manually adjusted, or the mobile phone uses an automatic mode to shoot by adopting default parameters.
In the embodiment of the application, the mobile phone can use the first camera to shoot according to the first parameter to obtain a first image; and simultaneously shooting by using a second camera according to a second parameter to obtain a second image.
It should be noted that the shooting actions of the first camera and the second camera can be triggered simultaneously by one instruction of the user. For example, the user may simultaneously instruct the first camera and the second camera to take a shot by clicking a virtual shooting control in the screen.
In the embodiment of the present application, the first camera may be a main camera, and the second camera may be a wide-angle camera. Thus, the first parameter used when the first camera takes a shot may be a longer focal length, smaller aperture, and the second parameter used when the second camera takes a shot may be a shorter focal length, larger aperture. That is, the first image captured by the main camera may be a first image captured with a long-focus small aperture, and the second image captured by the wide camera may be a second image captured with a short-focus large aperture.
S502, identifying a first glare area in the first image and a second glare area in the second image;
because of the existence of the light source, glare regions can be included in the images obtained by the first camera and the second camera. The presence of glare regions tends to cause visual discomfort and reduced object visibility, affecting imaging quality.
In order to reduce the influence of the glare area on the imaging quality, the embodiment of the application can perform glare removing processing on the shot image. Therefore, it is necessary to first detect and recognize a first glare area in the first image and a second glare area in the second image.
In a specific implementation, the glare region in an image can be identified based on a deep learning network model such as CNN, which is trained based on a supervised learning process. Alternatively, the glare area may be identified by an unsupervised clustering algorithm, which is not limited in this embodiment.
S503, removing the first glare area and the second glare area respectively to obtain a first target image and a second target image;
the detected glare area can be divided from other areas, and the divided glare area is removed to obtain a first target image and a second target image. The first target image is an image obtained by removing a first glare area in the first image, and the second target image is an image obtained by removing a second glare area in the second image.
S504, generating a non-glare image according to the first target image and the second target image.
Because the light reflection paths of different apertures and focal lengths are different, the forms of the generated glare are different; different positions of the lens can cause different influence areas of the glare. After removing the glare areas in the first image and the second image, the obtained first target image and the second target image may be registered and fused, so as to output a glare-free image.
In the embodiment of the application, the first image and the second image containing different glare influence areas can be obtained by respectively shooting through at least two cameras. After detecting and dividing the glare area in the first image and the second image, the divided glare area can be removed from the images, and a first target image and a second target image which do not include the glare area are obtained. On the basis, the first target image and the second target image are registered and fused, so that the non-glare image with obvious texture can be output. According to the embodiment of the application, the pixel information loss caused by the glare is compensated through at least two images, so that the purpose of effectively removing the glare is achieved, the photo imaging quality of terminal equipment such as a mobile phone is improved, and the photographing level of the terminal is improved.
Referring to fig. 6, a flowchart illustrating schematic steps of an image processing method according to another embodiment of the present application is shown, where the method may specifically include the following steps:
s601, controlling a first camera of a terminal device to acquire a first image according to a first parameter, and controlling a second camera of the terminal device to acquire a second image according to a second parameter, wherein the first parameter comprises a first focal length and a first aperture, the second parameter comprises a second focal length and a second aperture, the first focal length is greater than the second focal length, and the first aperture is smaller than the second aperture;
take the terminal device as the mobile phone 300 as an example. When a user uses the mobile phone to shoot, different cameras can be respectively configured to different parameters, so that the maximum effect of the multiple cameras can be obtained.
For example, a first camera may capture according to a first parameter and a second camera may capture according to a second parameter. Wherein the first parameter and the second parameter may be different.
In the embodiment of the present application, the shooting parameters of the camera may include a focal length, an aperture, and the like manually set by a user or automatically set by a mobile phone. That is, the first parameter may include a first focal length and a first aperture, and the second parameter may include a second focal length and a second aperture.
In a specific implementation, in order to achieve the best photographing effect, the first focal length may be configured to be larger than the second focal length, and the first aperture may be configured to be smaller than the second aperture.
Thus, when the first camera is taking according to the first parameters, the first image obtained is a long focus small aperture image, and when the second camera is taking according to the second parameters, the second image obtained is a short focus large aperture image.
It should be noted that the specific sizes of the focal length and the aperture are relative probabilities. That is, the focal length of one camera is compared with the focal length of the other camera, or the aperture of one camera is compared with the aperture of the other camera, and the specific sizes of the focal length and the aperture are not limited in this embodiment.
S602, detecting the first image and the second image respectively by adopting a preset detection model, wherein the detection model is obtained by training a plurality of training images marked with glare areas;
in this embodiment of the application, the first image and the second image may be detected by using a detection model preset in the mobile phone, and a glare area in each image is identified. The detection model may be a CNN model obtained by training a plurality of training images marked with glare regions.
In specific implementation, a plurality of different images can be collected in advance, a glare area in the images is marked, and then model training is performed by adopting a deep learning algorithm, so that a detection model for identifying the glare area is obtained.
S603, receiving a detection result output by the detection model, wherein the detection result comprises identification information of a first glare area in the first image and a second glare area in the second image;
after the first image and the second image are detected, the detection model may output a corresponding detection result, where the result includes the identification information of the glare area in the image.
S604, setting the pixel value of each pixel point in the first glare area and the second glare area to be zero respectively to obtain a first target image and a second target image;
in the embodiment of the present application, after detecting the glare area in the first image and the second image, the area may be first segmented, and then the segmented image area may be removed.
The imaging effect of different areas in the image is realized based on the pixels of the pixel points in the area. Therefore, in a specific implementation, when the segmented glare area is removed, the pixel value of each pixel point in the area may be set to zero, so as to obtain the first target image and the second target image respectively.
S605, generating a non-glare image according to the first target image and the second target image.
After removing the glare areas in the first image and the second image, the obtained first target image and the second target image may be registered and fused, so as to output a glare-free image.
As shown in fig. 7, an example effect diagram of the glare removal processing performed according to the image processing method shown in fig. 6 is shown. After a first image is shot by a first camera, namely a main camera, and a second image is shot by a second camera, the first image and the second image can be respectively subjected to glare area detection and segmentation, so that a first target image and a second target image are obtained, and after the first target image and the second target image are subjected to registration and fusion, a glare-free image as shown in the figure can be output.
According to the method and the device, the detection model capable of detecting the glare area is trained in advance by adopting a supervised deep learning method, and after the images are shot by using at least two cameras, the model can be directly adopted to detect and partition the glare area, so that the image processing efficiency is improved; on the basis, the first target image and the second target image without the glare area are registered and fused, so that the non-glare image with obvious texture can be output, the purpose of effectively removing glare is achieved, and the photo imaging quality of terminal equipment such as a mobile phone is improved.
Referring to fig. 8, a flowchart illustrating schematic steps of an image processing method according to another embodiment of the present application is shown, where the method may specifically include the following steps:
s801, controlling a first camera of a terminal device to acquire a first image according to a first parameter, and controlling a second camera of the terminal device to acquire a second image according to a second parameter, wherein the first parameter comprises a first focal length, a first aperture and first exposure time, the second parameter comprises a second focal length, a second aperture and second exposure time, the first focal length is greater than the second focal length, the first aperture is smaller than the second aperture, and the first exposure time is shorter than the second exposure time;
in addition, in this embodiment, based on the two embodiments, the purpose of glare removal is achieved by controlling the exposure time of the camera.
Therefore, in the embodiment of the present application, the parameters used when the camera shoots may include exposure time in addition to the focal length and the aperture.
In a specific implementation, the first exposure time of the first camera may be set to be less than the second exposure time of the second camera. Namely, the first camera adopts long focus, small aperture and short exposure time to shoot; and the second camera adopts short focus, large aperture and long exposure time to shoot. The area of a glare area is reduced by controlling the exposure time, and more pixel information near the glare is obtained.
It should be noted that, in this embodiment, the first camera and the second camera may be located below a screen of a terminal device such as a mobile phone.
S802, clustering the pixel points in the first image and the second image respectively according to the brightness values to obtain a plurality of first image area classes of the first image and a plurality of second image area classes of the second image;
because each image comprises a plurality of pixel points, and each pixel point has a corresponding brightness value, in the embodiment of the application, the glare area in the image can be identified in a clustering manner.
In specific implementation, the pixels in each image can be clustered according to the brightness value of each pixel to obtain a clustering result containing a plurality of image region classes.
S803, identifying a first glare area in the first image according to the first image area class; identifying a second glare area in the second image according to the second image area class;
in the embodiment of the application, after the first image and the second image are clustered, each image can obtain a clustering result of a plurality of image area classes of the treasure boat, and each image area class has a corresponding clustering center. That is, after clustering each pixel point in the first image, a plurality of first image region classes can be obtained; after clustering is carried out on each pixel point in the second image, a plurality of second image region classes can be obtained.
Therefore, when detecting the glare area in each image according to the clustering result, it can be done by the brightness value of the cluster center of each image area class.
In a specific implementation, for a first image, a region corresponding to an image region class of which the brightness value of a clustering center is greater than a preset threshold in the first image region class may be identified as a first glare region; similarly, for the second image, the area corresponding to the image area class of which the brightness value of the cluster center in the second image area class is greater than the preset threshold may also be identified as the second glare area.
S804, setting the pixel value of each pixel point in the first glare area and the second glare area to be zero respectively, and obtaining a first target image and a second target image;
in this embodiment of the application, for the segmented glare area, the glare area may be removed by setting a pixel value of each pixel point in the area to zero, so as to obtain a first target image and a second target image.
S805, determining a first corresponding relation between pixel points of the first target image and the second target image; respectively fusing all pixel points with the first corresponding relation in the first target image and the second target image to generate a non-glare image;
in the embodiment of the present application, for at least two different glare-removed images, the non-glare image can be output by registering and fusing the images.
In a specific implementation, a corresponding relationship between the first target image and the second target image may be determined first, and then, each pixel point having the corresponding relationship is registered and fused to generate a non-glare image.
S806, repairing the non-glare image to obtain a non-glare target image.
In the embodiment of the application, in order to ensure that the glare-free image generated by fusion has better imaging quality, the glare-free image can be repaired on the basis, so that the possibility that part of pixel points have no pixel value due to operation when a glare area is removed is reduced.
In a specific implementation, the repairing of the non-glare area may be performed based on a first target image, that is, a long-focus small-aperture image, and the process is similar to the registering and fusing of the first target area and a second target area, and it is required to first determine a second corresponding relationship between each pixel point of the first target image and each pixel point of the non-glare image, and then fuse each pixel point having the second corresponding relationship in the first target image and each pixel point of the non-glare image, so as to generate the non-glare target image.
Of course, the second target image may also be used to repair the non-glare image according to actual needs, which is not limited in this embodiment.
As shown in fig. 9, an example effect diagram of the glare removal processing performed according to the image processing method shown in fig. 8 is shown. After a first image is shot by a first camera, namely a main camera, and a second image is shot by a second camera, the first image and the second image are subjected to glare area detection, segmentation, registration and fusion, and then a glare-free image as shown in the figure can be output.
In the embodiment of the application, by clustering each pixel point, the glare area in the image can be directly identified according to the clustering result without training a detection model in advance; in addition, after the non-glare image is generated, the image can be repaired, and the imaging quality is favorably improved.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Fig. 10 shows a block diagram of an image processing apparatus according to an embodiment of the present application, which corresponds to the image processing method described in the above embodiment, and only shows portions related to the embodiment of the present application for convenience of description.
Referring to fig. 10, the apparatus may be applied to a terminal device, and specifically may include the following modules:
a first image collecting module 1001, configured to control a first camera of the terminal device to collect a first image according to a first parameter, and,
the second image acquisition module 1002 is configured to control a second camera of the terminal device to acquire a second image according to a second parameter, where the first parameter includes a first focal length and a first aperture, the second parameter includes a second focal length and a second aperture, the first focal length is greater than the second focal length, and the first aperture is smaller than the second aperture;
a glare area identification module 1003, configured to identify a first glare area in the first image and a second glare area in the second image;
a glare area processing module 1004, configured to remove the first glare area and the second glare area respectively to obtain a first target image and a second target image;
a non-glare image generating module 1005, configured to generate a non-glare image according to the first target image and the second target image.
In this embodiment, the first parameter may further include a first exposure time, and the second parameter may further include a second exposure time, where the first exposure time is smaller than the second exposure time.
In this embodiment, the glare area identification module 1003 may specifically include the following sub-modules:
the model detection submodule is used for respectively detecting the first image and the second image by adopting a preset detection model, and the detection model can be obtained by training a plurality of training images marked with glare areas;
and the detection result receiving submodule is used for receiving the detection result output by the detection model, and the detection result comprises identification information of a first glare area in the first image and a second glare area in the second image.
In this embodiment of the present application, the first image and the second image respectively include a plurality of pixel points, and each pixel point has a corresponding brightness value;
the glare area identification module 1003 may further include the following sub-modules:
the clustering submodule is used for respectively clustering all pixel points in the first image and the second image according to the brightness value to obtain a plurality of first image area classes of the first image and a plurality of second image area classes of the second image;
the first glare area identification submodule is used for identifying a first glare area in the first image according to the first image area class;
and the second glare area identification submodule is used for identifying a second glare area in the second image according to the second image area class.
In an embodiment of the application, each first image region class and each second image region class has a respective cluster center,
the first glare area identification sub-module may specifically include the following units:
a first glare area identification unit, configured to identify, as a first glare area, an area corresponding to an image area class in which a brightness value of the cluster center in the first image area class is greater than a preset threshold;
the second glare area identification sub-module may specifically include the following units:
and the second glare area identification unit is used for identifying an area corresponding to the image area class of which the brightness value of the clustering center is greater than the preset threshold value in the second image area class as a second glare area.
In this embodiment, the glare area processing module 1004 may specifically include the following sub-modules:
and the pixel value processing submodule is used for setting the pixel value of each pixel point in the first glare area and the second glare area to be zero respectively to obtain a first target image and a second target image.
In this embodiment of the application, the non-glare image generating module 1005 may specifically include the following sub-modules:
a first corresponding relation determining submodule, configured to determine a first corresponding relation between pixel points of the first target image and the second target image;
and the non-glare image generation submodule is used for fusing the pixel points with the first corresponding relation in the first target image and the second target image respectively to generate a non-glare image.
In this embodiment, the non-glare image generating module 1005 may further include the following sub-modules:
and the non-glare image restoration submodule is used for restoring the non-glare image to obtain a non-glare target image.
In this embodiment of the present application, the non-glare image restoration sub-module may specifically include the following units:
a second corresponding relation determining unit, configured to determine a second corresponding relation between the first target image and each pixel point of the non-glare image;
and the non-glare target image generating unit is used for fusing the pixel points with the second corresponding relation in the first target image and the non-glare image respectively to generate a non-glare target image.
For the apparatus embodiment, since it is substantially similar to the method embodiment, it is described relatively simply, and reference may be made to the description of the method embodiment section for relevant points.
Referring to fig. 11, a schematic diagram of a terminal device according to an embodiment of the present application is shown. As shown in fig. 11, the terminal device 1100 of the present embodiment includes: a processor 1110, a memory 1120, and computer programs 1121 stored in the memory 1120 and operable on the processor 1110. The processor 1110, when executing the computer program 1121, implements the steps of the various embodiments of the image processing method described above, such as the steps S501 to S504 shown in fig. 5. Alternatively, the processor 1110, when executing the computer program 1121, implements the functions of each module/unit in each device embodiment described above, for example, the functions of the modules 1001 to 1005 shown in fig. 10.
Illustratively, the computer programs 1121 can be divided into one or more modules/units that are stored in the memory 1120 and executed by the processor 1110 to accomplish the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which can be used to describe the execution process of the computer program 1121 in the terminal device 1100. For example, the computer program 1121 may be divided into a first image acquisition module, a second image acquisition module, a glare area recognition module, a glare area processing module, and a non-glare image generation module, where the specific functions of the modules are as follows:
a first image acquisition module for controlling a first camera of the terminal device to acquire a first image with a first parameter,
the second image acquisition module is used for controlling a second camera of the terminal equipment to acquire a second image according to second parameters, wherein the first parameters comprise a first focal length and a first aperture, the second parameters comprise a second focal length and a second aperture, the first focal length is greater than the second focal length, and the first aperture is smaller than the second aperture;
a glare area identification module for identifying a first glare area in the first image and a second glare area in the second image;
the glare area processing module is used for respectively removing the first glare area and the second glare area to obtain a first target image and a second target image;
and the non-glare image generating module is used for generating a non-glare image according to the first target image and the second target image.
The terminal device 1100 may be a mobile phone, a tablet computer, a palm computer, or other computing devices. The terminal device 1100 may include, but is not limited to, a processor 1110 and a memory 1120. Those skilled in the art will appreciate that fig. 11 is only one example of a terminal device 1100 and does not constitute a limitation of terminal device 1100, and may include more or fewer components than shown, or some components in combination, or different components, e.g., terminal device 1100 may also include input-output devices, network access devices, buses, etc.
The Processor 1110 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The storage 1120 may be an internal storage unit of the terminal device 1100, such as a hard disk or a memory of the terminal device 1100. The memory 1120 may also be an external storage device of the terminal device 1100, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, provided on the terminal device 1100. Further, the memory 1120 may also include both an internal storage unit and an external storage device of the terminal device 1100. The memory 1120 is used for storing the computer program 1121 and other programs and data required by the terminal device 1100. The memory 1120 may also be used to temporarily store data that has been output or is to be output.
The embodiment of the application also discloses a computer readable storage medium, which stores a computer program, and the computer program can realize the image processing method of the foregoing embodiment when being executed by a processor.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed image processing method, apparatus and terminal device may be implemented in other ways. For example, the division of the modules or units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the processes in the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium and can implement the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include at least: any entity or apparatus capable of carrying computer program code to an image processing apparatus, a terminal device, a recording medium, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), an electrical carrier signal, a telecommunications signal, and a software distribution medium. Such as a usb-disk, a removable hard disk, a magnetic or optical disk, etc. In certain jurisdictions, computer-readable media may not be an electrical carrier signal or a telecommunications signal in accordance with legislative and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same. Although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (12)

1. An image processing method is applied to a terminal device, and the method comprises the following steps:
controlling a first camera of a terminal device to acquire a first image according to a first parameter, and controlling a second camera of the terminal device to acquire a second image according to a second parameter, wherein the first parameter comprises a first focal length and a first aperture, the second parameter comprises a second focal length and a second aperture, the first focal length is greater than the second focal length, and the first aperture is smaller than the second aperture;
identifying a first glare area in the first image and a second glare area in the second image;
respectively removing the first glare area and the second glare area to obtain a first target image and a second target image;
and generating a non-glare image according to the first target image and the second target image.
2. The method of claim 1, wherein the first parameter further comprises a first exposure time and the second parameter further comprises a second exposure time, and wherein the first exposure time is less than the second exposure time.
3. The method of claim 2, wherein the identifying a first glare area in the first image and a second glare area in the second image comprises:
respectively detecting the first image and the second image by adopting a preset detection model, wherein the detection model is obtained by training a plurality of training images marked with glare areas;
receiving a detection result output by the detection model, wherein the detection result comprises identification information of a first glare area in the first image and a second glare area in the second image.
4. The method according to claim 1 or 2, wherein the first image and the second image respectively comprise a plurality of pixels, and each pixel has a corresponding brightness value;
the identifying a first glare area in the first image and a second glare area in the second image comprises:
according to the brightness values, clustering is respectively carried out on all pixel points in the first image and the second image, and a plurality of first image area classes of the first image and a plurality of second image area classes of the second image are obtained;
identifying a first glare area in the first image according to the first image area class;
and identifying a second glare area in the second image according to the second image area class.
5. The method according to claim 4, wherein each first image region class and each second image region class has a respective cluster center,
the identifying a first glare area in the first image from the first image area class comprises:
identifying an area corresponding to an image area class of which the brightness value of the clustering center is greater than a preset threshold value in the first image area class as a first glare area;
identifying a second glare region in the second image from the second image region class, comprising:
and identifying the area corresponding to the image area class of which the brightness value of the clustering center is greater than the preset threshold value in the second image area class as a second glare area.
6. The method of claim 1, 2, 3 or 5, wherein the removing the first glare area and the second glare area, respectively, to obtain a first target image and a second target image comprises:
and setting the pixel value of each pixel point in the first glare area and the second glare area to be zero respectively to obtain a first target image and a second target image.
7. The method of claim 6, wherein generating a non-glare image from the first target image and the second target image comprises:
determining a first corresponding relation between pixel points of the first target image and the second target image;
and respectively fusing all pixel points with the first corresponding relation in the first target image and the second target image to generate a non-glare image.
8. The method according to claim 7, wherein after fusing the pixel points having the first corresponding relationship in the first target image and the second target image respectively to generate a non-glare image, the method further comprises:
and repairing the non-glare image to obtain a non-glare target image.
9. The method according to claim 8, wherein the repairing the non-glare image to obtain a non-glare target image comprises:
determining a second corresponding relation between the pixel points of the first target image and the pixel points of the non-glare image;
and respectively fusing all the pixel points with the second corresponding relation in the first target image and the non-glare image to generate the non-glare target image.
10. An image processing apparatus, applied to a terminal device, the apparatus comprising:
a first image acquisition module for controlling a first camera of the terminal device to acquire a first image with a first parameter,
the second image acquisition module is used for controlling a second camera of the terminal equipment to acquire a second image according to second parameters, wherein the first parameters comprise a first focal length and a first aperture, the second parameters comprise a second focal length and a second aperture, the first focal length is greater than the second focal length, and the first aperture is smaller than the second aperture;
a glare area identification module for identifying a first glare area in the first image and a second glare area in the second image;
the glare area processing module is used for respectively removing the first glare area and the second glare area to obtain a first target image and a second target image;
and the non-glare image generating module is used for generating a non-glare image according to the first target image and the second target image.
11. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the image processing method according to any one of claims 1 to 9 when executing the computer program.
12. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the image processing method according to any one of claims 1 to 9.
CN202010225502.XA 2020-03-26 2020-03-26 Image processing method, device, terminal equipment and medium Active CN113518171B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010225502.XA CN113518171B (en) 2020-03-26 2020-03-26 Image processing method, device, terminal equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010225502.XA CN113518171B (en) 2020-03-26 2020-03-26 Image processing method, device, terminal equipment and medium

Publications (2)

Publication Number Publication Date
CN113518171A true CN113518171A (en) 2021-10-19
CN113518171B CN113518171B (en) 2022-11-18

Family

ID=78060211

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010225502.XA Active CN113518171B (en) 2020-03-26 2020-03-26 Image processing method, device, terminal equipment and medium

Country Status (1)

Country Link
CN (1) CN113518171B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117135438A (en) * 2023-03-10 2023-11-28 荣耀终端有限公司 Image processing method and electronic equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120268626A1 (en) * 2011-04-25 2012-10-25 Avermedia Information, Inc. Apparatus and method for eliminating glare
US20170142309A1 (en) * 2015-07-31 2017-05-18 Olympus Corporation Imaging apparatus and imaging method
CN106780370A (en) * 2016-11-25 2017-05-31 阿依瓦(北京)技术有限公司 A kind of image de-jittering device and method thereof
CN106791376A (en) * 2016-11-29 2017-05-31 广东欧珀移动通信有限公司 Imaging device, control method, control device and electronic installation
CN110557575A (en) * 2019-08-28 2019-12-10 维沃移动通信有限公司 method for eliminating glare and electronic equipment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120268626A1 (en) * 2011-04-25 2012-10-25 Avermedia Information, Inc. Apparatus and method for eliminating glare
US20170142309A1 (en) * 2015-07-31 2017-05-18 Olympus Corporation Imaging apparatus and imaging method
CN106780370A (en) * 2016-11-25 2017-05-31 阿依瓦(北京)技术有限公司 A kind of image de-jittering device and method thereof
CN106791376A (en) * 2016-11-29 2017-05-31 广东欧珀移动通信有限公司 Imaging device, control method, control device and electronic installation
CN110557575A (en) * 2019-08-28 2019-12-10 维沃移动通信有限公司 method for eliminating glare and electronic equipment

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117135438A (en) * 2023-03-10 2023-11-28 荣耀终端有限公司 Image processing method and electronic equipment

Also Published As

Publication number Publication date
CN113518171B (en) 2022-11-18

Similar Documents

Publication Publication Date Title
US10497097B2 (en) Image processing method and device, computer readable storage medium and electronic device
CN113132618B (en) Auxiliary photographing method and device, terminal equipment and storage medium
WO2019091486A1 (en) Photographing processing method and device, terminal, and storage medium
EP4047549A1 (en) Method and device for image detection, and electronic device
CN110839128B (en) Photographing behavior detection method and device and storage medium
CN110990341A (en) Method, device, electronic equipment and medium for clearing data
JP6862564B2 (en) Methods, devices and non-volatile computer-readable media for image composition
CN113888452A (en) Image fusion method, electronic device, storage medium, and computer program product
CN110807769B (en) Image display control method and device
CN109104573B (en) Method for determining focusing point and terminal equipment
CN108984075B (en) Display mode switching method and device and terminal
CN111857793A (en) Network model training method, device, equipment and storage medium
CN105513098B (en) Image processing method and device
CN113518171B (en) Image processing method, device, terminal equipment and medium
CN111275607B (en) Interface display method and device, computer equipment and storage medium
CN106454078A (en) Focusing mode control method and terminal device
CN108984677B (en) Image splicing method and terminal
CN112446849A (en) Method and device for processing picture
CN111611414A (en) Vehicle retrieval method, device and storage medium
CN115499577A (en) Image processing method and terminal equipment
CN111064886B (en) Shooting method of terminal equipment, terminal equipment and storage medium
CN113408809A (en) Automobile design scheme evaluation method and device and computer storage medium
CN111738282A (en) Image recognition method based on artificial intelligence and related equipment
CN111860030A (en) Behavior detection method, behavior detection device, behavior detection equipment and storage medium
CN110458289B (en) Multimedia classification model construction method, multimedia classification method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant