CN110956063A - Image processing method, device, equipment and storage medium - Google Patents

Image processing method, device, equipment and storage medium Download PDF

Info

Publication number
CN110956063A
CN110956063A CN201811133926.2A CN201811133926A CN110956063A CN 110956063 A CN110956063 A CN 110956063A CN 201811133926 A CN201811133926 A CN 201811133926A CN 110956063 A CN110956063 A CN 110956063A
Authority
CN
China
Prior art keywords
image
sky
target
shooting
sky material
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811133926.2A
Other languages
Chinese (zh)
Inventor
王倩
杜俊增
张月娇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN201811133926.2A priority Critical patent/CN110956063A/en
Publication of CN110956063A publication Critical patent/CN110956063A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/35Categorising the entire scene, e.g. birthday party or wedding scene
    • G06V20/38Outdoor scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/40Filling a planar surface by adding surface attributes, e.g. colour or texture

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Studio Devices (AREA)

Abstract

The application discloses an image processing method, an image processing device, image processing equipment and a storage medium, and relates to the field of image processing. The method comprises the following steps: identifying a sky area in the image through a neural network model, screening a target sky material from the candidate sky material according to the shooting attribute parameters of the read image, and replacing the sky area in the image with the target sky material to obtain a target image; the problem that the sky area can be manually replaced by at least 6 steps in the related technology is solved, the purpose that the original sky area in the image is replaced by automatically screening appropriate target sky materials through shooting attribute parameters is achieved, the later-stage image repairing of a user is not needed, the manual operation steps of the user are reduced, and the human-computer interaction efficiency is improved. Even if the user does not have the work of correcting the picture, the sky picture that can only be shot to compare favourably professional single reflection lens can be obtained.

Description

Image processing method, device, equipment and storage medium
Technical Field
The present disclosure relates to the field of image processing, and in particular, to an image processing method, an image processing apparatus, an image processing device, and a storage medium.
Background
Generally, a user uses a terminal to photograph an object of interest, and when a sky scene is included in a photographed image, the quality of the photographed image is affected by ambient light. For example, when a user shoots, the user may encounter the situations of poor weather conditions, poor light, and the like, and the shooting effect of the scenery may not achieve the ideal effect in the user's mind.
In the related art, a user uses a retouching software to replace a sky area in a shot image with an ideal sky material, and the related steps are as follows: 1. after a user uses a terminal to shoot an image, uploading the image to a computer; 2. on the premise that the computer is provided with the retouching software (if the computer is not provided with the retouching software, the retouching software needs to be installed firstly), the retouching software is opened; 3. opening images and ideal sky materials in the retouching software; 4. setting a sky material as a background and an image as a layer 1; 5. selecting a non-sky area in the image by using a matting tool, copying the non-sky area, and setting the non-sky area as an image layer 2; 6. deleting the layer 1, and performing feathering operation on the layer 2; 7. and saving the modified image.
The user can obtain the image after replacing the sky area through at least 7 steps, and the method is complex in operation, complex in steps and low in man-machine interaction efficiency.
Disclosure of Invention
The embodiment of the application provides an image processing method, device, equipment and storage medium, and can solve the problems that more manual operations are needed when sky materials are replaced in a sky area in a shot image, the steps are complex, and the human-computer interaction efficiency is low. The technical scheme is as follows:
according to a first aspect of the present application, there is provided an image processing method, the method comprising:
identifying a sky region in the image through a neural network model;
reading shooting attribute parameters of an image;
screening a target sky material from the candidate sky materials according to the shooting attribute parameters;
and replacing the sky area in the image with a target sky material to obtain a target image.
In some embodiments, screening the target sky material from the candidate sky materials according to the shooting attribute parameters includes:
extracting space-time parameters from the shooting attribute parameters, wherein the space-time parameters comprise shooting time and/or shooting place;
and screening out a target sky material from the candidate sky materials according to the space-time parameters.
In some embodiments, the spatiotemporal parameters include a capture time;
screening out a target sky material from the candidate sky materials according to the space-time parameters, and the method comprises the following steps:
and screening out target sky materials corresponding to the time periods from the candidate sky materials according to the time periods to which the shooting time belongs.
In some embodiments, the spatiotemporal parameters include a shooting location;
screening out a target sky material from the candidate sky materials according to the space-time parameters, and the method comprises the following steps:
and screening out target sky materials corresponding to the geographical area from the candidate sky materials according to the geographical area to which the shooting place belongs.
In some embodiments, the spatiotemporal parameters include a shooting time and/or a shooting location;
screening out target sky materials from candidate sky materials according to space-time parameters, and the method comprises the following steps:
determining a corresponding weather type according to the shooting time and the shooting place;
and screening out a target sky material corresponding to the weather type from the candidate sky materials.
In some embodiments, the method further comprises:
and processing the target image by adopting a filter, wherein the filter is used for enhancing the color tone uniformity and/or texture uniformity of the target sky material and other image parts in the target image.
In some embodiments, processing the target image with a filter includes:
determining a filter corresponding to the target image according to the shooting time and the weather type;
and processing the target image by adopting a filter corresponding to the target image.
In some embodiments, reading the shooting attribute parameters of the image includes:
EXIF (Exchangeable Image File) information of the Image is read as the photographing attribute information.
According to a second aspect of the present application, there is provided an image processing apparatus comprising:
an identification module configured to identify a sky region in an image through a neural network model;
a reading module configured to read a shooting attribute parameter of an image;
the screening module is configured to screen out a target sky material from the candidate sky materials according to the shooting attribute parameters;
the replacing module is configured to replace the sky area in the image with a target sky material to obtain a target image.
In some embodiments, a screening module, comprising:
the extraction submodule is configured to extract space-time parameters from the shooting attribute parameters, and the space-time parameters comprise shooting time and/or shooting places;
and the screening submodule is configured to screen out a target sky material from the candidate sky materials according to the space-time parameters.
In some embodiments, the spatiotemporal parameters include a capture time;
and the screening submodule is configured to screen out a target sky material corresponding to the time period from the candidate sky materials according to the time period to which the shooting time belongs.
In some embodiments, the spatiotemporal parameters include a shooting location;
and the screening submodule is configured to screen out a target sky material corresponding to the geographical area from the candidate sky material according to the geographical area to which the shooting place belongs.
In some embodiments, the spatiotemporal parameters include a shooting time and/or a shooting location;
a screening module comprising:
the determining submodule is configured to determine a corresponding weather type according to the shooting time and the shooting place;
and the screening submodule is configured to screen out a target sky material corresponding to the weather type from the candidate sky materials.
In some embodiments, the apparatus further comprises:
and the processing module is configured to process the target image by adopting a filter, and the filter is used for enhancing the color tone uniformity and/or texture uniformity of the target sky material and other image parts in the target image.
In some embodiments, a processing module, comprising:
the second determining submodule is configured to determine a filter corresponding to the target image according to the shooting time and the weather type;
and the processing sub-module is configured to process the target image by adopting the filter corresponding to the target image.
In some embodiments, the reading module is configured to read EXIF information of the image as the photographing attribute information.
According to a third aspect of the present application, there is provided a terminal comprising a processor and a memory, the memory having stored therein at least one instruction, at least one program, set of codes, or set of instructions, which is loaded and executed by the processor to implement the image processing method according to any of the first aspects above.
According to a fourth aspect of the present application, there is provided a computer readable storage medium having stored therein at least one instruction, at least one program, set of codes, or set of instructions, which is loaded and executed by the processor to implement the image processing method of any of the first aspects above.
The beneficial effects brought by the technical scheme provided by the embodiment of the application at least comprise:
identifying a sky area in the image through a neural network model, screening a target sky material from the candidate sky material according to the shooting attribute parameters of the read image, and replacing the sky area in the image with the target sky material to obtain a target image; the method solves the problem that the target image with the sky area replaced by the target sky material can be obtained only by at least 7 steps in the related technology, achieves the purpose that the original sky area in the image is replaced by automatically screening the appropriate target sky material through shooting attribute parameters, does not need to carry out later-stage map repairing of a user, reduces manual operation steps of the user, and improves the human-computer interaction efficiency. Even if the user does not have the work of correcting the picture, the sky picture that can only be shot to compare favourably professional single reflection lens can be obtained.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a block diagram of a terminal according to an exemplary embodiment of the present application;
fig. 2 is a block diagram of a terminal according to another exemplary embodiment of the present application;
FIG. 3 is a flow chart of an image processing method provided by an exemplary embodiment of the present application;
FIG. 4 is an interface schematic diagram of an image processing method provided by an exemplary embodiment of the present application;
FIG. 5 is an interface schematic diagram of an image processing method provided by another exemplary embodiment of the present application;
FIG. 6 is a flow chart of an image processing method provided by another exemplary embodiment of the present application;
FIG. 7 is a flow chart of an image processing method provided by another exemplary embodiment of the present application;
FIG. 8 is a flow chart of an image processing method provided by another exemplary embodiment of the present application;
FIG. 9 is a flow chart of an image processing method provided by another exemplary embodiment of the present application;
FIG. 10 is a flow chart of an image processing method provided by another exemplary embodiment of the present application;
FIG. 11 is a flow chart of an image processing method provided by another exemplary embodiment of the present application;
FIG. 12 is a flow chart of an image processing method provided by another exemplary embodiment of the present application;
FIG. 13 is a flow chart of an image processing method provided by another exemplary embodiment of the present application;
fig. 14 is a block diagram of an image processing apparatus provided in an exemplary embodiment of the present application;
FIG. 15 is a block diagram of an image processing apparatus according to another exemplary embodiment of the present application
Fig. 16 is a block diagram of an image processing apparatus according to another exemplary embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
First, several terms related to the embodiments of the present application are explained:
a neural network model: the artificial neural network is formed by connecting n neurons, wherein n is a positive integer. In this application, the neural network model is the artificial network model that can discern the sky region in the image, and this neural network model can divide into input layer, hidden layer and output layer, and the terminal is with the input layer of image input neural network model, is carried out down-sampling by hidden layer to the image of input, carries out convolution calculation to the pixel in the image promptly, finally through output layer output recognition result. The Neural Network model includes a CNN (Convolutional Neural Network) model, an FCN (full Convolutional Neural Network) model, a DNN (Deep Neural Network) model, an RNN (Recurrent Neural Network) model, an embedding model, a GBDT (Gradient Boosting Decision Tree) model, an LR (logistic regression) model, and the like.
CNN model: is a deep feedforward artificial neural network. CNN includes but is not limited to the following three parts: the sensor comprises an input layer, a combination of n convolutional layers and a pooling layer, and a fully-connected multilayer sensor, wherein n is a positive integer. The CNN includes a feature extractor consisting of convolutional and pooling layers. And the characteristic extractor performs characteristic extraction on the samples input by the input layer to obtain model parameters, and completes final model training in the perceptron according to the model parameters. In recent years, CNN models are widely applied to speech recognition, general object recognition, face recognition, image recognition, motion analysis, natural language processing, brain wave analysis, and the like, and the application takes the application of CNN models to image recognition as an example, and specifically, the CNN models are used to recognize sky regions in images.
FCN model: is a deep feedforward artificial neural network. The difference from the above-described CNN model is that the output layer is a convolutional layer, and the output layer of the CNN model is a fully-connected layer. The FCN model is subjected to convolution, pooling and deconvolution processes, and finally the identified image is output.
DNN model: is a deep learning framework. The DNN model includes an input layer, at least one hidden layer (or intermediate layer), and an output layer. Optionally, the input layer, the at least one hidden layer (or intermediate layer), and the output layer each include at least one neuron, and the neuron is configured to process the received data. Alternatively, the number of neurons between different layers may be the same; alternatively, it may be different.
RNN model: is a neural network with a feedback structure. In the RNN model, the output of a neuron can be directly applied to itself at the next time stamp, i.e., the input of the i-th layer neuron at time m includes its own output at time (m-1) in addition to the output of the (i-1) layer neuron at that time.
Imbedding model: relationships in each triplet instance are treated as translations from entity head to entity tail based on the entity and relationship distributed vector representations. The triple instance comprises a subject, a relation and an object, and can be expressed as (subject, relation and object); the subject is an entity head, and the object is an entity tail. Such as: dad of the small is large, then represented by the triple instance as (small, dad, large).
GBDT model: is an iterative decision tree algorithm, which consists of a plurality of decision trees, and the results of all the trees are accumulated to be the final result. Each node of the decision tree obtains a predicted value, and taking age as an example, the predicted value is an average value of ages of all people belonging to the node corresponding to the age.
LR model: the method is a model established by applying a logic function on the basis of linear regression.
Sky region identification model: the model is established according to at least one of a CNN model, an FCN model, a DNN model, an RNN model, an embedding model, a GBDT model and an LR model, and is used for identifying the sky area in the image.
Viewing images: and the image data collected by the photosensitive device is used for displaying the image in the shooting preview interface. If a shutter signal triggered by a user is received, the framing image can be processed and stored as a target image.
Shooting an image: and storing the viewfinder image according to the shutter signal to obtain an image.
Shooting attribute parameters: and obtaining an EXIF format file of the image after the image is shot, wherein the EXIF format file comprises file header information. The header information includes information such as aperture, shutter, white balance, sensitivity, focal length, date, time, and location when the image is captured, and the header information is used as a capture attribute parameter.
Candidate sky materials: is a plurality of sky materials pre-stored in a memory, the sky materials comprise sky materials of different time instants, and/or different geographical areas, and/or different weather types. The candidate sky materials can be high-quality sky materials selected manually or sky materials shot by a high-performance single-lens reflex camera.
Target sky material: the sky material is determined from a plurality of candidate sky materials and corresponds to the shooting attribute parameters of the image.
Target image: the sky image is obtained after a sky area in the image is replaced by a target sky material.
Referring to fig. 1, a block diagram of a terminal according to an exemplary embodiment of the present application is shown. The terminal includes: a light receiving device 101, an ISP (Image Signal Processing) module 102, a processor 103, and a memory 104.
And a light sensing device 101 configured to sense a shooting environment to obtain a through image. The light sensing device 101 may be a CCD (charge coupled device) image sensor or a CMOS (complementary metal Oxide Semiconductor) image sensor.
The ISP module 102 is electrically connected to the light sensing device 101. Optionally, the ISP module 102 and the photosensitive device 101 are connected by a bus, or the ISP module 102 and the photosensitive device 101 are integrated into the same electrical package or chip. The photosensitive device 101 transmits the acquired image data to the ISP module 102 for processing.
The ISP module 102 is configured to acquire a through image captured by the light sensing device 101, and when receiving a shutter signal, captures a captured image from the through image. In some embodiments, the ISP module 102 is also configured to perform auto-exposure, auto-focus, auto-adjust white balance, and the like.
The processor 103 is electrically connected to the ISP module 102. Optionally, the processor 103 and the ISP module 102 are connected by a bus, or the processor 103 and the ISP module 102 are integrated into the same electrical package or chip. The processor 103 may include one or more processing cores for transmitting a shutter signal to the ISP module 102 and for acquiring and storing a photographed image photographed by the ISP module 102. Optionally, the processor 103 includes a neural network model; the processor 103 identifies the sky region in the image through the neural network model, and then acquires the sky material stored in the memory 104 to process the sky region in the image.
The memory 104 is electrically connected to the processor 103. Optionally, the memory 104 is connected to the processor 103 via a bus. The Memory 120 may include a RAM (Random Access Memory) and a ROM (Read-Only Memory). The memory 104 is used for storing preset sky materials and storing images processed by the processor 103.
In some embodiments, the terminal further comprises: an AI (Artificial Intelligence) chip 105. Referring to fig. 2, the AI chip 105 is electrically connected to the ISP module 102. Optionally, the AI chip 105 is connected to the ISP module 102 via a bus. The AI chip comprises a neural network model; in some embodiments, the AI chip identifies sky regions in the image through a neural network identification model.
The AI chip 105 is also electrically connected to the memory 104. Optionally, the AI chip 105 is connected to the memory 104 via a bus. The AI chip is further configured to acquire a sky material stored in the memory 104, and replace a sky area in the image with the acquired sky material.
Referring to fig. 3, a flowchart of an image processing method provided in an exemplary embodiment of the present application is shown, where the present embodiment is illustrated by applying the method to the terminal shown in fig. 1 or fig. 2, and the method includes:
step 201, identifying a sky area in an image through a neural network model.
Optionally, after the terminal captures the image, the image is opened in the album, and the sky area in the image is identified through the neural network model.
Optionally, the neural network model includes, but is not limited to, at least one of an FCN model, a CNN model, a DNN model, an RNN model, an embedding model, a GBDT model, and an LR model.
Optionally, the sky of the sky area may be a sky shot at different times, and/or different locations, and/or different weather types; for example, the images may be taken early morning, at noon, at sunrise, at sunset, in the city, on prairie, at sea, on sunny days, cloudy, etc.
Step 202, reading shooting attribute parameters of the image.
The image has many shooting parameters in the shooting process, and the shooting parameters are stored as shooting attribute parameters.
In the embodiment of the application, the shooting attribute parameter can be used as reference data for acquiring a target sky material by a terminal. Optionally, the shooting attribute parameter may also be used as reference data for the terminal to acquire the filter.
Optionally, the terminal reads EXIF information of the image as a shooting attribute parameter. For most of the shot scenes, the EXIF format file of the image is obtained after the image is shot and saved. The file header information of the EXIF format file is EXIF information, the terminal reads the EXIF information of the EXIF format file corresponding to the image, and the EXIF information is used as the shooting attribute parameter.
The shooting attribute parameters comprise at least one of aperture, shutter, white balance, sensitivity, focal length, date, time and place when the image is shot.
And step 203, screening out a target sky material from the candidate sky materials according to the shooting attribute parameters.
Optionally, the terminal extracts a space-time parameter from the shooting attribute parameter, wherein the space-time parameter comprises shooting time and/or shooting place; and screening out a target sky material from the candidate sky materials according to the space-time parameters.
In an alternative embodiment, the present step comprises: the space-time parameters comprise shooting time;
and screening out a target sky material corresponding to the time period from the candidate sky materials according to the time period to which the shooting time belongs.
In an alternative embodiment, the present step comprises: the space-time parameters comprise shooting places;
and screening out target sky materials corresponding to the geographical area from the candidate sky materials according to the geographical area to which the shooting place belongs.
In an alternative embodiment, the present step comprises: the space-time parameters comprise shooting time and/or shooting place;
determining a corresponding weather type according to the shooting time and the shooting place; and screening out a target sky material corresponding to the weather type from the candidate sky materials.
Alternatively, the candidate sky material can be a sky image shot under different time, and/or different place, and/or different weather types; for example, the images may be taken early morning, at noon, at sunrise, at sunset, in the city, on prairie, at sea, on sunny days, cloudy, etc.
And step 204, replacing the sky area in the image with a target sky material to obtain a target image.
And the terminal replaces the sky area in the image with a target sky material to obtain a target image. Optionally, the terminal removes the sky region in the image identified by the sky region identification model to obtain a partial image of the non-sky region, and combines the partial image of the non-sky region with the target sky material, so that the sky region in the image is replaced with the target sky material, and the target image is finally obtained.
Optionally, a channel matting method is used to obtain a partial image of a non-sky region in the image, the partial image of the non-sky region is combined with a target sky material, the sky region in the image is replaced with the target sky material, and a target image is finally obtained.
Schematically, referring to fig. 4, in an album in the terminal, the user selects and opens an image 31 including a sky area, i.e., a portion of the image 31 circled with a dotted line; the user triggers an image processing function in the album for replacing the sky area in the image, illustratively, the user clicks the control button 32; as shown in the lower diagram of fig. 4, the terminal identifies the sky area in the image 31 through a neural network; determining a target sky material from the candidate sky materials according to the shooting attribute parameters of the image 31; the sky area in the image 31 is automatically replaced by a target sky material to obtain a target image 33, and compared with the sky area in the image 31 and the sky area in the image 33, the sky area is obviously different. Optionally, the terminal creates an image file and automatically saves the image 33.
In some embodiments, the triggering mode of the image processing function of the space area in the replacement image in the album can be at least one of a long-press operation, a pressure touch operation, a double-finger press operation, a finger joint double-click operation and a multi-click operation.
In other embodiments, a button control is provided on the display interface of the image for triggering the replacement of the sky area in the image, such as the control button 32 in fig. 4.
Optionally, the user may customize the target sky material. Schematically, referring to fig. 5, the terminal displays an image 35 in an album, and further displays an image 36 corresponding to the candidate sky material 1 and an image 37 corresponding to the candidate sky material 2 below the image 35; a user selects a candidate sky material 2 in an image 37 as a target sky material, and a terminal replaces a sky area of the image 35 with the candidate sky material 2 to obtain an image 38, as shown in fig. 5, the sky area in the image 35 is obviously different from the sky area in the image 38; the user triggers the save control, the terminal creates an image file and saves the image 38.
In summary, in the image processing method provided by the embodiment of the application, the sky region in the image is identified through the neural network model, the target sky material is screened out from the candidate sky material according to the shooting attribute parameters of the read image, and the sky region in the image is replaced with the target sky material to obtain the target image; the method solves the problem that the target image with the sky area replaced by the target sky material can be obtained only by at least 7 steps in the related technology, achieves the purpose that the original sky area in the image is replaced by automatically screening the appropriate target sky material through shooting attribute parameters, does not need to carry out later-stage map repairing of a user, reduces manual operation steps of the user, and improves the human-computer interaction efficiency. Even if the user does not have the work of correcting the picture, the sky picture that can only be shot to compare favourably professional single reflection lens can be obtained.
In addition, according to the image processing method provided by the embodiment of the application, a user can also obtain the sky photos under different sky materials through self-defining of the target sky materials, and user experience is improved.
Before the terminal calls the neural network model, the neural network model capable of identifying the sky region in the image needs to be obtained through image training: and identifying a model of the sky region.
Schematically, fig. 6 shows a flowchart of an image training method of a sky region identification model according to an exemplary embodiment, where the following training process takes a neural network model as an FCN model as an example, and includes the following steps:
and step 11, extracting a feature segmentation graph from the image samples for at least one group of historical image samples.
At least one group of historical image samples is an image sample including a sky region in an image, and the sky region includes different sky regions divided by time and/or place.
And the terminal performs down-sampling on at least one group of historical image samples to obtain a feature segmentation map of the image samples. Optionally, the historical image sample is used as an input, and the terminal performs segmentation mapping through the stacked convolution layers to output a low-resolution feature segmentation map.
Optionally, the terminal performs segmentation mapping on each pixel point in the image by using an Argmax function, and creates an output channel by using a One-Hot Code (One-Hot Code) or Label (Label Encoder) method. The Argmax function is a function of a dependent variable of a maximum function value mapped by the maximum function value of one function in the mathematical field, and is used for segmenting and mapping each pixel point in an image.
Illustratively, a terminal performs segmentation and mapping on an image sample including a sky region to obtain a feature segmentation map, and the feature segmentation map divides the sky region and a non-sky region.
And step 12, inputting the feature segmentation graph into a decoder unit to obtain a training result.
The neural network model comprises a decoder unit, wherein the decoder unit is used for performing deconvolution on each characteristic value in the characteristic segmentation graph and restoring the characteristic segmentation graph into the segmentation graph with the original resolution.
And the terminal performs up-sampling on the characteristic segmentation graph in a decoder unit to obtain the segmentation graph of the original resolution. Optionally, the terminal performs upsampling by using a method of transposition convolution to obtain a segmentation map of the original resolution; and creating a segmentation graph of the output channel output original resolution through one-hot coding or coding labels.
Illustratively, the terminal inputs the feature segmentation map into a decoder unit for up-sampling to obtain a segmentation map of the sky region with the original resolution.
And step 13, comparing the training result with the standard result to obtain a calculation loss, wherein the calculation loss is used for indicating the error between the training result and the standard result.
The standard result is used to indicate a sky area contained in the image.
Optionally, the calculated loss is expressed by cross-entropy (cross-entropy), and optionally, the terminal calculates the calculated loss H (p, q) by the following formula:
Figure BDA0001814264880000111
wherein p (x) and q (x) are discrete distribution vectors of equal length, and p (x) represents the training result; q (x) represents an output parameter; x is a vector in the training results or output parameters.
And 14, training by adopting an error back propagation algorithm according to the respective corresponding calculation loss of at least one group of historical image samples to obtain a sky region identification model.
Optionally, the terminal determines a gradient direction of the sky region identification model according to the calculation loss through a back propagation algorithm, and updates the model parameters in the sky region identification model layer by layer from an output layer of the sky region identification model.
Optionally, before each error back propagation, the terminal detects whether the current iteration meets a training end condition, where the training end condition may be that the computation loss is less than the error threshold, or that the number of iterations reaches a preset number (for example, 20000). When the training end condition is not met, entering step 14; and when the training end condition is met, obtaining the trained neural network model.
The model training method provided in the present embodiment is only an exemplary description, and does not limit the training method of the neural network model.
In an alternative embodiment based on fig. 3, step 203 is implemented by replacing step 2031 and step 2032, as shown in fig. 7:
step 2031, extracting space-time parameters from the shooting attribute parameters.
The space-time parameters are used for acquiring target sky materials. Optionally, the terminal obtains the time-space parameters in the shooting attribute parameters by calling a function, where the time-space parameters include shooting time and/or shooting location.
Step 2032, screening out target sky materials from the candidate sky materials according to the space-time parameters.
And the terminal screens out the target sky material from the candidate sky material according to the shooting time and/or shooting place included by the space-time parameters.
In summary, the present embodiment further determines the target sky material according to the space-time parameter, so that the target sky material is more appropriate to the sky background required by the user.
It should be noted that, in the exemplary embodiment shown in fig. 7, the terminal screens out the target sky material from the candidate sky materials according to the shooting time and/or the shooting location, and the method can be divided into at least the following three cases:
1. the terminal determines a target sky material according to the shooting time;
2. the terminal determines a target sky material according to a shooting place;
3. and the terminal determines a target sky material according to the shooting time and the shooting place.
In the following exemplary embodiment, fig. 8 is an explanatory explanation of an image processing method in the first case described above; fig. 9 is an explanatory view of the image processing method in the second case described above; fig. 10 is an explanatory view of an image processing method in the third case described above.
Referring to fig. 8, which shows a flowchart of an image processing method provided in another exemplary embodiment of the present application, in an alternative embodiment based on fig. 7, step 2032 is replaced with step 21, and the first case is explained, the method includes the following working procedures:
and 21, screening out a target sky material corresponding to the time period from the candidate sky materials according to the time period to which the shooting time belongs.
And after the terminal obtains the shooting time of the image from the shooting attribute, determining the time period to which the shooting time belongs, and correspondingly finding the target sky material from the time period preset by the terminal and the corresponding table of the candidate sky material according to the time period.
Illustratively, referring to table 1, each time segment corresponds to a candidate sky material, for example, if the image capturing time is in the time segment 04:00-06:00, and corresponds to the target sky material 1, the target sky material 1 is determined to be the target sky material; the image shooting time is in a time period of 06:00-10:00 and corresponds to the target sky material 2, and the target sky material 2 is determined to be the target sky material; the image shooting time is within the time period of 10:00-14:00 and corresponds to the target sky material 3, and the target sky material 3 is determined to be the target sky material; the image shooting time is within a time period of 14:00-16:00 and corresponds to the target sky material 4, and the target sky material 4 is determined to be the target sky material; and if the image shooting time is within the time period of 16:00-18:00 and corresponds to the target sky material 5, determining that the target sky material 5 is the target sky material.
TABLE 1
Time period Candidate sky material
04:00-06:00 Target sky material 1
06:00-10:00 Target sky material 2
10:00-14:00 Target sky material 3
14:00-16:00 Target sky material 4
16:00-18:00 Target sky material 5
In summary, in this embodiment, the terminal further obtains the target sky material corresponding to the belonging time period according to the shooting time, so that the target sky material is closer to or more matched with the part of the image, except the sky area, shot by the user.
Referring to fig. 9, which shows a flowchart of an image processing method provided in another exemplary embodiment of the present application, in an alternative embodiment based on fig. 7, step 2032 is replaced with step 22, and the second case is described, the method includes the following working procedures:
and step 22, screening out target sky materials corresponding to the geographical area from the candidate sky materials according to the geographical area to which the shooting place belongs.
After the terminal obtains the shooting place of the image from the shooting attribute, the geographical area to which the shooting place belongs is determined, and the target sky material is correspondingly found from the corresponding table of the geographical area preset by the terminal and the candidate sky material according to the geographical area.
Optionally, the geographical area is an area divided according to humanity, and/or terrain, and/or climate.
Optionally, the different shooting locations correspond to the same geographical area, and the corresponding target sky materials are the same.
Schematically, referring to table 2, each location corresponds to a geographic area, and each geographic area corresponds to a candidate sky material, for example, the image capturing location is located in shanghai, the corresponding geographic area is a city area, and the opposite target sky material is a target sky material 6; the image shooting site is located in a Renbell grassland, the corresponding geographic area is a grassland area, and the corresponding target sky material is a target sky material 7; the image shooting place is a Qinghai-Tibet plateau, the corresponding geographic area is a plateau area, and the corresponding target sky material is a target sky material 8; the image shooting places are located in Yangan, the corresponding geographic regions are city regions, the corresponding target sky materials are target sky materials 6, and the target sky materials are determined to be the same under the conditions that the shooting places are different and the geographic regions are the same by combining the condition that the shooting places are Shanghai.
TABLE 2
Shooting location Geographic region Candidate sky material
Shanghai province Urban area Target sky material 6
Hulunbel grass-root Grassland area Target sky material 7
Qinghai-Tibet plateau Plateau area Target sky material 8
Yanan tea Urban area Target sky material 6
Optionally, the terminal may determine the corresponding target sky material according to the shooting time and the shooting location. Schematically, referring to table 3, the geographic areas to which the shooting sites belong are the same, the time periods to which the shooting times belong are the same, and the corresponding target sky materials are the same, for example, the shooting time is in the time period 06:00-08:00, when the shooting site is Nanchang, the corresponding geographic area is a city area, and the corresponding target sky material is a target sky material 9; when the shooting place is wuhan and the corresponding geographical area is also a city area, the corresponding target sky material is also the target sky material 9.
Optionally, the shooting locations belong to the same geographical area, the shooting times belong to different time periods, and the shooting locations correspond to different target sky materials, for example, the shooting locations are wuhan, the corresponding geographical areas are city areas, and when the shooting times are within a time period of 06:00-08:00, the corresponding target sky materials are target sky materials 9; when the shooting time is within the time period of 08:00-10:00, the corresponding target sky material is the target sky material 10.
Optionally, the shooting locations belong to different geographical areas, the shooting times belong to the same time period, and correspond to different target sky materials, for example, the shooting times are within a time period of 08:00-10:00, and when the shooting locations are wuhan and the corresponding geographical areas are city areas, the corresponding target sky materials are target sky materials 10; when the shooting place is the Ili grassland and the corresponding geographical area is the grassland area, the corresponding target sky material is the target sky material 11.
Optionally, the shooting locations belong to different geographical areas, and the shooting times belong to different time periods, and correspond to different target sky materials, for example, if the shooting location is in Nanchang, the corresponding geographical area is an urban area, and the shooting time is within 06:00-08:00, the corresponding target sky material is a target sky material 9; the shooting place is in Ili grassland, the corresponding geographic area is grassland area, the shooting time is within 08:00-10:00, and the corresponding target sky material is a target sky material 11.
TABLE 3
Figure BDA0001814264880000151
In summary, in this embodiment, the terminal further obtains the target sky material corresponding to the geographic area to which the terminal belongs according to the shooting location, so that the target sky material is more closely or more closely matched with the part of the image, except the sky area, shot by the user.
Referring to fig. 10, which shows a flowchart of an image processing method provided in another exemplary embodiment of the present application, in an alternative embodiment based on fig. 7, step 2032 is replaced with step 23, and the third case is described, where the method includes the following working procedures:
step 23, determining a corresponding weather type according to the shooting time and the shooting place; and screening out a target sky material corresponding to the weather type from the candidate sky materials.
The shooting attribute parameters of the image further include: date of capture of the image. Optionally, the terminal determines the weather type during shooting according to the shooting date, the shooting time and the shooting place; and determining a corresponding target sky material according to the weather type.
Schematically, referring to table 4, different weather types correspond to different target sky materials, for example, the weather type is sunny, and the corresponding target sky material is the target sky material 12; the weather type is cloudy, and the corresponding target sky material is a target sky material 13; the weather type is cloudy, and the corresponding target sky material is a target sky material 14; the weather type is light rain and the corresponding target sky material is the target sky material 15.
TABLE 4
Weather type Candidate sky material
In sunny days Target sky material 12
Cloudy Target sky material 13
Cloudy day Target sky material 14
Light rain Target sky material 15
Optionally, the terminal determines the corresponding target sky material according to the shooting location and the weather type. Referring to the schematic table 5, if the geographic areas to which the shooting locations belong are the same and the weather types are different, the corresponding target sky materials are different, for example, the shooting location is a swell area, the corresponding geographic area is a coastal area, and if the weather type is a sunny day, the shooting location corresponds to the target sky material 16; and when the weather type is cloudy, the weather type corresponds to the target sky material 17.
Optionally, if the shooting location belongs to different geographical areas and the weather types are the same, the corresponding target sky materials are different, for example, if the weather type is sunny, and if the shooting location is a swell and the corresponding geographical area is a coastal area, the corresponding target sky materials are 16; when the shooting place is a Tarim basin and the corresponding geographic area is a basin area, the shooting place corresponds to the target sky material 18.
Optionally, if the shooting location belongs to different geographical areas and the weather type is different, the corresponding target sky material is different, for example, if the shooting location is a swell, the corresponding geographical area is a coastal area, and the weather type is cloudy, the corresponding target sky material is 17; when the shooting place is a Tarim basin, the corresponding geographical area is a basin area, and the weather type is sunny days, the shooting place corresponds to the target sky material 18.
TABLE 5
Figure BDA0001814264880000161
Figure BDA0001814264880000171
Optionally, the terminal determines the corresponding target sky material according to the shooting time, the shooting place and the weather type. Schematically, referring to table 6, the shooting locations belong to the same geographical area, the shooting times belong to the same time period, the weather types are different, and the shooting times correspond to different target sky materials, for example, the shooting locations are Shijiazhuang, the corresponding geographical areas are city areas, the shooting times are located in the time period of 12:00-14:00, and when the weather types are cloudy, the corresponding target sky materials are determined to be target sky materials 19; when the weather type is sunny, the corresponding target sky material is determined to be the target sky material 20.
Optionally, the shooting places belong to the same geographical area, the weather types are the same, the time periods of the shooting times are different, and the shooting places correspond to different target sky materials, for example, the shooting places are northbound rivers, the corresponding geographical areas are coastal areas, the weather types are sunny days, and when the shooting times are within the time periods of 12:00-14:00, the corresponding target sky materials are determined to be target sky materials 21; and when the shooting time is within the time period of 14:00-16:00, determining that the corresponding target sky material is the target sky material 23.
Optionally, the shooting locations belong to the same geographical area, the shooting times belong to different time periods and the weather types are different, and the corresponding target sky materials correspond to different weather types, for example, the shooting location is suzhou, the corresponding geographical area is a city area, and when the shooting time is within the time period of 12:00-14:00, the weather type is cloudy, and the corresponding target sky material is determined to be the target sky material 19; the shooting place is Beijing, the corresponding geographic area is an urban area, and when the shooting time is within a time period of 16:00-18:00 and the weather type is sunny, the corresponding target sky material is determined to be a target sky material 24.
Optionally, the time periods of the shooting time are the same, the weather types are the same, the geographical areas of the shooting places are different, and the shooting places correspond to different target sky materials, for example, when the shooting time is in the time period of 12:00-14:00, the weather types are sunny days, and when the shooting places are Shijiazhuang, the corresponding geographical areas are city areas, the corresponding target sky materials are determined to be target sky materials 20; when the shooting place is the northbound river, the corresponding geographical area is a coastal area, and the corresponding target sky material is determined to be the target sky material 21.
Optionally, the time periods of the shooting time are the same, the weather types are different, the geographical areas of the shooting places are different, and the shooting time corresponds to different target sky materials, for example, when the shooting place is suzhou, the corresponding geographical area is a city area, the weather type is cloudy, and the corresponding target sky material is determined to be a target sky material 19; when the shooting place is the northbound river, the corresponding geographical area is the coastal area, the weather type is light rain, and the corresponding target sky material is determined to be the target sky material 22.
Optionally, the weather types are the same, the time periods of the shooting times are different, the geographical areas of the shooting places are different, and the corresponding target sky materials are different, for example, the weather types are sunny days, when the shooting places are the northbound rivers, the corresponding geographical areas are coastal areas, the shooting times are within the time periods of 12:00-14:00, and the corresponding target sky materials are determined to be target sky materials 21; and when the shooting place is Beijing, the corresponding geographic area is a city area, the shooting time is within the time period of 16:00-18:00, and the corresponding target sky material is determined to be a target sky material 24.
Optionally, the shooting locations belong to different geographical areas, the shooting times belong to different time periods, the weather types are different, and corresponding target sky materials are different, for example, the shooting location is a northeast river, the corresponding geographical area is a coastal area, the shooting time is within a time period of 12:00-14:00, and the weather type is light rain, and the corresponding target sky material is determined to be a target sky material 22; the shooting place is Beijing, the corresponding geographic area is a city area, the shooting time is within the time period of 16:00-18:00, the weather type is sunny, and the corresponding target sky material is determined to be a target sky material 24.
TABLE 6
Figure BDA0001814264880000181
In summary, in this embodiment, the terminal further obtains a corresponding target sky material according to the weather type, so that the target sky material is closer to or more matched with a portion of the image, except for the sky area, captured by the user.
Referring to fig. 11, which shows a flowchart of an image processing method provided in another exemplary embodiment of the present application, in an alternative embodiment based on fig. 7, the terminal uses EXIT information as an attribute parameter, and then step 202 is replaced with step 2021, and the method includes the following working processes:
in step 2021, EXIF information in the image is read as a shooting attribute parameter.
The terminal reads EXIF information of the image as a photographing attribute parameter. The method comprises the steps of obtaining an EXIF format file of an image after the image is shot, wherein the header information of the EXIF format file is EXIF information, reading the EXIF information of the EXIF format file corresponding to the image by a terminal, and taking the EXIF information as shooting attribute parameters, wherein the shooting attribute parameters comprise at least one of aperture, shutter, white balance, sensitivity, focal length, date, time and place during image shooting.
In summary, in this embodiment, the terminal further obtains a space-time parameter through the EXIF information, and obtains a target sky material according to the space-time parameter, so that the target sky material is more closely or more closely matched with a part of an image, except a sky area, captured by the user.
Referring to fig. 12, which shows a flowchart of an image processing method provided in another exemplary embodiment of the present application, in an alternative embodiment based on fig. 7, after obtaining a target image, the terminal further performs image processing by using a filter, and then adds step 205 after step 204, where the method includes the following working processes:
step 205, processing the target image by using a filter.
Optionally, the filter includes at least one of an inner threshold filter, an inner filter, and an outer filter.
In order to enhance the color tone uniformity and/or texture uniformity of the target sky material and other image parts in the target image, the terminal adopts a filter to process the target image.
In summary, in this embodiment, after the terminal further obtains the target image, the filter is used to process the image, so that the color tone uniformity and/or texture uniformity of the target sky material and other image portions in the target image are enhanced; for example, the light direction of each frame in the target image after being processed by the filter is the same, and/or the color tone of the sky area of the target image is uniform with that of the parts except the sky area.
Referring to fig. 13, which shows a flowchart of an image processing method provided in another exemplary embodiment of the present application, in an alternative embodiment based on fig. 12, step 205 is replaced with step 2051 and step 2052, and the method includes the following working procedures:
and step 2051, determining a filter corresponding to the target image according to the shooting time and/or the weather type.
Optionally, the terminal determines a filter corresponding to the target image according to the shooting time.
Illustratively, referring to table 7, each time period corresponds to a set of filters, wherein a set of filters includes at least one filter. For example, the image shooting time is within the time period 04:00-10:00, and the filter corresponding to the target image is determined to be the 1 st group of filter corresponding to the 1 st group of filter; the image shooting time is within a time period of 10:00-16:00, the image shooting time corresponds to the 2 nd group of filters, and the filters corresponding to the target image are determined to be the 2 nd group of filters; the image shooting time is within the time period of 16:00-18:00 and corresponds to the 3 rd group of filters, and the filters corresponding to the target image are determined to be the 3 rd group of filters.
TABLE 7
Figure BDA0001814264880000191
Figure BDA0001814264880000201
Optionally, the shooting attribute parameters further include: a shooting location; the terminal searches the weather type of the image shooting according to the shooting time and the shooting place of the image; and determining a filter corresponding to the target image according to the weather type.
Illustratively, referring to table 8, each weather type corresponds to a set of filters, wherein a set of filters includes at least one filter. For example, when the weather type of the image shooting is sunny, the filter corresponding to the 4 th group of filters is determined, and the filter corresponding to the target image is the 4 th group of filters; and determining that the filter corresponding to the target image is the 5 th group of filter, wherein the weather type during image shooting is cloudy and corresponds to the 5 th group of filter.
TABLE 8
Weather type Filter lens
In sunny days Group 4 filters
Cloudy Group 5 filters
And step 2052, processing the target image by using a filter corresponding to the target image.
And the terminal adopts the filter corresponding to the obtained target image to process the target image, so that the tone uniformity and/or texture uniformity of the target sky material and other image parts in the target image are enhanced.
In summary, in this embodiment, the terminal determines the filter corresponding to the target image according to the shooting time and/or the weather type, and selects the filter that fits the color tone and/or the texture of the shot image, so that the color tone uniformity and/or the texture uniformity of the target sky material and other image portions in the target image are enhanced; for example, the light direction of each frame in the target image after being processed by the filter is the same, and/or the color tone of the sky area of the target image is uniform with that of the parts except the sky area.
In addition, in the embodiment, the terminal determines the filter according to the shooting time and/or the weather type of the target image, the influence of the factors such as the color temperature and the light intensity of the environment during shooting is closer, and the reduction degree of the shooting scene is higher.
Referring to fig. 14, a block diagram of an image processing apparatus provided by an exemplary embodiment of the present application, which may be implemented as a part or all of a terminal by software, hardware, or a combination of the two, is shown, and the apparatus includes:
an identification module 301 configured to identify a sky region in an image through a neural network model;
a reading module 302 configured to read a shooting attribute parameter of an image;
the screening module 303 is configured to screen a target sky material from the candidate sky materials according to the shooting attribute parameters;
a replacing module 304, configured to replace the sky area in the image with a target sky material, resulting in a target image.
In summary, the image processing apparatus provided in the embodiment of the present application identifies a sky region in an image through a neural network model, screens out a target sky material from candidate sky materials according to a shooting attribute parameter of the read image, and replaces the sky region in the image with the target sky material to obtain a target image; the method solves the problem that the target image with the sky area replaced by the target sky material can be obtained only by at least 7 steps in the related technology, achieves the purpose that the original sky area in the image is replaced by automatically screening the appropriate target sky material through shooting attribute parameters, does not need to carry out later-stage map repairing of a user, reduces manual operation steps of the user, and improves the human-computer interaction efficiency. Even if the user does not have the work of correcting the picture, the sky picture that can only be shot to compare favourably professional single reflection lens can be obtained.
Referring to fig. 15, a block diagram of an image processing apparatus provided by another exemplary embodiment of the present application, which may be implemented as a part or all of a terminal by software, hardware, or a combination of the two, is shown, and the apparatus includes:
an identification module 301 configured to identify a sky region in an image through a neural network model;
a reading module 302 configured to read a shooting attribute parameter of an image;
the screening module 303 is configured to screen a target sky material from the candidate sky materials according to the shooting attribute parameters;
a replacing module 304, configured to replace the sky area in the image with a target sky material, resulting in a target image.
In some embodiments, the screening module 303, comprises:
an extraction submodule 41 configured to extract spatio-temporal parameters including a shooting time and/or a shooting place from the shooting attribute parameters;
and the screening submodule 42 is configured to screen out the target sky material from the candidate sky materials according to the space-time parameters.
In some embodiments, the spatiotemporal parameters include a capture time;
and the screening submodule 42 is configured to screen out a target sky material corresponding to the time period from the candidate sky materials according to the time period to which the shooting time belongs.
In some embodiments, the spatiotemporal parameters include a shooting location;
and the screening submodule 42 is configured to screen out a target sky material corresponding to the geographical area from the candidate sky materials according to the geographical area to which the shooting place belongs.
In some embodiments, the spatiotemporal parameters include a shooting time and/or a shooting location;
a screening module 303 comprising:
a first determination submodule 43 configured to determine a corresponding weather type according to the photographing time and the photographing place;
and the screening submodule 42 is configured to screen out a target sky material corresponding to the weather type from the candidate sky materials.
In some embodiments, the apparatus further comprises:
a processing module 305 configured to process the target image by using a filter, wherein the filter is used for enhancing the color tone uniformity and/or texture uniformity of the target sky material and other image parts in the target image.
In some embodiments, a processing module, comprising:
a second determining sub-module 44 configured to determine a filter corresponding to the target image according to the shooting time and the weather type;
and the processing sub-module 45 is configured to process the target image by using the filter corresponding to the target image.
In some embodiments, the reading module is configured to read EXIF information of the image as the photographing attribute information.
In summary, the image processing apparatus provided in the embodiment of the present application identifies a sky region in an image through a neural network model, screens out a target sky material from candidate sky materials according to a shooting attribute parameter of the read image, and replaces the sky region in the image with the target sky material to obtain a target image; the method solves the problem that the target image with the sky area replaced by the target sky material can be obtained only by at least 7 steps in the related technology, achieves the purpose that the original sky area in the image is replaced by automatically screening the appropriate target sky material through shooting attribute parameters, does not need to carry out later-stage map repairing of a user, reduces manual operation steps of the user, and improves the human-computer interaction efficiency. Even if the user does not have the work of correcting the picture, the sky picture that can only be shot to compare favourably professional single reflection lens can be obtained.
In the embodiment, the device also obtains space-time parameters through EXIF information, and obtains a target sky material according to the space-time parameters, so that the target sky material is more closely or more matched with the part except the sky area in the image shot by the user.
In the embodiment, the device also adopts the filter to process the image after the target image is obtained, so that the color tone uniformity and/or texture uniformity of the target sky material and other image parts in the target image are enhanced; for example, the light direction of each frame in the target image after being processed by the filter is the same, and/or the color tone of the sky area of the target image is uniform with that of the parts except the sky area.
Fig. 16 is a block diagram of an image processing apparatus 400 according to an exemplary embodiment of the present application. For example, the apparatus 400 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, and the like.
Referring to fig. 16, the apparatus 400 may include one or more of the following components: processing components 402, memory 404, power components 406, multimedia components 408, audio components 410, input/output (I/O) interfaces 412, sensor components 414, and communication components 416.
The processing component 402 generally controls overall operation of the apparatus 400, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 402 may include one or more processors 418 to execute instructions to perform all or a portion of the steps performed by the UE20 in the above-described method embodiments. Further, processing component 402 can include one or more modules that facilitate interaction between processing component 402 and other components. For example, the processing component 402 can include a multimedia module to facilitate interaction between the multimedia component 408 and the processing component 402.
The memory 404 is configured to store various types of data to support operations at the apparatus 400. Examples of such data include instructions for any application or method operating on the device 400, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 404 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
Power supply components 406 provide power to the various components of device 400. Power components 406 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for device 400.
The multimedia component 408 includes a screen that provides an output interface between the device 400 and the user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 408 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the apparatus 400 is in an operation mode, such as a photographing mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 410 is configured to output and/or input audio signals. For example, audio component 410 includes a Microphone (MIC) configured to receive external audio signals when apparatus 400 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 404 or transmitted via the communication component 416. In some embodiments, audio component 410 also includes a speaker for outputting audio signals.
The I/O interface 412 provides an interface between the processing component 402 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor component 414 includes one or more sensors for providing various aspects of status assessment for the apparatus 400. For example, the sensor assembly 414 may detect an open/closed state of the apparatus 400, the relative positioning of the components, such as a display and keypad of the apparatus 400, the sensor assembly 414 may also detect a change in the position of the apparatus 400 or a component of the apparatus 400, the presence or absence of user contact with the apparatus 400, orientation or acceleration/deceleration of the apparatus 400, and a change in the temperature of the apparatus 400. The sensor assembly 414 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 414 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 414 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 416 is configured to facilitate wired or wireless communication between the apparatus 400 and other devices. The apparatus 400 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 416 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel.
In an exemplary embodiment, the apparatus 400 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the image processing methods in the above method embodiments.
In an exemplary embodiment, a non-transitory computer-readable storage medium comprising instructions, such as the memory 404 comprising instructions, executable by the processor 418 of the apparatus 400 to perform the image processing method of the above-described method embodiments is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
In an exemplary embodiment, a computer-readable storage medium is also provided, and the computer-readable storage medium is a non-volatile computer-readable storage medium, and a computer program is stored in the computer-readable storage medium, and when being executed by a processing component, the stored computer program can implement the image processing method provided by the above-mentioned embodiment of the present disclosure.
The disclosed embodiments also provide a computer program product having instructions stored therein, which when run on a computer, enable the computer to perform the image processing method provided by the disclosed embodiments.
The embodiment of the present disclosure also provides a chip, which includes a programmable logic circuit and/or a program instruction, and when the chip runs, the chip can execute the image processing method provided by the embodiment of the present disclosure.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (18)

1. An image processing method, characterized in that the method comprises:
identifying a sky region in the image through a neural network model;
reading shooting attribute parameters of the image;
screening out a target sky material from the candidate sky materials according to the shooting attribute parameters;
and replacing the sky area in the image with the target sky material to obtain a target image.
2. The method of claim 1 wherein said screening target sky material from candidate sky material according to said capture attribute parameters comprises:
extracting space-time parameters from the shooting attribute parameters, wherein the space-time parameters comprise shooting time and/or shooting place;
and screening the target sky material from the candidate sky material according to the space-time parameters.
3. The method of claim 2, wherein the spatiotemporal parameters include the shot time;
the screening out the target sky material from the candidate sky material according to the space-time parameters comprises:
and screening out target sky materials corresponding to the time period from the candidate sky materials according to the time period to which the shooting time belongs.
4. The method of claim 2, wherein the spatiotemporal parameters include the filming location;
the screening out the target sky material from the candidate sky material according to the space-time parameters comprises:
and screening the target sky material corresponding to the geographical area from the candidate sky material according to the geographical area to which the shooting place belongs.
5. The method according to claim 2, characterized in that the spatiotemporal parameters comprise the shooting time and/or the shooting location;
screening out the target sky material from the candidate sky material according to the space-time parameters, and the screening comprises the following steps:
determining a corresponding weather type according to the shooting time and the shooting place;
and screening the target sky material corresponding to the weather type from the candidate sky material.
6. The method of any of claims 1 to 5, further comprising:
and processing the target image by adopting a filter, wherein the filter is used for enhancing the color tone uniformity and/or texture uniformity of the target sky material and other image parts in the target image.
7. The method of claim 6, wherein the processing the target image with the filter comprises:
determining a filter corresponding to the target image according to the shooting time and/or the weather type;
and processing the target image by adopting a filter corresponding to the target image.
8. The method according to any one of claims 1 to 5, wherein the reading of the shooting attribute parameters of the image comprises:
EXIF information of the image is read as the photographing attribute parameter.
9. An image processing apparatus, characterized in that the apparatus comprises:
an identification module configured to identify a sky region in an image through a neural network model;
a reading module configured to read a shooting attribute parameter of the image;
the screening module is configured to screen out a target sky material from the candidate sky materials according to the shooting attribute parameters;
a replacing module configured to replace the sky area in the image with the target sky material to obtain a target image.
10. The apparatus of claim 9, wherein the screening module comprises:
an extraction submodule configured to extract spatio-temporal parameters from the shooting attribute parameters, the spatio-temporal parameters including shooting time and/or shooting place;
a screening submodule configured to screen the target sky material from the candidate sky materials according to the space-time parameters.
11. The apparatus of claim 10, wherein the spatiotemporal parameters comprise the shot time;
the screening submodule is configured to screen out a target sky material corresponding to the time period from the candidate sky material according to the time period to which the shooting time belongs.
12. The apparatus of claim 10, wherein the spatiotemporal parameters comprise the shooting location;
the screening submodule is configured to screen the target sky material corresponding to the geographical area from the candidate sky material according to the geographical area to which the shooting place belongs.
13. The apparatus according to claim 10, wherein the spatiotemporal parameters comprise the shooting time and/or the shooting location;
the screening module includes:
a first determining sub-module configured to determine a corresponding weather type according to the photographing time and the photographing place;
the screening submodule is configured to screen the target sky material corresponding to the weather type from the candidate sky material.
14. The apparatus of any of claims 9 to 13, further comprising:
a processing module configured to process the target image with a filter for enhancing color tone uniformity and/or texture uniformity of the target sky material and other image portions in the target image.
15. The apparatus of claim 14, wherein the processing module comprises:
the second determining submodule is configured to determine a filter corresponding to the target image according to the shooting time and the weather type;
and the processing sub-module is configured to process the target image by adopting a filter corresponding to the target image.
16. The apparatus according to any one of claims 9 to 13, wherein the reading module is configured to read EXIF information of the image as the shooting attribute parameter.
17. A terminal, characterized in that it comprises a processor and a memory in which at least one instruction, at least one program, set of codes or set of instructions is stored, which is loaded and executed by the processor to implement the image processing method according to any one of claims 1 to 8.
18. A computer readable storage medium having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, which is loaded and executed by a processor to implement the image processing method according to any one of claims 1 to 8.
CN201811133926.2A 2018-09-27 2018-09-27 Image processing method, device, equipment and storage medium Pending CN110956063A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811133926.2A CN110956063A (en) 2018-09-27 2018-09-27 Image processing method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811133926.2A CN110956063A (en) 2018-09-27 2018-09-27 Image processing method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN110956063A true CN110956063A (en) 2020-04-03

Family

ID=69975236

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811133926.2A Pending CN110956063A (en) 2018-09-27 2018-09-27 Image processing method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN110956063A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112562056A (en) * 2020-12-03 2021-03-26 广州博冠信息科技有限公司 Control method, device, medium and equipment for virtual light in virtual studio
CN112672209A (en) * 2020-12-14 2021-04-16 北京达佳互联信息技术有限公司 Video editing method and video editing device
CN114286005A (en) * 2021-12-29 2022-04-05 合众新能源汽车有限公司 Image display method and device for vehicle skylight
CN114840124A (en) * 2022-03-30 2022-08-02 阿里巴巴(中国)有限公司 Display control method, display control apparatus, electronic device, display control medium, and program product

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103927720A (en) * 2014-04-09 2014-07-16 厦门美图之家科技有限公司 Identification and optimization method for sky image
US20170294000A1 (en) * 2016-04-08 2017-10-12 Adobe Systems Incorporated Sky editing based on image composition
CN107529096A (en) * 2017-09-11 2017-12-29 广东欧珀移动通信有限公司 Image processing method and device
CN108121957A (en) * 2017-12-19 2018-06-05 北京麒麟合盛网络技术有限公司 The method for pushing and device of U.S. face material
CN108401112A (en) * 2018-04-23 2018-08-14 Oppo广东移动通信有限公司 Image processing method, device, terminal and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103927720A (en) * 2014-04-09 2014-07-16 厦门美图之家科技有限公司 Identification and optimization method for sky image
US20170294000A1 (en) * 2016-04-08 2017-10-12 Adobe Systems Incorporated Sky editing based on image composition
CN107529096A (en) * 2017-09-11 2017-12-29 广东欧珀移动通信有限公司 Image processing method and device
CN108121957A (en) * 2017-12-19 2018-06-05 北京麒麟合盛网络技术有限公司 The method for pushing and device of U.S. face material
CN108401112A (en) * 2018-04-23 2018-08-14 Oppo广东移动通信有限公司 Image processing method, device, terminal and storage medium

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112562056A (en) * 2020-12-03 2021-03-26 广州博冠信息科技有限公司 Control method, device, medium and equipment for virtual light in virtual studio
CN112672209A (en) * 2020-12-14 2021-04-16 北京达佳互联信息技术有限公司 Video editing method and video editing device
CN114286005A (en) * 2021-12-29 2022-04-05 合众新能源汽车有限公司 Image display method and device for vehicle skylight
CN114840124A (en) * 2022-03-30 2022-08-02 阿里巴巴(中国)有限公司 Display control method, display control apparatus, electronic device, display control medium, and program product

Similar Documents

Publication Publication Date Title
CN108764370B (en) Image processing method, image processing device, computer-readable storage medium and computer equipment
CN109379572B (en) Image conversion method, image conversion device, electronic equipment and storage medium
WO2022110638A1 (en) Human image restoration method and apparatus, electronic device, storage medium and program product
CN110956063A (en) Image processing method, device, equipment and storage medium
CN112258380A (en) Image processing method, device, equipment and storage medium
CN108810418A (en) Image processing method, device, mobile terminal and computer readable storage medium
US11070717B2 (en) Context-aware image filtering
US20220383508A1 (en) Image processing method and device, electronic device, and storage medium
US11551465B2 (en) Method and apparatus for detecting finger occlusion image, and storage medium
CN113411498B (en) Image shooting method, mobile terminal and storage medium
CN114096994A (en) Image alignment method and device, electronic equipment and storage medium
CN112634160A (en) Photographing method and device, terminal and storage medium
Yang et al. Personalized exposure control using adaptive metering and reinforcement learning
CN112085768A (en) Optical flow information prediction method, optical flow information prediction device, electronic device, and storage medium
CN112887610A (en) Shooting method, shooting device, electronic equipment and storage medium
CN114897916A (en) Image processing method and device, nonvolatile readable storage medium and electronic equipment
CN108259767B (en) Image processing method, image processing device, storage medium and electronic equipment
CN113038002B (en) Image processing method and device, electronic equipment and readable storage medium
CN112750081A (en) Image processing method, device and storage medium
CN112200817A (en) Sky region segmentation and special effect processing method, device and equipment based on image
WO2023071933A1 (en) Camera photographing parameter adjustment method and apparatus and electronic device
CN108495038B (en) Image processing method, image processing device, storage medium and electronic equipment
CN114255177B (en) Exposure control method, device, equipment and storage medium in imaging
CN111461950A (en) Image processing method and device
CN110956576B (en) Image processing method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination