US20210343070A1 - Method, apparatus and electronic device for processing image - Google Patents

Method, apparatus and electronic device for processing image Download PDF

Info

Publication number
US20210343070A1
US20210343070A1 US17/378,518 US202117378518A US2021343070A1 US 20210343070 A1 US20210343070 A1 US 20210343070A1 US 202117378518 A US202117378518 A US 202117378518A US 2021343070 A1 US2021343070 A1 US 2021343070A1
Authority
US
United States
Prior art keywords
image
rendering style
rendering
style
gradient
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US17/378,518
Inventor
Lei Zhang
Wen ZHENG
Wenbo Zhang
Qiang Li
Hongmin Xu
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Dajia Internet Information Technology Co Ltd
Original Assignee
Beijing Dajia Internet Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Dajia Internet Information Technology Co Ltd filed Critical Beijing Dajia Internet Information Technology Co Ltd
Priority to US17/378,518 priority Critical patent/US20210343070A1/en
Publication of US20210343070A1 publication Critical patent/US20210343070A1/en
Assigned to Beijing Dajia Internet Information Technology Co., Ltd. reassignment Beijing Dajia Internet Information Technology Co., Ltd. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: LI, QIANG, XU, Hongmin, ZHANG, LEI, ZHANG, WENBO, ZHENG, Wen
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • G06T15/205Image-based rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/802D [Two Dimensional] animation, e.g. using sprites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/60Image enhancement or restoration using machine learning, e.g. neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • G06T5/94Dynamic range modification of images or parts thereof based on local image properties, e.g. for local contrast enhancement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]

Definitions

  • This application relates to the field of computer technologies, in particular to a method, an apparatus, and an electronic device for processing an image.
  • a method for processing an image includes:
  • the at least one first intermediate gradient image comprises an image in a gradient process from the image with the first rendering style to the image with the second rendering style
  • an electronic device where the electronic device includes:
  • processor configured to:
  • the at least one first intermediate gradient image comprises an image in a gradient process from the image with the first rendering style to the image with the second rendering style;
  • a non-temporary computer readable storage medium where in response to an instruction in the storage medium being executed by a processor of an electronic device, the electronic device is enabled to perform the method for processing the image described in the first aspect.
  • a computer program product is provided, where in response to an instruction in the computer program product being executed by a processor of an electronic device, the electronic device is enabled to perform the method for processing the image described in the first aspect.
  • FIG. 1 illustrates a flowchart of a method for processing an image according to this application.
  • FIG. 2 illustrates a block diagram of an apparatus for processing an image according to this application.
  • FIG. 3 illustrates a block diagram of an electronic device according to this application.
  • FIG. 4 illustrates a block diagram of an electronic device according to this application.
  • FIG. 1 illustrates a flowchart of a method for processing an image according to this application. As shown in FIG. 1 , the method is applied to an electronic device, and the method includes the following steps.
  • the user when a user is interested in scenery in a location, the user may want to watch a gradient video including the scenery and similar to time-lapse photography, for example, a gradient video of the scenery from day to night or a gradient video of the scenery from dawn to morning.
  • the user does not require to continuously shoot a several-hour gradient video of the scenery by using a camera, but may take one image including the scenery by using an electronic device, namely, the image with the first rendering style.
  • Each image has its own rendering style, for example, a green rendering style, a blue rendering style, or a red rendering style.
  • the user may use a filter when capturing the image including the scenery by using the electronic device, so that the captured image is an image with a rendering style.
  • the user does not use a filter when capturing the image including the scenery by using the electronic device.
  • the captured image is close to reality, and a style of the captured image is a no-rendering style.
  • the no-rendering style is a special rendering style.
  • the image with the first rendering style may be taken on site by using the electronic device, the image with the first rendering style may be further acquired from a pre-stored image library, or the image with the first rendering style may be downloaded from a network or may be acquired in another manner.
  • a specific manner of acquiring the image with the first rendering style is not limited in this application.
  • an image with a second rendering style is acquired based on the image with the first rendering style and a first preset processing model.
  • image content of the image with the second rendering style may be the same as image content of the image with the first rendering style.
  • This step may be implemented in the following three manners.
  • One manner includes the following.
  • the gradient video that the user requires to acquire is a gradient video in which images gradually change from the first rendering style to another rendering style. Therefore, after acquiring the first rendering style, the electronic device also requires to acquire another rendering style, to acquire the gradient video in which the images gradually change from the first rendering style to another rendering style.
  • the user may specify the second rendering style in the electronic device. For example, after capturing the image with the first rendering style by using the electronic device, the user may input a request for acquiring a gradient video in the electronic device. After receiving the request, the electronic device may display a preset plurality of rendering styles on a screen of the electronic device for the user to select. After viewing the plurality of rendering styles on the screen of the electronic device, the user may select one rendering style, and the electronic device receives the rendering style selected by the user, and uses the rendering style as the second rendering style.
  • transforming an image with a rendering style into an image with another rendering style is essentially transforming colors of pixels in the image with a rendering style into colors corresponding to another rendering style. Therefore, to acquire the image with the second rendering style based on the image with the first rendering style, the relationship table of color transformation for transforming the first rendering style into the second rendering style requires to be determined.
  • a plurality of rendering styles are preset, for example, a green rendering style, a blue rendering style, or a red rendering style. Therefore, for any two rendering styles, a relationship table of color transformation for transforming one rendering style into the other rendering style of the two rendering styles requires to be preset; and then, the one rendering style, the other rendering style, and the specified relationship table of color transformation form corresponding entries and are stored in a correspondence among an original rendering style, a target rendering style, and a relationship table of color transformation by using the one rendering style as an original rendering style and the other rendering style as a target rendering style.
  • a relationship table of color transformation for transforming the other rendering style into the one rendering style of the two rendering styles requires to be preset; and then, the other rendering style, the one rendering style, and the specified relationship table of color transformation form corresponding entries and are stored in the correspondence among an original rendering style, a target rendering style, and a relationship table of color transformation by using the other rendering style as an original rendering style and the one rendering style as a target rendering style.
  • the foregoing operations are performed for every two other preset rendering styles.
  • the relationship table of color transformation corresponding to the first rendering style and the second rendering style may be searched for in the correspondence among an original rendering style, a target rendering style, and a relationship table of color transformation by using the first rendering style as an original rendering style and the second rendering style as a target rendering style.
  • Color information of a pixel may be identified by using values of the pixel in a red channel, a green channel, and a blue channel, or may be identified in another manner. This is not limited in this application.
  • the relationship table of color transformation includes two columns.
  • a first column stores each piece of first color information corresponding to the first rendering style
  • a second column stores each piece of second color information corresponding to the second rendering style
  • each row stores one piece of first color information corresponding to the first rendering style and one piece of second color information corresponding to the second rendering style.
  • first color information of the pixel may be searched for in the first column of the relationship table of color transformation; and then, second color information corresponding to the first color information of the pixel is searched for in the second column, that is, the second color information in a same row as the first color information of the pixel is searched for.
  • the foregoing operations are performed for each of the other pixels.
  • Pixels in the image with the first rendering style have respective locations in the image with the first rendering style.
  • a blank image of same resolution as the image with the first rendering style may be generated, locations of the pixels in the blank image are determined based on the locations of the pixels in the image with the first rendering style, and the respective locations of the pixels in the blank image are filled with the second color information corresponding to the first color information of the pixels, to acquire the image with the second rendering style.
  • Another manner includes the following.
  • the gradient video that the user requires to acquire is a gradient video in which images gradually change from the first rendering style to another rendering style. Therefore, after acquiring the first rendering style, the electronic device also requires to acquire the another rendering style, to acquire the gradient video in which the images gradually change from the first rendering style to the another rendering style.
  • the user may specify the second rendering style in the electronic device. For example, after capturing the image with the first rendering style by using the electronic device, the user may input a request for acquiring a gradient video in the electronic device. After receiving the request, the electronic device may display a preset plurality of rendering styles on a screen of the electronic device for the user to select. After viewing the plurality of rendering styles on the screen of the electronic device, the user may select one rendering style, and the electronic device receives the rendering style selected by the user, and uses the rendering style as the second rendering style.
  • transforming an image with a rendering style into an image with another rendering style is essentially transforming colors of pixels in the image with a rendering style into colors corresponding to another rendering style.
  • the neural network model for acquiring the image with the second rendering style requires to be determined.
  • the neural network model is used to process an input image, and output an image with the specified rendering style.
  • a plurality of rendering styles are preset, for example, a green rendering style, a blue rendering style, or a red rendering style.
  • a neural network model for acquiring an image with the rendering style requires to be pre-trained.
  • a preset neural network may be trained by using an annotation image with the rendering style until all parameters in the preset neural network converge, to acquire the neural network model for acquiring the image with the rendering style.
  • the rendering style and the trained neural network model for acquiring the image with the rendering style form corresponding entries, and are stored in a correspondence between a rendering style and a neural network model for acquiring an image with the rendering style. The foregoing operations are performed for each of the other rendering styles.
  • the neural network model corresponding to the second rendering style may be searched for in the correspondence between a rendering style and a neural network model for acquiring an image with the rendering style.
  • the image with the first rendering style is input to the acquired neural network model, to acquire the image with the second rendering style output by the acquired neural network model.
  • Still another manner includes the following.
  • the gradient video that the user requires to acquire is a gradient video in which images gradually change from the first rendering style to another rendering style. Therefore, after acquiring the first rendering style, the electronic device also requires to acquire another rendering style, to acquire the gradient video in which the images gradually change from the first rendering style to the another rendering style.
  • the user may input a request for acquiring a gradient video in the electronic device.
  • the electronic device may display a preset plurality of rendering styles on a screen of the electronic device for the user to select.
  • the user may select one rendering style, and the electronic device receives the rendering style selected by the user, uses the rendering style as the second rendering style, and then selects the preset image with the second rendering style in preset images with the preset plurality of rendering styles.
  • An image may be randomly generated as the reference image, for example, a plain white image or an all black image is generated.
  • a first image feature of the image with the first rendering style, a second image feature of the image with the second rendering style, and a third image feature of the reference image may be acquired.
  • a difference between the first image feature and the third image feature is acquired and is used as a first difference between the image content of the reference image and the image content of the image with the first rendering style.
  • a difference between the second image feature and the third image feature is acquired and is used as a second difference between the rendering style of the reference image and the second rendering style.
  • Color information of pixels in the reference image is adjusted based on the first difference and the second difference and according to a preset rule, to acquire the reference image on which the optimal iteration has been performed.
  • a fourth image feature of the reference image on which the optimal iteration has been performed is acquired.
  • a difference between the first image feature and the fourth image feature is acquired and is used as a third difference between the image content of the reference image on which the optimal iteration has been performed and the image content of the image with the first rendering style.
  • a difference between the second image feature and the fourth image feature is acquired and is used as a fourth difference between the rendering style of the reference image on which the optimal iteration has been performed and the second rendering style.
  • the reference image on which the optimal iteration has been performed is determined as the image with the second rendering style; otherwise, optimal iteration continues to be performed on the reference image on which the optimal iteration has been performed based on the above steps; and a reference image on which the optimal iteration has been performed is determined as the image with the second rendering style when a difference between a rendering style of the reference image and the second rendering style is less than the first preset threshold, and a difference between image content of the reference image and the image content of the image with the first rendering style is less than the second preset threshold.
  • At least one first intermediate gradient image is generated based on the image with the first rendering style, the image with the second rendering style, and a second preset processing model; and the at least one first intermediate gradient image includes an image in a gradient process from the image with the first rendering style to the image of the second rendering style.
  • this step may be implemented by using the following process, including:
  • resolution of the image with the first rendering style is the same as resolution of the image with the second rendering style. Therefore, for a location of any pixel in the image with the first rendering style, there is a pixel in the location in the image with the second rendering style.
  • the first color information of the pixel in the location in the image with the first rendering style and the second color information of the pixel in the location in the image with the second rendering style may be acquired, and then the at least one piece of target color information of the pixel in the location may be determined based on the first color information, the second color information, the preset first rendering style coefficient, and the preset second rendering style coefficient.
  • the foregoing operations are performed for each of the other same locations in the image with the first rendering style and the image with the second rendering style.
  • a first product of the first color information and the preset first rendering style coefficient may be calculated, then a second product of the second color information and the preset second rendering style coefficient may be calculated, and then a first sum of the first product and the second product may be calculated. Then, a second sum of the first color information and the second color information is calculated. Then, a ratio of the first sum to the second sum is calculated and is used as the target color information.
  • Color information of a pixel may be identified by using values of the pixel in a red channel, a green channel, and a blue channel. Certainly, color information of a pixel may be identified in another manner. This is not limited in this application.
  • a blank image of which resolution is the same as that of the image with the first rendering style or a blank image of which resolution is the same as that of the image with the second rendering style may be generated, then a location of each pixel in the blank image is determined based on a location of the each pixel in the image with the first rendering style or a location of the each pixel in the image with the second rendering style, and then, the location of each pixel in the blank image is filled with the target color information of the each pixel, to acquire the candidate image.
  • the at least one candidate image may be directly determined as at least one intermediate gradient image.
  • local processing requires to be performed on the intermediate gradient image. For example, usually, lights in a building do not require to be turned on during daytime, but require to be turned on at night.
  • a processing manner is to identify a window of the building and change color information of the window, to reflect on or off of the lights in the building.
  • a blue rendering style is transformed into a red rendering style or a purple rendering style
  • color information of pixels is transformed, and there is no need to perform local processing on an image.
  • a local processing manner requires to be acquired based on the first rendering style and the second rendering style, and then the at least one candidate image is processed according to the acquired local processing manner, to acquire the at least one intermediate gradient image.
  • a local processing manner of performing local processing on an intermediate gradient image in the process requires to be preset, and then, the one rendering style, the other rendering style, and the specified local processing manner form corresponding entries and are stored in a correspondence among an original rendering style, a target rendering style, and a local processing manner by using the one rendering style as an original rendering style and the other rendering style as a target rendering style.
  • a local processing manner of performing local processing on an intermediate gradient image in the process requires to be preset, and then, the other rendering style, the one rendering style, and the specified local processing manner form corresponding entries and are stored in the correspondence among an original rendering style, a target rendering style, and a local processing manner by using the other rendering style as an original rendering style and the one rendering style as a target rendering style.
  • the foregoing operations are performed for every two other preset rendering styles.
  • the local processing manner corresponding to the first rendering style and the second rendering style may be searched for in the correspondence among an original rendering style, a target rendering style, and a local processing manner by using the first rendering style as an original rendering style and the second rendering style as a target rendering style.
  • to highlight a gradient effect in the gradient process from the image with the first rendering style to the image with the second rendering style usually, there are at least two first intermediate gradient images in the gradient process from the image with the first rendering style to the image with the second rendering style, and an actual quantity may be specifically determined based on a rendering style difference between the image with the first rendering style and the image with the second rendering style. Details are not described in this application.
  • the preset first rendering style coefficient includes a difference between a preset value and the preset second rendering style coefficient.
  • the preset value may be 1 or the like.
  • the preset second rendering style coefficient may be constantly increased at a particular increase amplitude, the preset first rendering style coefficient is decreased each time the preset second rendering style coefficient is increased, and 41) to 43) are performed again after the preset second rendering style coefficient is increased until the preset second rendering style coefficient is the same as the preset value.
  • At least two intermediate gradient images may be acquired, and for a sequence of acquiring the intermediate gradient images, a rendering style of an intermediate gradient image that is acquired earlier is closer to the first rendering style, and a rendering style of an intermediate gradient image that is acquired later is closer to the second rendering style.
  • a first gradient video is generated based on the image with the first rendering style, the at least one first intermediate gradient image, and the image with the second rendering style.
  • the image with the first rendering style may be used as an image in a first frame
  • the image with the second rendering style may be used as an image in a last frame
  • the at least one first intermediate gradient image may be used as an image between the image with the first rendering style and the image with the second rendering style, to form the first gradient video.
  • a sequence of the at least two first intermediate gradient images in the first gradient video is the same as a sequence of acquiring the first intermediate gradient images.
  • the image with the first rendering style is acquired; the image with the second rendering style is acquired based on the image with the first rendering style and the first preset processing model; the at least one first intermediate gradient image is generated based on the image with the first rendering style, the image with the second rendering style, and the second preset processing model, where the at least one first intermediate gradient image includes the image in the gradient process from the image with the first rendering style to the image with the second rendering style; and the first gradient video is generated based on the image with the first rendering style, the at least one first intermediate gradient image, and the image with the second rendering style.
  • the user when a user requires to acquire a gradient video, the user does not require to spend a long time in continuously shooting the gradient video by using a camera, and can acquire the gradient video in the manner in this application only by capturing one image, thereby improving efficiency of acquiring a gradient video and improving user experience.
  • the user may require to acquire a plurality of continuous gradient effects to further have better gradient experience, for example, a gradient process from the first rendering style to the second rendering style and then from the second rendering style to a third rendering style.
  • an image with the third rendering style may be acquired based on the image with the second rendering style and the first preset processing model, where image content of the image with the third rendering style may be the same as the image content of the image with the second rendering style.
  • at least one second intermediate gradient image is generated based on the image with the second rendering style, the image with the third rendering style, and the second preset processing model, where the at least one second intermediate gradient image includes an image in a gradient process from the image with the second rendering style to the image with the third rendering style.
  • a second gradient video is generated based on the image with the second rendering style, the at least one second intermediate gradient image, and the image with the third rendering style.
  • the first gradient video and the second gradient video are combined into a third gradient video.
  • the user can experience the gradient process from the first rendering style to the second rendering style and the gradient process from the second rendering style to the third rendering style, so as to have better gradient experience, and further improve user experience.
  • FIG. 2 illustrates a block diagram of an apparatus for processing an image according to this application.
  • the apparatus includes: a first acquiring module 11 , configured to acquire an image with a first rendering style; a second acquiring module 12 , configured to acquire an image with a second rendering style based on the image with the first rendering style and a first preset processing model; a first generating module 13 , configured to generate at least one first intermediate gradient image based on the image with the first rendering style, the image with the second rendering style, and a second preset processing model; where the at least one first intermediate gradient image includes an image in a gradient process from the image with the first rendering style to the image with the second rendering style; and a second generating module 14 , configured to generate a first gradient video based on the image with the first rendering style, the at least one first intermediate gradient image, and the image with the second rendering style.
  • the second acquiring module 12 includes: a first acquiring unit, configured to acquire first color information of each pixel in the image with the first rendering style; a searching unit, configured to search for second color information corresponding to the first color information of each pixel in a relationship table of color transformation for transforming the first rendering style into the second rendering style; and a first generating unit, configured to generate the image with the second rendering style based on the second color information corresponding to the first color information of each pixel.
  • the first determining unit is specifically configured to: search for the relationship table of color transformation corresponding to the first rendering style and the second rendering style in a correspondence among an original rendering style, a target rendering style, and a relationship table of color transformation with the first rendering style as the original rendering style and the second rendering style as the target rendering style.
  • the first preset processing model is a neural network model for acquiring the image with the second rendering style
  • the second acquiring module 12 includes: a processing unit, configured to acquire the image with the second rendering style by processing the image with the first rendering style by using the neural network model.
  • the second determining unit is specifically configured to: search for the neural network model corresponding to the second rendering style in a correspondence between a rendering style and a neural network model for acquiring an image with the rendering style.
  • the second acquiring module 12 includes: a second acquiring unit, configured to acquire a preset image with the second rendering style; a second generating unit, configured to generate a reference image; and an iteration unit, configured to perform at least one round of optimal iteration on the reference image by using an iterative optimization algorithm, based on the image with the first rendering style and the preset image with the second rendering style; and determine a reference image on which the optimal iteration has been performed as the image with the second rendering style, when a difference between a rendering style of the reference image and the second rendering style is less than a first preset threshold and a difference between image content of the reference image and image content of the image with the first rendering style is less than a second preset threshold.
  • the first generating module 13 includes: a third acquiring unit, configured to acquire first color information of a pixel in the location in the image with the first rendering style and second color information of a pixel in the location in the image with the second rendering style, for each same location in the image with the first rendering style and the image with the second rendering style; a third determining unit, configured to determine at least one piece of target color information of a pixel in the location based on the first color information, the second color information, a preset first rendering style coefficient, and a preset second rendering style coefficient; a third generating unit, configured to generate at least one candidate image based on the at least one piece of target color information of a pixel in each location; and a fourth determining unit, configured to determine the at least one first intermediate gradient image based on the at least one candidate image.
  • a third acquiring unit configured to acquire first color information of a pixel in the location in the image with the first rendering style and second color information of a pixel in the location in the image with the second rendering style, for
  • the fourth determining unit includes: a determining subunit, configured to determine the at least one candidate image as the at least one first intermediate gradient image; or an acquiring subunit, configured to acquire a local processing manner based on the first rendering style and the second rendering style, and a processing subunit, configured to acquire the at least one first intermediate gradient image by processing the at least one candidate image according to the local processing manner.
  • the acquiring subunit is specifically configured to search for the local processing manner corresponding to the first rendering style and the second rendering style in a correspondence among an original rendering style, a target rendering style, and a local processing manner, with the first rendering style as the original rendering style and the second rendering style as the target rendering style.
  • the apparatus further includes: a fourth acquiring module, configured to acquire an image with a third rendering style based on the image with the second rendering style and the first preset processing model; a fourth generating module, configured to generate at least one second intermediate gradient image based on the image with the second rendering style, the image with the third rendering style, and the second preset processing model, where the at least one second intermediate gradient image includes an image in a gradient process from the image with the second rendering style to the image with the third rendering style; a fifth generating module, configured to generate a second gradient video based on the image with the second rendering style, the at least one second intermediate gradient image, and the image with the third rendering style; and a combination module, configured to combine the first gradient video and the second gradient video into a third gradient video.
  • a fourth acquiring module configured to acquire an image with a third rendering style based on the image with the second rendering style and the first preset processing model
  • a fourth generating module configured to generate at least one second intermediate gradient image based on the image with the second rendering style, the
  • the image with the first rendering style is acquired; the image with the second rendering style is acquired based on the image with the first rendering style and the first preset processing model; the at least one first intermediate gradient image is generated based on the image with the first rendering style, the image with the second rendering style, and the second preset processing model; where the at least one first intermediate gradient image includes the image in the gradient process from the image with the first rendering style to the image with the second rendering style; and the first gradient video is generated based on the image with the first rendering style, the at least one first intermediate gradient image, and the image with the second rendering style.
  • the user when a user requires to acquire a gradient video, the user does not require to spend a long time in continuously shooting the gradient video by using a camera, and can acquire the gradient video in the manner in this application only by capturing one image, thereby improving efficiency of acquiring a gradient video and improving user experience.
  • FIG. 3 illustrates a block diagram of an electronic device 800 according to this application.
  • the electronic device 800 may be a mobile phone, a computer, a digital broadcasting terminal, a message transceiver device, a game console, a tablet device, a medical device, fitness equipment, a personal digital assistant, or the like.
  • the electronic device 800 may include one or more of the following components: a processing component 802 , a memory 804 , a power supply component 806 , a multimedia component 808 , an audio component 810 , an input/output (I/O) interface 812 , a sensor component 814 , and a communication component 816 .
  • the processing component 802 usually controls an overall operation of the electronic device 800 , for example, operations associated with displaying, calling, data communication, camera operations, and recording operations.
  • the processing component 802 may include one or more processors 820 , to execute an instruction, to complete all or some steps of the method.
  • the processing component 802 may include one or more modules, facilitating interaction between the processing component 802 and the other components.
  • the processing component 802 may include a multimedia module, facilitating interaction between the multimedia component 808 and the processing component 802 .
  • the memory 804 is configured to store various types of data, to support operations of the device 800 .
  • An example of the data includes an instruction of any application or method operated in the electronic device 800 , contact data, address book data, a message, an image, a video, and the like.
  • the memory 804 may be implemented by any type of volatile or non-volatile storage devices or a combination thereof, for example, a static random access memory (SRAM), an electrically erasable programmable read-only memory (EEPROM), an erasable programmable read-only memory (EPROM), a programmable read-only memory (PROM), a read-only memory (ROM), a magnetic memory, a flash memory, a magnetic disk, or an optical disc.
  • SRAM static random access memory
  • EEPROM electrically erasable programmable read-only memory
  • EPROM erasable programmable read-only memory
  • PROM programmable read-only memory
  • ROM read-only memory
  • magnetic memory a magnetic memory
  • the power supply component 806 supplies power to various components of the electronic device 800 .
  • the power supply component 806 may include a power supply management system, one or more power supplies, and other components associated with generating electricity for, managing electricity for, and supplying electricity to the electronic device 800 .
  • the multimedia component 808 includes a screen providing an output interface between the electronic device 800 and a user.
  • the screen may include a liquid crystal display (LCD) and a touch panel (TP). If the screen includes the touch panel, the screen may be implemented as a touch screen, to receive an input signal from the user.
  • the touch panel includes one or more touch sensors, to sense touching, sliding, and a gesture on the touch panel. The touch sensor not only may sense a boundary of a touch or sliding operation, but also may detect duration and pressure related to the touch or sliding operation.
  • the multimedia component 808 includes a front-facing camera and/or rear-facing camera.
  • the front-facing camera and/or the rear-facing camera may receive external multimedia data.
  • Each of the front-facing camera and the rear-facing camera may be a fixed optical lens system or have a focal length and an optical zooming capability.
  • the audio component 810 is configured to output and/or input an audio signal.
  • the audio component 810 includes a microphone (MIC).
  • the microphone is configured to receive the external audio signal.
  • the received audio signal may be further stored in the memory 804 or sent by using the communication component 816 .
  • the audio component 810 further includes a loudspeaker, configured to output the audio signal.
  • the I/O interface 812 provides an interface between the processing component 802 and a peripheral interface module, and the peripheral interface module may be a keyboard, a click wheel, buttons, or the like.
  • the buttons include, but not limited to, a home page button, a volume button, a start button, and a lock button.
  • the sensor component 814 includes one or more sensors, and is configured to provide status estimation in various aspects for the electronic device 800 .
  • the sensor component 814 may detect an on/off state of the device 800 and relative positioning of a component.
  • the component is a display and a keypad of the electronic device 800 .
  • the sensor component 814 may further detect a location change of the electronic device 800 or a component of the electronic device 800 , detect whether there is contact between the user and the electronic device 800 , and detect an orientation or acceleration/slowing down of the electronic device 800 and a temperature change of the electronic device 800 .
  • the sensor component 814 may include a proximity sensor, configured to detect existence of a nearby object when there is no physical contact.
  • the sensor component 814 may further include an optical sensor, such as a CMOS or CCD image sensor, used in an imaging application.
  • the sensor component 814 may further include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
  • the communication component 816 is configured to facilitate wired or wireless communication between the electronic device 800 and other devices.
  • the electronic device 800 may access a wireless network based on a communication standard, such as Wi-Fi, an operator network (such as 2G, 3G, 4G, or 5G), or a combination thereof.
  • the communication component 816 receives, through a broadcast channel, a broadcast signal or broadcast-related information from an external broadcasting management system.
  • the communication component 816 further includes a near field communication (NFC) module, to facilitate short-range communication.
  • the NFC module may be implemented based on a radio frequency identification (RFID) technology, an infrared data association (IrDA) technology, an ultra-wideband (UWB) technology, a Bluetooth (BT) technology, and other technologies.
  • RFID radio frequency identification
  • IrDA infrared data association
  • UWB ultra-wideband
  • BT Bluetooth
  • the electronic device 800 may be implemented by using one or more application-specific integrated circuits (ASIC), one or more digital signal processors (DSP), one or more digital signal processing devices (DSPD), one or more programmable logic devices (PLD), one or more field programmable gate arrays (FPGA), one or more controllers, one or more microcontrollers, one or more microprocessors, or one or more other electronic elements, to perform the foregoing method.
  • ASIC application-specific integrated circuits
  • DSP digital signal processors
  • DSPD digital signal processing devices
  • PLD programmable logic devices
  • FPGA field programmable gate arrays
  • controllers one or more controllers, one or more microcontrollers, one or more microprocessors, or one or more other electronic elements, to perform the foregoing method.
  • a non-temporary computer readable storage medium including an instruction is further provided, such as the memory 804 including the instruction.
  • the instruction may be executed by the processor 802 of the electronic device 800 to complete the foregoing method.
  • the non-temporary computer readable storage medium may be an ROM, a random access memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, or an optical data storage device.
  • a computer program product is further provided.
  • the computer program product may be stored in the memory 804 , and when an instruction in the computer program product is executed by the processor 820 of the electronic device 800 , the electronic device 800 is enabled to perform the image processing method.
  • FIG. 4 illustrates a block diagram of an electronic device 1900 according to this application.
  • the electronic device 1900 may be provided as a server.
  • the electronic device 1900 includes a processing component 1922 , and further includes one or more processors, and a memory resource represented by a memory 1932 , configured to store an instruction that can be executed by the processing component 1922 , such as an application.
  • the application stored in the memory 1932 may include one or more modules, where each module corresponds to one group of instructions.
  • the processing component 1922 is configured to execute the instruction, to perform the foregoing method.
  • the electronic device 1900 may further include a power supply component 1926 , configured to execute power supply management for the electronic device 1900 ; a wired or wireless network interface 1950 , configured to connect the electronic device 1900 to a network; and an input/output (I/O) interface 1958 .
  • the electronic device 1900 may be operated based on an operating system stored in the memory 1932 , such as Windows ServerTM, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM, or a similar operating system.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Biophysics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Geometry (AREA)
  • Computer Graphics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application discloses a method, an apparatus and an electronic device for processing an image. The method includes: acquiring an image with a first rendering style and an image with a second rendering style; generating at least one first intermediate gradient image based on the image with the first rendering style and the image with the second rendering style; wherein the at least one first intermediate gradient image comprises an image in a gradient process from the image with the first rendering style to the image with the second rendering style; and generating a first gradient video based on the image with the first rendering style, the at least one first intermediate gradient image, and the image with the second rendering style.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation of U.S. patent application Ser. No. 16/972,475, filed on Dec. 4, 2020, which is a continuation of International Application No. PCT/CN2019/098011, filed on Jul. 26, 2019, which is based on and claims priority under 35 U.S.C. 119 to Chinese Patent Application No. 201811142166.1, filed with the China National Intellectual Property Administration on Sep. 28, 2018 and entitled “METHOD, APPARATUS, ELECTRONIC DEVICE, AND STORAGE MEDIUM FOR PROCESSING IMAGE”, which is incorporated herein by reference in its entirety.
  • FIELD
  • This application relates to the field of computer technologies, in particular to a method, an apparatus, and an electronic device for processing an image.
  • BACKGROUND
  • With the development of technologies, watching videos has become people's main entertainment manner. To improve experience of watching videos, more and more users like to watch gradient videos shot through time-lapse photography, for example, a gradient video from day to night, or a gradient video from dawn to morning.
  • However, in a related technology, if a user wants to acquire a gradient video, the user requires to speed a long time in continuously shooting a video of a location by using a camera, leading to low efficiency of acquiring a gradient video.
  • SUMMARY
  • According to a first aspect of this application, a method for processing an image is provided, where the method includes:
  • acquiring an image with a first rendering style and an image with a second rendering style;
  • generating at least one first intermediate gradient image based on the image with the first rendering style and the image with the second rendering style; wherein the at least one first intermediate gradient image comprises an image in a gradient process from the image with the first rendering style to the image with the second rendering style; and
  • generating a first gradient video based on the image with the first rendering style, the at least one first intermediate gradient image, and the image with the second rendering style.
  • According to a second aspect of this application, an electronic device is provided, where the electronic device includes:
  • a processor; and a memory configured to store an instruction executed by the processor; where in response to the instruction being executed, the processor is configured to:
  • acquire an image with a first rendering style and an image with a second rendering style;
  • generate at least one first intermediate gradient image based on the image with the first rendering style and the image with the second rendering style; wherein the at least one first intermediate gradient image comprises an image in a gradient process from the image with the first rendering style to the image with the second rendering style; and
  • generate a first gradient video based on the image with the first rendering style, the at least one first intermediate gradient image, and the image with the second rendering style.
  • According to a third aspect of this application, a non-temporary computer readable storage medium is provided, where in response to an instruction in the storage medium being executed by a processor of an electronic device, the electronic device is enabled to perform the method for processing the image described in the first aspect.
  • According to a fourth aspect of this application, a computer program product is provided, where in response to an instruction in the computer program product being executed by a processor of an electronic device, the electronic device is enabled to perform the method for processing the image described in the first aspect.
  • It should be understood that the foregoing general descriptions and the following detailed descriptions are merely used as an example and used for explanation, and cannot limit this application.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The accompanying drawings herein are incorporated into this specification and form one part of this specification, show embodiments conforming to this application, and are used, together with this specification, to explain the principle of this application.
  • FIG. 1 illustrates a flowchart of a method for processing an image according to this application.
  • FIG. 2 illustrates a block diagram of an apparatus for processing an image according to this application.
  • FIG. 3 illustrates a block diagram of an electronic device according to this application.
  • FIG. 4 illustrates a block diagram of an electronic device according to this application.
  • DETAILED DESCRIPTION OF THE EMBODIMENTS
  • The example embodiments are described in detail herein, and the examples are represented in the accompanying drawings. When the accompanying drawings are described below, unless otherwise specified, same numbers in different accompanying drawings represent same or similar elements. Implementations described in the following example embodiments are not all implementations consistent with this application. On the contrary, the implementations are merely examples of apparatuses and methods described in detail in the claims and consistent with some aspects of this application.
  • FIG. 1 illustrates a flowchart of a method for processing an image according to this application. As shown in FIG. 1, the method is applied to an electronic device, and the method includes the following steps.
  • In S101, an image with a first rendering style is acquired.
  • In this application, when a user is interested in scenery in a location, the user may want to watch a gradient video including the scenery and similar to time-lapse photography, for example, a gradient video of the scenery from day to night or a gradient video of the scenery from dawn to morning. In this case, the user does not require to continuously shoot a several-hour gradient video of the scenery by using a camera, but may take one image including the scenery by using an electronic device, namely, the image with the first rendering style.
  • Each image has its own rendering style, for example, a green rendering style, a blue rendering style, or a red rendering style. The user may use a filter when capturing the image including the scenery by using the electronic device, so that the captured image is an image with a rendering style. Alternatively, the user does not use a filter when capturing the image including the scenery by using the electronic device. In this case, the captured image is close to reality, and a style of the captured image is a no-rendering style. In this application, the no-rendering style is a special rendering style.
  • In some embodiments, in addition to that the image with the first rendering style may be taken on site by using the electronic device, the image with the first rendering style may be further acquired from a pre-stored image library, or the image with the first rendering style may be downloaded from a network or may be acquired in another manner. A specific manner of acquiring the image with the first rendering style is not limited in this application.
  • In S102, an image with a second rendering style is acquired based on the image with the first rendering style and a first preset processing model.
  • In this application, image content of the image with the second rendering style may be the same as image content of the image with the first rendering style.
  • This step may be implemented in the following three manners. One manner includes the following.
  • 11). Receiving the specified second rendering style.
  • In this application, the gradient video that the user requires to acquire is a gradient video in which images gradually change from the first rendering style to another rendering style. Therefore, after acquiring the first rendering style, the electronic device also requires to acquire another rendering style, to acquire the gradient video in which the images gradually change from the first rendering style to another rendering style.
  • The user may specify the second rendering style in the electronic device. For example, after capturing the image with the first rendering style by using the electronic device, the user may input a request for acquiring a gradient video in the electronic device. After receiving the request, the electronic device may display a preset plurality of rendering styles on a screen of the electronic device for the user to select. After viewing the plurality of rendering styles on the screen of the electronic device, the user may select one rendering style, and the electronic device receives the rendering style selected by the user, and uses the rendering style as the second rendering style.
  • 12). Determining a relationship table of color transformation for transforming the first rendering style into the second rendering style.
  • In this application, different rendering styles correspond to different pieces of color information, and transforming an image with a rendering style into an image with another rendering style is essentially transforming colors of pixels in the image with a rendering style into colors corresponding to another rendering style. Therefore, to acquire the image with the second rendering style based on the image with the first rendering style, the relationship table of color transformation for transforming the first rendering style into the second rendering style requires to be determined.
  • In this application, a plurality of rendering styles are preset, for example, a green rendering style, a blue rendering style, or a red rendering style. Therefore, for any two rendering styles, a relationship table of color transformation for transforming one rendering style into the other rendering style of the two rendering styles requires to be preset; and then, the one rendering style, the other rendering style, and the specified relationship table of color transformation form corresponding entries and are stored in a correspondence among an original rendering style, a target rendering style, and a relationship table of color transformation by using the one rendering style as an original rendering style and the other rendering style as a target rendering style. In addition, a relationship table of color transformation for transforming the other rendering style into the one rendering style of the two rendering styles requires to be preset; and then, the other rendering style, the one rendering style, and the specified relationship table of color transformation form corresponding entries and are stored in the correspondence among an original rendering style, a target rendering style, and a relationship table of color transformation by using the other rendering style as an original rendering style and the one rendering style as a target rendering style. The foregoing operations are performed for every two other preset rendering styles.
  • In this way, in this step, the relationship table of color transformation corresponding to the first rendering style and the second rendering style may be searched for in the correspondence among an original rendering style, a target rendering style, and a relationship table of color transformation by using the first rendering style as an original rendering style and the second rendering style as a target rendering style.
  • 13). Acquiring first color information of each pixel in the image with the first rendering style.
  • Color information of a pixel may be identified by using values of the pixel in a red channel, a green channel, and a blue channel, or may be identified in another manner. This is not limited in this application.
  • 14). Searching for second color information corresponding to the first color information of each pixel in the relationship table of color transformation.
  • The relationship table of color transformation includes two columns. A first column stores each piece of first color information corresponding to the first rendering style, a second column stores each piece of second color information corresponding to the second rendering style, and each row stores one piece of first color information corresponding to the first rendering style and one piece of second color information corresponding to the second rendering style.
  • For any pixel, first color information of the pixel may be searched for in the first column of the relationship table of color transformation; and then, second color information corresponding to the first color information of the pixel is searched for in the second column, that is, the second color information in a same row as the first color information of the pixel is searched for. The foregoing operations are performed for each of the other pixels.
  • 15). Generating the image with the second rendering style based on the second color information corresponding to the first color information of each pixel.
  • Pixels in the image with the first rendering style have respective locations in the image with the first rendering style. After the second color information corresponding to the first color information of each pixel is acquired, a blank image of same resolution as the image with the first rendering style may be generated, locations of the pixels in the blank image are determined based on the locations of the pixels in the image with the first rendering style, and the respective locations of the pixels in the blank image are filled with the second color information corresponding to the first color information of the pixels, to acquire the image with the second rendering style.
  • Another manner includes the following.
  • 21). Receiving the specified second rendering style.
  • In this application, the gradient video that the user requires to acquire is a gradient video in which images gradually change from the first rendering style to another rendering style. Therefore, after acquiring the first rendering style, the electronic device also requires to acquire the another rendering style, to acquire the gradient video in which the images gradually change from the first rendering style to the another rendering style.
  • The user may specify the second rendering style in the electronic device. For example, after capturing the image with the first rendering style by using the electronic device, the user may input a request for acquiring a gradient video in the electronic device. After receiving the request, the electronic device may display a preset plurality of rendering styles on a screen of the electronic device for the user to select. After viewing the plurality of rendering styles on the screen of the electronic device, the user may select one rendering style, and the electronic device receives the rendering style selected by the user, and uses the rendering style as the second rendering style.
  • 22). Determining a neural network model for acquiring the image with the second rendering style, namely, the above first preset processing model.
  • In this application, different rendering styles correspond to different pieces of color information, and transforming an image with a rendering style into an image with another rendering style is essentially transforming colors of pixels in the image with a rendering style into colors corresponding to another rendering style.
  • Therefore, to acquire the image with the second rendering style based on the image with the first rendering style, the neural network model for acquiring the image with the second rendering style requires to be determined. The neural network model is used to process an input image, and output an image with the specified rendering style.
  • In this application, a plurality of rendering styles are preset, for example, a green rendering style, a blue rendering style, or a red rendering style.
  • Therefore, for any rendering style, a neural network model for acquiring an image with the rendering style requires to be pre-trained. For example, a preset neural network may be trained by using an annotation image with the rendering style until all parameters in the preset neural network converge, to acquire the neural network model for acquiring the image with the rendering style. Then, the rendering style and the trained neural network model for acquiring the image with the rendering style form corresponding entries, and are stored in a correspondence between a rendering style and a neural network model for acquiring an image with the rendering style. The foregoing operations are performed for each of the other rendering styles.
  • In this way, in this step, the neural network model corresponding to the second rendering style may be searched for in the correspondence between a rendering style and a neural network model for acquiring an image with the rendering style.
  • 23). Processing the image with the first rendering style by using the neural network model, to acquire the image with the second rendering style.
  • For example, the image with the first rendering style is input to the acquired neural network model, to acquire the image with the second rendering style output by the acquired neural network model.
  • Still another manner includes the following.
  • 31). Acquiring a preset image with the second rendering style.
  • In this application, the gradient video that the user requires to acquire is a gradient video in which images gradually change from the first rendering style to another rendering style. Therefore, after acquiring the first rendering style, the electronic device also requires to acquire another rendering style, to acquire the gradient video in which the images gradually change from the first rendering style to the another rendering style.
  • For example, after capturing the image with the first rendering style by using the electronic device, the user may input a request for acquiring a gradient video in the electronic device. After receiving the request, the electronic device may display a preset plurality of rendering styles on a screen of the electronic device for the user to select. After viewing the plurality of rendering styles on the screen of the electronic device, the user may select one rendering style, and the electronic device receives the rendering style selected by the user, uses the rendering style as the second rendering style, and then selects the preset image with the second rendering style in preset images with the preset plurality of rendering styles.
  • 32). Generating a reference image.
  • An image may be randomly generated as the reference image, for example, a plain white image or an all black image is generated.
  • 33). Performing at least one round of optimal iteration on the reference image by using an iterative optimization algorithm, based on the image with the first rendering style and the preset image with the second rendering style; and determining a reference image on which the optimal iteration has been performed as the image with the second rendering style, when a difference between a rendering style of the reference image and the second rendering style is less than a first preset threshold and a difference between image content of the reference image and image content of the image with the first rendering style is less than a second preset threshold.
  • A first image feature of the image with the first rendering style, a second image feature of the image with the second rendering style, and a third image feature of the reference image may be acquired.
  • A difference between the first image feature and the third image feature is acquired and is used as a first difference between the image content of the reference image and the image content of the image with the first rendering style. A difference between the second image feature and the third image feature is acquired and is used as a second difference between the rendering style of the reference image and the second rendering style. Color information of pixels in the reference image is adjusted based on the first difference and the second difference and according to a preset rule, to acquire the reference image on which the optimal iteration has been performed.
  • Then, a fourth image feature of the reference image on which the optimal iteration has been performed is acquired. A difference between the first image feature and the fourth image feature is acquired and is used as a third difference between the image content of the reference image on which the optimal iteration has been performed and the image content of the image with the first rendering style. A difference between the second image feature and the fourth image feature is acquired and is used as a fourth difference between the rendering style of the reference image on which the optimal iteration has been performed and the second rendering style. If the fourth difference is less than the first preset threshold and the third difference is less than the second preset threshold, the reference image on which the optimal iteration has been performed is determined as the image with the second rendering style; otherwise, optimal iteration continues to be performed on the reference image on which the optimal iteration has been performed based on the above steps; and a reference image on which the optimal iteration has been performed is determined as the image with the second rendering style when a difference between a rendering style of the reference image and the second rendering style is less than the first preset threshold, and a difference between image content of the reference image and the image content of the image with the first rendering style is less than the second preset threshold.
  • In S103, at least one first intermediate gradient image is generated based on the image with the first rendering style, the image with the second rendering style, and a second preset processing model; and the at least one first intermediate gradient image includes an image in a gradient process from the image with the first rendering style to the image of the second rendering style.
  • In this application, this step may be implemented by using the following process, including:
  • 41). For each same location in the image with the first rendering style and the image with the second rendering style, acquiring first color information of a pixel in the location in the image with the first rendering style and second color information of a pixel in the location in the image with the second rendering style; and determining at least one piece of target color information of a pixel in the location based on the first color information, the second color information, a preset first rendering style coefficient, and a preset second rendering style coefficient.
  • In this application, resolution of the image with the first rendering style is the same as resolution of the image with the second rendering style. Therefore, for a location of any pixel in the image with the first rendering style, there is a pixel in the location in the image with the second rendering style.
  • For any same location in the image with the first rendering style and the image with the second rendering style, the first color information of the pixel in the location in the image with the first rendering style and the second color information of the pixel in the location in the image with the second rendering style may be acquired, and then the at least one piece of target color information of the pixel in the location may be determined based on the first color information, the second color information, the preset first rendering style coefficient, and the preset second rendering style coefficient. The foregoing operations are performed for each of the other same locations in the image with the first rendering style and the image with the second rendering style.
  • When the at least one piece of target color information of the pixel in the location is determined based on the first color information, the second color information, the preset first rendering style coefficient, and the preset second rendering style coefficient, a first product of the first color information and the preset first rendering style coefficient may be calculated, then a second product of the second color information and the preset second rendering style coefficient may be calculated, and then a first sum of the first product and the second product may be calculated. Then, a second sum of the first color information and the second color information is calculated. Then, a ratio of the first sum to the second sum is calculated and is used as the target color information.
  • Color information of a pixel may be identified by using values of the pixel in a red channel, a green channel, and a blue channel. Certainly, color information of a pixel may be identified in another manner. This is not limited in this application.
  • 42). Generating at least one candidate image based on the at least one piece of target color information of a pixel in each location.
  • In this step, a blank image of which resolution is the same as that of the image with the first rendering style or a blank image of which resolution is the same as that of the image with the second rendering style may be generated, then a location of each pixel in the blank image is determined based on a location of the each pixel in the image with the first rendering style or a location of the each pixel in the image with the second rendering style, and then, the location of each pixel in the blank image is filled with the target color information of the each pixel, to acquire the candidate image.
  • 43). Determining the at least one first intermediate gradient image based on the at least one candidate image.
  • In some embodiments, the at least one candidate image may be directly determined as at least one intermediate gradient image.
  • In some embodiments, to improve realness of the gradient process from the image with the first rendering style to the image with the second rendering style, generally, local processing requires to be performed on the intermediate gradient image. For example, usually, lights in a building do not require to be turned on during daytime, but require to be turned on at night.
  • When an image is transformed into images with different rendering styles, generally, local processing is performed on intermediate gradient images in transformation processes in different manners. For example, in a gradient process from day to night, lights in a building require to be turned on gradually, and the number of lights that are turned on is gradually increased. In a gradient process from dawn to morning, the lights in the building require to be turned off gradually, and the number of lights that are turned on is gradually decreased. Therefore, local processing requires to be performed on the intermediate gradient image, to improve realness of the gradient process. In this case, a processing manner is to identify a window of the building and change color information of the window, to reflect on or off of the lights in the building.
  • For another example, when a blue rendering style is transformed into a red rendering style or a purple rendering style, generally, color information of pixels is transformed, and there is no need to perform local processing on an image.
  • Therefore, whether to perform local processing and how to perform local processing are usually determined based on an original rendering style and a target rendering style.
  • Therefore, a local processing manner requires to be acquired based on the first rendering style and the second rendering style, and then the at least one candidate image is processed according to the acquired local processing manner, to acquire the at least one intermediate gradient image.
  • For any two rendering styles, in a process of transforming an image with one of the two rendering styles into an image with the other rendering style, a local processing manner of performing local processing on an intermediate gradient image in the process requires to be preset, and then, the one rendering style, the other rendering style, and the specified local processing manner form corresponding entries and are stored in a correspondence among an original rendering style, a target rendering style, and a local processing manner by using the one rendering style as an original rendering style and the other rendering style as a target rendering style. In addition, in a process of transforming the image with the other of the two rendering styles into the image with the one rendering style, a local processing manner of performing local processing on an intermediate gradient image in the process requires to be preset, and then, the other rendering style, the one rendering style, and the specified local processing manner form corresponding entries and are stored in the correspondence among an original rendering style, a target rendering style, and a local processing manner by using the other rendering style as an original rendering style and the one rendering style as a target rendering style. The foregoing operations are performed for every two other preset rendering styles.
  • In this way, when the local processing manner is acquired based on the first rendering style and the second rendering style, the local processing manner corresponding to the first rendering style and the second rendering style may be searched for in the correspondence among an original rendering style, a target rendering style, and a local processing manner by using the first rendering style as an original rendering style and the second rendering style as a target rendering style.
  • In some embodiments, to highlight a gradient effect in the gradient process from the image with the first rendering style to the image with the second rendering style, usually, there are at least two first intermediate gradient images in the gradient process from the image with the first rendering style to the image with the second rendering style, and an actual quantity may be specifically determined based on a rendering style difference between the image with the first rendering style and the image with the second rendering style. Details are not described in this application.
  • The preset first rendering style coefficient includes a difference between a preset value and the preset second rendering style coefficient. The preset value may be 1 or the like. In this way, the preset second rendering style coefficient may be constantly increased at a particular increase amplitude, the preset first rendering style coefficient is decreased each time the preset second rendering style coefficient is increased, and 41) to 43) are performed again after the preset second rendering style coefficient is increased until the preset second rendering style coefficient is the same as the preset value.
  • According to the method, at least two intermediate gradient images may be acquired, and for a sequence of acquiring the intermediate gradient images, a rendering style of an intermediate gradient image that is acquired earlier is closer to the first rendering style, and a rendering style of an intermediate gradient image that is acquired later is closer to the second rendering style.
  • In S104, a first gradient video is generated based on the image with the first rendering style, the at least one first intermediate gradient image, and the image with the second rendering style.
  • In this application, the image with the first rendering style may be used as an image in a first frame, the image with the second rendering style may be used as an image in a last frame, and the at least one first intermediate gradient image may be used as an image between the image with the first rendering style and the image with the second rendering style, to form the first gradient video.
  • If there are at least two first intermediate gradient images, a sequence of the at least two first intermediate gradient images in the first gradient video is the same as a sequence of acquiring the first intermediate gradient images.
  • In this application, the image with the first rendering style is acquired; the image with the second rendering style is acquired based on the image with the first rendering style and the first preset processing model; the at least one first intermediate gradient image is generated based on the image with the first rendering style, the image with the second rendering style, and the second preset processing model, where the at least one first intermediate gradient image includes the image in the gradient process from the image with the first rendering style to the image with the second rendering style; and the first gradient video is generated based on the image with the first rendering style, the at least one first intermediate gradient image, and the image with the second rendering style. According to this application, when a user requires to acquire a gradient video, the user does not require to spend a long time in continuously shooting the gradient video by using a camera, and can acquire the gradient video in the manner in this application only by capturing one image, thereby improving efficiency of acquiring a gradient video and improving user experience.
  • Further, sometimes, the user may require to acquire a plurality of continuous gradient effects to further have better gradient experience, for example, a gradient process from the first rendering style to the second rendering style and then from the second rendering style to a third rendering style.
  • Therefore, to bring better gradient experience to the user, in some embodiments, an image with the third rendering style may be acquired based on the image with the second rendering style and the first preset processing model, where image content of the image with the third rendering style may be the same as the image content of the image with the second rendering style. Then, at least one second intermediate gradient image is generated based on the image with the second rendering style, the image with the third rendering style, and the second preset processing model, where the at least one second intermediate gradient image includes an image in a gradient process from the image with the second rendering style to the image with the third rendering style. Then, a second gradient video is generated based on the image with the second rendering style, the at least one second intermediate gradient image, and the image with the third rendering style. Then, the first gradient video and the second gradient video are combined into a third gradient video. In this way, when the user watches the third gradient video, the user can experience the gradient process from the first rendering style to the second rendering style and the gradient process from the second rendering style to the third rendering style, so as to have better gradient experience, and further improve user experience. For specific implementations of the steps in the embodiments of this application, refer to the foregoing embodiments. Details are not described herein again.
  • FIG. 2 illustrates a block diagram of an apparatus for processing an image according to this application. Referring to FIG. 2, the apparatus includes: a first acquiring module 11, configured to acquire an image with a first rendering style; a second acquiring module 12, configured to acquire an image with a second rendering style based on the image with the first rendering style and a first preset processing model; a first generating module 13, configured to generate at least one first intermediate gradient image based on the image with the first rendering style, the image with the second rendering style, and a second preset processing model; where the at least one first intermediate gradient image includes an image in a gradient process from the image with the first rendering style to the image with the second rendering style; and a second generating module 14, configured to generate a first gradient video based on the image with the first rendering style, the at least one first intermediate gradient image, and the image with the second rendering style.
  • In some embodiments, the second acquiring module 12 includes: a first acquiring unit, configured to acquire first color information of each pixel in the image with the first rendering style; a searching unit, configured to search for second color information corresponding to the first color information of each pixel in a relationship table of color transformation for transforming the first rendering style into the second rendering style; and a first generating unit, configured to generate the image with the second rendering style based on the second color information corresponding to the first color information of each pixel.
  • In some embodiments, the first determining unit is specifically configured to: search for the relationship table of color transformation corresponding to the first rendering style and the second rendering style in a correspondence among an original rendering style, a target rendering style, and a relationship table of color transformation with the first rendering style as the original rendering style and the second rendering style as the target rendering style.
  • In some embodiments, the first preset processing model is a neural network model for acquiring the image with the second rendering style; and the second acquiring module 12 includes: a processing unit, configured to acquire the image with the second rendering style by processing the image with the first rendering style by using the neural network model.
  • In some embodiments, the second determining unit is specifically configured to: search for the neural network model corresponding to the second rendering style in a correspondence between a rendering style and a neural network model for acquiring an image with the rendering style.
  • In some embodiments, the second acquiring module 12 includes: a second acquiring unit, configured to acquire a preset image with the second rendering style; a second generating unit, configured to generate a reference image; and an iteration unit, configured to perform at least one round of optimal iteration on the reference image by using an iterative optimization algorithm, based on the image with the first rendering style and the preset image with the second rendering style; and determine a reference image on which the optimal iteration has been performed as the image with the second rendering style, when a difference between a rendering style of the reference image and the second rendering style is less than a first preset threshold and a difference between image content of the reference image and image content of the image with the first rendering style is less than a second preset threshold.
  • In some embodiments, the first generating module 13 includes: a third acquiring unit, configured to acquire first color information of a pixel in the location in the image with the first rendering style and second color information of a pixel in the location in the image with the second rendering style, for each same location in the image with the first rendering style and the image with the second rendering style; a third determining unit, configured to determine at least one piece of target color information of a pixel in the location based on the first color information, the second color information, a preset first rendering style coefficient, and a preset second rendering style coefficient; a third generating unit, configured to generate at least one candidate image based on the at least one piece of target color information of a pixel in each location; and a fourth determining unit, configured to determine the at least one first intermediate gradient image based on the at least one candidate image.
  • In some embodiments, the fourth determining unit includes: a determining subunit, configured to determine the at least one candidate image as the at least one first intermediate gradient image; or an acquiring subunit, configured to acquire a local processing manner based on the first rendering style and the second rendering style, and a processing subunit, configured to acquire the at least one first intermediate gradient image by processing the at least one candidate image according to the local processing manner.
  • In some embodiments, the acquiring subunit is specifically configured to search for the local processing manner corresponding to the first rendering style and the second rendering style in a correspondence among an original rendering style, a target rendering style, and a local processing manner, with the first rendering style as the original rendering style and the second rendering style as the target rendering style.
  • In some embodiments, the apparatus further includes: a fourth acquiring module, configured to acquire an image with a third rendering style based on the image with the second rendering style and the first preset processing model; a fourth generating module, configured to generate at least one second intermediate gradient image based on the image with the second rendering style, the image with the third rendering style, and the second preset processing model, where the at least one second intermediate gradient image includes an image in a gradient process from the image with the second rendering style to the image with the third rendering style; a fifth generating module, configured to generate a second gradient video based on the image with the second rendering style, the at least one second intermediate gradient image, and the image with the third rendering style; and a combination module, configured to combine the first gradient video and the second gradient video into a third gradient video.
  • In this application, the image with the first rendering style is acquired; the image with the second rendering style is acquired based on the image with the first rendering style and the first preset processing model; the at least one first intermediate gradient image is generated based on the image with the first rendering style, the image with the second rendering style, and the second preset processing model; where the at least one first intermediate gradient image includes the image in the gradient process from the image with the first rendering style to the image with the second rendering style; and the first gradient video is generated based on the image with the first rendering style, the at least one first intermediate gradient image, and the image with the second rendering style. According to this application, when a user requires to acquire a gradient video, the user does not require to spend a long time in continuously shooting the gradient video by using a camera, and can acquire the gradient video in the manner in this application only by capturing one image, thereby improving efficiency of acquiring a gradient video and improving user experience.
  • For the apparatus in the foregoing embodiments, specific manners of performing operations by the modules have been described in detail in the embodiments related to the method, and details are not described herein again.
  • FIG. 3 illustrates a block diagram of an electronic device 800 according to this application. For example, the electronic device 800 may be a mobile phone, a computer, a digital broadcasting terminal, a message transceiver device, a game console, a tablet device, a medical device, fitness equipment, a personal digital assistant, or the like.
  • Referring to FIG. 3, the electronic device 800 may include one or more of the following components: a processing component 802, a memory 804, a power supply component 806, a multimedia component 808, an audio component 810, an input/output (I/O) interface 812, a sensor component 814, and a communication component 816.
  • The processing component 802 usually controls an overall operation of the electronic device 800, for example, operations associated with displaying, calling, data communication, camera operations, and recording operations. The processing component 802 may include one or more processors 820, to execute an instruction, to complete all or some steps of the method. In addition, the processing component 802 may include one or more modules, facilitating interaction between the processing component 802 and the other components. For example, the processing component 802 may include a multimedia module, facilitating interaction between the multimedia component 808 and the processing component 802.
  • The memory 804 is configured to store various types of data, to support operations of the device 800. An example of the data includes an instruction of any application or method operated in the electronic device 800, contact data, address book data, a message, an image, a video, and the like. The memory 804 may be implemented by any type of volatile or non-volatile storage devices or a combination thereof, for example, a static random access memory (SRAM), an electrically erasable programmable read-only memory (EEPROM), an erasable programmable read-only memory (EPROM), a programmable read-only memory (PROM), a read-only memory (ROM), a magnetic memory, a flash memory, a magnetic disk, or an optical disc.
  • The power supply component 806 supplies power to various components of the electronic device 800. The power supply component 806 may include a power supply management system, one or more power supplies, and other components associated with generating electricity for, managing electricity for, and supplying electricity to the electronic device 800.
  • The multimedia component 808 includes a screen providing an output interface between the electronic device 800 and a user. In some embodiments, the screen may include a liquid crystal display (LCD) and a touch panel (TP). If the screen includes the touch panel, the screen may be implemented as a touch screen, to receive an input signal from the user. The touch panel includes one or more touch sensors, to sense touching, sliding, and a gesture on the touch panel. The touch sensor not only may sense a boundary of a touch or sliding operation, but also may detect duration and pressure related to the touch or sliding operation. In some embodiments, the multimedia component 808 includes a front-facing camera and/or rear-facing camera. When the device 800 is an operation mode, such as a photographing mode or a video mode, the front-facing camera and/or the rear-facing camera may receive external multimedia data. Each of the front-facing camera and the rear-facing camera may be a fixed optical lens system or have a focal length and an optical zooming capability.
  • The audio component 810 is configured to output and/or input an audio signal. For example, the audio component 810 includes a microphone (MIC). When the electronic device 800 is in an operation mode, such as a calling mode, a recording mode, and a speech recognition mode, the microphone is configured to receive the external audio signal. The received audio signal may be further stored in the memory 804 or sent by using the communication component 816. In some embodiments, the audio component 810 further includes a loudspeaker, configured to output the audio signal.
  • The I/O interface 812 provides an interface between the processing component 802 and a peripheral interface module, and the peripheral interface module may be a keyboard, a click wheel, buttons, or the like. The buttons include, but not limited to, a home page button, a volume button, a start button, and a lock button.
  • The sensor component 814 includes one or more sensors, and is configured to provide status estimation in various aspects for the electronic device 800. For example, the sensor component 814 may detect an on/off state of the device 800 and relative positioning of a component. For example, the component is a display and a keypad of the electronic device 800. The sensor component 814 may further detect a location change of the electronic device 800 or a component of the electronic device 800, detect whether there is contact between the user and the electronic device 800, and detect an orientation or acceleration/slowing down of the electronic device 800 and a temperature change of the electronic device 800. The sensor component 814 may include a proximity sensor, configured to detect existence of a nearby object when there is no physical contact. The sensor component 814 may further include an optical sensor, such as a CMOS or CCD image sensor, used in an imaging application. In some embodiments, the sensor component 814 may further include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
  • The communication component 816 is configured to facilitate wired or wireless communication between the electronic device 800 and other devices. The electronic device 800 may access a wireless network based on a communication standard, such as Wi-Fi, an operator network (such as 2G, 3G, 4G, or 5G), or a combination thereof. In some embodiments, the communication component 816 receives, through a broadcast channel, a broadcast signal or broadcast-related information from an external broadcasting management system. In some embodiments, the communication component 816 further includes a near field communication (NFC) module, to facilitate short-range communication. For example, the NFC module may be implemented based on a radio frequency identification (RFID) technology, an infrared data association (IrDA) technology, an ultra-wideband (UWB) technology, a Bluetooth (BT) technology, and other technologies.
  • In some embodiments, the electronic device 800 may be implemented by using one or more application-specific integrated circuits (ASIC), one or more digital signal processors (DSP), one or more digital signal processing devices (DSPD), one or more programmable logic devices (PLD), one or more field programmable gate arrays (FPGA), one or more controllers, one or more microcontrollers, one or more microprocessors, or one or more other electronic elements, to perform the foregoing method.
  • In some embodiments, a non-temporary computer readable storage medium including an instruction is further provided, such as the memory 804 including the instruction. The instruction may be executed by the processor 802 of the electronic device 800 to complete the foregoing method. For example, the non-temporary computer readable storage medium may be an ROM, a random access memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, or an optical data storage device.
  • In some embodiments, a computer program product is further provided. The computer program product may be stored in the memory 804, and when an instruction in the computer program product is executed by the processor 820 of the electronic device 800, the electronic device 800 is enabled to perform the image processing method.
  • FIG. 4 illustrates a block diagram of an electronic device 1900 according to this application. For example, the electronic device 1900 may be provided as a server.
  • Referring to FIG. 4, the electronic device 1900 includes a processing component 1922, and further includes one or more processors, and a memory resource represented by a memory 1932, configured to store an instruction that can be executed by the processing component 1922, such as an application. The application stored in the memory 1932 may include one or more modules, where each module corresponds to one group of instructions. In addition, the processing component 1922 is configured to execute the instruction, to perform the foregoing method.
  • The electronic device 1900 may further include a power supply component 1926, configured to execute power supply management for the electronic device 1900; a wired or wireless network interface 1950, configured to connect the electronic device 1900 to a network; and an input/output (I/O) interface 1958. The electronic device 1900 may be operated based on an operating system stored in the memory 1932, such as Windows Server™, Mac OS X™, Unix™, Linux™, FreeBSD™, or a similar operating system.

Claims (19)

We claim:
1. A method for processing an image, comprising:
acquiring an image with a first rendering style and an image with a second rendering style;
generating at least one first intermediate gradient image based on the image with the first rendering style and the image with the second rendering style; wherein the at least one first intermediate gradient image comprises an image in a gradient process from the image with the first rendering style to the image with the second rendering style; and
generating a first gradient video based on the image with the first rendering style, the at least one first intermediate gradient image, and the image with the second rendering style.
2. The method according to claim 1, wherein said acquiring the image with the second rendering style comprises:
acquiring first color information of each pixel in the image with the first rendering style;
searching for second color information corresponding to the first color information of each pixel in a relationship table of color transformation for transforming the first rendering style into the second rendering style; and
generating the image with the second rendering style based on the second color information corresponding to the first color information of each pixel.
3. The method according to claim 2, said generating the image with the second rendering style comprises:
generating a blank image of same resolution as the image with the first rendering style;
determining a location of each pixel in the blank image based on a location of each pixel in the image with the first rendering style; and
generating the image with the second rendering style by filling respective locations of the pixels in the blank image with the second color information corresponding to the first color information of the pixels.
4. The method according to claim 2, further comprises: determining the relationship table of color transformation for transforming the first rendering style into the second rendering style;
said determining the relationship table of color transformation comprises:
searching for the relationship table of color transformation corresponding to the first rendering style and the second rendering style in a correspondence among an original rendering style, a target rendering style, and a relationship table of color transformation, with the first rendering style as the original rendering style and the second rendering style as the target rendering style.
5. The method according to claim 1, wherein said acquiring the image with the second rendering style comprises:
acquiring a preset image with the second rendering style;
generating a reference image;
performing at least one round of optimal iteration on the reference image through an iterative optimization algorithm, based on the image with the first rendering style and the preset image with the second rendering style; and
determining a reference image on which the optimal iteration has been performed as the image with the second rendering style, in response to a difference between a rendering style of the reference image and the second rendering style being less than a first threshold and a difference between image content of the reference image and image content of the image with the first rendering style being less than a second threshold.
6. The method according to claim 1, wherein said generating the at least one first intermediate gradient image comprises:
for each same location in the image with the first rendering style and the image with the second rendering style, acquiring first color information of a pixel in the location in the image with the first rendering style and second color information of a pixel in the location in the image with the second rendering style, and determining at least one piece of target color information of a pixel in the location based on the first color information, the second color information, a first rendering style coefficient, and a second rendering style coefficient;
generating at least one candidate image based on the at least one piece of target color information of a pixel in each location; and
determining the at least one first intermediate gradient image based on the at least one candidate image.
7. The method according to claim 6, wherein said determining the at least one first intermediate gradient image comprises:
determining the at least one candidate image as the at least one first intermediate gradient image;
or,
determining the at least one first intermediate gradient image by acquiring a local processing manner based on the first rendering style and the second rendering style and processing the at least one candidate image based on the local processing manner.
8. The method according to claim 7, wherein said acquiring the local processing manner comprises:
searching for the local processing manner corresponding to the first rendering style and the second rendering style in a correspondence among an original rendering style, a target rendering style, and a local processing manner, with the first rendering style as the original rendering style and the second rendering style as the target rendering style.
9. The method according to claim 1, further comprising:
acquiring an image with a third rendering style based on the image with the second rendering style;
generating at least one second intermediate gradient image based on the image with the second rendering style and the image with the third rendering style; wherein the at least one second intermediate gradient image comprises an image in a gradient process from the image with the second rendering style to the image with the third rendering style;
generating a second gradient video based on the image with the second rendering style, the at least one second intermediate gradient image, and the image with the third rendering style; and
generating a third gradient video by combining the first gradient video and the second gradient video.
10. An electronic device, comprising:
a processor; and
a memory configured to store an instruction executed by the processor; wherein
in response to the instruction being executed, the processor is configured to:
acquire an image with a first rendering style and an image with a second rendering style;
generate at least one first intermediate gradient image based on the image with the first rendering style and the image with the second rendering style; wherein the at least one first intermediate gradient image comprises an image in a gradient process from the image with the first rendering style to the image with the second rendering style; and
generate a first gradient video based on the image with the first rendering style, the at least one first intermediate gradient image, and the image with the second rendering style.
11. The electronic device according to claim 10, wherein the processor is configured to:
acquire first color information of each pixel in the image with the first rendering style;
search for second color information corresponding to the first color information of each pixel in a relationship table of color transformation for transforming the first rendering style into the second rendering style; and
generate the image with the second rendering style based on the second color information corresponding to the first color information of each pixel.
12. The electronic device according to claim 11, wherein the processor is configured to:
generate a blank image of same resolution as the image with the first rendering style;
determine a location of each pixel in the blank image based on a location of each pixel in the image with the first rendering style; and
generate the image with the second rendering style by filling respective locations of the pixels in the blank image with the second color information corresponding to the first color information of the pixels.
13. The electronic device according to claim 11, wherein the processor is configured to:
determine the relationship table of color transformation for transforming the first rendering style into the second rendering style;
wherein the processor is further configured to:
search for the relationship table of color transformation corresponding to the first rendering style and the second rendering style in a correspondence among an original rendering style, a target rendering style, and a relationship table of color transformation, with the first rendering style as the original rendering style and the second rendering style as the target rendering style.
14. The electronic device according to claim 10, wherein the processor is configured to:
acquire a preset image with the second rendering style;
generate a reference image;
perform at least one round of optimal iteration on the reference image through an iterative optimization algorithm, based on the image with the first rendering style and the preset image with the second rendering style; and
determine a reference image on which the optimal iteration has been performed as the image with the second rendering style, in response to a difference between a rendering style of the reference image and the second rendering style being less than a first threshold and a difference between image content of the reference image and image content of the image with the first rendering style being less than a second threshold.
15. The electronic device according to claim 10, wherein the processor is configured to:
for each same location in the image with the first rendering style and the image with the second rendering style, acquire first color information of a pixel in the location in the image with the first rendering style and second color information of a pixel in the location in the image with the second rendering style, and determine at least one piece of target color information of a pixel in the location based on the first color information, the second color information, a first rendering style coefficient, and a second rendering style coefficient;
generate at least one candidate image based on the at least one piece of target color information of a pixel in each location; and
determine the at least one first intermediate gradient image based on the at least one candidate image.
16. The electronic device according to claim 15, wherein the processor is configured to:
determine the at least one candidate image as the at least one first intermediate gradient image;
or,
determine the at least one first intermediate gradient image by acquiring a local processing manner based on the first rendering style and the second rendering style and processing the at least one candidate image based on the local processing manner.
17. The electronic device according to claim 16, wherein the processor is configured to:
search for the local processing manner corresponding to the first rendering style and the second rendering style in a correspondence among an original rendering style, a target rendering style, and a local processing manner, with the first rendering style as the original rendering style and the second rendering style as the target rendering style.
18. The electronic device according to claim 10, wherein the processor is configured to:
acquire an image with a third rendering style based on the image with the second rendering style;
generate at least one second intermediate gradient image based on the image with the second rendering style and the image with the third rendering style; wherein the at least one second intermediate gradient image comprises an image in a gradient process from the image with the second rendering style to the image with the third rendering style;
generate a second gradient video based on the image with the second rendering style, the at least one second intermediate gradient image, and the image with the third rendering style; and
generate a third gradient video by combining the first gradient video and the second gradient video.
19. A non-temporary computer readable storage medium, wherein in response to an instruction in the storage medium being executed by a processor of an electronic device, the electronic device is enabled to:
acquire an image with a first rendering style and an image with a second rendering style;
generate at least one first intermediate gradient image based on the image with the first rendering style and the image with the second rendering style; wherein the at least one first intermediate gradient image comprises an image in a gradient process from the image with the first rendering style to the image with the second rendering style; and
generate a first gradient video based on the image with the first rendering style, the at least one first intermediate gradient image, and the image with the second rendering style.
US17/378,518 2018-09-28 2021-07-16 Method, apparatus and electronic device for processing image Abandoned US20210343070A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/378,518 US20210343070A1 (en) 2018-09-28 2021-07-16 Method, apparatus and electronic device for processing image

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
CN201811142166.1 2018-09-28
CN201811142166.1A CN109360261B (en) 2018-09-28 2018-09-28 Image processing method, image processing device, electronic equipment and storage medium
PCT/CN2019/098011 WO2020063084A1 (en) 2018-09-28 2019-07-26 Image processing method and apparatus, electronic device, and storage medium
US202016972475A 2020-12-04 2020-12-04
US17/378,518 US20210343070A1 (en) 2018-09-28 2021-07-16 Method, apparatus and electronic device for processing image

Related Parent Applications (2)

Application Number Title Priority Date Filing Date
US16/972,475 Continuation US11094110B2 (en) 2018-09-28 2019-07-26 Method, apparatus and electronic device for processing image
PCT/CN2019/098011 Continuation WO2020063084A1 (en) 2018-09-28 2019-07-26 Image processing method and apparatus, electronic device, and storage medium

Publications (1)

Publication Number Publication Date
US20210343070A1 true US20210343070A1 (en) 2021-11-04

Family

ID=65348256

Family Applications (2)

Application Number Title Priority Date Filing Date
US16/972,475 Active US11094110B2 (en) 2018-09-28 2019-07-26 Method, apparatus and electronic device for processing image
US17/378,518 Abandoned US20210343070A1 (en) 2018-09-28 2021-07-16 Method, apparatus and electronic device for processing image

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US16/972,475 Active US11094110B2 (en) 2018-09-28 2019-07-26 Method, apparatus and electronic device for processing image

Country Status (3)

Country Link
US (2) US11094110B2 (en)
CN (1) CN109360261B (en)
WO (1) WO2020063084A1 (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109360261B (en) * 2018-09-28 2019-12-31 北京达佳互联信息技术有限公司 Image processing method, image processing device, electronic equipment and storage medium
CN110062157B (en) * 2019-04-04 2021-09-17 北京字节跳动网络技术有限公司 Method and device for rendering image, electronic equipment and computer readable storage medium
CN110062176B (en) * 2019-04-12 2020-10-30 北京字节跳动网络技术有限公司 Method and device for generating video, electronic equipment and computer readable storage medium
JP7316829B2 (en) * 2019-04-15 2023-07-28 キヤノン株式会社 Information processing device, information processing method, and program
CN110399924B (en) 2019-07-26 2021-09-07 北京小米移动软件有限公司 Image processing method, device and medium
CN112541512B (en) * 2019-09-20 2023-06-02 杭州海康威视数字技术股份有限公司 Image set generation method and device
CN111292393A (en) * 2020-01-22 2020-06-16 北京达佳互联信息技术有限公司 Image processing method, image processing device, electronic equipment and storage medium
CN114066715A (en) * 2020-07-30 2022-02-18 北京达佳互联信息技术有限公司 Image style migration method and device, electronic equipment and storage medium
CN111932640B (en) * 2020-09-25 2021-02-09 北京尽微至广信息技术有限公司 Method and device for adjusting color gradient effect and storage medium
CN114615421B (en) * 2020-12-07 2023-06-30 华为技术有限公司 Image processing method and electronic equipment
CN112619160A (en) * 2020-12-29 2021-04-09 网易(杭州)网络有限公司 Image processing method, image processing apparatus, non-volatile storage medium, and electronic apparatus

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11094110B2 (en) * 2018-09-28 2021-08-17 Beijing Dajia Internet Information Technology Co., Ltd. Method, apparatus and electronic device for processing image

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7289127B1 (en) * 2005-04-25 2007-10-30 Apple, Inc. Multi-conic gradient generation
US8019182B1 (en) * 2007-08-20 2011-09-13 Adobe Systems Incorporated Digital image modification using pyramid vignettes
CN102129705A (en) * 2010-01-18 2011-07-20 腾讯科技(深圳)有限公司 Animation production method and device, and animation playing method and device
JP5617730B2 (en) * 2011-03-29 2014-11-05 カシオ計算機株式会社 Display control apparatus and program
CN102737369A (en) * 2011-03-31 2012-10-17 卡西欧计算机株式会社 Image processing apparatus, image processing method, and storage medium
CN104217452A (en) * 2014-09-10 2014-12-17 珠海市君天电子科技有限公司 Method and device for implementing gradual change of colors
US9762846B2 (en) * 2015-05-08 2017-09-12 Microsoft Technology Licensing, Llc Real-time hyper-lapse video creation via frame selection
CN107464273B (en) * 2016-06-02 2020-09-04 北京大学 Method and device for realizing image style brush
CN109426858B (en) * 2017-08-29 2021-04-06 京东方科技集团股份有限公司 Neural network, training method, image processing method, and image processing apparatus
CN108492348A (en) * 2018-03-30 2018-09-04 北京金山安全软件有限公司 Image processing method, image processing device, electronic equipment and storage medium

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11094110B2 (en) * 2018-09-28 2021-08-17 Beijing Dajia Internet Information Technology Co., Ltd. Method, apparatus and electronic device for processing image

Also Published As

Publication number Publication date
CN109360261A (en) 2019-02-19
US20210166469A1 (en) 2021-06-03
US11094110B2 (en) 2021-08-17
WO2020063084A1 (en) 2020-04-02
CN109360261B (en) 2019-12-31

Similar Documents

Publication Publication Date Title
US20210343070A1 (en) Method, apparatus and electronic device for processing image
US9674395B2 (en) Methods and apparatuses for generating photograph
US20170032725A1 (en) Method, device, and computer-readable medium for setting color gamut mode
US11770497B2 (en) Method and device for processing video, and storage medium
CN104869308A (en) Picture taking method and device
CN104933071A (en) Information retrieval method and corresponding device
EP3905660A1 (en) Method and device for shooting image, and storage medium
CN104850643B (en) Picture comparison method and device
CN112784081A (en) Image display method and device and electronic equipment
CN110086998B (en) Shooting method and terminal
CN114025092A (en) Shooting control display method and device, electronic equipment and medium
US20150130960A1 (en) Recommendation apparatus, method, and program
CN107292901B (en) Edge detection method and device
CN113596574A (en) Video processing method, video processing apparatus, electronic device, and readable storage medium
EP4161054A1 (en) Anchor point information processing method, apparatus and device and storage medium
CN108027821B (en) Method and device for processing picture
EP3799415A2 (en) Method and device for processing videos, and medium
CN111832455A (en) Method, device, storage medium and electronic equipment for acquiring content image
CN114285988B (en) Display method, display device, electronic equipment and storage medium
CN113157178B (en) Information processing method and device
WO2021169810A1 (en) Video processing method, and server
CN114745505A (en) Shooting method, shooting device, electronic equipment and readable storage medium
CN116939351A (en) Shooting method, shooting device, electronic equipment and readable storage medium
CN112887620A (en) Video shooting method and device and electronic equipment
CN111724398A (en) Image display method and device

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

AS Assignment

Owner name: BEIJING DAJIA INTERNET INFORMATION TECHNOLOGY CO., LTD., CHINA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ZHANG, LEI;ZHENG, WEN;ZHANG, WENBO;AND OTHERS;REEL/FRAME:058425/0042

Effective date: 20200827

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION