CN105338403B - Filter processing method and device and electronic equipment - Google Patents

Filter processing method and device and electronic equipment Download PDF

Info

Publication number
CN105338403B
CN105338403B CN201410384375.2A CN201410384375A CN105338403B CN 105338403 B CN105338403 B CN 105338403B CN 201410384375 A CN201410384375 A CN 201410384375A CN 105338403 B CN105338403 B CN 105338403B
Authority
CN
China
Prior art keywords
value
video
guide
pixel point
filter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410384375.2A
Other languages
Chinese (zh)
Other versions
CN105338403A (en
Inventor
高晓宇
黄本华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201410384375.2A priority Critical patent/CN105338403B/en
Publication of CN105338403A publication Critical patent/CN105338403A/en
Application granted granted Critical
Publication of CN105338403B publication Critical patent/CN105338403B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention discloses a filter processing method and device and electronic equipment, and belongs to the field of image processing. The method comprises the following steps: acquiring a guide video corresponding to an original video, wherein a filter guide value is stored in a pixel point of an image frame in the guide video; for a current pixel point in a current image frame in the original video, acquiring the filter guide value stored by a corresponding pixel point in a corresponding image frame in the guide video; and carrying out filter processing on the current pixel point according to the filter guide value. The method and the device have the advantages that the corresponding frame images in the original video can be respectively subjected to filter processing according to each frame image in the guide video, so that the original video can dynamically change the regions with the filter action at different moments, and different regions of the same frame image in the original video can be respectively subjected to filter processing.

Description

Filter processing method and device and electronic equipment
Technical Field
The present invention relates to the field of image processing, and in particular, to a filter processing method and apparatus, and an electronic device.
Background
Filters are commonly used to achieve various special effects of video.
When each pixel point in each frame of image in the original video is represented by RGB (red, Green, Blue), one filter replaces the RGB value of each pixel point in each frame of image in the original video with a new RGB value to complete color replacement. That is, the working principle of the filter is as follows: for each frame of image in the original video, replacing the RGB value (r, g, b) of each pixel point in the frame of image with (r ', g ', b ') according to a preset corresponding relation.
In the process of implementing the invention, the inventor finds that the prior art has at least the following problems: the existing filter can only perform integral filter processing on the whole picture of a video, and cannot dynamically change the acting area of the filter according to different moments.
Disclosure of Invention
In order to solve the problem that the existing filter can only perform integral filter processing on the whole picture of a video and cannot dynamically change the area of the filter action according to different moments, the embodiment of the invention provides a filter processing method, a filter processing device and electronic equipment. The technical scheme is as follows:
according to a first aspect of embodiments of the present invention, there is provided a filter processing method, the method including:
acquiring a guide video corresponding to an original video, wherein a filter guide value is stored in a pixel point of an image frame in the guide video;
for a current pixel point in a current image frame in the original video, acquiring the filter guide value stored by a corresponding pixel point in a corresponding image frame in the guide video;
and carrying out filter processing on the current pixel point according to the filter guide value.
According to a second aspect of embodiments of the present invention, there is provided a filter processing apparatus, the apparatus including:
the system comprises a guide video acquisition module, a filter guide module and a video processing module, wherein the guide video acquisition module is used for acquiring a guide video corresponding to an original video, and a filter guide value is stored in a pixel point of an image frame in the guide video;
a filter guide value obtaining module, configured to obtain, for a current pixel point in a current image frame in the original video, the filter guide value stored by a corresponding pixel point in a corresponding image frame in the guide video;
and the filter processing module is used for carrying out filter processing on the current pixel point according to the filter guide value.
According to a third aspect of embodiments of the present invention, there is provided an electronic apparatus including the filter processing device according to the second aspect.
The technical scheme provided by the embodiment of the invention has the following beneficial effects:
by acquiring a guide video corresponding to an original video, a filter guide value is stored in a pixel point of an image frame in the guide video; for a current pixel point in a current image frame in an original video, acquiring a filter guide value stored by a corresponding pixel point in a corresponding image frame in a guide video; performing filter processing on the current pixel point according to the filter guide value; the method and the device achieve the effects that the corresponding frame images in the original video can be respectively subjected to filter processing according to each frame image in the guide video, so that the original video can dynamically change the regions with the filter action at different moments, and different regions of the same frame image in the original video can be respectively subjected to filter processing.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic diagram illustrating an implementation of a pixel point filter processing method according to a filter processing method provided in various embodiments of the present invention;
FIG. 2A is a flow chart of a method of filter processing according to an embodiment of the present invention;
FIG. 2B is a schematic diagram of an implementation of a filter processing method according to an embodiment of the invention;
FIG. 2C is a schematic diagram of an implementation of a filter processing method according to an embodiment of the invention;
FIG. 3 is a schematic diagram of a method for storing filter guide values in a guide image according to another embodiment of the present invention;
FIG. 4A is a flow chart of a method of filter processing according to another embodiment of the present invention;
FIG. 4B is a schematic interface diagram of a filter processing method according to another embodiment of the invention;
FIG. 5 is a schematic diagram of a method for storing filter guide values in a guide image according to a filter processing method provided in another embodiment of the present invention;
FIG. 6 is a flow chart of a method of filter processing according to yet another embodiment of the present invention;
fig. 7 is a block diagram showing the structure of a filter processing apparatus according to an embodiment of the present invention;
fig. 8 is a block diagram showing the structure of a filter processing apparatus according to another embodiment of the present invention;
fig. 9 is a block diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, embodiments of the present invention will be described in detail with reference to the accompanying drawings.
The filter processing method provided by each embodiment of the present invention can be implemented by an application program having a video processing capability or an electronic device having a video processing capability. The electronic device may be a smartphone, smart television, tablet computer, laptop portable computer (camera, camcorder), and the like.
For simplicity of description, the filter processing method is only exemplified below as being performed by an electronic device having a video processing capability, but is not limited thereto.
Since the embodiments of the present invention relate to a step of filtering RGB values (r, g, b) of pixels in an image frame in an original video into (r ', g ', b ') according to a preset correspondence relationship, for convenience of understanding, the step will be described in detail below by taking a color-mixing filter as an example. The color-mixing filter is one of the filters and is used for carrying out color replacement on pixel points in the original video.
In the following, an exemplary description is given only by performing color-mixing filter processing on a certain pixel point in a certain frame of image in the original video, and the steps of performing color-mixing filter processing on other image frames in the original video are similar to those described above.
Referring to fig. 1, first, a preset corresponding relationship is stored by using a pixel array 11 with 256 pixels × 1 pixels, the pixel array 11 totally includes 256 pixels 12, which are numbered from left to right as 0-255, the number value of the pixel 12 corresponds to the RGB value of the pixel in the original image, and the RGB value stored in each pixel 12 in the pixel array 11 corresponds to the RGB value after the filter processing. The color mixing filter comprises the following processing procedures:
firstly, a pixel point of a certain frame of image in an original video is obtained, and the RGB value of the pixel point is (r, g, b).
Referring to fig. 1, a pixel 13 of a frame of image in an original video is obtained, and the RGB value of the pixel 13 is (r, g, b).
Secondly, according to r in the RGB values of the pixel 13, a pixel 12 with a corresponding number r is found in the pixel array 11, and the RGB values of the pixel 12 are (Rr, Gr, Br).
And thirdly, obtaining Rr in the RGB value of the pixel point 12, and replacing r in the RGB value of the pixel point 13 in a certain frame of image in the original video with Rr.
Fourthly, according to g in the RGB values of the pixel 13, finding the pixel 12 with the corresponding serial number of g in the pixel array 11, wherein the RGB value of the pixel 12 is (Rg, Gg, Bg).
And fifthly, acquiring the Gg in the RGB value of the pixel point 12, and replacing the g in the RGB value of the pixel point 13 in a certain frame of image in the original video with the Gg.
Sixthly, according to b in the RGB values of the pixel point 13, finding the pixel point 12 with the corresponding number b in the pixel array 11, wherein the RGB value of the pixel point 12 is (Rb, Gb, Bb).
And seventhly, acquiring Bb in the RGB value of the pixel point 12, and replacing b in the RGB value of the pixel point 13 in a certain frame of image in the original video with Bb.
According to the above steps, the RGB values of the pixel 13 in a certain frame of image in the original video are processed by the filter to be (Rr, Gg, Bb).
However, in order to perform filter processing to different degrees on frame images corresponding to an original video according to different regions and/or different playing times according to each frame image in a guide video, so that the regions of the original video acting as filters can be dynamically changed at different times, please refer to the following embodiments:
referring to fig. 2A, a method flowchart of a filter processing method according to an embodiment of the invention is shown. The method comprises the following steps:
step 202, acquiring a guide video corresponding to an original video, wherein a filter guide value is stored in a pixel point of an image frame in the guide video;
the guide video is a separately set video for guiding a filtering process on the original video. And a filter guide value is stored in each pixel point in each image frame in the guide video.
When all the filter guide values are the same, the whole picture of all the images in the original video will be subjected to filter processing according to the same processing level.
When different filter guide values are stored in different areas of a certain frame of image in the guide video, the whole picture of the corresponding frame of image in the original video is respectively subjected to filter processing with different processing levels according to different areas.
When images belonging to different playing moments in the guide video store different filter guide values, the guide video respectively performs filter processing of different processing levels on corresponding frame images at different playing moments in the original video according to the filter guide values stored in pixel points in the images at the playing moments.
Step 204, for a current pixel point in a current image frame in an original video, obtaining a filter guide value stored by a corresponding pixel point in a corresponding image frame in a guide video;
and step 206, performing filter processing on the current pixel point according to the filter guide value.
In summary, in the filter processing method provided in this embodiment, a guide video corresponding to an original video is obtained, and a filter guide value is stored in a pixel point of an image frame in the guide video; for a current pixel point in a current image frame in an original video, acquiring a filter guide value stored by a corresponding pixel point in a corresponding image frame in a guide video; performing filter processing on the current pixel point according to the filter guide value; the method and the device achieve the effects that the corresponding frame images in the original video can be respectively subjected to filter processing according to each frame image in the guide video, so that the original video can dynamically change the regions with the filter action at different moments, and different regions of the same frame image in the original video can be respectively subjected to filter processing.
As a possible implementation manner, the implementation procedure of step 206 may be:
setting the original RGB values of the pixel points as (r, g, b), and acquiring the reference RGB values after filter processing as (r ', g ', b ') according to the preset corresponding relation;
and taking the filter guide value a as weight, and carrying out filter processing on the RGB value of the current pixel point according to the reference RGB value (r ', g ', b ').
For example, the filter processing of the RGB value of the current pixel point is:
(r ", g", b ") (r, g, b) × (n-a)/n + (r ', g ', b ') × a/n, where n is the maximum value of the range of the filter guide value a.
Obviously, different filter guide values can produce different filter effects on pixel points in the original image, and regions acted by the filters may be different at different moments. In an exemplary example, by setting filter areas corresponding to different playing moments of the guide video, an effect of enabling the original video to dynamically change the areas of the filter function at different playing moments is achieved.
Referring to fig. 2B, the original video corresponds to the image frames in the guide video one by one, the guide video sets filter guide values of filter regions and pixel points at different times, performs filter processing on different regions of the original video, and performs filter processing on the image frames corresponding to the original video at different times.
Referring to fig. 2C, taking time t as an example, the image frame 21 corresponding to the original video at time t, the filter guide value of the bottom filter area is k at time t of the guide video 22, the filter guide values of the remaining areas are 0, and the image frame 21 is processed by the filter to be an image frame 23.
In a specific implementation process, the guidance video may be a black-and-white video or a color video, and the manner of storing the filter guidance value may also be through a gray value or an R value, a G value, or a B value of a pixel, and the like, and possible situations may include the following:
a first possible way: and when the guide video is black and white, storing the filter guide value by using a gray value.
A second possible way: when the guide video is a color video, the filter guide value is stored using the R value or the G value or the B value.
The above possibilities will be explained in detail below using specific embodiments.
For the first possible approach described above: when the guide video is a black and white video, since the pixel points in the image frame in the guide video can all contain a gray value a, the filter guide value can be stored by using the gray value a of the pixel points, so that the guide is performed when the original video is subjected to filter processing.
Reference is made in conjunction with fig. 3 to illustrate one frame of image frames in a guide video. Each pixel point 32 in one frame of image frame 31 in the guide video stores a gray value a, where a is greater than or equal to 0 and less than or equal to 255, and the gray value is used for storing a filter guide value.
Referring to fig. 4A, a flowchart of a method of processing a filter according to another embodiment of the invention is shown. This embodiment is exemplified by applying the filter processing method to an electronic device, and instructing that image frames in a video are the image frames shown in fig. 3, and a filter is a color-mixing filter shown in fig. 1, where the method includes:
step 401, a guide video is obtained.
There are two methods for obtaining the guide video:
firstly, receiving a selection signal of 1 guide video in preset n guide videos, and acquiring the selected guide video.
The user can select one guide video as the original video from the preset n guide videos, each preset guide video corresponds to different filter effects, and the user can select the guide video according to the effect of the original video. The electronic equipment obtains the guide video selected by the user after receiving the selection signal of the user.
And secondly, importing a user-defined guide video.
The user can guide the self-defined guide video into the electronic equipment, select the guide video and perform filter processing.
The electronic equipment acquires a guide video corresponding to the original video from a preset guide video library. The correspondence may be a correspondence of an effect expected to be presented by the original video with the guide video, or the like. For example, if it is expected that the original video shows the color where the brush brushes, a guidance video that shows the video effect as the color where the brush brushes is obtained from the guidance video library.
It should be noted that, in the following description,
when the number of frames of the guide video is less than the number of frames of the original video, the guide video may be played again from the beginning after the end of the play of the guide video until the end of the play of the original video. Or, according to an interpolation algorithm, uniformly adding a plurality of frames in the guide video, so that the guide video with the added frame number has the same frame number as the original video.
And when the frame number of the guide video is greater than that of the original video, stopping the playing of the guide video after the playing of the original video is finished. Or, according to an interpolation algorithm, uniformly deleting a plurality of frames in the guide video, so that the guide video with the deleted frame number has the same frame number as the original video.
Step 402, it is detected whether the guide video and the original video have the same pixel size.
The electronic device acquires the length and width (in pixels) of the original video and the guide video, and when the length of the original video coincides with the length of the guide video and the width of the original video coincides with the width of the guide video, the original video and the guide video have the same pixel size.
In step 403, if the original video and the guide video do not have the same pixel size, the pixel size of each frame image in the original video is scaled to the pixel size of the guide video by an interpolation algorithm, or the pixel size of each frame image in the guide video is scaled to the pixel size of the original video by an interpolation algorithm.
For example, the length of the pixel size of the guide video is 640 pixels by 320 pixels, and the length of the pixel size of the original video is 540 pixels by 360 pixels, the length of the original video can be enlarged to 640 pixels, and the width of the original video can be reduced to 320 pixels, which are consistent with the size of the guide video, or the length of the guide video can be reduced to 540 pixels, and the width of the guide video can be enlarged to 360 pixels, which are consistent with the size of the original video.
It should be noted that, in order to ensure that the pixel size of the original video after being processed by the filter does not change, it is preferable to scale the pixel size of each frame image in the guide video to the pixel size of the original video by an interpolation algorithm.
Step 404, for a current pixel point in a current image frame in the original video, obtaining a filter guide value stored by a corresponding pixel point in a corresponding image frame in the guide video.
In general, when the playing time of the original video and the guidance video are synchronized, the current image frame in the original video and the corresponding image frame in the guidance video have the same playing time, and the current pixel point in the current image frame and the corresponding pixel point in the guidance video have the same position.
However, since the user can set the starting time of the guide video to make the original video start to perform the filter processing at a certain playing time according to the guidance of the guide video, in this case, the step may include the following substeps:
starting a guide video when the playing time of an original video is t;
secondly, setting the playing time of the current image frame as t ', wherein t ' is not less than t, and acquiring the image frame with the playing time of t ' -t in the guide video as a corresponding image frame of the current image frame;
for example, the user may set the activation time of the guide video to be 5s, which means that the guide video is activated when the original video is played for 5 s. It should be noted that, in order to ensure the filter effect, the set start time of the guide video needs to be shorter than the time length of the original video. And if the playing time of the current image frame is 10s and the starting time of the guide video is 5s, acquiring the image frame with the playing time of 5s in the guide video as the corresponding image frame of the current image frame.
Thirdly, acquiring pixel points with the same positions as the current pixel points in the corresponding image frame as corresponding pixel points;
because the original video and the guide video have the same pixel size, the pixel points with the same coordinate position in the corresponding image frame in the guide video are obtained as the corresponding pixel points according to the coordinate position of the pixel point in the current image frame in the original video. For example, for a pixel point (x, y) in an image frame with a playing time t in an original video, a pixel point with the same coordinate position (x, y) in a corresponding image frame in the guide video is obtained as a corresponding pixel point.
And fourthly, acquiring the filter guide value stored by the corresponding pixel point.
Because the grey value of the pixel point is adopted to store the guide value of the filter, the grey value of the corresponding pixel point is obtained.
Step 405, obtaining the reference RGB values after filter processing as (r ', g ', b ') according to the preset corresponding relationship.
Taking a filter as a color mixing filter as an example, the reference RGB values after the color mixing filter processing are obtained as (r ', g ', b ') according to the preset corresponding relationship.
The preset corresponding relationship may be stored by using pixel arrays numbered sequentially from 0 to 255, and the pixel arrays may be stored separately in a picture in RGB format, as shown in fig. 1.
At this time, when the number value of each pixel point in the pixel array corresponds to the original RGB value, and the RGB value stored in each pixel point is the reference RGB value after the filter processing,
for the R value in the original RGB value of the current pixel point, reading the R value stored in the pixel point with the number of R in the pixel array, and taking the R value stored in the pixel point with the number of R as the reference R value of the current pixel point after filter processing;
for the G value in the original RGB value of the current pixel point, reading the G value stored in the pixel point with the number of G in the pixel array, and taking the G value stored in the pixel point with the number of G as the reference G value after the filter processing of the current pixel point;
and for the B value in the original RGB value of the current pixel point, reading the B value stored in the pixel point with the number of B in the pixel array, and taking the B value stored in the pixel point with the number of B as the reference B value of the current pixel point after filter processing.
And 406, taking the filter guide value a as a weight, and performing filter processing on the RGB value of the current pixel point according to the reference RGB value (r ', g ', b ').
As a possible implementation manner, the color-mixing filter processing is performed on the RGB value of the current pixel point as follows:
(r ", g", b ") (r, g, b) × (n-a)/n + (r ', g ', b ') × a/n, where n is the maximum value of the range of the filter guide value a.
And the electronic equipment carries out color mixing filter processing on the RGB value of the pixel point according to the filter guide value of the pixel point in the image frame in the guide video and the reference RGB value of the corresponding pixel point in the image frame in the original video. It is clear that,
when a is 0, the pixel point is not subjected to color mixing filter processing, and the pixel point can be directly skipped without processing;
when a is more than 0 and less than 255, the RGB value color mixing filter of the pixel point is treated as follows: (r ", g", b ") (r, g, b) × (255-a)/255+ (r ', g ', b ') × a/255;
and when the a is 255, processing the RGB value color-mixing filter of the pixel point into a reference RGB value.
And repeating the steps 404 to 406 until all the pixel points in the original video are processed by the color-mixing filter.
In summary, in the filter processing method provided in this embodiment, a guide video corresponding to an original video is obtained, and a filter guide value is stored in a pixel point of an image frame in the guide video; for a current pixel point in a current image frame in an original video, acquiring a filter guide value stored by a corresponding pixel point in a corresponding image frame in a guide video; performing filter processing on the current pixel point according to the filter guide value; the method and the device achieve the effects that the corresponding frame images in the original video can be respectively subjected to filter processing according to each frame image in the guide video, so that the original video can dynamically change the regions with the filter action at different moments, and different regions of the same frame image in the original video can be respectively subjected to filter processing.
In the embodiment, the gray value of the pixel point in the image frame in the guide video is used for storing the filter guide value, so that the original video is subjected to filter processing, the area of the filter action of the original video can be dynamically changed at different moments, and the area and the processing level of the filter processing can be visually seen from the guide video.
Referring to fig. 4B, in an illustrative example, a short video sharing application is installed in the electronic device of the user, the user acquires an original video 41 in the short video sharing application through shooting or downloading, an effect 43 desired to be presented by the original video is selected at a filter effect selection 42, and the electronic device acquires a guide video corresponding to the effect 43. The original video is subjected to filter processing according to the guide video to generate a video 44 subjected to filter processing.
For the second possible approach described above: when the guide video is a color video, since the pixel points in the image frame in the guide video may include an R value, a G value, or a B value, the filter guide value may be stored by using the R value, the G value, or the B value of the pixel points, thereby guiding when the original video is subjected to filter processing.
Reference is made in conjunction with fig. 5 to illustrate one frame of image frames in a guide video. Each pixel point 52 in one frame of image frame 51 in the guide video stores an R value a, where a is greater than or equal to 0 and less than or equal to 255, and the R value is used for storing the filter guide value.
The following will describe in detail using one embodiment.
Referring to fig. 6, a flowchart of a method of processing a filter according to still another embodiment of the invention is shown. This embodiment is exemplified by applying the filter processing method to an electronic device, and instructing that image frames in a video are the image frames shown in fig. 3, and a filter is a color-mixing filter shown in fig. 1, where the method includes:
step 601, acquiring a guide video.
There are two methods for obtaining the guide video:
and thirdly, receiving selection signals of 1 guide video in the preset n guide videos to obtain the selected guide video.
The user can select one guide video as the original video from the preset n guide videos, each preset guide video corresponds to different filter effects, and the user can select the guide video according to the effect of the original video. The electronic equipment obtains the guide video selected by the user after receiving the selection signal of the user.
And fourthly, importing the user-defined guide video.
The user can guide the self-defined guide video into the electronic equipment, select the guide video and perform filter processing.
It should be noted that, in the following description,
when the number of frames of the guide video is less than the number of frames of the original video, the guide video may be played again from the beginning after the end of the play of the guide video until the end of the play of the original video. Or, according to an interpolation algorithm, uniformly adding a plurality of frames in the guide video, so that the guide video with the added frame number has the same frame number as the original video.
And when the frame number of the guide video is greater than that of the original video, stopping the playing of the guide video after the playing of the original video is finished. Or, according to an interpolation algorithm, uniformly deleting a plurality of frames in the guide video, so that the guide video with the deleted frame number has the same frame number as the original video.
Step 602, it is detected whether the guide video and the original video have the same pixel size.
The electronic device acquires the length and width (in pixels) of the original video and the guide video, and when the length of the original video coincides with the length of the guide video and the width of the original video coincides with the width of the guide video, the original video and the guide video have the same pixel size.
Step 603, if the original video and the guide video do not have the same pixel size, the pixel size of each frame image in the original video is scaled to the pixel size of the guide video through an interpolation algorithm, or the pixel size of each frame image in the guide video is scaled to the pixel size of the original video through the interpolation algorithm.
For example, the length of the pixel size of the guide video is 640 pixels by 320 pixels, and the length of the pixel size of the original video is 540 pixels by 360 pixels, the length of the original video can be enlarged to 640 pixels, and the width of the original video can be reduced to 320 pixels, which are consistent with the size of the guide video, or the length of the guide video can be reduced to 540 pixels, and the width of the guide video can be enlarged to 360 pixels, which are consistent with the size of the original video.
It should be noted that, in order to ensure that the pixel size of the original video after the filter processing remains unchanged, it is preferable to scale the pixel size of each frame image in the guide video to the pixel size of the original video through an interpolation algorithm.
Step 604, for a current pixel point in a current image frame in the original video, obtaining a filter guide value stored by a corresponding pixel point in a corresponding image frame in the guide video.
In general, when the playing time of the original video and the guidance video are synchronized, the current image frame in the original video and the corresponding image frame in the guidance video have the same playing time, and the current pixel point in the current image frame and the corresponding pixel point in the guidance video have the same position.
However, since the user can set the starting time of the guide video to make the original video start to perform the filter processing at a certain playing time according to the guidance of the guide video, in this case, the step may include the following substeps:
starting a guide video when the playing time of an original video is t;
secondly, setting the playing time of the current image frame as t ', wherein t ' is not less than t, and acquiring the image frame with the playing time of t ' -t in the guide video as a corresponding image frame of the current image frame;
for example, the user may set the activation time of the guide video to be 5s, which means that the guide video is activated when the original video is played for 5 s. It should be noted that, in order to ensure the filter effect, the set start time of the guide video needs to be shorter than the time length of the original video. And if the playing time of the current image frame is 10s and the starting time of the guide video is 5s, acquiring the image frame with the playing time of 5s in the guide video as the corresponding image frame of the current image frame.
Thirdly, acquiring pixel points with the same positions as the current pixel points in the corresponding image frame as corresponding pixel points;
because the original video and the guide video have the same pixel size, the pixel points with the same coordinate position in the corresponding image frame in the guide video are obtained as the corresponding pixel points according to the coordinate position of the pixel point in the current image frame in the original video. For example, for a pixel point (x, y) in an image frame with a playing time t in an original video, a pixel point with the same coordinate position (x, y) in a corresponding image frame in the guide video is obtained as a corresponding pixel point.
And fourthly, acquiring the filter guide value stored by the corresponding pixel point.
The R value, the G value or the B value of the corresponding pixel point is obtained because the R value, the G value or the B value of the pixel point is adopted to store the filter guiding value.
Step 605, obtaining the reference RGB values after filter processing as (r ', g ', b ') according to the preset corresponding relationship.
Taking a filter as a color mixing filter as an example, the reference RGB values after the color mixing filter processing are obtained as (r ', g ', b ') according to the preset corresponding relationship.
The preset corresponding relationship may be stored by using pixel arrays numbered sequentially from 0 to 255, and the pixel arrays may be stored separately in a picture in RGB format, as shown in fig. 1.
At this time, when the number value of each pixel point in the pixel array corresponds to the original RGB value, and the RGB value stored in each pixel point is the reference RGB value after the filter processing,
for the R value in the original RGB value of the current pixel point, reading the R value stored in the pixel point with the number of R in the pixel array, and taking the R value stored in the pixel point with the number of R as the reference R value of the current pixel point after filter processing;
for the G value in the original RGB value of the current pixel point, reading the G value stored in the pixel point with the number of G in the pixel array, and taking the G value stored in the pixel point with the number of G as the reference G value after the filter processing of the current pixel point;
and for the B value in the original RGB value of the current pixel point, reading the B value stored in the pixel point with the number of B in the pixel array, and taking the B value stored in the pixel point with the number of B as the reference B value of the current pixel point after filter processing.
And 606, taking the filter guide value a as a weight, and carrying out filter processing on the RGB value of the current pixel point according to the reference RGB value (r ', g ', b ').
The RGB value of the current pixel point is processed by a color mixing filter as follows:
(r ", g", b ") (r, g, b) × (n-a)/n + (r ', g ', b ') × a/n, where n is the maximum value of the range of the filter guide value a.
And the electronic equipment carries out color mixing filter processing on the RGB value of the pixel point according to the filter guide value of the pixel point in the image frame in the guide video and the reference RGB value of the corresponding pixel point in the image frame in the original video. It is clear that,
when a is 0, the pixel point is not subjected to color mixing filter processing, and the pixel point can be directly skipped without processing;
when a is more than 0 and less than 255, the RGB value color mixing filter of the pixel point is treated as follows: (r ", g", b ") (r, g, b) × (255-a)/255+ (r ', g ', b ') × a/255;
and when the a is 255, processing the RGB value color-mixing filter of the pixel point into a reference RGB value.
And repeating the steps 604 to 606 until all pixel points in the original video are processed by the color-mixing filter.
In summary, in the filter processing method provided in this embodiment, a guide video corresponding to an original video is obtained, and a filter guide value is stored in a pixel point of an image frame in the guide video; for a current pixel point in a current image frame in an original video, acquiring a filter guide value stored by a corresponding pixel point in a corresponding image frame in a guide video; performing filter processing on the current pixel point according to the filter guide value; the method and the device achieve the effects that the corresponding frame images in the original video can be respectively subjected to filter processing according to each frame image in the guide video, so that the original video can dynamically change the regions with the filter action at different moments, and different regions of the same frame image in the original video can be respectively subjected to filter processing.
In the embodiment, the filter guide value is stored by using the R value or the G value or the B value of the pixel point in the image frame in the guide video, so that the original video is subjected to filter processing, the area of the filter action of the original video can be dynamically changed at different moments, and the area of the filter processing and the filter level can be visually seen from the guide video.
Referring to fig. 7, a block diagram of a filter processing apparatus according to an embodiment of the present invention is shown. The filter processing means may be implemented in software, hardware or a combination of both as all or part of an electronic device having video processing capabilities. This filter processing apparatus includes:
a guide video obtaining module 720, configured to obtain a guide video corresponding to an original video, where a filter guide value is stored in a pixel point of an image frame in the guide video;
a filter guide value obtaining module 740, configured to obtain, for a current pixel point in a current image frame in the original video, the filter guide value stored by a corresponding pixel point in a corresponding image frame in the guide video;
and a filter processing module 760, configured to perform filter processing on the current pixel point according to the filter guide value.
In summary, in the filter processing apparatus provided in this embodiment, by acquiring the guide video corresponding to the original video, the filter guide value is stored in the pixel point of the image frame in the guide video; for a current pixel point in a current image frame in an original video, acquiring a filter guide value stored by a corresponding pixel point in a corresponding image frame in a guide video; performing filter processing on the current pixel point according to the filter guide value; the method and the device achieve the effects that the corresponding frame images in the original video can be respectively subjected to filter processing according to each frame image in the guide video, so that the original video can dynamically change the regions with the filter action at different moments, and different regions of the same frame image in the original video can be respectively subjected to filter processing.
Referring to fig. 8, a block diagram of a filter processing apparatus according to another embodiment of the present invention is shown. The filter processing means may be implemented in software, hardware or a combination of both as all or part of an electronic device having video processing capabilities. This filter processing apparatus includes:
a guide video obtaining module 720, configured to obtain a guide video corresponding to an original video, where a filter guide value is stored in a pixel point of an image frame in the guide video;
a filter guide value obtaining module 740, configured to obtain, for a current pixel point in a current image frame in the original video, the filter guide value stored by a corresponding pixel point in a corresponding image frame in the guide video;
and a filter processing module 760, configured to perform filter processing on the current pixel point according to the filter guide value.
Optionally, a filter processing module 760 comprising:
a reference RGB value obtaining unit 762, configured to set an original RGB value of the pixel point to (r, g, b), and obtain a reference RGB value after filter processing according to a preset correspondence as (r ', g ', b ');
the filter processing unit 764 is configured to perform filter processing on the RGB values of the current pixel point according to the reference RGB values (r ', g ', b ') by using the filter guide value a as a weight.
Optionally, the filter processing unit 764 comprises:
a filter processing subunit 764a, configured to perform filter processing on the RGB values of the current pixel point by:
(r ", g", b ") (r, g, b) × (n-a)/n + (r ', g ', b ') × a/n, where n is the maximum value of the range of the filter guide value a.
Optionally, the reference RGB value obtaining unit 762 includes:
a first reference value obtaining subunit 762a, configured to, when the preset correspondence relationship is stored using a pixel array sequentially numbered from 0 to 255, and a number value of a pixel point in the pixel array corresponds to an original RGB value, and an RGB value stored in the pixel point in the pixel array is a reference RGB value after filter processing, read, for an R value in the original RGB value of the current pixel point, an R value stored in a pixel point numbered R in the pixel array, and use the R value stored in the pixel point numbered R as the reference R value after filter processing of the current pixel point;
a second reference value obtaining subunit 762b, configured to, when the preset correspondence relationship is stored using a pixel array sequentially numbered from 0 to 255, and a number value of a pixel point in the pixel array corresponds to an original RGB value, and an RGB value stored in the pixel point in the pixel array is a reference RGB value after filter processing, read, for a G value in the original RGB value of the current pixel point, a G value stored in a pixel point numbered G in the pixel array, and use the G value stored in the pixel point numbered G as the reference G value after filter processing of the current pixel point;
a third reference value obtaining subunit 762c, configured to, when the preset correspondence relationship is stored using a pixel array sequentially numbered from 0 to 255, and a number value of a pixel point in the pixel array corresponds to an original RGB value, and an RGB value stored in the pixel point in the pixel array is a reference RGB value after filter processing, read, for a B value in the original RGB value of the current pixel point, a B value stored in a pixel point numbered as B in the pixel array, and use the B value stored in the pixel point numbered as B as the reference B value after filter processing of the current pixel point.
Optionally, the guiding video obtaining module 720 includes:
a guide video acquisition unit 722 for acquiring the guide video;
a video size detection unit 724 for detecting whether the guide video and the original video have the same pixel size;
a video size adjusting unit 726, configured to, if the original video does not have the same pixel size, scale the pixel size of the image frame in the original video to the pixel size of the guide video through an interpolation algorithm, or scale the pixel size of the image frame in the guide video to the pixel size of the original video through an interpolation algorithm.
Optionally, the guiding video obtaining module 720 includes:
a first guide video acquiring unit 723, configured to receive a selection signal for 1 guide video of the preset n guide videos; acquiring the selected guide video;
or the like, or, alternatively,
a second guide video obtaining unit 725, configured to import a user-defined guide video.
Optionally, the filter guide value obtaining module 740 includes:
a video enabling unit 742, configured to start enabling the guide video when a playing time of the original video is t;
an image frame acquiring unit 744, configured to set a playing time of the current image frame as t ', where t ' is equal to or greater than t, and acquire an image frame in the guidance video at the playing time of t ' -t as a corresponding image frame of the current image frame;
a pixel point obtaining unit 746, configured to obtain a pixel point in the corresponding image frame that has the same position as the current pixel point as the corresponding pixel point;
a filter guiding value obtaining unit 748, configured to obtain the filter guiding value stored in the corresponding pixel point.
In summary, in the filter processing apparatus provided in this embodiment, by acquiring the guide video corresponding to the original video, the filter guide value is stored in the pixel point of the image frame in the guide video; for a current pixel point in a current image frame in an original video, acquiring a filter guide value stored by a corresponding pixel point in a corresponding image frame in a guide video; performing filter processing on the current pixel point according to the filter guide value; the method and the device achieve the effects that the corresponding frame images in the original video can be respectively subjected to filter processing according to each frame image in the guide video, so that the original video can dynamically change the regions with the filter action at different moments, and different regions of the same frame image in the original video can be respectively subjected to filter processing.
It should be noted that: in the filter processing apparatus provided in the foregoing embodiment, only the division of the functional modules is illustrated in the filter processing, and in practical applications, the functions may be distributed by different functional modules according to needs, that is, the internal structure of the electronic device having a video processing function is divided into different functional modules to complete all or part of the functions described above. In addition, the filter processing apparatus and the filter processing method provided by the above embodiments belong to the same concept, and specific implementation processes thereof are described in the method embodiments and are not described herein again.
Referring to fig. 9, a schematic structural diagram of an electronic device according to an embodiment of the invention is shown. The electronic device is configured to implement the request processing method provided in the above-described embodiment. Specifically, the method comprises the following steps:
electronic device 900 may include RF (Radio Frequency) circuitry 910, memory 920 including one or more computer-readable storage media, input unit 930, display unit 940, sensor 950, audio circuitry 960, WiFi (wireless fidelity) module 970, processor 980 including one or more processing cores, and power supply 990, among other components. Those skilled in the art will appreciate that the electronic device configuration shown in fig. 9 does not constitute a limitation of the electronic device and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components. Wherein:
the RF circuit 910 may be used for receiving and transmitting signals during a message transmission or call, and in particular, for receiving downlink information from a base station and then processing the received downlink information by the one or more processors 980; in addition, data relating to uplink is transmitted to the base station. In general, RF circuit 910 includes, but is not limited to, an antenna, at least one Amplifier, a tuner, one or more oscillators, a Subscriber Identity Module (SIM) card, a transceiver, a coupler, an LNA (Low Noise Amplifier), a duplexer, and the like. In addition, the RF circuit 910 may also communicate with networks and other devices via wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to GSM (Global System for Mobile communications), GPRS (General Packet Radio Service), CDMA (Code Division Multiple Access), WCDMA (Wideband Code Division Multiple Access), LTE (Long Term Evolution), e-mail, SMS (short messaging Service), etc.
The memory 920 may be used to store software programs and modules, and the processor 980 performs various functional applications and data processing by operating the software programs and modules stored in the memory 920. The memory 920 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the electronic apparatus 900, and the like. Further, the memory 920 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Accordingly, the memory 920 may also include a memory controller to provide the processor 980 and the input unit 930 with access to the memory 920.
The input unit 930 may be used to receive input numeric or character information and generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function control. Specifically, the input unit 930 may include an image input device 931 and other input devices 932. The image input device 931 may be a camera or a photoelectric scanning device. The input unit 930 may include other input devices 932 in addition to the image input device 931. In particular, other input devices 932 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The display unit 940 may be used to display information input by or provided to the user and various graphical user interfaces of the electronic device 900, which may be made up of graphics, text, icons, video, and any combination thereof. The Display unit 940 may include a Display panel 941, and optionally, the Display panel 941 may be configured in the form of an LCD (Liquid Crystal Display), an OLED (Organic Light-Emitting Diode), or the like.
The electronic device 900 may also include at least one sensor 950, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor that may adjust the brightness of the display panel 941 according to the brightness of ambient light, and a proximity sensor that may turn off the display panel 941 and/or a backlight when the electronic device 900 is moved to the ear. As one of the motion sensors, the gravity acceleration sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when the mobile phone is stationary, and can be used for applications of recognizing the posture of the mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which may be further configured to the electronic device 900, detailed descriptions thereof are omitted.
The audio circuitry 960, speaker 961, microphone 962 may provide an audio interface between a user and the electronic device 900. The audio circuit 960 may transmit the electrical signal converted from the received audio data to the speaker 961, and convert the electrical signal into a sound signal for output by the speaker 961; microphone 962, on the other hand, converts collected sound signals into electrical signals, which are received by audio circuit 960 and converted into audio data, which are processed by audio data output processor 980, either through RF circuit 910 for transmission to another electronic device, for example, or output to memory 920 for further processing. The audio circuit 960 may also include an earbud jack to provide communication of a peripheral headset with the electronic device 900.
WiFi belongs to short-range wireless transmission technology, and the electronic device 900 can help the user send and receive e-mails, browse webpages, access streaming media and the like through the WiFi module 970, which provides the user with wireless broadband internet access. Although fig. 9 shows the WiFi module 970, it is understood that it does not belong to the essential constitution of the electronic device 900 and may be omitted entirely as needed within the scope not changing the essence of the invention.
The processor 980 is a control center of the electronic device 900, connects various parts of the entire cellular phone using various interfaces and lines, and performs various functions of the electronic device 900 and processes data by operating or executing software programs and/or modules stored in the memory 920 and calling data stored in the memory 920, thereby integrally monitoring the cellular phone. Optionally, processor 980 may include one or more processing cores; preferably, the processor 980 may integrate an application processor, which primarily handles operating systems, user interfaces, applications, etc., and a modem processor, which primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 980.
The electronic device 900 also includes a power supply 990 (e.g., a battery) for supplying power to the various components, which may be logically connected to the processor 980 via a power management system, for managing charging, discharging, and power consumption via the power management system. Power supply 990 may also include any component of one or more dc or ac power sources, recharging systems, power failure detection circuits, power converters or inverters, power status indicators, and the like.
Although not shown, the electronic device 900 may further include a bluetooth module or the like, which is not described herein.
In particular, in this embodiment, the electronic device 900 also includes a memory and one or more programs, where the one or more programs are stored in the memory and configured to be executed by the one or more processors. The above-mentioned one or more programs are used for executing the filter processing method provided by the above-mentioned embodiments.
It should be understood that, as used herein, the singular forms "a," "an," "the" are intended to include the plural forms as well, unless the context clearly supports the exception. It should also be understood that "and/or" as used herein is meant to include any and all possible combinations of one or more of the associated listed items.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (17)

1. A method of filter processing, the method comprising:
acquiring a guide video corresponding to an original video, wherein a filter guide value is stored in a pixel point of an image frame in the guide video, and the guide video is a video which is independently set and is used for guiding a filter process of the original video;
for a current pixel point in a current image frame in the original video, acquiring the filter guide value stored by a corresponding pixel point in a corresponding image frame in the guide video;
performing filter processing on the current pixel point according to the filter guide value; wherein:
for one frame of image in the guide video, the same or different filter guide values are stored in the pixel points in different areas; and/or storing the same or different filter guide values for the images at different playing moments in the guide video by the pixel points at the same positions in different images.
2. The method of claim 1, wherein the filter processing the current pixel point according to the filter guide value comprises:
setting the original RGB values of the pixel points as (r, g, b), and acquiring the reference RGB values after filter processing as (r ', g ', b ') according to the preset corresponding relation;
and taking the filter guide value a as weight, and carrying out filter processing on the RGB value of the current pixel point according to the reference RGB value (r ', g ', b ').
3. The method according to claim 2, wherein the filtering the RGB values of the current pixel point according to the reference RGB values (r ', g ', b ') with the filter guide value a as a weight comprises:
and filtering the RGB value of the current pixel point to obtain:
(r ", g", b ") (r, g, b) × (n-a)/n + (r ', g ', b ') × a/n, where n is the maximum value of the range of the filter guide value a.
4. The method according to claim 2, wherein the step of obtaining the filtered reference RGB values as (r ', g ', b ') according to the predetermined corresponding relationship by setting the original RGB values of the pixels as (r, g, b) comprises:
when the preset corresponding relation is stored by adopting a pixel array which is sequentially numbered from 0 to 255, the number value of a pixel point in the pixel array corresponds to an original RGB value, and the RGB value stored in the pixel point in the pixel array is a reference RGB value after filter processing,
for the R value in the original RGB value of the current pixel point, reading the R value stored in the pixel point with the number of R in the pixel array, and taking the R value stored in the pixel point with the number of R as the reference R value of the current pixel point after filter processing;
for the G value in the original RGB value of the current pixel point, reading the G value stored in the pixel point with the number of G in the pixel array, and taking the G value stored in the pixel point with the number of G as the reference G value of the current pixel point after filter processing;
and for the B value in the original RGB value of the current pixel point, reading the B value stored in the pixel point with the number of B in the pixel array, and taking the B value stored in the pixel point with the number of B as the reference B value of the current pixel point after filter processing.
5. The method of claim 3,
when the guide video is a black and white video, storing the filter guide value by adopting a gray value; or the like, or, alternatively,
and when the guide video is a color video, storing the filter guide value by adopting an R value, a G value or a B value.
6. The method according to any one of claims 1 to 5, wherein the obtaining of the guide video corresponding to the original video comprises:
acquiring the guide video;
detecting whether the guide video and the original video have the same pixel size;
if the original video does not have the same pixel size, the pixel size of each frame image in the original video is scaled to the pixel size of the guide video through an interpolation algorithm, or the pixel size of each frame image in the guide video is scaled to the pixel size of the original video through the interpolation algorithm.
7. The method according to any one of claims 1 to 5, wherein the obtaining of the guide video corresponding to the original video comprises:
receiving a selection signal for 1 guide video of preset n guide videos; acquiring the selected guide video;
or the like, or, alternatively,
and importing a user-defined guide video.
8. The method according to any one of claims 1 to 5, wherein said obtaining the stored filter guide values for corresponding pixel points in corresponding image frames in the guide video comprises:
starting to start the guide video when the playing time of the original video is t;
setting the playing time of the current image frame as t ', t ' being equal to or more than t, and acquiring the image frame with the playing time of t ' -t in the guide video as a corresponding image frame of the current image frame;
acquiring pixel points in the corresponding image frame, which have the same positions as the current pixel points, as the corresponding pixel points;
and acquiring the filter guide value stored in the corresponding pixel point.
9. A filter processing apparatus, comprising:
the system comprises a guide video acquisition module, a filter guide module and a filter processing module, wherein the guide video acquisition module is used for acquiring a guide video corresponding to an original video, a filter guide value is stored in a pixel point of an image frame in the guide video, and the guide video is a video which is independently set and is used for guiding the filter process of the original video;
a filter guide value obtaining module, configured to obtain, for a current pixel point in a current image frame in the original video, the filter guide value stored by a corresponding pixel point in a corresponding image frame in the guide video;
the filter processing module is used for carrying out filter processing on the current pixel point according to the filter guide value; wherein:
for one frame of image in the guide video, the same or different filter guide values are stored in the pixel points in different areas; and/or storing the same or different filter guide values for the images at different playing moments in the guide video by the pixel points at the same positions in different images.
10. The apparatus of claim 9, wherein the filter processing module comprises:
a reference RGB value obtaining unit, configured to set an original RGB value of the pixel point to (r, g, b), and obtain a reference RGB value after filter processing according to a preset correspondence to (r ', g ', b ');
and the filter processing unit is used for performing filter processing on the RGB value of the current pixel point according to the reference RGB value (r ', g ', b ') by taking the filter guide value a as weight.
11. The apparatus of claim 10, wherein the filter processing unit comprises:
a filter processing subunit, configured to perform filter processing on the RGB value of the current pixel point, where the filter processing is:
(r ", g", b ") (r, g, b) × (n-a)/n + (r ', g ', b ') × a/n, where n is the maximum value of the range of the filter guide value a.
12. The apparatus according to claim 10, wherein the reference RGB value obtaining unit includes:
a first reference value obtaining subunit, configured to, when the preset correspondence relationship is stored using a pixel array that is sequentially numbered from 0 to 255, and a number value of a pixel point in the pixel array corresponds to an original RGB value, and an RGB value stored in the pixel point in the pixel array is a reference RGB value after filter processing, read, for an R value in the original RGB value of the current pixel point, an R value stored in a pixel point that is numbered R in the pixel array, and use the R value stored in the pixel point that is numbered R as the reference R value after filter processing of the current pixel point;
a second reference value obtaining subunit, configured to, when the preset correspondence relationship is stored using a pixel array that is sequentially numbered from 0 to 255, and a number value of a pixel point in the pixel array corresponds to an original RGB value, and an RGB value stored in a pixel point in the pixel array is a reference RGB value after filter processing, read, for a G value in the original RGB value of the current pixel point, a G value stored in a pixel point that is numbered G in the pixel array, and use the G value stored in the pixel point that is numbered G as the reference G value after filter processing of the current pixel point;
and the third reference value obtaining subunit is configured to, when the preset correspondence relationship is stored by using a pixel array that is sequentially numbered from 0 to 255, and a number value of a pixel point in the pixel array corresponds to an original RGB value, and when the RGB value stored in the pixel point in the pixel array is a reference RGB value after filter processing, read, for a B value in the original RGB value of the current pixel point, a B value stored in a pixel point that is numbered B in the pixel array, and use the B value stored in the pixel point that is numbered B as the reference B value after filter processing of the current pixel point.
13. The apparatus of claim 12,
when the guide video is a black and white video, storing the filter guide value by adopting a gray value; or the like, or, alternatively,
and when the guide video is a color video, storing the filter guide value by adopting an R value, a G value or a B value.
14. The apparatus according to any one of claims 9 to 13, wherein the guide video acquisition module comprises:
a guide video acquisition unit for acquiring the guide video;
a video size detection unit for detecting whether the guide video and the original video have the same pixel size;
and the video size adjusting unit is used for scaling the pixel size of the image frame in the original video to the pixel size of the guide video through an interpolation algorithm or scaling the pixel size of the image frame in the guide video to the pixel size of the original video through the interpolation algorithm if the image frames do not have the same pixel size.
15. The apparatus according to any one of claims 9 to 13, wherein the guide video acquisition module comprises:
a first guide video acquisition unit for receiving a selection signal for 1 guide video of preset n guide videos; acquiring the selected guide video;
or the like, or, alternatively,
and the second guide video acquisition unit is used for importing the user-defined guide video.
16. The apparatus according to any one of claims 9 to 13, wherein the filter guide value acquisition module includes:
the video starting unit is used for starting to start the guide video when the playing time of the original video is t;
an image obtaining unit, configured to obtain, when a playing time of the current image frame is t ', an image frame with a playing time t ' -t in the guidance video as a corresponding image frame of the current image frame, where t ' is greater than or equal to t;
a pixel point obtaining unit, configured to obtain a pixel point in the corresponding image frame, where the pixel point has a same position as the current pixel point, as the corresponding pixel point;
and the filter guide value acquisition unit is used for acquiring the filter guide value stored by the corresponding pixel point.
17. An electronic device, characterized in that it comprises a filter processing apparatus according to any one of claims 9 to 16.
CN201410384375.2A 2014-08-06 2014-08-06 Filter processing method and device and electronic equipment Active CN105338403B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410384375.2A CN105338403B (en) 2014-08-06 2014-08-06 Filter processing method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410384375.2A CN105338403B (en) 2014-08-06 2014-08-06 Filter processing method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN105338403A CN105338403A (en) 2016-02-17
CN105338403B true CN105338403B (en) 2020-03-03

Family

ID=55288626

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410384375.2A Active CN105338403B (en) 2014-08-06 2014-08-06 Filter processing method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN105338403B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106204455A (en) * 2016-07-13 2016-12-07 广州市久邦数码科技有限公司 A kind of image processing method with multiple filtering effects and system thereof
CN107948733B (en) * 2017-12-04 2020-07-10 腾讯科技(深圳)有限公司 Video image processing method and device and electronic equipment
CN109451360B (en) * 2018-11-02 2021-03-05 北京亿幕信息技术有限公司 Video transition special effect method and engine
CN114598902B (en) * 2022-03-09 2023-12-22 安徽文香科技股份有限公司 Video frame processing method and device and electronic equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08194805A (en) * 1995-01-18 1996-07-30 Mitsubishi Electric Corp Picture processor and picture processing system
GB0519976D0 (en) * 2005-09-30 2005-11-09 Sony Uk Ltd Image processing
CN103533241A (en) * 2013-10-14 2014-01-22 厦门美图网科技有限公司 Photographing method of intelligent filter lens
CN103716712A (en) * 2013-12-31 2014-04-09 上海艾麒信息科技有限公司 Video processing method based on mobile terminal
CN103777852A (en) * 2012-10-18 2014-05-07 腾讯科技(深圳)有限公司 Image obtaining method and device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI493977B (en) * 2011-09-20 2015-07-21 Hannstar Display Corp Image searching module and method thereof

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08194805A (en) * 1995-01-18 1996-07-30 Mitsubishi Electric Corp Picture processor and picture processing system
GB0519976D0 (en) * 2005-09-30 2005-11-09 Sony Uk Ltd Image processing
CN103777852A (en) * 2012-10-18 2014-05-07 腾讯科技(深圳)有限公司 Image obtaining method and device
CN103533241A (en) * 2013-10-14 2014-01-22 厦门美图网科技有限公司 Photographing method of intelligent filter lens
CN103716712A (en) * 2013-12-31 2014-04-09 上海艾麒信息科技有限公司 Video processing method based on mobile terminal

Also Published As

Publication number Publication date
CN105338403A (en) 2016-02-17

Similar Documents

Publication Publication Date Title
US10142845B2 (en) Network connection method and apparatus, and computer storage medium
CN109271327B (en) Memory management method and device
US9697622B2 (en) Interface adjustment method, apparatus, and terminal
JP2017510856A (en) Backlight control method, apparatus, program, and recording medium
JP6328275B2 (en) Image type identification method, apparatus, program, and recording medium
CN112584251B (en) Display method and electronic equipment
CN106658064B (en) Virtual gift display method and device
US20170147904A1 (en) Picture processing method and apparatus
CN103390034A (en) Image display method, device, terminal and server
CN105338403B (en) Filter processing method and device and electronic equipment
CN108459799B (en) Picture processing method, mobile terminal and computer readable storage medium
CN105306802A (en) Photographing mode switching method and device
US20210195574A1 (en) Communication link configuration method and device
CN106844580A (en) A kind of generation method of thumbnail, device and mobile terminal
JP6207146B2 (en) Video display control apparatus and method for portable terminal
US11915633B2 (en) Screen color temperature control method, apparatus, storage medium, and mobile terminal
CN106445970B (en) Loading processing method and device for placeholder map
EP3726885A1 (en) Cell reselection method and device, and storage medium
JP2020509470A (en) Method, apparatus and non-volatile computer readable medium for image synthesis
CN105376640B (en) Filter processing method, device and electronic equipment
CN108877733B (en) Color temperature adjusting method and related device
CN105992001A (en) Method and device for carrying out quantification processing on picture
US11382125B2 (en) Data transmission method, device and computer readable storage medium
US11109008B2 (en) Data transmission method, data loading device, data parsing device and data transmission system
CN107835336B (en) Dual-camera frame synchronization method and device, user terminal and storage medium

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant