CN109034068A - Method for processing video frequency and device, electronic equipment and storage medium - Google Patents
Method for processing video frequency and device, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN109034068A CN109034068A CN201810845512.6A CN201810845512A CN109034068A CN 109034068 A CN109034068 A CN 109034068A CN 201810845512 A CN201810845512 A CN 201810845512A CN 109034068 A CN109034068 A CN 109034068A
- Authority
- CN
- China
- Prior art keywords
- processed
- video frame
- area
- video
- layer
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/40—Scenes; Scene-specific elements in video content
- G06V20/46—Extracting features or characteristics from the video content, e.g. video fingerprints, representative shots or key frames
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
Abstract
This disclosure relates to a kind of method for processing video frequency and device, electronic equipment and storage medium.This method comprises: the video frame to be processed to video carries out edge detection, the fringe region in the video frame to be processed is determined;Determine the highlight area in the video frame to be processed;According to the fringe region and the highlight area, the flashed area in the video frame to be processed is determined;According to the position of the flashed area, the superposition flash of light figure layer in the video frame to be processed.The disclosure can realize in video flash effect by lower computation complexity, meet the requirement of real-time of video processing.
Description
Technical field
The disclosure relate to technical field of computer vision more particularly to a kind of method for processing video frequency and device, electronic equipment and
Storage medium.
Background technique
In video processing, user occasionally wants to increase video specified special effect, enhances the ornamental value of video.
This video processing mode usually requires to handle the multitude of video frame in video, the computation complexities of the relevant technologies compared with
Height is not able to satisfy the requirement of real-time of video processing.
Summary of the invention
The present disclosure proposes a kind of video processing technique schemes.
According to the one side of the disclosure, a kind of method for processing video frequency is provided, comprising:
Edge detection is carried out to the video frame to be processed of video, determines the fringe region in the video frame to be processed;
Determine the highlight area in the video frame to be processed;
According to the fringe region and the highlight area, the flashed area in the video frame to be processed is determined;
According to the position of the flashed area, the superposition flash of light figure layer in the video frame to be processed.
In one possible implementation, the highlight area in the video frame to be processed is determined, comprising:
It is greater than the region of gray threshold according to gray value in the video frame to be processed, determines in the video frame to be processed
Highlight area.
In one possible implementation, the area of gray threshold is greater than according to gray value in the video frame to be processed
Domain determines the highlight area in the video frame to be processed, comprising:
The region that gray value in the video frame to be processed is greater than gray threshold is determined as in the video frame to be processed
Highlight area;Alternatively,
Gray value in the video frame to be processed is greater than area in the region of gray threshold and is greater than the first area threshold
Region is determined as the highlight area in the video frame to be processed.
In one possible implementation, it according to the fringe region and the highlight area, determines described to be processed
Flashed area in video frame, comprising:
According to the intersection area of the fringe region and the highlight area, the flash of light in the video frame to be processed is determined
Region.
In one possible implementation, it according to the intersection area of the fringe region and the highlight area, determines
Flashed area in the video frame to be processed, comprising:
The intersection area of the fringe region and the highlight area is determined as the flash of light in the video frame to be processed
Region;Alternatively,
The fringe region and area in the intersection area of the highlight area is true greater than the region of second area threshold value
The flashed area being set in the video frame to be processed.
In one possible implementation, it according to the position of the flashed area, is folded in the video frame to be processed
Add flash of light figure layer, comprising:
The video frame to be processed is scanned using scan line;
If the first scan line obtains first scan line and first flashed area by the first flashed area
The position of two intersection points of boundary line;
According to the position of described two intersection points, flash of light figure layer is superimposed for first flashed area.
In one possible implementation, folded for first flashed area according to the position of described two intersection points
Add flash of light figure layer, comprising:
According to the distance between described two intersection points, the size of the corresponding flash of light figure layer of first flashed area is determined;
Using the midpoint of described two intersection points as the geometric center of flash of light figure layer, and it is corresponding according to first flashed area
Flash of light figure layer size, for first flashed area be superimposed flash of light figure layer.
In one possible implementation, it according to the position of the flashed area, is folded in the video frame to be processed
Add flash of light figure layer, comprising:
According to the position of the flashed area, flashlight view is superimposed in the video frame to be processed by the way of colour filter
Layer.
In one possible implementation, it according to the position of the flashed area, is folded in the video frame to be processed
Add flash of light figure layer, comprising:
Determine the corresponding flash of light figure layer of the video frame to be processed;
According to the position of the flashed area, it is corresponding that the video frame to be processed is superimposed in the video frame to be processed
Flash of light figure layer.
In one possible implementation, the corresponding flash of light figure layer of the video frame to be processed is determined, comprising:
According to the quantity for figure layer of glistening in flash period, flashlight view sequence of layer and the video frame to be processed in the view
Frame number in frequency determines the corresponding flash of light figure layer of the video frame to be processed from the flashlight view sequence of layer.
According to the one side of the disclosure, a kind of video process apparatus is provided, comprising:
Edge detection module carries out edge detection for the video frame to be processed to video, determines the video to be processed
Fringe region in frame;
First determining module, for determining the highlight area in the video frame to be processed;
Second determining module, for determining the video frame to be processed according to the fringe region and the highlight area
In flashed area;
Laminating module, for the position according to the flashed area, the superposition flash of light figure layer in the video frame to be processed.
In one possible implementation, first determining module is used for:
It is greater than the region of gray threshold according to gray value in the video frame to be processed, determines in the video frame to be processed
Highlight area.
In one possible implementation, first determining module is used for:
The region that gray value in the video frame to be processed is greater than gray threshold is determined as in the video frame to be processed
Highlight area;Alternatively,
Gray value in the video frame to be processed is greater than area in the region of gray threshold and is greater than the first area threshold
Region is determined as the highlight area in the video frame to be processed.
In one possible implementation, second determining module is used for:
According to the intersection area of the fringe region and the highlight area, the flash of light in the video frame to be processed is determined
Region.
In one possible implementation, second determining module is used for:
The intersection area of the fringe region and the highlight area is determined as the flash of light in the video frame to be processed
Region;Alternatively,
The fringe region and area in the intersection area of the highlight area is true greater than the region of second area threshold value
The flashed area being set in the video frame to be processed.
In one possible implementation, the laminating module includes:
Submodule is scanned, for scanning the video frame to be processed using scan line;
Acquisition submodule, if obtaining first scan line and institute by the first flashed area for the first scan line
State the position of two intersection points of the boundary line of the first flashed area;
First superposition submodule is superimposed for first flashed area and dodges for the position according to described two intersection points
Light figure layer.
In one possible implementation, the first superposition submodule includes:
Determination unit, for determining the corresponding sudden strain of a muscle of first flashed area according to the distance between described two intersection points
The size of light figure layer;
Superpositing unit, for using the midpoint of described two intersection points as the geometric center of flash of light figure layer, and according to described the
The size of the corresponding flash of light figure layer of one flashed area is superimposed flash of light figure layer for first flashed area.
In one possible implementation, the laminating module is used for:
According to the position of the flashed area, flashlight view is superimposed in the video frame to be processed by the way of colour filter
Layer.
In one possible implementation, the laminating module includes:
Submodule is determined, for determining the corresponding flash of light figure layer of the video frame to be processed;
Second superposition submodule is superimposed institute in the video frame to be processed for the position according to the flashed area
State the corresponding flash of light figure layer of video frame to be processed.
In one possible implementation, the determining submodule is used for:
According to the quantity for figure layer of glistening in flash period, flashlight view sequence of layer and the video frame to be processed in the view
Frame number in frequency determines the corresponding flash of light figure layer of the video frame to be processed from the flashlight view sequence of layer.
According to the one side of the disclosure, a kind of electronic equipment is provided, comprising:
Processor;
Memory for storage processor executable instruction;
Wherein, the processor is configured to: execute above-mentioned method for processing video frequency.
According to the one side of the disclosure, a kind of computer readable storage medium is provided, computer program is stored thereon with
Instruction, the computer program instructions realize above-mentioned method for processing video frequency when being executed by processor.
In the embodiments of the present disclosure, edge detection is carried out by the video frame to be processed to video, determines video to be processed
Fringe region in frame determines the highlight area in video frame to be processed, according to fringe region and highlight area, determines to be processed
Flashed area in video frame, and according to the position of flashed area, the superposition flash of light figure layer in video frame to be processed, thus, it is possible to
Flash effect is realized in video by lower computation complexity, meets the requirement of real-time of video processing.
According to below with reference to the accompanying drawings to detailed description of illustrative embodiments, the other feature and aspect of the disclosure will become
It is clear.
Detailed description of the invention
Comprising in the description and constituting the attached drawing of part of specification and specification together illustrates the disclosure
Exemplary embodiment, feature and aspect, and for explaining the principles of this disclosure.
Fig. 1 shows the flow chart of the method for processing video frequency according to the embodiment of the present disclosure.
Fig. 2 shows the schematic diagrames of fringe region in the method for processing video frequency according to the embodiment of the present disclosure.
Fig. 3 shows the schematic diagram of highlight area in the method for processing video frequency according to the embodiment of the present disclosure.
Fig. 4 shows an illustrative flow chart of the method for processing video frequency step S14 according to the embodiment of the present disclosure.
Fig. 5 shows the schematic diagram that scan line passes through flashed area in the method for processing video frequency according to the embodiment of the present disclosure.
Fig. 6 shows an illustrative flow chart of the method for processing video frequency step S143 according to the embodiment of the present disclosure.
Fig. 7 is shown in the method for processing video frequency according to the embodiment of the present disclosure in video frame to be processed after superposition flash of light figure layer
Schematic diagram.
Fig. 8 shows an illustrative flow chart of the method for processing video frequency step S14 according to the embodiment of the present disclosure.
Fig. 9 shows the block diagram of the video process apparatus according to the embodiment of the present disclosure.
Figure 10 shows an illustrative block diagram of the video process apparatus according to the embodiment of the present disclosure.
Figure 11 is the block diagram of a kind of electronic equipment 800 shown according to an exemplary embodiment.
Figure 12 is the block diagram of a kind of electronic equipment 1900 shown according to an exemplary embodiment.
Specific embodiment
Various exemplary embodiments, feature and the aspect of the disclosure are described in detail below with reference to attached drawing.It is identical in attached drawing
Appended drawing reference indicate element functionally identical or similar.Although the various aspects of embodiment are shown in the attached drawings, remove
It non-specifically points out, it is not necessary to attached drawing drawn to scale.
Dedicated word " exemplary " means " being used as example, embodiment or illustrative " herein.Here as " exemplary "
Illustrated any embodiment should not necessarily be construed as preferred or advantageous over other embodiments.
In addition, giving numerous details in specific embodiment below to better illustrate the disclosure.
It will be appreciated by those skilled in the art that without certain details, the disclosure equally be can be implemented.In some instances, for
Method, means, element and circuit well known to those skilled in the art are not described in detail, in order to highlight the purport of the disclosure.
Fig. 1 shows the flow chart of the method for processing video frequency according to the embodiment of the present disclosure.As shown in Figure 1, this method includes step
Rapid S11 to step S14.
In step s 11, edge detection is carried out to the video frame to be processed of video, determines the edge in video frame to be processed
Region.
The video of the embodiment of the present disclosure can be any video handled.For example, the video can be real-time
The video of shooting, or the video of non real-time shooting.The video frame to be processed of video indicates view to be treated in video
Frequency frame.In the embodiments of the present disclosure, can be using the partial video frame of video as video frame to be processed, it can also be by the institute of video
There is video frame as video frame to be processed.It, can be according to each video frame of video in video for the video of non real-time shooting
In frame number, in advance determine video each video frame whether be video frame to be processed;For the video of captured in real-time, Ke Yigen
According to each video frame frame number in video of video, determine whether each video frame of video is video frame to be processed in real time.
Wherein, frame number of the video frame to be processed in the video indicates video frame to be processed is which frame in the video.
For example, frame number of the video frame to be processed in the video is i, indicate that video frame to be processed is the i-th frame in the video.For non-
The video of captured in real-time and the video of captured in real-time, the frame number of video frame to be processed in video can be from the first of the video
Frame starts counting.
In the embodiments of the present disclosure, edge detection can be carried out to video frame to be processed using related art method, determined
Fringe region in video frame to be processed.For example, edge can be carried out to video frame to be processed using Sobel edge detection method
Detection, determines the fringe region in video frame to be processed.It for another example, can be using Canny edge detection method to video to be processed
Frame carries out edge detection, determines the fringe region in video frame to be processed.Wherein, the fringe region in video frame to be processed, table
Show the edge region in video frame to be processed.Fig. 2 shows marginal zones in the method for processing video frequency according to the embodiment of the present disclosure
The schematic diagram in domain.
In step s 12, the highlight area in video frame to be processed is determined.
In one possible implementation, the highlight area in video frame to be processed is determined, comprising: according to view to be processed
Gray value is greater than the region of gray threshold in frequency frame, determines the highlight area in video frame to be processed.
In the embodiment of the present disclosure, if video frame to be processed is grayscale image, each picture in available video frame to be processed
The gray value of element is greater than the region of gray threshold according to gray value in video frame to be processed, determines the height in video frame to be processed
Light region.
In one possible implementation, if video frame to be processed is RGB image, video frame to be processed can be turned
It is changed to grayscale image, and the region of gray threshold can be greater than according to gray value in the corresponding grayscale image of video frame to be processed, is determined
Highlight area in video frame to be processed.
It, can be according to video to be processed if video frame to be processed is RGB image in alternatively possible implementation
The channel value in the channel R of frame, the channel G and one or more channels in channel B, determines the gray scale of each pixel in be processed
Value determines the specular in video frame to be processed to be greater than the region of gray threshold according to gray value in video frame to be processed
Domain.
As an example of the implementation, the gray value of a certain pixel is equal to each of the pixel in video frame to be processed
The weighted sum of the channel value in a channel.
For example, the weight that the weight in the channel R is the channel 0.3, G is 0.59, the weight of channel B is 0.11, Gray=0.3R+
0.59G+0.11B, wherein Gray indicates that gray value, R indicate R channel value, and G indicates G channel value, and B indicates channel B value.
For another example, the weight in the channel R, the channel G and channel B is 1/3, Gray=(R+G+B)/3.
As another example of the implementation, the gray value of a certain pixel is equal to the pixel in video frame to be processed
The pixel value in the channel G.
It should be noted that although describing the gray scale for determining each pixel in video frame to be processed in a manner of implementation above
The mode of value, it is understood by one of ordinary skill in the art that the disclosure answer it is without being limited thereto.Those skilled in the art can be according to reality
Application scenarios demand and/or personal preference flexible setting determine the mode of the gray value of each pixel in video frame to be processed.
In one possible implementation, the region of gray threshold is greater than according to gray value in video frame to be processed, really
Highlight area in fixed video frame to be processed, comprising: determine the region that gray value in video frame to be processed is greater than gray threshold
For the highlight area in video frame to be processed.In this implementation, gray value in video frame to be processed can be greater than gray scale
The all areas of threshold value are identified as the highlight area in video frame to be processed.Fig. 3 shows the view according to the embodiment of the present disclosure
The schematic diagram of highlight area in frequency processing method.In Fig. 3, white area is highlight area, and black region is non-highlight area.
In one possible implementation, the region of gray threshold is greater than according to gray value in video frame to be processed, really
Highlight area in fixed video frame to be processed, comprising: gray value in video frame to be processed is greater than face in the region of gray threshold
The region that product is greater than the first area threshold is determined as the highlight area in video frame to be processed.In this implementation, by only
By gray value in video frame to be processed be greater than gray threshold region in area be greater than the first area threshold region be determined as to
The highlight area in video frame is handled, facilitates the calculation amount for reducing subsequent processes, further increases the effect of video processing
Rate.
In step s 13, according to fringe region and highlight area, the flashed area in video frame to be processed is determined.
In one possible implementation, according to fringe region and highlight area, the sudden strain of a muscle in video frame to be processed is determined
Light region, comprising: according to the intersection area of fringe region and highlight area, determine the flashed area in video frame to be processed.
An example as the implementation determines to be processed according to the intersection area of fringe region and highlight area
Flashed area in video frame, comprising: the intersection area of fringe region and highlight area is determined as in video frame to be processed
Flashed area.In this example, each intersection area of fringe region and highlight area can be identified as view to be processed
Flashed area in frequency frame.
It is determined according to the intersection area of fringe region and highlight area wait locate as another example of the implementation
Manage the flashed area in video frame, comprising: area in the intersection area of fringe region and highlight area is greater than second area threshold
The region of value is determined as the flashed area in video frame to be processed.The example passes through only by the intersection of fringe region and highlight area
The region that area is greater than second area threshold value in region is determined as the flashed area in video frame to be processed, helps to reduce subsequent
The calculation amount for the treatment of process further increases the efficiency of video processing.
In alternatively possible implementation, according to fringe region and highlight area, determine in video frame to be processed
Flashed area, comprising: the region other than fringe region in video frame to be processed is determined as non-edge;According to non-edge area
The intersection area in domain and highlight area determines the flashed area in video frame to be processed.
It is determined according to the intersection area of non-edge and highlight area wait locate as an example of the implementation
Manage the flashed area in video frame, comprising: the intersection area of non-edge and highlight area is determined as video frame to be processed
In flashed area.In this example, each intersection area of non-edge and highlight area can be identified as to
Handle the flashed area in video frame.
As another example of the implementation, according to the intersection area of non-edge and highlight area, determine to
Handle the flashed area in video frame, comprising: area in the intersection area of non-edge and highlight area is greater than third face
The region of product threshold value is determined as the flashed area in video frame to be processed.The example passes through only by non-edge and highlight area
Intersection area in area be greater than third area threshold region be determined as the flashed area in video frame to be processed, help to drop
The calculation amount of low subsequent processes further increases the efficiency of video processing.
In step S14, according to the position of flashed area, the superposition flash of light figure layer in video frame to be processed.
It is superimposed in video frame to be processed according to the position of flashed area after flash of light figure layer, flashed area brightens, vision
Effect is more preferable.
In one possible implementation, according to the position of flashed area, flashlight view is superimposed in video frame to be processed
Layer, comprising: according to the position of flashed area, be superimposed flash of light figure layer in video frame to be processed by the way of colour filter.For example,
The pixel value P of superposition flash of light figure layer recoil mark (x, y) in video frame to be processedx,y=255-ax,y(255-bx,y)/255, wherein
ax,yIndicate the pixel value of coordinate (x, y) in video frame to be processed, bx,yThe picture of coordinate (x, y) in the be superimposed flash of light figure layer of expression
Element value.
The embodiment of the present disclosure carries out edge detection by the video frame to be processed to video, determines in video frame to be processed
Fringe region determines the highlight area in video frame to be processed, according to fringe region and highlight area, determines video frame to be processed
In flashed area, and according to the position of flashed area, the superposition flash of light figure layer in video frame to be processed, thus, it is possible to by compared with
Low computation complexity realizes flash effect in video, meets the requirement of real-time of video processing.
Fig. 4 shows an illustrative flow chart of the method for processing video frequency step S14 according to the embodiment of the present disclosure.Such as Fig. 4
Shown, step S14 may include step S141 to step S143.
In step s 141, video frame to be processed is scanned using scan line.
In one possible implementation, the distance between adjacent scan line is fixed value.For example, fixed value takes
Being worth range is 20 to 50.
In the embodiments of the present disclosure, the quantity that scan line can according to need the flash of light figure layer of addition determines.If desired add
Add more flash of light figure layer, then scan line comparatively dense, i.e., the distance between adjacent scan line are smaller;If desired it adds less
Flash of light figure layer, then scan line is sparse, i.e., the distance between adjacent scan line is larger.
In step S142, if the first scan line obtains the first scan line and the first flash of light by the first flashed area
The position of two intersection points of the boundary line in region.
Fig. 5 shows the schematic diagram that scan line passes through flashed area in the method for processing video frequency according to the embodiment of the present disclosure.Such as
Shown in Fig. 5, the scan line where A, B two o'clock passes through flashed area.
In step S143, according to the position of two intersection points, flash of light figure layer is superimposed for the first flashed area.
For example, flash of light figure layer can be superimposed between two intersection points;It for another example, can be by the conduct of the distance between two intersection points
The side length for figure layer of glistening.
Fig. 6 shows an illustrative flow chart of the method for processing video frequency step S143 according to the embodiment of the present disclosure.Such as Fig. 6
Shown, step S143 may include step S1431 and step S1432.
In step S1431, according to the distance between two intersection points, the corresponding flash of light figure layer of the first flashed area is determined
Size.
Wherein, the corresponding relationship between the distance between two intersection points and the size for figure layer of glistening, can be according to actually answering
It is determined, is not limited thereto with scene demand.
For example, if the distance between two intersection points are less than distance threshold, the corresponding flash of light figure layer of the first flashed area
Having a size of S1;If the distance between two intersection points are greater than or equal to distance threshold, the corresponding flash of light figure layer of the first flashed area
Size be S2.For example, distance threshold is 50 pixels.
In step S1432, using the midpoint of two intersection points as the geometric center of flash of light figure layer, and according to the first flash area
The size of the corresponding flash of light figure layer in domain is superimposed flash of light figure layer for the first flashed area.
As shown in figure 5, a flashed area may intersect with multi-strip scanning line, in such a case, it is possible to according to each item
Scan line is directed to flashed area with the intersection point of the boundary line of the flashed area respectively and is superimposed flash of light figure layer.
Fig. 7 is shown in the method for processing video frequency according to the embodiment of the present disclosure in video frame to be processed after superposition flash of light figure layer
Schematic diagram.
Fig. 8 shows an illustrative flow chart of the method for processing video frequency step S14 according to the embodiment of the present disclosure.Such as Fig. 8
Shown, step S14 may include step S41 and step S42.
In step S41, the corresponding flash of light figure layer of video frame to be processed is determined.
In one possible implementation, the corresponding flash of light figure layer of video frame to be processed is determined, comprising: according to flash of light week
The frame number of the quantity and video frame to be processed for figure layer of glistening in phase, flashlight view sequence of layer in video, from flashlight view sequence of layer
The corresponding flash of light figure layer of middle determination video frame to be processed.
In this implementation, the brightness of each flash of light figure layer in flashlight view sequence of layer is different.For example, flashlight view sequence
The quantity for figure layer of glistening in column is K, and the brightness of k-th of flash of light figure layer is less than kth+1 flash of light figure layer in flashlight view sequence of layer
Brightness, wherein 1≤k≤K-1;Alternatively, the brightness of k-th of flash of light figure layer is greater than+1 flash of light figure layer of kth in flashlight view sequence of layer
Brightness.
For example, can determine that video frame to be processed is corresponding from flashlight view sequence of layer using formula f=(i%T) K/ (T+K)
Flash of light figure layer f, wherein 1≤f≤K, i indicate the frame number of video frame to be processed in video, and T indicates flash period, and K is indicated
The quantity for figure layer of glistening in flashlight view sequence of layer, i%T expression take i divided by the remainder of T.Wherein, f=(i%T) K/ (T+K) can be with
For shaping operation.
The implementation passes through according to the quantity of flash of light figure layer and video to be processed in flash period, flashlight view sequence of layer
The frame number of frame in video determines the corresponding flash of light figure layer of video frame to be processed from flashlight view sequence of layer, and thus, it is possible to realize
Periodically variable flash effect.
In step S42, according to the position of flashed area, it is corresponding that video frame to be processed is superimposed in video frame to be processed
Flash of light figure layer.
In one possible implementation, concurrently each video frame to be processed can be handled, with further
Improve the efficiency of video processing.
In one possible implementation, it can be exported in real time after superposition flash of light figure layer in video frame to be processed
To screen and show.
Since the method for processing video frequency that the embodiment of the present disclosure provides can be real in video by lower computation complexity
Existing flash effect, therefore the embodiment of the present disclosure can be applied to video record and live streaming platform etc. and need to locate video in real time
In the application scenarios of reason.
It is appreciated that above-mentioned each embodiment of the method that the disclosure refers to, without prejudice to principle logic,
To engage one another while the embodiment to be formed after combining, as space is limited, the disclosure is repeated no more.
In addition, the disclosure additionally provides image processing apparatus, electronic equipment, computer readable storage medium, program, it is above-mentioned
It can be used to realize any image processing method that the disclosure provides, corresponding technical solution and description and referring to method part
It is corresponding to record, it repeats no more.
Fig. 9 shows the block diagram of the video process apparatus according to the embodiment of the present disclosure.As shown in figure 9, the device includes: edge
Detection module 91 carries out edge detection for the video frame to be processed to video, determines the fringe region in video frame to be processed;
First determining module 92, for determining the highlight area in video frame to be processed;Second determining module 93, for according to marginal zone
Domain and highlight area determine the flashed area in video frame to be processed;Laminating module 94, for the position according to flashed area,
The superposition flash of light figure layer in video frame to be processed.
In one possible implementation, the first determining module 92 is used for: big according to gray value in video frame to be processed
In the region of gray threshold, the highlight area in video frame to be processed is determined.
In one possible implementation, the first determining module 92 is used for: gray value in video frame to be processed is greater than
The region of gray threshold is determined as the highlight area in video frame to be processed;Alternatively, gray value in video frame to be processed is greater than
The region that area is greater than the first area threshold in the region of gray threshold is determined as the highlight area in video frame to be processed.
In one possible implementation, the second determining module 93 is used for: according to the friendship of fringe region and highlight area
Collect region, determines the flashed area in video frame to be processed.
In one possible implementation, the second determining module 93 is used for: by the intersection of fringe region and highlight area
Region is determined as the flashed area in video frame to be processed;Alternatively, by area in the intersection area of fringe region and highlight area
Region greater than second area threshold value is determined as the flashed area in video frame to be processed.
Figure 10 shows an illustrative block diagram of the video process apparatus according to the embodiment of the present disclosure.It is as shown in Figure 10:
In one possible implementation, laminating module 94 includes: scanning submodule 941, for being swept using scan line
Retouch video frame to be processed;Acquisition submodule 942, if obtaining the first scanning by the first flashed area for the first scan line
The position of two intersection points of the boundary line of line and the first flashed area;First superposition submodule 943, for according to two intersection points
Position is superimposed flash of light figure layer for the first flashed area.
In one possible implementation, the first superposition submodule 943 comprises determining that unit, for being handed over according to two
The distance between point determines the size of the corresponding flash of light figure layer of the first flashed area;Superpositing unit, for will be in two intersection points
Geometric center of the point as flash of light figure layer, and according to the size of the corresponding flash of light figure layer of the first flashed area, for the first flash of light
Region superposition flash of light figure layer.
In one possible implementation, laminating module 94 is used for: according to the position of flashed area, using the side of colour filter
Formula is superimposed flash of light figure layer in video frame to be processed.
In one possible implementation, laminating module 94 comprises determining that submodule 944, described wait locate for determining
Manage the corresponding flash of light figure layer of video frame;Second superposition submodule 945, for the position according to the flashed area, it is described to
The corresponding flash of light figure layer of the video frame to be processed is superimposed in processing video frame.
In one possible implementation, determine that submodule 944 is used for: according in flash period, flashlight view sequence of layer
Frame number of the quantity and the video frame to be processed for figure layer of glistening in the video, determines from the flashlight view sequence of layer
The corresponding flash of light figure layer of the video frame to be processed.
The embodiment of the present disclosure carries out edge detection by the video frame to be processed to video, determines in video frame to be processed
Fringe region determines the highlight area in video frame to be processed, according to fringe region and highlight area, determines video frame to be processed
In flashed area, and according to the position of flashed area, the superposition flash of light figure layer in video frame to be processed, thus, it is possible to by compared with
Low computation complexity realizes flash effect in video, meets the requirement of real-time of video processing.
The embodiment of the present disclosure also proposes a kind of computer readable storage medium, is stored thereon with computer program instructions, institute
It states when computer program instructions are executed by processor and realizes the above method.Computer readable storage medium can be non-volatile meter
Calculation machine readable storage medium storing program for executing.
The embodiment of the present disclosure also proposes a kind of electronic equipment, comprising: processor;For storage processor executable instruction
Memory;Wherein, the processor is configured to the above method.
The equipment that electronic equipment may be provided as terminal, server or other forms.
Figure 11 is the block diagram of a kind of electronic equipment 800 shown according to an exemplary embodiment.For example, electronic equipment 800
It can be mobile phone, computer, digital broadcasting terminal, messaging device, game console, tablet device, Medical Devices,
Body-building equipment, the terminals such as personal digital assistant.
Referring to Fig.1 1, electronic equipment 800 may include following one or more components: processing component 802, memory 804,
Power supply module 806, multimedia component 808, audio component 810, the interface 812 of input/output (I/O), sensor module 814,
And communication component 816.
The integrated operation of the usual controlling electronic devices 800 of processing component 802, such as with display, call, data are logical
Letter, camera operation and record operate associated operation.Processing component 802 may include one or more processors 820 to hold
Row instruction, to perform all or part of the steps of the methods described above.In addition, processing component 802 may include one or more moulds
Block, convenient for the interaction between processing component 802 and other assemblies.For example, processing component 802 may include multi-media module, with
Facilitate the interaction between multimedia component 808 and processing component 802.
Memory 804 is configured as storing various types of data to support the operation in electronic equipment 800.These data
Example include any application or method for being operated on electronic equipment 800 instruction, contact data, telephone directory
Data, message, picture, video etc..Memory 804 can by any kind of volatibility or non-volatile memory device or it
Combination realize, such as static random access memory (SRAM), electrically erasable programmable read-only memory (EEPROM) is erasable
Except programmable read only memory (EPROM), programmable read only memory (PROM), read-only memory (ROM), magnetic memory, fastly
Flash memory, disk or CD.
Power supply module 806 provides electric power for the various assemblies of electronic equipment 800.Power supply module 806 may include power supply pipe
Reason system, one or more power supplys and other with for electronic equipment 800 generate, manage, and distribute the associated component of electric power.
Multimedia component 808 includes the screen of one output interface of offer between the electronic equipment 800 and user.
In some embodiments, screen may include liquid crystal display (LCD) and touch panel (TP).If screen includes touch surface
Plate, screen may be implemented as touch screen, to receive input signal from the user.Touch panel includes one or more touches
Sensor is to sense the gesture on touch, slide, and touch panel.The touch sensor can not only sense touch or sliding
The boundary of movement, but also detect duration and pressure associated with the touch or slide operation.In some embodiments,
Multimedia component 808 includes a front camera and/or rear camera.When electronic equipment 800 is in operation mode, as clapped
When taking the photograph mode or video mode, front camera and/or rear camera can receive external multi-medium data.It is each preposition
Camera and rear camera can be a fixed optical lens system or have focusing and optical zoom capabilities.
Audio component 810 is configured as output and/or input audio signal.For example, audio component 810 includes a Mike
Wind (MIC), when electronic equipment 800 is in operation mode, when such as call mode, recording mode, and voice recognition mode, microphone
It is configured as receiving external audio signal.The received audio signal can be further stored in memory 804 or via logical
Believe that component 816 is sent.In some embodiments, audio component 810 further includes a loudspeaker, is used for output audio signal.
I/O interface 812 provides interface between processing component 802 and peripheral interface module, and above-mentioned peripheral interface module can
To be keyboard, click wheel, button etc..These buttons may include, but are not limited to: home button, volume button, start button and lock
Determine button.
Sensor module 814 includes one or more sensors, for providing the state of various aspects for electronic equipment 800
Assessment.For example, sensor module 814 can detecte the state that opens/closes of electronic equipment 800, the relative positioning of component, example
As the component be electronic equipment 800 display and keypad, sensor module 814 can also detect electronic equipment 800 or
The position change of 800 1 components of electronic equipment, the existence or non-existence that user contacts with electronic equipment 800, electronic equipment 800
The temperature change of orientation or acceleration/deceleration and electronic equipment 800.Sensor module 814 may include proximity sensor, be configured
For detecting the presence of nearby objects without any physical contact.Sensor module 814 can also include optical sensor,
Such as CMOS or ccd image sensor, for being used in imaging applications.In some embodiments, which may be used also
To include acceleration transducer, gyro sensor, Magnetic Sensor, pressure sensor or temperature sensor.
Communication component 816 is configured to facilitate the communication of wired or wireless way between electronic equipment 800 and other equipment.
Electronic equipment 800 can access the wireless network based on communication standard, such as WiFi, 2G or 3G or their combination.Show at one
In example property embodiment, communication component 816 receives broadcast singal or broadcast from external broadcasting management system via broadcast channel
Relevant information.In one exemplary embodiment, the communication component 816 further includes near-field communication (NFC) module, short to promote
Cheng Tongxin.For example, radio frequency identification (RFID) technology, Infrared Data Association (IrDA) technology, ultra wide band can be based in NFC module
(UWB) technology, bluetooth (BT) technology and other technologies are realized.
In the exemplary embodiment, electronic equipment 800 can be by one or more application specific integrated circuit (ASIC), number
Word signal processor (DSP), digital signal processing appts (DSPD), programmable logic device (PLD), field programmable gate array
(FPGA), controller, microcontroller, microprocessor or other electronic components are realized, for executing the above method.
In the exemplary embodiment, a kind of non-volatile computer readable storage medium storing program for executing is additionally provided, for example including calculating
The memory 804 of machine program instruction, above-mentioned computer program instructions can be executed by the processor 820 of electronic equipment 800 to complete
The above method.
Figure 12 is the block diagram of a kind of electronic equipment 1900 shown according to an exemplary embodiment.For example, electronic equipment
1900 may be provided as a server.Referring to Fig.1 2, it further comprises one that electronic equipment 1900, which includes processing component 1922,
A or multiple processors and memory resource represented by a memory 1932, can be by processing component 1922 for storing
The instruction of execution, such as application program.The application program stored in memory 1932 may include one or more every
One corresponds to the module of one group of instruction.In addition, processing component 1922 is configured as executing instruction, to execute the above method.
Electronic equipment 1900 can also include that a power supply module 1926 is configured as executing the power supply of electronic equipment 1900
Management, a wired or wireless network interface 1950 is configured as electronic equipment 1900 being connected to network and an input is defeated
(I/O) interface 1958 out.Electronic equipment 1900 can be operated based on the operating system for being stored in memory 1932, such as
Windows ServerTM, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM or similar.
In the exemplary embodiment, a kind of non-volatile computer readable storage medium storing program for executing is additionally provided, for example including calculating
The memory 1932 of machine program instruction, above-mentioned computer program instructions can by the processing component 1922 of electronic equipment 1900 execute with
Complete the above method.
The disclosure can be system, method and/or computer program product.Computer program product may include computer
Readable storage medium storing program for executing, containing for making processor realize the computer-readable program instructions of various aspects of the disclosure.
Computer readable storage medium, which can be, can keep and store the tangible of the instruction used by instruction execution equipment
Equipment.Computer readable storage medium for example can be-- but it is not limited to-- storage device electric, magnetic storage apparatus, optical storage
Equipment, electric magnetic storage apparatus, semiconductor memory apparatus or above-mentioned any appropriate combination.Computer readable storage medium
More specific example (non exhaustive list) includes: portable computer diskette, hard disk, random access memory (RAM), read-only deposits
It is reservoir (ROM), erasable programmable read only memory (EPROM or flash memory), static random access memory (SRAM), portable
Compact disk read-only memory (CD-ROM), digital versatile disc (DVD), memory stick, floppy disk, mechanical coding equipment, for example thereon
It is stored with punch card or groove internal projection structure and the above-mentioned any appropriate combination of instruction.Calculating used herein above
Machine readable storage medium storing program for executing is not interpreted that instantaneous signal itself, the electromagnetic wave of such as radio wave or other Free propagations lead to
It crosses the electromagnetic wave (for example, the light pulse for passing through fiber optic cables) of waveguide or the propagation of other transmission mediums or is transmitted by electric wire
Electric signal.
Computer-readable program instructions as described herein can be downloaded to from computer readable storage medium it is each calculate/
Processing equipment, or outer computer or outer is downloaded to by network, such as internet, local area network, wide area network and/or wireless network
Portion stores equipment.Network may include copper transmission cable, optical fiber transmission, wireless transmission, router, firewall, interchanger, gateway
Computer and/or Edge Server.Adapter or network interface in each calculating/processing equipment are received from network to be counted
Calculation machine readable program instructions, and the computer-readable program instructions are forwarded, for the meter being stored in each calculating/processing equipment
In calculation machine readable storage medium storing program for executing.
Computer program instructions for executing disclosure operation can be assembly instruction, instruction set architecture (ISA) instructs,
Machine instruction, machine-dependent instructions, microcode, firmware instructions, condition setup data or with one or more programming languages
The source code or object code that any combination is write, the programming language include the programming language-of object-oriented such as
Smalltalk, C++ etc., and conventional procedural programming languages-such as " C " language or similar programming language.Computer
Readable program instructions can be executed fully on the user computer, partly execute on the user computer, be only as one
Vertical software package executes, part executes on the remote computer or completely in remote computer on the user computer for part
Or it is executed on server.In situations involving remote computers, remote computer can pass through network-packet of any kind
It includes local area network (LAN) or wide area network (WAN)-is connected to subscriber computer, or, it may be connected to outer computer (such as benefit
It is connected with ISP by internet).In some embodiments, by utilizing computer-readable program instructions
Status information carry out personalized customization electronic circuit, such as programmable logic circuit, field programmable gate array (FPGA) or can
Programmed logic array (PLA) (PLA), the electronic circuit can execute computer-readable program instructions, to realize each side of the disclosure
Face.
Referring herein to according to the flow chart of the method, apparatus (system) of the embodiment of the present disclosure and computer program product and/
Or block diagram describes various aspects of the disclosure.It should be appreciated that flowchart and or block diagram each box and flow chart and/
Or in block diagram each box combination, can be realized by computer-readable program instructions.
These computer-readable program instructions can be supplied to general purpose computer, special purpose computer or other programmable datas
The processor of processing unit, so that a kind of machine is produced, so that these instructions are passing through computer or other programmable datas
When the processor of processing unit executes, function specified in one or more boxes in implementation flow chart and/or block diagram is produced
The device of energy/movement.These computer-readable program instructions can also be stored in a computer-readable storage medium, these refer to
It enables so that computer, programmable data processing unit and/or other equipment work in a specific way, thus, it is stored with instruction
Computer-readable medium then includes a manufacture comprising in one or more boxes in implementation flow chart and/or block diagram
The instruction of the various aspects of defined function action.
Computer-readable program instructions can also be loaded into computer, other programmable data processing units or other
In equipment, so that series of operation steps are executed in computer, other programmable data processing units or other equipment, to produce
Raw computer implemented process, so that executed in computer, other programmable data processing units or other equipment
Instruct function action specified in one or more boxes in implementation flow chart and/or block diagram.
The flow chart and block diagram in the drawings show system, method and the computer journeys according to multiple embodiments of the disclosure
The architecture, function and operation in the cards of sequence product.In this regard, each box in flowchart or block diagram can generation
One module of table, program segment or a part of instruction, the module, program segment or a part of instruction include one or more use
The executable instruction of the logic function as defined in realizing.In some implementations as replacements, function marked in the box
It can occur in a different order than that indicated in the drawings.For example, two continuous boxes can actually be held substantially in parallel
Row, they can also be executed in the opposite order sometimes, and this depends on the function involved.It is also noted that block diagram and/or
The combination of each box in flow chart and the box in block diagram and or flow chart, can the function as defined in executing or dynamic
The dedicated hardware based system made is realized, or can be realized using a combination of dedicated hardware and computer instructions.
The presently disclosed embodiments is described above, above description is exemplary, and non-exclusive, and
It is not limited to disclosed each embodiment.Without departing from the scope and spirit of illustrated each embodiment, for this skill
Many modifications and changes are obvious for the those of ordinary skill in art field.The selection of term used herein, purport
In the principle, practical application or technological improvement to the technology in market for best explaining each embodiment, or lead this technology
Other those of ordinary skill in domain can understand each embodiment disclosed herein.
Claims (10)
1. a kind of method for processing video frequency characterized by comprising
Edge detection is carried out to the video frame to be processed of video, determines the fringe region in the video frame to be processed;
Determine the highlight area in the video frame to be processed;
According to the fringe region and the highlight area, the flashed area in the video frame to be processed is determined;
According to the position of the flashed area, the superposition flash of light figure layer in the video frame to be processed.
2. the method according to claim 1, wherein being determined according to the fringe region and the highlight area
Flashed area in the video frame to be processed, comprising:
According to the intersection area of the fringe region and the highlight area, the flash area in the video frame to be processed is determined
Domain.
3. method according to claim 1 or 2, which is characterized in that according to the position of the flashed area, described wait locate
Manage superposition flash of light figure layer in video frame, comprising:
The video frame to be processed is scanned using scan line;
If the first scan line obtains the boundary of first scan line Yu first flashed area by the first flashed area
The position of two intersection points of line;
According to the position of described two intersection points, flash of light figure layer is superimposed for first flashed area.
4. the method according to claim 1, which is characterized in that according to the position of the flashed area,
The superposition flash of light figure layer in the video frame to be processed, comprising:
Determine the corresponding flash of light figure layer of the video frame to be processed;
According to the position of the flashed area, the corresponding flash of light of the video frame to be processed is superimposed in the video frame to be processed
Figure layer.
5. a kind of video process apparatus characterized by comprising
Edge detection module carries out edge detection for the video frame to be processed to video, determines in the video frame to be processed
Fringe region;
First determining module, for determining the highlight area in the video frame to be processed;
Second determining module, for determining in the video frame to be processed according to the fringe region and the highlight area
Flashed area;
Laminating module, for the position according to the flashed area, the superposition flash of light figure layer in the video frame to be processed.
6. device according to claim 5, which is characterized in that second determining module is used for:
According to the intersection area of the fringe region and the highlight area, the flash area in the video frame to be processed is determined
Domain.
7. device according to claim 5 or 6, which is characterized in that the laminating module includes:
Submodule is scanned, for scanning the video frame to be processed using scan line;
Acquisition submodule, if obtaining first scan line and described for the first scan line by the first flashed area
The position of two intersection points of the boundary line of one flashed area;
First superposition submodule is superimposed flashlight view for first flashed area for the position according to described two intersection points
Layer.
8. the device according to any one of claim 5 to 7, which is characterized in that the laminating module includes:
Submodule is determined, for determining the corresponding flash of light figure layer of the video frame to be processed;
Second superposition submodule, for the position according to the flashed area, be superimposed in the video frame to be processed it is described to
Handle the corresponding flash of light figure layer of video frame.
9. a kind of electronic equipment characterized by comprising
Processor;
Memory for storage processor executable instruction;
Wherein, the processor is configured to: perform claim require any one of 1 to 4 described in method.
10. a kind of computer readable storage medium, is stored thereon with computer program instructions, which is characterized in that the computer
Method described in any one of Claims 1-4 is realized when program instruction is executed by processor.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810845512.6A CN109034068B (en) | 2018-07-27 | 2018-07-27 | Video processing method and device, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810845512.6A CN109034068B (en) | 2018-07-27 | 2018-07-27 | Video processing method and device, electronic equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109034068A true CN109034068A (en) | 2018-12-18 |
CN109034068B CN109034068B (en) | 2020-11-03 |
Family
ID=64647173
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810845512.6A Active CN109034068B (en) | 2018-07-27 | 2018-07-27 | Video processing method and device, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109034068B (en) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7526417B1 (en) * | 2001-12-06 | 2009-04-28 | Adobe Systems Incorporated | Vector-based representation of a lens flare |
CN105976365A (en) * | 2016-04-28 | 2016-09-28 | 天津大学 | Nocturnal fire disaster video detection method |
CN107705352A (en) * | 2017-11-01 | 2018-02-16 | 深圳市灵动飞扬科技有限公司 | Rendering intent, device, installed video equipment and the automobile of car model image |
-
2018
- 2018-07-27 CN CN201810845512.6A patent/CN109034068B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7526417B1 (en) * | 2001-12-06 | 2009-04-28 | Adobe Systems Incorporated | Vector-based representation of a lens flare |
CN105976365A (en) * | 2016-04-28 | 2016-09-28 | 天津大学 | Nocturnal fire disaster video detection method |
CN107705352A (en) * | 2017-11-01 | 2018-02-16 | 深圳市灵动飞扬科技有限公司 | Rendering intent, device, installed video equipment and the automobile of car model image |
Non-Patent Citations (2)
Title |
---|
ADOBE专业人士资格认证教材编委会 主编: "《After Effects 5.5专业资格认证标准教程(Adobe专业人士)》", 30 April 2003 * |
卢师德 著: "《3D Studio MAX R4 动画篇》", 31 January 2002 * |
Also Published As
Publication number | Publication date |
---|---|
CN109034068B (en) | 2020-11-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106651955B (en) | Method and device for positioning target object in picture | |
TWI747325B (en) | Target object matching method, target object matching device, electronic equipment and computer readable storage medium | |
CN104918107B (en) | The identification processing method and device of video file | |
WO2020155711A1 (en) | Image generating method and apparatus, electronic device, and storage medium | |
WO2017031901A1 (en) | Human-face recognition method and apparatus, and terminal | |
CN107944447B (en) | Image classification method and device | |
CN110503023A (en) | Biopsy method and device, electronic equipment and storage medium | |
CN109697734A (en) | Position and orientation estimation method and device, electronic equipment and storage medium | |
WO2020007241A1 (en) | Image processing method and apparatus, electronic device, and computer-readable storage medium | |
CN109522910A (en) | Critical point detection method and device, electronic equipment and storage medium | |
CN106557759B (en) | Signpost information acquisition method and device | |
CN109344832A (en) | Image processing method and device, electronic equipment and storage medium | |
CN109246332A (en) | Video flowing noise-reduction method and device, electronic equipment and storage medium | |
CN109948494A (en) | Image processing method and device, electronic equipment and storage medium | |
CN112219224B (en) | Image processing method and device, electronic equipment and storage medium | |
CN109978891A (en) | Image processing method and device, electronic equipment and storage medium | |
JP7074877B2 (en) | Network optimization methods and equipment, image processing methods and equipment, storage media and computer programs | |
CN109034150B (en) | Image processing method and device | |
CN107038428B (en) | Living body identification method and apparatus | |
CN107025441B (en) | Skin color detection method and device | |
CN108900903A (en) | Method for processing video frequency and device, electronic equipment and storage medium | |
CN109635920A (en) | Neural network optimization and device, electronic equipment and storage medium | |
CN110111281A (en) | Image processing method and device, electronic equipment and storage medium | |
CN110378312A (en) | Image processing method and device, electronic equipment and storage medium | |
US20220222831A1 (en) | Method for processing images and electronic device therefor |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |