CN110545393A - Video stream de-interlacing method, terminal equipment and storage medium - Google Patents

Video stream de-interlacing method, terminal equipment and storage medium Download PDF

Info

Publication number
CN110545393A
CN110545393A CN201910914592.0A CN201910914592A CN110545393A CN 110545393 A CN110545393 A CN 110545393A CN 201910914592 A CN201910914592 A CN 201910914592A CN 110545393 A CN110545393 A CN 110545393A
Authority
CN
China
Prior art keywords
interlacing
video stream
video
frame
output
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN201910914592.0A
Other languages
Chinese (zh)
Inventor
卓福州
许学泽
任赋
林雅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ming Ming (xiamen) Technology Co Ltd
Original Assignee
Ming Ming (xiamen) Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ming Ming (xiamen) Technology Co Ltd filed Critical Ming Ming (xiamen) Technology Co Ltd
Priority to CN201910914592.0A priority Critical patent/CN110545393A/en
Publication of CN110545393A publication Critical patent/CN110545393A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/01Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
    • H04N7/0117Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level involving conversion of the spatial resolution of the incoming video signal
    • H04N7/012Conversion between an interlaced and a progressive signal

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Television Systems (AREA)

Abstract

the invention relates to a video stream de-interlacing method, a terminal device and a storage medium, wherein the method comprises the following steps: s1, storing and analyzing an input digital interlaced video stream, and extracting time sequence information; s2, generating an output video stream time sequence according to the time sequence information obtained in the S1; s3, performing singular value correlation de-interlacing, entropy value correlation de-interlacing and intra-frame interpolation de-interlacing calculation; s4, calculating an entropy value related de-interlacing operation factor and a singular value related de-interlacing operation factor; s5, calculating a normalization factor according to the entropy value related de-interlacing operation factor and the singular value related de-interlacing operation factor obtained in the S4; s6, performing normalized summation operation on the output result of the S3 according to the normalization factor obtained in the S5, and storing the result; and S7, outputting progressive scanning videos, reading corresponding data from the data stored in the S6 according to the output video stream time sequence obtained in the S2, and packaging the data into corresponding standard frame formats meeting the video stream output to output the videos.

Description

Video stream de-interlacing method, terminal equipment and storage medium
Technical Field
The invention belongs to the technical field of video processing, and particularly relates to a video stream de-interlacing method, a terminal device and a storage medium.
background
With the advancement of video display technology, CRT displays have been gradually replaced by LCD, LED displays. The LCD and LED displays in the market are both digitally driven, and only accept the video input of line-by-line scanning. Interlaced cameras are still widely used due to the influence of substitution cost and the advantage that interlaced video has low required transmission bandwidth. However, video streams in an interlaced format such as an interlaced camera cannot be displayed on LCD or LED displays.
disclosure of Invention
the present invention is directed to a video stream de-interlacing method, a terminal device and a storage medium to solve the above problems. Therefore, the invention adopts the following specific technical scheme:
according to an aspect of the invention, there is provided a video stream de-interlacing method comprising the steps of:
S1, storing and analyzing an input digital interlaced video stream, and extracting time sequence information;
s2, calculating the time sequence information requirement of the video stream to be output according to the time sequence information obtained in the S1, and performing output video clock management according to the time sequence information requirement to generate an output video stream time sequence;
S3, reading data of video frames Fn, Fn-1 and Fn-2 stored in S1, and performing singular value correlation de-interlacing calculation, entropy value correlation de-interlacing calculation and intra-frame interpolation de-interlacing calculation, wherein the video frames Fn, Fn-1 and Fn-2 are a current frame video, a previous frame video and a previous frame video respectively;
S4, calculating an entropy value related de-interlacing operation factor and a singular value related de-interlacing operation factor;
s5, calculating a normalization factor according to the entropy value related de-interlacing operation factor and the singular value related de-interlacing operation factor obtained in the S4;
S6, carrying out normalized summation operation on the output result of S3 according to the normalization factor obtained in S5, and storing;
And S7, outputting progressive scanning video, reading corresponding data from the data stored in S6 according to the output video stream time sequence obtained in S2, and packaging into corresponding standard frame format meeting the video stream output for video output.
Further, the video stream de-interlacing method further includes the step of converting the analog video stream into the digital video stream before S1.
Further, S1 specifically includes: converting current video stream data into RGB signals according to standard ITU601-1 and storing the RGB signals in a corresponding memory for subsequent reading calculation; and extracting the resolution and frame rate information of the input video stream.
Further, the specific process of the singular value correlation de-interlacing calculation in S3 is as follows: representing the variation characteristics of images of input video frames Fn, Fn-1 and Fn-2 in time and space by using the following arrays, wherein each element in the arrays is the pixel value of a corresponding pixel point, R represents a row, and L represents a column; carrying out scalar maximum singular value dimension analysis on the singular value vector; and then carrying out multidimensional asymmetric interpolation operation on the image.
further, the specific process of the entropy-related de-interlacing in S3 is as follows: calculating entropy between the input video frames Fn and Fn-2 using a formula, wherein Pn (i, j), Pn-2(i, j) are pixel values of the input video frames Fn and Fn-2, R denotes a row, and L denotes a column; determining an interpolation scale factor through entropy; the interpolated value Pn-1(i, j) is calculated by interpolation according to the interpolated scale factor.
Further, the interpolation may include bilinear interpolation, bicubic interpolation, or windowed function interpolation, etc.
the video stream de-interlacing method of claim 1, wherein the formula of S6 is: pn-1(R, L) ═ γ 1Pn-1(R, L)1+ γ 2Pn-1(R, L)2+ γ 3Pn-1(R, L)3, where Pn-1(R, L) is the pixel value of a point (R, L) of frame n-1 of the final output video stream, Pn-1(R, L)1, Pn-1(R, L)2, Pn-1(R, L)3 are the calculation results of intra-frame deinterlacing, entropy-value-dependent deinterlacing, singular-value-dependent deinterlacing, respectively, γ 1, γ 2, γ 3 are normalization factors, and γ 1+ γ 2+ γ 3 is 1.
according to another aspect of the present invention, a video stream de-interlacing terminal device is provided, comprising a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor implements the steps of the method as described above when executing the computer program.
according to a further aspect of the invention, a computer-readable storage medium is provided, which stores a computer program, wherein the computer program, when executed by a processor, implements the steps of the method as described above.
By adopting the technical scheme, the invention has the beneficial effects that: the invention can realize the display of the interlaced video stream on the LCD and the LED display, and has the advantages of good de-interlacing picture effect, high real-time property, wide universality and the like.
drawings
To further illustrate the various embodiments, the invention provides the accompanying drawings. The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate embodiments of the invention and, together with the description, serve to explain the principles of the embodiments. Those skilled in the art will appreciate still other possible embodiments and advantages of the present invention with reference to these figures. Elements in the figures are not drawn to scale and like reference numerals are generally used to indicate like elements.
FIG. 1 is a schematic diagram of the video stream de-interlacing method of the present invention;
FIG. 2 is a general flow diagram of the video stream de-interlacing method of the present invention;
FIG. 3 is a detailed flow chart of the video stream de-interlacing method of the present invention;
FIG. 4 is a graph of entropy affecting interpolation scale factors.
Detailed Description
The invention will now be further described with reference to the accompanying drawings and detailed description.
fig. 1 shows the principle of the video stream de-interlacing method of the present invention. The function implementation module is a de-interlacing processing module, the input signal source of the function module is an input video frame Fn-2, an input video frame Fn-1 and an input video frame Fn, and the output result is 'Pn-1 (R, L)'. "input video frame Fn-2", "input video frame Fn-1", "input video frame Fn" are consecutive three-frame images input by an interlaced video source, "the missing line of video data of" input video frame Fn-2 "is the same as" input video frame Fn ", and the missing line of video data of" input video frame Fn-2 "and" input video frame Fn "is just the non-missing line of video data of" input video frame Fn-1 ".
the following describes a specific flow of the video stream de-interlacing method of the present invention with reference to fig. 2 and 3. The video stream de-interlacing method comprises the following steps:
s1, storing and analyzing an input digital interlaced video stream, and extracting time sequence information, specifically, the method comprises the following steps:
1. Interlaced video input:
The system interlaced video source input, since the present invention is based on digital signals, should be converted to digital signals in this step if the interlaced video source is an analog signal.
2. input video stream parsing:
the digitized interlaced video stream is analyzed, video stream data is converted into RGB signals according to standard ITU601-1, and information such as resolution, frame rate and the like of the input video stream is extracted.
5. input video stream storage:
The current video stream data is stored in a corresponding memory for subsequent read calculations, such as intra interpolation de-interlacing (step 9), singular value dependent de-interlacing (step 10), entropy dependent de-interlacing (step 11), etc.
s2, calculating the time sequence information requirement of the video stream to be output according to the time sequence information obtained in the S1, and performing output video clock management according to the time sequence information requirement to generate an output video stream time sequence; specifically, the method comprises the following steps:
3. Video output timing information reading:
And (3) calculating the resolution and frame rate requirements of the output video stream according to the time sequence information extracted in the step (2), and then reading the corresponding memory to obtain the relevant information of the output time sequence.
4. video output timing generation:
and performing output video clock management according to the output timing requirement to generate a video output timing.
s3, reading data of video frames Fn, Fn-1 and Fn-2 stored in S1, and performing singular value correlation de-interlacing calculation, entropy value correlation de-interlacing calculation and intra-frame interpolation de-interlacing calculation, wherein the video frames Fn, Fn-1 and Fn-2 are a current frame video, a previous frame video and a previous frame video respectively; the method specifically comprises the following steps:
6. Reading a video stream buffer frame Fn-1:
The video stream data Fn-1 of the previous frame is read from the memory so as to participate in operations for subsequent function implementation, such as intra interpolation de-interlacing (step 9), singular value correlation de-interlacing (step 10), entropy value correlation de-interlacing calculation (step 11), and the like.
7. Reading a video stream buffer frame Fn-2:
The video stream data Fn-2 of the previous frame is read from the memory so as to participate in operations for subsequent function implementation, such as intra interpolation de-interlacing (step 9), singular value correlation de-interlacing (step 10), entropy value correlation de-interlacing calculation (step 11), and the like.
8. Video stream image small area caching:
Storing the current frame video image data of the small region into a cache space which can be called quickly so as to participate in the operation of realizing the subsequent functions, calculating the maximum singular value of a pixel point (step 12) and calculating the entropy value of the image of the small region (step 13); the image area size is shown in fig. 1.
9. Intra-frame interpolation de-interlacing:
The interpolation is performed on the input video frame Fn-1. Interpolation algorithms are various, such as bilinear interpolation, bicubic interpolation, or windowed function interpolation.
10. singular value correlation de-spacing calculation:
The image motion is time-dependent, the image singular value can reflect the running characteristic of the image on a certain program, the three images Fn, Fn-1 and Fn-2 can be considered as three planes on the space, the value A is the origin of the three planes, and the following array can reflect the time and space change characteristic of the three images Fn, Fn-1 and Fn-2.
each element in the array is a pixel value of a corresponding pixel point, R represents a row, and L represents a column. The pixel values may be represented by RGB, gray scale, color difference, and the like. RGB, gray scale and color difference respectively correspond to different image color acquisition modes, wherein RGB corresponds to R, G, B, the color difference corresponds to Y, Cb and Cr, and the gray scale value only corresponds to Y in the color difference. In general analysis, only the gray value Y is analyzed, and if RGB or color difference is used for analysis, R, G, B of RGB or Y, Cb, and Cr in color difference collection should be analyzed respectively, and the dimension of the maximum singular value is taken as the basis for analysis and calculation factor calculation. Pn-1(R-1, L-1) -A and Pn-1(R +1, L +1) -A; pn-1(R, L-1) -A and Pn-1(R, L +1) -A; pn-1(R +1, L-1) -A and Pn-1(R-1, L +1) -A; pn-2(R-1, L) -Pn (R-1, L) and Pn-2(R +1, L) -Pn (R +1, L); pn-2(R, L-2) -Pn (R, L-2) and Pn-2(R, L +2) -Pn (R, L +2) are independent dimensions. The singular value vector is analyzed, particularly the scalar maximum singular value dimension is analyzed, then the image is subjected to multidimensional asymmetric interpolation operation to be used as supplement of intra-frame de-interlacing and entropy value related de-interlacing, and therefore the de-interlacing effect is further improved. The introduction of singular value dependent de-interlacing has a very significant improvement effect on de-interlacing of video images with large entropy.
Specifically, if RGB or Y, Cb, Cr is used for the analysis, each image quantity component should be analyzed independently based on the above-mentioned dimensions. The de-interlacing operation related to singular values is similar to motion compensation, the specific analysis and calculation process is complex, and the de-interlacing operation related to the application scene is mostly related to the application scene. This is illustrated here by way of example: if we analyze the dimensions Pn-2(R, L-2) -Pn (R, L-2) and Pn-2(R, L +2) -Pn (R, L +2), the two equations represent the temporal variation of the image pixel point (R, L-2) and point (R, L +2), respectively, comparing Pn-2(R, L-2) -Pn (R, L-2) and Pn-2(R, L +2) -Pn (R, L +2), a larger value means a larger variation of this point in time, if linear interpolation is used, then there are Pn-1(R, L) ═ … + C1 × Pn-2(R, L-2) + C2 × Pn (R, L-2) + C3 × Pn-2(R, L +2) + C4 × Pn (R, L +2) + C …,
Wherein, C1, C2, C3 and C4 are interpolation scale factors. Assuming that the value of Pn-2(R, L +2) -Pn (R, L +2) is larger, if the target is a landscape or surveillance application, C3 > C4 should be provided, whereas if the target is a sports shooting like speed sports, C4 > C3 should be provided, the value of C1, C2 for sports shooting of speed sports should be smaller than that of C1, C2 for landscape or surveillance applications, and the value of C1, C2 should be smaller than that of C3, C4.
11. Entropy-dependent de-interlacing calculation:
the entropy reflects the magnitude of the difference between the two images, using a formula
The entropy between frame n and frame n-2 can be found. If the entropy value is close to 0, the representative image is a static image, Pn-1(i, j) ≈ Pn-2(i, j) ≈ Pn (i, j),
a larger entropy value represents a larger change in motion between frame n and frame n-2. Since frame n-1 is between frame n and frame n-2 as seen on the time axis, the actual values of Pn-1(i, j) should be related to the pixel values of frame n and frame n-2. If the entropy value is very large, it represents that the change between the frame n and the frame n-2 is very severe, and the image with severe change has more uncertainty from the time axis, i.e. the coherence between the frame n-1, the frame n, and the frame n-2 becomes small, and then the interpolated value Pn-1(i, j) should depend on the image n-1 of the current frame more. The correlation of the interpolated values Pn-1(i, j) with the pixel values of frame n and frame n-2 determines the determination of the scale factor for the data between frames in the calculation. The approximate relationship of the interpolation scale factor to the entropy value is shown in fig. 3. Interpolation scale factors of different applications are selected differently, and a reasonable curve is summarized according to different application scene experiments, so that a better interpolation effect can be obtained.
because entropy affects the interpolation scaling factor, the excellent de-interlacing interpolation calculation should have both intra-frame de-interlacing interpolation and entropy-related inter-frame de-interlacing interpolation, the interpolation is multidimensional, and the algorithm is various, such as bilinear interpolation, bicubic interpolation, and even windowing function interpolation.
12. Calculating the maximum singular value of a pixel point:
Image singular value calculation is performed on the input video frames Fn, Fn-1, Fn-2 according to the size of the image area shown in fig. 1, and the inter-frame maximum singular value is extracted. The specific calculation process is shown in step 10.
13. Calculating entropy of small region image:
entropy calculation between the input video frames Fn, Fn-1, Fn-2 is performed according to the image region size shown in fig. 1, and the specific calculation process adopts the correlation formula of step 11.
S4, calculating an entropy value related de-interlacing operation factor and a singular value related de-interlacing operation factor; specifically, the method comprises the following steps:
14. Calculating an entropy value related de-interlacing operation factor:
the calculation of the calculation factor suitable for entropy-dependent deinterlacing (step 11) from the calculation of step 13 may be selected according to fig. 4.
15. singular value dependent de-interlacing factor:
and (4) calculating an operation factor suitable for the relevant de-interlacing of the singular value according to the calculation result of the step (12), wherein the specific calculation process refers to the step (10).
s6, carrying out normalized summation operation on the output result of S3 according to the normalization factor obtained in S5, and storing; the method specifically comprises the following steps:
16. Calculating a normalization parameter:
the normalization parameters (factors) are calculated from the calculation results of steps 14, 15. Specifically, when the entropy value is large, the inter-frame de-interlacing is mainly used, the proportion of normalization parameter factors of the corresponding inter-frame de-interlacing is larger, when the entropy value is smaller, the intra-frame de-interlacing is mainly used, and the proportion of the normalization parameter factors of the inter-frame de-interlacing is smaller; the interpolation based on the singular value de-interlacing is a further supplement to the inter-frame de-interlacing and the intra-frame de-interlacing to further improve the picture effect after de-interlacing, and the de-interlacing based on the singular value belongs to an asymmetric interpolation mode, so that the parameter proportion in the final normalization operation is smaller under any condition.
17. normalization of pixel operation value:
And performing normalized summation operation on the output results of the steps 9, 10 and 11 according to the output result of the step 16. The invention provides a video stream de-interlacing method which is a comprehensive application of intra-frame de-interlacing, entropy value related de-interlacing and singular value related de-interlacing. The operation parameters can be obtained by a lookup table or a scene application related calculation formula. Since the final output Pn-1(R, L) is the pixel value of the point (R, L) of the frame n-1, the outputs of three de-interlacing operation results Pn-1(R, L)1, Pn-1(R, L)2 and Pn-1(R, L)3, including intra de-interlacing, entropy-related de-interlacing and singular value-related de-interlacing, should be normalized and summed to ensure that the final output result does not overflow. The normalized sum operation is formulated as
P(R,L)=γP(R,L)+γP(R,L)+γP(R,L),
Wherein the normalization factor γ 1+ γ 2+ γ 3 is 1. The values of γ 1, γ 2, γ 3 may be obtained by a lookup table or a calculation formula related to a scene application.
S7, video output is scanned line by line, corresponding data are read from the data stored in S6 according to the output video stream time sequence obtained in S2, and the data are packaged into a corresponding standard frame format meeting the video stream output and then video output is carried out; the method specifically comprises the following steps:
18. Data caching:
and caching the final output video data Pn-1(R, L) obtained in the step 17 in a memory.
19. Progressive scan video output:
And (4) reading the final output video data Pn-1(R, L) from the memory according to the output video time sequence generated in the step (4), and packaging the data into a corresponding standard frame format which accords with video stream output for video output.
in an embodiment of the present invention, there is also provided a video stream de-interlacing terminal device, comprising a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor implements the steps of the method as described above when executing the computer program.
further, the terminal device may be a desktop computer, a notebook, a palm computer, a cloud server, or other computing devices. The terminal device may include, but is not limited to, a processor, a memory. It will be understood by those skilled in the art that the above-mentioned structure of the terminal device is only an example of the video stream de-interlacing terminal device, and does not constitute a limitation on the video stream de-interlacing terminal device, and may include more or less components than the above, or combine some components, or different components, for example, the video stream de-interlacing terminal device may further include an input/output device, a network access device, a bus, etc., which is not limited by the embodiment of the present invention.
Further, the Processor may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, or the like. The general purpose processor may be a microprocessor or the processor may be any conventional processor or the like, said processor being the control center of the video stream de-interlacing terminal device, connecting the various parts of the entire video stream de-interlacing terminal device with various interfaces and lines.
the memory may be configured to store the computer program and/or module, and the processor may implement various functions of the video stream de-interlacing terminal device by executing or executing the computer program and/or module stored in the memory and calling data stored in the memory. The memory may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required for at least one function, and the like. In addition, the memory may include high speed random access memory, and may also include non-volatile memory, such as a hard disk, a memory, a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), at least one magnetic disk storage device, a Flash memory device, or other volatile solid state storage device.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the computer program implements the steps of the above method according to the embodiment of the present invention.
The modules/units integrated with the video stream de-interlacing terminal device may be stored in a computer readable storage medium if implemented in the form of software functional units and sold or used as separate products. Based on such understanding, all or part of the flow of the method according to the embodiments of the present invention may also be implemented by a computer program, which may be stored in a computer-readable storage medium, and when the computer program is executed by a processor, the steps of the method embodiments may be implemented. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
While the invention has been particularly shown and described with reference to a preferred embodiment, it will be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (9)

1. A method for de-interlacing a video stream, comprising the steps of:
S1, storing and analyzing an input digital interlaced video stream, and extracting time sequence information;
S2, calculating the time sequence information requirement of the video stream to be output according to the time sequence information obtained in the S1, and performing output video clock management according to the time sequence information requirement to generate an output video stream time sequence;
S3, reading data of video frames Fn, Fn-1 and Fn-2 stored in S1, and performing singular value correlation de-interlacing calculation, entropy value correlation de-interlacing calculation and intra-frame interpolation de-interlacing calculation, wherein the video frames Fn, Fn-1 and Fn-2 are a current frame video, a previous frame video and a previous frame video respectively;
S4, calculating an entropy value related de-interlacing operation factor and a singular value related de-interlacing operation factor;
s5, calculating a normalization factor according to the entropy value related de-interlacing operation factor and the singular value related de-interlacing operation factor obtained in the S4;
S6, carrying out normalized summation operation on the output result of S3 according to the normalization factor obtained in S5, and storing;
and S7, outputting progressive scanning video, reading corresponding data from the data stored in S6 according to the output video stream time sequence obtained in S2, and packaging into corresponding standard frame format meeting the video stream output for video output.
2. The video stream de-interlacing method according to claim 1, wherein said video stream de-interlacing method further comprises a step of converting an analog video stream into a digital video stream before S1.
3. The video stream de-interlacing method according to claim 1, wherein S1 is specifically: converting current video stream data into RGB signals according to standard ITU601-1 and storing the RGB signals in a corresponding memory for subsequent reading calculation; and extracting the resolution and frame rate information of the input video stream.
4. the video stream de-interlacing method according to claim 1, wherein the singular value dependent de-interlacing in S3 is performed by: the time and space variation characteristics of the input video frame Fn, Fn-1 and Fn-2 images are expressed by the following arrays
Each element in the array is a pixel value of a corresponding pixel point, R represents a row, and L represents a column; carrying out scalar maximum singular value dimension analysis on the singular value vector; and then carrying out multidimensional asymmetric interpolation operation on the image.
5. The video stream de-interlacing method according to claim 1, wherein the entropy-dependent de-interlacing in S3 is performed by: calculating entropy between the input video frames Fn and Fn-2 using a formula, wherein Pn (i, j), Pn-2(i, j) are pixel values of the input video frames Fn and Fn-2, R denotes a row, and L denotes a column; determining an interpolation scale factor through entropy; the interpolated value Pn-1(i, j) is calculated by interpolation according to the interpolated scale factor.
6. the video stream de-interlacing method of claim 4 or 5, wherein the interpolation comprises bilinear interpolation, bicubic interpolation or windowing function interpolation.
7. the video stream de-interlacing method of claim 1, wherein the formula of S6 is: pn-1(R, L) ═ γ 1Pn-1(R, L)1+ γ 2Pn-1(R, L)2+ γ 3Pn-1(R, L)3, where Pn-1(R, L) is the pixel value of a point (R, L) of frame n-1 of the final output video stream, Pn-1(R, L)1, Pn-1(R, L)2, Pn-1(R, L)3 are the calculation results of intra-frame deinterlacing, entropy-value-dependent deinterlacing, singular-value-dependent deinterlacing, respectively, γ 1, γ 2, γ 3 are normalization factors, and γ 1+ γ 2+ γ 3 is 1.
8. a video stream de-interlacing terminal device comprising a memory, a processor and a computer program stored in said memory and executable on said processor, characterized in that said processor implements the steps of the method according to claims 1 to 7 when executing said computer program.
9. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to claims 1 to 7.
CN201910914592.0A 2019-09-26 2019-09-26 Video stream de-interlacing method, terminal equipment and storage medium Withdrawn CN110545393A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910914592.0A CN110545393A (en) 2019-09-26 2019-09-26 Video stream de-interlacing method, terminal equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910914592.0A CN110545393A (en) 2019-09-26 2019-09-26 Video stream de-interlacing method, terminal equipment and storage medium

Publications (1)

Publication Number Publication Date
CN110545393A true CN110545393A (en) 2019-12-06

Family

ID=68714531

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910914592.0A Withdrawn CN110545393A (en) 2019-09-26 2019-09-26 Video stream de-interlacing method, terminal equipment and storage medium

Country Status (1)

Country Link
CN (1) CN110545393A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111885338A (en) * 2020-07-31 2020-11-03 青岛信芯微电子科技股份有限公司 Video de-interlacing processing method and device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040070686A1 (en) * 2002-07-25 2004-04-15 Samsung Electronics Co., Ltd. Deinterlacing apparatus and method
CN101197995A (en) * 2006-12-07 2008-06-11 深圳艾科创新微电子有限公司 Edge self-adapting de-interlacing interpolation method
CN103077509A (en) * 2013-01-23 2013-05-01 天津大学 Method for synthesizing continuous and smooth panoramic video in real time by using discrete cubic panoramas

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040070686A1 (en) * 2002-07-25 2004-04-15 Samsung Electronics Co., Ltd. Deinterlacing apparatus and method
CN101197995A (en) * 2006-12-07 2008-06-11 深圳艾科创新微电子有限公司 Edge self-adapting de-interlacing interpolation method
CN103077509A (en) * 2013-01-23 2013-05-01 天津大学 Method for synthesizing continuous and smooth panoramic video in real time by using discrete cubic panoramas

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111885338A (en) * 2020-07-31 2020-11-03 青岛信芯微电子科技股份有限公司 Video de-interlacing processing method and device

Similar Documents

Publication Publication Date Title
US20220038700A1 (en) Motion compensation and motion estimation leveraging a continuous coordinate system
US8072511B2 (en) Noise reduction processing apparatus, noise reduction processing method, and image sensing apparatus
US7411628B2 (en) Method and system for scaling, filtering, scan conversion, panoramic scaling, YC adjustment, and color conversion in a display controller
JP3046079B2 (en) A histogram equalizer for improving the contrast of moving images
US7791662B2 (en) Image processing device, image processing method, recording medium, and program
CN100505880C (en) Method for motion estimation based on hybrid block matching and apparatus for converting frame rate using the method
US8861846B2 (en) Image processing apparatus, image processing method, and program for performing superimposition on raw image or full color image
US8749699B2 (en) Method and device for video processing using a neighboring frame to calculate motion information
JP2010239636A (en) Image generation apparatus, image generation method and program
JPS60229594A (en) Method and device for motion interpolation of motion picture signal
US8289420B2 (en) Image processing device, camera device, image processing method, and program
CN110889809B9 (en) Image processing method and device, electronic equipment and storage medium
JP2002542741A (en) De-interlacing of video signal
US8723969B2 (en) Compensating for undesirable camera shakes during video capture
US8922712B1 (en) Method and apparatus for buffering anchor frames in motion compensation systems
CN110545393A (en) Video stream de-interlacing method, terminal equipment and storage medium
JP7137185B2 (en) Tone mapping processing method by edge strength maximization and HDR video conversion device
US7421150B2 (en) Coordinate conversion apparatus and method
KR20120020821A (en) Method and apparatus for correcting distorted image
JPWO2017203941A1 (en) IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, AND PROGRAM
KR100719988B1 (en) Image signal processing device
CN114298889A (en) Image processing circuit and image processing method
JP2007088910A (en) Motion vector detecting device and imaging apparatus
KR20060023150A (en) Spatial signal conversion
JP2006303693A (en) Electronic camera provided with function of generating reduced picture

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication

Application publication date: 20191206

WW01 Invention patent application withdrawn after publication