WO2009140916A1 - Procédé de désentrelacement, dispositif de désentrelacement et système de traitement vidéo pour des données vidéo - Google Patents

Procédé de désentrelacement, dispositif de désentrelacement et système de traitement vidéo pour des données vidéo Download PDF

Info

Publication number
WO2009140916A1
WO2009140916A1 PCT/CN2009/071867 CN2009071867W WO2009140916A1 WO 2009140916 A1 WO2009140916 A1 WO 2009140916A1 CN 2009071867 W CN2009071867 W CN 2009071867W WO 2009140916 A1 WO2009140916 A1 WO 2009140916A1
Authority
WO
WIPO (PCT)
Prior art keywords
pixel
current frame
field
motion state
spatial position
Prior art date
Application number
PCT/CN2009/071867
Other languages
English (en)
Chinese (zh)
Inventor
谢清鹏
高新波
周芳
邓勤耕
张晓森
熊联欢
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2009140916A1 publication Critical patent/WO2009140916A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/14Picture signal circuitry for video frequency region
    • H04N5/144Movement detection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/01Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
    • H04N7/0117Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level involving conversion of the spatial resolution of the incoming video signal
    • H04N7/012Conversion between an interlaced and a progressive signal

Definitions

  • the present invention relates to the field of video processing, and in particular, to a deinterlacing method, a deinterlacing device, and a video processing system for video data. Background of the invention
  • interlaced scanning technology is widely used in the video field, consumers find that on high-resolution display devices, interlaced scanning technology has the disadvantages of line crawling, edge feathering and interlacing, which leads to interlaced scanning technology that cannot satisfy consumers' high multimedia. Claim.
  • De-interlacing is the top field in the original interlaced scan (pixels of odd scan lines have pixel values, pixels of even rows have no pixel values) or bottom field (pixels of even scan lines have pixel values, pixels of odd rows) The point has no pixel value.
  • Interpolation is performed on the line without the pixel value by the interpolation operation to generate a frame image having one complete field.
  • the line averaging method and the field copy method are employed for the top field or the bottom field in the interlaced scanning.
  • the top field first select a detection point, and then judge whether the detected point moves according to the pixel point corresponding to the spatial position of the detected point in the adjacent top field, thereby estimating the interpolation adjacent to the detected point vertically.
  • the adjacent interpolation point is considered to be moving, and the line averaging method is adopted for the interpolation point, that is, the pixels of the upper and lower pixels of the vertical adjacent direction of the interpolation point are obtained.
  • the average value if not moving, uses the inter-field copy method, that is, the pixels of the point at the same spatial position of the bottom field corresponding to the interpolation point in the top field are copied.
  • the prior art only detects the pixel variation of the front and back top fields, and cannot detect the pixel variation of the bottom field between the front and back top fields, thereby causing missed detection, resulting in incorrect pixel values of the interpolation points after the interpolation operation.
  • the reaction misses the change of the bottom field and finally affects the sharpness of the image.
  • by detecting whether the detection point in the top field is moving it is estimated whether the pixel of the interpolation line in the top field moves, and the accuracy of the judgment method is not High, it is easy to cause misjudgment, which ultimately affects the image's unclearness.
  • the process of row averaging for moving pixels only the pixel value in the vertical direction is used, which will increase the pixel value error of the point to be interpolated. , causing the image to be unclear.
  • the embodiment of the invention provides a deinterlacing method, a deinterlacing device and a video processing system, which can prevent the pixel of the current frame from being missed, improve the accuracy of judging the motion state, and reduce the pixel value of the row to be interpolated. Error, which improves the sharpness of the image.
  • An embodiment of the present invention provides a deinterlacing method for video data, including:
  • Judging whether a motion state of the pixel of the current frame relative to a pixel of the same spatial position of the adjacent frame is determined by determining whether a motion state of at least one pixel of the top field and the bottom field of the current frame is motion;
  • the pixels to be interpolated are subjected to interpolation processing according to the motion state of the pixels of the current frame with respect to the pixels of the same spatial position of the adjacent frames.
  • the embodiment of the invention further provides a deinterlacing device for video data, including:
  • a detecting module configured to detect a motion state of a top field pixel of a current frame in a video data with respect to a top field of an adjacent frame at a same spatial position, and detect a bottom field of the current frame relative to a bottom field of the adjacent frame a motion state of the pixel at the same spatial position;
  • a determining module configured to determine the current frame by determining whether a motion state of at least one pixel of the top spatial field and the bottom field of the current frame is motion The motion state of the pixels relative to the same spatial position of the adjacent frame;
  • an interpolation module configured to perform interpolation processing on the pixels of the interpolation row according to the motion state of the pixel of the current frame determined by the determining module with respect to the pixel of the same spatial position of the adjacent frame.
  • the embodiment of the invention further provides a video processing system for video data, including:
  • a data receiving device configured to receive video data
  • Deinterlacing means for detecting a motion state of a pixel of a top field of a current frame in a video data with respect to a top field of an adjacent frame at a same spatial position, and detecting a bottom field pixel of the current frame relative to a bottom of the adjacent frame Determining the motion state of the pixel in the same spatial position; determining whether the pixel of the current frame is relative by determining whether the motion state of at least one pixel of the top field and the bottom field of the current frame is motion The motion state of the pixel in the same spatial position of the adjacent frame; and interpolating the pixels of the interpolation row according to the motion state of the pixel of the current frame with respect to the same spatial position pixel of the adjacent frame.
  • the motion state of the top field pixel of the current frame in the video data relative to the top field of the adjacent frame in the same spatial position is detected, and the bottom field pixel of the current frame is detected.
  • the motion state of the pixel in the same spatial position of the bottom field of the adjacent frame; and determining whether the motion state of at least one pixel of the top field and the bottom field of the current frame is motion The motion state of the pixel of the current frame relative to the same spatial position pixel of the adjacent frame; and the pixel of the interpolation row is interpolated according to the motion state of the pixel of the current frame with respect to the pixel of the same spatial position of the adjacent frame.
  • FIG. 1 is a schematic structural diagram of a video processing system according to Embodiment 1 of the present invention.
  • FIG. 2 is a general flowchart of a deinterlacing method according to Embodiment 2 of the present invention.
  • Embodiment 3 is a detailed flowchart of a deinterlacing method according to Embodiment 3 of the present invention.
  • FIG. 4 is a top field diagram of a current frame of a deinterlacing method according to Embodiment 3 of the present invention.
  • FIG. 5 is a bottom field diagram of a current frame of a deinterlacing method according to Embodiment 3 of the present invention.
  • FIG. 6 is a top field diagram of an adjacent frame of a deinterlacing method according to Embodiment 3 of the present invention.
  • FIG. 7 is a bottom field diagram of an adjacent frame of a deinterlacing method according to Embodiment 3 of the present invention.
  • FIG. 8 is a schematic diagram of a pixel absolute difference of a top field of a current frame according to a deinterlacing method according to Embodiment 3 of the present invention
  • FIG. 9 is an absolute pixel of a bottom field of a current frame according to a deinterlacing method according to Embodiment 3 of the present invention
  • FIG. 10 is a schematic diagram showing the identification of the absolute difference of the pixel of the top field of the current frame in the deinterlacing method according to Embodiment 3 of the present invention
  • FIG. 11 is the current frame of the deinterlacing method according to Embodiment 3 of the present invention
  • FIG. 12 is a schematic diagram showing an extension of an identifier of a pixel absolute difference of a top field of a current frame according to a deinterlacing method according to Embodiment 3 of the present invention
  • FIG. 13 is a schematic diagram showing an extension of an identifier of a pixel absolute difference of a bottom field of a current frame in a deinterlacing method according to Embodiment 3 of the present invention
  • FIG. 14 is a schematic diagram of a first operation result of a current frame of a deinterlacing method according to Embodiment 3 of the present invention
  • FIG. 15 is a schematic diagram of a second operation result of a current frame of a deinterlacing method according to Embodiment 3 of the present invention
  • FIG. 17 is a schematic diagram of blocking of a top field of a current frame of a deinterlacing method according to Embodiment 4 of the present invention.
  • FIG. 18 is a schematic diagram of blocking of a top field of an adjacent frame in a deinterlacing method according to Embodiment 4 of the present invention.
  • FIG. 19 is a schematic diagram of blocking of a bottom field of a current frame of a deinterlacing method according to Embodiment 4 of the present invention.
  • FIG. 20 is a schematic diagram of blocking of a bottom field of an adjacent frame in a deinterlacing method according to Embodiment 4 of the present invention.
  • FIG. 21 is a schematic diagram of a pixel absolute difference of a block of a top field of a current frame according to a deinterlacing method according to Embodiment 4 of the present invention
  • FIG. 22 is a bottom field of a current frame of the deinterlacing method according to Embodiment 4 of the present invention
  • FIG. 23 is a schematic diagram showing the identification of the pixel absolute difference of the block of the top field of the current frame in the deinterlacing method according to Embodiment 4 of the present invention
  • FIG. 24 is a schematic diagram showing identification of pixel absolute differences of blocks of a bottom field of a current frame in a deinterlacing method according to Embodiment 4 of the present invention.
  • Figure 27 is a detailed flow chart of the deinterlacing method provided in Embodiment 7 of the present invention.
  • FIG. 1 is a schematic structural diagram of a video processing system 10 according to Embodiment 1 of the present invention.
  • the video processing system 10 includes a data receiving device 12 and a deinterlacing device 11, wherein:
  • the data receiving device 12 is configured to receive video data
  • the deinterlacing device 11 is configured to perform deinterlacing processing on the received video data, and specifically includes: detecting motion of a pixel of a top field of a current frame in the video data with respect to a top field of an adjacent frame at a same spatial position.
  • a state detecting a motion state of a pixel of a bottom field of the current frame relative to a pixel of a same spatial position of a bottom field of the adjacent frame, and determining whether the pixel of the same spatial position of the top field and the bottom field of the current frame has at least
  • the motion state of one pixel is motion to determine the motion state of the pixel of the current frame relative to the pixel of the same spatial position of the adjacent frame, and according to the motion of the pixel of the current frame relative to the pixel of the same spatial position of the adjacent frame
  • the state treats the pixels in the interpolated row for interpolation.
  • the pixels to be interpolated are included in the current frame and the pixels in the adjacent frame to be interpolated.
  • the deinterlacing device 11 includes: a detecting module 110, a determining module 130, and an interpolation module 140, where:
  • the detecting module 110 is configured to detect a motion state of a pixel of a top field of a current frame in a video field with respect to a pixel of a same spatial position of a top field of an adjacent frame, and detect a pixel of a bottom field of the current frame relative to an adjacent frame.
  • the bottom of the same spatial position of the pixel An active state, wherein the pixel comprises a pixel point and a pixel block.
  • each frame can be divided into a top field and a bottom field, and pixels of odd rows in the top field have pixel values, and pixels of even rows have no pixel values; pixels of even rows in the bottom field have The pixel value, the pixel of its odd row has no pixel value.
  • the determining module 130 is configured to determine, by determining whether a pixel of the same spatial position of the top field and the bottom field of the current frame is a motion state of at least one pixel, to determine that the pixel of the current frame is the same space as the adjacent frame.
  • the motion state of the pixel of the position; the interpolation module 140 is configured to: according to the motion state of the pixel of the current frame determined by the determining module 130 relative to the pixel of the same spatial position of the adjacent frame, the pixel in the row to be interpolated Perform interpolation operations.
  • the determining module 130 may be further configured to: when the motion state of the pixel of the same spatial position of the top field and the bottom field of the current frame is at least one pixel, determine that the pixel of the current frame is relative to the adjacent frame.
  • the motion state of the pixel in the same spatial position is motion; and when the motion state of the pixel of the same spatial position of the top field and the bottom field of the current frame is not moving, determining that the pixel of the current frame is adjacent to the adjacent
  • the motion state of a pixel in the same spatial position of the frame is still.
  • the interpolation module 140 described above may include an intra-field interpolation module 141 and an inter-field interpolation module 142, where:
  • the intra-field interpolation module 141 is configured to perform an intra-field interpolation operation on the pixels in the motion-to-interpolation row that are in motion; the inter-field interpolation module 142 is configured to use the motion state in the to-be-interpolated row. Inter-field interpolation is performed for still pixels.
  • the pixel may be a pixel
  • the detecting module 110 may be further configured to determine whether a pixel absolute difference of a pixel point of the same spatial position of the top field of the current frame and the top field of the adjacent frame is And greater than the first threshold value, to detect a motion state of a pixel point of a top field of the current frame relative to a same spatial position of the top field of the adjacent frame; and determining a bottom field and a phase of the current frame by determining Whether the pixel absolute difference of the pixel point of the same spatial position of the bottom field of the adjacent frame is greater than the first threshold value, to detect the pixel of the current frame bottom field relative to the same spatial position of the adjacent frame bottom field The state of motion of the point.
  • the detection module 110 is further configured to divide the top field and the bottom field of the current frame and the top field and the bottom field of the adjacent frame into multiple pixel blocks. For example, if the image size of the frame is N*M, Then, the detecting module 110 divides the top field and the bottom field of the frame into pixel blocks by N* (M/2), and may also perform blocking in other manners. And determining whether the pixel absolute difference of the pixel point of the pixel block of the pixel unit of the top field of the current frame and the pixel location of the same spatial position of the top field of the adjacent frame is greater than a second threshold value.
  • each block of the top field of the current frame forms a one-to-one correspondence with each pixel block of the top field of the adjacent frame in a spatial position.
  • the de-interlacing device 11 may further include an operation module 120, where the operation module 120 is configured to identify a motion state of a pixel point of each pixel block of the top field and the bottom field of the current frame, and collect the current The number of motion pixel points for each pixel block in the top field and bottom field of the frame.
  • the operation module 120 can mark the moving pixel points with 1 or other values; the static pixels can be identified by 0, and other values can be used for identification.
  • the detecting module 110 is further configured to detect, by determining whether the motion state of each pixel block in the top field is the number of moving pixel points is greater than a third threshold, to detect each pixel block of the top field. a motion state; and detecting a motion state of each pixel block of the bottom field by determining whether the number of motion pixel points of each pixel block in the bottom field is greater than the third threshold value.
  • the operation module 120 is further configured to identify a motion state of a pixel of a top field and a bottom field of the current frame; The identification of the motion state of the top field pixel is ORed with the identification of the motion state of the pixel at the same spatial position in the bottom field. In the first embodiment, the operation module 120 performs an OR operation on the identifier of the top field pixel point of the current frame and the identifier of the bottom field pixel point.
  • the determining module 130 is further configured to determine, by using the motion identifier in the operation result of the judgment or the operation, whether the pixel of the same spatial position of the top field and the bottom field of the current frame has a motion state of at least one pixel as motion.
  • the intra-field interpolation module 141 is notified to perform intra-field interpolation on the pixels in the current frame to be interpolated, in the actual implementation process,
  • the intra-field interpolation operation can be implemented by the interpolation operation method in the prior art, such as edge adaptive filtering, edge line average interpolation, etc.
  • the inter-field interpolation is notified.
  • the module 142 performs inter-field interpolation motion on the pixels in the row to be interpolated of the current frame, for example, a pixel copy algorithm may be employed.
  • the determining module 130 is further configured to determine whether the current frame is globally moved relative to the adjacent frame by determining whether the proportion of the motion identifier after the result is greater than a fourth threshold, and determining that the proportion of the motion identifier is greater than The four threshold value determines that the current frame is globally moved relative to the adjacent frame.
  • the intra-field interpolation module 141 is notified to perform intra-field interpolation on the pixel of the current frame.
  • an adaptive filtering algorithm may be used for the motion; if the determining module 130 determines the current frame If the pixel is not a global motion, it is determined whether the pixel of the same spatial position of the top field and the bottom field of the current frame is the motion state of at least one pixel of the current frame by the motion identifier in the operation result after the judgment or the operation.
  • Embodiment 1 of the present invention By the deinterlacing device and the video processing system provided in Embodiment 1 of the present invention, it is ensured that the pixels of the current frame are not missed, the accuracy of determining the motion state of the pixels of the current frame is improved, and the value to be interpolated is reduced.
  • the pixel value error of the point increases the sharpness of the image.
  • Embodiment 2 of the present invention provides a deinterlacing method
  • FIG. 2 is a general flowchart of a deinterlacing method provided by Embodiment 2 of the present invention, and the figure includes:
  • Step S200 receiving video data.
  • Step S202 detecting a motion state of a pixel of a top field of the current frame in the video data with respect to a pixel of a same spatial position of a top field of the adjacent frame, and detecting a bottom field of the current frame relative to a bottom field of the adjacent frame.
  • the motion state of pixels in the same spatial position is a motion state of pixels in the same spatial position.
  • Step S204 determining whether the pixel of the current frame is relative to the pixel of the same spatial position of the adjacent frame by determining whether the pixel of the same spatial position of the top field and the bottom field of the current frame is motion of at least one pixel. Movement state.
  • step S204 further includes the following steps: Step A: identifying a motion state of a pixel of a top field and a bottom field of the current frame; Step B, identifying an activity state of the top field pixel Performing an operation with the identifier of the motion state of the pixel at the same spatial position in the bottom field; Step C: determining the same spatial position of the top field and the bottom field of the current frame by using the motion identifier in the operation result after the judgment or operation Whether the pixel has at least one pixel of motion is motion.
  • Step S206 performing an intra-field interpolation operation on the pixels whose motion state is to be interpolated in the current frame to be moved, and then performing step S210, outputting the processed current frame, and ending.
  • Step S208 performing inter-field interpolation on the pixels whose motion state of the current frame to be interpolated is still, and then performing step S210, and ending.
  • the pixels to be interpolated are included in the current frame and the pixels of the adjacent frame in the adjacent frame.
  • Embodiment 3 of the present invention provides another deinterlacing method.
  • FIG. 3 is a detailed flowchart of another deinterlacing method provided by Embodiment 3 of the present invention, and the figure includes:
  • Step S300 Receive video data, where the video data is interlaced video data, where each frame of the video data includes a pixel of a top field and a pixel of a bottom field.
  • Step S302 calculating a pixel absolute difference of pixel points of the same spatial position of the top field of the current frame and the top field of the adjacent frame, and calculating a pixel point of the bottom field of the same spatial position of the bottom field of the current frame and the adjacent frame.
  • the absolute difference in pixels is a pixel absolute difference of pixel points of the same spatial position of the top field of the current frame and the top field of the adjacent frame.
  • a current frame of size 8 X 6 For example, divide a current frame of size 8 X 6 into the top field of the current frame of size 8 X 6 (as in Figure 4) and the bottom field of the current frame of size 8 X 6 ( Figure 5). Since the pixels of the odd-numbered rows of the top field have pixel values, and the pixels of the even-numbered rows have no pixel values, the row can also be used as an interpolation row, so the pixel value of the pixels of the even-numbered rows in the top field is set to 0, and The pixel is used as the interpolation point, and the bottom field is opposite to the top field.
  • the pixel of the even line in the bottom field has the pixel value, and the pixel of the odd line has no pixel value, so the odd line is used as the interpolation line; similarly, the adjacent
  • the frame is divided into a top field of 8x6 adjacent frames (as in Figure 6) and a bottom field of adjacent frames of size 8 x 6 ( Figure 7).
  • Pixel absolute difference calculation is performed on the top field of the current frame and the pixel of the top field of the same spatial position of the adjacent frame, and FIG. 8 is obtained ; the bottom field of the current frame and the bottom field of the same spatial position of the adjacent frame are obtained.
  • the pixel is subjected to pixel absolute difference calculation, and FIG. 9 is obtained.
  • Step S304 detecting a motion state of a pixel point of a pixel position of a current frame top field relative to a same spatial position of an adjacent frame top field, and detecting a same spatial position of a pixel point of a current frame bottom field relative to an adjacent frame bottom field.
  • the motion state of the pixel is a motion state of a pixel point of a pixel position of a current frame top field relative to a same spatial position of an adjacent frame top field, and detecting a same spatial position of a pixel point of a current frame bottom field relative to an adjacent frame bottom field.
  • the same space of the top field of the current frame is detected by determining whether the pixel absolute difference of the pixel points of the same spatial position of the top field of the current frame and the top field of the adjacent frame is greater than the first threshold value.
  • the motion state of the pixel of the position relative to the pixel of the top field of the adjacent frame, and whether the absolute difference of the pixel of the pixel of the same spatial position of the bottom field of the current frame and the bottom field of the adjacent frame is greater than the first gate
  • the limit value is used to detect the motion state of the pixel point of the current frame bottom field relative to the same spatial position of the adjacent frame bottom field.
  • the first threshold may be 5 or other values.
  • step S306 the motion state of the stationary pixel is identified by 0, and other values may be used for identification. For example, if the pixel absolute difference of the pixel in FIG. 8 is less than 5, that is, stationary, the motion state of the pixel is identified by 0. Similarly, if the pixel absolute difference of the pixel in FIG. 9 is less than 5, that is, if it is still, use 0 to identify the motion state of the pixel.
  • step 308 If it is detected that the motion state of the pixel point of the top field of the current frame relative to the same spatial position of the adjacent frame top field is motion, and detecting the same spatial position of the pixel point of the bottom field of the current frame relative to the bottom field of the adjacent frame
  • the process proceeds to step 308, and the motion state of the moving pixel is identified by 1, and may be identified by other values. For example, comparing each value in FIG. 8 of the top field with the first threshold value, if the pixel absolute difference of the pixel in FIG. 10 is greater than 5, that is, motion, the pixel is identified by 1 For the same reason, if the pixel absolute difference of the pixel in FIG. 9 is greater than 5, that is, motion, the motion state of the pixel is identified by 1. After step S306 or step S308 is performed, FIG. 10 of the top field and FIG. 11 of the bottom field are obtained.
  • Step S310 the identifier of the motion state of the top field pixel point is ORed with the identifier of the motion state of the pixel point of the same spatial position in the bottom field.
  • the identifier of the top field after the judgment and the identifier of the bottom field need to be expanded, including two cases, as follows:
  • the result of the operation is still an identifier, that is, after an identifier is ORed with another identifier, the result is still an identifier, and the identifier still identifies the motion state of a certain pixel.
  • 1 indicates that the motion state of one pixel of the current frame is motion
  • 0 indicates that the motion state of another pixel of the current frame is still.
  • the even rows are set to 0.
  • the odd rows in the bottom field are set to 0. Since the pixel values of the even rows in the top field are 0, the top field after expansion and the unexpanded The previous identifier has not changed. Similarly, the bottom field after expansion and the identifier before expansion are unchanged. Finally, the top field and the bottom field identifier are ORed, and the result shown in FIG. 15 is obtained.
  • Step S312 judging the motion state of the pixel point of the current frame with respect to the pixel position of the same spatial position of the adjacent frame by the motion identifier in the operation result after the judgment or the operation.
  • the motion state of the pixel point of the interpolation line of the bottom field is judged. If the top field interpolation operation is performed on the current frame, the motion state of the pixel point of the interpolation line of the top field is judged.
  • step S314 is performed, and an intra-field interpolation operation is performed on the pixel point.
  • the intra-field interpolation operation is performed on the pixel point (ie, the interpolation point).
  • an intra-field interpolation operation is performed on the pixel point (ie, the interpolation point).
  • step S316 If it is determined that the motion state of the pixel point of the current frame with respect to the same spatial position of the adjacent frame is still, then the process proceeds to step S316, and an inter-field interpolation operation is performed on the pixel point.
  • the inter-field interpolation operation is performed on the pixel point (ie, the interpolation point).
  • an inter-field interpolation operation is performed on the pixel point (ie, the interpolation point).
  • the intra-field interpolation operation is an edge-adaptive filtering algorithm
  • the inter-field interpolation operation is a pixel copy algorithm. Since intra-field interpolation and inter-field interpolation are well-known techniques, they are not described here.
  • step S318 is executed to output the current frame after the operation.
  • Embodiment 3 of the present invention By implementing the technical solution of Embodiment 3 of the present invention, the accuracy of determining the motion state of the pixel of the current frame can be improved, thereby ensuring that the pixel point of the current frame does not cause missed detection, and the pixel error of the point to be inserted is reduced. Thereby improving the sharpness of the image.
  • Embodiment 4 of the present invention provides another deinterlacing method.
  • FIG. 16 is a detailed flowchart of another deinterlacing method according to Embodiment 4 of the present invention. The figure includes: Step S400, receiving a video data stream.
  • Step S402 dividing the top field and the bottom field of the current frame and the adjacent frame into a plurality of pixel blocks.
  • the detecting module 110 blocks the top field and the bottom field of the frame by N* (M/2), and may also perform blocking in other manners.
  • the image size of the frame is assumed to be 8x12
  • the top field and the bottom field of the frame are divided into two pixel blocks by 8x6, as shown in Fig. 17 as the top field of the current frame, and Fig. 18 is the bottom field of the current frame, and Fig. 19 is adjacent
  • the top field of the frame, Figure 20 is the bottom field of the adjacent frame.
  • Step S404 calculating a pixel absolute difference of a pixel of a top field of a top field of the current frame and an identical spatial position of an adjacent frame, and calculating a pixel point of a bottom field of the same spatial position of the bottom field of the current frame and the adjacent frame.
  • the absolute difference calculating a pixel absolute difference of a pixel of a top field of a top field of the current frame and an identical spatial position of an adjacent frame, and calculating a pixel point of a bottom field of the same spatial position of the bottom field of the current frame and the adjacent frame.
  • the absolute difference of the pixels of the top field can be obtained by calculation, as shown in Fig. 21; similarly, the absolute difference of the pixels of the bottom field can be obtained, as shown in Fig. 22.
  • Step S406 detecting a motion state of a pixel point of a pixel block of each pixel block in a top field of a current frame with respect to a same spatial position of a top field of an adjacent frame, and detecting each pixel block of a bottom field of the current frame.
  • the motion state of the pixel point of the pixel block relative to the same spatial position of the bottom field of the adjacent frame.
  • the second threshold value may be 2, and may be other values.
  • step S408 If the motion state of the pixel point of the pixel block of each pixel block in the top field of the current frame is detected to be stationary relative to the pixel position of the same spatial position in the top field of the adjacent frame, or if the bottom field of the current frame is detected If the pixel of each pixel block is stationary relative to the pixel of the pixel block of the same spatial position in the bottom field of the adjacent frame, the process proceeds to step S408, and the motion state of the pixel is marked with 0, and other values may be used. Mark it.
  • the motion state of the pixel point of the pixel block of each pixel block in the top field of the current frame relative to the pixel position of the same spatial position in the top field of the adjacent frame is detected as motion, or if the bottom field of the current frame is determined
  • the motion state of the pixel point of the pixel block of each pixel block relative to the pixel position of the same spatial position in the bottom field of the adjacent frame is motion, and then proceeds to step S410, and the motion state of the pixel point is identified by 1, Other values can also be used to identify and count the number of motion pixel points for each pixel block in the top and bottom fields.
  • the identification map of the top field can be obtained, as shown in FIG. 23; similarly, the identification map of the bottom field can be obtained, as shown in FIG.
  • Step S412 detecting a motion state of each pixel block in the top field and the bottom field.
  • the motion state of each pixel block in the top field is detected by determining whether the number of motion pixel points of each pixel block in the top field is greater than a third threshold value, and by determining the bottom field Whether the number of moving pixel points of each pixel block is greater than a third threshold value determines the motion state of each pixel block in the bottom field.
  • the third threshold value is 11, and may be other values.
  • step S416 If it is determined that the motion state of the pixel block in the top field is motion, or if it is determined that the motion state of the pixel block in the bottom field is motion, the process proceeds to step S416, and the motion state of the block is marked as 1, and other values may be used. logo. If it is determined that the pixel block in the top field is not a motion block, or if it is determined that the pixel block in the top field is not a motion block, that is, a static block, the process proceeds to step S414, and the block is identified as 0, and may be identified by other values.
  • the number of the identifier of the left pixel block in the top field is 12, and the number of the identifier 1 of the right pixel block is 2 ; in FIG. 24, the left pixel block in the bottom field
  • the number of the identifier 1 is 14, and the number of the identifier of the right pixel block is 15.
  • the motion state of the pixel block on the left side of the top field is identified as 1, the motion state of the right pixel block is 0, and the pixel on the left side of the bottom field.
  • the motion state of the block is identified by 1, and the motion state of the right pixel block is identified as 1.
  • Step S418, the identifier of the motion state of each pixel block in the top field is ORed with the identifier of the motion state of the pixel block in the same spatial position in the bottom field.
  • the operation result is 1; the right pixel block in the top field is The operation state is ORed with the flag of the motion state of the right pixel block in the bottom field, and the operation result is 1.
  • Step S420 Determine a motion state of a pixel block of the same spatial position of each pixel block of the current frame with respect to the adjacent frame.
  • the motion state of the pixel block of the same spatial position of each pixel block of the current frame with respect to the adjacent frame is judged by judging the motion flag in the OR operation result in step S418.
  • step S422 If it is determined that the motion state of the pixel block of the current frame is motion, the process proceeds to step S422, and the intra-frame interpolation operation is performed on the block of the current frame.
  • the intra-frame interpolation operation is performed on the block of the current frame.
  • an intra-field interpolation operation is performed on all the interpolation lines in the block of the bottom field.
  • an interpolated operation is performed on all interpolated lines in the block of the top field.
  • step S424 the inter-field interpolation operation is performed on the block of the current frame.
  • inter-field interpolation is performed on all the interpolated lines in the block of the bottom field.
  • inter-field interpolation is used for all interpolated lines in the block of the top field.
  • step S426 is executed to output the processed current frame.
  • step S422 is executed, and the processed current frame is output.
  • Embodiment 5 In order to further improve the clarity of an image, Embodiment 5 of the present invention provides another deinterlacing method, and FIG. 25 is a general flowchart of another deinterlacing method provided by Embodiment 5 of the present invention. The figure includes:
  • Step S500 receiving video data.
  • Step S502 detecting a motion state of a pixel of a top field of the current frame in the video data with respect to a pixel of a same spatial position of a top field of the adjacent frame, and detecting a bottom field of the current frame with respect to a bottom field of the adjacent frame.
  • the motion state of pixels in the same spatial position is a motion state of pixels in the same spatial position.
  • Step S504 determining whether the current frame is globally moved with respect to the adjacent frame.
  • the step S504 further includes the following steps: Step C: identifying a motion state of a pixel of a top field and a bottom field of the current frame; Step D: identifying an activity state of the top field pixel And performing an operation with the identifier of the motion state of the pixel in the same spatial position in the bottom field; Step F: determining whether the current frame is relative to the adjacent frame by determining whether the specific gravity of the motion identifier after the result is greater than a fourth threshold Global movement.
  • step S508 is reached.
  • step S506 If it is determined that the current frame is not global motion, proceed to step S506, and further determine the phase of the top field and the bottom field of the current frame. Whether the pixel of the same spatial position has a motion state of at least one pixel is motion to determine the motion state of the pixel of the current frame relative to the pixel of the same spatial position of the adjacent frame.
  • step S508 If it is determined that the pixel of the current frame is moving relative to the pixel of the adjacent frame, the process proceeds to step S508, and the current frame is subjected to the intra-field interpolation operation, and then step S512 is performed, and the processed current frame is output, and the process ends.
  • step S510 If it is determined that the pixel of the current frame is stationary with respect to the pixel of the adjacent frame, the process proceeds to step S510, and the inter-field interpolation operation is performed on the current frame, and then step S512 is performed, and the process ends.
  • Embodiment 6 of the present invention provides another deinterlacing method.
  • FIG. 26 is a detailed flowchart of another deinterlacing method according to Embodiment 6 of the present invention. The figure includes:
  • step S600 to step S610 is the same as the implementation process of step S300 to step S310 in the embodiment 3, and is not described here.
  • Step S612 Determine whether the current frame is globally moved with respect to the adjacent frame.
  • the fourth threshold value is 50%, and may be other values, but at least 50% or more.
  • step S614 is reached.
  • step S616 is continued to continue to determine the motion state of the pixel of the current frame relative to the pixel position of the same spatial position of the adjacent frame.
  • step S614 is performed, and an intra-field interpolation operation is performed on the pixel point.
  • the intra-field interpolation operation is performed on the pixel point (ie, the interpolation point).
  • an interpolation operation is performed on the top field of the current frame, if it is determined that the pixel of the interpolation field of the top field is moving, an intra-field interpolation operation is performed on the pixel point (ie, the interpolation point).
  • step S618 If it is determined that the motion state of the pixel point of the current frame with respect to the same spatial position of the adjacent frame is still, then the process proceeds to step S618, and an inter-field interpolation operation is performed on the pixel point.
  • inter-field interpolation when the bottom field of the current frame is interpolated, if the motion state of the pixel of the interpolation line of the bottom field is determined to be stationary, inter-field interpolation is used for the pixel (ie, the interpolation point).
  • an interpolation operation is performed on the top field of the current frame, if it is determined that the pixel of the interpolation field of the top field does not move, an inter-field interpolation operation is performed on the pixel point (ie, the interpolation point).
  • step S620 is performed to output the current frame after the operation.
  • Embodiment 7 of the present invention provides another deinterlacing method.
  • FIG. 27 is a detailed flowchart of another deinterlacing method according to Embodiment 7 of the present invention. The figure includes:
  • step S700 to step S718 is the same as the implementation process of step S400 to step S418 in the embodiment 4, and is not described here.
  • Step S720 Determine whether the current frame is globally moved with respect to the adjacent frame.
  • the seventh embodiment it is determined whether the current frame is globally moved with respect to the adjacent frame by determining whether the specific gravity of the motion indicator of the pixel block in the calculated identifier is greater than the fourth threshold.
  • the fourth threshold value is 50%, and other values may be used. But at least 50% or more.
  • step S722 is reached.
  • step S724 it is determined that the motion identifier in the operation result after each pixel block operation is determined to determine that each pixel block of the current frame is the same as the adjacent frame.
  • the motion state of the pixel block in the spatial position is the same as the adjacent frame.
  • step S722 is performed, and an intra-field interpolation operation is performed on the block.
  • an intra-field interpolation operation is performed on all the interpolation lines in the block of the bottom field.
  • an interpolated operation is performed on all interpolated lines in the block of the top field.
  • step S726, where an inter-field interpolation operation is employed.
  • inter-field interpolation is performed on all the interpolated lines in the block of the bottom field.
  • inter-field interpolation is used for all interpolated lines in the block of the top field.
  • step S728 is performed to output the current frame after the operation.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Graphics (AREA)
  • Television Systems (AREA)

Abstract

Les modes de réalisation de la présente invention concernent un procédé de désentrelacement, un dispositif de désentrelacement et un système de traitement vidéo pour des données vidéo, qui comprennent de manière concrète : premièrement, la détection de l'état de mouvement d'un pixel dans le champ supérieur de la trame actuelle dans les données vidéo comparé au pixel de position spatiale identique dans le champ supérieur de la trame voisine, et la détection de l'état de mouvement d'un pixel dans le champ inférieur de la trame actuelle comparé au pixel de position spatiale identique dans le champ inférieur de la trame voisine; deuxièmement, la détermination de l'état de mouvement d'un pixel dans la trame actuelle comparé au pixel de position spatiale identique dans la trame voisine en déterminant si l'état de mouvement d'au moins un des pixels de position spatiale identique dans le champ supérieur et le champ inférieur de la trame actuelle est "en mouvement"; troisièmement, la réalisation d'un processus d'interpolation sur les pixels de la ligne à interpoler selon l'état de mouvement d'un pixel dans la trame actuelle comparé au pixel de position spatiale identique dans la trame voisine. Grâce à la mise en œuvre des moyens techniques ci-dessus, il est garanti que des pixels dans la trame actuelle ne seront pas manqués lors de la détection, et la précision de détermination de l'état de mouvement des pixels dans la trame actuelle est améliorée, le risque d'erreur de pixel des points à interpoler est diminué, et par conséquent la définition des images peut être améliorée.
PCT/CN2009/071867 2008-05-20 2009-05-20 Procédé de désentrelacement, dispositif de désentrelacement et système de traitement vidéo pour des données vidéo WO2009140916A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN200810067381.X 2008-05-20
CN200810067381XA CN101588444B (zh) 2008-05-20 2008-05-20 视频数据的去隔行方法、去隔行装置及视频处理系统

Publications (1)

Publication Number Publication Date
WO2009140916A1 true WO2009140916A1 (fr) 2009-11-26

Family

ID=41339784

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2009/071867 WO2009140916A1 (fr) 2008-05-20 2009-05-20 Procédé de désentrelacement, dispositif de désentrelacement et système de traitement vidéo pour des données vidéo

Country Status (2)

Country Link
CN (1) CN101588444B (fr)
WO (1) WO2009140916A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102868870A (zh) * 2012-09-28 2013-01-09 许丹 去隔行处理方法

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104702877A (zh) * 2014-12-02 2015-06-10 深圳市云宙多媒体技术有限公司 一种视频去隔行方法和装置
CN105898179A (zh) * 2015-12-14 2016-08-24 乐视云计算有限公司 一种隔行视频的去隔行方法及装置
CN107306346B (zh) * 2016-04-18 2019-11-22 深圳市中兴微电子技术有限公司 图像数据处理方法及装置、播放器、电子设备
CN107666560B (zh) * 2016-07-28 2020-11-17 北京数码视讯科技股份有限公司 一种视频去隔行方法及装置
CN108810601B (zh) * 2017-05-04 2020-10-27 福州瑞芯微电子股份有限公司 运动字幕解交织方法、系统、移动终端及可读存储介质
CN113822866A (zh) * 2021-09-23 2021-12-21 深圳爱莫科技有限公司 一种广适应的车轴数量识别方法、系统、设备及存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1173776A (zh) * 1996-01-27 1998-02-18 三星电子株式会社 使用运动和空间相关的隔行向顺序转换装置和方法
US6188437B1 (en) * 1998-12-23 2001-02-13 Ati International Srl Deinterlacing technique
CN1477869A (zh) * 2002-07-26 2004-02-25 ���ǵ�����ʽ���� 去隔行装置及其方法
CN1512771A (zh) * 2002-12-30 2004-07-14 三星电子株式会社 视频信号去隔行的方法及装置

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100464580C (zh) * 2006-06-12 2009-02-25 北京中星微电子有限公司 一种用于数字视频处理的运动检测和解交错方法及其装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1173776A (zh) * 1996-01-27 1998-02-18 三星电子株式会社 使用运动和空间相关的隔行向顺序转换装置和方法
US6188437B1 (en) * 1998-12-23 2001-02-13 Ati International Srl Deinterlacing technique
CN1477869A (zh) * 2002-07-26 2004-02-25 ���ǵ�����ʽ���� 去隔行装置及其方法
CN1512771A (zh) * 2002-12-30 2004-07-14 三星电子株式会社 视频信号去隔行的方法及装置

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102868870A (zh) * 2012-09-28 2013-01-09 许丹 去隔行处理方法

Also Published As

Publication number Publication date
CN101588444B (zh) 2011-07-20
CN101588444A (zh) 2009-11-25

Similar Documents

Publication Publication Date Title
WO2009140916A1 (fr) Procédé de désentrelacement, dispositif de désentrelacement et système de traitement vidéo pour des données vidéo
US8199252B2 (en) Image-processing method and device
US6999128B2 (en) Stillness judging device and scanning line interpolating device having it
US9049336B2 (en) Auto-detect method for detecting image format and playback method applying the same
JP4869045B2 (ja) 補間フレーム作成方法および補間フレーム作成装置
CN100433791C (zh) 在静止区域的影片模式纠正
JP2005318621A (ja) ビデオシーケンスでのティッカー処理
US20050225671A1 (en) Method of Processing Fields of Images and Related Device for Data Lines Similarity Detection
JP3893227B2 (ja) 走査線補間装置、及び走査線補間方法
WO2004017634A1 (fr) Dispositif et procede de traitement d'image, afficheur video et dispositif de reproduction d'information enregistree
JP2005045803A (ja) 2:2プルダウンシーケンスの判別装置およびその方法
JP4510874B2 (ja) 合成映像検出装置
US8059920B2 (en) Method and apparatus for pixel interpolation
US20050163355A1 (en) Method and unit for estimating a motion vector of a group of pixels
US20120274845A1 (en) Image processing device and method, and program
US8254682B2 (en) Pattern detecting method and related image processing apparatus
US20060033839A1 (en) De-interlacing method
JP5241632B2 (ja) 画像処理回路および画像処理方法
JP2007129400A (ja) フィルムモード検出装置及び映像表示装置
JP2003179886A (ja) 画像処理装置および方法、記録媒体、並びにプログラム
US20050219411A1 (en) Method of Processing Video Fields and Related Fields Similarity Detection Apparatus
Tai et al. A motion and edge adaptive deinterlacing algorithm
TWI489873B (zh) 影像偵測裝置與方法
JP4929963B2 (ja) プルダウンシーケンス検出プログラムおよびプルダウンシーケンス検出装置
US10015513B2 (en) Image processing apparatus and image processing method thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09749459

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09749459

Country of ref document: EP

Kind code of ref document: A1