CN113365075A - Wired sending and receiving method and device of ultra-high-definition video applying light compression algorithm - Google Patents

Wired sending and receiving method and device of ultra-high-definition video applying light compression algorithm Download PDF

Info

Publication number
CN113365075A
CN113365075A CN202110633993.6A CN202110633993A CN113365075A CN 113365075 A CN113365075 A CN 113365075A CN 202110633993 A CN202110633993 A CN 202110633993A CN 113365075 A CN113365075 A CN 113365075A
Authority
CN
China
Prior art keywords
interface
definition video
integrated circuit
data packet
ultra
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110633993.6A
Other languages
Chinese (zh)
Inventor
高炳海
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Lenkeng Technology Co Ltd
Original Assignee
Shenzhen Lenkeng Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Lenkeng Technology Co Ltd filed Critical Shenzhen Lenkeng Technology Co Ltd
Priority to CN202110633993.6A priority Critical patent/CN113365075A/en
Publication of CN113365075A publication Critical patent/CN113365075A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/42Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/124Quantisation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/182Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being a pixel
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/60Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding
    • H04N19/625Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding using discrete cosine transform [DCT]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/60Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding
    • H04N19/63Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding using sub-band based transform, e.g. wavelets
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/90Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using coding techniques not provided for in groups H04N19/10-H04N19/85, e.g. fractals
    • H04N19/91Entropy coding, e.g. variable length coding [VLC] or arithmetic coding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/01Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
    • H04N7/0125Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level one of the standards being a high definition standard
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/015High-definition television systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/22Adaptations for optical transmission

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Discrete Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

The application discloses a method and a device for transmitting and receiving a wire of an ultra-high-definition video by applying a light compression algorithm, wherein the method for transmitting the wire comprises the following steps: the sending equipment acquires the ultra-high-definition video based on the input interface; the method comprises the steps that a sending device encodes an ultra-high-definition video based on a light compression encoding algorithm to obtain code stream data; the sending equipment encapsulates the code stream data based on a communication protocol to obtain a data packet; the sending equipment sends the data packet to a first communication module of which the transmission rate is not lower than a first threshold value; the first communication module is used for sending a data packet to receiving equipment; the first communication module includes: an optical or electrical module comprising: a PHY chip and an RJ-45 interface. The transmitting device performs video lossless compression on the ultrahigh-definition video by adopting a light compression algorithm, and then transmits the ultrahigh-definition video to the receiving device through an optical fiber or a network cable, so that the ultrahigh-definition video can be displayed in real time with ultralow time delay and lossless image quality by the display device coupled with the receiving device.

Description

Wired sending and receiving method and device of ultra-high-definition video applying light compression algorithm
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a method and an apparatus for wired transmission and reception of an ultra high definition video using a light compression algorithm.
Background
A large amount of redundant data exists in each frame of image data in the ultra-high definition video, so that an image data compression technology is developed, and algorithms with good compression effects in a transmission image compression algorithm are as follows: the h.264 compression algorithm can compress images to a very small size, but has a high delay, and is not suitable for transmitting ultra-high definition video over a long distance through a cable.
Disclosure of Invention
Based on the existing problems and the defects of the prior art, the application provides a wired transmission and receiving method and equipment of an ultra-high-definition video applying a light compression algorithm; the ultrahigh-definition video is subjected to lossless video compression by adopting a light compression algorithm and then is transmitted through a network cable or an optical fiber, so that the ultrahigh-definition video can be displayed in real time with ultralow time delay and lossless image quality by a display device coupled with a receiving device.
In a first aspect, the present application provides a wired transmission method for ultra high definition video applying a light compression algorithm, where the wired transmission method includes:
the sending equipment acquires the ultra-high-definition video based on the input interface;
the sending equipment encodes the ultrahigh-definition video based on a light compression encoding algorithm to obtain code stream data;
the sending equipment encapsulates the code stream data based on a communication protocol to obtain a data packet;
the sending equipment sends the data packet to a first communication module of which the transmission rate is not lower than a first threshold value; the first communication module is used for sending the data packet; the first communication module includes: an optical or electrical module, the electrical module comprising: a PHY chip and an RJ-45 interface.
In a second aspect, the present application provides a cable receiving method for ultra high definition video applying a light compression algorithm, the cable receiving method comprising:
the receiving equipment receives the data packet through a second communication module with the transmission rate not lower than a second threshold value;
the receiving equipment de-encapsulates the data packet based on a communication protocol to obtain code stream data;
the receiving equipment decodes the code stream data based on a light compression decoding algorithm to obtain an ultra-high definition video; the second communication module includes: an optical or electrical module, the electrical module comprising: a PHY chip and an RJ-45 interface.
In a third aspect, the present application provides an ultra high definition video transmission device applying a light compression algorithm, including: a first memory and a first processor coupled to the first memory, the first memory being configured to store first application program instructions, the first processor being configured to invoke the first application program instructions to perform the wired transmission method of ultra high definition video applying the light compression algorithm according to the first aspect.
In a fourth aspect, the present application provides an ultra high definition video receiving apparatus applying a light compression algorithm, the receiving apparatus comprising: the second processor is configured to call the second application program instruction, and execute the wired receiving method of ultra high definition video applying the light compression algorithm according to the second aspect.
The application provides a wired transmitting and receiving method and equipment of an ultra-high-definition video applying a light compression algorithm. The wired transmission method comprises the following steps: the sending equipment acquires the ultra-high-definition video based on the input interface; the sending equipment encodes the ultrahigh-definition video based on a light compression encoding algorithm to obtain code stream data; the sending equipment encapsulates the code stream data based on a communication protocol to obtain a data packet; the sending equipment sends the data packet to a first communication module of which the transmission rate is not lower than a first threshold value; the first communication module is used for sending the data packet; the first communication module includes: an optical or electrical module, the electrical module comprising: a PHY chip and an RJ-45 interface.
Compared with the prior art, the beneficial effects of the embodiment of the application are that: after the ultrahigh-definition video is subjected to video lossless compression by adopting a light compression algorithm, the ultrahigh-definition video is transmitted through an optical fiber or a network cable, and the ultrahigh-definition video can be displayed in real time with ultralow time delay and lossless image quality by a display device coupled with a receiving device.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic flow chart of a wired transmission method of ultra high definition video applying a light compression algorithm according to the present application;
fig. 2 is a schematic diagram of a quantization process of wavelet transform coefficients provided in the present application;
FIG. 3 is a schematic diagram illustrating a process of zigzag scanning quantized data provided in the present application;
fig. 4 is a schematic flowchart of a wired receiving method of ultra high definition video applying a light compression algorithm according to the present application;
fig. 5 is a schematic structural diagram of an ultra high definition video transmitting device applying a light compression algorithm according to the present application;
fig. 6 is a schematic structural diagram of an ultra high definition video receiving device applying a light compression algorithm according to the present application.
Detailed Description
The technical solutions in the present application will be described clearly and completely with reference to the accompanying drawings in the present application, and it is obvious that the described embodiments are some, not all embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Referring to fig. 1, which is a schematic flow chart of a wired transmission method of ultra high definition video applying a light compression algorithm according to the present application, as shown in fig. 1,
s101, the sending equipment acquires the ultra-high-definition video based on the input interface.
In the embodiment of the present application, acquiring, by a sending device, an ultra high definition video based on an input interface includes:
the sending equipment acquires the ultra-high definition video from video source equipment (such as a DVD, a set-top box, a camera and the like) based on an input interface; among other things, input interfaces may include, but are not limited to: HDMI (high Definition Multimedia interface) interface, Type-C interface, DP (DisplayPort) interface, USB (Universal Serial bus) interface, MIPI (Mobile Industry Processor interface) interface, DVI (digital visual interface) interface or VGA (Video Graphics Array) interface;
among others, ultra high definition video may include, but is not limited to: ultra high definition video in YUV format or ultra high definition video in RGB format; the high definition video data may further include, but is not limited to, the following features: the resolution may be: 1080P, 4K or 8K resolution; the frame rate may be 30FPS, 60FPS, 100FPS, or 120 FPS; high Dynamic Range hdr (high Dynamic Range imaging).
S102, the sending equipment encodes the ultra-high-definition video based on a light compression encoding algorithm to obtain code stream data.
In the embodiment of the present application, the sending device encodes the ultra-high-definition video based on the light compression encoding algorithm to obtain code stream data, which may include, but is not limited to, the following modes:
mode 1: the sending equipment encodes the ultra-high-definition video based on a wavelet transform coding algorithm through the first integrated circuit to obtain code stream data. In particular, the method comprises the following steps of,
step 1: the method comprises the steps that a sending device carries out wavelet transformation on an ultra-high-definition video through a first integrated circuit to obtain a wavelet transformation coefficient;
specifically, the sending device may perform, by the first integrated circuit, discrete wavelet transform of horizontal 1-5 layer decomposition and vertical 2-3 layer decomposition on pixel values of two lines of pixels of each frame of image in the ultra high definition video in the RGB format or the YUV format based on the following wavelets, to obtain the wavelet transform coefficient. Wherein, the wavelet may include but is not limited to: haar wavelets, daubechies (dbn) wavelets, Mexihat wavelets, Morlet wavelets, Meyer wavelets.
Step 2: the sending equipment quantizes the wavelet transform coefficients through the first integrated circuit to obtain quantized data;
the transmitting device quantizes the wavelet transform coefficients based on a target quantization step size by the first integrated circuit to obtain quantized data, wherein the transmitting device obtains the quantization step size according to a quantization formula.
The quantization process of the wavelet transform coefficients is briefly described below with reference to fig. 2.
As shown in fig. 2, the transmitting device quantizes the wavelet transform coefficients (e.g., data in the left table of fig. 2) by a quantization step (quantization step 28) to obtain quantized data (e.g., data in the right table of fig. 2).
And step 3: and the sending equipment performs entropy coding on the quantized data through the first integrated circuit to obtain code stream data.
Specifically, the sending device performs zigzag scanning on quantized data through the first integrated circuit to obtain a series of numbers, so that the quantized data is reduced from two dimensions to one dimension; then, the transmitting device entropy encodes the series of numbers through the first integrated circuit, and finally code stream data can be obtained.
In the embodiment of the present application, the first integrated circuit may include, but is not limited to: an FPGA chip, an ASIC chip, or an eASIC chip.
The process of zigzag scanning the quantized data is briefly described below with reference to fig. 3.
As shown in fig. 3, the sending device can scan the quantized data (e.g., the data in the left table in fig. 3) into a character string consisting of a series of numbers through the ZigZag scanning order of the zigbee.
It should be noted that when the character string is obtained after zigzag scanning: 9, 0, 0, 0, 0, -1, -1, 0, 0, 0, 0, 0, 0, 0, 0, 0, the character string can be output after entropy encoding: 0000101110000000000000001101010, respectively; expressed in 16-ary as: 0X: B8000D + 010.
It should be noted that the data occupation space after encoding is: 3 bytes (the foremost 0 does not need to occupy space) +3 bits 3 x 8+3 bits 27 bits.
It should be noted that, in conjunction with fig. 2-3, the coding is preceded by a 4 × 4 pixel block, and if each pixel occupies a byte of space, i.e., 16Bytes × 8 × 128 bits, the size of the data occupied space after coding is 3 Bytes (the foremost 0 does not need to occupy space) +3 bits × 3 × 8+3 × 27 bits. Thus, the compression ratio may be: 27/128-0.210-20%.
It should be noted that, the sending device entropy-encodes the quantized data by the first integrated circuit to obtain the code stream data, which may include, but is not limited to, the following approaches:
route 1: the sending equipment encodes the quantized data through a first integrated circuit based on a run-length encoding algorithm to obtain code stream data;
route 2: the transmitting equipment encodes the quantized data through a first integrated circuit based on a Huffman coding algorithm to obtain code stream data;
route 3: the sending equipment encodes the quantized data through a constant block encoding algorithm of a first integrated circuit based on a binary image to obtain code stream data;
route 4: and the sending equipment encodes the quantized data through the first integrated circuit based on a quad-tree coding algorithm to obtain code stream data.
Route 5: and the sending equipment encodes the quantized data through a context-based adaptive variable length coding algorithm of the first integrated circuit to obtain code stream data.
Route 6: and the sending equipment encodes the quantized data through a context-based adaptive binary arithmetic coding algorithm of the first integrated circuit to obtain code stream data.
Specifically, the method for encoding the ultra-high-definition video by the sending device through the first integrated circuit based on the wavelet transform coding algorithm to obtain the code stream data may further include:
and the transmitting equipment encodes the ultra-high-definition video through the first integrated circuit based on a JPEG-XS encoding algorithm to obtain code stream data. In particular, the method comprises the following steps of,
step 1: the sending equipment performs upsampling on each frame of picture in the ultra-high-definition video through the first integrated circuit to obtain upsampled data.
Step 2: if each frame of picture in the ultra-high-definition video is in an RGB format, the sending equipment can convert the ultra-high-definition video in the RGB format into the ultra-high-definition video in the YUV format by adopting MCT conversion;
and step 3: the transmitting equipment performs discrete wavelet transform on the ultra-high-definition video in the YUV format to obtain a discrete wavelet transform coefficient;
and 4, step 4: after the discrete wavelet transform coefficients are subjected to pre-quantization processing by the sending equipment, dividing the pre-quantized discrete wavelet coefficients into a plurality of coding groups; the numerical value range of the discrete wavelet coefficient of each coding group is as follows:
Figure BDA0003101169930000061
wherein,
Figure BDA0003101169930000062
g denotes the g-th code group, MgRepresenting the number of (non-zero) bit-planes per coding group, xiRepresenting the ith coefficient in the encoded group.
And 5: and (4) the sending equipment carries out entropy coding (such as Significance coding, MSB Position coding, absolute value coding or sign bit coding) on the quantized wavelet coefficient processed in the step (4) to obtain code stream data.
In particular, the method for encoding the ultra-high-definition video by the sending device based on the wavelet transform coding algorithm to obtain the code stream data may further include:
and the transmitting equipment encodes the ultra-high-definition video through the first integrated circuit based on a JPEG-LS encoding algorithm to obtain code stream data. In particular, the method comprises the following steps of,
step 1: the sending equipment acquires context parameters (such as gradient of a current pixel and nearby pixels) of the current pixel in each frame of picture in the ultra-high definition video through the first integrated circuit;
step 2: the sending device predicts according to the adjacent pixel value in the context template (the adjacent pixel of the current pixel) through the first integrated circuit to obtain the predicted value of the current pixel, and corrects the predicted value of the current pixel through the context parameter in the step 1;
and step 3: obtaining a prediction error by using the predicted value and the original pixel, and correcting and coding the prediction error;
and 4, step 4: updating relevant parameters of the context;
and 5: and performing Golomb coding on the prediction residual error to obtain code stream data.
Mode 2: the sending equipment encodes the ultra-high-definition video based on a short-time Fourier transform encoding algorithm through the first integrated circuit to obtain code stream data.
Mode 3: the sending equipment encodes the ultra-high-definition video based on a Fourier transform encoding algorithm through the first integrated circuit to obtain code stream data.
Mode 4: the sending device encodes the ultra-high-definition video based on a Discrete Cosine Transform (DCT) encoding algorithm through the first integrated circuit to obtain code stream data.
Specifically, the method for encoding the ultra high definition video by the first integrated circuit of the sending device based on the discrete cosine transform coding algorithm to obtain the code stream data may include:
and the first integrated circuit of the sending equipment encodes the ultra-high-definition video based on a VDC-M encoding algorithm to obtain code stream data. In particular, the method comprises the following steps of,
the VDC-M encoding algorithm may include, but is not limited to, the following encoding processes:
step 1: the sending equipment detects the flatness of the ultra-high definition video;
step 2: performing discrete cosine transform on the ultra-high definition video, and determining a prediction mode;
and step 3: and entropy coding the transformation coefficient to obtain code stream data.
S103, the sending equipment encapsulates the code stream data based on a communication protocol to obtain a data packet.
In this embodiment of the present application, the sending device encapsulates the code stream data based on the communication protocol to obtain the data packet, which may include but is not limited to the following manners:
mode 1: the sending equipment encapsulates the code stream data based on a User Datagram Protocol (UDP) communication Protocol through a first integrated circuit to obtain a UDP data packet; in particular, the method comprises the following steps of,
the sending equipment respectively adds a UDP data packet head and a UDP data packet tail to the front and back positions of the code stream data based on the UDP protocol through the first integrated circuit to obtain the UDP data packet comprising the code stream data, the UDP protocol head and the UDP protocol tail. The UDP header or the UDP trailer may include control information such as a destination address, a source address, a port number, and a flag bit.
It should be noted that the sending device may further encapsulate, by using the first integrated circuit, the code stream data and the acquired control instruction based on the UDP protocol, to obtain a UDP data packet.
It should be noted that the sending device may obtain the control instruction from the control device through an IR receiving head, an RS232 interface, a USB interface, or a UART interface. Among them, the USB interface may include but is not limited to: USB3.0, USB2.0, USB3.1 or Type-C interface.
Mode 2: the sending equipment encapsulates the code stream data based on a TCP (Transmission Control Protocol) communication Protocol through a first integrated circuit to obtain a TCP data packet;
it should be noted that the sending device may further encapsulate, by using the first integrated circuit, the code stream data and the acquired control instruction based on the TCP protocol, to obtain a TCP data packet.
Mode 3: and the sending equipment encapsulates the code stream data through the first integrated circuit based on the user-defined communication protocol to obtain a user-defined data packet.
Wherein, the self-defining protocol comprises: the simple protocol is designed to keep the requirements of data coding in the sending equipment and data decoding in the receiving equipment synchronous.
It should be noted that the sending device may further encapsulate, through the first integrated circuit, the code stream data and the acquired control instruction based on the custom protocol, to obtain a custom data packet.
S104, the sending equipment sends the data packet to a first communication module with the transmission rate not lower than a first threshold value.
In the embodiment of the present application, the first threshold may include, but is not limited to: 100Mbps, 1000Mbps, 1Gbps, 2.5 Gbps.
In this embodiment, the sending device sending the data packet to the first communication module whose transmission rate is not lower than the first threshold may include:
mode 1:
when the first communication module includes: when the light module is used as the light module,
the transmitting device, after receiving the data packet from the communication timing interface of the MAC unit in the first integrated circuit, converts the data packet into an optical signal and transmits the optical signal to the receiving device, or,
the method comprises the steps that after receiving a data packet from a communication time sequence interface of an MAC unit in a first integrated circuit, a sending device converts the data packet into an optical signal and sends the optical signal to a switch based on an optical fiber, and the switch is used for forwarding the optical signal to a receiving device;
the data packet in the embodiment of the present application includes: UDP packets, TCP packets, or custom packets; the communication time sequence interface in the embodiment of the application comprises: an XFI interface, an MII interface, a GMII interface, an SGMII interface, an RGMII interface, an XGMII interface, a Serdes interface, an XAUI interface, or an RXAUI interface.
The above-mentioned optical module may include, but is not limited to: the single-fiber bidirectional optical module (specifically, the single-fiber bidirectional optical module can comprise a single-mode optical module for long-distance transmission and a multi-mode optical module for short-distance transmission).
It should be noted that, when the receiving apparatus includes: a first receiving device and a second receiving device;
the method comprises the steps that after receiving a data packet from a communication time sequence interface of an MAC unit in a first integrated circuit, a sending device converts the data packet into an optical signal and sends the optical signal to a first receiving device and a second receiving device; or,
the sending device converts the data packet into an optical signal after receiving the data packet from a communication time sequence interface of the MAC unit in the first integrated circuit, and sends the optical signal to the switch based on the optical fiber, and the switch is used for forwarding the optical signal to the first receiving device and the second receiving device.
Mode 2:
when the first communication module includes: an electrical module, the electrical module comprising: PHY chip and RJ-45 interface;
after outputting the data packet to the PHY chip (ethernet physical interface transceiver) through the communication timing interface of the MAC unit in the first integrated circuit, the transmitting device modulates the data packet by the PHY chip and outputs the modulated data packet to the RJ-45 interface, transmits the modulated data packet to the receiving device through the RJ-45 interface and based on the network cable, or,
after the transmitting device outputs the data packet to the PHY chip through the communication time sequence interface of the MAC unit in the first integrated circuit, the data packet is output to the RJ-45 interface after being modulated through the PHY chip, and the data packet is transmitted to the switch through the RJ-45 interface and based on the network cable, and the switch is used for forwarding the data packet to the receiving device.
Such mesh wires may include, but are not limited to: CAT5, CAT5E, CAT6, CAT6E, CAT7 and the like.
It should be noted that, when the receiving apparatus includes: a first receiving device and a second receiving device;
after outputting the data packet to the PHY chip through the communication timing interface of the MAC unit in the first integrated circuit, the transmitting device modulates the data packet and outputs the modulated data packet to the RJ-45 interface, and transmits the modulated data packet to the first receiving device and the second receiving device through the RJ-45 interface, or,
after the transmitting device outputs the data packet to the PHY chip through a communication timing interface of the MAC unit in the first integrated circuit, the PHY chip modulates the data packet and outputs the modulated data packet to the RJ-45 interface, and the modulated data packet is transmitted to the switch through the RJ-45 interface and based on a network cable, and the switch is configured to forward the data packet to the first receiving device and the second receiving device.
It should be noted that the sending device may further receive, through the first communication module, a preset data packet sent by the receiving device based on a network cable or an optical fiber, and obtain a preset control instruction after decapsulating the preset data packet, where the preset control instruction is used to control a video source device (for example, a video source device is turned on or turned off) connected to the sending device, and the sending device may send the preset control instruction to the video source device coupled to the sending device through the infrared transmitting head.
It should be noted that fig. 2-3 are only used for explaining the embodiments of the present application and should not limit the present application.
The embodiment of the application provides a wired transmission method of an ultra-high-definition video by applying a light compression algorithm, wherein a transmission device performs video lossless compression on the ultra-high-definition video by adopting the light compression algorithm and then transmits the ultra-high-definition video to a receiving device through a network cable or an optical fiber, so that the ultra-low-delay and image-quality lossless real-time display of the ultra-high-definition video by a display device coupled with the receiving device can be realized.
Referring to fig. 4, which is a schematic flowchart of a cable receiving method for ultra high definition video applying a light compression algorithm according to the present application, as shown in fig. 4, the cable receiving method may include at least the following steps:
s401, the receiving device receives the data packet through the second communication module with the transmission rate not lower than the second threshold value.
In the embodiment of the present application, the second threshold may include, but is not limited to: 100Mbps, 1000Mbps, 1Gbps, 2.5 Gbps.
It should be noted that the second threshold value and the first threshold value in the present application may be equal in magnitude.
In this embodiment of the application, the receiving device receives the data packet through the second communication module whose transmission rate is not lower than the second threshold, which may include but is not limited to the following manners:
mode 1:
when the second communication module is an optical module and the transmission rate of the optical module is not lower than a second threshold value;
the receiving equipment receives the optical signal sent by the sending equipment through the optical module and the optical fiber, and converts the optical signal into a data packet, and the data packet is output to the second integrated circuit through a communication time sequence interface of an MAC unit in the second integrated circuit of the receiving equipment; the second integrated circuit is used for performing processing operations such as decapsulation and decoding on the data packet; the second integrated circuit in the embodiment of the present application may include, but is not limited to: an FPGA chip, an ASIC chip, or an eASIC chip. Or,
the receiving device receives the optical signal forwarded by the switch through the optical module and by combining with the optical fiber, and converts the optical signal into a data packet, and the data packet is output to the second integrated circuit through a communication time sequence interface of the MAC unit in the second integrated circuit.
The data packet in the embodiment of the present application includes: UDP packets, TCP packets, or custom packets; the optical module in the embodiment of the present application may include: the single-fiber bidirectional optical module (specifically, the single-fiber bidirectional optical module can comprise a single-mode optical module for long-distance transmission and a multi-mode optical module for short-distance transmission).
Mode 2:
when the second communication module is an electrical module, the transmission rate of the electrical module is not lower than a second threshold value; an electrical module comprising: a PHY chip and an RJ-45 interface;
after receiving the data packet sent by the sending device through the network cable and the RJ-45 interface, the receiving device outputs the data packet to a second integrated circuit through a PHY chip and by combining a communication time sequence interface of an MAC unit in the second integrated circuit, and the second integrated circuit is used for processing the data packet; or,
and after the receiving equipment receives the data packet forwarded by the switch through the network cable and the RJ-45 interface, the data packet is output to the second integrated circuit through the PHY chip and the communication time sequence interface of the MAC unit in the second integrated circuit.
Wherein, this communication chronogenesis interface includes: an XFI interface, a GMII interface, an SGMII interface, an RGMII interface, an XGMII interface, a Serdes interface, an XAUI interface, or an RXAAUI interface.
S402, the receiving device de-encapsulates the data packet based on a communication protocol to obtain code stream data.
In the embodiment of the application, the receiving device may obtain the data packet from the second communication module through the communication timing interface of the MAC unit in the second integrated circuit;
the receiving device decapsulates the data packet based on the communication protocol to obtain code stream data, which may include but is not limited to the following manners:
mode 1: the receiving equipment decapsulates the UDP data packet based on the UDP communication protocol through the second integrated circuit to obtain code stream data; in particular, the method comprises the following steps of,
the receiving device can remove the UDP data packet head and the UDP data packet tail from the UDP data packet respectively through the second integrated circuit based on the UDP protocol to obtain the code stream data.
It should be noted that, the receiving device decapsulates the UDP data packet based on the UDP protocol by using the second integrated circuit, and may obtain a control instruction in addition to the code stream data, where the control instruction is used to control a display device connected to the receiving device.
Mode 2: the receiving equipment de-encapsulates the TCP data packet through a second integrated circuit based on a TCP communication protocol to obtain code stream data;
it should be noted that, the receiving device may decapsulate the TCP data packet based on the TCP protocol by using the second integrated circuit to obtain the code stream data, and may further obtain a control instruction, where the control instruction is used to control a display device connected to the receiving device.
Mode 3: and the receiving equipment de-encapsulates the custom data packet through the second integrated circuit based on the custom communication protocol to obtain code stream data.
It should be noted that the receiving device decapsulates the custom data packet based on the custom communication protocol through the second integrated circuit to obtain the code stream data, and may also obtain a control instruction, where the control instruction is used to control a display device connected to the receiving device.
And S403, decoding the code stream data by the receiving equipment based on a light compression decoding algorithm to obtain the ultra-high definition video.
In the embodiment of the present application, the receiving device decodes the code stream data based on the light compression decoding algorithm to obtain the ultra high definition video, which may include but is not limited to the following modes:
mode 1: the receiving equipment decodes the code stream data based on a decoding algorithm of wavelet inverse transformation through a second integrated circuit to obtain an ultra-high definition video; in particular, the method comprises the following steps of,
the receiving device can perform entropy decoding (such as variable length entropy decoding and binary arithmetic decoding) on the code stream data through the second integrated circuit, perform inverse quantization operation on the entropy-decoded data to recover wavelet transform coefficients, perform inverse wavelet transform on the recovered wavelet transform coefficients, and further recover the ultra-high definition video. More specifically, the present invention is to provide a novel,
the receiving equipment decodes the code stream data through a second integrated circuit based on a JPEG-XS decoding algorithm to obtain an ultra-high definition video; wherein, the ultra-high definition video comprises: ultra high definition video in YUV format, or ultra high definition video in RGB format. Or,
and the receiving equipment decodes the code stream data through the second integrated circuit based on a JPEG-LS decoding algorithm to obtain the ultra-high definition video.
Mode 2: the receiving equipment decodes the code stream data based on a decoding algorithm of short-time inverse Fourier transform through a second integrated circuit to obtain an ultra-high definition video; in particular, the method comprises the following steps of,
the receiving device can perform entropy decoding on the code stream data through the second integrated circuit, then perform inverse quantization operation on the entropy-decoded data to recover the short-time Fourier transform coefficient, perform short-time inverse Fourier transform on the recovered short-time inverse Fourier transform coefficient, and further recover the ultra-high definition video.
Mode 3: and the receiving equipment decodes the code stream data through a decoding algorithm of the second integrated circuit based on the inverse Fourier transform to obtain the ultra-high definition video.
Mode 4: and the receiving equipment decodes the code stream data through a decoding algorithm of a second integrated circuit based on inverse discrete cosine transform to obtain the ultra-high definition video. More specifically, the present invention is to provide a novel,
and the receiving equipment decodes the ultra-high-definition video through a second integrated circuit based on a VDC-M decoding algorithm to obtain the ultra-high-definition video.
It should be noted that the receiving device may further obtain a preset control instruction from the control device through an IR receiving head, an RS232 interface, a USB interface, or a UART interface integrated inside, encapsulate the preset control instruction into a preset data packet, and send the preset data packet to the sending device or the switch through the second communication module in the receiving device, where the preset control instruction may be used to control a video source device connected to the sending device (e.g., start or shut down the video source device).
The application provides a sending device of ultra-high definition video applying a light compression algorithm, which can be used for realizing the wired sending method described in the embodiment of fig. 1. The sending device shown in fig. 5 may be used to execute the description in the embodiment of fig. 1.
As shown in fig. 5, the transmitting device 50 may include, but is not limited to: a first memory 501, a first processor 502 coupled to the first memory 501, and a first communication module 503 coupled to the first processor 502.
A first memory 501, operable to: a first application program instruction;
a first processor 502 operable to: the first application program instruction stored in the first memory 501 is called to implement the wired transmission method of the ultra high definition video applying the light compression algorithm described in fig. 1.
The first communication module 503 may be configured to transmit a data packet in the wired transmission method of the ultra high definition video applying the light compression algorithm, which is described in fig. 1.
A first processor 502 operable to:
acquiring an ultra-high-definition video based on an input interface;
coding the ultra-high-definition video based on a light compression coding algorithm to obtain code stream data;
packaging the code stream data based on a communication protocol to obtain a data packet;
the data packet is sent to the first communication module 503 whose transmission rate is not lower than the first threshold.
The first processor 502 may be specifically configured to:
acquiring the ultra-high-definition video from video source equipment based on an input interface; the input interface includes: an HDMI interface, a Type-C interface, a DP interface, a USB interface, an MIPI interface, a DVI interface or a VGA interface;
the first processor 502 may be specifically configured to:
mode 1: and coding the ultra-high-definition video through a first integrated circuit based on a wavelet transform coding algorithm to obtain code stream data. The first integrated circuit in the embodiments of the present application may include, but is not limited to: an FPGA chip, an ASIC chip, or an eASIC chip. In particular, the method comprises the following steps of,
performing wavelet transformation on the ultra-high definition video through a first integrated circuit to obtain a wavelet transformation coefficient;
quantizing the wavelet transform coefficients through a first integrated circuit to obtain quantized data;
and entropy coding the quantized data through the first integrated circuit to obtain code stream data.
More specifically, the present invention is to provide a novel,
coding the ultra-high-definition video through a first integrated circuit based on a JPEG-XS coding algorithm to obtain code stream data; the ultra-high definition video comprises: ultra high definition video in YUV format, or ultra high definition video in RGB format.
And coding the ultra-high-definition video through the first integrated circuit based on a JPEG-LS coding algorithm to obtain code stream data.
Mode 2: and coding the ultra-high-definition video through a first integrated circuit based on a short-time Fourier transform coding algorithm to obtain code stream data.
Mode 3: and coding the ultra-high-definition video through a first integrated circuit based on a coding algorithm of Fourier transform to obtain code stream data.
Mode 4: and coding the ultra-high-definition video through a first integrated circuit based on a discrete cosine transform coding algorithm to obtain code stream data.
And coding the ultra-high definition video based on a VDC-M coding algorithm to obtain code stream data.
The first processor 502 may be further specifically configured to:
packaging the code stream data based on a UDP communication protocol through a first integrated circuit to obtain a UDP data packet;
packaging the code stream data based on a TCP communication protocol through a first integrated circuit to obtain a TCP data packet; or,
and packaging the code stream data through the first integrated circuit based on a user-defined communication protocol to obtain a user-defined data packet.
When the first communication module 503 is an optical module, the transmission rate of the optical module is not lower than the first threshold;
an optical module operable to:
after receiving the data packet from the communication timing interface of the MAC unit, the data packet is converted into an optical signal and the optical signal is transmitted to the receiving device, or,
after receiving the data packet from the communication time sequence interface of the MAC unit, converting the data packet into an optical signal and sending the optical signal to a switch, wherein the switch is used for forwarding the optical signal to a receiving device;
wherein the data packet includes: UDP packets, TCP packets, or custom packets; the communication timing interface comprises: an XFI interface, an MII interface, a GMII interface, an SGMII interface, an RGMII interface, an XGMII interface, a Serdes interface, an XAUI interface, or an RXAUI interface.
It should be noted that, when the receiving apparatus includes: a first receiving device and a second receiving device;
the light module may also be used to:
after receiving the data packet from the communication timing interface of the MAC unit, the MAC unit converts the data packet into an optical signal and transmits the optical signal to the first receiving device and the second receiving device, or,
and after receiving the data packet from the communication time sequence interface of the MAC unit, converting the data packet into an optical signal and sending the optical signal to the switch, wherein the switch is used for forwarding the optical signal to the first receiving device and the second receiving device.
When the first communication module 503 includes: when the electric module is to be electrically operated,
the transmission rate of the electrical module is not lower than the first threshold, the electrical module comprising: a PHY chip and an RJ-45 interface;
the electrical module is for:
after the data packet is output to the PHY chip through the communication timing interface of the MAC unit, the data packet is output to the RJ-45 interface through the PHY chip, transmitted to the receiving device through the RJ-45 interface, or,
after the data packet is output to the PHY chip through the communication time sequence interface of the MAC unit, the data packet is output to the RJ-45 interface through the PHY chip and is sent to the switch through the RJ-45 interface, and the switch is used for forwarding the data packet to the receiving equipment.
When the reception apparatus includes: a first receiving device and a second receiving device;
the electrical module may also be used to:
after outputting the data packet to the PHY chip through the communication timing interface of the MAC unit, the data packet is output to the RJ-45 interface through the PHY chip, transmitted to the first receiving device and the second receiving device through the RJ-45 interface, or,
after the data packet is output to the PHY chip through the communication time sequence interface of the MAC unit, the data packet is output to the RJ-45 interface through the PHY chip and is sent to the switch through the RJ-45 interface, and the switch is used for forwarding the data packet to the first receiving device and the second receiving device.
The first communication module 503 is further configured to receive a preset data packet sent by the receiving device; the first processor 502 may be further configured to decapsulate the preset data packet to obtain a preset control instruction; the transmission device 50 includes: the first memory 501, the first processor 502, and the first communication module 503 may further include: and the infrared emission head is used for sending the preset control instruction to the video source equipment coupled with the sending equipment 50 so as to control the video source equipment.
It should be understood that the sending device 50 is only one example provided by the embodiments of the present application, and that the sending device 50 may have more or less components than those shown, may combine two or more components, or may have a different configuration implementation of the components.
It can be understood that, regarding the specific implementation of the functional components included in the sending device 50 of fig. 5, reference may be made to the embodiment of fig. 1, and details are not repeated here.
The application provides an ultra-high-definition video receiving device applying a light compression algorithm, which can be used for realizing the wired transmission method described in the embodiment of fig. 4. The receiving device shown in fig. 6 may be used to execute the description in the embodiment of fig. 4.
As shown in fig. 6, receiving device 60 may include, but is not limited to: a second memory 601, a second processor 602 coupled to the second memory 601, and a second communication module 603 coupled to the second processor 602.
A second memory 601, operable to: a second application program instruction;
a second processor 602 operable to: and calling a second application program instruction stored in the second memory 601 to implement the wired receiving method of the ultra-high definition video applying the light compression algorithm described in fig. 4.
The second communication module 603 may be configured to receive a data packet in the wired receiving method for the ultra-high definition video applying the light compression algorithm, described in fig. 4.
When the second communication module 603 is an optical module; the transmission rate of the optical module is not lower than a second threshold;
an optical module operable to:
receiving the optical signal transmitted by the transmitting device and converting the optical signal into a data packet, the data packet being output to the second integrated circuit through the communication timing interface of the MAC unit in the second integrated circuit of the receiving device 60; the second integrated circuit is used for processing the data packet; the second integrated circuit in the embodiments of the present application may include, but is not limited to: an FPGA chip, an ASIC chip, or an eASIC chip. Or,
receiving the optical signal forwarded by the switch and converting the optical signal into a data packet, the data packet being output to the second integrated circuit through the communication timing interface of the MAC unit in the second integrated circuit of the receiving device 60;
wherein, the data packet includes: UDP packets, TCP packets, or custom packets; the communication timing interface comprises: an XFI interface, a GMII interface, an SGMII interface, an RGMII interface, an XGMII interface, a Serdes interface, an XAUI interface, or an RXAAUI interface.
When the second communication module 603 is an electrical module, the transmission rate of the electrical module is not lower than a second threshold; an electrical module comprising: a PHY chip and an RJ-45 interface;
the electrical module may be used to:
after receiving the data packet sent by the sending device through the RJ-45 interface, the data packet is output to the second integrated circuit through the PHY chip in combination with the communication timing interface of the MAC unit in the second integrated circuit of the receiving device 60, the second integrated circuit being configured to process the data packet; or,
after receiving the data packet forwarded by the switch through the RJ-45 interface, the data packet is output to the second integrated circuit through the PHY chip in conjunction with the communication timing interface of the MAC unit in the second integrated circuit.
The second processor 602 may be specifically configured to:
decapsulating the UDP data packet through a second integrated circuit based on a UDP communication protocol to obtain code stream data; decapsulating the TCP data packet through a second integrated circuit based on a TCP communication protocol to obtain code stream data;
and de-encapsulating the custom data packet through a second integrated circuit based on a custom communication protocol to obtain code stream data.
The second processor 602 may be specifically configured to:
mode 1: decoding the code stream data through a decoding algorithm of a second integrated circuit based on inverse wavelet transform to obtain an ultra-high-definition video; in particular, the method comprises the following steps of,
decoding the code stream data through a second integrated circuit based on a JPEG-XS decoding algorithm to obtain an ultra-high definition video; wherein, the ultra-high definition video comprises: ultra high definition video in YUV format, or ultra high definition video in RGB format. Or,
and decoding the code stream data through a second integrated circuit based on a JPEG-LS decoding algorithm to obtain the ultra-high definition video.
Mode 2: decoding the code stream data through a second integrated circuit based on a decoding algorithm of short-time inverse Fourier transform to obtain an ultra-high definition video; or,
mode 3: decoding the code stream data through a decoding algorithm of a second integrated circuit based on inverse Fourier transform to obtain an ultra-high definition video; wherein, the ultra-high definition video comprises: ultra high definition video in YUV format, or ultra high definition video in RGB format.
Mode 4: and decoding the code stream data through a decoding algorithm of a second integrated circuit based on inverse discrete cosine transform to obtain the ultra-high definition video. More specifically, the present invention is to provide a novel,
and decoding the ultra-high-definition video through a second integrated circuit based on a VDC-M decoding algorithm to obtain the ultra-high-definition video.
It should be noted that the receiving device 60 may also obtain the preset control instruction from the control device through an internal integrated IR receiving head, an RS232 interface, a USB interface or a UART interface.
The second processor 602 is further configured to: packaging the preset control instruction into a preset data packet;
the second communication module 603 may further be configured to: and sending the preset data packet to a sending device or a switch.
It should be understood that the receiving device 60 is only one example provided by the embodiments of the present application, and that the receiving device 60 may have more or less components than those shown, may combine two or more components, or may have a different configuration implementation of the components.
It can be understood that, regarding the specific implementation of the functional components included in the receiving device 60 of fig. 6, reference may be made to the embodiment of fig. 4, which is not described herein again.
Those of ordinary skill in the art will appreciate that the various illustrative modules and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative components and steps have been described above generally in terms of their functionality in order to clearly illustrate the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described apparatuses, devices or modules may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus, device or method may be implemented in other ways. For example, the components and steps of the various examples are described. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The above-described embodiments of the apparatus and device are merely illustrative, and for example, the division of the modules is only one logical division, and other divisions may be realized in practice, for example, a plurality of modules or components may be combined or integrated into another apparatus, or some features may be omitted or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices, apparatuses or modules, and may also be an electrical, mechanical or other form of connection.
The modules described as separate parts may or may not be physically separate, and parts displayed as modules may or may not be physical modules, may be located in one place, or may be distributed on a plurality of network modules. Some or all of the modules can be selected according to actual needs to achieve the purpose of the solution of the embodiments of the present application.
In addition, functional modules in the embodiments of the present application may be integrated into one processing module, or each of the modules may exist alone physically, or two or more modules are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode.
The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially or partially contributed by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
While the invention has been described with reference to specific embodiments, the scope of the invention is not limited thereto, and those skilled in the art can easily conceive various equivalent modifications or substitutions within the technical scope of the invention. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (21)

1. A wired transmission method of ultra high definition video applying a light compression algorithm is characterized by comprising the following steps:
the sending equipment acquires the ultra-high-definition video based on the input interface;
the sending equipment encodes the ultrahigh-definition video based on a light compression encoding algorithm to obtain code stream data;
the sending equipment encapsulates the code stream data based on a communication protocol to obtain a data packet;
the sending equipment sends the data packet to a first communication module of which the transmission rate is not lower than a first threshold value; the first communication module is used for sending the data packet; the first communication module includes: an optical or electrical module, the electrical module comprising: a PHY chip and an RJ-45 interface.
2. The wired transmission method of ultra high definition video using a light compression algorithm according to claim 1,
when the first communication module is an optical module, the transmission rate of the optical module is not lower than the first threshold;
the optical module is used for:
after receiving a data packet from a communication timing interface of the MAC unit, the data packet is converted into an optical signal and the optical signal is transmitted to a receiving device, or,
after receiving the data packet from the communication timing interface of the MAC unit, converting the data packet into an optical signal, and sending the optical signal to a switch, where the switch is configured to forward the optical signal to the receiving device;
wherein the data packet includes: UDP packets, TCP packets, or custom packets; the communication timing interface comprises: an XFI interface, an MII interface, a GMII interface, an SGMII interface, an RGMII interface, an XGMII interface, a Serdes interface, an XAUI interface, or an RXAUI interface.
3. The wired transmission method of ultra high definition video using a light compression algorithm according to claim 1,
when the first communication module includes: when the electric module is to be electrically operated,
a transmission rate of the electrical module is not below the first threshold, the electrical module comprising: a PHY chip and an RJ-45 interface;
the electrical module is configured to:
after outputting the data packet to the PHY chip through a communication timing interface of the MAC unit, outputting the data packet to the RJ-45 interface through the PHY chip, transmitting to a receiving device through the RJ-45 interface, or,
after the data packet is output to the PHY chip through a communication timing interface of the MAC unit, the data packet is output to the RJ-45 interface through the PHY chip, and is sent to a switch through the RJ-45 interface, where the switch is configured to forward the data packet to the receiving device;
wherein, the communication time sequence interface comprises: an XFI interface, an MII interface, a GMII interface, an SGMII interface, an RGMII interface, an XGMII interface, a Serdes interface, an XAUI interface or an RXAAUI interface; the data packet includes: UDP packets, TCP packets, or custom packets.
4. The wired transmission method of ultra high definition video applying a light compression algorithm according to claim 2,
the receiving apparatus includes: a first receiving device and a second receiving device;
the optical module is used for:
converting a data packet into an optical signal after receiving the data packet from a communication timing interface of the MAC unit, and transmitting the optical signal to the first receiving device and the second receiving device, or,
and after receiving a data packet from a communication timing interface of the MAC unit, converting the data packet into an optical signal, and sending the optical signal to a switch, where the switch is configured to forward the optical signal to the first receiving device and the second receiving device.
5. The wired transmission method of ultra high definition video using a light compression algorithm according to claim 3,
the receiving apparatus includes: a first receiving device and a second receiving device;
the electrical module is configured to:
after outputting the data packet to the PHY chip through a communication timing interface of the MAC unit, outputting the data packet to the RJ-45 interface through the PHY chip, sending the data packet to the first receiving device and the second receiving device through the RJ-45 interface, or,
after the data packet is output to the PHY chip through the communication timing interface of the MAC unit, the data packet is output to the RJ-45 interface through the PHY chip, and is sent to a switch through the RJ-45 interface, where the switch is configured to forward the data packet to the first receiving device and the second receiving device.
6. The wired transmission method of ultra high definition video using a light compression algorithm according to claim 1,
the method for acquiring the ultra-high-definition video by the sending equipment based on the input interface comprises the following steps:
the sending equipment acquires the ultra-high-definition video from video source equipment based on an input interface; the input interface includes: an HDMI interface, a Type-C interface, a DP interface, a USB interface, an MIPI interface, a DVI interface or a VGA interface;
wherein the ultra high definition video comprises: ultra high definition video in YUV format, or ultra high definition video in RGB format.
7. The wired transmission method of ultra high definition video using a light compression algorithm according to claim 1,
the method for encoding the ultra-high-definition video by the sending device based on the light compression encoding algorithm to obtain code stream data comprises the following steps:
the sending equipment encodes the ultra-high-definition video based on a wavelet transform coding algorithm through a first integrated circuit to obtain code stream data; or,
the sending equipment encodes the ultra-high-definition video based on a short-time Fourier transform encoding algorithm through a first integrated circuit to obtain code stream data; or,
the sending equipment encodes the ultra-high-definition video based on a discrete cosine transform coding algorithm through the first integrated circuit to obtain the code stream data;
wherein the first integrated circuit comprises: an FPGA chip or an ASIC chip; the ultra-high definition video comprises: ultra high definition video in YUV format, or ultra high definition video in RGB format.
8. The wired transmission method of ultra high definition video applying the light compression algorithm according to claim 7,
the method comprises the following steps that the sending equipment encodes the ultra-high-definition video through a first integrated circuit based on a wavelet transform coding algorithm to obtain code stream data, and comprises the following steps:
the transmitting equipment encodes the ultrahigh-definition video through the first integrated circuit based on a JPEG-XS encoding algorithm to obtain code stream data; or,
the transmitting equipment encodes the ultrahigh-definition video through the first integrated circuit based on a JPEG-LS (joint photographic experts group-LS) encoding algorithm to obtain code stream data; or,
the method for obtaining the code stream data by the sending device by encoding the ultra high definition video based on the discrete cosine transform coding algorithm through the first integrated circuit includes:
and the transmitting equipment encodes the ultra-high-definition video through the first integrated circuit based on a VDC-M encoding algorithm to obtain the code stream data.
9. The wired transmission method of ultra high definition video applying the light compression algorithm according to claim 7,
the method comprises the following steps that the sending equipment encodes the ultra-high-definition video through a first integrated circuit based on a wavelet transform coding algorithm to obtain code stream data, and comprises the following steps:
the sending equipment performs wavelet transformation on the ultrahigh-definition video through a first integrated circuit to obtain a wavelet transformation coefficient;
the sending equipment quantizes the wavelet transform coefficients through a first integrated circuit to obtain quantized data;
and the sending equipment carries out entropy coding on the quantized data through a first integrated circuit to obtain code stream data.
10. The wired transmission method of ultra high definition video applying the light compression algorithm according to claim 7,
the method includes the steps that the sending device codes the ultra-high-definition video through a first integrated circuit based on a short-time Fourier transform coding algorithm to obtain code stream data, and the method includes the following steps:
the sending equipment carries out short-time Fourier transform on the ultra-high-definition video through a first integrated circuit to obtain a short-time discrete Fourier coefficient;
the sending equipment quantizes the short-time discrete Fourier coefficients through a first integrated circuit to obtain quantized data;
and the sending equipment carries out entropy coding on the quantized data through a first integrated circuit to obtain code stream data.
11. The wired transmission method of ultra high definition video applying a light compression algorithm according to claim 9,
the sending device performs wavelet transform on the ultra-high definition video through a first integrated circuit to obtain a wavelet transform coefficient, and the wavelet transform coefficient comprises:
and the sending equipment performs wavelet transformation of horizontal 1-5 layer decomposition and vertical 2-3 layer decomposition on the ultra-high definition video through a first integrated circuit to obtain a wavelet transformation coefficient.
12. The wired transmission method of ultra high definition video applying a light compression algorithm according to claim 9,
the sending device entropy-encodes the quantized data through a first integrated circuit to obtain code stream data, and the method comprises the following steps:
the sending equipment encodes the quantized data through the first integrated circuit based on a run length encoding algorithm to obtain the code stream data; or,
the sending equipment encodes the quantized data through the first integrated circuit based on a Huffman coding algorithm to obtain the code stream data; or,
the sending equipment encodes the quantized data through the first integrated circuit based on a constant block encoding algorithm of a binary image to obtain code stream data; or,
and the sending equipment encodes the quantized data through the first integrated circuit based on a quadtree coding algorithm to obtain the code stream data.
13. The wired transmission method of ultra high definition video using a light compression algorithm according to claim 1,
the method for encapsulating the code stream data by the sending equipment based on a communication protocol to obtain a data packet comprises the following steps:
the sending equipment encapsulates the code stream data based on a UDP communication protocol through a first integrated circuit to obtain a UDP data packet; or,
the sending equipment encapsulates the code stream data based on a TCP communication protocol through the first integrated circuit to obtain a TCP data packet; or,
and the sending equipment encapsulates the code stream data based on a user-defined communication protocol through the first integrated circuit to obtain a user-defined data packet.
14. A wired receiving method of ultra high definition video applying a light compression algorithm is characterized by comprising the following steps:
the receiving equipment receives the data packet through a second communication module with the transmission rate not lower than a second threshold value;
the receiving equipment de-encapsulates the data packet based on a communication protocol to obtain code stream data;
the receiving equipment decodes the code stream data based on a light compression decoding algorithm to obtain an ultra-high definition video; wherein the second communication module comprises: an optical or electrical module, the electrical module comprising: a PHY chip and an RJ-45 interface.
15. The cable receiving method of ultra high definition video applying a light compression algorithm as claimed in claim 14,
when the second communication module is an optical module; the transmission rate of the optical module is not lower than the second threshold;
the optical module is used for:
receiving an optical signal transmitted by a transmitting device, and converting the optical signal into a data packet, wherein the data packet is output to a second integrated circuit through a communication time sequence interface of an MAC unit in the second integrated circuit; the second integrated circuit is used for processing the data packet; or,
receiving an optical signal forwarded by a switch, and converting the optical signal into a data packet, wherein the data packet is output to the second integrated circuit through a communication time sequence interface of an MAC unit in the second integrated circuit;
wherein the data packet includes: UDP packets, TCP packets, or custom packets; the communication timing interface comprises: an XFI interface, a GMII interface, an SGMII interface, an RGMII interface, an XGMII interface, a Serdes interface, an XAUI interface, or an RXAAUI interface.
16. The cable receiving method of ultra high definition video applying a light compression algorithm as claimed in claim 14,
when the second communication module is an electrical module, the transmission rate of the electrical module is not lower than the second threshold; the electrical module includes: a PHY chip and an RJ-45 interface;
the electrical module is configured to:
after receiving a data packet sent by a sending device through the RJ-45 interface, outputting the data packet to the second integrated circuit through the PHY chip and combining with a communication timing interface of an MAC unit in the second integrated circuit, wherein the second integrated circuit is used for processing the data packet; or,
after receiving the data packet forwarded by the switch through the RJ-45 interface, outputting the data packet to the second integrated circuit through the PHY chip in combination with a communication timing interface of an MAC unit in the second integrated circuit;
wherein the data packet includes: UDP packets, TCP packets, or custom packets; the communication timing interface comprises: an XFI interface, a GMII interface, an SGMII interface, an RGMII interface, an XGMII interface, a Serdes interface, an XAUI interface, or an RXAAUI interface.
17. The cable receiving method of ultra high definition video applying a light compression algorithm as claimed in claim 14,
the receiving device decapsulates the data packet based on a communication protocol to obtain code stream data, including:
the receiving equipment decapsulates the UDP data packet based on the UDP communication protocol through the second integrated circuit to obtain code stream data;
or,
the receiving equipment decapsulates the TCP data packet based on a TCP communication protocol through the second integrated circuit to obtain the code stream data;
or,
and the receiving equipment decapsulates the custom data packet based on a custom communication protocol through the second integrated circuit to obtain the code stream data.
18. The cable receiving method of ultra high definition video applying a light compression algorithm as claimed in claim 14,
the receiving device decodes the code stream data based on a light compression decoding algorithm to obtain the ultra-high definition video, and the method comprises the following steps:
the receiving equipment decodes the code stream data based on a decoding algorithm of wavelet inverse transformation through a second integrated circuit to obtain the ultra-high definition video; or,
the receiving equipment decodes the code stream data through the second integrated circuit based on a decoding algorithm of short-time inverse Fourier transform to obtain the ultra-high-definition video; or,
the receiving equipment decodes the code stream data based on a decoding algorithm of inverse discrete cosine transform through the second integrated circuit to obtain the ultra-high definition video;
wherein the ultra high definition video comprises: ultra high definition video in YUV format, or ultra high definition video in RGB format.
19. The wired receiving method of ultra high definition video applying a light compression algorithm according to claim 18,
the receiving device decodes the code stream data based on a decoding algorithm of wavelet inverse transformation through a second integrated circuit to obtain the ultra-high definition video, and the method comprises the following steps:
the receiving equipment decodes the code stream data based on a JPEG-XS decoding algorithm through the second integrated circuit to obtain an ultra-high definition video; or,
the receiving equipment decodes the code stream data based on a JPEG-LS decoding algorithm through the second integrated circuit to obtain the ultra-high definition video; or,
the receiving device decodes the code stream data based on a decoding algorithm of inverse discrete cosine transform through the second integrated circuit to obtain the ultra-high definition video, and the method comprises the following steps:
the receiving equipment decodes the code stream data based on a VDC-M decoding algorithm through the second integrated circuit to obtain the ultra-high definition video;
wherein the ultra high definition video comprises: ultra high definition video in YUV format, or ultra high definition video in RGB format.
20. An apparatus for transmitting ultra high definition video using a light compression algorithm, comprising:
a first memory for storing first application program instructions and a first processor coupled to the first memory, the first processor configured to invoke the first application program instructions to perform the wired transmission method of ultra high definition video applying a light compression algorithm of claims 1-13.
21. An ultra high definition video receiving apparatus applying a light compression algorithm, comprising:
a second memory for storing second application program instructions and a second processor coupled to the second memory, the second processor being configured to invoke the second application program instructions to perform the wired reception method of ultra high definition video applying a light compression algorithm of claims 14-19.
CN202110633993.6A 2021-06-04 2021-06-04 Wired sending and receiving method and device of ultra-high-definition video applying light compression algorithm Pending CN113365075A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110633993.6A CN113365075A (en) 2021-06-04 2021-06-04 Wired sending and receiving method and device of ultra-high-definition video applying light compression algorithm

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110633993.6A CN113365075A (en) 2021-06-04 2021-06-04 Wired sending and receiving method and device of ultra-high-definition video applying light compression algorithm

Publications (1)

Publication Number Publication Date
CN113365075A true CN113365075A (en) 2021-09-07

Family

ID=77533027

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110633993.6A Pending CN113365075A (en) 2021-06-04 2021-06-04 Wired sending and receiving method and device of ultra-high-definition video applying light compression algorithm

Country Status (1)

Country Link
CN (1) CN113365075A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113742003A (en) * 2021-09-15 2021-12-03 深圳市朗强科技有限公司 Program code execution method and device based on FPGA chip
CN113784140A (en) * 2021-09-15 2021-12-10 深圳市朗强科技有限公司 Mathematical lossless coding method and device
CN116132712A (en) * 2023-02-08 2023-05-16 北京镁伽机器人科技有限公司 Data transmission method, data sending device and data receiving device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111669588A (en) * 2020-05-27 2020-09-15 西安电子科技大学 Ultra-high definition video compression coding and decoding method with ultra-low time delay
CN112565823A (en) * 2020-12-09 2021-03-26 深圳市朗强科技有限公司 Method and equipment for sending and receiving high-definition video data
CN112887305A (en) * 2021-01-25 2021-06-01 深圳市朗强科技有限公司 Audio data sending and receiving method and equipment

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111669588A (en) * 2020-05-27 2020-09-15 西安电子科技大学 Ultra-high definition video compression coding and decoding method with ultra-low time delay
CN112565823A (en) * 2020-12-09 2021-03-26 深圳市朗强科技有限公司 Method and equipment for sending and receiving high-definition video data
CN112887305A (en) * 2021-01-25 2021-06-01 深圳市朗强科技有限公司 Audio data sending and receiving method and equipment

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113742003A (en) * 2021-09-15 2021-12-03 深圳市朗强科技有限公司 Program code execution method and device based on FPGA chip
CN113784140A (en) * 2021-09-15 2021-12-10 深圳市朗强科技有限公司 Mathematical lossless coding method and device
CN113742003B (en) * 2021-09-15 2023-08-22 深圳市朗强科技有限公司 Program code execution method and device based on FPGA chip
CN113784140B (en) * 2021-09-15 2023-11-07 深圳市朗强科技有限公司 Mathematical lossless coding method and device
CN116132712A (en) * 2023-02-08 2023-05-16 北京镁伽机器人科技有限公司 Data transmission method, data sending device and data receiving device

Similar Documents

Publication Publication Date Title
CN113365075A (en) Wired sending and receiving method and device of ultra-high-definition video applying light compression algorithm
US20210168426A1 (en) Transmitting method, receiving method, transmitting device, and receiving device
US12010379B2 (en) Transmitting method, receiving method, transmitting device, and receiving device for high-definition video data
US11381869B2 (en) Transmitting method, receiving method, transmitting device, and receiving device for audio and video data in long-distance transmission
US11500804B2 (en) Method for transmitting control instruction, transmitting device, and receiving device
CN111083170A (en) Method and equipment for sending and receiving multimedia data
CN111510763A (en) WIFI-based sending and receiving method and device
WO2021142998A1 (en) Data sending method and apparatus, data receiving method and apparatus, system, device, and computer storage medium
CN210670381U (en) Audio and video data sending device, receiving device and transmission system
US9686560B2 (en) Lossless data compression and decompression apparatus, system, and method
JP2009010954A (en) Method and system for processing image at high speed
US20230007282A1 (en) Image transmission method and apparatus
CN111277591A (en) Improved data sending and receiving method, device and system
CN110474867A (en) A kind of transmission method of multi-medium data, system and equipment
CN211791839U (en) WIFI-based sending device, receiving device and transmission system
US20040237110A1 (en) Display monitor
CN113365073A (en) Wireless transmitting and receiving method and device for ultra-high-definition video applying light compression algorithm
CN111316644B (en) Image encoding method, decoding method, and applicable devices and systems
CN110087074B (en) Image processing apparatus and method for operating the same
US20210204020A1 (en) Transmitting device, receiving device, transmitting method, and receiving method for multimedia data
CN213547715U (en) High-definition video data sending device, receiving device and transmission system
CN210986124U (en) Transmitting and receiving device and system for audio and video data in long-distance transmission scene
CN210958813U (en) Treatment equipment
CN211531219U (en) Multimedia data sending device, receiving device and transmission system
CN217825146U (en) Ultra-high-definition video wireless transmitting device, wireless receiving device and wireless transmission system applying compression algorithm

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210907

RJ01 Rejection of invention patent application after publication