CN111263216A - Video transmission method, device, storage medium and terminal - Google Patents

Video transmission method, device, storage medium and terminal Download PDF

Info

Publication number
CN111263216A
CN111263216A CN202010095189.2A CN202010095189A CN111263216A CN 111263216 A CN111263216 A CN 111263216A CN 202010095189 A CN202010095189 A CN 202010095189A CN 111263216 A CN111263216 A CN 111263216A
Authority
CN
China
Prior art keywords
transmitted
video frame
video
source
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010095189.2A
Other languages
Chinese (zh)
Other versions
CN111263216B (en
Inventor
黄树伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Wanrou Automotive Electronics Co ltd
Original Assignee
Tcl Mobile Communication Technology Ningbo Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tcl Mobile Communication Technology Ningbo Co Ltd filed Critical Tcl Mobile Communication Technology Ningbo Co Ltd
Priority to CN202010095189.2A priority Critical patent/CN111263216B/en
Publication of CN111263216A publication Critical patent/CN111263216A/en
Application granted granted Critical
Publication of CN111263216B publication Critical patent/CN111263216B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/4363Adapting the video or multiplex stream to a specific local network, e.g. a IEEE 1394 or Bluetooth® network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440263Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the spatial resolution, e.g. for displaying on a connected PDA
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/458Scheduling content for creating a personalised stream, e.g. by combining a locally stored advertisement with an incoming stream; Updating operations, e.g. for OS modules ; time-related management operations
    • H04N21/4586Content update operation triggered locally, e.g. by comparing the version of software modules in a DVB carousel to the version stored locally

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Databases & Information Systems (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The embodiment of the application discloses a video transmission method, a video transmission device, a storage medium and a terminal. The video transmission method comprises the following steps: receiving a video transmission instruction, and acquiring a video frame to be transmitted according to the video transmission instruction; determining a source video frame corresponding to the video frame to be transmitted from a source video file; acquiring source image data of the source video frame; updating the video frame to be transmitted based on the source image data, and taking the updated video frame to be transmitted as a target video frame; and sending the target video frame to the opposite terminal. According to the embodiment of the application, each video frame of the video file is photographed and data is collected, video frame data lost by the video file after complementary processing is carried out, a video transmission picture is enhanced to be clear, meanwhile, the video occupation of the mobile terminal for storage is reduced, and the video transmission efficiency can be effectively improved.

Description

Video transmission method, device, storage medium and terminal
Technical Field
The application relates to the field of mobile terminal application, in particular to a video transmission method, a video transmission device, a storage medium and a terminal.
Background
With the continuous development of terminal technology and image transmission technology, people can more conveniently use various intelligent terminals to watch video data, the requirements on video and image quality are increasingly improved, and in the related technology, the intelligent mobile terminal simplifies the transmitted video picture data, so that the video picture is clear, and the user experience is greatly reduced. If the video pictures are transmitted clearly, the storage is obviously increased, and the transmission speed is greatly influenced.
Disclosure of Invention
The embodiment of the application provides a video transmission method, a video transmission device, a storage medium and a terminal, which can effectively improve the efficiency of video transmission.
The embodiment of the application provides a video transmission method, which comprises the following steps:
receiving a video transmission instruction, and acquiring a video frame to be transmitted according to the video transmission instruction;
determining a source video frame corresponding to the video frame to be transmitted from a source video file;
acquiring source image data of the source video frame;
updating the video frame to be transmitted based on the source image data, and taking the updated video frame to be transmitted as a target video frame;
and sending the target video frame to the opposite terminal.
Correspondingly, an embodiment of the present application further provides a video transmission apparatus, including:
the receiving unit is used for receiving a video transmission instruction and acquiring a video frame to be transmitted according to the video transmission instruction;
the first determining unit is used for determining a source video frame corresponding to the video frame to be transmitted from a source video file;
the first acquisition unit is used for acquiring source image data of the source video frame;
the updating unit is used for updating the video frame to be transmitted based on the source image data and taking the updated video frame to be transmitted as a target video frame;
and the sending unit is used for sending the target video frame to the opposite terminal.
Accordingly, the present application further provides a storage medium, where the storage medium stores a plurality of instructions, and the instructions are suitable for being loaded by a processor to perform the steps in the video transmission method as described above.
Correspondingly, an embodiment of the present application further provides a terminal, which includes a processor and a memory, where the memory stores a plurality of instructions, and the processor loads the instructions to execute the steps in the video transmission method described above.
According to the embodiment of the application, each video frame of the video file is photographed and data is collected, video frame data lost by the video file after complementary processing is carried out, a video transmission picture is enhanced to be clear, meanwhile, the video occupation of the mobile terminal for storage is reduced, and the video transmission efficiency can be effectively improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a flowchart illustrating a first video transmission method according to an embodiment of the present application.
Fig. 2 is a flowchart illustrating a second video transmission method according to an embodiment of the present application.
Fig. 3 is a block diagram of a first video transmission apparatus according to an embodiment of the present disclosure.
Fig. 4 is a block diagram of a second video transmission apparatus according to an embodiment of the present application.
Fig. 5 is a schematic structural diagram of a terminal according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described clearly and completely with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Based on the above problems, embodiments of the present application provide a video transmission method, an apparatus, a storage medium, and a terminal, which can effectively improve video transmission efficiency. The following are detailed below. It should be noted that the following description of the embodiments is not intended to limit the preferred order of the embodiments.
Referring to fig. 1, fig. 1 is a schematic flowchart illustrating a video transmission method according to an embodiment of the present disclosure. The video transmission method may be applied to mobile terminals such as mobile phones, tablet computers, notebook computers, palmtop computers, Portable Media Players (PMPs), and fixed terminals such as desktop computers. The specific flow of the video transmission method can be as follows:
101. and receiving a video transmission instruction, and acquiring a video frame to be transmitted according to the video transmission instruction.
Specifically, the video transmission instruction is received, whether video transmission is needed or not can be judged through the operation of the user on the current terminal, and the user can trigger the terminal to carry out video transmission in various modes.
For example, when it is detected that a user performs a video-related operation during the running of the terminal application, it may be determined that video transmission is required. The terminal application may be software installed on the terminal, and the software may perform video transmission.
For another example, when a wired connection with another terminal is detected, a video file can be transmitted in this manner, like a data transmission line.
Specifically, after receiving a video transmission instruction triggered by a user, a video file, that is, a video file that the user needs to transmit, may be determined according to the video transmission instruction. After the video file is determined, compression processing may be performed on the video file, where the compression processing may refer to performing a deletion operation on source video data of the video file, so as to save transmission time.
Specifically, the video frame to be transmitted is obtained by obtaining a video frame of a compressed video file, and the video frame to be transmitted is obtained by compressing a video frame of a source file.
102. And determining a source video frame corresponding to the video frame to be transmitted from the source video file.
Specifically, a source video frame corresponding to a video frame to be transmitted is determined from a source video file, the source video file is a video file selected by a user for transmission at a terminal, the source video frame is obtained from the source video file, the source video frame and the video frame to be transmitted are the same display picture, that is, the source video frame is an image which is not subjected to compression processing, the video frame to be transmitted is an image which is subjected to compression processing on the source video frame, the difference between the source video frame and the video frame to be transmitted can be that the size of image data is different, and image data may be lost after the video frame is subjected to compression processing.
For example, the obtained video frame to be transmitted may be a first frame image displayed in a playing mode of a video file to be transmitted, and the obtained source video frame may be a first frame image displayed in a playing mode of a source video file.
In some embodiments, the step "determining a source video frame corresponding to the video frame to be transmitted from a source video file" may include the following procedures:
in the playing process of a source video file, collecting an image display picture according to a preset frame rate, and taking the image display picture as a source video frame.
Specifically, in the playing process of the source video file, the image display frames are collected according to a preset frame rate, where the preset frame rate may refer to a frame rate of the video file during playing, for example, the frame rate may be 30FPS (frames per second), which means that 30 video frame frames are displayed per second.
The image display frames are collected according to a preset frame rate, a plurality of image display frames are collected based on the preset frame rate, a target display frame is determined from the plurality of image display frames, for example, the preset frame rate is 30FPS, 30 video frames which can be collected in the first second of source video playing can be determined, a frame with the highest image frame definition can be determined from the 30 video frames as the target display frame, and the target display frame can be the source video frame.
103. And acquiring source image data of the source video frame.
Specifically, source image data of a source video frame is obtained, the image data may be a numerical representation of image information, the image information may include a plurality of pixel points, and color, brightness, gray scale, and the like of each pixel point, and then the image data may include color value, brightness value, gray scale, and the like of each pixel point.
Wherein a color value may be defined by a hexadecimal symbol consisting of red, green and blue values (RGB), the minimum value of each color being 0 (hexadecimal: #00) and the maximum value being 255 (hexadecimal: # FF), e.g., pure white may be # FFFFFF; the luminance value of the pixel is between 0 and 255, the luminance of the pixel close to 255 is higher, the luminance close to 0 is lower, and the rest part belongs to the middle tone. This distinction in luminance is an absolute distinction, i.e. pixels near 255 are high, pixels near 0 are dark, and the middle is around 128; the gray value refers to the degree of color shade, and the gray histogram refers to the number of pixels with the gray value counted corresponding to each gray value in a digital image. The black and white images have equal R, G and B values, which are called gray values, each pixel has a gray value, and the gray value range of the 8-bit gray image is 0-255.
For example, acquiring the source image data may include: the color value of the first pixel is # FF0000, the luminance value may be 100, the gray scale value may be 100, the color value of the second pixel is # FF0001, the luminance value may be 50, the gray scale value may be 50, the color value of the second pixel is #000001, the luminance value may be 30, the gray scale value may be 40, and the like.
104. And updating the video frame to be transmitted based on the source image data, and taking the updated video frame to be transmitted as a target video frame.
Specifically, after the source image data is acquired, the image data to be transmitted included in the image frame to be transmitted may be updated, the image data to be transmitted may be updated based on the source image data, the image data to be transmitted may be supplemented, the image definition of the video frame to be transmitted may be improved, and the video file may be kept consistent in playing effect before and after transmission.
In some embodiments, the source image data may include source image sub-data, and the source image sub-data may be divided according to a preset rule, for example, the source video frame may be divided into a plurality of regions, and the image information of each region may be one source image sub-data. Then, the step "updating the video frame to be transmitted based on the source image data" may comprise the following procedure:
acquiring data to be transmitted of the video frame to be transmitted, wherein the data to be transmitted comprises sub data to be transmitted;
acquiring the position of the sub data to be transmitted in the video frame to be transmitted;
determining source image subdata corresponding to the subdata to be transmitted in a source video frame according to the position;
and updating the sub data to be transmitted according to the source image sub data.
Specifically, the data to be transmitted of the video frame to be transmitted is obtained, and the data to be transmitted may include sub data to be transmitted, for example, the video frame to be transmitted may be divided into four image display areas, which are a first area, a second area, a third area, and a fourth area, and correspondingly, the data to be transmitted may include first sub data to be transmitted, second sub data to be transmitted, third sub data to be transmitted, and fourth sub data to be transmitted, where the first sub data to be transmitted represents the first area, the second sub data to be transmitted represents the second area, the third sub data to be transmitted represents the third area, and the fourth sub data to be transmitted represents the fourth area.
Specifically, the position of the sub-data to be transmitted in the video frame to be transmitted is obtained, and the position information corresponding to the sub-data to be transmitted can be determined through the image display area represented by the image sub-data.
For example, the source video frame may also be divided into four image display areas, which are a fifth area, a sixth area, a seventh area, and an eighth area, and correspondingly, the source image data may include first source image sub-data, second source image sub-data, third source image sub-data, and fourth source image sub-data, where the first source image sub-data represents the fifth area, the second source image sub-data represents the sixth area, the third source image sub-data represents the seventh area, and the fourth source image sub-data represents the eighth area.
After the position of the display area corresponding to the to-be-transmitted sub data is determined, the source image sub data corresponding to the to-be-transmitted sub data in the source video frame may be determined according to the position information, for example, the to-be-transmitted video frame may include a first area, a second area, a third area, and a fourth area, the source video frame may include a fifth area, a sixth area, a seventh area, and an eighth area, which may indicate that the first area corresponds to the fifth area, the second area corresponds to the sixth area, the third area corresponds to the seventh area, and the fourth area corresponds to the eighth area.
Specifically, the image sub-data to be transmitted is updated according to the source image sub-data, and the source image sub-data corresponding to the sub-data to be transmitted can be added to the position of the sub-data to be transmitted in the video frame to be transmitted by deleting the sub-data to be transmitted in the video frame to be transmitted. For example, the first source image sub-data may include a color value # FF0000, a luminance value of 100, and a gray value of 100, and the first to-be-transmitted sub-data may include a color value #000001, a luminance value of 30, and a gray value of 40, so that the color value of the first to-be-transmitted data may be updated to # FF0000, the luminance value is changed to 100, and the gray value is changed to 100.
In some embodiments, before the step of "updating the video frame to be transmitted based on the source image data", the following steps may be further included:
matching the image data to be transmitted with the source image data;
determining the matching degree of the image data to be transmitted and the source image data according to the matching result;
and if the matching degree is smaller than the preset matching degree, executing the step of updating the video frame to be transmitted based on the source image data.
Specifically, the data to be transmitted and the source image data may be matched in various ways, for example, the image sub-data to be transmitted included in the data to be transmitted and the source image sub-data included in the source image data may be matched, and the data to be transmitted and the source image data may be matched, where the data to be transmitted may be an average value of all the sub-data to be transmitted, and the source image data may be an average value of all the source image sub-data.
If the matching mode is to match the image subdata to be transmitted included in the data to be transmitted with the source image subdata included in the source image data, the image subdata to be transmitted and the corresponding source image subdata can be matched.
For example, the image subdata to be transmitted may be color value #000000, brightness value 100, and gray value 100, and the source image subdata may be color value #000011, brightness value 90, and gray value 90, and the image subdata to be transmitted is matched with the source image subdata.
Specifically, the matching degree between the image data to be transmitted and the source image data is determined according to the matching result, and the matching degree can represent the similarity between the image data to be transmitted and the source image data, for example, when the color value difference is 1000, the color value matching degree can be determined to be 90%; when the brightness value difference is 20, the brightness value matching degree can be determined to be 90%; when the gray value difference is 20, it may be determined that the gray value matching degree is 90%, and then the average value of the color value, the brightness value, and the gray value matching degree may be represented as the matching degree of the image sub-data to be transmitted and the source image sub-data.
For example, after the image data to be transmitted is matched with the source image data, when the color value difference is 4000 as a matching result, the color value matching degree can be determined to be 60%; when the brightness value difference is 80%, the brightness value matching degree can be determined to be 60%; when the gray value difference is 20, the matching degree of the gray value can be determined to be 90%, and then the matching degree of the image data to be transmitted and the source image data can be determined to be 70%.
Specifically, the matching degree of the image data to be transmitted and the source image data is compared with the preset matching degree, when the matching degree is greater than the preset matching degree, it can be determined that the difference between the image data to be transmitted and the source image data is small, at the moment, the data to be transmitted does not need to be updated, and the operation efficiency of the terminal can be saved.
If the matching degree of the image data to be transmitted and the source image data is smaller than the preset matching degree, it can be shown that the difference between the image data to be transmitted and the source image data is large, the image data to be transmitted needs to be updated, and then the step of updating the video frame to be transmitted based on the source image data can be executed.
In some embodiments, the video transmission method may further include the steps of:
and if the matching degree of the image data to be transmitted and the source image data is greater than the preset matching degree, taking the video frame to be transmitted as a target video frame, and clearing the source video frame and the source image data.
Specifically, when the matching degree between the image data to be transmitted and the source image data is greater than the preset matching degree, the video frame to be transmitted can be used as the target video frame, and the source video frame corresponding to the obtained video frame to be transmitted can be deleted, so that the storage space of the terminal can be saved, and the operation efficiency can be improved.
For example, when the color value difference is 1000, the color value matching degree may be determined to be 90%; when the brightness value difference is 20, the brightness value matching degree can be determined to be 90%; when the gray value difference is 20, it may be determined that the gray value matching degree is 90%, it may be determined that the matching degree between the image data to be transmitted and the source image data is 90%, and the preset matching degree may be 80%, it may be determined that the matching degree between the image data to be transmitted and the source image data is greater than the preset matching degree, and it may be determined that the data to be transmitted is the target transmission data, and the source image data is deleted.
105. And sending the target video frame to the opposite terminal.
Specifically, after the target video frame is determined, the target video frame may be transmitted to the counterpart terminal.
In some embodiments, before the step of "transmitting the target video frame to the counterpart terminal", the following steps may be further included:
acquiring a first definition value of a video frame to be transmitted;
acquiring a second definition value of the source video frame;
determining a definition difference value according to the first definition value and the second definition value;
and if the definition difference value is smaller than a preset threshold value, taking the video frame to be transmitted as a target video frame.
Specifically, a first definition value of the video frame to be transmitted is obtained, where the first definition value may be an overall definition value of an image frame of the video frame to be transmitted, and a second definition value of the source video frame is obtained, and the second definition value may be an overall definition value of an image frame of the source video frame corresponding to the video frame to be transmitted.
Wherein, the definition refers to the definition of each detail shadow and its boundary on the image. Sharpness is generally the term sharpness used for video recorders, because it compares image quality by looking at the sharpness of the reproduced image. While a camera typically uses the term resolution to measure its ability to "resolve details of the scene being photographed. The unit is "television line" (TVLine), also known as line.
After the first definition value and the second definition value are obtained, a difference value between the first definition value and the second definition value may be obtained, and a definition difference between the image frame to be transmitted and the source video frame may be determined, for example, the first definition value may be 400, the second definition value may be 100, and then the definition difference between the video frame to be transmitted and the source video frame may be 300.
Specifically, the definition difference value may be compared with a preset difference value, whether to update the video frame to be transmitted is judged according to the comparison result, and when the definition difference value is smaller than the preset difference value, the video frame to be transmitted may be used as the target video frame, and the video frame to be transmitted may not be updated; if the definition difference is larger than the preset difference, the video to be transmitted can be updated based on the source video frame, and the definition of transmitting the video file to the opposite terminal is ensured.
The embodiment of the application discloses a video transmission method, which comprises the following steps: receiving a video transmission instruction, and acquiring a video frame to be transmitted according to the video transmission instruction; determining a source video frame corresponding to the video frame to be transmitted from a source video file; acquiring source image data of the source video frame; updating the video frame to be transmitted based on the source image data, and taking the updated video frame to be transmitted as a target video frame; and sending the target video frame to the opposite terminal. According to the embodiment of the application, each video frame of the video file is photographed and data is collected, video frame data lost by the video file after complementary processing is carried out, a video transmission picture is enhanced to be clear, meanwhile, the video occupation of the mobile terminal for storage is reduced, and the video transmission efficiency can be effectively improved.
Referring to fig. 2, fig. 2 is a schematic flowchart of a second video transmission method according to an embodiment of the present application. The specific scene application of the video transmission method can be as follows:
201. when the terminal detects that the video file is transmitted to the opposite terminal, the video file to be transmitted is obtained.
Specifically, when the terminal detects that the video file is transmitted to the other terminal, the terminal may trigger a video transmission instruction according to the user operation, and determine the video file to be transmitted according to the video transmission instruction, which may be a source video file. After the source video file is determined, the source video file is compressed by image data, and the video file to be transmitted can be obtained.
202. The terminal obtains image data to be transmitted of a video file to be transmitted.
Specifically, the terminal acquires image data to be transmitted of a video file to be transmitted, and can further acquire image data information included in the video frame to be transmitted by acquiring a video frame to be transmitted of the video file to be transmitted after acquiring the video frame to be transmitted.
The terminal can acquire the video frames through a preset frame rate, where the preset frame rate can be a frame rate for playing the video file, for example, the preset frame rate can be 20 frames per second, when the video is played, 20 frames of images can be displayed per second, and when the video frames are acquired, any one of the 20 frames of images can be displayed.
After determining a video frame to be transmitted through a video file to be transmitted, acquiring image data of the video frame to be transmitted, which may be used as video data to be transmitted, where the image data may include data of each pixel point in the video frame, for example, the video frame to be transmitted may include 100 pixel points, and the data to be transmitted may include first data to be transmitted (representing the first pixel point to be transmitted), second data to be transmitted (representing the second pixel point to be transmitted) … … first data to be transmitted (representing the first pixel point to be transmitted), and so on.
203. The terminal obtains source image data of a source video file.
Specifically, the terminal acquires source image data of a source video file, and the specific acquisition mode can acquire image data to be transmitted of the video file to be transmitted according to the steps. For example, a source video frame can include first source image data (representing first source image pixels), second source image data (representing second source image pixels) … … first source image data (representing first source image pixels), and so on.
204. And the terminal matches the image data to be transmitted with the source image data.
Specifically, after the image data to be transmitted and the source image data are acquired, the image data to be transmitted and the source image data can be matched.
For example, the data to be transmitted may include first data to be transmitted (representing first pixel points to be transmitted), second data to be transmitted (representing second pixel points to be transmitted) … … data to be transmitted (representing first pixel points to be transmitted), and so on, the source video frame may include first source image data (representing first source image pixel points), second source image data (representing second source image pixel points) … … first source image data (representing first source image pixel points), and so on, then the first data to be transmitted may be matched with the first source image data, the second data to be transmitted may be matched with the second source image data, the third data to be transmitted may be matched with the third source image data, and so on, the data to be transmitted corresponding to the pixel points at the same position may be matched with the source image data.
205. And determining target transmission image data according to the matching result, and sending the target transmission image data to the opposite terminal.
Specifically, when the image data to be transmitted is matched with the source image data, a matching result can be obtained, for example, the matching result can be that the image data to be transmitted is successfully matched with the source image data or the image data to be transmitted is unsuccessfully matched with the source image data, where successful matching indicates that the image data to be transmitted is the same as the source image data, and unsuccessful matching indicates that the image data to be transmitted is not the same as the source image data.
When it is determined that the image data to be transmitted is different from the source image data, the image data to be transmitted may be updated based on the source image data in order to ensure that the opposite terminal receives a complete video file, and the updating manner may include various manners, for example, if the image data to be transmitted lacks part of the image data, the corresponding image data in the source image data may be added to the image data to be transmitted. For another example, if the image data to be transmitted and the source image data have a difference in value, the difference between the image data to be transmitted and the source image data can be changed according to the source image data.
Specifically, the updated image data to be transmitted can be used as target image data, and the target image data is sent to the opposite terminal, so that the quality of the transmitted video can be improved.
The embodiment of the application discloses a video transmission method, which comprises the following steps: receiving a video transmission instruction, and acquiring a video frame to be transmitted according to the video transmission instruction; determining a source video frame corresponding to the video frame to be transmitted from a source video file; acquiring source image data of the source video frame; updating the video frame to be transmitted based on the source image data, and taking the updated video frame to be transmitted as a target video frame; and sending the target video frame to the opposite terminal. According to the embodiment of the application, each video frame of the video file is photographed and data is collected, video frame data lost by the video file after complementary processing is carried out, a video transmission picture is enhanced to be clear, meanwhile, the video occupation of the mobile terminal for storage is reduced, and the video transmission efficiency can be effectively improved.
In order to better implement the video transmission method provided by the embodiment of the present application, an embodiment of the present application further provides a device based on the video transmission method. The terms are the same as those in the video transmission method, and details of implementation can be referred to the description in the method embodiment.
Referring to fig. 3, fig. 3 is a block diagram of a first video transmission device according to an embodiment of the present disclosure, where the video transmission device can be applied to a mobile terminal such as a mobile phone, a tablet computer, a notebook computer, a palm computer, a Portable Media Player (PMP), and a fixed terminal such as a desktop computer, and the device includes:
a receiving unit 301, configured to receive a video transmission instruction, and obtain a video frame to be transmitted according to the video transmission instruction;
a first determining unit 302, configured to determine, from a source video file, a source video frame corresponding to the video frame to be transmitted;
a first obtaining unit 303, configured to obtain source image data of the source video frame;
an updating unit 304, configured to update the video frame to be transmitted based on the source image data, and use the updated video frame to be transmitted as a target video frame;
a sending unit 305, configured to send the target video frame to a counterpart terminal.
In some embodiments, referring to fig. 4, fig. 4 is a block diagram of a second video transmission apparatus according to an embodiment of the present disclosure, where the updating unit 304 may include:
a first obtaining subunit 3041, configured to obtain data to be transmitted of the video frame to be transmitted, where the data to be transmitted includes sub data to be transmitted;
a second obtaining subunit 3042, configured to obtain a position of the sub data to be transmitted in the video frame to be transmitted;
a determining subunit 3043, configured to determine, according to the position, source image sub-data corresponding to the to-be-transmitted sub-data in the source video frame;
a replacing subunit 3044, configured to update the to-be-transmitted sub data according to the source image sub data.
In some embodiments, the video transmission apparatus may further include:
the matching unit is used for matching the image data to be transmitted with the source image data;
the second determining unit is used for determining the matching degree of the image data to be transmitted and the source image data according to the matching result;
and the first execution unit is used for updating the video frame to be transmitted based on the source image data if the matching degree is smaller than a preset matching degree.
In some embodiments, the video transmission apparatus may further include:
and the second execution unit is used for taking the video frame to be transmitted as a target video frame and clearing the source video frame and the source image data if the matching degree of the image data to be transmitted and the source image data is greater than a preset matching degree.
In some embodiments, the first determining unit 302 may include:
and the acquisition subunit is used for acquiring an image display picture according to a preset frame rate in the playing process of the source video file, and taking the image display picture as a source video frame.
In some embodiments, the video transmission apparatus may further include:
the second acquisition unit is used for acquiring a first definition value of a video frame to be transmitted;
a third obtaining unit configured to obtain a second sharpness value of the source video frame;
a third determining unit, configured to determine a sharpness difference according to the first sharpness value and the second sharpness value;
and the comparison unit is used for taking the video frame to be transmitted as a target video frame if the definition difference value is smaller than a preset threshold value.
The embodiment of the application discloses a video transmission device, this video transmission device includes: receiving a video transmission instruction, and acquiring a video frame to be transmitted according to the video transmission instruction; determining a source video frame corresponding to the video frame to be transmitted from a source video file; acquiring source image data of the source video frame; updating the video frame to be transmitted based on the source image data, and taking the updated video frame to be transmitted as a target video frame; and sending the target video frame to the opposite terminal. According to the embodiment of the application, each video frame of the video file is photographed and data is collected, video frame data lost by the video file after complementary processing is carried out, a video transmission picture is enhanced to be clear, meanwhile, the video occupation of the mobile terminal for storage is reduced, and the video transmission efficiency can be effectively improved.
The embodiment of the application also provides a terminal. As shown in fig. 5, the terminal may include a Radio Frequency (RF) circuit 601, a memory 602 including one or more storage media, an input unit 603, a display unit 604, a sensor 605, an audio circuit 606, a Wireless Fidelity (WiFi) module 607, a processor 608 including one or more processing cores, and a power supply 609. Those skilled in the art will appreciate that the terminal structure shown in fig. 5 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components. Wherein:
RF circuit 601 may be used for receiving and transmitting signals during the process of transmitting and receiving information, and in particular, for processing downlink information of a base station by one or more processors 608; in addition, data relating to uplink is transmitted to the base station. In general, the RF circuit 601 includes, but is not limited to, an antenna, at least one amplifier, a tuner, one or more oscillators, a Subscriber Identity Module (SIM) card, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, the RF circuit 601 may also communicate with networks and other devices via wireless communications.
The memory 602 may be used to store software programs and modules, and the processor 608 executes various functional applications and data processing by operating the software programs and modules stored in the memory 602. The memory 602 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function (such as a sound playing function, an image playing function, etc.), and the like. Further, the memory 602 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Accordingly, the memory 602 may also include a memory controller to provide the processor 608 and the input unit 603 access to the memory 602.
The input unit 603 may be used to receive input numeric or character information and generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function control. In particular, in one particular embodiment, input unit 603 may include a touch-sensitive surface as well as other input devices. The touch-sensitive surface, also referred to as a touch display screen or a touch pad, may collect touch operations by a user (e.g., operations by a user on or near the touch-sensitive surface using a finger, a stylus, or any other suitable object or attachment) thereon or nearby, and drive the corresponding connection device according to a predetermined program. The input unit 603 may include other input devices in addition to the touch-sensitive surface. In particular, other input devices may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The display unit 604 may be used to display information input by or provided to the user and various graphical user interfaces of the server, which may be made up of graphics, text, icons, video, and any combination thereof. The display unit 604 may include a display panel, and optionally, the display panel may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. Further, the touch-sensitive surface may overlay the display panel, and when a touch operation is detected on or near the touch-sensitive surface, the touch operation is transmitted to the processor 608 to determine the type of touch event, and the processor 608 then provides a corresponding visual output on the display panel according to the type of touch event. Although in FIG. 5 the touch-sensitive surface and the display panel are two separate components to implement input and output functions, in some embodiments the touch-sensitive surface may be integrated with the display panel to implement input and output functions.
The terminal may also include at least one sensor 605, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor that adjusts the brightness of the display panel according to the brightness of ambient light, and a proximity sensor that turns off the display panel and the backlight when the server moves to the ear.
Audio circuitry 606, speakers, and microphones may provide an audio interface between the user and the server. The audio circuit 606 may transmit the electrical signal converted from the received audio data to a speaker, and convert the electrical signal into a sound signal for output; on the other hand, the microphone converts the collected sound signal into an electrical signal, which is received by the audio circuit 606 and converted into audio data, which is then processed by the audio data output processor 608, and then passed through the RF circuit 601 to be sent to, for example, a terminal, or the audio data is output to the memory 602 for further processing. The audio circuitry 606 may also include an ear-bud jack to provide communication of peripheral headphones with the server.
WiFi belongs to short-distance wireless transmission technology, and the terminal can help a user to receive and send e-mails, browse webpages, access streaming media and the like through the WiFi module 607, and provides wireless broadband internet access for the user. Although fig. 5 shows the WiFi module 607, it is understood that it does not belong to the essential constitution of the terminal, and may be omitted entirely as needed within the scope of not changing the essence of the application.
The processor 608 is a control center of the terminal, connects various parts of the entire handset using various interfaces and lines, and performs various functions of the server and processes data by operating or executing software programs and modules stored in the memory 602 and calling data stored in the memory 602, thereby performing overall monitoring of the handset. Optionally, processor 608 may include one or more processing cores; preferably, the processor 608 may integrate an application processor, which primarily handles operating systems, user interfaces, applications, etc., and a modem processor, which primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 608.
The terminal also includes a power supply 609 (e.g., a battery) for powering the various components, which may preferably be logically connected to the processor 608 via a power management system that may be used to manage charging, discharging, and power consumption. The power supply 609 may also include any component of one or more dc or ac power sources, recharging systems, power failure detection circuitry, power converters or inverters, power status indicators, and the like.
Specifically, in this embodiment, the processor 608 in the terminal loads the executable file corresponding to the process of one or more application programs into the memory 602 according to the following instructions, and the processor 608 runs the application programs stored in the memory 602, thereby implementing various functions:
receiving a video transmission instruction, and acquiring a video frame to be transmitted according to the video transmission instruction;
determining a source video frame corresponding to the video frame to be transmitted from a source video file;
acquiring source image data of the source video frame;
updating the video frame to be transmitted based on the source image data, and taking the updated video frame to be transmitted as a target video frame;
and sending the target video frame to the opposite terminal.
The embodiment of the application discloses a video transmission method, a video transmission device, a storage medium and a terminal. The video transmission method comprises the following steps: receiving a video transmission instruction, and acquiring a video frame to be transmitted according to the video transmission instruction; determining a source video frame corresponding to the video frame to be transmitted from a source video file; acquiring source image data of the source video frame; updating the video frame to be transmitted based on the source image data, and taking the updated video frame to be transmitted as a target video frame; and sending the target video frame to the opposite terminal. According to the embodiment of the application, each video frame of the video file is photographed and data is collected, video frame data lost by the video file after complementary processing is carried out, a video transmission picture is enhanced to be clear, meanwhile, the video occupation of the mobile terminal for storage is reduced, and the video transmission efficiency can be effectively improved.
It will be understood by those skilled in the art that all or part of the steps in the methods of the above embodiments may be performed by instructions or by instructions controlling associated hardware, which may be stored in a storage medium and loaded and executed by a processor.
To this end, the present application provides a storage medium, in which a plurality of instructions are stored, where the instructions can be loaded by a processor to execute the steps in any one of the video transmission methods provided in the present application. For example, the instructions may perform the steps of:
receiving a video transmission instruction, and acquiring a video frame to be transmitted according to the video transmission instruction; determining a source video frame corresponding to the video frame to be transmitted from a source video file; acquiring source image data of the source video frame; updating the video frame to be transmitted based on the source image data, and taking the updated video frame to be transmitted as a target video frame; and sending the target video frame to the opposite terminal.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Wherein the storage medium may include: read Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disks, and the like.
Since the instructions stored in the storage medium can execute the steps in any video transmission method provided in the embodiments of the present application, beneficial effects that can be achieved by any video transmission method provided in the embodiments of the present application can be achieved, which are detailed in the foregoing embodiments and will not be described herein again.
The video transmission method, the video transmission device, the storage medium and the terminal provided by the embodiment of the present application are described in detail above, and a specific example is applied in the description to explain the principle and the implementation of the present application, and the description of the above embodiment is only used to help understanding the method and the core idea of the present application; meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (10)

1. A video transmission method is applied to a terminal and is characterized by comprising the following steps:
receiving a video transmission instruction, and acquiring a video frame to be transmitted according to the video transmission instruction;
determining a source video frame corresponding to the video frame to be transmitted from a source video file;
acquiring source image data of the source video frame;
updating the video frame to be transmitted based on the source image data, and taking the updated video frame to be transmitted as a target video frame;
and sending the target video frame to the opposite terminal.
2. The method of claim 1, wherein the source image data comprises source image sub-data;
updating the video frame to be transmitted based on the source image data comprises:
acquiring data to be transmitted of the video frame to be transmitted, wherein the data to be transmitted comprises sub data to be transmitted;
acquiring the position of the sub data to be transmitted in the video frame to be transmitted;
determining source image subdata corresponding to the subdata to be transmitted in a source video frame according to the position;
and updating the sub data to be transmitted according to the source image sub data.
3. The method of claim 1, further comprising, prior to said updating the video frame to be transmitted based on the source image data:
matching the image data to be transmitted with the source image data;
determining the matching degree of the image data to be transmitted and the source image data according to the matching result;
and if the matching degree is smaller than the preset matching degree, executing the step of updating the video frame to be transmitted based on the source image data.
4. The method of claim 3, further comprising:
and if the matching degree of the image data to be transmitted and the source image data is greater than the preset matching degree, taking the video frame to be transmitted as a target video frame, and clearing the source video frame and the source image data.
5. The method according to claim 1, wherein the determining a source video frame corresponding to the video frame to be transmitted from a source video file comprises:
in the playing process of a source video file, collecting an image display picture according to a preset frame rate, and taking the image display picture as a source video frame.
6. The method according to claim 1, before transmitting the target video frame to a counterpart terminal, further comprising:
acquiring a first definition value of a video frame to be transmitted;
acquiring a second definition value of the source video frame;
determining a definition difference value according to the first definition value and the second definition value;
and if the definition difference value is smaller than a preset threshold value, taking the video frame to be transmitted as a target video frame.
7. A video transmission apparatus, comprising:
the receiving unit is used for receiving a video transmission instruction and acquiring a video frame to be transmitted according to the video transmission instruction;
the first determining unit is used for determining a source video frame corresponding to the video frame to be transmitted from a source video file;
the first acquisition unit is used for acquiring source image data of the source video frame;
the updating unit is used for updating the video frame to be transmitted based on the source image data and taking the updated video frame to be transmitted as a target video frame;
and the sending unit is used for sending the target video frame to the opposite terminal.
8. The apparatus of claim 7, wherein the updating unit comprises:
the first obtaining subunit is configured to obtain data to be transmitted of the video frame to be transmitted, where the data to be transmitted includes sub data to be transmitted;
the second obtaining subunit is configured to obtain a position of the sub data to be transmitted in the video frame to be transmitted;
the determining subunit is used for determining source image sub-data corresponding to the sub-data to be transmitted in the source video frame according to the position;
and the replacing subunit is used for updating the sub data to be transmitted according to the source image sub data.
9. A storage medium storing a plurality of instructions adapted to be loaded by a processor to perform the steps of the video transmission method according to any one of claims 1 to 6.
10. A terminal comprising a processor and a memory, said memory storing a plurality of instructions, said processor loading said instructions to perform the steps in the video transmission method of any of claims 1 to 6.
CN202010095189.2A 2020-02-14 2020-02-14 Video transmission method, device, storage medium and terminal Active CN111263216B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010095189.2A CN111263216B (en) 2020-02-14 2020-02-14 Video transmission method, device, storage medium and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010095189.2A CN111263216B (en) 2020-02-14 2020-02-14 Video transmission method, device, storage medium and terminal

Publications (2)

Publication Number Publication Date
CN111263216A true CN111263216A (en) 2020-06-09
CN111263216B CN111263216B (en) 2022-06-10

Family

ID=70954625

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010095189.2A Active CN111263216B (en) 2020-02-14 2020-02-14 Video transmission method, device, storage medium and terminal

Country Status (1)

Country Link
CN (1) CN111263216B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112836086A (en) * 2020-12-31 2021-05-25 维沃移动通信有限公司 Video processing method and device and electronic equipment

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130156109A1 (en) * 2011-12-19 2013-06-20 Canon Kabushiki Kaisha Method of transmitting video information over a wireless multi-path communication link and corresponding wireless station
CN103281539A (en) * 2013-06-07 2013-09-04 华为技术有限公司 Method, device and terminal for image encoding and decoding processing
CN103716630A (en) * 2012-09-29 2014-04-09 华为技术有限公司 Upsampling filter generation method and device
CN103812833A (en) * 2012-11-08 2014-05-21 上海心动企业发展有限公司 Control method and control device for updating multimedia data according to bandwidth
US20170099493A1 (en) * 2015-10-01 2017-04-06 Lattice Semiconductor Corporation Compressed video playback with raw data assist
CN107566733A (en) * 2017-09-29 2018-01-09 深圳市聚宝汇科技有限公司 A kind of picture transmission method and system
CN107809641A (en) * 2017-11-13 2018-03-16 北京京东方光电科技有限公司 Image data transfer method, processing method and image processing equipment, display device
CN110047060A (en) * 2019-04-15 2019-07-23 Oppo广东移动通信有限公司 Image processing method, device, storage medium and electronic equipment
CN110418209A (en) * 2019-06-24 2019-11-05 华为技术有限公司 A kind of information processing method and terminal device applied to transmission of video

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130156109A1 (en) * 2011-12-19 2013-06-20 Canon Kabushiki Kaisha Method of transmitting video information over a wireless multi-path communication link and corresponding wireless station
CN103716630A (en) * 2012-09-29 2014-04-09 华为技术有限公司 Upsampling filter generation method and device
CN103812833A (en) * 2012-11-08 2014-05-21 上海心动企业发展有限公司 Control method and control device for updating multimedia data according to bandwidth
CN103281539A (en) * 2013-06-07 2013-09-04 华为技术有限公司 Method, device and terminal for image encoding and decoding processing
US20170099493A1 (en) * 2015-10-01 2017-04-06 Lattice Semiconductor Corporation Compressed video playback with raw data assist
CN107566733A (en) * 2017-09-29 2018-01-09 深圳市聚宝汇科技有限公司 A kind of picture transmission method and system
CN107809641A (en) * 2017-11-13 2018-03-16 北京京东方光电科技有限公司 Image data transfer method, processing method and image processing equipment, display device
CN110047060A (en) * 2019-04-15 2019-07-23 Oppo广东移动通信有限公司 Image processing method, device, storage medium and electronic equipment
CN110418209A (en) * 2019-06-24 2019-11-05 华为技术有限公司 A kind of information processing method and terminal device applied to transmission of video

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
李然: "图像与视频压缩感知研究", 《中国博士学位论文全文数据库信息科技辑,2016年第04期》 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112836086A (en) * 2020-12-31 2021-05-25 维沃移动通信有限公司 Video processing method and device and electronic equipment
WO2022143971A1 (en) * 2020-12-31 2022-07-07 维沃移动通信有限公司 Video processing method and apparatus, and electronic device
CN112836086B (en) * 2020-12-31 2023-06-23 维沃移动通信有限公司 Video processing method and device and electronic equipment

Also Published As

Publication number Publication date
CN111263216B (en) 2022-06-10

Similar Documents

Publication Publication Date Title
CN107038715B (en) Image processing method and device
CN108287744B (en) Character display method, device and storage medium
CN106792120B (en) Video picture display method and device and terminal
WO2020215720A1 (en) Photographing method, storage medium, and electronic device
US11297328B2 (en) Video coding method, device, device and storage medium
CN112449120A (en) High dynamic range video generation method and device
CN106993136B (en) Mobile terminal and multi-camera-based image noise reduction method and device thereof
WO2022110687A1 (en) Image processing method and apparatus, electronic device, and readable storage medium
CN110187808B (en) Dynamic wallpaper setting method and device and computer-readable storage medium
WO2020134789A1 (en) Mobile terminal and method for controlling on and off of screen, and computer storage medium
CN109495769B (en) Video communication method, terminal, smart television, server and storage medium
US20220319378A1 (en) Screen color temperature control method, apparatus, storage medium, and mobile terminal
CN111158815B (en) Dynamic wallpaper blurring method, terminal and computer readable storage medium
CN111263216B (en) Video transmission method, device, storage medium and terminal
CN108320265B (en) Image processing method, terminal and computer readable storage medium
CN111432154A (en) Video playing method, video processing method and electronic equipment
CN108259808B (en) Video frame compression method and mobile terminal
CN111614901A (en) Image shooting method and device, storage medium and terminal
CN114140655A (en) Image classification method and device, storage medium and electronic equipment
CN113852751A (en) Image processing method, device, terminal and storage medium
CN113485667A (en) Method for screen projection display of terminal, terminal and storage medium
CN111290672A (en) Image display method and device, storage medium and terminal
CN111182351A (en) Video playing processing method and device, storage medium and terminal
CN111787228A (en) Shooting method, shooting device, storage medium and mobile terminal
CN111966271B (en) Screen panorama screenshot method and device, terminal equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20230627

Address after: 200137 Room 518, No. 1155 Jinhu Road, China (Shanghai) Pilot Free Trade Zone, Pudong New Area, Shanghai

Patentee after: Shanghai Wanrou Automotive Electronics Co.,Ltd.

Address before: 10 / F, No.5 Lane 999, Yangfan Road, high tech Zone, Ningbo City, Zhejiang Province

Patentee before: TCL mobile communication technology (Ningbo) Co.,Ltd.