CN112784108A - Data processing method and device - Google Patents

Data processing method and device Download PDF

Info

Publication number
CN112784108A
CN112784108A CN202011642323.2A CN202011642323A CN112784108A CN 112784108 A CN112784108 A CN 112784108A CN 202011642323 A CN202011642323 A CN 202011642323A CN 112784108 A CN112784108 A CN 112784108A
Authority
CN
China
Prior art keywords
video
picture data
data
slice picture
slice
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011642323.2A
Other languages
Chinese (zh)
Inventor
岳晓峰
钟文亮
赵子苏
杨春晖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Visionvera Information Technology Co Ltd
Original Assignee
Visionvera Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Visionvera Information Technology Co Ltd filed Critical Visionvera Information Technology Co Ltd
Priority to CN202011642323.2A priority Critical patent/CN112784108A/en
Publication of CN112784108A publication Critical patent/CN112784108A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/70Information retrieval; Database structures therefor; File system structures therefor of video data
    • G06F16/74Browsing; Visualisation therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/70Information retrieval; Database structures therefor; File system structures therefor of video data
    • G06F16/73Querying
    • G06F16/738Presentation of query results

Abstract

The embodiment of the invention provides a data processing method and a data processing device, wherein the method comprises the following steps: receiving a browsing request aiming at a non-audio/video file sent by a video networking terminal; generating one or more slice picture data corresponding to the non-audio/video file; determining target slice picture data from the one or more slice picture data; and generating video frame data according to the target slice picture data, and sending the video frame data to the video network terminal. By the embodiment of the invention, the video network terminal can directly look up the non-audio and video files on the video network storage gateway without special modification aiming at the video network terminal, so that the speed of looking up the non-audio and video files by a user is greatly improved, the use requirement of the user is met, and the use experience of the user on video network equipment is improved.

Description

Data processing method and device
Technical Field
The present invention relates to the field of video networking technologies, and in particular, to a method and an apparatus for data processing.
Background
The video networking is an important milestone for network development, is a real-time network, can realize high-definition video real-time transmission, and pushes a plurality of internet applications to high-definition video, and high-definition faces each other.
At present, in order to facilitate users to use the video networking, a video networking storage gateway may be deployed in the video networking, and specifically may be a video networking dedicated storage interface host, which provides a uniform interface for a user's terminal and provides storage services for other video networking services. When the storage service of the video network storage gateway is adopted, the storage service can mainly store office documents, audio, video and other format contents.
However, the video network terminal usually adopts an embedded video device, and if special modification (mainly modification on terminal software function support) is not performed on the video network terminal, the non-audio/video document file in the video network storage gateway cannot be directly displayed. Therefore, the existing video network storage gateway is difficult to meet the functional requirements of enterprise customers.
Disclosure of Invention
In view of the above, the present invention has been developed to provide a method and apparatus that overcome or at least partially solve the above problems, including:
in a first aspect, a method for data processing is provided, the method comprising:
receiving a browsing request aiming at a non-video file sent by a video networking terminal;
generating one or more slice picture data corresponding to the non-video file;
determining target slice picture data from the one or more slice picture data;
and generating video frame data according to the target slice picture data, and sending the video frame data to the video network terminal.
Optionally, the generating one or more slice picture data corresponding to the non-video file includes:
opening the non-video file in a preset virtual window;
intercepting the picture displayed by the virtual window to obtain screenshot picture data;
and slicing the screen capture picture data to obtain one or more sliced picture data.
Optionally, the determining target slice picture data from the one or more slice picture data includes:
acquiring screen resolution, and determining the number of slice pictures according to the screen resolution;
determining target slice picture data satisfying the slice picture number from the one or more slice picture data.
Optionally, the determining, from the one or more slice picture data, target slice picture data that satisfies the slice picture number includes:
determining target slice picture data which meets the slice picture number and contains initial slice picture data from the one or more slice picture data;
or, in response to a play control operation, determining target slice picture data satisfying the slice picture number from the one or more slice picture data.
Optionally, there are a plurality of target slice picture data, and generating video frame data according to the target slice picture data includes:
performing picture splicing on the plurality of target slice picture data to obtain spliced picture data;
and carrying out format conversion on the spliced picture data to obtain video frame data.
Optionally, the sending the video frame data to the video networking terminal includes:
and circularly sending the video frame data to the video network terminal until the next playing control operation is detected or the browsing of the non-video file is stopped.
Optionally, the one or more slice picture data have numbering information.
In a second aspect, an apparatus for data processing is provided, the apparatus comprising:
the browsing request receiving module is used for receiving a browsing request aiming at a non-video file sent by a video networking terminal;
the slice picture data generation module is used for generating one or more slice picture data corresponding to the non-video file;
a target slice picture data determination module, configured to determine target slice picture data from the one or more slice picture data;
and the video frame data generating module is used for generating video frame data according to the target slice picture data and sending the video frame data to the video network terminal.
Optionally, the slice picture data generating module further includes:
the non-audio and video file opening submodule is used for opening the non-audio and video file in a preset virtual window;
the virtual window intercepting submodule is used for intercepting the picture displayed by the virtual window to obtain screenshot picture data;
and the screen capture picture data slicing submodule is used for slicing the screen capture picture data to obtain one or more slice picture data.
Optionally, the target slice picture data determining module further includes:
the slice picture quantity determining submodule is used for acquiring the screen resolution and determining the quantity of slice pictures according to the screen resolution;
and the target slice picture data determining sub-module is used for determining target slice picture data meeting the slice picture number from the one or more slice picture data.
Optionally, the target slice picture data determination sub-module further includes:
a target slice picture data determination unit, configured to determine, from the one or more slice picture data, target slice picture data that satisfies the slice picture number and includes initial slice picture data;
or, in response to a play control operation, determining target slice picture data satisfying the slice picture number from the one or more slice picture data.
Optionally, the video frame data generating module includes:
the target slice picture data splicing submodule is used for carrying out picture splicing on the plurality of target slice picture data to obtain spliced picture data;
and the spliced picture data conversion submodule is used for carrying out format conversion on the spliced picture data to obtain video frame data.
Optionally, the video frame data generating module further includes:
and the video frame data circulating and sending submodule is used for circularly sending the video frame data to the video network terminal until the next playing control operation is detected or the browsing of the non-audio and video file is stopped.
Optionally, the one or more slice picture data has numbering information.
In a third aspect, a server is provided, comprising a processor, a memory and a computer program stored on the memory and capable of running on the processor, the computer program, when executed by the processor, implementing the method of data processing as described above.
In a fourth aspect, a computer-readable storage medium is provided, on which a computer program is stored, which computer program, when being executed by a processor, carries out the method of data processing as described above.
The embodiment of the invention has the following advantages:
in the embodiment of the invention, one or more slice picture data corresponding to a non-video file are generated by receiving a browsing request aiming at the non-video file sent by a video network terminal, target slice picture data are determined from the one or more slice picture data, video frame data are generated according to the target slice picture data, and the video frame data are sent to the video network terminal, so that the video network terminal can directly look up the non-audio and video file on a video network storage gateway without special modification aiming at the video network terminal, the speed of looking up the non-audio and video file by a user is greatly improved, the use requirement of the user is met, and the use experience of the user on video network equipment is improved.
Drawings
In order to more clearly illustrate the technical solution of the present invention, the drawings needed to be used in the description of the present invention will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic networking diagram of a video network according to an embodiment of the present invention;
fig. 2 is a schematic hardware structure diagram of a node server according to an embodiment of the present invention;
fig. 3 is a schematic hardware structure diagram of an access switch according to an embodiment of the present invention;
fig. 4 is a schematic hardware structure diagram of an ethernet protocol conversion gateway according to an embodiment of the present invention;
FIG. 5 is a flow chart illustrating steps of a method for data processing according to an embodiment of the present invention;
FIG. 6 is a flow chart of steps in another method of data processing according to an embodiment of the invention;
FIG. 7 is a diagram of a method for data processing according to an embodiment of the present invention;
FIG. 8 is a flowchart illustrating a method of data processing according to an embodiment of the present invention;
fig. 9 is a block diagram of a data processing apparatus according to an embodiment of the present invention.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below. It is to be understood that the embodiments described are only a few embodiments of the present invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The video networking is an important milestone for network development, is a real-time network, can realize high-definition video real-time transmission, and pushes a plurality of internet applications to high-definition video, and high-definition faces each other.
The video networking adopts a real-time high-definition video exchange technology, can integrate required services such as dozens of services of video, voice, pictures, characters, communication, data and the like on a system platform on a network platform, such as high-definition video conference, video monitoring, intelligent monitoring analysis, emergency command, digital broadcast television, delayed television, network teaching, live broadcast, VOD on demand, television mail, Personal Video Recorder (PVR), intranet (self-office) channels, intelligent video broadcast control, information distribution and the like, and realizes high-definition quality video broadcast through a television or a computer.
To better understand the embodiments of the present invention, the following description refers to the internet of view:
some of the technologies applied in the video networking are as follows:
network technology (network technology)
Network technology innovation in video networking has improved over traditional Ethernet (Ethernet) to face the potentially enormous video traffic on the network. Unlike pure network Packet Switching (Packet Switching) or network Circuit Switching (Circuit Switching), the Packet Switching is adopted by the technology of the video networking to meet the Streaming requirement. The video networking technology has the advantages of flexibility, simplicity and low price of packet switching, and simultaneously has the quality and safety guarantee of circuit switching, thereby realizing the seamless connection of the whole network switching type virtual circuit and the data format.
Switching Technology (Switching Technology)
The video network adopts two advantages of asynchronism and packet switching of the Ethernet, eliminates the defects of the Ethernet on the premise of full compatibility, has end-to-end seamless connection of the whole network, is directly communicated with a user terminal, and directly bears an IP data packet. The user data does not require any format conversion across the entire network. The video networking is a higher-level form of the Ethernet, is a real-time exchange platform, can realize the real-time transmission of the whole-network large-scale high-definition video which cannot be realized by the existing Internet, and pushes a plurality of network video applications to high-definition and unification.
Server technology (Servertechnology)
The server technology on the video networking and unified video platform is different from the traditional server, the streaming media transmission of the video networking and unified video platform is established on the basis of connection orientation, the data processing capacity of the video networking and unified video platform is independent of flow and communication time, and a single network layer can contain signaling and data transmission. For voice and video services, the complexity of video networking and unified video platform streaming media processing is much simpler than that of data processing, and the efficiency is greatly improved by more than one hundred times compared with that of a traditional server.
Storage Technology (Storage Technology)
The super-high speed storage technology of the unified video platform adopts the most advanced real-time operating system in order to adapt to the media content with super-large capacity and super-large flow, the program information in the server instruction is mapped to the specific hard disk space, the media content is not passed through the server any more, and is directly sent to the user terminal instantly, and the general waiting time of the user is less than 0.2 second. The optimized sector distribution greatly reduces the mechanical motion of the magnetic head track seeking of the hard disk, the resource consumption only accounts for 20% of that of the IP internet of the same grade, but concurrent flow which is 3 times larger than that of the traditional hard disk array is generated, and the comprehensive efficiency is improved by more than 10 times.
Network Security Technology (Network Security Technology)
The structural design of the video network completely eliminates the network security problem troubling the internet structurally by the modes of independent service permission control each time, complete isolation of equipment and user data and the like, generally does not need antivirus programs and firewalls, avoids the attack of hackers and viruses, and provides a structural carefree security network for users.
Service Innovation Technology (Service Innovation Technology)
The unified video platform integrates services and transmission, and is not only automatically connected once whether a single user, a private network user or a network aggregate. The user terminal, the set-top box or the PC are directly connected to the unified video platform to obtain various multimedia video services in various forms. The unified video platform adopts a menu type configuration table mode to replace the traditional complex application programming, can realize complex application by using very few codes, and realizes infinite new service innovation.
Networking of the video network is as follows:
the video network is a centralized control network structure, and the network can be a tree network, a star network, a ring network and the like, but on the basis of the centralized control node, the whole network is controlled by the centralized control node in the network.
As shown in fig. 1, the video network is divided into an access network and a metropolitan network.
The devices of the access network part can be mainly classified into 3 types: node server, access switch, terminal (including various set-top boxes, coding boards, memories, etc.). The node server is connected to an access switch, which may be connected to a plurality of terminals and may be connected to an ethernet network.
The node server is a node which plays a centralized control function in the access network and can control the access switch and the terminal. The node server can be directly connected with the access switch or directly connected with the terminal.
Similarly, devices of the metropolitan network portion may also be classified into 3 types: a metropolitan area server, a node switch and a node server. The metro server is connected to a node switch, which may be connected to a plurality of node servers.
The node server is a node server of the access network part, namely the node server belongs to both the access network part and the metropolitan area network part.
The metropolitan area server is a node which plays a centralized control function in the metropolitan area network and can control a node switch and a node server. The metropolitan area server can be directly connected with the node switch or directly connected with the node server.
Therefore, the whole video network is a network structure with layered centralized control, and the network controlled by the node server and the metropolitan area server can be in various structures such as tree, star and ring.
The access network part can form a unified video platform (the part in the dotted circle), and a plurality of unified video platforms can form a video network; each unified video platform may be interconnected via metropolitan area and wide area video networking.
1. Video networking device classification
1.1 devices in the video network of the embodiment of the present invention can be mainly classified into 3 types: server, exchanger (including Ethernet protocol conversion gateway), terminal (including various set-top boxes, code board, memory, etc.). The video network as a whole can be divided into a metropolitan area network (or national network, global network, etc.) and an access network.
1.2 wherein the devices of the access network part can be mainly classified into 3 types: node server, access exchanger (including Ethernet protocol conversion gateway), terminal (including various set-top boxes, coding board, memory, etc.).
The specific hardware structure of each access network device is as follows:
a node server:
as shown in fig. 2, the system mainly includes a network interface module 201, a switching engine module 202, a CPU module 203, and a disk array module 204;
the network interface module 201, the CPU module 203, and the disk array module 204 all enter the switching engine module 202; the switching engine module 202 performs an operation of looking up the address table 205 on the incoming packet, thereby obtaining the direction information of the packet; and stores the packet in a queue of the corresponding packet buffer 206 based on the packet's steering information; if the queue of the packet buffer 206 is nearly full, it is discarded; the switching engine module 202 polls all packet buffer queues for forwarding if the following conditions are met: 1) the port send buffer is not full; 2) the queue packet counter is greater than zero. The disk array module 204 mainly implements control over the hard disk, including initialization, read-write, and other operations on the hard disk; the CPU module 203 is mainly responsible for protocol processing with an access switch and a terminal (not shown in the figure), configuring an address table 205 (including a downlink protocol packet address table, an uplink protocol packet address table, and a data packet address table), and configuring the disk array module 204.
The access switch:
as shown in fig. 3, the network interface module mainly includes a network interface module (a downlink network interface module 301 and an uplink network interface module 302), a switching engine module 303 and a CPU module 304;
wherein, the packet (uplink data) coming from the downlink network interface module 301 enters the packet detection module 305; the packet detection module 305 detects whether the Destination Address (DA), the Source Address (SA), the packet type, and the packet length of the packet meet the requirements, and if so, allocates a corresponding stream identifier (stream-id) and enters the switching engine module 303, otherwise, discards the stream identifier; the packet (downstream data) coming from the upstream network interface module 302 enters the switching engine module 303; the incoming data packet of the CPU module 304 enters the switching engine module 303; the switching engine module 303 performs an operation of looking up the address table 306 on the incoming packet, thereby obtaining the direction information of the packet; if the packet entering the switching engine module 303 is from the downstream network interface to the upstream network interface, the packet is stored in the queue of the corresponding packet buffer 307 in association with the stream-id; if the queue of the packet buffer 307 is nearly full, it is discarded; if the packet entering the switching engine module 303 is not from the downlink network interface to the uplink network interface, the data packet is stored in the queue of the corresponding packet buffer 307 according to the guiding information of the packet; if the queue of the packet buffer 307 is nearly full, it is discarded.
The switching engine module 303 polls all packet buffer queues, which in this embodiment of the present invention is divided into two cases:
if the queue is from the downlink network interface to the uplink network interface, the following conditions are met for forwarding: 1) the port send buffer is not full; 2) the queued packet counter is greater than zero; 3) obtaining a token generated by a code rate control module;
if the queue is not from the downlink network interface to the uplink network interface, the following conditions are met for forwarding: 1) the port send buffer is not full; 2) the queue packet counter is greater than zero.
The rate control module 308 is configured by the CPU module 304, and generates tokens for packet buffer queues from all downstream network interfaces to upstream network interfaces at programmable intervals to control the rate of upstream forwarding.
The CPU module 304 is mainly responsible for protocol processing with the node server, configuration of the address table 306, and configuration of the code rate control module 308.
Ethernet protocol gateway:
as shown in fig. 4, the apparatus mainly includes a network interface module (a downlink network interface module 401 and an uplink network interface module 402), a switching engine module 403, a CPU module 404, a packet detection module 405, a rate control module 408, an address table 406, a packet buffer 407, a MAC adding module 409, and a MAC deleting module 410.
Wherein, the data packet coming from the downlink network interface module 401 enters the packet detection module 405; the packet detection module 405 detects whether the ethernet MAC DA, the ethernet MAC SA, the ethernet length or frame type, the video network destination address DA, the video network source address SA, the video network packet type, and the packet length of the packet meet the requirements, and if so, allocates a corresponding stream identifier (stream-id); then, the MAC deletion module 410 subtracts MAC DA, MAC SA, length or frame type (2byte) and enters the corresponding receiving buffer, otherwise, discards it;
the downlink network interface module 401 detects the sending buffer of the port, and if there is a packet, obtains the ethernet MAC DA of the corresponding terminal according to the video networking destination address DA of the packet, adds the ethernet MAC DA of the terminal, the MAC SA of the ethernet coordination gateway, and the ethernet length or frame type, and sends the packet.
The other modules in the ethernet protocol gateway function similarly to the access switch.
A terminal:
the system mainly comprises a network interface module, a service processing module and a CPU module; for example, the set-top box mainly comprises a network interface module, a video and audio coding and decoding engine module and a CPU module; the coding board mainly comprises a network interface module, a video and audio coding engine module and a CPU module; the memory mainly comprises a network interface module, a CPU module and a disk array module.
1.3 devices of the metropolitan area network part can be mainly classified into 2 types: node server, node exchanger, metropolitan area server. The node switch mainly comprises a network interface module, a switching engine module and a CPU module; the metropolitan area server mainly comprises a network interface module, a switching engine module and a CPU module.
2. Video networking packet definition
2.1 Access network packet definition
The data packet of the access network mainly comprises the following parts: destination Address (DA), Source Address (SA), reserved bytes, payload (pdu), CRC.
As shown in the following table, the data packet of the access network mainly includes the following parts:
DA SA Reserved Payload CRC
wherein:
the Destination Address (DA) is composed of 8 bytes (byte), the first byte represents the type of the data packet (such as various protocol packets, multicast data packets, unicast data packets, etc.), there are 256 possibilities at most, the second byte to the sixth byte are metropolitan area network addresses, and the seventh byte and the eighth byte are access network addresses;
the Source Address (SA) is also composed of 8 bytes (byte), defined as the same as the Destination Address (DA);
the reserved byte consists of 2 bytes;
the payload part has different lengths according to different types of datagrams, and is 64 bytes if the datagram is various types of protocol packets, and is 32+1024 or 1056 bytes if the datagram is a unicast packet, of course, the length is not limited to the above 2 types;
the CRC consists of 4 bytes and is calculated in accordance with the standard ethernet CRC algorithm.
2.2 metropolitan area network packet definition
The topology of a metropolitan area network is a graph and there may be 2, or even more than 2, connections between two devices, i.e., there may be more than 2 connections between a node switch and a node server, a node switch and a node switch, and a node switch and a node server. However, the metro network address of the metro network device is unique, and in order to accurately describe the connection relationship between the metro network devices, parameters are introduced in the embodiment of the present invention: a label to uniquely describe a metropolitan area network device.
In this specification, the definition of the Label is similar to that of the Label of MPLS (Multi-Protocol Label Switch), and assuming that there are two connections between the device a and the device B, there are 2 labels for the packet from the device a to the device B, and 2 labels for the packet from the device B to the device a. The label is classified into an incoming label and an outgoing label, and assuming that the label (incoming label) of the packet entering the device a is 0x0000, the label (outgoing label) of the packet leaving the device a may become 0x 0001. The network access process of the metro network is a network access process under centralized control, that is, address allocation and label allocation of the metro network are both dominated by the metro server, and the node switch and the node server are both passively executed, which is different from label allocation of MPLS, and label allocation of MPLS is a result of mutual negotiation between the switch and the server.
As shown in the following table, the data packet of the metro network mainly includes the following parts:
DA SAReserved Label Payload CRC
Namely Destination Address (DA), Source Address (SA), Reserved byte (Reserved), tag, payload (pdu), CRC. The format of the tag may be defined by reference to the following: the tag is 32 bits with the upper 16 bits reserved and only the lower 16 bits used, and its position is between the reserved bytes and payload of the packet.
Referring to fig. 5, a flowchart illustrating steps of a data processing method according to an embodiment of the present invention is shown, which may specifically include the following steps:
step 501: receiving a browsing request aiming at a non-audio/video file sent by a video networking terminal;
the terminal of the video network may include a physical terminal such as a set-top box (SetTopBox, STB), which may be called a set-top box or set-top box, and is a device for connecting a television set and an external signal source, and converting a compressed digital signal into television content and displaying the television content on the television set. Generally, the set-top box may be connected to a camera and a microphone for collecting multimedia data such as video data and audio data, and may also be connected to a television for playing multimedia data such as video data and audio data.
And the non-audio and video files can refer to office format files of non-audio and video types. Such as text files, word files, excel files, pdf files, etc.
In practical application, the video network terminal can provide a video-on-demand channel of a non-audio/video file, when a user needs to directly check the non-audio/video file in the video network, the non-audio/video file can be requested in the video network through the video-on-demand function of the video network terminal, and then the video network storage gateway can receive a browsing request aiming at the non-audio/video file, which is generated and sent by the video network terminal adopted by the user, and extract the non-audio/video file. It should be noted that the browsing request may be transmitted in an encrypted or unencrypted manner, and the internet of view storage gateway may receive the browsing request in a wireless or wired manner, such as bluetooth, a wireless cellular network, a wireless local area network, an optical fiber, a coaxial cable, etc., which is not limited in this disclosure.
In specific implementation, the browsing request for the non-audio/video file may include a case name, a classification tag, page number information, an address of a terminal of the video network, and the like of the non-audio/video file, which are used to identify information of a browsing object and the request terminal, so that the storage gateway of the video network may flexibly select the non-audio/video file according to the browsing request, and flexibly select the content specifically referred to in the non-audio/video file.
Step 502: generating one or more slice picture data corresponding to the non-audio/video file;
the one or more slice picture data may be data intercepted by the video network storage gateway for the non-audio/video file and adopting a picture type, and the intercepted one or more slice picture data may record substantial content in the non-audio/video file.
In practical application, after receiving a browsing request sent by a video network terminal, a video network storage gateway can analyze the browsing request, so as to determine a non-audio/video file which the video network terminal actually needs to request to look up. The video network storage gateway can also open the non-audio and video file through software corresponding to the non-audio and video file, and then call a screenshot tool to intercept the non-audio and video file which is required to be consulted by the video network terminal to generate one or more slice picture data. The screenshot specifications adopted by the screenshot tool can be the same, namely, between one or more pieces of slice picture data intercepted by the screenshot tool, the length pixels intercepted from each other can be the same, and the width pixels intercepted from each other can also be the same, so that the splicing operation of the slice picture data can be better carried out according to the same screenshot specifications.
It should be noted that one or more slice picture data may be in a conventional picture data format, for example, the picture data format may be a bitmap format, a jpeg format, or the like. The present invention is not particularly limited in this regard.
Step 503: determining target slice picture data from the one or more slice picture data;
in practical application, when a non-audio and video file is requested through a video network terminal, a user often needs to carefully look up and analyze a specific part of content in the non-audio and video file, and after a video network storage gateway receives a browsing request of the video network terminal for a certain non-audio and video file, target slice picture data containing the specific part of content which the user needs to look up can be selected from one or more slice picture data generated aiming at the non-audio and video file.
In specific implementation, after target slice picture data is selected for one or more slice picture data generated by a non-audio/video file, the remaining one or more slice picture data is not required to be deleted, but can be continuously stored in the video network storage gateway.
Step 504: and generating video frame data according to the target slice picture data, and sending the video frame data to the video network terminal.
The video frame data may be data in a video format that is formed by combining one frame of picture data and can be played at a certain frame rate.
In practical application, the video network storage gateway may synthesize the determined target slice picture data to obtain video frame data that can be played by the video network terminal, and then the video network storage gateway may transmit the video frame data obtained by synthesizing the target slice picture data to the video network terminal that has sent the browsing request for the non-audio/video file through the video network. Therefore, the video networking terminal is used as embedded video equipment, the video frame data can be directly played, and the requirement of a user for looking up the non-audio and video files is met.
As an example, the target slice picture data may be a bitmap, and in the process of generating video frame data, the video network storage gateway may synthesize and convert the target slice picture into a YUV420 frame format, perform encoding and compression on the YUV420 format according to an H264 mode, then push the H264 video frame according to a preset frame rate to form a video stream output, and send the video stream output to the video network terminal that requests the non-audio/video file.
In the embodiment of the invention, a browsing request aiming at a non-audio/video file sent by a video network terminal is received; generating one or more slice picture data corresponding to the non-audio/video file; determining target slice picture data from the one or more slice picture data; according to the target slice picture data, video frame data are generated and sent to the video network terminal, the video network terminal can directly look up non-audio and video files on the video network storage gateway, special transformation aiming at the video network terminal is not needed, the speed of looking up the non-audio and video files by a user is greatly improved, the use requirement of the user is met, and the use experience of the user on video network equipment is improved.
Referring to fig. 6, a flowchart illustrating steps of another data processing method according to an embodiment of the present invention is shown, which may specifically include the following steps:
step 601: receiving a browsing request aiming at a non-audio/video file sent by a video networking terminal;
in practical application, the storage gateway of the video network can receive a browsing request sent by the video network terminal aiming at various non-audio and video files, the storage gateway of the video network can analyze the non-audio and video file objects requested by the video network terminal through the browsing request, and can determine the specific file format adopted by the non-audio and video files.
Step 602: opening the non-audio/video file in a preset virtual window;
in practical application, after receiving a browsing request sent by a video network terminal, a video network storage gateway can call application software corresponding to a specific file format adopted by the non-audio/video file, and then the video network storage gateway can open the non-audio/video file requested by the video network terminal in a virtual window through the application software.
The content in the virtual window is not on the screen, but is generated into a buffer area in the memory by an API interface provided by an operating system on the video network storage gateway, and the buffer area is used for storing application software which can start various non-audio and video files. From a program perspective, a virtual window appears as a desktop, but its contents are not displayed on the screen, but rather "displayed" in a particular memory area.
Step 603: intercepting the picture displayed by the virtual window to obtain screenshot picture data;
in practical application, the video network storage gateway can call a preset screenshot tool, and screenshot is carried out on the non-audio and video file started by the application software in the virtual window until all the essential contents in the non-audio and video file are intercepted. Moreover, the video network storage gateway can obtain the position of the application software for showing the non-audio and video files through a Graphical User Interface (GUI) provided by the operating system.
In the specific implementation, when screenshot is performed, screenshot can be performed on a non-audio/video file displayed by application software according to a preset width and a preset height, a virtual keyboard instruction page down can be called to turn pages, and the screenshot can be buffered and stored, so that when a video is played, the screenshot can be spliced according to height pixels of a screen and converted into a video format, and the video format is output to a video networking terminal.
Step 604: slicing the screen capture picture data to obtain one or more sliced picture data;
in practical applications, after the screenshot picture data is obtained by capturing the picture displayed by the virtual window, the screenshot content may be further sliced according to a specific pixel height, where the specific pixel height may be an integer fraction of the resolution of the video on demand, for example: when the standard definition resolution is 1280 × 720, the pixel width of the video or image is 1280 pixels, the pixel height is fixed to 720 pixels, and the pixel height of the slice can be set to 240 pixels, which is one third of 720. Therefore, after the slice pictures are spliced and converted, the screen of the video network terminal can be covered completely.
In an embodiment of the present application, the one or more slice picture data have numbering information.
The obtaining of the one or more slice picture data may be sequentially generating one or more slice picture format files, and numbering the one or more slice picture format files, so that the video network storage gateway may select slice pictures according to the ordered numbering, for example, a series of jpeg format files, which may be numbered 0001-. It should be noted that the specific numbers are merely examples for explanation, and the numbers and the numbering method may be adjusted according to actual situations, and the present invention is not limited to this.
Step 605: acquiring screen resolution, and determining the number of slice pictures according to the screen resolution;
in practical application, after obtaining one or more slice picture data, the video network storage gateway can obtain the screen resolution of the video network terminal through a browsing request sent by the video network terminal, and then calculate the maximum number of slices that can be displayed by the video network terminal under the condition of keeping the requested video resolution unchanged according to the screen resolution of the video network terminal and the pixel height of the slices.
It should be noted that the resolution of the video on demand can adopt two resolution modes that are conventional to video networks, namely a high definition mode: 1920 × 1080p, and standard definition mode: 1280 × 720p, and most of the video network terminals in the video network can compatibly display the two conventional resolution modes, that is, when the resolution of the video on demand is the conventional resolution mode of the video network, the screen resolution of various video network terminals does not need to be detected, and 1920 pixels are directly adopted as the pixel height of the screen of the video network terminal.
Step 606: determining target slice picture data satisfying the slice picture number from the one or more slice picture data;
in practical application, after the number of slice pictures is determined according to the screen resolution, target slice picture data meeting the number of slice pictures, which is actually required to be referred by the terminal of the video network, can be selected from one or more slice picture data.
In an embodiment of the present application, step 606 may include the following sub-steps:
determining target slice picture data which meets the slice picture number and contains initial slice picture data from the one or more slice picture data;
or, in response to a play control operation, determining target slice picture data satisfying the slice picture number from the one or more slice picture data.
In practical application, when a user sends a browsing request through a video network terminal to open the non-audio/video file and the user does not specify the number of pages to be consulted, the video network storage gateway can select target slice picture data meeting the number of slice pictures from the beginning part of the non-audio/video file, namely the target slice picture data including the initial slice picture data; when a user specifies the number of pages of a non-audio/video file to be referred for the first time through a remote control device of the video network terminal, or when playing control operations such as fast forward, fast backward, page up and down turning operations and the like are performed on the opened non-audio/video file, the video network storage gateway can respond to the playing control operations and select target slice picture data corresponding to the playing control operations such as fast forward, fast backward, page up and down turning operations and the like.
The following are examples of several of the described play control operations:
when the number of pages to be consulted is appointed through the remote control device, page number detection can be carried out on a non-audio/video file opened by application software, the content meeting the appointed number of pages to be consulted is intercepted and cut, and target slice picture data meeting the number of slice pictures is selected.
And when the play control operation of page turning is sent by the remote control device, according to the determined number of the slice pictures, selecting the slice pictures backwards according to the number of the slice pictures when the page is turned down each time. For example, when the determined number of the slice pictures is three slice pictures displayed on the current screen, three slice pictures are selected backwards according to the number information of the slice pictures when the page is turned down.
When a fast-forward play control operation is sent through a remote control device, a plurality of slice pictures which are currently displayed and a plurality of slice pictures which are to be displayed can be spliced into a large picture, the picture is captured in the large picture in a downward sliding mode according to the pixel height of a screen, wherein the downward sliding amount of pixels can be set according to the fast-forward degree, namely the higher the fast-forward degree is, the more pixels are slid downward, and then the picture with the pixel height of the screen is captured.
As shown in fig. 7, in a specific example of a data processing method, in a virtual desktop (virtual window) of an internet of view storage gateway, a corresponding word file (non-audio/video file) may be opened through a word (application software), the internet of view storage gateway captures a screenshot of a document content display portion and may automatically page until all contents in the document are captured, and then slices the screenshot, which may be specifically divided into a 0001 slice picture, a 0002 slice picture, and a 0003 slice picture … … nmn slice picture; the video network terminal can display three slice pictures at most. And then 0003 slice pictures-0005 slice pictures can be selected for slice splicing and are output as video stream push after format conversion. When the video network storage gateway receives the page turning-down control of the video network terminal, the 0006 slice picture-0008 slice picture can be selected for slice splicing and conversion output.
Step 607: and generating video frame data according to the target slice picture data, and sending the video frame data to the video network terminal.
In practical application, the video network storage gateway may synthesize the determined target slice picture data to obtain video frame data that can be played by the video network terminal, and then the video network storage gateway may transmit the video frame data obtained by synthesizing the target slice picture data to the video network terminal that has sent the browsing request for the non-audio/video file through the video network.
In an embodiment of the present application, step 607 may include the following sub-steps:
performing picture splicing on the plurality of target slice picture data to obtain spliced picture data;
and carrying out format conversion on the spliced picture data to obtain video frame data.
After the target slice image data meeting the slice image number is determined, the target slice image data can be synthesized in a splicing mode, so that spliced image data can be obtained. And then, carrying out format conversion on the spliced picture data, and converting the spliced picture data of the picture type into video frame data which is formed by combining one frame of picture data and can be played at a certain frame rate. For example, a target slice picture format in a bitmap and jpeg format may be converted into a frame format in YUV420, and then encoded and compressed in an H264 manner, and the H264 video frame is pushed according to a preset frame rate to form a video stream output.
In another embodiment of the present application, step 607 may further include the following sub-steps:
and circularly sending the video frame data to the video networking terminal until the next playing control operation is detected or the browsing of the non-audio/video file is stopped.
In practical application, after video frame data is generated, the H264 video frames converted from the picture format can be sent in a circulating manner according to a preset frame rate to form continuous video streams and output to the video network terminal, so as to ensure that a display picture on the video network terminal can be kept unchanged, when the video network terminal sends the video network terminal to the next play control operation or stops browsing the non-audio and video files, the fact that a user does not need to look up the non-audio and video files of a current page is indicated, namely the video network storage gateway stops sending the H264 video frames to the video network terminal in a circulating manner at this time.
It should be noted that although the H264 video frames converted from the picture format are sent in a loop according to the preset frame rate, because the video frame data is the concatenation of the still pictures, the pictures need to be changed only when the pages are turned or stopped, so a very low frame rate can be selected for playing in a loop, for example, 2-3 frames per second, which can effectively reduce the bandwidth required for loading the video stream.
In the embodiment of the invention, by receiving a browsing request aiming at a non-audio and video file sent by a video network terminal, opening the non-audio and video file in a preset virtual window, intercepting a picture displayed by the virtual window to obtain screenshot picture data, slicing the screenshot picture data to obtain one or more slice picture data, obtaining screen resolution, determining the number of slice pictures according to the screen resolution, determining target slice picture data meeting the number of the slice pictures from the one or more slice picture data, generating video frame data according to the target slice picture data, and sending the video frame data to the video network terminal, the video network terminal can directly look up the non-audio and video file on a video network storage gateway, and the video network terminal can send play control operation, the method and the system realize various looking up operations aiming at the non-audio and video files, increase the flexibility of looking up the non-audio and video files through the storage gateway of the video network and improve the use experience of users.
It should be noted that, for simplicity of description, the method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present invention is not limited by the illustrated order of acts, as some steps may occur in other orders or concurrently in accordance with the embodiments of the present invention. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred and that no particular act is required to implement the invention.
Referring to fig. 8, a specific flowchart of a data processing method according to an embodiment of the present invention is shown, which may specifically include the following steps:
the video networking V2V interface may be used to interface with video networking terminals.
When the video network terminal sends a browsing request aiming at the non-audio and video files, the video network storage gateway can receive the document browsing request aiming at the non-audio and video files by calling the API interface and process the document browsing request.
The video network storage gateway calls a virtual desktop (virtual window) for screen capture and slicing, and stores the virtual desktop in a file system of the video network storage gateway.
Then the video network storage gateway can perform slice playing selection and picture synthesis conversion, and convert the slices into video streams to be output to the video network terminal through a video network V2V interface.
When the video network storage gateway receives the play control operation of the user through the API interface, the play processing is carried out, namely, the slice play selection is executed again according to the type of the play control operation of the user.
And after the video network storage gateway re-executes slice playing selection, performing picture synthesis format conversion, and outputting the video stream to the video network terminal through a video network V2V interface.
Referring to fig. 9: a block diagram of a data processing apparatus according to an embodiment of the present invention is shown, which may specifically include the following modules:
a browsing request receiving module 901, configured to receive a browsing request for a non-audio/video file sent by a video networking terminal;
a slice picture data generating module 902, configured to generate one or more slice picture data corresponding to the non-audio/video file;
a target slice picture data determining module 903, configured to determine target slice picture data from the one or more slice picture data;
and the video frame data generating module 904 is configured to generate video frame data according to the target slice picture data, and send the video frame data to the video networking terminal.
In an embodiment of the present invention, the slice image data generating module further includes:
the non-audio and video file opening submodule is used for opening the non-audio and video file in a preset virtual window;
the virtual window intercepting submodule is used for intercepting the picture displayed by the virtual window to obtain screenshot picture data;
and the screen capture picture data slicing submodule is used for slicing the screen capture picture data to obtain one or more slice picture data.
In an embodiment of the present invention, the target slice picture data determining module further includes:
the slice picture quantity determining submodule is used for acquiring the screen resolution and determining the quantity of slice pictures according to the screen resolution;
and the target slice picture data determining sub-module is used for determining target slice picture data meeting the slice picture number from the one or more slice picture data.
In an embodiment of the present invention, the target slice picture data determining sub-module further includes:
a target slice picture data determination unit, configured to determine, from the one or more slice picture data, target slice picture data that satisfies the slice picture number and includes initial slice picture data;
or, in response to a play control operation, determining target slice picture data satisfying the slice picture number from the one or more slice picture data.
In an embodiment of the present invention, the video frame data generating module includes:
the target slice picture data splicing submodule is used for carrying out picture splicing on the plurality of target slice picture data to obtain spliced picture data;
and the spliced picture data conversion submodule is used for carrying out format conversion on the spliced picture data to obtain video frame data.
In an embodiment of the present invention, the video frame data generating module further includes:
and the video frame data circulating and sending submodule is used for circularly sending the video frame data to the video network terminal until the next playing control operation is detected or the browsing of the non-audio and video file is stopped.
In an embodiment of the present invention, the one or more slice picture data have number information.
In the embodiment of the invention, a browsing request aiming at a non-audio/video file sent by a video network terminal is received; generating one or more slice picture data corresponding to the non-audio/video file; determining target slice picture data from the one or more slice picture data; according to the target slice picture data, video frame data are generated and sent to the video network terminal, the video network terminal can directly look up non-audio and video files on the video network storage gateway, special transformation aiming at the video network terminal is not needed, the speed of looking up the non-audio and video files by a user is greatly improved, the use requirement of the user is met, and the use experience of the user on video network equipment is improved.
An embodiment of the present invention also provides a server, which may include a processor, a memory, and a computer program stored on the memory and capable of running on the processor, and when executed by the processor, the computer program implements the method for processing data as above.
An embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, which, when executed by a processor, implements the above data processing method.
For the device embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, refer to the partial description of the method embodiment.
The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, apparatus, or computer program product. Accordingly, embodiments of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
Embodiments of the present invention are described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing terminal to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing terminal to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing terminal to cause a series of operational steps to be performed on the computer or other programmable terminal to produce a computer implemented process such that the instructions which execute on the computer or other programmable terminal provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications of these embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the embodiments of the invention.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or terminal that comprises the element.
The method and apparatus for data processing provided by the present invention are described in detail above, and the principle and the implementation of the present invention are explained in detail herein by applying specific examples, and the description of the above examples is only used to help understanding the method and the core idea of the present invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (10)

1. A method of data processing, the method comprising:
receiving a browsing request aiming at a non-audio/video file sent by a video networking terminal;
generating one or more slice picture data corresponding to the non-audio/video file;
determining target slice picture data from the one or more slice picture data;
and generating video frame data according to the target slice picture data, and sending the video frame data to the video network terminal.
2. The method according to claim 1, wherein the generating one or more slice picture data corresponding to the non-audio-video file comprises:
opening the non-audio/video file in a preset virtual window;
intercepting the picture displayed by the virtual window to obtain screenshot picture data;
and slicing the screen capture picture data to obtain one or more sliced picture data.
3. The method of claim 1 or 2, wherein determining target slice picture data from the one or more slice picture data comprises:
acquiring screen resolution, and determining the number of slice pictures according to the screen resolution;
determining target slice picture data satisfying the slice picture number from the one or more slice picture data.
4. The method of claim 3, wherein the determining, from the one or more slice picture data, target slice picture data that satisfies the slice picture number comprises:
determining target slice picture data which meets the slice picture number and contains initial slice picture data from the one or more slice picture data;
or, in response to a play control operation, determining target slice picture data satisfying the slice picture number from the one or more slice picture data.
5. The method according to claim 1, wherein there are a plurality of target slice picture data, and the generating video frame data from the target slice picture data comprises:
performing picture splicing on the plurality of target slice picture data to obtain spliced picture data;
and carrying out format conversion on the spliced picture data to obtain video frame data.
6. The method according to claim 1, wherein the sending the video frame data to the video networking terminal comprises:
and circularly sending the video frame data to the video networking terminal until the next playing control operation is detected or the browsing of the non-audio/video file is stopped.
7. The method of claim 1, wherein the one or more slice picture data has numbering information.
8. An apparatus for data processing, the apparatus comprising:
the browsing request receiving module is used for receiving a browsing request aiming at a non-audio/video file sent by a video networking terminal;
the slice picture data generation module is used for generating one or more slice picture data corresponding to the non-audio/video file;
a target slice picture data determination module, configured to determine target slice picture data from the one or more slice picture data;
and the video frame data generating module is used for generating video frame data according to the target slice picture data and sending the video frame data to the video network terminal.
9. A server comprising a processor, a memory and a computer program stored on the memory and capable of running on the processor, the computer program, when executed by the processor, implementing a method of data processing according to any one of claims 1 to 7.
10. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out a method of data processing according to any one of claims 1 to 7.
CN202011642323.2A 2020-12-31 2020-12-31 Data processing method and device Pending CN112784108A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011642323.2A CN112784108A (en) 2020-12-31 2020-12-31 Data processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011642323.2A CN112784108A (en) 2020-12-31 2020-12-31 Data processing method and device

Publications (1)

Publication Number Publication Date
CN112784108A true CN112784108A (en) 2021-05-11

Family

ID=75753514

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011642323.2A Pending CN112784108A (en) 2020-12-31 2020-12-31 Data processing method and device

Country Status (1)

Country Link
CN (1) CN112784108A (en)

Similar Documents

Publication Publication Date Title
CN109788314B (en) Method and device for transmitting video stream data
CN110324580B (en) Monitoring video playing method and device based on video network
CN111107299A (en) Method and device for synthesizing multi-channel video
CN110769310B (en) Video processing method and device based on video network
CN109547731B (en) Video conference display method and system
CN108965930B (en) Video data processing method and device
CN112866725A (en) Live broadcast control method and device
CN113194278A (en) Conference control method and device and computer readable storage medium
CN110460898B (en) Video processing method, system, device and machine readable medium
CN110149305B (en) Video network-based multi-party audio and video playing method and transfer server
CN110113564B (en) Data acquisition method and video networking system
CN110769179B (en) Audio and video data stream processing method and system
CN109302384B (en) Data processing method and system
CN110022286B (en) Method and device for requesting multimedia program
CN110769297A (en) Audio and video data processing method and system
CN110392275B (en) Sharing method and device for manuscript demonstration and video networking soft terminal
CN111212255B (en) Monitoring resource obtaining method and device and computer readable storage medium
CN111131788B (en) Monitoring resource state detection method and device and computer readable storage medium
CN111654659A (en) Conference control method and device
CN111447396A (en) Audio and video transmission method and device, electronic equipment and storage medium
CN110795008B (en) Picture transmission method and device and computer readable storage medium
CN110087020B (en) Method and system for realizing video networking conference by iOS equipment
CN110474934B (en) Data processing method and video networking monitoring platform
CN110149306B (en) Media data processing method and device
CN110620796B (en) Fingerprint information access method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination