CN110769184A - Service processing method and device - Google Patents

Service processing method and device Download PDF

Info

Publication number
CN110769184A
CN110769184A CN201810843377.1A CN201810843377A CN110769184A CN 110769184 A CN110769184 A CN 110769184A CN 201810843377 A CN201810843377 A CN 201810843377A CN 110769184 A CN110769184 A CN 110769184A
Authority
CN
China
Prior art keywords
service
video
attribute information
target
dimensional code
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810843377.1A
Other languages
Chinese (zh)
Inventor
韩杰
彭庆太
王艳辉
周国强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Visionvera Information Technology Co Ltd
Original Assignee
Visionvera Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Visionvera Information Technology Co Ltd filed Critical Visionvera Information Technology Co Ltd
Priority to CN201810843377.1A priority Critical patent/CN110769184A/en
Publication of CN110769184A publication Critical patent/CN110769184A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1408Methods for optical code recognition the method being specifically adapted for the type of code
    • G06K7/14172D bar codes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/1066Session management
    • H04L65/1069Session establishment or de-establishment
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems

Abstract

The application provides a method and a device for service processing, which are applied to a video networking terminal, wherein the method comprises the following steps: when a first interactive instruction is received, creating a target video service; determining service attribute information corresponding to the target video service; when a second interactive instruction is received, a two-dimension code corresponding to the service attribute information is generated and issued, the video networking service is issued by adopting the two-dimension code, and convenience of the video networking is improved.

Description

Service processing method and device
Technical Field
The present application relates to the field of video networking technologies, and in particular, to a method and an apparatus for service processing.
Background
With the development of scientific technology, the video networking technology is widely applied to various technical fields, and in life, work and study, users can adopt the video networking to carry out video communication such as video conferences and video teaching, so that high-fidelity and high-definition video communication experience is brought to the users.
At present, when video service is carried out by adopting video networking, such as video telephone, a user is usually required to manually input the video networking number of an opposite terminal, and the video service can be carried out after the video networking number input by the user is received.
Disclosure of Invention
In view of the above, the present application is proposed to provide a method and apparatus for service processing that overcomes or at least partially solves the above problems, comprising:
a method for processing service is applied to a video network terminal, and comprises the following steps:
when a first interactive instruction is received, creating a target video service;
determining service attribute information corresponding to the target video service;
and when a second interactive instruction is received, generating a two-dimensional code corresponding to the service attribute information, and issuing the two-dimensional code.
Optionally, the step of generating the two-dimensional code corresponding to the service attribute information includes:
determining a target coding mode corresponding to the target video service in a preset database; the preset database stores the corresponding relation between a plurality of video service identifications and coding modes;
and generating a two-dimensional code corresponding to the service attribute information by adopting the target coding mode.
Optionally, the step of generating the two-dimensional code corresponding to the service attribute information by using the target coding method includes:
when the target video service is a video conference service, extracting key information from the service attribute information; wherein the key information comprises a video conference identifier;
and coding the key information by adopting the target coding mode to obtain a two-dimensional code corresponding to the service attribute information.
Optionally, the step of generating the two-dimensional code corresponding to the service attribute information by using the target coding method includes:
when the target video service is a live broadcast service, acquiring a preview image;
and coding the preview image and the service attribute information by adopting the target coding mode to obtain a two-dimensional code corresponding to the service attribute information.
Optionally, the service attribute information includes any one of:
video networking number, video conference identification, and live broadcast identification.
A service processing device is applied to a video network terminal and comprises:
the target video service creating module is used for creating a target video service when receiving the first interactive instruction;
the service attribute information determining module is used for determining service attribute information corresponding to the target video service;
and the two-dimension code issuing module is used for generating a two-dimension code corresponding to the service attribute information and issuing the two-dimension code when receiving a second interactive instruction.
Optionally, the two-dimensional code publishing module includes:
the target coding mode determining submodule is used for determining a target coding mode corresponding to the target video service in a preset database; the preset database stores the corresponding relation between a plurality of video service identifications and coding modes;
and the two-dimension code generation submodule is used for generating the two-dimension code corresponding to the service attribute information by adopting the target coding mode.
Optionally, the two-dimensional code generation sub-module includes:
a key information extracting unit, configured to extract key information from the service attribute information when the target video service is a video conference service; wherein the key information comprises a video conference identifier;
and the first coding unit is used for coding the key information by adopting the target coding mode to obtain a two-dimensional code corresponding to the service attribute information.
Optionally, the two-dimensional code generation sub-module includes:
a preview image acquisition unit, configured to acquire a preview image when the target video service is a live broadcast service;
and the second coding unit is used for coding the preview image and the service attribute information by adopting the target coding mode to obtain a two-dimensional code corresponding to the service attribute information.
Optionally, the service attribute information includes any one of:
video networking number, video conference identification, and live broadcast identification.
The application has the following advantages:
in the application, the target video service is created when the first interactive instruction is received, then the service attribute information corresponding to the target video service is determined, the two-dimensional code corresponding to the service attribute information is generated when the second interactive instruction is received, and the two-dimensional code is issued, so that the video network service is issued by adopting the two-dimensional code, and the convenience of the video network is improved.
Drawings
In order to more clearly illustrate the technical solutions of the present application, the drawings needed to be used in the description of the present application will be briefly introduced below, and it is apparent that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive labor.
Fig. 1 is a schematic networking diagram of a video network according to an embodiment of the present invention;
fig. 2 is a schematic hardware structure diagram of a node server according to an embodiment of the present invention;
fig. 3 is a schematic hardware structure diagram of an access switch according to an embodiment of the present invention;
fig. 4 is a schematic hardware structure diagram of an ethernet protocol conversion gateway according to an embodiment of the present invention;
FIG. 5 is a flowchart illustrating steps of a method for processing services according to an embodiment of the present invention;
fig. 6 is a block diagram illustrating a service processing apparatus according to an embodiment of the present invention.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, the present application is described in further detail with reference to the accompanying drawings and the detailed description. It is to be understood that the embodiments described are only a few embodiments of the present application and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The video networking is an important milestone for network development, is a real-time network, can realize high-definition video real-time transmission, and pushes a plurality of internet applications to high-definition video, and high-definition faces each other.
The video networking adopts a real-time high-definition video exchange technology, can integrate required services such as dozens of services of video, voice, pictures, characters, communication, data and the like on a system platform on a network platform, such as high-definition video conference, video monitoring, intelligent monitoring analysis, emergency command, digital broadcast television, delayed television, network teaching, live broadcast, VOD on demand, television mail, Personal Video Recorder (PVR), intranet (self-office) channels, intelligent video broadcast control, information distribution and the like, and realizes high-definition quality video broadcast through a television or a computer.
To better understand the embodiments of the present invention, the following description refers to the internet of view:
some of the technologies applied in the video networking are as follows:
network technology (network technology)
Network technology innovation in video networking has improved over traditional Ethernet (Ethernet) to face the potentially enormous video traffic on the network. Unlike pure network Packet Switching (Packet Switching) or network circuit Switching (circuit Switching), the Packet Switching is adopted by the technology of the video networking to meet the Streaming requirement. The video networking technology has the advantages of flexibility, simplicity and low price of packet switching, and simultaneously has the quality and safety guarantee of circuit switching, thereby realizing the seamless connection of the whole network switching type virtual circuit and the data format.
Switching Technology (Switching Technology)
The video network adopts two advantages of asynchronism and packet switching of the Ethernet, eliminates the defects of the Ethernet on the premise of full compatibility, has end-to-end seamless connection of the whole network, is directly communicated with a user terminal, and directly bears an IP data packet. The user data does not require any format conversion across the entire network. The video networking is a higher-level form of the Ethernet, is a real-time exchange platform, can realize the real-time transmission of the whole-network large-scale high-definition video which cannot be realized by the existing Internet, and pushes a plurality of network video applications to high-definition and unification.
Server technology (Servertechnology)
The server technology on the video networking and unified video platform is different from the traditional server, the streaming media transmission of the video networking and unified video platform is established on the basis of connection orientation, the data processing capacity of the video networking and unified video platform is independent of flow and communication time, and a single network layer can contain signaling and data transmission. For voice and video services, the complexity of video networking and unified video platform streaming media processing is much simpler than that of data processing, and the efficiency is greatly improved by more than one hundred times compared with that of a traditional server.
Storage Technology (Storage Technology)
The super-high speed storage technology of the unified video platform adopts the most advanced real-time operating system in order to adapt to the media content with super-large capacity and super-large flow, the program information in the server instruction is mapped to the specific hard disk space, the media content is not passed through the server any more, and is directly sent to the user terminal instantly, and the general waiting time of the user is less than 0.2 second. The optimized sector distribution greatly reduces the mechanical motion of the magnetic head track seeking of the hard disk, the resource consumption only accounts for 20% of that of the IP internet of the same grade, but concurrent flow which is 3 times larger than that of the traditional hard disk array is generated, and the comprehensive efficiency is improved by more than 10 times.
Network Security Technology (Network Security Technology)
The structural design of the video network completely eliminates the network security problem troubling the internet structurally by the modes of independent service permission control each time, complete isolation of equipment and user data and the like, generally does not need antivirus programs and firewalls, avoids the attack of hackers and viruses, and provides a structural carefree security network for users.
Service Innovation Technology (Service Innovation Technology)
The unified video platform integrates services and transmission, and is not only automatically connected once whether a single user, a private network user or a network aggregate. The user terminal, the set-top box or the PC are directly connected to the unified video platform to obtain various multimedia video services in various forms. The unified video platform adopts a menu type configuration table mode to replace the traditional complex application programming, can realize complex application by using very few codes, and realizes infinite new service innovation.
Networking of the video network is as follows:
the video network is a centralized control network structure, and the network can be a tree network, a star network, a ring network and the like, but on the basis of the centralized control node, the whole network is controlled by the centralized control node in the network.
As shown in fig. 1, the video network is divided into an access network and a metropolitan network.
The devices of the access network part can be mainly classified into 3 types: node server, access switch, terminal (including various set-top boxes, coding boards, memories, etc.). The node server is connected to an access switch, which may be connected to a plurality of terminals and may be connected to an ethernet network.
The node server is a node which plays a centralized control function in the access network and can control the access switch and the terminal. The node server can be directly connected with the access switch or directly connected with the terminal.
Similarly, devices of the metropolitan network portion may also be classified into 3 types: a metropolitan area server, a node switch and a node server. The metro server is connected to a node switch, which may be connected to a plurality of node servers.
The node server is a node server of the access network part, namely the node server belongs to both the access network part and the metropolitan area network part.
The metropolitan area server is a node which plays a centralized control function in the metropolitan area network and can control a node switch and a node server. The metropolitan area server can be directly connected with the node switch or directly connected with the node server.
Therefore, the whole video network is a network structure with layered centralized control, and the network controlled by the node server and the metropolitan area server can be in various structures such as tree, star and ring.
The access network part can form a unified video platform (the part in the dotted circle), and a plurality of unified video platforms can form a video network; each unified video platform may be interconnected via metropolitan area and wide area video networking.
1. Video networking device classification
1.1 devices in the video network of the embodiment of the present invention can be mainly classified into 3 types: server, exchanger (including Ethernet protocol conversion gateway), terminal (including various set-top boxes, code board, memory, etc.). The video network as a whole can be divided into a metropolitan area network (or national network, global network, etc.) and an access network.
1.2 wherein the devices of the access network part can be mainly classified into 3 types: node server, access exchanger (including Ethernet protocol conversion gateway), terminal (including various set-top boxes, coding board, memory, etc.).
The specific hardware structure of each access network device is as follows:
a node server:
as shown in fig. 2, the system mainly includes a network interface module 201, a switching engine module 202, a CPU module 203, and a disk array module 204;
the network interface module 201, the CPU module 203, and the disk array module 204 all enter the switching engine module 202; the switching engine module 202 performs an operation of looking up the address table 205 on the incoming packet, thereby obtaining the direction information of the packet; and stores the packet in a queue of the corresponding packet buffer 206 based on the packet's steering information; if the queue of the packet buffer 206 is nearly full, it is discarded; the switching engine module 202 polls all packet buffer queues for forwarding if the following conditions are met: 1) the port send buffer is not full; 2) the queue packet counter is greater than zero. The disk array module 204 mainly implements control over the hard disk, including initialization, read-write, and other operations on the hard disk; the CPU module 203 is mainly responsible for protocol processing with an access switch and a terminal (not shown in the figure), configuring an address table 205 (including a downlink protocol packet address table, an uplink protocol packet address table, and a data packet address table), and configuring the disk array module 204.
The access switch:
as shown in fig. 3, the network interface module mainly includes a network interface module (a downlink network interface module 301 and an uplink network interface module 302), a switching engine module 303 and a CPU module 304;
wherein, the packet (uplink data) coming from the downlink network interface module 301 enters the packet detection module 305; the packet detection module 305 detects whether the Destination Address (DA), the Source Address (SA), the packet type, and the packet length of the packet meet the requirements, and if so, allocates a corresponding stream identifier (stream-id) and enters the switching engine module 303, otherwise, discards the stream identifier; the packet (downstream data) coming from the upstream network interface module 302 enters the switching engine module 303; the data packet coming from the CPU module 204 enters the switching engine module 303; the switching engine module 303 performs an operation of looking up the address table 306 on the incoming packet, thereby obtaining the direction information of the packet; if the packet entering the switching engine module 303 is from the downstream network interface to the upstream network interface, the packet is stored in the queue of the corresponding packet buffer 307 in association with the stream-id; if the queue of the packet buffer 307 is nearly full, it is discarded; if the packet entering the switching engine module 303 is not from the downlink network interface to the uplink network interface, the data packet is stored in the queue of the corresponding packet buffer 307 according to the guiding information of the packet; if the queue of the packet buffer 307 is nearly full, it is discarded.
The switching engine module 303 polls all packet buffer queues, which in this embodiment of the present invention is divided into two cases:
if the queue is from the downlink network interface to the uplink network interface, the following conditions are met for forwarding: 1) the port send buffer is not full; 2) the queued packet counter is greater than zero; 3) obtaining a token generated by a code rate control module;
if the queue is not from the downlink network interface to the uplink network interface, the following conditions are met for forwarding: 1) the port send buffer is not full; 2) the queue packet counter is greater than zero.
The rate control module 208 is configured by the CPU module 204, and generates tokens for packet buffer queues from all downstream network interfaces to upstream network interfaces at programmable intervals to control the rate of upstream forwarding.
The CPU module 304 is mainly responsible for protocol processing with the node server, configuration of the address table 306, and configuration of the code rate control module 308.
Ethernet protocol conversion gateway
As shown in fig. 4, the apparatus mainly includes a network interface module (a downlink network interface module 401 and an uplink network interface module 402), a switching engine module 403, a CPU module 404, a packet detection module 405, a rate control module 408, an address table 406, a packet buffer 407, a MAC adding module 409, and a MAC deleting module 410.
Wherein, the data packet coming from the downlink network interface module 401 enters the packet detection module 405; the packet detection module 405 detects whether the ethernet MAC DA, the ethernet MAC SA, the ethernet length or frame type, the video network destination address DA, the video network source address SA, the video network packet type, and the packet length of the packet meet the requirements, and if so, allocates a corresponding stream identifier (stream-id); then, the MAC deletion module 410 subtracts MAC DA, MAC SA, length or frame type (2byte) and enters the corresponding receiving buffer, otherwise, discards it;
the downlink network interface module 401 detects the sending buffer of the port, and if there is a packet, obtains the ethernet MAC DA of the corresponding terminal according to the destination address DA of the packet, adds the ethernet MAC DA of the terminal, the MACSA of the ethernet coordination gateway, and the ethernet length or frame type, and sends the packet.
The other modules in the ethernet protocol gateway function similarly to the access switch.
A terminal:
the system mainly comprises a network interface module, a service processing module and a CPU module; for example, the set-top box mainly comprises a network interface module, a video and audio coding and decoding engine module and a CPU module; the coding board mainly comprises a network interface module, a video and audio coding engine module and a CPU module; the memory mainly comprises a network interface module, a CPU module and a disk array module.
1.3 devices of the metropolitan area network part can be mainly classified into 2 types: node server, node exchanger, metropolitan area server. The node switch mainly comprises a network interface module, a switching engine module and a CPU module; the metropolitan area server mainly comprises a network interface module, a switching engine module and a CPU module.
2. Video networking packet definition
2.1 Access network packet definition
The data packet of the access network mainly comprises the following parts: destination Address (DA), Source Address (SA), reserved bytes, payload (pdu), CRC.
As shown in the following table, the data packet of the access network mainly includes the following parts:
DA SA Reserved Payload CRC
wherein:
the Destination Address (DA) is composed of 8 bytes (byte), the first byte represents the type of the data packet (such as various protocol packets, multicast data packets, unicast data packets, etc.), there are 256 possibilities at most, the second byte to the sixth byte are metropolitan area network addresses, and the seventh byte and the eighth byte are access network addresses;
the Source Address (SA) is also composed of 8 bytes (byte), defined as the same as the Destination Address (DA);
the reserved byte consists of 2 bytes;
the payload part has different lengths according to different types of datagrams, and is 64 bytes if the datagram is various types of protocol packets, and is 32+1024 or 1056 bytes if the datagram is a unicast packet, of course, the length is not limited to the above 2 types;
the CRC consists of 4 bytes and is calculated in accordance with the standard ethernet CRC algorithm.
2.2 metropolitan area network packet definition
The topology of a metropolitan area network is a graph and there may be 2, or even more than 2, connections between two devices, i.e., there may be more than 2 connections between a node switch and a node server, a node switch and a node switch, and a node switch and a node server. However, the metro network address of the metro network device is unique, and in order to accurately describe the connection relationship between the metro network devices, parameters are introduced in the embodiment of the present invention: a label to uniquely describe a metropolitan area network device.
In this specification, the definition of the Label is similar to that of the Label of MPLS (Multi-Protocol Label Switch), and assuming that there are two connections between the device a and the device B, there are 2 labels for the packet from the device a to the device B, and 2 labels for the packet from the device B to the device a. The label is classified into an incoming label and an outgoing label, and assuming that the label (incoming label) of the packet entering the device a is 0x0000, the label (outgoing label) of the packet leaving the device a may become 0x 0001. The network access process of the metro network is a network access process under centralized control, that is, address allocation and label allocation of the metro network are both dominated by the metro server, and the node switch and the node server are both passively executed, which is different from label allocation of MPLS, and label allocation of MPLS is a result of mutual negotiation between the switch and the server.
As shown in the following table, the data packet of the metro network mainly includes the following parts:
DA SA Reserved label (R) Payload CRC
Namely Destination Address (DA), Source Address (SA), Reserved byte (Reserved), tag, payload (pdu), CRC. The format of the tag may be defined by reference to the following: the tag is 32 bits with the upper 16 bits reserved and only the lower 16 bits used, and its position is between the reserved bytes and payload of the packet.
Referring to fig. 5, a flowchart illustrating steps of a method for service processing according to an embodiment of the present application is shown, and applied to a terminal of a video network.
The terminal of the video network may include a physical terminal such as a set-top box (SetTopBox, STB), which may be called a set-top box or set-top box, and is a device for connecting a television set and an external signal source, and converting a compressed digital signal into television content and displaying the television content on the television set. Generally, the set-top box may be connected to a camera and a microphone for collecting multimedia data such as video data and audio data, and may also be connected to a television for playing multimedia data such as video data and audio data.
The terminal of the video network can also comprise a virtual video network terminal, the virtual video network terminal is a terminal for accessing the video network to realize special services, and refers to special software, an environment can be created between the terminal and a terminal user, the terminal user operates the software based on the environment created by the software, and the virtual video network terminal can be realized by the software of the terminal which runs programs like a real machine.
Specifically, the method can comprise the following steps:
step 501, when a first interactive instruction is received, a target video service is created;
in a specific implementation, a user may interact with the video networking terminal through a remote controller or a touch screen, and when receiving a first interaction instruction, the video networking terminal may create a target video service, such as a live broadcast service, a video conference service, a video call service, and the like.
Step 502, determining service attribute information corresponding to the target video service;
as an example, the service attribute information may include any one of:
video networking number, video conference identification, and live broadcast identification.
After the target video service is created, the service attribute information corresponding to the target video service can be determined, for example, after a video conference is created, a video conference identifier (conference room number) corresponding to the video conference can be obtained, and for example, after a video live broadcast is created, a live broadcast identifier (live broadcast room number) corresponding to the live broadcast can be obtained.
Step 503, when receiving the second interactive instruction, generating a two-dimensional code corresponding to the service attribute information, and issuing the two-dimensional code.
In practical application, a user can select to issue information in a two-dimensional code mode, and then when a second interactive instruction is received, a two-dimensional code corresponding to the service attribute information can be generated and issued, and if the two-dimensional code is displayed on a display screen, the two-dimensional code is sent to other terminals through a video network.
In this practical application, the video network is a network with a centralized control function, and includes a main control server and a lower level network device, where the lower level network device includes a terminal, and one of the core concepts of the video network is that a table is configured for a downlink communication link of a current service by notifying a switching device by the main control server, and then a data packet is transmitted based on the configured table.
Namely, the communication method in the video network includes:
the main control server configures a downlink communication link of the current service;
and transmitting the data packet of the current service sent by the source terminal to the target terminal according to the downlink communication link.
In the embodiment of the present invention, configuring the downlink communication link of the current service includes: informing the switching equipment related to the downlink communication link of the current service to allocate a table;
further, transmitting according to the downlink communication link includes: the configured table is consulted, and the switching equipment transmits the received data packet through the corresponding port.
In particular implementations, the services include unicast communication services and multicast communication services. Namely, whether multicast communication or unicast communication, the core concept of the table matching-table can be adopted to realize communication in the video network.
As described above, the video network includes an access network portion, in which the master server is a node server, and the lower-level network device includes an access switch and a terminal.
For the unicast communication service in the access network, the step of configuring the downlink communication link of the current service by the master server may include the following steps:
in the substep S11, the main control server obtains downlink communication link information of the current service according to the service request protocol packet initiated by the source terminal, wherein the downlink communication link information includes downlink communication port information of the main control server and the access switch participating in the current service;
in the substep S12, the main control server sets a downlink port to which a packet of the current service is directed in a packet address table inside the main control server according to the downlink communication port information of the control server; sending a port configuration command to a corresponding access switch according to the downlink communication port information of the access switch;
in sub-step S13, the access switch sets the downstream port to which the packet of the current service is directed in its internal packet address table according to the port configuration command.
For a multicast communication service (e.g., video conference) in the access network, the step of the master server obtaining downlink information of the current service may include the following sub-steps:
in the substep S21, the main control server obtains a service request protocol packet initiated by the target terminal and applying for the multicast communication service, wherein the service request protocol packet includes service type information, service content information and an access network address of the target terminal; wherein, the service content information comprises a service number;
substep S22, the master control server extracts the access network address of the source terminal in the preset content-address mapping table according to the service number;
in the substep of S23, the main control server obtains the multicast address corresponding to the source terminal and distributes the multicast address to the target terminal; and acquiring the communication link information of the current multicast service according to the service type information and the access network addresses of the source terminal and the target terminal.
In an embodiment of the present application, step 503 may include the following sub-steps:
determining a target coding mode corresponding to the target video service in a preset database; and generating a two-dimensional code corresponding to the service attribute information by adopting the target coding mode.
The preset database can store the corresponding relation between a plurality of video service identifications and coding modes.
In a specific implementation, a preset database may be established in advance, and after the target video service is determined, the identifier of the target video service may be obtained, and matching may be performed in the preset database to determine a target coding mode corresponding to the target video service.
After the coding mode is determined, a target coding mode can be adopted to code the service attribute information to obtain the two-dimensional code corresponding to the service attribute information.
In an embodiment of the present application, the step of generating the two-dimensional code corresponding to the service attribute information by using the target coding method may include the following sub-steps:
when the target video service is a video conference service, extracting key information from the service attribute information; and coding the key information by adopting the target coding mode to obtain a two-dimensional code corresponding to the service attribute information.
The key information may include, among other things, a video conference identification.
For the video conference service, the video network terminal can screen out key information such as a video conference identifier from the service attribute information, and then can encode the key information by adopting a target encoding mode to obtain a two-dimensional code corresponding to the service attribute information.
In another embodiment of the present application, the step of generating the two-dimensional code corresponding to the service attribute information by using the target coding method may include the following sub-steps:
when the target video service is a live broadcast service, acquiring a preview image; and coding the preview image and the service attribute information by adopting the target coding mode to obtain a two-dimensional code corresponding to the service attribute information.
For the live broadcast service, the video network terminal may obtain a preview image, for example, the preview image may be an image obtained from a current live broadcast picture or an image set by a main broadcast, and then the preview image and the service attribute information may be encoded by using a target encoding method to obtain a two-dimensional code corresponding to the service attribute information.
In the application, the target video service is created when the first interactive instruction is received, then the service attribute information corresponding to the target video service is determined, the two-dimensional code corresponding to the service attribute information is generated when the second interactive instruction is received, and the two-dimensional code is issued, so that the video network service is issued by adopting the two-dimensional code, and the convenience of the video network is improved.
It should be noted that, for simplicity of description, the method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the embodiments are not limited by the order of acts described, as some steps may occur in other orders or concurrently depending on the embodiments. Further, those skilled in the art will also appreciate that the embodiments described in the specification are presently preferred and that no particular act is required of the embodiments of the application.
Referring to fig. 6, a block diagram of a service processing apparatus according to an embodiment of the present application is shown, and is applied to a video network terminal, where the service processing apparatus specifically includes the following modules:
a target video service creation module 601, configured to create a target video service when receiving a first interaction instruction;
a service attribute information determining module 602, configured to determine service attribute information corresponding to the target video service;
and a two-dimension code issuing module 603, configured to generate a two-dimension code corresponding to the service attribute information when a second interactive instruction is received, and issue the two-dimension code.
In an embodiment of the present application, the two-dimensional code publishing module 603 includes:
the target coding mode determining submodule is used for determining a target coding mode corresponding to the target video service in a preset database; the preset database stores the corresponding relation between a plurality of video service identifications and coding modes;
and the two-dimension code generation submodule is used for generating the two-dimension code corresponding to the service attribute information by adopting the target coding mode.
In an embodiment of the present application, the two-dimensional code generation submodule includes:
a key information extracting unit, configured to extract key information from the service attribute information when the target video service is a video conference service; wherein the key information comprises a video conference identifier;
and the first coding unit is used for coding the key information by adopting the target coding mode to obtain a two-dimensional code corresponding to the service attribute information.
In an embodiment of the present application, the two-dimensional code generation submodule includes:
a preview image acquisition unit, configured to acquire a preview image when the target video service is a live broadcast service;
and the second coding unit is used for coding the preview image and the service attribute information by adopting the target coding mode to obtain a two-dimensional code corresponding to the service attribute information.
In an embodiment of the present application, the service attribute information includes any one of:
video networking number, video conference identification, and live broadcast identification.
In the application, the target video service is created when the first interactive instruction is received, then the service attribute information corresponding to the target video service is determined, the two-dimensional code corresponding to the service attribute information is generated when the second interactive instruction is received, and the two-dimensional code is issued, so that the video network service is issued by adopting the two-dimensional code, and the convenience of the video network is improved.
For the device embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, refer to the partial description of the method embodiment.
An embodiment of the present application also provides an electronic device, which may include a processor, a memory, and a computer program stored on the memory and capable of running on the processor, and when the computer program is executed by the processor, the steps of the method for processing the business as above are implemented.
An embodiment of the present application further provides a computer-readable storage medium, on which a computer program is stored, and the computer program, when executed by a processor, implements the steps of the above method for business processing.
The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
As will be appreciated by one of skill in the art, embodiments of the present application may be provided as a method, apparatus, or computer program product. Accordingly, embodiments of the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
Embodiments of the present application are described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing terminal to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing terminal to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing terminal to cause a series of operational steps to be performed on the computer or other programmable terminal to produce a computer implemented process such that the instructions which execute on the computer or other programmable terminal provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present application have been described, additional variations and modifications of these embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including the preferred embodiment and all such alterations and modifications as fall within the true scope of the embodiments of the application.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or terminal that comprises the element.
The method and the apparatus for processing a service provided by the present application are introduced in detail, and a specific example is applied in the present application to explain the principle and the implementation of the present application, and the description of the above embodiment is only used to help understand the method and the core idea of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (10)

1. A method for processing service is applied to a video network terminal, and comprises the following steps:
when a first interactive instruction is received, creating a target video service;
determining service attribute information corresponding to the target video service;
and when a second interactive instruction is received, generating a two-dimensional code corresponding to the service attribute information, and issuing the two-dimensional code.
2. The method of claim 1, wherein the step of generating the two-dimensional code corresponding to the service attribute information comprises:
determining a target coding mode corresponding to the target video service in a preset database; the preset database stores the corresponding relation between a plurality of video service identifications and coding modes;
and generating a two-dimensional code corresponding to the service attribute information by adopting the target coding mode.
3. The method according to claim 2, wherein the step of generating the two-dimensional code corresponding to the service attribute information by using the target coding scheme includes:
when the target video service is a video conference service, extracting key information from the service attribute information; wherein the key information comprises a video conference identifier;
and coding the key information by adopting the target coding mode to obtain a two-dimensional code corresponding to the service attribute information.
4. The method according to claim 2, wherein the step of generating the two-dimensional code corresponding to the service attribute information by using the target coding scheme includes:
when the target video service is a live broadcast service, acquiring a preview image;
and coding the preview image and the service attribute information by adopting the target coding mode to obtain a two-dimensional code corresponding to the service attribute information.
5. The method according to claim 1, wherein the service attribute information comprises any one of:
video networking number, video conference identification, and live broadcast identification.
6. A service processing device is applied to a video network terminal, and comprises:
the target video service creating module is used for creating a target video service when receiving the first interactive instruction;
the service attribute information determining module is used for determining service attribute information corresponding to the target video service;
and the two-dimension code issuing module is used for generating a two-dimension code corresponding to the service attribute information and issuing the two-dimension code when receiving a second interactive instruction.
7. The apparatus of claim 6, wherein the two-dimensional code publishing module comprises:
the target coding mode determining submodule is used for determining a target coding mode corresponding to the target video service in a preset database; the preset database stores the corresponding relation between a plurality of video service identifications and coding modes;
and the two-dimension code generation submodule is used for generating the two-dimension code corresponding to the service attribute information by adopting the target coding mode.
8. The apparatus of claim 7, wherein the two-dimensional code generation submodule comprises:
a key information extracting unit, configured to extract key information from the service attribute information when the target video service is a video conference service; wherein the key information comprises a video conference identifier;
and the first coding unit is used for coding the key information by adopting the target coding mode to obtain a two-dimensional code corresponding to the service attribute information.
9. The apparatus of claim 7, wherein the two-dimensional code generation submodule comprises:
a preview image acquisition unit, configured to acquire a preview image when the target video service is a live broadcast service;
and the second coding unit is used for coding the preview image and the service attribute information by adopting the target coding mode to obtain a two-dimensional code corresponding to the service attribute information.
10. The apparatus according to claim 6, wherein the service attribute information comprises any one of:
video networking number, video conference identification, and live broadcast identification.
CN201810843377.1A 2018-07-27 2018-07-27 Service processing method and device Pending CN110769184A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810843377.1A CN110769184A (en) 2018-07-27 2018-07-27 Service processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810843377.1A CN110769184A (en) 2018-07-27 2018-07-27 Service processing method and device

Publications (1)

Publication Number Publication Date
CN110769184A true CN110769184A (en) 2020-02-07

Family

ID=69327821

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810843377.1A Pending CN110769184A (en) 2018-07-27 2018-07-27 Service processing method and device

Country Status (1)

Country Link
CN (1) CN110769184A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102630004A (en) * 2012-04-12 2012-08-08 华为技术有限公司 Authentication method for video conference and related device
CN104917764A (en) * 2015-06-09 2015-09-16 深圳创维-Rgb电子有限公司 Multimedia service pushing method and system based on two-dimensional code
CN107370610A (en) * 2017-08-30 2017-11-21 百度在线网络技术(北京)有限公司 Meeting synchronous method and device
CN107690085A (en) * 2017-04-25 2018-02-13 腾讯科技(深圳)有限公司 Data sharing method and device
CN107734287A (en) * 2017-09-27 2018-02-23 苏州成业网络科技有限公司 A kind of method and system for automatically creating video conference

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102630004A (en) * 2012-04-12 2012-08-08 华为技术有限公司 Authentication method for video conference and related device
CN104917764A (en) * 2015-06-09 2015-09-16 深圳创维-Rgb电子有限公司 Multimedia service pushing method and system based on two-dimensional code
CN107690085A (en) * 2017-04-25 2018-02-13 腾讯科技(深圳)有限公司 Data sharing method and device
CN107370610A (en) * 2017-08-30 2017-11-21 百度在线网络技术(北京)有限公司 Meeting synchronous method and device
CN107734287A (en) * 2017-09-27 2018-02-23 苏州成业网络科技有限公司 A kind of method and system for automatically creating video conference

Similar Documents

Publication Publication Date Title
CN108965224B (en) Video-on-demand method and device
CN110166728B (en) Video networking conference opening method and device
CN109167960B (en) Method and system for processing video stream data
CN109525854B (en) Live broadcast processing method and device
CN109309806B (en) Video conference management method and system
CN110049271B (en) Video networking conference information display method and device
CN109120879B (en) Video conference processing method and system
CN109547731B (en) Video conference display method and system
CN109660816B (en) Information processing method and device
CN110572607A (en) Video conference method, system and device and storage medium
CN110049273B (en) Video networking-based conference recording method and transfer server
CN109922351B (en) Method and device for releasing live broadcast service
CN109246135B (en) Method and system for acquiring streaming media data
CN109040656B (en) Video conference processing method and system
CN108574816B (en) Video networking terminal and communication method and device based on video networking terminal
CN110149305B (en) Video network-based multi-party audio and video playing method and transfer server
CN109743555B (en) Information processing method and system based on video network
CN109873864B (en) Communication connection establishing method and system based on video networking
CN109743284B (en) Video processing method and system based on video network
CN109802952B (en) Monitoring data synchronization method and device
CN109005378B (en) Video conference processing method and system
CN108965930B (en) Video data processing method and device
CN110891156B (en) Conference entering method and device of monitoring camera
CN110072154B (en) Video networking-based clustering method and transfer server
CN109963107B (en) Audio and video data display method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200207