CN109672615B - Data message caching method and device - Google Patents

Data message caching method and device Download PDF

Info

Publication number
CN109672615B
CN109672615B CN201710963684.9A CN201710963684A CN109672615B CN 109672615 B CN109672615 B CN 109672615B CN 201710963684 A CN201710963684 A CN 201710963684A CN 109672615 B CN109672615 B CN 109672615B
Authority
CN
China
Prior art keywords
cache
function module
message
data
data message
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710963684.9A
Other languages
Chinese (zh)
Other versions
CN109672615A (en
Inventor
李汉成
周汉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN201710963684.9A priority Critical patent/CN109672615B/en
Publication of CN109672615A publication Critical patent/CN109672615A/en
Application granted granted Critical
Publication of CN109672615B publication Critical patent/CN109672615B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L45/00Routing or path finding of packets in data switching networks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/56Provisioning of proxy services
    • H04L67/568Storing data temporarily at an intermediate stage, e.g. caching

Abstract

The application provides a data message caching method and device. The method comprises the following steps: the UP receives a flow table item of a cache message sent by the CP, the UP receives a data message, and the data message matched with the flow table item of the cache message is sent to the cache function module for the cache function module to store the data message, and the cache function module is an internal module of the switch or an external module of the switch. Therefore, the user data message can be cached in the UP, and the problems of load and time delay caused by message caching in the CP are avoided.

Description

Data message caching method and device
Technical Field
The present application relates to the field of communications technologies, and in particular, to a data packet caching method and apparatus.
Background
The message buffer function is an application of telecommunication service, the exchanger buffers the user data message when the user is not on-line, and then sends the buffered user data message to the user when the user is on-line. The Control function and the message forwarding function of a conventional switch are integrated in one switch, and under a network architecture in which a Control Plane (CP) is separated from a User Plane (UP) in the 5th Generation mobile communication technology, 5G, the switch is divided into CP and UP, the CP and the UP are separately deployed and connected through a standard communication interface, the CP is responsible for signaling message processing and issuing a flow table entry of a User data message to the UP, and the flow table entry is composed of a header field, a counter and an operation, wherein the header field is a ten-tuple and is an identifier of the flow table entry, the counter is used to count statistics data of the flow table entry, and the operation identifies an operation to be performed by the User data message matching with the flow table entry. The UP receives the user data message and processes (such as modifying or forwarding) the user data message according to the flow table item issued by the CP.
Under the CP-UP separation architecture, a common switch includes an openflow switch and a programmable Protocol-Independent Packet processor (P4) switch, the openflow switch has a function as UP only according to a flow table item issued by the CP, matches a received user data Packet with a user data Packet in an operation of the flow table item, and completes modification or forwarding of the user data Packet according to an operation defined in the flow table item on the matched user data Packet, and there is no storage function in processing the user data Packet. There are definitions of registers in P4, but the number and function of registers are not used for mass storage such as user data message caching, and therefore message storage cannot be accomplished. In the prior art, in order to implement data message caching, a user data message is sent to a controller of a CP through UP and stored by the controller.
However, since CP and UP are separately deployed and CP manages a large amount of UP, a large load is caused by CP performing data packet buffering, and UP is required to send a user data packet to CP through a standard communication interface, which results in a large delay.
Disclosure of Invention
The application provides a data message caching method and device, which are characterized in that a module with a storage function is added on a user plane, so that a user data message is cached on the user plane, and the problems of load and time delay caused by message caching on a control plane are avoided.
In a first aspect, the present application provides a data packet caching method, including: the user plane UP receives a flow table item of a cache message sent by the control plane CP, the UP receives a data message, and sends a data message matched with the flow table item of the cache message to the cache function module, wherein the data message is stored in the cache function module, and the cache function module is an internal module of the switch or an external module of the switch.
According to the data message caching method provided by the first aspect, the caching function module is added to the UP, the CP sends the flow table item of the caching message to the UP, the UP receives the data message, the data message matched with the flow table item of the caching message is sent to the caching function module, and the caching function module stores the data message according to the user information in the data message and the receiving sequence of the data message, so that the message is forwarded to the caching function module by utilizing the forwarding capacity of the UP, and the caching function module stores the data message in a mode appointed to the CP, so that the user data message can be cached in the UP, and the problems of load and time delay caused by message caching in the CP are solved.
In one possible design, the method further includes: UP receives flow table items of forwarding messages sent by CP; the UP receives a cache data message sent by a cache function module, the cache data message is sent by the cache function module according to a storage sequence and a message sending instruction when receiving the message sending instruction sent by the CP, the message sending instruction comprises second user information of the cache data message to be sent, the second user information comprises at least one of an MAC address, an IP address, a tunnel identifier and an identifier of a hit flow table item, and the UP forwards the cache data message according to the flow table item of a forwarded message.
According to the data message caching method provided by the embodiment, a caching function module is added to the UP, the CP sends a flow table item of the caching message to the UP, the UP receives the data message, the data message matched with the flow table item of the caching message is sent to the caching function module, the caching function module stores the data message according to user information in the data message and a receiving sequence of the data message, when the data message is forwarded, the CP sends a flow table item of the forwarding message to the UP, the CP sends a message sending instruction to the caching function module, the message sending instruction comprises the user information of the caching message to be sent, the caching function module sends the caching data message to the UP according to the storing sequence and the message sending instruction, and finally the UP forwards the caching data message according to the flow table item of the forwarding message. Therefore, the message is forwarded to the caching function module by utilizing the forwarding capability of the UP, the caching function module stores the data message according to the mode appointed with the CP, and can send the cached data message to the UP according to the appointed message sending command, so that the user data message can be cached in the UP, and the problems of load and time delay caused by message caching by the CP are solved.
In one possible design, the buffer function module is directly butted with a port of UP, the buffer function module has a single message storage function, the operation indicated by the flow table item of the buffer message is matched with the specified data message, and the data message is sent to the buffer function module through the port connected with the buffer function module;
the UP sends a data packet matched with a flow entry of a cache packet to a cache function module, and the method comprises the following steps:
the UP sends the data packet matched with the flow entry of the cache packet to the cache function module through a port connected with the cache function module, and the UP is used for the cache function module to store the data packet according to the first user information in the data packet and the receiving sequence of the data packet, wherein the first user information comprises at least one of a media access control MAC address, an IP address, a tunnel identifier and an identifier of a hit flow entry.
In one possible design, the buffer function module is directly connected with a port of the UP in a butt joint mode, the buffer function module has a non-single message storage function, the operation indicated by a flow table item of the buffer message is to match a specified data message, first control information is added, the first control information at least comprises a buffer instruction, and the data message added with the first control information is sent to the buffer function module through the port connected with the buffer function module;
the UP sends a data packet matched with a flow entry of a cache packet to a cache function module, and the method comprises the following steps: the UP adds first control information to a data message matched with a flow table item of a cache message, and the data message added with the first control information is sent to a cache function module through a port connected with the cache function module, so that the cache function module stores the data message according to first user information in the data message and the receiving sequence of the data message, or stores the data message according to a serial number in the first control information, wherein the first user information comprises at least one of an MAC address, an IP address, a tunnel identifier and a hit identifier of the flow table item.
According to the data message caching method provided by the embodiment, the caching function module not only supports the message storage function, but also supports other functions, and the UP indicates the caching function module to cache the data message by adding the first control information in the data message; when sending the cached message, the CP may add a data message sending port in the message sending instruction, and the cache function module adds the first control information to the message when sending the cached data message, so as to instruct the UP to perform a corresponding operation on the data message. The cache function module in the embodiment reduces the limitation of single function of the module, and the addition of the control information makes the control and operation of caching and sending the cached data message more flexible.
In a possible design, the second user information further includes a data packet sending port, the cache data packet is sent to the UP after the cache functional module adds the second control information according to the storage sequence, and the second control information includes the data packet sending port.
In one possible design, the cache function module is communicated with the UP through routing or exchange, the cache function module has a non-single message storage function, the operation indicated by the flow table item of the cache message is to match a specified data message, the matched data message is encapsulated in a service chain packet header, the encapsulated data message is sent to the cache function module through a port connected with the cache function module, the service chain packet header carries information used for indicating the cache data message and information required to be transmitted to the cache function module by the UP, and the service chain packet header comprises a packet header of a service chain protocol or a packet header of a tunnel protocol;
the UP sends a data packet matched with a flow entry of a cache packet to a cache function module, and the method comprises the following steps: the UP encapsulates the data message matched with the flow table item of the cache message in a service chain packet header, and sends the encapsulated data message to the cache function module through a port connected with the cache function module, so that the cache function module stores the data message according to first user information in the data message and the receiving sequence of the data message, or stores the data message according to a serial number in the service chain packet header, wherein the first user information comprises at least one of an MAC address, an IP address, a tunnel identifier and a hit identifier of the flow table item.
According to the data message caching method provided by the embodiment, the caching function module has no requirement on a networking mode and is not limited by direct connection of UP, the data message to be cached and the cached data message to be sent reach UP through routing or exchange, meanwhile, information used for indicating the cached data message can be carried in a service chain packet header, the networking mode is not required, and meanwhile, the control is flexible.
In a possible design, the second user information further includes a data packet sending port, the cache data packet is sent to the UP after the cache functional module is encapsulated in the packet header of the service chain according to the storage sequence, and the packet header of the service chain carries the data packet sending port.
In a second aspect, the present application provides a data packet caching method, including: the method comprises the steps that a cache function module receives a data message which is sent by a user plane UP and matched with a flow table item of the cache message, the cache function module is an internal module of a switch or an external module of the switch, the flow table item of the cache message is sent to the UP by a control plane CP, and the cache function module stores the data message.
According to the data message caching method provided by the first aspect, the caching function module is added to the UP, the CP sends the flow table item of the caching message to the UP, the UP receives the data message, the data message matched with the flow table item of the caching message is sent to the caching function module, and the caching function module stores the data message according to the user information in the data message and the receiving sequence of the data message, so that the message is forwarded to the caching function module by utilizing the forwarding capacity of the UP, and the caching function module stores the data message in a mode appointed to the CP, so that the user data message can be cached in the UP, and the problems of load and time delay caused by message caching in the CP are solved.
In one possible design, the method further includes: the cache function module receives a message sending instruction sent by the CP, the message sending instruction comprises second user information of a cache data message to be sent, and the second user information comprises at least one of an MAC address, an IP address, a tunnel identifier and an identifier of a hit flow table item; and the caching function module sends a caching data message to the UP according to the storage sequence and the message sending instruction, and the UP forwards the received caching data message according to the flow table item of the forwarding message sent by the CP.
According to the data message caching method provided by the embodiment, a caching function module is added to the UP, the CP sends a flow table item of the caching message to the UP, the UP receives the data message, the data message matched with the flow table item of the caching message is sent to the caching function module, the caching function module stores the data message according to user information in the data message and a receiving sequence of the data message, when the data message is forwarded, the CP sends a flow table item of the forwarding message to the UP, the CP sends a message sending instruction to the caching function module, the message sending instruction comprises the user information of the caching message to be sent, the caching function module sends the caching data message to the UP according to the storing sequence and the message sending instruction, and finally the UP forwards the caching data message according to the flow table item of the forwarding message. Therefore, the message is forwarded to the cache function module by utilizing the forwarding capability of the UP, the cache function module stores the data message according to the mode appointed with the CP, and can send the cached data message to the UP according to the appointed message sending instruction, so that the user data message can be cached in the UP, and the problems of load and time delay caused by the message caching of the CP are avoided.
In one possible design, the buffer function module is directly butted with a port of UP, the buffer function module has a single message storage function, the operation indicated by the flow table item of the buffer message is matched with the specified data message, and the data message is sent to the buffer function module through the port connected with the buffer function module;
the cache function module stores data messages, and the cache function module comprises: the cache function module stores the data message according to first user information in the data message and a receiving sequence of the data message, wherein the first user information comprises at least one of a Media Access Control (MAC) address, an Internet Protocol (IP) address, a tunnel identifier and an identifier of a hit flow table item.
In one possible design, the buffer function module is directly connected with a port of the UP in a butt joint mode, the buffer function module has a non-single message storage function, the operation indicated by a flow table item of the buffer message is to match a specified data message, first control information is added, the first control information at least comprises a buffer instruction, and the data message added with the first control information is sent to the buffer function module through the port connected with the buffer function module;
the cache function module stores data messages, and the cache function module comprises: the cache function module stores the data messages according to the first user information in the data messages and the receiving sequence of the data messages, or stores the data messages according to the sequence number in the first control information, wherein the first user information comprises at least one of an MAC address, an IP address, a tunnel identifier and a hit flow table entry identifier.
According to the data message caching method provided by the embodiment, the caching function module not only supports the message storage function, but also supports other functions, and the UP indicates the caching function module to cache the data message by adding the first control information in the data message; when sending the cached message, the CP may add a data message sending port in the message sending instruction, and the cache function module adds the second control information to the message when sending the cached data message, so as to instruct the UP to perform a corresponding operation on the data message. The cache function module in the embodiment reduces the limitation of single function of the module, and the addition of the control information makes the control and operation of caching and sending the cached data message more flexible.
In a possible design, the second user information further includes a data packet sending port, and the cache function module sends the cached data packet to the UP according to the storage sequence and the packet sending instruction, including: and the buffer function module adds second control information to the buffer data message to be sent according to the storage sequence and then sends the second control information to the UP, wherein the second control information comprises a data message sending port.
In one possible design, the cache function module is communicated with the UP through routing or exchange, the cache function module has a non-single message storage function, the operation indicated by the flow table item of the cache message is to match a specified data message, the matched data message is encapsulated in a service chain packet header, the encapsulated data message is sent to the cache function module through a port connected with the cache function module, the service chain packet header carries information used for indicating the cache data message and information required to be transmitted to the cache function module by the UP, and the service chain packet header comprises a packet header of a service chain protocol or a packet header of a tunnel protocol;
the caching function module stores data messages, and the caching function module comprises: the cache function module stores the data message according to the first user information in the data message and the receiving sequence of the data message, or stores the data message according to the serial number in the service link packet header, wherein the first user information comprises at least one of an MAC address, an IP address, a tunnel identifier and a hit identifier of the flow table entry.
According to the data message caching method provided by the embodiment, the caching function module has no requirement on a networking mode and is not limited by direct connection of UP, the data message to be cached and the cached data message to be sent reach UP through routing or exchange, meanwhile, information used for indicating the cached data message can be carried in a service chain packet header, the networking mode is not required, and meanwhile, the control is flexible.
In one possible design, the second user information further includes a data packet sending port, and the cache function module sends the cached data packet to the UP according to the storage sequence and the packet sending instruction, including: and the buffer function module encapsulates the buffer data messages to be sent in the service chain packet header according to the storage sequence and then sends the buffer data messages to the UP, wherein the service chain packet header carries a data message sending port.
In a third aspect, the present application provides a data packet buffering apparatus, including: the receiving module is used for receiving a flow entry of the cache message sent by the control plane CP; the receiving module is further configured to: receiving a data message; and the sending module is used for sending the data message matched with the flow table item of the cache message to the cache function module, and the cache function module is used for storing the data message and is an internal module of the switch or an external module of the switch.
In one possible design, the receiving module is further configured to: receiving a flow table item of a forwarding message sent by a CP; and receiving a cache data message sent by the cache function module, wherein the cache data message is sent by the cache function module according to the storage sequence and the message sending instruction when receiving a message sending instruction sent by the CP, the message sending instruction comprises second user information of the cache data message to be sent, and the second user information comprises at least one of an MAC address, an IP address, a tunnel identifier and a hit flow table entry identifier. The sending module is further configured to forward the cached data packet according to the flow entry of the forwarding packet.
In one possible design, the cache function module is directly butted with a port of the data message cache device, the function of the cache function module is a single message storage function, the operation indicated by the flow table item of the cache message is matched with the specified data message, and the operation is sent to the cache function module through the port connected with the cache function module;
the sending module is used for: and sending the data message matched with the flow table entry of the cache message to the cache function module through a port connected with the cache function module, wherein the cache function module is used for storing the data message according to first user information and a receiving sequence of the data message in the data message, and the first user information comprises at least one of a Media Access Control (MAC) address, an IP address, a tunnel identifier and a hit identifier of the flow table entry.
In one possible design, a cache function module is directly butted with a port of a data message cache device, the cache function module has a non-single message storage function, the operation indicated by a flow entry of a cache message is to match a specified data message, first control information is added, the first control information at least comprises a cache instruction, and the data message added with the first control information is sent to the cache function module through the port connected with the cache function module;
the sending module is used for: adding first control information to a data message matched with a flow table entry of a cache message, sending the data message added with the first control information to a cache function module through a port connected with the cache function module, and storing the data message by the cache function module according to first user information in the data message and a receiving sequence of the data message, or storing the data message according to a serial number in the first control information, wherein the first user information comprises at least one of an MAC address, an IP address, a tunnel identifier and a hit identifier of the flow table entry.
In a possible design, the second user information further includes a data packet sending port, the cache data packet is sent to the data packet caching device after the cache functional module adds the second control information according to the storage sequence, and the second control information includes the data packet sending port.
In one possible design, the cache function module is communicated with the data message cache device through routing or exchange, the cache function module has a non-single message storage function, the operation indicated by the flow table item of the cache message is to match a specified data message, the matched data message is encapsulated in a service chain packet header, the encapsulated data message is sent to the cache function module through a port connected with the cache function module, the service chain packet header carries information used for indicating the cached data message and information which needs to be transmitted to the cache function module by the data message cache device, and the service chain packet header comprises a packet header of a service chain protocol or a packet header of a tunnel protocol;
the sending module is used for: and encapsulating the data message matched with the flow table entry of the cache message in a service chain packet header, and sending the encapsulated data message to the cache function module through a port connected with the cache function module, wherein the cache function module is used for storing the data message according to first user information in the data message and the receiving sequence of the data message, or storing the data message according to a serial number in the service chain packet header, and the first user information comprises at least one of an MAC address, an IP address, a tunnel identifier and a hit flow table entry identifier.
In a possible design, the second user information further includes a data packet sending port, the cache data packet is sent to the data packet cache device after the cache functional module is encapsulated in the packet header of the service chain according to the storage sequence, and the packet header of the service chain carries the data packet sending port.
The beneficial effects of the data packet buffering apparatus provided by the third aspect and the possible designs of the third aspect may refer to the beneficial effects brought by the possible designs of the first aspect and the first aspect, and are not described herein again.
In a fourth aspect, the present application provides a data packet buffering apparatus, including:
the receiving module is used for receiving a data message which is sent by a user plane UP and matched with a flow table item of a cache message, the data message caching device is an internal module or an external module of a switch, and the flow table item of the cache message is sent to the UP as a control plane CP; and the storage module is used for storing the data message.
In one possible design, the receiving module is further configured to: receiving a message sending instruction sent by the CP, wherein the message sending instruction comprises second user information of a cache data message to be sent, and the second user information comprises at least one of an MAC address, an IP address, a tunnel identifier and an identifier of a hit flow table item; the device further comprises: and the transmitting module is used for transmitting the cache data message to the UP according to the storage sequence and the message transmitting instruction, and the UP forwards the received cache data message according to the flow table item of the forwarding message transmitted by the CP.
In one possible design, the data message caching device is directly butted with a port of the UP, the function of the data message caching device is a single message storage function, the operation indicated by a flow table item of the caching message is to match the specified data message, and the data message is sent to the data message caching device through the port connected with the data message caching device;
the storage module is used for: and storing the data messages according to first user information in the data messages and the receiving sequence of the data messages, wherein the first user information comprises at least one of a Media Access Control (MAC) address, an Internet Protocol (IP) address, a tunnel identifier and a hit identifier of the flow table entry.
In one possible design, the data message caching device is directly butted with a port of the UP, the data message caching device has a non-single message storage function, the operation indicated by a flow table item of the cached message is to match a specified data message, first control information is added, the first control information at least comprises a caching instruction, and the data message added with the first control information is sent to the data message caching device through the port connected with the data message caching device;
the storage module is used for: and storing the data messages according to the first user information in the data messages and the receiving sequence of the data messages, or storing the data messages according to the sequence number in the first control information, wherein the first user information comprises at least one of an MAC address, an IP address, a tunnel identifier and a hit identifier of the flow table entry.
In one possible design, the second user information further includes a data packet sending port, and the sending module is configured to: and adding second control information to the cache data message to be sent according to the storage sequence and then sending the second control information to the UP, wherein the second control information comprises a data message sending port.
In one possible design, the data packet buffer device is communicated with the UP through routing or exchange, the data packet buffer device has a non-single packet storage function, the operation indicated by the flow table entry of the buffer packet is to match a specified data packet, the matched data packet is encapsulated in a service chain packet header, the encapsulated data packet is sent to the data packet buffer device through a port connected with the data packet buffer device, the service chain packet header carries information for indicating the buffered data packet and information that the UP needs to be transmitted to the data packet buffer device, and the service chain packet header comprises a packet header of a service chain protocol or a packet header of a tunnel protocol;
the storage module is used for: and storing the data messages according to the first user information in the data messages and the receiving sequence of the data messages, or storing the data messages according to the sequence number in the service chain packet header, wherein the first user information comprises at least one of an MAC address, an IP address, a tunnel identifier and a hit flow table entry identifier.
In one possible design, the second user information further includes a data packet sending port, and the sending module is configured to:
and encapsulating the cache data messages to be sent in a service chain packet header according to the storage sequence and then sending the cache data messages to the UP, wherein the service chain packet header carries a data message sending port.
The beneficial effects of the data packet buffering apparatus provided by the fourth aspect and the possible designs of the fourth aspect may refer to the beneficial effects brought by the possible designs of the second aspect and the second aspect, and are not described herein again.
In a fifth aspect, the present application provides a data packet buffering apparatus, including: a memory and a processor;
the memory is used for storing program instructions;
the processor is configured to call program instructions in the memory to perform the data packet caching method in any one of the possible designs of the first aspect and the first aspect or in any one of the possible designs of the second aspect and the second aspect.
In a sixth aspect, the present application provides a readable storage medium, where an execution instruction is stored, and when the execution instruction is executed by at least one processor of a datagram caching device, the datagram caching device executes a method for caching a datagram in any one of the possible designs of the first aspect and the first aspect, or in any one of the possible designs of the second aspect and the second aspect.
In a seventh aspect, the present application provides a program product comprising execution instructions stored in a readable storage medium. The executable instructions may be read from a readable storage medium by at least one processor of the data packet caching apparatus, and the execution of the executable instructions by the at least one processor causes the data packet caching apparatus to implement the data packet caching method in any one of the possible designs of the first aspect and the first aspect or in any one of the possible designs of the second aspect and the second aspect.
Drawings
FIG. 1 is a schematic diagram of a network architecture of a 5G system;
FIG. 2 is a schematic diagram of CU-DU split in the 5G system;
FIG. 3 is a schematic diagram of a partition of a CU;
fig. 4 is an interaction flowchart of an embodiment of a data message caching method provided in the present application;
fig. 5 is an interaction flowchart of an embodiment of a process of sending a cached data packet in a data packet caching method provided by the present application;
fig. 6 is a schematic structural diagram of a related component for implementing a data packet caching method according to the present application;
fig. 7a is a schematic diagram illustrating a process of buffering a data packet in a data packet buffering method according to the present application;
fig. 7b is a schematic diagram of a process of sending a cached data packet in a data packet caching method provided by the present application;
fig. 8a is a schematic diagram illustrating a process of caching a data packet in another data packet caching method according to the present application;
fig. 8b is a schematic diagram of a process of sending a cached data packet in another data packet caching method provided by the present application;
fig. 9a is a schematic diagram illustrating a process of caching a data packet in another data packet caching method according to the present application;
fig. 9b is a schematic diagram of a process of sending a cached data packet in another data packet caching method provided by the present application;
fig. 10 is a schematic structural diagram of an embodiment of a data packet caching apparatus provided in the present application;
fig. 11 is a schematic structural diagram of an embodiment of a data packet buffering apparatus provided in the present application;
fig. 12 is a schematic structural diagram of an embodiment of a data packet buffering apparatus provided in the present application;
fig. 13 is a schematic diagram of another data packet caching apparatus provided in the present application.
Detailed Description
The technical solution of the present application may be applied to a mobile communication system such as the 5th Generation mobile communication technology (5G) system or an LTE system, and may also be applied to various types of systems including a system in which some functions of a base station are separated.
Fig. 1 is a schematic diagram of a Network architecture of a 5G system, in the 5G system, a Next Generation Radio Access Network (NG-RAN), a Core Network is a 5G Core Network (5G Core Network, 5GC), a base station is called a gNB/NG-eNB, and mainly includes Radio Resource Control (RRC)/Service Data Adaptation Protocol (Service Data Adaptation Protocol, SDAP)/Packet Data Convergence Protocol (Packet Data Convergence Protocol-Control, PDCP)/Radio Link Control (Radio Link Control, RLC)/Medium Access Control (Medium Access Control, MAC)/Physical Layer (PHY) Protocol layers. Hereinafter, and in fig. 1, collectively, a gNB represents a base station in a 5G system, the gnbs are connected through an Xn interface, and the gnbs and 5GC are connected through an Ng interface. The Access and Mobility management Function (AMF)/User Plane Function (UPF) is equivalent to a Mobility Management Entity (MME) in the LTE system. The AMF is mainly responsible for admission aspect, and the UPF is mainly responsible for session management.
It should be understood that the character "/" in this application indicates that the former and latter associated objects are in an "or" relationship.
A CU-DU separation scenario is introduced below with reference to fig. 2 and fig. 3, where fig. 2 is a schematic diagram of CU-DU splitting in a 5G system, and as shown in fig. 2, a base station may be composed of a Centralized Unit (CU) and a Distributed Unit (DU), that is, functions of the base station in an original access network are split, part of functions of the base station are deployed in one CU, the remaining functions are deployed in one DU, and multiple DUs share one CU, which may save cost and facilitate network expansion. The CU and the DU are connected through an F1 interface, and the CU represents the gNB and is connected with the core network through an Ng interface. The CU and DU may be split according to a protocol stack, wherein one possible way is to deploy RRC and PDCP layers in the CU and the rest of RLC, MAC and PHY layers in the DU. Furthermore, the CU may be further divided into a control plane CU (CU-CP) and a user plane CU (CU-UP), and fig. 3 is a schematic diagram of the division of the CU, and as shown in fig. 3, the CU-CP and the CU-UP are connected via an E1 interface. CU-CP stands for gNB connected to the core network via Ng interface, CU-CP connected to DU via F1-C (control plane), and CU-UP connected to DU via F1-U (user plane). Yet another possible implementation is that PDCP-C is also in CU-UP. The CU-CP is responsible for control plane functions, mainly comprises RRC and control plane PDCP (PDCP-C), and the PDCP-C is mainly responsible for encryption and decryption, integrity protection, data transmission and the like of control plane data. The CU-UP is responsible for user plane functions, mainly including the SDAP, which is responsible for processing data of the core network and mapping data flows (flows) to bearers, and the user plane PDCP (PDCP-U). The PDCP-U is mainly responsible for encryption and decryption of a data plane, integrity protection, header compression, serial number maintenance, data transmission and the like.
Under the network architecture that the CU-CP and the CU-UP are separated, the exchanger is divided into the CP and the UP, the CP and the UP are separately deployed and connected through a standard communication interface, the CP is responsible for processing signaling messages and sending flow table items of user data messages to the UP, the UP receives the user data messages, and the user data messages are processed (such as modified or forwarded) according to the flow table items sent by the CP.
The technical scheme of the application is mainly applied to a scene of how the user data message is cached after the switch is divided into the CP and the UP, and the scene that the user data message needs to be cached is, for example: when the mobile phone is in Idle state, if there is downlink data message on the network side, the data message will be cached on the gateway device, and after the mobile phone is on-line, the gateway device sends the cached data message to the mobile phone. In the prior art, the user data message is sent to the controller of the CP through the UP and is stored by the controller to realize the buffer storage of the user data message, but because the CP and the UP are separately deployed, at the same time, the CP manages a large amount of UP, the message caching by the CP brings large load, and the UP is required to transmit the user data message to the CP through a standard communication interface, the time delay is large, in order to solve the problem, the message is transmitted to the caching function module by adding the caching function module in the UP and utilizing the transmitting capability of the UP, the caching function module stores the data message according to the mode appointed with the CP and can transmit the cached data message to the UP according to the appointed message transmitting instruction, therefore, the user data message can be cached in the UP, the problems of load and time delay caused by the message caching of the CP are avoided, and the technical scheme of the application is described in detail below by combining with the attached drawings.
Fig. 4 is an interaction flowchart of an embodiment of a data packet caching method provided in the present application, and as shown in fig. 4, the method of this embodiment may include:
s101, CP sends buffer message flow list item to UP.
S102, UP receives data message.
In this embodiment and the following, the user data message is simply referred to as a data message.
S103, the UP is matched with the flow table item of the cache message, and the data message matched with the flow table item of the cache message is sent to the cache function module.
In this embodiment of the present application, the cache function module is a logical module, and may be a network element, a software module, a server, and the like, and when the cache function module is a software module, for example, an internal thread or an internal process of the server. The cache function module may be a switch internal module or a switch external module, such as a process or a service chain processing node. The cache function module in this embodiment may be a module with only a storage function, that is, the function of the cache function module is a single message storage function, or may be a general function module, which includes a message storage function and other functions, that is, the function of the cache function module is a non-single message storage function.
And S104, the cache function module stores the data message.
According to the data message caching method provided by the embodiment, the caching function module is added in the UP, the CP sends the flow table item of the caching message to the UP, the UP receives the data message, the data message matched with the flow table item of the caching message is sent to the caching function module, and the caching function module stores the data message according to the user information in the data message and the receiving sequence of the data message, so that the message is forwarded to the caching function module by utilizing the forwarding capability of the UP, and the caching function module stores the data message in a mode appointed to the CP, so that the user data message can be cached in the UP, and the problems of load and time delay caused by message caching in the CP are solved.
The above process is a data packet caching process, after a data packet is sent from a user plane to a caching function module for caching, when the user plane needs to forward or modify the cached data packet, the caching function module is required to send the cached data packet to the user plane, a specific process for sending the cached data packet is described below with reference to fig. 5, fig. 5 is an interaction flow diagram of an embodiment of a process for sending the cached data packet in a data packet caching method provided by the present application, and as shown in fig. 5, the method of the present embodiment may include:
s105, CP sends the flow list item of the transfer message to UP.
The operation of the indication of the flow table entry of the forwarding message is to match the specified data message, and send the data message matched with the flow table entry of the forwarding message to the specified port.
S106, the CP sends a message sending instruction to the cache function module.
S107, after receiving the message sending command, the cache function module sends a cache data message to the UP according to the storage sequence and the message sending command.
And S108, the UP forwards the cache data message according to the flow table entry of the forwarding message.
Specifically, the message sending instruction includes second user information of the cache data message to be sent, where the second user information includes at least one of an MAC address, an IP address, a tunnel identifier, and an identifier of a hit flow table entry, that is, the message sending instruction includes at least one of an MAC address, an IP address, and a tunnel identifier of the data message to be sent, or includes other information that can identify the data message. That is, the message sending instruction indicates which messages are to be sent, and the cache function module sends all data messages specified in the message sending instruction to the UP through the port between the cache function module and the UP according to the storage sequence.
In the above embodiment, specifically, for the function of the buffer function module and the connection relationship between the buffer function module and the UP, there are three ways for transmitting the data packet between the UP and the buffer function module, which are specifically as follows:
the method comprises the steps that a cache function module is directly butted with a port of a UP, the function of the cache function module is a single message storage function, the operation indicated by a flow table item of a cache message is matched with a specified data message, and the data message is sent to the cache function module through the port connected with the cache function module.
Specifically, the designated data packet may be designated by an identifier of the data packet, and the operation indicated by the flow entry of the cache packet is, for example, matching the data packet 5 to the data packet 10, and sending the data packet 5 to the data packet 10 to a port connected to the cache functional module, and accordingly, S103 may specifically be: the UP is matched with the flow table item of the cache message, and the data message matched with the flow table item of the cache message is sent to the cache function module through the port connected with the cache function module. In this way, the data packets transmitted between the UP and the buffer function module are original data packets. Correspondingly, S104 may store the data packet for the cache function module according to the first user information in the data packet and the receiving sequence of the data packet, where the first user information includes at least one of an MAC address, an IP address, a tunnel identifier, and an identifier of a flow entry.
In the method, the message is forwarded to the cache function module by utilizing the forwarding capability of the UP, the cache function module stores the data message according to the mode appointed with the CP, and can send the cached data message to the UP according to the appointed message sending command, so that the user data message can be cached in the UP, and the problems of load and time delay caused by the message caching of the CP are avoided.
And secondly, directly butting the cache function module with a port of the UP, wherein the cache function module has a non-single message storage function and also has other functions, the operation indicated by the flow table item of the cache message is to match with the specified data message, first control information is added, the data message added with the first control information is sent to the cache function module through the port connected with the cache function module, and the first control information at least comprises a cache instruction. The cache instruction instructs the cache function module to cache.
Correspondingly, S103 may specifically be: and the UP adds first control information to the data message matched with the flow table item of the cache message, and sends the data message added with the first control information to the cache function module through a port connected with the cache function module. If the cache function is to be implemented, the first control information includes a cache instruction, and if the other functions are implemented, the first control information includes instructions of other corresponding operations. As an example, the location of the first control information is shown in the following table one:
table-example of the location of a first control information in a data message
D-MAC S-MAC Control-head Payload
Wherein, D-MAC is a destination MAC address, S-MAC is a source MAC address, and after the source MAC address, Control information (also called Control-head) is located, payload is a partial or complete user packet to be cached, and the definition of the first Control information is shown in the following table two:
definition of the second first control information
ethType=control
controlType=buffering
flowRuleId
SequenceID
length
protocolType
In table two, the first control information includes: the controlType is an instruction type, and the instruction type is buffering; the flow rule for a hit identifies: flowRuleId; sequence number: sequence id, message length: length and user message type: protocol type.
It should be noted that the location and definition of the first control information are only one implementable manner, and the present application is not limited to this implementable manner.
Accordingly, S104 may be: the cache function module stores the data message according to the first user information in the data message and the receiving sequence of the data message, or the cache function module stores the data message according to the serial number in the service link packet header, wherein the first user information comprises at least one of an MAC address, an IP address, a tunnel identifier and a hit identifier of a flow table item.
In the first mode, the cache function module can only store data messages according to their sequence, and since the data messages are directly connected, the receiving sequence of the cache function module is the same as the receiving sequence of the UP. In this embodiment, since the first control information is present, the first control information may be stored in the order of reception by the cache function module, or may be stored in the order specified in the first control information, for example, the UP sends the data packet to the cache function module by routing, and the order in which the data packet reaches the cache function module after being routed is not necessarily the same as the order in which the UP receives the data packet. The following mode three is also a similar mode.
In this transmission mode, in S106, the second user information of the to-be-transmitted cache data packet included in the packet transmission instruction sent by the CP to the cache function module may include a data packet transmission port in addition to at least one of the MAC address, the IP address, the tunnel identifier, and the identifier of the hit flow entry, or the second user information includes other information that can identify the data packet. When the hit flow table entry identifier is included, if the flow table entry is created based on the user granularity, the user to which the flow table entry belongs and the service to which the flow table entry is issued can be determined according to the flow table entry identifier, so that the corresponding relationship between the message and the user and the service can be simply and accurately determined according to the flow table entry identifier. When the second user information further includes a data packet sending port, optionally, when the caching function module sends the cached data packet in S107, in the same manner that the UP sends the data packet to the caching function module in the present manner, the caching data packet with the second control information added thereto may be added according to the packet sending instruction, and then the caching data packet with the second control information added thereto is sent to the UP according to the storage sequence, where the second control information may include information such as the data packet sending port issued in the packet sending instruction. Optionally, if the data packet received by the UP includes the second control information, the UP may forward the data packet according to the data packet transmission port included in the second control information.
In the first mode, the buffer function module not only supports the message storage function, but also supports other functions, and the UP indicates the buffer function module to buffer the data message by adding the first control information in the data message; when sending the cached message, the CP may add a data message sending port in the message sending instruction, and the cache function module adds the second control information to the message when sending the cached data message, so as to instruct the UP to perform a corresponding operation on the data message. Compared with the first mode, the cache function module in the mode reduces the limitation of single module function, and meanwhile, the addition of the control information enables the control and operation of caching and sending the cached data message to be more flexible.
The buffer function module is directly butted with a port of the UP, or the buffer function module is communicated with the UP through routing or exchange, the buffer function module has a non-single message storage function, the operation indicated by a flow table item of the buffer message is to match a specified data message, the matched data message is packaged in a service chain packet header, the packaged data message is sent to the buffer function module through the port connected with the buffer function module, and the service chain packet header carries information used for indicating the buffer data message and information required to be transmitted to the buffer function module by the UP.
Correspondingly, S103 may specifically be: and the UP encapsulates the data message matched with the flow table item of the cache message in a service chain packet head, and sends the encapsulated data message to the cache function module through a port connected with the cache function module.
Optionally, S103 may also be configured to send the data packet to a specific port of the UP, and when the UP sends the data packet, the UP encapsulates the data packet in a service chain packet header according to convention or configuration, and then sends the data packet out. Correspondingly, when the cache function module in the S104 receives the data packet, the data packet is stored according to the service link header information in the data packet and the first user information in the data packet in the receiving sequence; or, the cache function module stores the data packet according to a serial number in the service link packet header, and the first user information includes at least one of information such as an MAC address, an IP address, a tunnel identifier, an identifier of a hit flow table entry, and the like, or other information that can identify the packet.
Specifically, the service link packet header is used to carry a data packet, so that the data packet can reach the cache function module through routing or switching. As an implementable manner, the Service chain packet header may be a Service chain definition such as a Network Service Header (NSH), or a tunnel encapsulation (e.g., GRE, VxLan, etc.), or other defined encapsulation manners capable of achieving the same purpose. The service chain packet header may contain information for indicating the buffered data packet and information that the UP needs to be transferred to the buffer function module. Taking NSH as an example, the data message is encapsulated in NSH, as shown in table three below:
table three NSH packed data message
D-MAC1 S-MAC1 NSH User-Packet
Wherein, D-MAC1 is the destination MAC address, S-MAC1 is the source MAC address, User-Packet is the User data message, and the definition of NSH Packet header is as shown in the following table four:
definition of Table four NSH header
ethType=IP
UDP_Port=NSH
Next Protocol
Path ID|Service Index
Context
Wherein, the definition of Context in the NSH header is shown in the following table five:
definition of Table five Context
flowRuleID
sequence
Wherein, the complete data message is encapsulated in the NSH message header, and the encapsulated data message is exchanged or routed through the added Ethernet header or IP header. The NSH header may be encapsulated in two or three layers depending on the networking. As shown in table four, taking NSH as an example of three-layer encapsulation, in this embodiment, to simplify the description, an IP header is used as a part of NSH, a Service chain identifier (Path ID and Service Index) in the NSH header may be used to mark a cache data packet, and add, in a Context field, other information that UP needs to be transferred to a cache function module, or of course, a manner of defining control information in a manner two may be used to define an indication of the data packet in the Context field, instead of relying on the Service chain identifier for indication. As shown in table five, the Context field may be used to transfer information such as flow rule identification (flowrule id), sequence number (sequence id), etc. of the hit to the cache function module.
In this transmission mode, correspondingly, in S106, the second user information may further include a data packet sending port in addition to at least one of the MAC address, the IP address, the tunnel identifier, and the identifier of the hit flow entry. Optionally, when the cache function module in S107 sends the cached data packet, the cache function module may encapsulate the cached data packet in the service chain packet header according to the same manner that the UP sends the data packet to the cache function module in this manner, and then send the encapsulated data packet to the UP according to the storage sequence, where the service chain packet header may carry information such as a data packet sending port sent in the packet sending instruction. Optionally, if the data packet received by the UP includes a data packet transmission port, the UP may forward the data packet according to the data packet transmission port.
In the method, the cache function module has no requirement on the networking mode and the direct connection of the UP, the data message to be cached and the cache data message to be sent reach the UP through routing or exchange, and meanwhile, the information for indicating the cache data message can be carried in the service link packet header, so that the method has the advantages of the second mode, has no requirement on the networking mode, and is flexible to control.
The data packet caching method provided by this embodiment includes adding a caching function module to an UP, sending a flow entry of a cached packet to the UP by a CP, receiving the data packet by the UP, sending a data packet matched with the flow entry of the cached packet to the caching function module by the UP, storing the data packet by the caching function module according to user information in the data packet and a receiving sequence of the data packet, sending a flow entry of a forwarding packet to the UP by the CP when the data packet is forwarded, sending a packet sending instruction to the caching function module by the CP, where the packet sending instruction includes the user information of the cached packet to be sent, sending the cached data packet to the UP by the caching function module according to the storing sequence and the packet sending instruction, and finally forwarding the cached data packet by the UP according to the flow entry of the forwarding packet. Therefore, the message is forwarded to the cache function module by utilizing the forwarding capability of the UP, the cache function module stores the data message according to the mode appointed with the CP, and can send the cached data message to the UP according to the appointed message sending instruction, so that the user data message can be cached in the UP, and the problems of load and time delay caused by the message caching of the CP are avoided.
The following describes the technical solution of the embodiment of the method shown in fig. 4 and 5 in detail by using several specific embodiments.
Fig. 6 is a schematic structural diagram of related components for implementing a data packet caching method according to the present application, where as shown in fig. 6, a CP takes a controller as an example, a cache function module takes an internal module of a switch or an external module of the switch as an example, a UP takes a switch PipeLine (PipeLine) as an example, and a process of the data packet caching method is as follows: the Pipeline receives the data message and sends the data message to the cache function module for storage. The cache function module is controlled to cache the data message by designating the cache function module as a storage function, or by adding control information in the data message or encapsulating the data message in a service chain packet header. The controller controls the PipeLine to add control information to the message to be cached, encapsulate the message in a service chain packet header or directly send the message to the cache function module by specifying a flow table entry (a flow table entry for caching the message or a flow table entry for forwarding the message) of the PipeLine. When the cached message is to be forwarded, the controller controls the cache function module to send the cached data message by sending a message sending instruction to the cache function module. In the present application, for simplifying the description, the controller is described as the same module, but the controller that issues the flow entry to the user plane and the controller that sends the "message sending instruction" in the present application may be different modules. Three realizable modes are respectively described in the following with reference to the attached drawings. In the following embodiments, the CP is exemplified by a controller.
Fig. 7a is a schematic diagram of a process of buffering a data packet in a data packet buffering method provided by the present application, and fig. 7b is a schematic diagram of a process of sending a buffered data packet in a data packet buffering method provided by the present application, where in this embodiment, a buffering function module is a single packet storage function and is directly docked with a port of an UP through virtual or physical connection, and the buffering function module is an internal storage module or an external storage module, as shown in fig. 7a, the process of buffering a data packet may include:
s201, the controller sends a flow table item of the cache message to the Pipeline. The operation indicated by the flow table entry of the cache message is to match the specified data message and send the data message to the cache function module through the port connected with the cache function module.
If the cache function module is an internal storage module, the port connected with the cache function module is an internal virtual port, and if the cache function module is an external storage module, the port connected with the cache function module is an external port.
S202, Pipeline receives the data message.
S203, Pipeline hits the flow table entry of the cache message, and the data message matched with the flow table entry of the cache message is sent to the internal storage module through the internal virtual port, or the data message matched with the flow table entry of the cache message is sent to the external storage module through the external port.
S204, the cache function module stores the data messages according to the first user information in the data messages and the receiving sequence of the data messages.
Wherein the first user information includes at least one of a MAC address, an IP address, and a tunnel identifier.
When the UP needs to forward or modify the cached data packet, the cache function module is required to send the cached data packet to the UP, as shown in fig. 7b, the process of sending the cached data packet may include:
s205, the controller sends the flow table item of the forwarding message to the Pipeline. The operation of the indication of the flow entry of the forwarding message is to match the specified data message and send the data message matched with the flow entry of the forwarding message to the specified port.
S206, the controller sends a message sending instruction to the cache function module.
The message sending instruction comprises second user information of the cache data message to be sent, wherein the second user information comprises at least one of an MAC address and an IP address tunnel identifier, or comprises other information capable of identifying the data message.
And S207, after receiving the message sending instruction, the cache function module sends a cache data message to the Pipeline according to the storage sequence and the message sending instruction.
Specifically, the cache function module sends all data packets specified in the packet sending instruction to the UP through the port between the cache function module and the UP according to the storage sequence.
And S208, the Pipeline forwards the cache data message according to the flow table entry of the forwarding message.
In the data packet caching method provided by this embodiment, the packet is forwarded to the caching function module by using the forwarding capability of the UP, and the caching function module stores the data packet in a manner agreed with the CP and can send the cached data packet to the UP according to an agreed packet sending instruction, so that the user data packet can be cached in the UP, and the load and delay problems caused by the CP performing packet caching are avoided.
Fig. 8a is a schematic diagram of a process of caching a data packet in another data packet caching method provided by the present application, and fig. 8b is a schematic diagram of a process of sending a cached data packet in another data packet caching method provided by the present application, where a function of a caching function module in this embodiment is a non-single packet storage function, and also includes other functions, and is directly docked with a port of an UP through virtual or physical connection, and the caching function module is an internal function module or an external function module, as shown in fig. 8a, the process of caching a data packet may include:
s301, the controller sends a flow table item of the cache message to the Pipeline.
The operation indicated by the flow table entry of the cache message is to match the specified data message, add first control information, and send the data message added with the first control information to the cache function module through a port connected with the cache function module, where the first control information at least includes a cache instruction. For example, the first control information includes: caching instructions, hit flow rule identification, sequence number, message length and user message type.
If the cache function module is an internal function module, the port connected with the cache function module is an internal virtual port, and if the cache function module is an external function module, the port connected with the cache function module is an external port.
S302, PipeLine receives the data message.
And S303, the PipeLine hits a flow table item of the cache message, adds first control information to a data message matched with the flow table item of the cache message, and sends the data message added with the first control information to the internal function module through the internal virtual port, or sends the data message added with the first control information to the external function module through the internal virtual port.
S304, the cache function module stores the data messages according to the first user information in the data messages and the receiving sequence of the data messages.
Alternatively, S304 may be: and the cache function module stores the data message according to the serial number in the first control information.
Wherein the first user information includes at least one of a MAC address, an IP address, a tunnel identifier, and an identifier of a hit flow entry.
When the UP needs to forward or modify the cached data packet, the cache function module is required to send the cached data packet to the UP, as shown in fig. 8b, the process of sending the cached data packet may include:
s305, the controller sends the flow table item of the forwarding message to the Pipeline. The operation of the indication of the flow entry of the forwarding message is to match the specified data message and send the data message matched with the flow entry of the forwarding message to the specified port.
S306, the controller sends a message sending instruction to the cache function module.
The message sending instruction comprises second user information of a cache data message to be sent, wherein the second user information comprises at least one of an MAC address, an IP address, a tunnel identifier, a hit flow table item identifier and a data message sending port, or comprises other information capable of identifying the data message.
And S307, after receiving the message sending instruction, the cache function module sends a cache data message to the Pipeline according to the storage sequence and the message sending instruction.
Specifically, the cache function module sends all data packets specified in the packet sending instruction to the UP through the port between the cache function module and the UP according to the storage sequence.
Optionally, if the second user information includes a data packet sending port, S307 may further be: and the cache function module adds second control information to the cache data messages, and then sends the cache data messages added with the second control information to the Pipeline according to the storage sequence, wherein the second control information comprises a data message sending port.
And S308, the Pipeline forwards the cache data message according to the flow table entry of the forwarding message.
Optionally, if the data packet received by the UP includes the second control information, the UP may forward the data packet according to the data packet transmission port included in the second control information.
In the data packet caching method provided by this embodiment, the caching function module not only supports the packet storage function, and the UP instructs the caching function module to cache the data packet by adding the first control information to the data packet; when sending the cached message, the CP may add a data message sending port in the message sending instruction, and the cache function module adds the first control information to the message when sending the cached data message, so as to instruct the UP to perform a corresponding operation on the data message. Compared with the embodiments shown in fig. 7a and 7b, the cache function module in this embodiment reduces the limitation of single module function, and the addition of the control information makes the control and operation of caching and sending the cached data packet more flexible.
Fig. 9a is a schematic diagram of a process of caching a data packet in another data packet caching method provided by the present application, and fig. 9b is a schematic diagram of a process of sending a cached data packet in another data packet caching method provided by the present application, where a function of a caching function module in this embodiment is a non-single packet storage function and also includes other functions, the caching function module is intercommunicated with UP through routing or switching, the caching function module is an external function module, and the process of caching a data packet may include:
s401, the controller sends a flow table item of the cache message to the Pipeline.
The operation indicated by the flow table entry of the cache message is to match the specified data message, the matched data message is encapsulated in a service chain packet header, the encapsulated data message is sent to the external function module through a port connected with the external function module, and the service chain packet header carries information for indicating the cache data message and information required to be transmitted to the external function module by UP. The service chain packet header is used for carrying data messages, so that the data messages can reach an external function module through routing or switching.
If the external function module is an internal function module, the port connected with the external function module is an internal virtual port, and if the external function module is an external function module, the port connected with the external function module is an external port.
S402, receiving the data message by the Pipeline.
S403, the pipe lineup encapsulates the data packet matching the flow table entry of the cache packet in the service link packet header, and sends the encapsulated data packet to the external function module through switching or routing.
S404, the external function module stores the data message according to the receiving sequence according to the service chain header information in the data message and the first user information in the data message.
Alternatively, S404 may be: and the cache function module stores the data message according to the serial number in the service chain packet header.
Wherein the first user information includes at least one of a MAC address, an IP address, a tunnel identifier, and an identifier of a hit flow entry.
When the UP needs to forward or modify the cached data packet, it is necessary for the external function module to send the cached data packet to the UP, as shown in fig. 9b, taking the external function module as the external function module or the internal function module as an example in fig. 9b, a process of sending the cached data packet may include:
s405, the controller sends the flow table item of the forwarding message to the Pipeline. The operation of the indication of the flow entry of the forwarding message is to match the specified data message and send the data message matched with the flow entry of the forwarding message to the specified port.
S406, the controller sends a message sending instruction to the external function module.
The message sending instruction comprises second user information of a cache data message to be sent, wherein the second user information comprises at least one of an MAC address, an IP address, a tunnel identifier, a hit flow table entry identifier (flowRuleId) and a data message sending port, or comprises other information capable of identifying the data message.
And S407, after receiving the message sending instruction, the external function module sends the cache data message to the Pipeline according to the storage sequence and the message sending instruction.
Specifically, the external function module sends all data packets specified in the packet sending command to the UP through the port between the external function module and the UP according to the storage sequence.
Optionally, if the second user information includes a data packet sending port, S407 may further be: the external function module encapsulates the cache data message in a service chain packet header, and then sends the encapsulated data message to the PipeLine according to the storage sequence, and the service chain packet header can carry information such as a data message sending port and the like sent in the message sending instruction.
And S408, the Pipeline forwards the cache data message according to the flow table entry of the forwarding message.
Optionally, if the data packet received by the PipeLine includes a data packet sending port, the PipeLine may forward the data packet according to the data packet sending port.
The data packet caching method provided by the embodiment is also applicable to the condition that the caching function module is directly butted with the port of the UP.
In the data packet caching method provided in this embodiment, the caching function module does not have a requirement on the networking mode and is directly connected to the UP, the data packet to be cached and the cached data packet to be sent reach the UP through routing or switching, and meanwhile, information for indicating the cached data packet may be carried in the service link packet header, which has the advantages of the embodiments shown in fig. 8a and 8b, and has no requirement on the networking mode and flexible control.
Fig. 10 is a schematic structural diagram of an embodiment of a data packet buffering device provided in this application, where the data packet buffering device in this embodiment may be an UP, as shown in fig. 10, the device in this embodiment may include: a receiving module 11 and a sending module 12, where the receiving module 11 is configured to receive a flow entry of a cache packet sent by a CP, and the receiving module 11 is further configured to: the sending module 12 is configured to send a data packet matching a flow entry of the cache packet to the cache function module, where the cache function module is configured to store the data packet, and the cache function module is an internal module of the switch or an external module of the switch.
Optionally, the receiving module 11 is further configured to: receiving a flow entry of a forwarding message sent by the CP, and receiving a cache data message sent by the cache function module, where the cache data message is sent by the cache function module according to the storage sequence and the message sending instruction when receiving a message sending instruction sent by the CP, the message sending instruction includes second user information of the cache data message to be sent, the second user information includes at least one of an MAC address, an IP address, a tunnel identifier, and an identifier of a hit flow entry, and the sending module 12 is further configured to forward the cache data message according to the flow entry of the forwarding message.
Optionally, the cache function module is directly connected with a port of the data packet cache device in a butt joint manner, the cache function module has a single packet storage function, and the operation indicated by the flow table entry of the cache packet is to match the specified data packet and send the data packet to the cache function module through the port connected with the cache function module;
the sending module 12 is configured to: and sending the data message matched with the flow table entry of the cache message to the cache function module through a port connected with the cache function module, wherein the cache function module is used for storing the data message according to first user information in the data message and the receiving sequence of the data message, and the first user information comprises at least one of a Media Access Control (MAC) address, an Internet Protocol (IP) address, a tunnel identifier and a hit flow table entry identifier.
Optionally, the cache function module is directly connected with a port of the data packet cache device in an abutting joint manner, the cache function module has a non-single packet storage function, the operation indicated by the flow table entry of the cache packet is to match the specified data packet, and add first control information, the first control information at least includes a cache instruction, and the data packet added with the first control information is sent to the cache function module through the port connected with the cache function module;
the sending module 12 is configured to: adding first control information to a data message matched with a flow table entry of a cache message, sending the data message added with the first control information to a cache function module through a port connected with the cache function module, and storing the data message by the cache function module according to first user information in the data message and a receiving sequence of the data message, or storing the data message according to a serial number in the first control information, wherein the first user information comprises at least one of an MAC address, an IP address, a tunnel identifier and a hit identifier of the flow table entry.
Optionally, the second user information further includes a data packet sending port, the cache data packet is sent to the data packet caching device after the cache functional module adds the second control information according to the storage sequence, and the second control information includes the data packet sending port.
Optionally, the cache function module is intercommunicated with the data packet caching device through routing or switching, the cache function module has a non-single packet storage function, the operation indicated by the flow table entry of the cache packet is to match a specific data packet, the matched data packet is encapsulated in a service chain packet header, the encapsulated data packet is sent to the cache function module through a port connected with the cache function module, the service chain packet header carries information indicating the cached data packet and information that the data packet caching device needs to transmit to the cache function module, and the service chain packet header includes a packet header of a service chain protocol or a packet header of a tunnel protocol. The sending module is used for: and encapsulating the data message matched with the flow table entry of the cache message in a service chain packet header, and sending the encapsulated data message to the cache function module through a port connected with the cache function module, wherein the cache function module is used for storing the data message according to first user information in the data message and the receiving sequence of the data message, or storing the data message according to a serial number in the service chain packet header, and the first user information comprises at least one of an MAC address, an IP address, a tunnel identifier and a hit flow table entry identifier.
Optionally, the second user information further includes a data packet sending port, the cache data packet is sent to the data packet caching device after the cache functional module is encapsulated in the service chain packet header according to the storage sequence, and the service chain packet header carries the data packet sending port.
The apparatus of this embodiment may be used to implement the technical solutions of the method embodiments shown in fig. 4 or fig. 5, and the implementation principles and technical effects are similar, which are not described herein again.
Fig. 11 is a schematic structural diagram of an embodiment of a data packet caching apparatus provided in this application, where the data packet caching apparatus in this embodiment may be a caching function module, and as shown in fig. 11, the apparatus of this embodiment may include: the device comprises a receiving module 21 and a storage module 22, wherein the receiving module 21 is used for receiving a data message which is sent by the UP and matched with a flow table entry of a cache message, a data message caching device is an internal module of a switch or an external module of the switch, the flow table entry of the cache message is CP and sent to the UP, and the storage module 22 is used for storing the data message.
Fig. 12 is a schematic structural diagram of an embodiment of a data packet caching apparatus provided in this application, where the data packet caching apparatus in this embodiment may further include, on the basis of the apparatus shown in fig. 11: the sending module 23, the receiving module 21 are further configured to: receiving a message sending instruction sent by the CP, wherein the message sending instruction comprises second user information of a cache data message to be sent, and the second user information comprises at least one of an MAC address, an IP address, a tunnel identifier and an identifier of a hit flow table item; the sending module 23 is configured to send a cache data packet to the UP according to the storage sequence and the packet sending instruction, and is configured to forward the received cache data packet by the UP according to a flow entry of a forwarding packet sent by the CP.
Optionally, the data packet caching device is directly butted with a port of the UP, the data packet caching device has a single packet storage function, and an operation indicated by a flow entry of the cached packet matches a specified data packet and is sent to the data packet caching device through the port connected to the data packet caching device;
the storage module 22 is configured to: and storing the data messages according to first user information in the data messages and the receiving sequence of the data messages, wherein the first user information comprises at least one of a Media Access Control (MAC) address, an Internet Protocol (IP) address, a tunnel identifier and an identifier of a hit flow table item.
Optionally, the data packet caching device is directly connected with a port of the UP in a butt joint mode, the data packet caching device has a non-single packet storage function, the operation indicated by the flow table entry of the cached packet is to match the specified data packet, the first control information is added, the first control information at least comprises a caching instruction, and the data packet added with the first control information is sent to the data packet caching device through the port connected with the data packet caching device.
The storage module 22 is configured to: and storing the data messages according to the first user information in the data messages and the receiving sequence of the data messages, or storing the data messages according to the sequence number in the first control information, wherein the first user information comprises at least one of an MAC address, an IP address, a tunnel identifier and a hit identifier of the flow table entry.
Optionally, the second user information further includes a data packet sending port, and the sending module 23 is configured to:
and adding second control information to the cache data message to be sent according to the storage sequence and then sending the second control information to the UP, wherein the second control information comprises a data message sending port.
Optionally, the data packet caching device is intercommunicated with UP through routing or switching, the data packet caching device has a non-single packet storage function, the operation indicated by the flow table entry of the caching packet is to match a specified data packet, the matched data packet is encapsulated in a service chain packet header, the encapsulated data packet is sent to the data packet caching device through a port connected to the data packet caching device, the service chain packet header carries information used for indicating that the data packet is cached and information that UP needs to be transmitted to the data packet caching device, and the service chain packet header includes a packet header of a service chain protocol or a packet header of a tunnel protocol;
the storage module 22 is configured to: and storing the data messages according to the first user information in the data messages and the receiving sequence of the data messages, or storing the data messages according to the sequence number in the service chain packet header, wherein the first user information comprises at least one of an MAC address, an IP address, a tunnel identifier and a hit flow table entry identifier.
Optionally, the second user information further includes a data packet sending port, and the sending module 23 is configured to: and encapsulating the cache data messages to be sent in a service chain packet header according to the storage sequence and then sending the cache data messages to the UP, wherein the service chain packet header carries a data message sending port.
The apparatuses in the embodiments shown in fig. 11 and fig. 12 may be used to implement the technical solutions in the method embodiments shown in fig. 4 or fig. 5, and the implementation principles and technical effects are similar, which are not described herein again.
The present application may perform the division of the function modules on the data packet caching apparatus according to the method example, for example, each function module may be divided corresponding to each function, or two or more functions may be integrated in one processing module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. It should be noted that the division of the modules in the embodiments of the present application is schematic, and is only one division of logic functions, and there may be another division manner in actual implementation.
Fig. 13 is a schematic diagram of another data packet caching apparatus provided in the present application, where the data packet caching apparatus 700 includes:
a memory 701 for storing program instructions, which may be a flash (flash memory).
A processor 702 is configured to call and execute the program instructions in the memory to implement the steps in the data packet caching method shown in fig. 4 or fig. 5. Reference may be made in particular to the description relating to the preceding method embodiment.
Alternatively, the memory 701 may be separate or integrated with the processor 702.
The data message buffering mechanism of fig. 13 may also include a transceiver (not shown in fig. 13) for transmitting and receiving signals via an antenna.
The present application further provides a readable storage medium, where an execution instruction is stored in the readable storage medium, and when at least one processor of the data packet caching device executes the execution instruction, the data packet caching device executes the data packet caching method provided in the foregoing various embodiments.
The present application also provides a program product comprising execution instructions stored in a readable storage medium. The at least one processor of the data packet caching device may read the execution instruction from the readable storage medium, and the execution instruction by the at least one processor causes the data packet caching device to implement the data packet caching method provided in the foregoing various embodiments.
Those of ordinary skill in the art will understand that: in the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website site, computer, server, or data center to another website site, computer, server, or data center via wired (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.

Claims (25)

1. A method for caching data messages, comprising:
a user plane UP receives a flow entry of a cache message sent by a control plane CP;
the UP receives a data message, and sends the data message matched with a flow table item of the cache message to a cache function module, wherein the data message is used for being stored by the cache function module, and the cache function module is an internal module of a switch or an external module of the switch;
the UP receives a flow table item of a forwarding message sent by the CP;
the UP receives a cache data message sent by the cache function module, wherein the cache data message is sent by the cache function module according to a storage sequence and a message sending instruction when the cache function module receives the message sending instruction sent by the CP;
if the cache function module is directly butted with the port of the UP, the function of the cache function module is a single message storage function, the operation indicated by the flow table item of the cache message is to match the specified data message, and the operation is sent to the cache function module through the port connected with the cache function module;
the UP sends the data packet matching with the flow entry of the cache packet to the cache function module, including:
and the UP sends the data message matched with the flow table entry of the cache message to the cache function module through a port connected with the cache function module, and the UP is used for the cache function module to store the data message according to first user information in the data message and the receiving sequence of the data message, wherein the first user information comprises at least one of a Media Access Control (MAC) address, an IP address, a tunnel identifier and an identifier of a hit flow table entry.
2. The method of claim 1, further comprising:
the message sending instruction comprises second user information of a cache data message to be sent, the second user information comprises at least one of an MAC address, an IP address, a tunnel identifier and an identifier of a hit flow table item, and the UP forwards the cache data message according to the flow table item of the forwarding message.
3. The method according to claim 2, wherein if the cache function module is directly interfaced with the port of the UP, the function of the cache function module is a non-single packet storage function, the operation indicated by the flow table entry of the cache packet is to match a specified data packet, add first control information, the first control information at least contains a cache instruction, and send the data packet added with the first control information to the cache function module through the port connected with the cache function module;
the UP sends the data packet matching with the flow entry of the cache packet to the cache function module, including:
the UP adds first control information to a data message matched with a flow table item of the cache message, and sends the data message added with the first control information to the cache function module through a port connected with the cache function module, so that the cache function module stores the data message according to first user information in the data message and the receiving sequence of the data message, or stores the data message according to a serial number in the first control information, wherein the first user information comprises at least one of an MAC address, an IP address, a tunnel identifier and a hit identifier of the flow table item.
4. The method according to claim 3, wherein the second user information further includes a data packet transmission port, the caching data packet is transmitted to the UP after the caching function module adds second control information according to a storage sequence, and the second control information includes the data packet transmission port.
5. The method according to claim 2, wherein if the cache function module interworks with the UP through routing or switching, the cache function module functions as a non-single packet storage function, the flow table entry of the cache packet indicates that the operation is to match a specific data packet, the matched data packet is encapsulated in a service chain packet header, and the encapsulated data packet is sent to the cache function module through a port connected to the cache function module, the service chain packet header carries information indicating that the data packet is cached and information that the UP needs to be transferred to the cache function module, and the service chain packet header includes a packet header of a service chain protocol or a packet header of a tunneling protocol;
the UP sends the data packet matching with the flow entry of the cache packet to the cache function module, including:
and the UP encapsulates the data message matched with the flow table entry of the cache message in a service chain packet header, and sends the encapsulated data message to the cache function module through a port connected with the cache function module, so that the cache function module stores the data message according to first user information in the data message and the receiving sequence of the data message, or stores the data message according to a serial number in the service chain packet header, wherein the first user information comprises at least one of an MAC address, an IP address, a tunnel identifier and a hit flow table entry identifier.
6. The method of claim 5, wherein the second user information further includes a data packet sending port, and wherein the buffered data packet is sent to the UP after the buffered data packet is encapsulated in a service chain packet header according to a storage order, and wherein the service chain packet header carries the data packet sending port.
7. A method for caching data messages, comprising:
a cache function module receives a data message which is sent by a user plane UP and matched with a flow table item of a cache message, wherein the cache function module is an internal module of a switch or an external module of the switch, and the flow table item of the cache message is sent to the UP by a control plane CP;
the cache function module stores the data message;
the cache function module receives a message sending instruction sent by the CP;
if the cache function module is directly butted with the port of the UP, the function of the cache function module is a single message storage function, the operation indicated by the flow table item of the cache message is to match the specified data message, and the operation is sent to the cache function module through the port connected with the cache function module;
the cache function module stores the data packet, including:
the cache function module stores the data message according to first user information in the data message and a receiving sequence of the data message, wherein the first user information comprises at least one of a Media Access Control (MAC) address, an Internet Protocol (IP) address, a tunnel identifier and an identifier of a hit flow table item.
8. The method of claim 7, further comprising:
the message sending instruction comprises second user information of a cache data message to be sent, wherein the second user information comprises at least one of an MAC address, an IP address, a tunnel identifier and an identifier of a hit flow table item;
and the cache function module sends cache data messages to the UP according to the storage sequence and the message sending instruction, and the cache function module is used for forwarding the received cache data messages by the UP according to the flow table items of the forwarding messages sent by the CP.
9. The method according to claim 8, wherein if the cache function module is directly interfaced with the port of the UP, the function of the cache function module is a non-single packet storage function, the operation indicated by the flow table entry of the cache packet is to match a specified data packet, add first control information, the first control information at least contains a cache instruction, and send the data packet added with the first control information to the cache function module through the port connected with the cache function module;
the cache function module stores the data packet, including:
the cache function module stores the data message according to first user information in the data message and a receiving sequence of the data message, or stores the data message according to a sequence number in the first control information, wherein the first user information comprises at least one of an MAC address, an IP address, a tunnel identifier and an identifier of a hit flow table item.
10. The method of claim 9 wherein the second user information further comprises a data packet forwarding port, and wherein the sending of the buffered data packet to the UP by the cache function module according to the storage order and the packet forwarding command comprises:
and the cache function module adds second control information to the cache data message to be sent according to the storage sequence and then sends the second control information to the UP, wherein the second control information comprises the data message sending port.
11. The method according to claim 8, wherein if the caching function module interworks with the UP through routing or switching, the caching function module functions as a non-single packet storage function, the operation indicated by the flow table entry of the caching packet is to match a specific data packet, encapsulate the matched data packet in a service chain packet header, and send the encapsulated data packet to the caching function module through a port connected to the caching function module, wherein the service chain packet header carries information indicating that the data packet is cached and information that the UP needs to be transferred to the caching function module, and the service chain packet header includes a packet header of a service chain protocol or a packet header of a tunneling protocol;
the cache function module stores the data packet, including:
the cache function module stores the data message according to first user information in the data message and a receiving sequence of the data message, or stores the data message according to a sequence number in a service chain packet header, wherein the first user information comprises at least one of an MAC address, an IP address, a tunnel identifier and a hit flow table entry identifier.
12. The method of claim 11 wherein the second user information further comprises a data packet forwarding port, and wherein the sending of the buffered data packet to the UP by the cache function module according to the storage order and the packet forwarding command comprises:
and the cache function module encapsulates cache data messages to be sent in a service chain packet header according to a storage sequence and then sends the cache data messages to the UP, wherein the service chain packet header carries the data message sending port.
13. A data message buffering mechanism, comprising:
the receiving module is used for receiving a flow entry of the cache message sent by the control plane CP;
the receiving module is further configured to: receiving a data message;
the sending module is used for sending the data message matched with the flow table item of the cache message to the cache function module, the cache function module is used for storing the data message, and the cache function module is an internal module of the switch or an external module of the switch;
the receiving module is further configured to: receiving a flow table item of a forwarding message sent by the CP;
receiving a cache data message sent by the cache function module, wherein the cache data message is sent by the cache function module according to a storage sequence and a message sending instruction when the cache function module receives the message sending instruction sent by the CP;
if the cache function module is directly butted with a port of the data message cache device, the function of the cache function module is a single message storage function, the operation indicated by the flow entry of the cache message is to match the specified data message, and the operation is sent to the cache function module through the port connected with the cache function module;
the sending module is configured to:
and sending the data message matched with the flow table entry of the cache message to the cache function module through a port connected with the cache function module, wherein the cache function module is used for storing the data message according to first user information in the data message and the receiving sequence of the data message, and the first user information comprises at least one of a Media Access Control (MAC) address, an Internet Protocol (IP) address, a tunnel identifier and an identifier of a hit flow table entry.
14. The apparatus of claim 13, wherein:
the message sending instruction comprises second user information of a cache data message to be sent, wherein the second user information comprises at least one of an MAC address, an IP address, a tunnel identifier and an identifier of a hit flow table item;
the sending module is further configured to forward the cached data packet according to the flow entry of the forwarding packet.
15. The apparatus according to claim 14, wherein if the cache function module is directly interfaced with the port of the data packet cache apparatus, the function of the cache function module is a non-single packet storage function, the operation indicated by the flow table entry of the cache packet is to match a specified data packet, add first control information, the first control information at least contains a cache instruction, and send the data packet added with the first control information to the cache function module through the port connected with the cache function module;
the sending module is configured to:
adding first control information to a data message matched with a flow table entry of the cache message, sending the data message added with the first control information to the cache function module through a port connected with the cache function module, and storing the data message by the cache function module according to first user information in the data message and a receiving sequence of the data message, or storing the data message according to a serial number in the first control information, wherein the first user information comprises at least one of an MAC address, an IP address, a tunnel identifier and an identifier of a hit flow table entry.
16. The apparatus according to claim 15, wherein the second user information further includes a data packet sending port, the caching data packet is sent to the data packet caching apparatus after the caching function module adds second control information according to a storage sequence, and the second control information includes the data packet sending port.
17. The apparatus according to claim 14, wherein if the cache function module interworks with the data packet cache device through routing or switching, the cache function module functions as a non-uniform packet storage function, the operation indicated by the flow table entry of the cache packet is to match a specific data packet, the matched data packet is encapsulated in a service chain packet header, and the encapsulated data packet is sent to the cache function module through a port connected to the cache function module, the service chain packet header carries information indicating that the data packet is cached and information that the data packet cache device needs to transfer to the cache function module, and the service chain packet header includes a packet header of a service chain protocol or a packet header of a tunneling protocol;
the sending module is configured to:
and encapsulating the data message matched with the flow table entry of the cache message in a service chain packet header, and sending the encapsulated data message to the cache function module through a port connected with the cache function module, so that the cache function module stores the data message according to first user information in the data message and the receiving sequence of the data message, or stores the data message according to a serial number in the service chain packet header, wherein the first user information comprises at least one of an MAC address, an IP address, a tunnel identifier and a hit flow table entry identifier.
18. The apparatus according to claim 17, wherein the second user information further includes a data packet sending port, and the buffered data packet is sent to the data packet buffering apparatus after the buffering function module is encapsulated in a service chain packet header according to a storage sequence, and the service chain packet header carries the data packet sending port.
19. A data message buffering mechanism, comprising:
the device comprises a receiving module, a caching module and a sending module, wherein the receiving module is used for receiving a data message which is sent by a user plane UP and matched with a flow table item of a caching message, the data message caching device is an internal module or an external module of a switch, and the flow table item of the caching message is sent to the UP by a control plane CP;
the storage module is used for storing the data message;
the receiving module is further configured to: receiving a message sending instruction sent by the CP;
if the data message caching device is directly butted with the port of the UP, the function of the data message caching device is a single message storage function, the operation indicated by the flow table item of the cached message is to match the specified data message, and the data message is sent to the data message caching device through the port connected with the data message caching device;
the storage module is configured to:
and storing the data messages according to first user information in the data messages and the receiving sequence of the data messages, wherein the first user information comprises at least one of a Media Access Control (MAC) address, an Internet Protocol (IP) address, a tunnel identifier and an identifier of a hit flow table item.
20. The apparatus of claim 19, wherein:
the message sending instruction comprises second user information of a cache data message to be sent, wherein the second user information comprises at least one of an MAC address, an IP address, a tunnel identifier and an identifier of a hit flow table item;
the device further comprises:
and the sending module is used for sending the cache data message to the UP according to the storage sequence and the message sending instruction, and forwarding the received cache data message by the UP according to the flow table item of the forwarding message sent by the CP.
21. The apparatus of claim 20, wherein if the datagram cache device is directly interfaced with the port of the UP, the datagram cache device functions as a non-unitary datagram store function, the operation indicated by the flow table entry of the cache datagram is to match a specified datagram, add first control information, the first control information at least includes a cache instruction, and send the datagram with the first control information added to the datagram cache device through the port connected to the datagram cache device;
the storage module is configured to:
and storing the data message according to first user information in the data message and the receiving sequence of the data message, or storing the data message according to a sequence number in the first control information, wherein the first user information comprises at least one of an MAC address, an IP address, a tunnel identifier and a hit flow table entry identifier.
22. The apparatus of claim 21, wherein the second user information further comprises a data messaging port, and wherein the sending module is configured to:
and adding second control information to the cache data message to be sent according to the storage sequence and then sending the cache data message to the UP, wherein the second control information comprises the data message sending port.
23. The apparatus of claim 20, wherein if the data packet caching apparatus interworks with the UP through routing or switching, the data packet caching apparatus functions as a non-uniform packet storage function, the flow table entry of the cached packet indicates that the specified data packet is matched, the matched data packet is encapsulated in a service chain packet header, and the encapsulated data packet is sent to the data packet caching apparatus through a port connected to the data packet caching apparatus, the service chain packet header carries information indicating that the data packet is cached and information that the UP needs to be transferred to the data packet caching apparatus, and the service chain packet header includes a packet header of a service chain protocol or a packet header of a tunneling protocol;
the storage module is configured to:
and storing the data message according to first user information in the data message and the receiving sequence of the data message, or storing the data message according to a serial number in the service link packet header, wherein the first user information comprises at least one of an MAC address, an IP address, a tunnel identifier and a hit flow table entry identifier.
24. The apparatus of claim 23, wherein the second user information further comprises a data messaging port, and wherein the sending module is configured to:
and encapsulating the cache data messages to be sent in a service chain packet header according to the storage sequence and then sending the cache data messages to the UP, wherein the service chain packet header carries the data message sending port.
25. A readable storage medium having stored thereon instructions for execution, which when executed by at least one processor of a data message caching apparatus, cause the data message caching apparatus to perform the data message caching method of any one of claims 1 to 6 or any one of claims 7 to 12.
CN201710963684.9A 2017-10-17 2017-10-17 Data message caching method and device Active CN109672615B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710963684.9A CN109672615B (en) 2017-10-17 2017-10-17 Data message caching method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710963684.9A CN109672615B (en) 2017-10-17 2017-10-17 Data message caching method and device

Publications (2)

Publication Number Publication Date
CN109672615A CN109672615A (en) 2019-04-23
CN109672615B true CN109672615B (en) 2022-06-14

Family

ID=66139531

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710963684.9A Active CN109672615B (en) 2017-10-17 2017-10-17 Data message caching method and device

Country Status (1)

Country Link
CN (1) CN109672615B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110366276B (en) * 2019-07-03 2022-04-12 中国联合网络通信集团有限公司 Service architecture base station
CN113141284B (en) * 2020-01-17 2022-07-19 大唐移动通信设备有限公司 Access network equipment and data transmission method
CN113596038B (en) * 2021-08-02 2023-04-07 武汉绿色网络信息服务有限责任公司 Data packet parsing method and server
CN115996192B (en) * 2023-03-14 2023-08-15 阿里巴巴(中国)有限公司 Data forwarding method, vehicle control method, private network equipment and equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105359472A (en) * 2014-05-16 2016-02-24 华为技术有限公司 Data processing method and apparatus for OpenFlow network
WO2017035723A1 (en) * 2015-08-31 2017-03-09 华为技术有限公司 Paging method and apparatus for distributed gateway
WO2017085570A1 (en) * 2015-11-17 2017-05-26 Telefonaktiebolaget Lm Ericsson (Publ) Service based intelligent packet-in buffering mechanism for openflow switches by having variable buffer timeouts

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10021643B2 (en) * 2013-07-03 2018-07-10 Nokia Solutions And Networks Gmbh & Co. Kg User plane IDLE mode buffering within software defined network architecture
CN104348750B (en) * 2013-07-31 2019-07-26 中兴通讯股份有限公司 The implementation method and device of QoS in OpenFlow network
EP3217616B1 (en) * 2014-11-28 2018-11-21 Huawei Technologies Co., Ltd. Memory access method and multi-processor system
WO2017062066A1 (en) * 2015-10-06 2017-04-13 Intel IP Corporation Bearer-less architecture for a wireless cellular network

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105359472A (en) * 2014-05-16 2016-02-24 华为技术有限公司 Data processing method and apparatus for OpenFlow network
WO2017035723A1 (en) * 2015-08-31 2017-03-09 华为技术有限公司 Paging method and apparatus for distributed gateway
WO2017085570A1 (en) * 2015-11-17 2017-05-26 Telefonaktiebolaget Lm Ericsson (Publ) Service based intelligent packet-in buffering mechanism for openflow switches by having variable buffer timeouts

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
PiBuffer:面向数据中心的OpenFlow流缓存管理模型;毛健彪 等;《计算机学报》;20160630;第39卷(第6期);第1092-1104页 *

Also Published As

Publication number Publication date
CN109672615A (en) 2019-04-23

Similar Documents

Publication Publication Date Title
JP7079866B2 (en) Packet processing method and device
JP7303833B2 (en) Information transmission method and device
CN109672615B (en) Data message caching method and device
WO2018171791A1 (en) Data transmission method, access network device, terminal and communications system
CN109995654B (en) Method and device for transmitting data based on tunnel
CN104796227B (en) A kind of data transmission method and equipment
US10812292B2 (en) Packet processing method and device
CN110677345B (en) User message transmission method and communication equipment
JP2016511978A (en) Method, device and routing system for network virtualization data transmission
CN102857414A (en) Forwarding table writing method and device and message forwarding method and device
CN105531967B (en) Message transmission method, device and communication system
CN105379228A (en) Method, switch, and controller for implementing ARP
CN109936492A (en) A kind of methods, devices and systems by tunnel transmission message
US20210227608A1 (en) Method And Apparatus For Sending Multicast Data
WO2020034861A1 (en) Communication method and device
CN104796338A (en) Migration method and device of virtual machines
US10205610B2 (en) Uplink packet routing in a system-on-a-chip base station architecture
CN104780090A (en) VPN multicast transmission method and device PE equipment
US20230413154A1 (en) Data Unit Processing Method and Node
JP7298606B2 (en) Communication system and communication method
CA2975407A1 (en) Processing method for service flow packet, and apparatus
CN109873763A (en) A kind of communication means and equipment
JP2008219490A (en) Network system and address conversion method
CN109150752B (en) Cache control method, network element and controller
WO2019153295A1 (en) Mobile communication system, method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant