CN102523305B - Caching information pushing device, member, method and system and local area network - Google Patents

Caching information pushing device, member, method and system and local area network Download PDF

Info

Publication number
CN102523305B
CN102523305B CN201110458436.1A CN201110458436A CN102523305B CN 102523305 B CN102523305 B CN 102523305B CN 201110458436 A CN201110458436 A CN 201110458436A CN 102523305 B CN102523305 B CN 102523305B
Authority
CN
China
Prior art keywords
area network
lan
local area
information
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201110458436.1A
Other languages
Chinese (zh)
Other versions
CN102523305A (en
Inventor
何少伟
尹巍
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sangfor Technologies Co Ltd
Original Assignee
Sangfor Network Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sangfor Network Technology Shenzhen Co Ltd filed Critical Sangfor Network Technology Shenzhen Co Ltd
Priority to CN201110458436.1A priority Critical patent/CN102523305B/en
Publication of CN102523305A publication Critical patent/CN102523305A/en
Application granted granted Critical
Publication of CN102523305B publication Critical patent/CN102523305B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The invention relates to caching information pushing device, member, method and system and a local area network. The device is arranged in a local area network with a wide-area-network caching acceleration device; and according to data caching information stored in the wide-area-network caching acceleration device, network information of all the local area networks, which is provided with the certain data caching information, is acquired, so that network information of other local area networks is pushed to all the local area networks, and for the certain data caching information, caching acceleration of data stream is performed between any two local area networks in all the local area networks through the wide-area-network caching acceleration devices of the two local area networks. Therefore, the local area network of each client side can know the network information of the local area networks of other client sides with the same caching data, the learning mechanisms of the local area networks of the client sides are enhanced, the trouble that the caching acceleration effect can only be taken when the same data is re-transmitted between two wide-area-network caching acceleration devices through which the data stream pass is avoided.

Description

Caching information pushing device, member, method, system and local area network (LAN)
Technical field
The present invention relates to information push-delivery apparatus and method, more particularly, relate to a kind of caching information pushing device, member, method, system and local area network (LAN).
Background technology
The application of service Network Based, in wide area network, owing to being subject to the restriction of bandwidth, performance can reduce.Wide area network accelerates the deficiency of buffer storage for bandwidth, generally ensures making full use of of bandwidth with compression and caching technology.In the time of specific works, client local area network (LAN) and service end local area network (LAN) are in the time transmitting new data first, in client local area network (LAN) and serve in the net of end office (EO) territory wide area network separately accelerate buffer storage can this new data of buffer memory, when client local area network (LAN) transmits identical data with server end local area network (LAN) later, can accelerate to encode in buffer storage at the wide area network of transmit leg, only on wide area network, transmit index value, then accelerate to carry out index value decoding in buffer storage at recipient's wide area network, again wide area network by recipient is accelerated to buffer storage and transfer data to the terminal in recipient's local area network (LAN).
In wide area network framework, server end local area network (LAN) and each client local area network (LAN) form the tree structure of representational level relation, and form the mesh network that represents that same level customs station is between each client local area network (LAN).At present, server end local area network (LAN) and a client local area network (LAN) carry out the transmission of a certain data, this server end local area network (LAN) carries out the transmission of these data with another client local area network (LAN) again simultaneously, although now these two client local area network (LAN)s are all cached with these data, when but but between transmits these data first, there is no the effect of buffer memory, reason is between these two client local area network (LAN)s and not know that the other side has data cached.
Summary of the invention
The technical problem to be solved in the present invention is, for can only just having acceleration buffer memory effect in the wide area network framework of prior art in the time that two of data flow process wide area network accelerates to transmit identical data again between buffer storage, do not there is the defect of study mechanism, a kind of caching information pushing device, member, method, system and local area network (LAN) are provided.
The technical solution adopted for the present invention to solve the technical problems is: construct a kind of caching information pushing device for local area network (LAN), this caching information pushing device is arranged in the local area network (LAN) with wide area network acceleration buffer storage, accelerate according to described wide area network the data buffer storage information that buffer storage is stored, obtain the network information of all local area network (LAN)s with some data buffer storage information, thereby each local area network (LAN) in described all local area network (LAN)s pushes the network information except other local area network (LAN)s self, with between any two local area network (LAN)s in described all local area network (LAN)s for described some data buffer storage information, accelerate buffer storage by wide area network each other and carry out the buffer memory acceleration of data flow.
At the caching information pushing device for local area network (LAN) of the present invention, described caching information pushing device clocked flip carries out the propelling movement of the network information; Or only when the wide area network of its local area network (LAN) being positioned at accelerate buffer storage data cached while reaching predetermined number, trigger the propelling movement of carrying out the network information; Or the inquiry request sending by receiving other local area network (LAN), triggers the propelling movement of carrying out the network information.
At the caching information pushing device for local area network (LAN) of the present invention, described local area network (LAN) is server end local area network (LAN) or client local area network (LAN).
According to another aspect of the present invention, provide a kind of multiple-limb to share stream buffer memory acceleration means, it carries out exchanges data at least two local area network (LAN)s by wide area network, be provided with a described multiple-limb at the data exchange interface place of each local area network (LAN) and share stream buffer memory acceleration means, described multiple-limb is shared stream buffer memory acceleration means and is comprised that wide area network accelerates buffer storage, in the time that two local area network (LAN)s carry out exchanges data, accelerate buffer storage by wide area network each other and carry out the buffer memory acceleration of data flow, described multiple-limb is shared stream buffer memory acceleration means and is also comprised caching information pushing device, for accelerating according to described wide area network the data buffer storage information that buffer storage is stored, obtain the network information of all local area network (LAN)s with some data buffer storage information, thereby each local area network (LAN) in described all local area network (LAN)s pushes the network information except other local area network (LAN)s self, to carry out the buffer memory acceleration of data flow between any two local area network (LAN)s in described all local area network (LAN)s for described some data buffer storage information.
Share in stream buffer memory acceleration means at multiple-limb of the present invention, described caching information pushing device clocked flip carries out the propelling movement of the network information; Or only when the wide area network of its local area network (LAN) being positioned at accelerate buffer storage data cached while reaching predetermined number, trigger the propelling movement of carrying out the network information; Or the inquiry request sending by receiving other local area network (LAN), triggers the propelling movement of carrying out the network information.
Share in stream buffer memory acceleration means at multiple-limb of the present invention, described local area network (LAN) is server end local area network (LAN) or client local area network (LAN).
According to another aspect of the present invention, a kind of cache information method for pushing for local area network (LAN) is provided, be provided with wide area network at the data exchange interface place of described local area network (LAN) and accelerate buffer storage, accelerate with the buffer memory that carries out data flow, accelerate according to described wide area network the data buffer storage information that buffer storage is stored, obtain the network information of all local area network (LAN)s with some data buffer storage information, thereby each local area network (LAN) in described all local area network (LAN)s pushes the network information except other local area network (LAN)s self, to carry out the buffer memory acceleration of data flow between any two local area network (LAN)s in described all local area network (LAN)s for described some data buffer storage information.
At the cache information method for pushing for local area network (LAN) of the present invention, clocked flip carries out the propelling movement of the network information; Or only when the wide area network of local area network (LAN) accelerate buffer storage data cached while reaching predetermined number, trigger the propelling movement of carrying out the network information; Or the inquiry request sending by receiving other local area network (LAN), triggers the propelling movement of carrying out the network information.
According to a further aspect of the invention, provide a kind of multiple-limb to share stream buffer memory accelerating system, it comprises at least two local area network (LAN)s that carry out exchanges data by wide area network, wherein, is provided with wide area network accelerates buffer storage at the data exchange interface place of each local area network (LAN), a local area network (LAN) is server end local area network (LAN), and remaining local area network (LAN) is client local area network (LAN), described server end local area network (LAN) and a client local area network (LAN), accelerate buffer storage by wide area network each other and carry out the buffer memory acceleration of data flow, in described server end local area network (LAN), be also provided with caching information pushing device, it is for accelerating according to the wide area network of described server end local area network (LAN) the data buffer storage information that buffer storage is stored, obtain the network information of all local area network (LAN)s with some data buffer storage information, thereby each local area network (LAN) in described all local area network (LAN)s pushes the network information except other local area network (LAN)s self, with between any two local area network (LAN)s in described all local area network (LAN)s for described some data buffer storage information, accelerate buffer storage by wide area network each other and carry out the buffer memory acceleration of data flow.
According to a further aspect of the invention, provide a kind of local area network (LAN), be provided with wide area network at the data exchange interface place of described local area network (LAN) and accelerate buffer storage, accelerate with the buffer memory that carries out data flow, in described local area network (LAN), be provided with caching information pushing device; This caching information pushing device accelerates according to described wide area network the data buffer storage information that buffer storage is stored, obtain the network information of all local area network (LAN)s with some data buffer storage information, thereby each local area network (LAN) in described all local area network (LAN)s pushes the network information except other local area network (LAN)s self, with between any two local area network (LAN)s in described all local area network (LAN)s for described some data buffer storage information, the wide area network by each other accelerates the buffer memory that buffer storage carries out data flow and accelerates.
Implement caching information pushing device of the present invention, member, method, system and local area network (LAN), there is following beneficial effect: the caching information pushing device of server end local area network (LAN) pushes the relevant network information to each client local area network (LAN), make each client local area network (LAN) all know the network information of other client local area network (LAN)s of same buffered data concrete with it, strengthen the study mechanism of client local area network (LAN), avoided in the time that two of data flow process wide area network accelerates to transmit identical data again between buffer storage, just there is acceleration buffer memory effect.
Brief description of the drawings
Below in conjunction with drawings and Examples, the invention will be further described, in accompanying drawing:
Fig. 1 is the structural representation that multiple-limb of the present invention is shared stream buffer memory accelerating system the first embodiment;
Fig. 2 is the structural representation that multiple-limb of the present invention is shared stream buffer memory accelerating system the second embodiment;
Fig. 3 is the flow chart that multiple-limb of the present invention is shared stream buffer memory accelerated method.
Embodiment
In order to make object of the present invention, technical scheme and advantage clearer, below in conjunction with drawings and Examples, the present invention is further elaborated.Should be appreciated that specific embodiment described herein, only in order to explain the present invention, is not intended to limit the present invention.
Multiple-limb of the present invention is shared in stream buffer memory accelerating system as illustrated in fig. 1 and 2, and it mainly comprises at least two local area network (LAN)s that carry out exchanges data by wide area network 3.Understandable, in the process of exchanges data, can be the exchanges data between client, can be also the exchanges data between client and server, therefore, this local area network (LAN) can be server end local area network (LAN) 1 or client local area network (LAN) 2.For the efficiency of speeding up data transmission and reduce the restriction of bandwidth, be provided with wide area network at the data exchange interface place of each local area network (LAN) (client local area network (LAN) 2 and server end local area network (LAN) 1) and accelerate buffer storage 4.Thus, server end local area network (LAN) 1 and a client local area network (LAN) 2, can accelerate buffer storage 4 by the wide area network of the wide area network acceleration buffer storage 4 of server end local area network (LAN) 1 and client local area network (LAN) 2 and carry out the buffer memory acceleration of data flow.Be specially: the server 11 in client 21 and server end local area network (LAN) 1 in client local area network (LAN) 2 was in the time transmitting first pass data, in client local area network (LAN) 2 and server end local area network (LAN) 1, wide area network separately accelerates the data that buffer storage 4 can these transmission of buffer memory, when later the client 21 in this client local area network (LAN) 2 is transmitted identical data to the server 11 in server end local area network (LAN) 1 again, can accelerate to encode in buffer storage 4 at the wide area network of client local area network (LAN) 2, and only transmit index value by wide area network 3, then accelerate in buffer storage 4, this index value to be decoded at the wide area network of server end local area network (LAN) 1, the last data of again wide area network of server end local area network (LAN) 1 being accelerated to 4 buffer memorys of buffer storage send to server 11.In like manner, two client local area network (LAN)s 2 also can carry out by wide area network acceleration buffer storage 4 separately the buffer memory acceleration of data flow, the workflow between the similar client local area network (LAN) 2 of concrete operations and server end local area network (LAN) 1.
In addition, accelerate for the data flow cache of accurately and safely carrying out between local area network (LAN), need to carry out the coupling checking of data.That is to say, while carrying out transfer of data between server end local area network (LAN) 1 and client local area network (LAN) 2 or between two client local area network (LAN)s 2, before accelerating buffer memory by transmit leg and recipient's wide area network acceleration buffer storage 4 separately, need to carry out Data Matching.In the time that the match is successful, just send index value and accelerate buffer memory.In the time that it fails to match, just carry out the transmission of initial data.When concrete operations, the wide area network that data are sent to this transmit leg by the terminal of transmit leg accelerates buffer storage 4, accelerate buffer storage 4 at wide area network and store data buffer storage information, each data buffer storage information stores the information of the transmitting-receiving side that can carry out data acceleration buffer memory and the feature of corresponding concrete transmission data, and for example this data buffer storage information can comprise characteristic value, data cached and opposite end network information data.Wide area network accelerates buffer storage 4 and is receiving after data waiting for transmission, will mate one by one for each data buffer storage information.In the time mating with certain specific data buffer storage information, will extract the characteristic value of these data waiting for transmission and the characteristic value of stored this specific data buffer storage information is mated, what the match is successful, will carry out next stage coupling; And it fails to match, mate with another data buffer storage information, if all having mated, all data buffer storage information represents that data waiting for transmission are new data, thereby by wide area network 3, initial data is sent to recipient, this wide area network accelerates buffer storage 4 and will process to obtain a new data buffer storage information according to recipient's information and data waiting for transmission simultaneously.When characteristic value is after the match is successful, wide area network accelerates buffer storage 4 will carry out Data Matching, that is to say these data waiting for transmission and corresponding data cached coupling of this data buffer storage information, if consistent, represent that the match is successful, will enter again next stage coupling; If inconsistent, represent that it fails to match, mate with another data buffer storage information, if all having mated, all data buffer storage information represents that data waiting for transmission are new data, thereby by wide area network 3, initial data is sent to recipient, this wide area network accelerates buffer storage 4 and will process to obtain a new data buffer storage information according to recipient's information and data waiting for transmission simultaneously.After Data Matching success, the coupling of the opposite end network information will be carried out, namely afterbody coupling, opposite end network information data by recipient's information and date cache information of data waiting for transmission matches, if consistent, represent that the match is successful, thereby wide area network accelerates buffer storage index value is sent to recipient, then accelerate in buffer storage 4, this index value to be decoded at recipient's wide area network, the last data of again recipient's wide area network being accelerated to 4 buffer memorys of buffer storage send to certain terminal of recipient.Otherwise, the opposite end network information data of recipient's information and date cache information of data waiting for transmission is inconsistent, mate with another data buffer storage information, if all having mated, all data buffer storage information represents that data waiting for transmission are new data, thereby by wide area network 3, initial data is sent to recipient, this wide area network accelerates buffer storage 4 and will process to obtain a new data buffer storage information according to recipient's information and data waiting for transmission simultaneously.
But in the time that a client local area network (LAN) 2 and server end local area network (LAN) 1 carry out the acceleration buffer memory of a certain transfer of data, even if other client local area network (LAN)s 2 carried out the transmission of this identical data with server end local area network (LAN) 1, because other client local area network (LAN)s 2 can not be learned this client local area network (LAN) and also have the buffer memory of these data, therefore when other clients and this client are carried out the transmission of these data, still need to transmit first the acceleration buffer memory of new data, it just has the effect of accelerating buffer memory can only transmit identical data again between two of a data flow process wide area network acceleration buffer storage time.For this reason, in server end local area network (LAN) 1, be also provided with as shown in Figure 1 caching information pushing device 5, it is for accelerating according to the wide area network of this server end local area network (LAN) 1 the data buffer storage information that buffer storage 4 is stored, push the network information of other client local area network (LAN)s with it with same buffered data to all client local area network (LAN)s 2 with same buffered data, that is to say, inform that some client local area network (LAN)s 2 have which other client local area network (LAN) 2 and it to have identical data cached, when thereby any one client local area network (LAN) 2 carries out exchanges data with another client local area network (LAN) 2, even if the transfer of data that these two client local area network (LAN)s 2 carry out first, the client local area network (LAN) 2 of knowing recipient is had identical data cached with data to be transmitted by whether the client local area network (LAN) 2 of transmit leg, thereby the buffer memory that can carry out data flow between two client local area network (LAN)s 2 with same buffered data accelerates.This wide area network accelerates buffer storage 4 and formed the shared stream of multiple-limb buffer memory acceleration means 6 together with caching information pushing device 5.
For timely and effectively, upgrade each client local area network (LAN) 2 to have same buffered data other client local area network (LAN)s the opposite end network information be familiar with, its caching information pushing device 5 of clocked flip is carried out the propelling movement of the network information by this server end local area network (LAN) 1.Understandable, the implementation method of server end local area network (LAN) 1 active push can have multiple, except clocked flip, can also carry out active push according to method for distinguishing, for example, only just push when server end local area network (LAN) 1 data cached reaches after a certain amount of.In addition, except adopting the active push of server end local area network (LAN) 1, can also adopt the active inquiry of client local area network (LAN) 2, namely client local area network (LAN) has other network nodes of same buffered data by retrieval local cache data search, the inquiry request that namely caching information pushing device 5 of server end local area network (LAN) 1 sends by receiving client local area network (LAN) 2, triggers the propelling movement of carrying out the network information.
For further support, the propelling movement of the network information of non-service end local area network (LAN), caching information pushing device 5 can be also set as shown in Figure 2 in client local area network (LAN) 2, its wide area network according to this client local area network (LAN) accelerates the data buffer storage information that buffer storage is stored, push the network information of other client local area network (LAN) with it with same buffered data to other each client local area network (LAN)s, namely, retrieval wide area network is accelerated data cached in buffer storage 4 by this caching information pushing device 5, and calculate the network information of each data cached associated client local area network (LAN), and inform the network information of all other client local area network (LAN)s of client local area network (LAN) with same buffered data, thereby the buffer memory that carries out data flow between two client local area network (LAN)s with same buffered data accelerates.
Fig. 3 shows multiple-limb of the present invention and shares the method flow that stream buffer memory accelerates, the system configuration of the method flow process based on shown in Fig. 2, and detailed process is:
S31: caching information pushing device 5 accelerates according to wide area network the data buffer storage information that buffer storage 4 is stored, obtains the network information of all local area network (LAN)s with some data buffer storage information; In the time of specific works, this caching information pushing device 5 can accelerate each the data buffer storage information in buffer storage 4 for wide area network, carries out obtaining of this network information, that is to say, for data buffer storage information a1, obtain the network information of all local area network (LAN)s with data buffer storage information a1; For data buffer storage information a2, obtain the network information of all local area network (LAN)s with data buffer storage information a1; ...; For data buffer storage information an, obtain the network information of all local area network (LAN)s with data buffer storage information a1.Preferably, can carry out the propelling movement of the network information by clocked flip; Or only when the wide area network of local area network (LAN) accelerate buffer storage data cached while reaching predetermined number, trigger the propelling movement of carrying out the network information; Or the inquiry request sending by receiving other local area network (LAN), triggers the propelling movement of carrying out the network information.
S32: each local area network (LAN) in all local area network (LAN)s pushes the network information except other local area network (LAN)s self, to carry out the buffer memory acceleration of data flow between any two local area network (LAN)s in all local area network (LAN)s for some data buffer storage information, that is to say, inform that some client local area network (LAN)s 2 have which other client local area network (LAN) 2 and it to have identical data cached, when thereby any one client local area network (LAN) 2 carries out exchanges data with another client local area network (LAN) 2, even if the transfer of data that these two client local area network (LAN)s 2 carry out first, the client local area network (LAN) 2 of knowing recipient is had identical data cached with data to be transmitted by whether the client local area network (LAN) 2 of transmit leg, thereby the buffer memory that can carry out data flow between two client local area network (LAN)s 2 with same buffered data accelerates.
The foregoing is only preferred embodiment of the present invention, not in order to limit the present invention, all any amendments of doing within the spirit and principles in the present invention, be equal to and replace and improvement etc., within all should being included in protection scope of the present invention.

Claims (10)

1. the caching information pushing device for local area network (LAN), it is characterized in that, this caching information pushing device is arranged in the local area network (LAN) with wide area network acceleration buffer storage, accelerate according to described wide area network the data buffer storage information that buffer storage is stored, obtain the network information of all local area network (LAN)s with some data buffer storage information, thereby each local area network (LAN) in described all local area network (LAN)s pushes the network information except other local area network (LAN) self, to inform that each local area network (LAN) has other LAN Information of identical data cache information with it, between any two local area network (LAN)s in described all local area network (LAN)s for described some data buffer storage information, accelerate buffer storage by wide area network each other and carry out the buffer memory acceleration of data flow.
2. require the caching information pushing device for local area network (LAN) described in 1 according to power, it is characterized in that, described caching information pushing device clocked flip carries out the propelling movement of the network information; Or only when the wide area network of its local area network (LAN) being positioned at accelerate buffer storage data cached while reaching predetermined number, trigger the propelling movement of carrying out the network information; Or the inquiry request sending by receiving other local area network (LAN), triggers the propelling movement of carrying out the network information.
3. require the caching information pushing device for local area network (LAN) described in 1 according to power, it is characterized in that, described local area network (LAN) is server end local area network (LAN) or client local area network (LAN).
4. multiple-limb is shared a stream buffer memory acceleration means, carries out exchanges data at least two local area network (LAN)s by wide area network, be provided with a described multiple-limb at the data exchange interface place of each local area network (LAN) and share stream buffer memory acceleration means, described multiple-limb is shared stream buffer memory acceleration means and is comprised that wide area network accelerates buffer storage, in the time that two local area network (LAN)s carry out exchanges data, accelerate buffer storage by wide area network each other and carry out the buffer memory acceleration of data flow, it is characterized in that, described multiple-limb is shared stream buffer memory acceleration means and is also comprised caching information pushing device, for accelerating according to described wide area network the data buffer storage information that buffer storage is stored, obtain the network information of all local area network (LAN)s with some data buffer storage information, thereby each local area network (LAN) in described all local area network (LAN)s pushes the network information except other local area network (LAN) self, to inform that each local area network (LAN) has other LAN Information of identical data cache information with it, between any two local area network (LAN)s in described all local area network (LAN)s, carry out the buffer memory acceleration of data flow for described some data buffer storage information.
5. multiple-limb according to claim 4 is shared stream buffer memory acceleration means, it is characterized in that, described caching information pushing device clocked flip carries out the propelling movement of the network information; Or only when the wide area network of its local area network (LAN) being positioned at accelerate buffer storage data cached while reaching predetermined number, trigger the propelling movement of carrying out the network information; Or the inquiry request sending by receiving other local area network (LAN), triggers the propelling movement of carrying out the network information.
6. require the multiple-limb described in 4 to share stream buffer memory acceleration means according to power, it is characterized in that, described local area network (LAN) is server end local area network (LAN) or client local area network (LAN).
7. the cache information method for pushing for local area network (LAN), be provided with wide area network at the data exchange interface place of described local area network (LAN) and accelerate buffer storage, accelerate with the buffer memory that carries out data flow, it is characterized in that, accelerate according to described wide area network the data buffer storage information that buffer storage is stored, obtain the network information of all local area network (LAN)s with some data buffer storage information, thereby each local area network (LAN) in described all local area network (LAN)s pushes the network information except other local area network (LAN) self, to inform that each local area network (LAN) has other LAN Information of identical data cache information with it, between any two local area network (LAN)s in described all local area network (LAN)s, carry out the buffer memory acceleration of data flow for described some data buffer storage information.
8. require the cache information method for pushing for local area network (LAN) described in 7 according to power, it is characterized in that, clocked flip carries out the propelling movement of the network information; Or only when the wide area network of local area network (LAN) accelerate buffer storage data cached while reaching predetermined number, trigger the propelling movement of carrying out the network information; Or the inquiry request sending by receiving other local area network (LAN), triggers the propelling movement of carrying out the network information.
9. multiple-limb is shared a stream buffer memory accelerating system, comprises at least two local area network (LAN)s that carry out exchanges data by wide area network, wherein, is provided with wide area network accelerates buffer storage at the data exchange interface place of each local area network (LAN), a local area network (LAN) is service end local area network (LAN), and remaining local area network (LAN) is client local area network (LAN), described server end local area network (LAN) and a client local area network (LAN), accelerate buffer storage by wide area network each other and carry out the buffer memory acceleration of data flow, it is characterized in that, in described service end local area network (LAN), be also provided with caching information pushing device, it is for accelerating according to the wide area network of described server end local area network (LAN) the data buffer storage information that buffer storage is stored, obtain the network information of all local area network (LAN)s with some data buffer storage information, thereby each local area network (LAN) in described all local area network (LAN)s pushes the network information except other local area network (LAN) self, to inform that each local area network (LAN) has other LAN Information of identical data cache information with it, between any two local area network (LAN)s in described all local area network (LAN)s for described some data buffer storage information, accelerate buffer storage by wide area network each other and carry out the buffer memory acceleration of data flow.
10. a local area network (LAN), be provided with wide area network at the data exchange interface place of described local area network (LAN) and accelerate buffer storage, accelerate with the buffer memory that carries out data flow, it is characterized in that, in described local area network (LAN), be provided with caching information pushing device, this caching information pushing device accelerates according to described wide area network the data buffer storage information that buffer storage is stored, obtain the network information of all local area network (LAN)s with some data buffer storage information, thereby each local area network (LAN) in described all local area network (LAN)s pushes the network information except other local area network (LAN) self, to inform that each local area network (LAN) has other LAN Information of identical data cache information with it, between any two local area network (LAN)s in described all local area network (LAN)s for described some data buffer storage information, accelerate buffer storage by wide area network each other and carry out the buffer memory acceleration of data flow.
CN201110458436.1A 2011-12-31 2011-12-31 Caching information pushing device, member, method and system and local area network Active CN102523305B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201110458436.1A CN102523305B (en) 2011-12-31 2011-12-31 Caching information pushing device, member, method and system and local area network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201110458436.1A CN102523305B (en) 2011-12-31 2011-12-31 Caching information pushing device, member, method and system and local area network

Publications (2)

Publication Number Publication Date
CN102523305A CN102523305A (en) 2012-06-27
CN102523305B true CN102523305B (en) 2014-10-15

Family

ID=46294092

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201110458436.1A Active CN102523305B (en) 2011-12-31 2011-12-31 Caching information pushing device, member, method and system and local area network

Country Status (1)

Country Link
CN (1) CN102523305B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103401792A (en) * 2013-07-04 2013-11-20 中国科学院声学研究所 Adaptive upload acceleration apparatus for mobile terminal

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101690079A (en) * 2007-03-12 2010-03-31 思杰系统有限公司 Systems and methods for using compression histories to improve network performance
CN101841387A (en) * 2009-03-19 2010-09-22 中国移动通信集团江西有限公司 Wide area network data speed acceleration method, device and system
CN101945103A (en) * 2010-08-09 2011-01-12 中国电子科技集团公司第五十四研究所 IP (Internet Protocol) network application accelerating system
CN102263687A (en) * 2011-08-11 2011-11-30 武汉思为同飞网络技术有限公司 VPN (virtual private network) speed-up gateway in WAN (wide area network) as well as speed-up communication and method thereof

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7613131B2 (en) * 2005-11-10 2009-11-03 Citrix Systems, Inc. Overlay network infrastructure

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101690079A (en) * 2007-03-12 2010-03-31 思杰系统有限公司 Systems and methods for using compression histories to improve network performance
CN101841387A (en) * 2009-03-19 2010-09-22 中国移动通信集团江西有限公司 Wide area network data speed acceleration method, device and system
CN101945103A (en) * 2010-08-09 2011-01-12 中国电子科技集团公司第五十四研究所 IP (Internet Protocol) network application accelerating system
CN102263687A (en) * 2011-08-11 2011-11-30 武汉思为同飞网络技术有限公司 VPN (virtual private network) speed-up gateway in WAN (wide area network) as well as speed-up communication and method thereof

Also Published As

Publication number Publication date
CN102523305A (en) 2012-06-27

Similar Documents

Publication Publication Date Title
CN103297343B (en) Routing method based on delay tolerant network
CN104519036A (en) Method and device for sending service request message
CN101547161A (en) Folder transmission system, folder transmission device and folder transmission method
CN109905409A (en) Things-internet gateway real time bidirectional communication system based on Socket.IO
EP2571296B1 (en) Method, device and mobile multi-media broadcasting service system for transmitting data information
GB2589211A (en) Methods and systems of using remote subscriber identification modules at device
CN101136943A (en) System and method for implementing extended Diameter protocol application
CN110752943B (en) Distributed fault diagnosis system and method for power transmission line
CN103440142A (en) GPRS (General Packet Radio Service)-based remote upgrade and dynamic loading method
CN102420876A (en) Active peer-to-peer file transmission method for android-based intelligent mobile terminal
CN108494826A (en) A kind of distribution cloud storage method and system
CN109118360A (en) Block chain account checking method, device, equipment and storage medium
CN106059936B (en) The method and device of cloud system Multicast File
CN105656964B (en) The implementation method and device of data-pushing
CN109673018A (en) Novel cache contents in Wireless Heterogeneous Networks are placed and content caching distribution optimization method
CN103209195A (en) Data acquisition method, terminal and far-end device
CN107302582A (en) The data acquisition of millions scenes of internet of things and weak method for pushing
CN102523305B (en) Caching information pushing device, member, method and system and local area network
CN105978796A (en) Message communication method and system based on unstable mobile network
CN101087263B (en) A method and system for capturing user status information via search engine
CN102368767B (en) Internet acceleration method and system based on HFC (hybrid fiber coaxial) network
CN103701865A (en) Data transmission method and system
CN106982165A (en) Data compression method and its system
CN106535231B (en) Content transmission method for 5G user-oriented central network Cache deployment
EP2999266A1 (en) Method, device and system for obtaining mobile network data resources

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20200623

Address after: Nanshan District Xueyuan Road in Shenzhen city of Guangdong province 518000 No. 1001 Nanshan Chi Park building A1 layer

Patentee after: SANGFOR TECHNOLOGIES Inc.

Address before: 518000 Nanshan Science and Technology Pioneering service center, No. 1 Qilin Road, Guangdong, Shenzhen 418, 419,

Patentee before: Shenxin network technology (Shenzhen) Co.,Ltd.