Summary of the invention
Main purpose of the present invention is to provide a kind of load-balancing method, Apparatus and system, to solve when server Single Point of Faliure, affects larger problem.
To achieve these goals, according to an aspect of the present invention, a kind of load-balancing method is provided.Load-balancing method according to the present invention comprises: distribution server receives the request instruction from client, wherein, request instruction is to the information of the current available front-end server of distribution server acquisition request, and front-end server is used for providing data access passage for client-access application server; Distribution server is after receiving request instruction, and distribution server sends data list to client, and wherein, data list is the list storing current available front-end server information, and client accesses front-end server according to data list.
Further, after distribution server sends data list to client, the method also comprises: distribution server obtains the information of the current available front-end server upgraded; Distribution server upgrades data list, and wherein, the current available front-end server information according to the renewal got upgrades data list; And distribution server sends the data list after upgrading to client.
Further, distribution server is multiple distribution server, multiple distribution server comprises the first distribution server and the second distribution server, data list is multiple data list, multiple data list comprises the first data list and the second data list, the current available front-end server information that first data list obtains for storing the first distribution server, the current available front-end server information that second data list obtains for storing the second distribution server, wherein, first data list is identical with the information that the second data list stores, distribution server is after receiving request instruction, distribution server sends data list and comprises to client: judge that whether access first distribution server is successful, if judge the success of access first distribution server, first distribution server sends the first data list to client, if judge that access first distribution server is unsuccessful, second distribution server receives the first instruction, wherein, the first instruction is the access instruction conducted interviews to the second distribution server that distribution server prestores, and second distribution server send the second data list to client.
Further, request instruction is the instruction conducted interviews to the first distribution server by the domain name that the first distribution server is corresponding.
Further, the first instruction is the instruction conducted interviews to the second distribution server by the IP network location that the second distribution server is corresponding.
To achieve these goals, according to another aspect of the present invention, a kind of load-balancing method is additionally provided.Load-balancing method according to the present invention comprises: client sends request instruction to distribution server, wherein, request instruction is the instruction of client to the information of the current available front-end server of distribution server acquisition request, and front-end server is used for providing data access passage for client-access application server; Client receives data list, and wherein, data list is the list storing current available front-end server information, and distribution server is also for sending data list to client; And client accesses front-end server according to data list.
Further, front-end server comprises multiple front-end server, and multiple front-end server comprises the first front-end server and the second front-end server, and client comprises according to data list access front-end server: client obtains data list; Client is according to data list, determine the connection request instruction of connection first front-end server, wherein, connection request instruction is used for the request instruction of client-requested connected reference front-end server, and the availability of the first front-end server is higher than the availability of the second front-end server; And client accesses the first front-end server by connection request instruction.
To achieve these goals, according to a further aspect in the invention, a kind of load balancing apparatus is provided.Load balancing apparatus according to the present invention comprises: receiving element, for receiving the request instruction from client, wherein, request instruction is to the information of the current available front-end server of distribution server acquisition request, and front-end server is used for providing data access passage for client-access application server; And transmitting element, for after receiving request instruction, send data list to client, wherein, data list is the list storing current available front-end server information, and client accesses front-end server according to data list.
To achieve these goals, according to a further aspect in the invention, a kind of load balancing apparatus is additionally provided.Load balancing apparatus according to the present invention comprises: transmitting element, for sending request instruction to distribution server, wherein, request instruction is the instruction of client to the information of the current available front-end server of distribution server acquisition request, and front-end server is used for providing data access passage for client-access application server; Receiving element, for receiving data list, wherein, data list is the list storing current available front-end server information, and distribution server is also for sending data list to client; And addressed location, for accessing front-end server according to data list.
To achieve these goals, according to a further aspect in the invention, a kind of SiteServer LBS is provided.SiteServer LBS according to the present invention comprises: client, for conducting interviews to distribution server, receive the data list that distribution server sends, data list is the list storing current available front-end server information, and client accesses front-end server according to data list; Distribution server, for receiving the request instruction from client, wherein, request instruction is to the information of the current available front-end server of distribution server acquisition request, distribution server is after receiving request instruction, distribution server sends data list to client, and front-end server, for providing data access passage for client-access application server.
Pass through the present invention, adopt the method comprised the following steps, the method comprises: distribution server receives the request instruction from client, wherein, request instruction is to the information of the current available front-end server of distribution server acquisition request, and front-end server is used for providing data access passage for client-access application server; Distribution server is after receiving request instruction, and distribution server sends data list to client, and wherein, data list is the list storing current available front-end server information, and client accesses front-end server according to data list.The data list of the information storing current available front-end server is got by client of the present invention, according to this data list, access front-end server, solves when server Single Point of Faliure, affect larger problem, the impact caused when significantly reducing server Single Point of Faliure.
Embodiment
It should be noted that, when not conflicting, the embodiment in the application and the feature in embodiment can combine mutually.Below with reference to the accompanying drawings and describe the present invention in detail in conjunction with the embodiments.
The application's scheme is understood better in order to make those skilled in the art person, below in conjunction with the accompanying drawing in the embodiment of the present application, technical scheme in the embodiment of the present application is clearly and completely described, obviously, described embodiment is only the embodiment of the application's part, instead of whole embodiments.Based on the embodiment in the application, those of ordinary skill in the art are not making the every other embodiment obtained under creative work prerequisite, all should belong to the scope of the application's protection.
It should be noted that, term " first ", " second " etc. in the specification of the application and claims and above-mentioned accompanying drawing are for distinguishing similar object, and need not be used for describing specific order or precedence.Should be appreciated that the data used like this can be exchanged, in the appropriate case so that the embodiment of the application described herein.In addition, term " comprises " and " having " and their any distortion, intention is to cover not exclusive comprising, such as, contain those steps or unit that the process of series of steps or unit, method, system, product or equipment is not necessarily limited to clearly list, but can comprise clearly do not list or for intrinsic other step of these processes, method, product or equipment or unit.
In the present invention, part technical term is explained as follows:
Load balancing (being also called load balancing), English name is Load Balance, task carried out balance, share on multiple operating unit and perform, such as web page server, ftp server, enterprise crucial application server and other mission critical server etc., thus task of jointly finishing the work.
Server end (Server): in broad terms, server refers to the computer system that some can be provided to serve to other machine in network.If a computer is served providing ftp outside server end, also server can be.
Client (Client) or be called user side: refer to corresponding with server, for client provides the program of local service.Except some are only except the application program of local runtime, be generally arranged in common client computer, needing works in coordination with service end runs.After development of Internet, comparatively conventional client comprises: the web browser that World Wide Web (WWW) uses, and receives email client when posting Email, and the client software etc. of instant messaging.For this class application program, need have corresponding server and service routine in network to provide corresponding service, as database service, E-mail service etc., like this at client and server end, need to set up specific communication connection, ensure the normal operation of application program.
Single Point of Faliure (Single Point of Failure), from the literal fault that can be understood as a single point and occur, be usually applied to computer system and network, in the entire network, as long as have a place or a station server to occur problem, whole network has just all been paralysed.
Fig. 1 is the flow chart of load-balancing method according to a first embodiment of the present invention.As shown in Figure 1, the method comprises following step S101 to step S103:
Step S101, client sends request instruction to distribution server.
Client sends request instruction to distribution server, and wherein, request instruction is the instruction of client to the information of the current available front-end server of distribution server acquisition request, and front-end server is used for providing data access passage for client-access application server.Request instruction is the instruction conducted interviews to the first distribution server by the domain name that distribution server is corresponding.
Such as, client carries out request access by http://www.sina.com to distribution server, and namely client sends request instruction to the distribution server that www.sina.com is corresponding.This distribution server for obtaining the relevant information of current available front-end server, particularly, as information such as server name, IP address, server state and availabilities.And this relevant information is stored in data list.If client is by www.sina.com to the success of distribution server request access, distribution server sends this data list to this client.
Step S102, client receives data list.
Distribution server sends this data list to this client, and client receives data list.Data list is the list storing current available front-end server information.The data list that client receives can be the different data list that different distribution server sends, but the data message that different data lists stores is identical, because the current available front-end server information that different distribution servers gets is identical.
Step S103, client accesses front-end server according to data list.
After client receives data list, from the data list storing current available front-end server, select a front-end server access front-end server.
Preferably, in order to promote the efficiency of client access front-end server, in the load-balancing method that the embodiment of the present invention provides, client comprises according to data list access front-end server: front-end server comprises multiple front-end server, and multiple front-end server comprises the first front-end server and the second front-end server.Client obtains data list; Client is according to data list, determine the connection request instruction of connection first front-end server, wherein, connection request instruction is used for the request instruction of client-requested connected reference front-end server, and the availability of the first front-end server is higher than the availability of the second front-end server; And client accesses the first front-end server by connection request instruction.
By the relevant information to the current available front-end server stored in data list, the front-end server selecting availability best, as access server, improves the efficiency of client access front-end server.
A kind of load-balancing method that the embodiment of the present invention provides, sends request instruction to distribution server by client, and client receives data list, and client accesses front-end server according to data list.The data list of the information storing current available front-end server is got by client of the present invention, according to this data list, access front-end server, solves when server Single Point of Faliure, affect larger problem, the impact caused when significantly reducing server Single Point of Faliure.
Fig. 2 is the flow chart of load-balancing method according to a second embodiment of the present invention.As shown in Figure 2, the method comprises following step S201 to step S202:
Step S201, distribution server receives the request instruction from client.
Distribution server receives the request instruction from client, and wherein, request instruction is to the information of the current available front-end server of distribution server acquisition request, and front-end server is used for providing data access passage for client-access application server.
Step S202, distribution server is after receiving request instruction, and distribution server sends data list to client.
Distribution server is after receiving request instruction, and distribution server sends data list to client, and wherein, data list is the list storing current available front-end server information, and client is according to this data list access front-end server.
Particularly, distribution server is multiple distribution server, multiple distribution server comprises the first distribution server and the second distribution server, data list is multiple data list, multiple data list comprises the first data list and the second data list, first data list stores the current available front-end server information that the first distribution server obtains, second data list stores the current available front-end server information that the second distribution server obtains, wherein, first data list is identical with the information that the second data list stores, it is characterized in that, distribution server is after receiving request instruction, distribution server sends data list and comprises to client: judge that whether access first distribution server is successful, if judge the success of access first distribution server, first distribution server sends the first data list to client, if judge that access first distribution server is unsuccessful, second distribution server receives the first instruction, wherein, the first instruction is the access instruction conducted interviews to the second distribution server that distribution server prestores, and second distribution server send the second data list to client.
Request instruction is the instruction conducted interviews to the first distribution server by the domain name that the first distribution server is corresponding.First instruction is the instruction conducted interviews to the second distribution server by the IP network location that the second distribution server is corresponding.
First instruction is if judge that access first distribution server is unsuccessful, the instruction that distribution server self prestores, and this instruction is used for conducting interviews in the mode of IP address to the second distribution server.Thus ensure that when the first distribution server occur domain name mapping unsuccessfully etc. fault time, client can also have access to the second distribution server, thus obtains the data list storing current available front-end server.
Preferably, in order to ensure the accuracy of data list information, in the load-balancing method that the embodiment of the present invention provides, after distribution server sends data list to client, method also comprises: distribution server obtains the information of the current available front-end server upgraded; Distribution server upgrades data list, and wherein, the current available front-end server information according to the renewal got upgrades data list; And distribution server sends the data list after upgrading to client.
By periodically upgrading data list, and being sent to client, ensure that the accuracy of data list information improves the accuracy of client according to this data list access front-end server simultaneously.
A kind of load-balancing method that the embodiment of the present invention provides, receives the request instruction from client by distribution server, and distribution server is after receiving request instruction, and distribution server sends data list to client.The data list of the information storing current available front-end server is got by client of the present invention, according to this data list, access front-end server, solves when server Single Point of Faliure, affect larger problem, the impact caused when significantly reducing server Single Point of Faliure.
It should be noted that, can perform in the computer system of such as one group of computer executable instructions in the step shown in the flow chart of accompanying drawing, and, although show logical order in flow charts, but in some cases, can be different from the step shown or described by order execution herein.
The present invention also provides a kind of load balancing apparatus, and this load balancing apparatus is arranged on distribution server, or as distribution server, is introduced below to load balancing apparatus:
Fig. 3 is the schematic diagram of load balancing apparatus according to a first embodiment of the present invention.As shown in Figure 3, this device comprises: receiving element 10 and transmitting element 12.
Receiving element 10, for receiving the request instruction from client, wherein, request instruction is to the information of the current available front-end server of distribution server acquisition request, and front-end server is used for providing data access passage for client-access application server.
Transmitting element 12, for after receiving request instruction, send data list to client, wherein, data list is the list storing current available front-end server information, and client accesses front-end server according to data list.
The load balancing apparatus that the embodiment of the present invention provides, the request instruction from client is received by receiving element 10, wherein, request instruction is to the information of the current available front-end server of distribution server acquisition request, and front-end server is used for providing data access passage for client-access application server; Transmitting element 12 is after receiving request instruction, and send data list to client, wherein, data list is the list storing current available front-end server information, and client accesses front-end server according to data list.By the present invention, solve when server Single Point of Faliure, affect larger problem, the impact caused when significantly reducing server Single Point of Faliure.
The present invention also provides a kind of load balancing apparatus, and this load balancing apparatus is arranged in client, or as client, is introduced below to load balancing apparatus.
Fig. 4 is the schematic diagram of load balancing apparatus according to a second embodiment of the present invention.As shown in Figure 4, this device comprises: transmitting element 20, receiving element 22 and addressed location 24.
Transmitting element 20, for sending request instruction to distribution server, wherein, request instruction is the instruction of client to the information of the current available front-end server of distribution server acquisition request, and front-end server is used for providing data access passage for client-access application server.
Receiving element 22, for receiving data list, wherein, data list is the list storing current available front-end server information, and distribution server is also for sending data list to client.
Addressed location 24, for accessing front-end server according to data list.
The load balancing apparatus that the embodiment of the present invention provides, instruction is sent request to distribution server by transmitting element 20, wherein, request instruction is the instruction of client to the information of the current available front-end server of distribution server acquisition request, and front-end server is used for providing data access passage for client-access application server.Receiving element 22 receives data list, and wherein, data list is the list storing current available front-end server information, and distribution server is also for sending data list to client.Addressed location 24 is according to data list access front-end server.By the present invention, solve when server Single Point of Faliure, affect larger problem, the impact caused when significantly reducing server Single Point of Faliure.
The embodiment of the present invention additionally provides a kind of SiteServer LBS, it should be noted that, the SiteServer LBS of the embodiment of the present invention may be used for performing that the embodiment of the present invention provides for load-balancing method.Below the SiteServer LBS that the embodiment of the present invention provides is introduced.
Fig. 5 is the schematic diagram according to a kind of SiteServer LBS of the present invention.As shown in Figure 5, this system comprises: client 100, distribution server 200 and front-end server 300.
Client 100, for conducting interviews to distribution server 200, receive the data list that distribution server 200 sends, data list is the list storing current available front-end server 300 information, and client 100 is according to data list access front-end server 300.
Distribution server 200, for receiving the request instruction from client 100, wherein, request instruction is to the information of the current available front-end server 300 of distribution server 200 acquisition request, distribution server 200 is after receiving request instruction, and distribution server 200 sends data list to client 100.
Front-end server 300, for providing data access passage for client 100 access application server.
The SiteServer LBS that the embodiment of the present invention provides, conducted interviews by client 100 pairs of distribution servers 200, receive the data list that distribution server 200 sends, data list is the list storing current available front-end server 300 information, and client 100 is according to data list access front-end server 300; Distribution server 200 receives the request instruction from client 100, wherein, request instruction is to the information of the current available front-end server 300 of distribution server 200 acquisition request, and distribution server 200 is after receiving request instruction, and distribution server 200 sends data list to client 100; Front-end server 300 provides data access passage for client 100 access application server.By the present invention, solve when server Single Point of Faliure, affect larger problem, the impact caused when significantly reducing server Single Point of Faliure.
Obviously, those skilled in the art should be understood that, above-mentioned of the present invention each module or each step can realize with general calculation element, they can concentrate on single calculation element, or be distributed on network that multiple calculation element forms, alternatively, they can realize with the executable program code of calculation element, thus, they can be stored and be performed by calculation element in the storage device, or they are made into each integrated circuit modules respectively, or the multiple module in them or step are made into single integrated circuit module to realize.Like this, the present invention is not restricted to any specific hardware and software combination.
The foregoing is only the preferred embodiments of the present invention, be not limited to the present invention, for a person skilled in the art, the present invention can have various modifications and variations.Within the spirit and principles in the present invention all, any amendment done, equivalent replacement, improvement etc., all should be included within protection scope of the present invention.