CN104699628A - Method and device for cache prestorage - Google Patents

Method and device for cache prestorage Download PDF

Info

Publication number
CN104699628A
CN104699628A CN201510106743.1A CN201510106743A CN104699628A CN 104699628 A CN104699628 A CN 104699628A CN 201510106743 A CN201510106743 A CN 201510106743A CN 104699628 A CN104699628 A CN 104699628A
Authority
CN
China
Prior art keywords
server
threshold value
load
user
default
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510106743.1A
Other languages
Chinese (zh)
Other versions
CN104699628B (en
Inventor
沈建荣
谭国斌
马哲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Technology Co Ltd
Xiaomi Inc
Original Assignee
Xiaomi Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xiaomi Inc filed Critical Xiaomi Inc
Priority to CN201510106743.1A priority Critical patent/CN104699628B/en
Publication of CN104699628A publication Critical patent/CN104699628A/en
Application granted granted Critical
Publication of CN104699628B publication Critical patent/CN104699628B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The invention provides a method and a device for cache prestorage. The method comprises the steps that a user request transmitted by a client is received; load information of a first server is acquired, wherein the first server is used for processing a new service; and when the load information of the first server is less than a preset first load threshold, the user request is transmitted to the first server and the first server is indicated to acquire and cache data information requested by the user request. The method and the device are used for reducing impact of excessive initial request amount on a new server or an upgraded server.

Description

The method of buffer memory pre-stored and device
Technical field
The disclosure relates to computer realm, particularly relates to method and the device of buffer memory pre-stored.
Background technology
In correlation technique, when the webserver provides service to client or third party, all need to use buffer memory.Because obtain data by buffer memory to obtain data comparatively fast with respect to database, thus server can be made to process user's request quickly.
Summary of the invention
For overcoming Problems existing in correlation technique, the disclosure provides a kind of method and device of buffer memory pre-stored.In order to the server after new placed into service device or upgrading, reduce the excessive impact to server of initial request amount.
According to the first aspect of disclosure embodiment, a kind of method of buffer memory pre-stored is provided, comprises: receive user's request that client sends; Obtain the information on load of first server; Wherein, described first server is for the treatment of new business; When the information on load of described first server is less than default first load threshold value, send described user request to described first server, indicate described first server to obtain and the data message that described in buffer memory, user asks.
The technical scheme that embodiment of the present disclosure provides can comprise following beneficial effect: by distributing a small amount of user request to the first server for new business, make first server after initial launch, complete the buffer memory to new business data message, reduce the excessive impact to first server of initial request amount, and reduced-maintenance technician can be subtracted, reduce maintenance cost.
Described method also comprises: when the information on load duration be between default 3rd load threshold value and default first load threshold value of described first server is greater than preset duration threshold value, described first load threshold value of presetting is increased to default second load threshold value; Described the 3rd load threshold value of presetting is less than default first load threshold value.
The technical scheme that embodiment of the present disclosure provides can comprise following beneficial effect: when the load of first server reduces and tends to balance, progressively increase the quantity of the user's request distributing to first server, when ensureing that first server is normally run, add the treatment capacity of first server, while completing the pre-stored of server, improve the processing power of first server.
Described method also comprises: when the information on load of described first server is more than or equal to default first load threshold value, sends described user request to second server, indicates described second server to obtain and the data message that described in buffer memory, user asks.
The technical scheme that embodiment of the present disclosure provides can comprise following beneficial effect: when the user's number of requests distributing to first server increases to the first load threshold value, user's request is sent to second server, achieve first server and second server to process user simultaneously and ask, first server only process a small amount of user ask time, make remainder user ask can obtain normal process.Wherein, second server is for the treatment of the business before old service or upgrading.
Described method also comprises: send upgrade file, using second server as first server, for the treatment of described new business to described second server.
The technical scheme that embodiment of the present disclosure provides can comprise following beneficial effect: second server of progressively upgrading, and makes second server as first server, process new business.Progressively to upgrade part server, and to a small amount of user's request of the server-assignment after upgrading, pre-stored is carried out to the buffer memory of server, decrease the server of a large number of users request to process new business and produce and impact.
Described method also comprises: receive the result of asking for described user that described first server sends; Described result is sent to the client sending described user request.
The technical scheme that embodiment of the present disclosure provides can comprise following beneficial effect: achieve when after server process new business, still can to the object of client Normal Feedback result.
According to the second aspect of disclosure embodiment, a kind of device of buffer memory pre-stored is provided, comprises: the first receiver module, for receiving user's request that client sends; Obtain module, for obtaining the information on load of first server; Wherein, described first server is for the treatment of new business; First sending module, for when the information on load of described first server is less than default first load threshold value, sends described user request to described first server, indicates described first server to obtain and the data message that described in buffer memory, user asks.
Described device also comprises: increase module, when being greater than preset duration threshold value for the duration be in when the information on load of described first server between default 3rd load threshold value and default first load threshold value, described first load threshold value of presetting is increased to default second load threshold value; Described the 3rd load threshold value of presetting is less than default first load threshold value.
Described device also comprises: the second sending module, for when the information on load of described first server is more than or equal to default first load threshold value, send described user request to second server, indicate described second server to obtain and the data message that described in buffer memory, user asks.
Described device also comprises: the 3rd sending module, for sending upgrade file, using second server as first server, for the treatment of described new business to described second server.
Described device also comprises: the second receiver module, for receiving the result of asking for described user that described first server sends; 4th sending module, for sending described result to the client sending described user request.
According to the third aspect of disclosure embodiment, a kind of device of buffer memory pre-stored is provided, it is characterized in that, comprising: processor; For the storer of storage of processor executable instruction; Wherein, described processor is configured to: receive user's request that client sends; Obtain the information on load of first server; Wherein, described first server is for the treatment of new business; When the information on load of described first server is less than default first load threshold value, send described user request to described first server, indicate described first server to obtain and the data message that described in buffer memory, user asks.
Should be understood that, it is only exemplary and explanatory that above general description and details hereinafter describe, and can not limit the disclosure.
Accompanying drawing explanation
Accompanying drawing to be herein merged in instructions and to form the part of this instructions, shows and meets embodiment of the present disclosure, and is used from instructions one and explains principle of the present disclosure.
Fig. 1 is the process flow diagram of the method for a kind of buffer memory pre-stored according to an exemplary embodiment.
Fig. 2 is the detail flowchart of the method for a kind of buffer memory pre-stored according to an exemplary embodiment.
Fig. 3 is the detail flowchart of the method for a kind of buffer memory pre-stored according to an exemplary embodiment.
Fig. 4 is the block diagram of the device of a kind of buffer memory pre-stored according to an exemplary embodiment.
Fig. 5 is the detailed diagram of the device of a kind of buffer memory pre-stored according to an exemplary embodiment.
Fig. 6 is the detailed diagram of the device of a kind of buffer memory pre-stored according to an exemplary embodiment.
Fig. 7 is the detailed diagram of the device of a kind of buffer memory pre-stored according to an exemplary embodiment.
Fig. 8 is the detailed diagram of the device of a kind of buffer memory pre-stored according to an exemplary embodiment.
Fig. 9 is the block diagram of the device of a kind of buffer memory pre-stored according to an exemplary embodiment.
Embodiment
Here will be described exemplary embodiment in detail, its sample table shows in the accompanying drawings.When description below relates to accompanying drawing, unless otherwise indicated, the same numbers in different accompanying drawing represents same or analogous key element.Embodiment described in following exemplary embodiment does not represent all embodiments consistent with the disclosure.On the contrary, they only with as in appended claims describe in detail, the example of apparatus and method that aspects more of the present disclosure are consistent.
When the webserver provides service to client or third party, all need to use buffer memory, the request to the slow database of speed can be reduced by buffer memory, thus user is responded faster.The data that cache stores up normally obtain from database according to user's request.And the server of newly reaching the standard grade or upgrading after server and untreated user request, therefore server does not obtain and data cached information from database, when there being server process user to ask, server sends to the data of user all to obtain from database, because the response time is long when server directly obtains data from database, takies resource large, therefore, when there is a large number of users request, increasing server load, easily making server collapse.Therefore, need to carry out pre-stored to the buffer memory in server after server is reached the standard grade or after upgrading, namely in the buffer memory of server, pre-stored is used for the data of new business.
Possible way has active cache, and active cache refers to that the data of buffer memory in server are initiatively gone to generate by technician; Passive pre-stored is using each request of user all as data cached, and to each data cached setting expired time, if having again identical request in expired time, then refreshes expired time.But the mode of active cache often cost compare is high, because technician do not know which data needs buffer memory, therefore need the forecast model setting up more complicated.
The server of process new business is when initial; the user received asks flow just larger; if directly all users are asked directly to send to and there is no data cached server; can produce server instantaneously and impact; so usually can during this time limit the flow of user's request; abandon a part of user request, be unfavorable for that server process user asks.
For solving the problem, disclosure embodiment is for the server after the larger server of newly reaching the standard grade of user base number or upgrading, by distributing a small amount of user request to the first server for new business, and by other a large amount of user's request dispatching to other servers being used for old business, make first server after initial launch, complete the buffer memory to new business data message, decrease that initial request amount is excessive to impact first server, and reduced-maintenance technician can be subtracted, reduce maintenance cost.
Fig. 1 is the process flow diagram of the method for a kind of buffer memory pre-stored according to an exemplary embodiment, and as shown in Figure 1, the method for buffer memory pre-stored is used for, in server, comprising the following steps.
In a step 101, user's request that client sends is received.
In a step 102, the information on load of first server is obtained; Wherein, first server is for the treatment of new business.Can to be the server of newly reaching the standard grade also can be first server carries out the server after function upgrading to runtime server.
In step 103, when the information on load of first server is less than default first load threshold value, send user's request to first server, instruction first server obtains the data message that also cache user request is asked.
Disclosure embodiment provides a kind of method of buffer memory pre-stored.The server of newly reaching the standard grade larger for user base number or the server after upgrading.When server initial launch, owing to needing to process a large amount of user's requests, if user to be asked the server all directly sent to for new business, then need server from database, obtain mass data information, and the speed that server obtains data message from database is slow, from database, obtains mass data information can make server overload operation.Therefore, send a large amount of user's requests directly to the server for new business to impact to server.For the server after new placed into service device or upgrading, for reducing a large number of users request, server being impacted, all needing a large amount of technician to carry out safeguarding early stage or test in early stage to server.And the scheme that disclosure embodiment provides, by distributing a small amount of user request to the first server for new business, make first server after initial launch, complete the buffer memory to new business data message, decrease that initial request amount is excessive to impact first server, and reduced-maintenance technician can be subtracted, reduce maintenance cost.
In one embodiment, the method also can be implemented as: steps A 1.
In steps A 1, when the information on load duration be between default 3rd load threshold value and default first load threshold value of first server is greater than preset duration threshold value, default first load threshold value is increased to default second load threshold value.Preset the 3rd load threshold value and be less than default first load threshold value.
When the load of first server reduces and tends to balance, can determine that first server process user asks speed and receives the speed that user asks and keep balancing, to illustrate in the buffer memory of now first server that a considerable amount of data message of buffer memory is for first server, now increase the quantity of the user's request distributing to first server, the more data message of pre-stored in first server buffer memory.Therefore, progressively increase the quantity of the user's request distributing to first server, when ensureing that first server is normally run, adding the treatment capacity of first server, completing the pre-stored of server buffer.
In one embodiment, the method also can be implemented as: step B1.
In step bl is determined., when the information on load of first server is more than or equal to default first load threshold value, send user's request to second server, instruction second server obtains the data message that also cache user request is asked.
When the user's number of requests distributing to first server increases to the first load threshold value, illustrate that the processing power of first server arrives threshold value, now, user's request is not sent for the treatment of the second server of new business to other, achieve first server and second server to process user simultaneously and ask, first server only process a small amount of user ask time, decrease remainder user request cannot obtain process situation.
In one embodiment, the method also can be implemented as: step C1.
In step C1, send upgrade file, using second server as first server, for the treatment of new business to second server.
Progressively to upgrade second server, make second server carry out work as first server, process new business.Progressively to upgrade part server, and to a small amount of user's request of the server-assignment after upgrading, pre-stored is carried out to the buffer memory of server, reduce the server of a large number of users request to process new business and produce and impact.
In one embodiment, the method also can be implemented as: step D1-step D2.
In step D1, receive the result for user's request that first server sends;
In step d 2, to the client transmission processing result sending user's request.
Achieve when after server process new business, still can to the object of client Normal Feedback result.
New business upgrading is carried out to one group of server, and needs the server normal process user request after upgrading.Disclosure embodiment provides as under type.
As shown in Figure 2, disclosure embodiment provides a kind of method of buffer memory pre-stored, and concrete implementation step is as follows:
In step 201, upgrade file is sent to first server.
One group of server is divided into 3 parts, first server, second server and the 3rd server, first server sends upgrade file wherein, and the upgrading of instruction first server, for the treatment of new business.
In step 202., user's request that client sends is received.
In step 203, the information on load of first server is obtained.
Information on load, comprising: CPU (central processing unit) utilization rate, committed memory size and the process of operation number.The information on load obtained in the disclosed embodiments is CPU usage.
In step 204, judge whether the information on load of first server is less than default first load threshold value.When the information on load of first server is less than default first load threshold value, perform step 204; When the information on load duration be between default 3rd load threshold value and default first load threshold value of first server is greater than preset duration threshold value, perform step 205; When the information on load of first server is more than or equal to default first load threshold value, perform step 207.
Presetting the first load threshold value is CPU usage 70%.
In step 205, when the information on load of first server is less than default first load threshold value, send user's request to first server, instruction first server obtains the data message that also cache user request is asked.
A small amount of user's request is sent to first server, the CPU usage obtained is judged simultaneously, if CPU usage does not reach 70%, then continue to send user's request to first server, and indicate first server to obtain the data message of asking, and the data message that pre-stored is asked in the buffer memory of first server.
In step 206, judge that the information on load of first server is in default 3rd load threshold value and whether the duration preset between the first load threshold value is greater than preset duration threshold value.When the information on load duration be between default 3rd load threshold value and default first load threshold value of first server is greater than preset duration threshold value, then perform step 207.
Presetting the 3rd load threshold value is CPU usage 50%, and preset duration threshold value is 20S.
In step 207, when the information on load duration be between default 3rd load threshold value and default first load threshold value of first server is greater than preset duration threshold value, default first load threshold value is increased to default second load threshold value.Preset the 3rd load threshold value and be less than default first load threshold value.When the information on load duration be between default 3rd load threshold value and default first load threshold value of first server is less than or equal to preset duration threshold value, the load threshold value of first server is kept to be the first default load threshold value.
When the CPU usage of first server is between 50%-70% during 20S, the processing speed can determining first server with receive the speed that newly assigned user asks and keep balancing, now, the quantity sending to the user of first server to ask can be strengthened, the CPU usage of first server reaches 80%, first server is made to process maximum treatment capacity while guarantee processing speed, the quantity that control sends to the user of first server to ask from less to more, thus set up phase buffer to first server, prevent a large number of users request from impacting first server.
In a step 208, upgrade file is sent, using second server as first server, for the treatment of new business to second server.
Meanwhile, the data of first server buffer memory can also be sent to second server and indicate second server buffer memory.
When the CPU usage of first server remains on 80%, the buffer memory that first server is described pre-stored completes, and the user of first server process asks institute's requesting data information, directly can obtain from buffer memory.Now, upgrading second server, using second server as first server completing steps 203-step 208.Final buffer memory completes the pre-stored of second server buffer memory.
In step 209, when the information on load of first server is more than or equal to default first load threshold value, send user's request to second server, instruction second server obtains the data message that also cache user request is asked.
First server processing speed with receive before user's speed of asking also do not tend to balance, when the amount sending to the user of first server to ask, when reaching 70% of first server CPU usage, need to process redundancy section user request, the server then not carrying out upgrading to other sends user's request, as sent user's request to second server or the 3rd server.The server process of not upgraded by other exceedes user's request of the first load threshold value of first server, ensures while minimizing first server is subject to the impact of a large number of users request, ensures that all user's requests are all processed.
In step 210, the result for user's request that first server sends is received.
Receive the result that first server asks according to user to obtain.
In step 211, to the client transmission processing result sending user's request.
Disclosure embodiment provides a kind of method of buffer memory pre-stored.For after the server used is upgraded, during server process new business, because the data message of new business does not store in the buffer, if the server after upgrading receives a large amount of user's requests, then can impact server.Therefore disclosure embodiment, by first upgrading to first server, and distributing a small amount of user's request to first server, making to complete buffer memory pre-stored in first server.And then to second server upgrading, carry out the step of first server, other servers of then progressively upgrading, finally complete the upgrading of Servers-all.When carrying out pre-stored to the server of new upgrading, the user that the server process of new upgrading is not complete asks, then being processed by other servers, can be the server of having upgraded, also can be the server of not upgrading, make the user received ask all can obtain respective handling.Achieve and still can Deterministic service device normal process user ask in escalation process, the server decreasing new upgrading directly processes a large number of users request and produces impact to server.And can save and safeguard manpower, reduce maintenance cost.
New server of reaching the standard grade is had in the server group of mass clients at one group, and can normal process user request after needing the server of newly reaching the standard grade to reach the standard grade.Disclosure embodiment provides as under type.
As shown in Figure 3, disclosure embodiment provides a kind of method of buffer memory pre-stored, and concrete implementation step is as follows:
In step 301, user's request that client sends is received.
In step 302, the first information on load and second information on load of first server is obtained.
One group of server is divided into two parts, and first server is the server of newly reaching the standard grade for the treatment of new business, and second server is the normal server used for the treatment of old business.Information on load, comprising: CPU (central processing unit) utilization rate, user's number of requests, committed memory size and the process of operation number.First information on load is CPU usage in the disclosed embodiments, and the second information on load is user's number of requests.
In step 303, judge whether the first information on load is more than or equal to default first load threshold value.
When the first information on load of first server is more than or equal to default first load threshold value, continue step 304;
When the first information on load of first server is less than default first load threshold value, continue step 305.
Presetting the first load threshold value is CPU usage 70%.
Step 304, when the first information on load of first server is more than or equal to default first load threshold value, sends user's request to second server.After second server processes user's request, receive the result for user's request that second server sends, and to sending the client transmission processing result of user's request.
Step 305, when the first information on load of first server is less than default first load threshold value, judges whether the second information on load is more than or equal to default second load threshold value;
When the second information on load of first server is less than default second load threshold value, continue step 306,307 and 309;
When the second information on load is more than or equal to default second load threshold value, continue step 306.
Preset the second load threshold value for user's number of requests be 1000.
Step 306, sends user's request to first server.
When the first information on load of first server is less than default first load threshold value, and when the second information on load of first server is less than default second load threshold value, send user's request to first server;
If CPU usage do not reach 70% and user's number of requests do not reach 1000, then continue to send user's request to first server, and indicate first server to obtain the data message of asking, and the data message that pre-stored is asked in the buffer memory of first server.
When the first information on load of first server is less than default first load threshold value, when but the second information on load is more than or equal to default second load threshold value, continue to send user's request, until the first information on load of first server equals default first load threshold value to first server.
First server processing speed with receive before user's speed of asking also do not tend to balance, when the CPU usage of first server does not reach 70%, but when the quantity of user's request has reached 1000, continue to send user's number of requests, until the CPU usage of first server reaches 70% to first server.
Wherein, the first load threshold value and the second load threshold value are equivalent to default first load threshold value in step 203.
Step 307, judges that the first information on load of first server is in default 3rd load threshold value and whether the duration preset between the first load threshold value is greater than default first duration threshold value.
When the first information on load duration be between default 3rd load threshold value and default first load threshold value of first server is greater than default first duration threshold value, continue step 308; When the first information on load duration be between default 3rd load threshold value and default first load threshold value of first server is less than or equal to default first duration threshold value, the first load threshold value of first server is kept to be the first default load threshold value.
Step 308, when the first information on load duration be between default 3rd load threshold value and default first load threshold value of first server is greater than default first duration threshold value, increases to default 5th load threshold value by default first load threshold value.
When the CPU usage of first server is between 50%-70% during 20S, the processing speed can determining first server with receive the speed that newly assigned user asks and keep balancing, now, the quantity sending to the user of first server to ask can be strengthened, the CPU usage of first server reaches 80%, first server is made to process maximum treatment capacity while guarantee processing speed, the quantity that control sends to the user of first server to ask from less to more, thus set up phase buffer to first server, prevent a large number of users request from impacting first server.
Step 309, judges that the second information on load of first server is in default 4th load threshold value and whether the duration preset between the second load threshold value is greater than default second duration threshold value;
The duration be between default 4th load threshold value and default second load threshold value when the second information on load is greater than default second duration threshold value, continues step 310; The duration be between default 4th load threshold value and default second load threshold value when the second information on load is less than or equal to default second duration threshold value, keeps the second load threshold value of first server to be the second default load threshold value.
Step 310, the duration be between default 4th load threshold value and default second load threshold value when the second information on load is greater than default second duration threshold value, and default second load threshold value is increased to default 6th load threshold value.
User's number of requests is between 800-1000 during 20S, the processing speed can determining first server with receive the speed that newly assigned user asks and keep balancing, now, the quantity sending to the user of first server to ask can be strengthened, the quantity of user's request of process reaches 1200, first server is made to process maximum treatment capacity while guarantee processing speed, the quantity that control sends to the user of first server to ask from less to more, thus set up phase buffer to first server, prevent a large number of users request from impacting first server.
Default 6th load threshold value preset in the 5th load threshold value and step 310 in above-mentioned steps 308 is equivalent to default second load threshold value in step 205.
In step 311, the result for user's request that first server sends is received.
Receive the result that first server asks according to user to obtain.
In step 312, to the client transmission processing result sending user's request.
Disclosure embodiment provides a kind of method of buffer memory pre-stored.New server is added in the server group used, in order to the process making the server of newly reaching the standard grade complete pre-stored fast, can send to the server of newly reaching the standard grade and need user to be processed request in a large number, the server of newly reaching the standard grade is made to carry out buffer memory, and in order to ensure that the server of newly reaching the standard grade does not ask to impact by excessive user, when the information on load of the server of newly reaching the standard grade reaches threshold value, send user's request to the server normally used.When achieving the server of reaching the standard grade new in server group, improve the pre-stored speed to new server, and do not need technician to carry out pre-stored to new server, save and safeguard manpower, reduce maintenance cost.
Fig. 4 is the device block diagram of a kind of buffer memory pre-stored according to an exemplary embodiment.With reference to Fig. 4.Comprise:
First receiver module 401 is configured to the user's request receiving client transmission.
Obtain the information on load that module 402 is configured to obtain first server; Wherein, first server is for the treatment of new business.
First sending module 403 is configured to when the information on load of first server is less than default first load threshold value, sends user's request to first server, and instruction first server obtains the data message that also cache user request is asked.
As shown in Figure 5, this device also comprises:
Increase module 501 to be configured to, when the information on load of first server is in default 3rd load threshold value and the duration preset between the first load threshold value is greater than preset duration threshold value, default first load threshold value be increased to default second load threshold value; Preset the 3rd load threshold value and be less than default first load threshold value.
As shown in Figure 6, this device also comprises:
Second sending module 601 is configured to when the information on load of first server is more than or equal to default first load threshold value, sends user's request to second server, and instruction second server obtains the data message that also cache user request is asked.
As shown in Figure 7, this device also comprises:
3rd sending module 701 is configured to send upgrade file, using second server as first server, for the treatment of new business to second server.
As shown in Figure 8, this device also comprises:
Second receiver module 801 is configured to the result for user's request receiving first server transmission;
4th sending module 802 is configured to the client transmission processing result to sending user's request.About the device in above-described embodiment, wherein the concrete mode of modules executable operations has been described in detail in about the embodiment of the method, will not elaborate explanation herein.
Fig. 9 is the block diagram of a kind of device 900 for buffer memory pre-stored according to an exemplary embodiment.Such as, device 900 can be mobile phone, computing machine, digital broadcast terminal, messaging devices, game console, tablet device, Medical Devices, body-building equipment, personal digital assistant etc.
With reference to Fig. 9, device 900 can comprise following one or more assembly: processing components 902, storer 904, power supply module 906, multimedia groupware 908, audio-frequency assembly 910, the interface 912 of I/O (I/O), sensor module 914, and communications component 916.
The integrated operation of the usual control device 900 of processing components 902, such as with display, call, data communication, camera operation and record operate the operation be associated.Treatment element 902 can comprise one or more processor 920 to perform instruction, to complete all or part of step of above-mentioned method.In addition, processing components 902 can comprise one or more module, and what be convenient between processing components 902 and other assemblies is mutual.Such as, processing element 902 can comprise multi-media module, mutual with what facilitate between multimedia groupware 908 and processing components 902.
Storer 904 is configured to store various types of data to be supported in the operation of equipment 900.The example of these data comprises for any application program of operation on device 900 or the instruction of method, contact data, telephone book data, message, picture, video etc.Storer 904 can be realized by the volatibility of any type or non-volatile memory device or their combination, as static RAM (SRAM), Electrically Erasable Read Only Memory (EEPROM), Erasable Programmable Read Only Memory EPROM (EPROM), programmable read only memory (PROM), ROM (read-only memory) (ROM), magnetic store, flash memory, disk or CD.
The various assemblies that electric power assembly 906 is device 900 provide electric power.Electric power assembly 906 can comprise power-supply management system, one or more power supply, and other and the assembly generating, manage and distribute electric power for device 900 and be associated.
Multimedia groupware 908 is included in the screen providing an output interface between this device 900 and user.In certain embodiments, screen can comprise liquid crystal display (LCD) and touch panel (TP).If screen comprises touch panel, screen may be implemented as touch-screen, to receive the input signal from user.Touch panel comprises one or more touch sensor with the gesture on sensing touch, slip and touch panel.Touch sensor can the border of not only sensing touch or sliding action, but also detects the duration relevant with touch or slide and pressure.In certain embodiments, multimedia groupware 908 comprises a front-facing camera and/or post-positioned pick-up head.When equipment 900 is in operator scheme, during as screening-mode or video mode, front-facing camera and/or post-positioned pick-up head can receive outside multi-medium data.Each front-facing camera and post-positioned pick-up head can be fixing optical lens systems or have focal length and optical zoom ability.
Audio-frequency assembly 910 is configured to export and/or input audio signal.Such as, audio-frequency assembly 910 comprises a microphone (MIC), and when device 900 is in operator scheme, during as call model, logging mode and speech recognition mode, microphone is configured to receive external audio signal.The sound signal received can be stored in storer 904 further or be sent via communications component 916.In certain embodiments, audio-frequency assembly 910 also comprises a loudspeaker, for output audio signal.
I/O interface 912 is for providing interface between processing components 902 and peripheral interface module, and above-mentioned peripheral interface module can be keyboard, some striking wheel, button etc.These buttons can include but not limited to: home button, volume button, start button and locking press button.
Sensor module 914 comprises one or more sensor, for providing the state estimation of various aspects for device 900.Such as, sensor module 914 can detect the opening/closing state of equipment 900, the relative positioning of assembly, such as assembly is display and the keypad of device 900, the position of all right pick-up unit 900 of sensor module 914 or device 900 1 assemblies changes, the presence or absence that user contacts with device 900, the temperature variation of device 900 orientation or acceleration/deceleration and device 900.Sensor module 914 can comprise proximity transducer, be configured to without any physical contact time detect near the existence of object.Sensor module 914 can also comprise optical sensor, as CMOS or ccd image sensor, for using in imaging applications.In certain embodiments, this sensor module 914 can also comprise acceleration transducer, gyro sensor, Magnetic Sensor, pressure transducer or temperature sensor.
Communications component 916 is configured to the communication being convenient to wired or wireless mode between device 900 and other equipment.Device 900 can access the wireless network based on communication standard, as WiFi, 2G or 3G, or their combination.In one exemplary embodiment, communication component 916 receives from the broadcast singal of external broadcasting management system or broadcast related information via broadcast channel.In one exemplary embodiment, communication component 916 also comprises near-field communication (NFC) module, to promote junction service.Such as, can based on radio-frequency (RF) identification (RFID) technology in NFC module, Infrared Data Association (IrDA) technology, ultra broadband (UWB) technology, bluetooth (BT) technology and other technologies realize.
In the exemplary embodiment, device 900 can be realized, for performing said method by one or more application specific integrated circuit (ASIC), digital signal processor (DSP), digital signal processing appts (DSPD), programmable logic device (PLD) (PLD), field programmable gate array (FPGA), controller, microcontroller, microprocessor or other electronic components.
In the exemplary embodiment, additionally provide a kind of non-transitory computer-readable recording medium comprising instruction, such as, comprise the storer 904 of instruction, above-mentioned instruction can perform said method by the processor 920 of device 900.Such as, non-transitory computer-readable recording medium can be ROM, random access memory (RAM), CD-ROM, tape, floppy disk and optical data storage devices etc.
A kind of non-transitory computer-readable recording medium, when the instruction in storage medium is performed by the processor of mobile terminal, make mobile terminal can perform a kind of method of buffer memory pre-stored, the method comprises:
Receive user's request that client sends;
Obtain the information on load of first server; Wherein, first server is for the treatment of new business;
When the information on load of first server is less than default first load threshold value, send user's request to first server, instruction first server obtains the data message that also cache user request is asked.
A kind of non-transitory computer-readable recording medium is also configured to:
The method also comprises: when the information on load duration be between default 3rd load threshold value and default first load threshold value of first server is greater than preset duration threshold value, default first load threshold value is increased to default second load threshold value; Preset the 3rd load threshold value and be less than default first load threshold value.
A kind of non-transitory computer-readable recording medium is also configured to:
The method also comprises: when the information on load of first server is more than or equal to default first load threshold value, sends user's request to second server, and instruction second server obtains the data message that also cache user request is asked.
A kind of non-transitory computer-readable recording medium is also configured to:
The method also comprises: send upgrade file, using second server as first server, for the treatment of new business to second server.
A kind of non-transitory computer-readable recording medium is also configured to:
The method also comprises: receive the result for user's request that first server sends;
To the client transmission processing result sending user's request.
A device for buffer memory pre-stored, comprising: processor; For the storer of storage of processor executable instruction; Wherein, processor is configured to: receive user's request that client sends; Obtain the information on load of first server; Wherein, first server is for the treatment of new business; When the information on load of first server is less than default first load threshold value, send user's request to first server, instruction first server obtains the data message that also cache user request is asked.
This process is also configured to:
The method also comprises: when the information on load duration be between default 3rd load threshold value and default first load threshold value of first server is greater than preset duration threshold value, default first load threshold value is increased to default second load threshold value; Preset the 3rd load threshold value and be less than default first load threshold value.
This process is also configured to:
The method also comprises: when the information on load of first server is more than or equal to default first load threshold value, sends user's request to second server, and instruction second server obtains the data message that also cache user request is asked.
This process is also configured to:
The method also comprises: send upgrade file, using second server as first server, for the treatment of new business to second server.
This process is also configured to:
The method also comprises: receive the result for user's request that first server sends; To the client transmission processing result sending user's request.
Those skilled in the art, at consideration instructions and after putting into practice invention disclosed herein, will easily expect other embodiment of the present disclosure.The application is intended to contain any modification of the present disclosure, purposes or adaptations, and these modification, purposes or adaptations are followed general principle of the present disclosure and comprised the undocumented common practise in the art of the disclosure or conventional techniques means.Instructions and embodiment are only regarded as exemplary, and true scope of the present disclosure and spirit are pointed out by claim below.
Should be understood that, the disclosure is not limited to precision architecture described above and illustrated in the accompanying drawings, and can carry out various amendment and change not departing from its scope.The scope of the present disclosure is only limited by appended claim.

Claims (11)

1. a method for buffer memory pre-stored, is characterized in that, comprising:
Receive user's request that client sends;
Obtain the information on load of first server; Wherein, described first server is for the treatment of new business;
When the information on load of described first server is less than default first load threshold value, send described user request to described first server, indicate described first server to obtain and the data message that described in buffer memory, user asks.
2. the method for claim 1, is characterized in that, described method also comprises:
When the information on load duration be between default 3rd load threshold value and default first load threshold value of described first server is greater than preset duration threshold value, described first load threshold value of presetting is increased to default second load threshold value; Described the 3rd load threshold value of presetting is less than default first load threshold value.
3. the method for claim 1, is characterized in that, described method also comprises:
When the information on load of described first server is more than or equal to default first load threshold value, send described user request to second server, indicate described second server to obtain and the data message that described in buffer memory, user asks.
4. method as claimed in claim 3, it is characterized in that, described method also comprises:
Upgrade file is sent, using second server as first server, for the treatment of described new business to described second server.
5. the method for claim 1, is characterized in that, described method also comprises:
Receive the result of asking for described user that described first server sends;
Described result is sent to the client sending described user request.
6. a device for buffer memory pre-stored, is characterized in that, comprising:
First receiver module, for receiving user's request that client sends;
Obtain module, for obtaining the information on load of first server; Wherein, described first server is for the treatment of new business;
First sending module, for when the information on load of described first server is less than default first load threshold value, sends described user request to described first server, indicates described first server to obtain and the data message that described in buffer memory, user asks.
7. device as claimed in claim 5, it is characterized in that, described device also comprises:
Increase module, when being greater than preset duration threshold value for the duration be in when the information on load of described first server between default 3rd load threshold value and default first load threshold value, described first load threshold value of presetting is increased to default second load threshold value; Described the 3rd load threshold value of presetting is less than default first load threshold value.
8. device as claimed in claim 5, it is characterized in that, described device also comprises:
Second sending module, for when the information on load of described first server is more than or equal to default first load threshold value, sends described user request to second server, indicates described second server to obtain and the data message that described in buffer memory, user asks.
9. device as claimed in claim 8, it is characterized in that, described device also comprises:
3rd sending module, for sending upgrade file, using second server as first server, for the treatment of described new business to described second server.
10. device as claimed in claim 1, it is characterized in that, described device also comprises:
Second receiver module, for receiving the result of asking for described user that described first server sends;
4th sending module, for sending described result to the client sending described user request.
The device of 11. 1 kinds of buffer memory pre-stored, is characterized in that, comprising:
Processor;
For the storer of storage of processor executable instruction;
Wherein, described processor is configured to:
Receive user's request that client sends;
Obtain the information on load of first server; Wherein, described first server is for the treatment of new business;
When the information on load of described first server is less than default first load threshold value, send described user request to described first server, indicate described first server to obtain and the data message that described in buffer memory, user asks.
CN201510106743.1A 2015-03-11 2015-03-11 The pre-stored method and device of caching Active CN104699628B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510106743.1A CN104699628B (en) 2015-03-11 2015-03-11 The pre-stored method and device of caching

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510106743.1A CN104699628B (en) 2015-03-11 2015-03-11 The pre-stored method and device of caching

Publications (2)

Publication Number Publication Date
CN104699628A true CN104699628A (en) 2015-06-10
CN104699628B CN104699628B (en) 2018-07-27

Family

ID=53346777

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510106743.1A Active CN104699628B (en) 2015-03-11 2015-03-11 The pre-stored method and device of caching

Country Status (1)

Country Link
CN (1) CN104699628B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105743772A (en) * 2016-01-26 2016-07-06 深圳宸睿科技有限公司 Message processing method and system
CN110209968A (en) * 2019-04-24 2019-09-06 北京奇艺世纪科技有限公司 A kind of data-storage system, method, apparatus and computer readable storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20080082227A (en) * 2007-03-08 2008-09-11 (주)에임투지 Request proportion apparatus in load balancing system and load balancing method
CN101631360A (en) * 2009-08-19 2010-01-20 中兴通讯股份有限公司 Method, device and system for realizing load balancing
CN101937467A (en) * 2010-09-17 2011-01-05 北京开心人信息技术有限公司 High-efficiency caching method and system of server

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20080082227A (en) * 2007-03-08 2008-09-11 (주)에임투지 Request proportion apparatus in load balancing system and load balancing method
CN101631360A (en) * 2009-08-19 2010-01-20 中兴通讯股份有限公司 Method, device and system for realizing load balancing
CN101937467A (en) * 2010-09-17 2011-01-05 北京开心人信息技术有限公司 High-efficiency caching method and system of server

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105743772A (en) * 2016-01-26 2016-07-06 深圳宸睿科技有限公司 Message processing method and system
CN110209968A (en) * 2019-04-24 2019-09-06 北京奇艺世纪科技有限公司 A kind of data-storage system, method, apparatus and computer readable storage medium

Also Published As

Publication number Publication date
CN104699628B (en) 2018-07-27

Similar Documents

Publication Publication Date Title
EP2975821A1 (en) Network connection method and apparatus
CN104766005A (en) Management method and device for application software access authority
US20170171321A1 (en) Methods and devices for managing accounts
CN104660685A (en) Method and device for obtaining equipment information
CN105263196A (en) Connection state prompting method and device
CN104933170A (en) Information exhibition method and device
CN104125162B (en) The access processing method and device of Internet resources
CN105468767A (en) Method and device for acquiring calling card information
CN105337800A (en) Polling frequency adjustment method and polling frequency adjustment device
CN107040591A (en) A kind of method and device being controlled to client
CN107948093A (en) Adjust the method and device that network speed is applied in terminal device
CN105554064A (en) Method and device for setting head portrait
CN105391683A (en) Remote method invocation method, device and system
CN103916468A (en) System upgrading method, terminal, server and upgrading system
CN105138564A (en) Data file reading method and apparatus
CN105893268A (en) Cached image processing method and device
CN105357669A (en) WiFi connecting method and device
CN104536787A (en) Resource preloading method and device
CN104125267A (en) Account protection method, device and terminal equipment
CN105282162A (en) Processing method and device for account management business
CN105389083A (en) Large font implementation method and apparatus, and intelligent terminal
CN106909481B (en) Interface test method, interface test device and electronic equipment
CN105187154A (en) Response packet reception time delay method and response packet reception time delay device
CN105188024A (en) Method, apparatus and system for accessing network
CN107819836A (en) The remapping method and device of facility information

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant