CN105743950B - Data cache method, device and electronic equipment - Google Patents

Data cache method, device and electronic equipment Download PDF

Info

Publication number
CN105743950B
CN105743950B CN201410764573.1A CN201410764573A CN105743950B CN 105743950 B CN105743950 B CN 105743950B CN 201410764573 A CN201410764573 A CN 201410764573A CN 105743950 B CN105743950 B CN 105743950B
Authority
CN
China
Prior art keywords
network resource
data
caching
preset threshold
reaches
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410764573.1A
Other languages
Chinese (zh)
Other versions
CN105743950A (en
Inventor
张德麟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Tencent Computer Systems Co Ltd
Original Assignee
Shenzhen Tencent Computer Systems Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Tencent Computer Systems Co Ltd filed Critical Shenzhen Tencent Computer Systems Co Ltd
Priority to CN201410764573.1A priority Critical patent/CN105743950B/en
Publication of CN105743950A publication Critical patent/CN105743950A/en
Application granted granted Critical
Publication of CN105743950B publication Critical patent/CN105743950B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Data Exchanges In Wide-Area Networks (AREA)

Abstract

The embodiment of the invention discloses a kind of data cache method, device and electronic equipments, the Internet resources being currently running are determined as master cache Internet resources, during the operation of master cache Internet resources, if the data buffer storage amount of master cache Internet resources reaches the first preset threshold, it then determines target network resource and starts to cache the data of target network resource, i.e. in the embodiment of the present invention, while carrying out data buffer storage to the Internet resources being currently running, the data of target network resource are cached.Due to the buffered a certain amount of data of target network resource during the Internet resources operation being currently running, therefore, if the Internet resources being currently running are switched to target network resource by user, can directly be brought into operation target network resource, to which user is without waiting for that can use target network resource, wait the probability of happening of this situation when to reducing user's handover network resource.

Description

Data caching method and device and electronic equipment
Technical Field
The present invention relates to the field of data processing technologies, and in particular, to a data caching method and apparatus, and an electronic device.
Background
With the popularization of the internet and the continuous expansion of the WIFI environment range, more and more users do not download data of network resources (such as videos, audios and the like) to the local, but directly use the network resources online by using the network, such as online listening to songs, online watching tv plays and the like.
In the process of online using network resources, that is, online running of network resources, to implement online running of network resources, it is necessary to cache data of network resources first, and then process the cached data, thereby implementing online running of network resources. In the process of online operation of the network resources, if the user does not like the currently operated network resources, the user can select to switch the network resources, namely, operate other network resources. For example, when a user listens to a song online, if the user does not want to listen to the currently played song, the user may switch to any one of the other songs for playing.
However, after the user selects to switch the network resource, it needs to wait for a certain time, and the network resource selected to be switched by the user can be operated online normally.
Disclosure of Invention
The invention aims to provide a data caching method, a data caching device and electronic equipment, so as to reduce the occurrence probability of the situation that a user needs to wait when switching network resources.
In order to achieve the purpose, the invention provides the following technical scheme:
a data caching method is applied to electronic equipment, and the method comprises the following steps:
determining the currently running network resource as a main cache network resource;
judging whether the buffer amount of the data of the main cache network resource determined this time reaches a first preset threshold value, wherein the data amount of the main cache network resource determined by the first preset threshold value is less than the total data amount of the main cache network resource;
when the cache amount of the data of the main cache network resource reaches the first preset threshold value, determining a network resource which is not cached in the data mode as a target network resource through a network resource list; wherein, a plurality of network resources including the currently running network resource are recorded in the network resource list;
and caching the data of the target network resource.
An embodiment of the present invention further provides a data caching apparatus, which is applied to an electronic device, and the apparatus includes:
the first determining module is used for determining the currently running network resource as a main cache network resource;
the judging module is used for judging whether the buffer amount of the data of the main cache network resource determined at this time reaches a first preset threshold value, wherein the data amount of the main cache network resource determined by the first preset threshold value is less than the total data amount of the main cache network resource;
the second determining module is used for determining a network resource which is not subjected to data caching as a target network resource through the network resource list when the judging module judges that the caching amount of the data of the main caching network resource reaches the first preset threshold; wherein, a plurality of network resources including the currently running network resource are recorded in the network resource list;
and the first cache module is used for caching the data of the target network resource.
An embodiment of the present invention further provides an electronic device, including the data caching apparatus described above.
According to the scheme, the data caching method, the data caching device and the electronic equipment provided by the application determine that the currently running network resource is the main caching network resource, and judge whether the caching amount of the data of the determined main caching network resource reaches a first preset threshold value; when the cache amount of the data of the main cache network resource reaches the first preset threshold value, determining a target network resource through a network resource list; the data volume of the main cache network resource determined by the first preset threshold is less than the total data volume of the main cache network resource; a plurality of network resources including the currently running network resource are recorded in the network resource list; and caching the data of the target network resource.
That is to say, in the embodiment of the present invention, an operating network resource is determined as a primary cache network resource, and in the process of operating the primary cache network resource, if a data caching amount of the primary cache network resource reaches a first preset threshold, a target network resource is determined and caching of data of the target network resource is started, that is, in the embodiment of the present invention, data caching is performed on the operating network resource while data caching is performed on the target network resource. Because the target network resource caches a certain amount of data in the running process of the running network resource, if the user switches the running network resource to the target network resource, the user can directly start running the target network resource, so that the user can use the target network resource without waiting, and the occurrence probability of the situation that the user needs to wait when switching the network resource is reduced.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a flowchart of an implementation of a data caching method according to an embodiment of the present invention;
fig. 2 is a flowchart of another implementation manner of determining, by a network resource list, that a network resource not subject to data caching is a target network resource according to an embodiment of the present invention;
fig. 3 is a flowchart of an implementation of obtaining information about favorite types of network resources of a user according to an embodiment of the present invention;
fig. 4 is a flowchart of another implementation of a data caching method according to an embodiment of the present invention;
fig. 5 is a flowchart of another implementation of the data caching method according to the embodiment of the present invention;
fig. 6 is a schematic structural diagram of a data caching apparatus according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of a second determining module according to an embodiment of the present invention;
fig. 8 is another schematic structural diagram of a second determining module according to an embodiment of the present invention;
fig. 9 is a schematic structural diagram of an obtaining unit according to an embodiment of the present invention;
fig. 10 is a block diagram of a hardware structure of an electronic device according to an embodiment of the present invention;
FIG. 11 is a block diagram of a data processing system according to an embodiment of the present invention;
fig. 12 is a schematic diagram of an initial cache state according to an embodiment of the present invention;
fig. 13 is a schematic diagram of another cache state according to an embodiment of the present invention;
fig. 14 is a schematic diagram of another cache state according to an embodiment of the present invention;
fig. 15 is a schematic diagram of another cache state according to an embodiment of the present invention;
fig. 16 is a schematic diagram of another cache state according to an embodiment of the present invention;
fig. 17 is a schematic diagram of another cache state according to an embodiment of the present invention.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims, as well as in the drawings described above, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It should be understood that the data so used may be interchanged under appropriate circumstances such that embodiments of the application described herein may be practiced otherwise than as specifically illustrated.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The data caching method and device provided by the embodiment of the invention are applied to electronic equipment, and the electronic equipment can be accessed to the Internet.
Referring to fig. 1, fig. 1 is a flowchart illustrating an implementation of a data caching method according to an embodiment of the present invention, which may include:
step S11: determining the currently running network resource as a main cache network resource;
when the network resource runs, the running network resource is determined as the main cache network resource.
Step S12: judging whether the buffer amount of the data of the main cache network resource determined this time reaches a first preset threshold value, if so, executing a step S13;
the data volume of the main cache network resource determined by the first preset threshold is less than the total data volume of the main cache network resource;
the buffer amount of data of any network resource may refer to a percentage of the amount of data that the network resource has buffered to the total amount of data of the network resource; accordingly, the first preset threshold is a positive number less than 1, for example, the first preset threshold may be 50%.
The buffer amount of data of any network resource may also refer to the amount of data that the network resource has cached; the corresponding first preset threshold is a product of the total data amount of the first network resource and a preset product factor, wherein the preset product factor is a positive number smaller than 1.
The amount of buffer reaching the first preset threshold may include: the buffer amount is at least the first preset threshold value.
The currently operated network resources may be firstly operated online, or may be operated again after the online operation is finished at least once; when the currently running network resource is in online running for the first time, the data caching and running of the currently running network resource are performed simultaneously, so that the caching amount of the data of the currently running network resource can reach a first preset threshold value after a certain time, obviously, the reaching of the first preset threshold value means that the caching amount is equal to the first preset threshold value; when the currently running network resource runs again after running for at least one time online, the data of the currently running network resource is completely cached, so that when the currently running network resource runs again, the data caching amount exceeds a first preset threshold value, namely the data caching amount is greater than the first preset threshold value.
Step S13: determining a network resource which is not subjected to data caching as a target network resource through a network resource list;
wherein, the network resource list records a plurality of network resources including the currently operating network resource.
Step S14: and caching the data of the target network resource.
In the embodiment of the invention, the data of the target network resource is cached while the currently running network resource runs.
According to the data caching method provided by the application, the running network resource is determined as the main caching network resource, and in the running process of the main caching network resource, if the data caching amount of the main caching network resource reaches a first preset threshold value, the target network resource is determined and caching of data of the target network resource is started. Because the target network resource caches a certain amount of data in the running process of the running network resource, if the user switches the running network resource to the target network resource, the user can directly start running the target network resource, so that the user can use the target network resource without waiting, and the occurrence probability of the situation that the user needs to wait when switching the network resource is reduced.
In the foregoing embodiment, optionally, the determining, by the network resource list, that one network resource which is not subjected to data caching is a target network resource may include:
and determining a network resource which is not subjected to data caching as a target network resource according to the preset operation sequence of the network resources in the network resource list.
The preset operation sequence of the network resources in the network resource list may be set by a user, for example, the user may select to operate according to the sequence of the network resources in the list, or may select to operate randomly, etc.
After the preset operation sequence is determined, the target network resource may be a network resource that is already operated after the currently operated network resource by the preset operation sequence.
In the foregoing embodiment, optionally, a flowchart of another implementation manner of determining, by the network resource list, that a network resource not subject to data caching is a target network resource is shown in fig. 2, and may include:
step S21: acquiring the favorite type information of a user on network resources;
the user's preference for network resources may be determined by user selection.
Step S22: and determining a network resource which is not subjected to data caching in the network resource list as a target network resource according to the preference type information and the type information of the network resource in the network resource list.
After the preference type information of the user on the network resources is acquired, one network resource which is not subjected to data caching and has the same type as the preference type of the user can be preferentially selected from the network resource list to serve as a target network resource;
when there are a plurality of network resources without data caching, the types of which are the same as the user preference types, one network resource can be randomly selected as the target network resource.
In the foregoing embodiment, optionally, an implementation flowchart of obtaining the preference type information of the user on the network resource is shown in fig. 3, and may include:
step S31: acquiring a historical operating record of network resources used by a user within a preset time length;
the preset duration may refer to several days before the current day, such as a week before the current day, or a month before the current day.
That is to say, in the embodiment of the present invention, the operation attribute information (such as the number of operations, the type of the network resource, and the like) of the network resource is recorded each time the user uses the network resource.
Step S32: according to the types of all network resources in the network resource list and historical running records of the network resources used by the users in the preset time length, carrying out classified statistics on the network resources used by the users in the preset time length;
in the embodiment of the invention, the network resources which are used by the user in the preset time length and have the same type as the network resources in the network resource list can be determined through the historical running records of the network resources used by the user in the preset time length; statistics may then be made on the determined number of uses of each type of network resource.
Step S33: and determining the preference type information of the user to the network resources according to the classification statistical result.
Preferably, the type with the largest number of uses may be determined as the user preference type; alternatively, a type of usage number greater than a certain threshold may be determined as the user preference type, that is, the user preference type may be various.
In the above embodiment, it is preferable that the method further includes:
stopping caching the data of the target network resource when the caching amount of the data of the target network resource reaches a second preset threshold value;
and the second preset threshold is smaller than the first preset threshold.
Optionally, as shown in fig. 4, another implementation flowchart of the data caching method provided in the embodiment of the present invention may include:
step S41: determining the currently running network resource as a main cache network resource;
step S42: judging whether the buffer amount of the data of the main cache network resource determined this time reaches a first preset threshold value, if so, executing step S43;
step S43: determining a network resource which is not subjected to data caching as a target network resource through a network resource list;
step S44: caching the data of the target network resource;
in the embodiment of the present invention, the implementation manner of step S41 to step S44 is the same as the implementation manner of step S11 to step S14, and is not described herein again.
Step S45: judging whether the buffer amount of the data of the target network resource reaches a second preset threshold value, if so, executing the step S46;
wherein the second preset threshold is smaller than the first preset threshold;
step S46: stopping caching the data of the target network resource;
step S47: judging whether the data caching amount reaches a second preset threshold value and the quantity of the network resources which are not cached completely reaches a first preset number, if so, executing the step S48; otherwise, the execution returns to step S43.
After stopping caching the data of the target network resource, judging whether the quantity of the network resources which have been subjected to partial data caching reaches a first preset quantity, if not, returning to execute the step S43 again, namely, determining a target network resource again to perform data caching. A buffer cycle is formed.
Step S48: only the data of the main cache network resource is cached.
When the data caching amount reaches a second preset threshold value and the number of the network resources which are not cached reaches a first preset number, the data of the two network resources are not cached at the same time, but only the data of the main cache network resources are cached.
In the embodiment of the invention, in the running process of the currently running network resource, data caching can be carried out on a plurality of non-running network resources in the network resource list, so that the occurrence probability of the situation that a user needs to wait when switching the network resource is further reduced.
On the basis of the embodiment shown in fig. 4, as shown in fig. 5, another implementation flowchart of the data caching method provided by the present application may further include, after only caching data of the primary cache network resource:
step S49: judging whether the data of the main cache network resource is cached completely, if so, executing the step S50;
step S50: judging whether the caching amount reaches a second preset threshold value and network resources which are not cached completely exist, if so, executing a step S51; otherwise, ending.
Step S51: determining that the data caching amount reaches a second preset threshold value firstly and network resources which are not cached completely are main cache network resources;
step S52: caching the data of the determined main cache network resource;
step S53: and judging whether the network resources which are not subjected to data caching exist in the network resource list. If yes, go back to step S42; otherwise, the execution returns to step S49.
In the embodiment of the invention, when the data caching of the main cache network resource is finished, if the network resource list also contains the network resource of which the caching amount reaches the second preset threshold value and the caching is not finished, the main cache network resource is determined again, and the data of the determined main cache network resource is cached. When the data of the re-determined main cache network resource is cached, if network resources which are not cached in the data are still existed in the network resource list, returning to the step of judging whether the buffer amount of the data of the main cache network resource determined this time reaches the first preset threshold value, if no network resource which is not subjected to data caching exists in the network resource list, returning to execute the step of if the network resources which have the caching amount reaching the second preset threshold value and are not cached completely exist in the network resource list, determining that the data caching amount reaches a second preset threshold value firstly and the network resource which is not cached completely is a main caching network resource to form a caching cycle, therefore, more data of the network resources which are not operated in the network resource list can be cached, and the occurrence probability of the situation that the user needs to wait when switching the network resources is further reduced.
On the basis of the embodiment shown in fig. 1, in another embodiment of the data caching method provided by the present application, after starting to cache the data of the target network resource, the method may further include:
and when the buffer amount of the data of the target network resource reaches a second preset threshold value, if the data buffer amount reaches the second preset threshold value and the quantity of the network resources which are not cached is not up to a second preset number in the network resource list, returning to the step of determining that one network resource which is not cached is the target network resource through the network resource list.
In the embodiment of the present invention, when the buffer amount of the data of the target network resource currently performing data caching reaches the second preset threshold, the data caching is not stopped, but a new target network resource is determined to perform data caching, that is, data of a plurality of (that is, a second preset number of) target network resources are cached at the same time, and the occurrence probability of the situation that a user needs to wait when switching the network resources can also be reduced.
Optionally, if the data caching amount in the network resource list reaches a second preset threshold and the number of network resources that are not cached reaches the second preset number, only caching the data of the main cache network resource.
That is, if the data caching amount reaches a second preset threshold value and the number of network resources which are not cached reaches the second preset number in the network resource list, stopping caching the data of the target network resources of the second preset number, and only caching the data of the main cache network resources.
In the foregoing embodiment, optionally, when the determined main cache network resource stops operating, the data of the main cache network resource is stopped being cached.
In the embodiment of the invention, if the data caching and the operation of the main cache network resource are performed simultaneously, when the main cache network resource stops operating, the data of the main cache network resource is not cached continuously even if the data caching of the main cache network resource is not finished.
In the foregoing embodiment, optionally, when the main cache network resource is a running network resource, caching a bandwidth occupied by the main cache network resource, where the bandwidth is greater than or equal to a minimum bandwidth required to meet a code rate requirement of the main cache network resource.
When the data of the main cache network resource and the data of the target network resource are cached simultaneously, in order to ensure the smoothness of the operation of the main cache network resource, in the real-time example of the application, the bandwidth allocated to the main cache network resource is not less than the minimum bandwidth required for meeting the code rate requirement of the main cache network resource.
Corresponding to the method embodiment, an embodiment of the present invention further provides a data caching device, and a schematic structural diagram of the data caching device provided in the embodiment of the present invention is shown in fig. 6, and the data caching device may include:
a first determining module 61, a judging module 62, a second determining module 63 and a first caching module 64; wherein,
the first determining module 61 is configured to determine that a currently running network resource is a primary cache network resource;
the judging module 62 is configured to judge whether the buffer amount of the data of the main cache network resource determined this time reaches a first preset threshold, where the data amount of the main cache network resource determined by the first preset threshold is smaller than the total data amount of the main cache network resource;
the second determining module 63 is configured to determine, through the network resource list, a network resource for which data caching is not performed as a target network resource when the determining module 62 determines that the caching amount of the data of the main caching network resource reaches the first preset threshold; wherein, a plurality of network resources including the currently running network resource are recorded in the network resource list;
the first caching module 64 is configured to cache data of the target network resource.
According to the data caching device provided by the application, the running network resource is determined as the main caching network resource, and in the running process of the main caching network resource, if the data caching amount of the main caching network resource reaches a first preset threshold value, the target network resource is determined and caching of data of the target network resource is started. Because the target network resource caches a certain amount of data in the running process of the running network resource, if the user switches the running network resource to the target network resource, the user can directly start running the target network resource, so that the user can use the target network resource without waiting, and the occurrence probability of the situation that the user needs to wait when switching the network resource is reduced.
In the foregoing embodiment, optionally, a schematic structural diagram of the second determining module 63 is shown in fig. 7, and may include:
a first determining unit 71, configured to determine, according to a preset operation sequence of network resources in the network resource list, that a network resource for which data caching is not performed is a target network resource.
In the above embodiment, optionally, another schematic structural diagram of the second determining module 63 is shown in fig. 8, and may include:
an acquisition unit 81 and a second determination unit 82; wherein,
the acquiring unit 81 is configured to acquire preference type information of a user on a network resource;
the second determining unit 82 is configured to determine, according to the preference type information and the type information of the network resource in the network resource list, a network resource that is not cached as the target network resource in the network resource list.
In the above embodiment, optionally, a schematic structural diagram of the obtaining unit 81 is shown in fig. 9, and may include:
an acquisition subunit 91, a statistics subunit 92, and a determination subunit 93; wherein,
the obtaining subunit 91 is configured to obtain a historical operation record of a network resource used by a user within a preset duration;
the statistics subunit 92 is configured to perform classification statistics on the network resources used by the user within the preset duration according to the types of all the network resources in the network resource list and the historical operating records of the network resources used by the user within the preset duration;
the determining subunit 93 is configured to determine preference type information of the user for the network resource according to the classification statistical result.
In the foregoing embodiment, optionally, the first caching module 64 may be further configured to stop caching the data of the target network resource when the caching amount of the data of the target network resource reaches a second preset threshold;
wherein the second preset threshold is smaller than the first preset threshold.
In the above embodiment, optionally, the second determining module 63 may be further configured to, when the first caching module 64 stops caching the data of the target network resource, determine, by the network resource list, that one network resource which is not cached in the data is the target network resource if the data caching amount in the network resource list reaches a second preset threshold and the number of network resources which are not cached is not equal to a first preset number;
the second preset threshold is smaller than the first preset threshold.
Optionally, the second determining module 63 may be further configured to, when the first caching module 64 stops caching the data of the target network resource, if the data caching amount in the network resource list reaches a second preset threshold and the number of network resources that are not cached reaches a first preset number, not perform the step of determining, by the network resource list, that one network resource that is not cached is the target network resource. That is, if the data caching amount in the network resource list reaches the second preset threshold and the number of network resources that are not cached reaches the first preset number, the target network resource is not determined any more, and only the data of the main cache network resource is cached.
Further, the data caching apparatus provided in the embodiment of the present invention may further include:
the third determining module is used for determining that the data caching amount reaches a second preset threshold value firstly and the network resource which is not cached is the main cache network resource if the network resource list also has the network resource which is not cached and the caching amount reaches the second preset threshold value when the data caching of the main cache network resource is completed;
the second cache module is used for caching the data of the determined main cache network resource;
the first triggering module is configured to generate a first triggering instruction if a network resource for which data caching is not performed still exists in the network resource list, where the first triggering instruction is used to trigger the determining module to perform a step of determining whether the caching amount of the data of the currently determined main cache network resource reaches a first preset threshold.
Further, the data caching apparatus provided in the embodiment of the present invention may further include:
and the second triggering module is used for generating a second triggering instruction if the network resource list does not have the network resource which is not cached, wherein the second triggering instruction is used for triggering the third determining module to execute the step that when the data caching of the main cache network resource is completed, if the network resource list also has the network resource which is not cached and the caching amount reaches a second preset threshold value, the data caching amount firstly reaches the second preset threshold value, and the network resource which is not cached and completed is determined as the main cache network resource.
On the basis of the embodiment shown in fig. 6, the second determining module 63 may be further configured to, when the buffer amount of the data of the target network resource reaches a second preset threshold, determine, through the network resource list, that one network resource which is not cached is the target network resource if, in the network resource list, the data buffer amount reaches the second preset threshold and the number of network resources which are not cached is not equal to a second preset number.
The first caching module is specifically configured to cache data of the determined multiple (at most, a second preset number of) target network resources.
In the embodiment of the present invention, data caching is performed on a plurality of network resources (at most (the second preset number + 1)) at the same time.
Further, the second determining module 63 may be further configured to, when the buffer amount of the data of the target network resource reaches a second preset threshold, if the buffer amount of the data in the network resource list reaches the second preset threshold and the number of network resources that are not cached reaches the second preset number, not execute the step of determining, by the network resource list, that one network resource that is not cached is the target network resource.
The first caching module is further configured to stop caching the data of the at least one target network resource if the data caching amount in the network resource list reaches a second preset threshold and the number of network resources which are not cached reaches the second preset number. I.e. only the data of the primary cache network resource is cached.
In the foregoing embodiment, preferably, if the primary cache network resource is a currently operating network resource, when the primary cache network resource stops operating, the cache module that performs data caching on the primary cache network resource stops caching data of the primary cache network resource.
In the foregoing embodiment, preferably, when the main cache network resource is a running network resource, the bandwidth occupied by caching the main cache network resource is greater than or equal to the minimum bandwidth required to meet the code rate requirement of the main cache network resource.
An embodiment of the present invention further provides an electronic device, where the electronic device has the data caching apparatus according to any one of the above apparatus embodiments.
Referring to fig. 10, fig. 10 is a block diagram illustrating a hardware structure of an electronic device according to an embodiment of the present invention, where the electronic device may be a smart phone, a tablet computer, a notebook computer, a PC, or the like. The electronic device may include:
a processor 1, a communication interface 2, a memory 3 and a communication bus 4;
wherein, the processor 1, the communication interface 2 and the memory 3 complete the communication with each other through the communication bus 4;
optionally, the communication interface 2 may be an interface of a communication module, such as an interface of a GSM module, an interface of a GPRS module, or other wireless network interfaces;
a processor 1 for executing a program;
a memory 3 for storing a program;
the program may include program code including computer operating instructions.
The processor 1 may be a central processing unit CPU or an application Specific Integrated circuit asic or one or more Integrated circuits configured to implement embodiments of the present invention.
The memory 3 may comprise a high-speed RAM memory and may also comprise a non-volatile memory, such as at least one disk memory.
Among them, the procedure can be specifically used for:
determining the currently running network resource as a main cache network resource;
judging whether the buffer amount of the data of the main cache network resource determined this time reaches a first preset threshold value, wherein the data amount of the main cache network resource determined by the first preset threshold value is less than the total data amount of the main cache network resource;
when the cache amount of the data of the main cache network resource reaches the first preset threshold value, determining a network resource which is not cached in the data mode as a target network resource through a network resource list; wherein, a plurality of network resources including the currently running network resource are recorded in the network resource list;
and caching the data of the target network resource.
Referring to fig. 11, fig. 11 is a schematic structural diagram of a data processing system according to an embodiment of the present invention, which includes a network server 111 and an electronic device 112; the electronic device 112 may be a tablet computer, a mobile phone, or a desktop computer.
The network server 111 stores data of a plurality of network resources; after the electronic device 112 establishes a connection with the network server 111, the electronic device 112 may send a data caching request including identification information of a network resource to the network server 111, after receiving the data caching request sent by the electronic device 112, the network server 111 may send data of the network resource to the electronic device 112 so that the electronic device 112 caches the data of the network resource, and the electronic device 112 may process the cached data of the network resource to implement online operation of the network resource.
In an embodiment of the present invention, the electronic device 112 is configured to:
determining the currently running network resource as a main cache network resource;
judging whether the buffer amount of the data of the main cache network resource determined this time reaches a first preset threshold value, wherein the data amount of the main cache network resource determined by the first preset threshold value is less than the total data amount of the main cache network resource;
when the cache amount of the data of the main cache network resource reaches the first preset threshold value, determining a network resource which is not cached in the data mode as a target network resource through a network resource list; wherein, a plurality of network resources including the currently running network resource are recorded in the network resource list;
caching the data of the target network resource; specifically, a data caching request including the identification of the target network resource may be sent to the network server 111, and after receiving the data caching request including the identification of the target network resource, the network server 111 sends the data of the target network resource to the electronic device 112 so that the electronic device 112 caches the data of the target network resource.
An implementation of the embodiment of the present invention is described below by taking a user listening to a song as an example.
In this example, the user has created a list of songs, including A, B, C, D, E five songs. As shown in fig. 12, fig. 12 is a schematic diagram of an initial cache state according to an embodiment of the present invention; in fig. 12, it is assumed that the user selects songs to be played in order from a, and when song a is played, the data of song a starts to be buffered at the same time.
In this example, the data of song B may start to be cached when the data of song a is cached by 50%, as shown in fig. 13, where fig. 13 is another schematic diagram of a caching state provided in the embodiment of the present invention. In this example, data caching for song a and data caching for song B are performed simultaneously;
when the data of song B is cached to 5%, stopping caching song B, and starting to cache song C synchronously, that is, performing data caching on song a and performing data caching on song C simultaneously, as shown in fig. 14, fig. 14 is a schematic view of another caching state provided by the embodiment of the present invention.
When the song C is cached for 5%, stopping caching the song C, and completely caching the data of the song A, namely, only caching the data of the song A. As shown in fig. 15, fig. 15 is a schematic diagram of another cache state according to an embodiment of the present invention.
When song a finishes buffering, the data of song B starts to continue buffering to 50%. As shown in fig. 16, fig. 16 is a schematic diagram of another cache state according to an embodiment of the present invention.
When the data of song B is cached to 50%, the data of song D is started to be cached, and a caching cycle is formed. As shown in fig. 17, fig. 17 is a schematic diagram of another cache state according to an embodiment of the present invention.
And if the user starts to listen to the song B under the condition that the song A is not listened to completely, directly stopping the data caching of the song A, continuously caching the data of the song B to 50%, and circularly and recurrently caching.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a U disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (22)

1. A data caching method is applied to electronic equipment, and is characterized by comprising the following steps:
determining the currently running network resource as a main cache network resource;
judging whether the buffer amount of the data of the main cache network resource determined this time reaches a first preset threshold value, wherein the data amount of the main cache network resource determined by the first preset threshold value is less than the total data amount of the main cache network resource;
when the cache amount of the data of the main cache network resource reaches the first preset threshold value, determining a network resource which is not cached in the data mode as a target network resource through a network resource list; wherein, a plurality of network resources including the currently running network resource are recorded in the network resource list;
caching the data of the target network resource while caching the data of the main cache network resource;
when the data caching of the main caching network resource is finished, if the network resource which has the caching amount reaching a second preset threshold value and is not cached completely exists in the network resource list, determining that the data caching amount firstly reaches the second preset threshold value and the network resource which is not cached completely is the main caching network resource; the second preset threshold is smaller than the first preset threshold;
caching the data of the determined main cache network resource;
and if the network resources which are not subjected to data caching still exist in the network resource list, returning to the step of judging whether the caching quantity of the data of the main caching network resource determined this time reaches a first preset threshold value.
2. The method of claim 1, wherein determining a non-cached network resource as a target network resource via the network resource list comprises:
and determining a network resource which is not subjected to data caching as a target network resource according to the preset operation sequence of the network resources in the network resource list.
3. The method of claim 1, wherein determining a non-cached network resource as a target network resource via the network resource list comprises:
acquiring the favorite type information of a user on network resources;
and determining a network resource which is not subjected to data caching in the network resource list as a target network resource according to the preference type information and the type information of the network resource in the network resource list.
4. The method of claim 3, wherein the obtaining of the user preference type information on the network resource comprises:
acquiring a historical operating record of network resources used by a user within a preset time length;
according to the types of all network resources in the network resource list and historical running records of the network resources used by the users in the preset time length, carrying out classified statistics on the network resources used by the users in the preset time length;
and determining the preference type information of the user to the network resources according to the classification statistical result.
5. The method of claim 1, further comprising:
and stopping caching the data of the target network resource when the caching amount of the data of the target network resource reaches a second preset threshold value.
6. The method of claim 5, further comprising, after ceasing to cache the data of the target network resource:
and if the data caching amount reaches a second preset threshold value and the quantity of the network resources which are not cached is not up to a first preset quantity in the network resource list, returning to the step of determining one network resource which is not cached as the target network resource through the network resource list.
7. The method of claim 5, further comprising, after ceasing to cache the data of the target network resource:
if the data caching amount in the network resource list reaches a second preset threshold value and the number of the network resources which are not cached completely reaches the first preset number, caching the data of the main cache network resources.
8. The method of claim 1, further comprising:
if the network resource list does not have network resources which are not cached, returning to execute the step of determining that the data caching amount reaches a second preset threshold firstly and the network resources which are not cached are the main cache network resources when the data caching of the main cache network resources is finished and if the network resource list also has the network resources which are not cached and have the caching amount reaching the second preset threshold.
9. The method of claim 1, further comprising:
and when the buffer amount of the data of the target network resource reaches a second preset threshold value, if the data buffer amount reaches the second preset threshold value and the quantity of the network resources which are not cached is not up to a second preset number in the network resource list, returning to the step of determining that one network resource which is not cached is the target network resource through the network resource list.
10. The method of claim 9, further comprising:
if the data caching amount reaches a second preset threshold value and the number of the network resources which are not cached is up to the second preset number in the network resource list, caching the data of the main cache network resources.
11. The method according to claim 1, wherein when the determined primary cache network resource stops running, stopping caching data of the primary cache network resource.
12. A data caching device applied to electronic equipment is characterized by comprising:
the first determining module is used for determining the currently running network resource as a main cache network resource;
the judging module is used for judging whether the buffer amount of the data of the main cache network resource determined at this time reaches a first preset threshold value, wherein the data amount of the main cache network resource determined by the first preset threshold value is less than the total data amount of the main cache network resource;
the second determining module is used for determining a network resource which is not subjected to data caching as a target network resource through the network resource list when the judging module judges that the caching amount of the data of the main caching network resource reaches the first preset threshold; wherein, a plurality of network resources including the currently running network resource are recorded in the network resource list;
the first cache module is used for caching the data of the target network resource while caching the data of the main cache network resource;
the third determining module is used for determining that the data caching amount reaches a second preset threshold value firstly and the network resource which is not cached is the main cache network resource if the network resource list also has the network resource which is not cached and the caching amount reaches the second preset threshold value when the data caching of the main cache network resource is completed; the second preset threshold is smaller than the first preset threshold;
the second cache module is used for caching the data of the determined main cache network resource;
the first triggering module is configured to generate a first triggering instruction if a network resource for which data caching is not performed still exists in the network resource list, where the first triggering instruction is used to trigger the determining module to perform a step of determining whether the caching amount of the data of the currently determined main cache network resource reaches a first preset threshold.
13. The apparatus of claim 12, wherein the second determining module comprises:
and the first determining unit is used for determining a network resource which does not carry out data caching as a target network resource according to the preset running sequence of the network resources in the network resource list.
14. The apparatus of claim 12, wherein the second determining module comprises:
the acquisition unit is used for acquiring the preference type information of the user on the network resources;
and a second determining unit, configured to determine, according to the preference type information and the type information of the network resource in the network resource list, a network resource for which data caching is not performed in the network resource list as a target network resource.
15. The apparatus of claim 14, wherein the obtaining unit comprises:
the acquisition subunit is used for acquiring the historical operating record of the network resources used by the user within the preset time length;
the statistical subunit is configured to perform classification statistics on the network resources used by the user within the preset duration according to the types of all the network resources in the network resource list and the historical operating records of the network resources used by the user within the preset duration;
and the determining subunit is used for determining the preference type information of the user on the network resources according to the classification statistical result.
16. The apparatus of claim 12, wherein the first caching module is further configured to stop caching the data of the target network resource when the caching amount of the data of the target network resource reaches a second preset threshold.
17. The apparatus of claim 16,
the second determining module is further configured to, when the first caching module stops caching the data of the target network resource, determine, through the network resource list, that one network resource on which data caching is not performed is the target network resource if, in the network resource list, the data caching amount reaches a second preset threshold and the number of network resources on which caching is not performed does not reach a first preset number;
the second preset threshold is smaller than the first preset threshold.
18. The apparatus according to claim 16, wherein the second determining module is further configured to, when the first caching module stops caching the data of the target network resource, not perform the step of determining, by the network resource list, that one network resource that is not cached is the target network resource if the data caching amount in the network resource list reaches a second preset threshold and the number of network resources that are not cached reaches a first preset number.
19. The apparatus of claim 12, further comprising:
and the second triggering module is used for generating a second triggering instruction if the network resource list does not have the network resource which is not cached, wherein the second triggering instruction is used for triggering the third determining module to execute the step that when the data caching of the main cache network resource is completed, if the network resource list also has the network resource which is not cached and the caching amount reaches a second preset threshold value, the data caching amount firstly reaches the second preset threshold value, and the network resource which is not cached and completed is determined as the main cache network resource.
20. The apparatus according to claim 12, wherein the second determining module is further configured to, when the buffer amount of the data of the target network resource reaches a second preset threshold, determine, through the network resource list, that a network resource that is not cached is the target network resource if the buffer amount of the data in the network resource list reaches the second preset threshold and the number of network resources that are not cached does not reach a second preset number;
the first caching module is specifically configured to cache data of the determined at least one target network resource.
21. The apparatus according to claim 20, wherein the second determining module is further configured to, when the buffer amount of the data of the target network resource reaches a second preset threshold, if the buffer amount of the data in the network resource list reaches the second preset threshold and the number of network resources that are not cached reaches the second preset number, not perform the step of determining, by the network resource list, that one network resource that is not cached is the target network resource;
the first caching module is further configured to stop caching the data of the at least one target network resource if the data caching amount in the network resource list reaches a second preset threshold and the number of network resources which are not cached reaches the second preset number.
22. An electronic device, comprising a data caching apparatus as claimed in any one of claims 12 to 21.
CN201410764573.1A 2014-12-11 2014-12-11 Data cache method, device and electronic equipment Active CN105743950B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410764573.1A CN105743950B (en) 2014-12-11 2014-12-11 Data cache method, device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410764573.1A CN105743950B (en) 2014-12-11 2014-12-11 Data cache method, device and electronic equipment

Publications (2)

Publication Number Publication Date
CN105743950A CN105743950A (en) 2016-07-06
CN105743950B true CN105743950B (en) 2019-11-19

Family

ID=56241239

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410764573.1A Active CN105743950B (en) 2014-12-11 2014-12-11 Data cache method, device and electronic equipment

Country Status (1)

Country Link
CN (1) CN105743950B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108694241B (en) * 2018-05-14 2023-04-18 平安科技(深圳)有限公司 Data storage method and device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101005606A (en) * 2006-12-31 2007-07-25 华为技术有限公司 Method and device for reducing medium playing delay
CN101674356A (en) * 2008-09-10 2010-03-17 鸿富锦精密工业(深圳)有限公司 Electronic device and method thereof for shortening operation response time
CN103440276A (en) * 2013-08-08 2013-12-11 星云融创(北京)信息技术有限公司 Method and device for improving webpage display speed
CN103607634A (en) * 2013-11-19 2014-02-26 四川长虹电器股份有限公司 Method for reducing network video ordering terminal user waiting time
CN103873883A (en) * 2014-03-06 2014-06-18 小米科技有限责任公司 Video playing method and device and terminal equipment
CN103974097A (en) * 2014-05-22 2014-08-06 南京大学镇江高新技术研究院 Personalized user-generated video prefetching method and system based on popularity and social networks

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101005606A (en) * 2006-12-31 2007-07-25 华为技术有限公司 Method and device for reducing medium playing delay
CN101674356A (en) * 2008-09-10 2010-03-17 鸿富锦精密工业(深圳)有限公司 Electronic device and method thereof for shortening operation response time
CN103440276A (en) * 2013-08-08 2013-12-11 星云融创(北京)信息技术有限公司 Method and device for improving webpage display speed
CN103607634A (en) * 2013-11-19 2014-02-26 四川长虹电器股份有限公司 Method for reducing network video ordering terminal user waiting time
CN103873883A (en) * 2014-03-06 2014-06-18 小米科技有限责任公司 Video playing method and device and terminal equipment
CN103974097A (en) * 2014-05-22 2014-08-06 南京大学镇江高新技术研究院 Personalized user-generated video prefetching method and system based on popularity and social networks

Also Published As

Publication number Publication date
CN105743950A (en) 2016-07-06

Similar Documents

Publication Publication Date Title
US10509842B2 (en) Method and device for refreshing news list
CN104424484B (en) Application program switching, the method and device for adding access information
CN103092700B (en) Internal memory method for cleaning, device and terminal unit
CN107948735A (en) Video playing method and device and electronic equipment
CN112087633B (en) Video decoding method, device and storage medium
CN104090781B (en) Upgrade information processing method and device
US10620810B2 (en) Method and a system for performing scrubbing in a video stream
CN112104897B (en) Video acquisition method, terminal and storage medium
CN105187733A (en) Video processing method, device and terminal
EP3651421A1 (en) Multimedia display method, apparatus, and device
US20150134846A1 (en) Method and apparatus for media segment request retry control
CN103474080A (en) Processing method, device and system of audio data based on code rate switching
CN106649645B (en) Playlist processing method and device
CN103516856A (en) Method and apparatus for information combination
CN113163255B (en) Video playing method, device, terminal and storage medium
CN105743950B (en) Data cache method, device and electronic equipment
CN108966315A (en) Wireless network acquisition methods, device and electronic equipment
US10165245B2 (en) Pre-fetching video content
CN106254908B (en) Multimedia resource playing method and device
CN109240790B (en) Multi-window management method and system and android terminal
CN107734396B (en) A kind of multimedia resource playback method, device and storage medium
CN108156514B (en) Media file playing method and device and storage medium
CN112995705A (en) Method and device for video processing and electronic equipment
CN106131659A (en) Video cache method and device
CN105657473A (en) Data processing method and device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant