CN106790363A - A kind of data cache method and device - Google Patents

A kind of data cache method and device Download PDF

Info

Publication number
CN106790363A
CN106790363A CN201611032253.2A CN201611032253A CN106790363A CN 106790363 A CN106790363 A CN 106790363A CN 201611032253 A CN201611032253 A CN 201611032253A CN 106790363 A CN106790363 A CN 106790363A
Authority
CN
China
Prior art keywords
link
newly
increased data
request
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201611032253.2A
Other languages
Chinese (zh)
Inventor
王坤辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
TVMining Beijing Media Technology Co Ltd
Original Assignee
TVMining Beijing Media Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by TVMining Beijing Media Technology Co Ltd filed Critical TVMining Beijing Media Technology Co Ltd
Priority to CN201611032253.2A priority Critical patent/CN106790363A/en
Publication of CN106790363A publication Critical patent/CN106790363A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/56Provisioning of proxy services
    • H04L67/568Storing data temporarily at an intermediate stage, e.g. caching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/958Organisation or management of web site content, e.g. publishing, maintaining pages or automatic linking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/56Provisioning of proxy services
    • H04L67/568Storing data temporarily at an intermediate stage, e.g. caching
    • H04L67/5682Policies or rules for updating, deleting or replacing the stored data

Abstract

The invention discloses a kind of data cache method and device, to the content in the display page that still can be complete after entering without net environment, user is facilitated to browse.Methods described includes:Receive the trigger event for refreshing current page;The corresponding link of data is increased newly in obtaining the current page according to the trigger event;The corresponding newly-increased data of the link are cached.Using method provided by the present invention, so as to the content in the display page that still can be complete after entering without net environment, user can be facilitated to browse the page in advance to being cached when all newly-increased data in the page of front opening.

Description

A kind of data cache method and device
Technical field
The present invention relates to Internet technical field, more particularly to a kind of data cache method and device.
Background technology
The website data renewal speed of some types is very fast, such as shares with photo, makes friends, the net of blog function Stand, in these websites, user can see mood, blog article, photo, video that good friend is shared etc..When this kind of webpage is beaten by user When opening, can detect whether that data update, when there are data to update, trigger the refreshing of the page.
In the prior art, when new data include picture or video, terminal only load the picture that current screen shows or Video, the picture being not explicitly shown in current screen or video only load its corresponding link, however, as user from the ring for having net When border switches to the environment without net, picture or video without loading cannot be seen, great inconvenience is brought to user, because And, a kind of data cache method and device how are proposed, to the display page that still can be complete after entering without net environment In content, facilitate user to browse.
The content of the invention
The present invention provides a kind of data cache method and device, to still can be complete after entering without net environment show Show the content in the page, facilitate user to browse.
The present invention provides a kind of data cache method and device, including:
Receive the trigger event for refreshing current page;
The corresponding link of data is increased newly in obtaining the current page according to the trigger event;
The corresponding newly-increased data of the link are cached.
The beneficial effects of the present invention are:In advance to being cached when all newly-increased data in the page of front opening, from And the content in the display page that still can be complete after entering without net environment, facilitate user to browse the page.
In one embodiment, it is described that the corresponding newly-increased data of the link are cached, including:
Judge it is local whether the corresponding newly-increased data of the buffered link;
When the corresponding newly-increased data of the link are not cached locally, the corresponding newly-increased data of the link are downloaded;
When the corresponding newly-increased data of the link are downloaded to be completed, by the corresponding newly-increased data storage of the link local Buffering area.
The beneficial effect of the present embodiment is:Prejudge it is local whether the corresponding newly-increased data of the buffered link, So as to the repetition for avoiding newly-increased data is cached, the waste of storage resource is reduced.
In one embodiment, during being cached to the newly-increased data, methods described also includes:
When network request is received, the network request is added in request queue.
The beneficial effect of the present embodiment is:Network request is stored by network request queue, so that network request Storage more in order, when multiple network requests are received, is easy to be processed according to the time sequencing for receiving network request.
In one embodiment, methods described also includes:
When having network request in monitoring the request queue, caching of the pause to the newly-increased data;
Obtain the objective network request of the request queue head of the queue;
Objective network request is processed;
When objective network request is disposed, the objective network request of the request queue head of the queue is deleted.
The beneficial effect of the present embodiment is:When there is network request in detecting request queue, pause is to increasing data newly Caching, so as to accelerate the processing speed of network request;Secondly, when objective network request is disposed, removal request Objective network request in queue, so as to save storage resource.
In one embodiment, when the network request in the request queue is all disposed, methods described is also wrapped Include:
Judge whether the corresponding server of the link supports breakpoint transmission;
When the corresponding server of the link supports breakpoint transmission, paused caching is continued executing with;
When the corresponding server of the link does not support breakpoint transmission, the paused caching is re-executed.
The beneficial effect of the present embodiment is:When network request in request queue is all disposed, link is judged Whether corresponding server supports breakpoint transmission, when breakpoint transmission is supported, then continues executing with paused caching, so that Caching need not be re-executed, caching speed is accelerated.
The present embodiment provides a kind of data buffer storage device, including:
Receiver module, the trigger event of current page is refreshed for receiving;
First acquisition module, for increasing the corresponding chain of data newly according in the trigger event acquisition current page Connect;
First cache module, for being cached to the corresponding newly-increased data of the link.
In one embodiment, first cache module, including:
Judging submodule, for judge it is local whether the corresponding newly-increased data of the buffered link;
Submodule is downloaded, for when the corresponding newly-increased data of the link are not cached locally, downloading the link right The newly-increased data answered;
Sub-module stored is corresponding new by the link for when the corresponding newly-increased data of the link download completion Increase data storage in local buffer.
In one embodiment, described device also includes:
Add module, for when network request is received, the network request being added in request queue.
In one embodiment, described device also includes:
Pause module, for when having network request in monitoring the request queue, suspending to the newly-increased data Caching;
Second acquisition module, the objective network for obtaining the request queue head of the queue is asked;
Processing module, for processing objective network request;
Removing module, for when objective network request is disposed, deleting the target of the request queue head of the queue Network request.
In one embodiment, described device also includes:
Judge module, for judging whether the corresponding server of the link supports breakpoint transmission;
Second cache module, for when the corresponding server of the link supports breakpoint transmission, continuing executing with paused Caching;
3rd cache module, for when the corresponding server of the link does not support breakpoint transmission, re-executing described Paused caching.
Other features and advantages of the present invention will be illustrated in the following description, also, the partly change from specification Obtain it is clear that or being understood by implementing the present invention.The purpose of the present invention and other advantages can be by the explanations write Specifically noted structure is realized and obtained in book, claims and accompanying drawing.
Below by drawings and Examples, technical scheme is described in further detail.
Brief description of the drawings
Accompanying drawing is used for providing a further understanding of the present invention, and constitutes a part for specification, with reality of the invention Applying example is used to explain the present invention together, is not construed as limiting the invention.In the accompanying drawings:
Fig. 1 is a kind of flow chart of data cache method in one embodiment of the invention;
Fig. 2 is a kind of flow chart of data cache method in one embodiment of the invention;
Fig. 3 is a kind of block diagram of data buffer storage device in one embodiment of the invention;
Fig. 4 is a kind of block diagram of data buffer storage device in one embodiment of the invention.
Specific embodiment
The preferred embodiments of the present invention are illustrated below in conjunction with accompanying drawing, it will be appreciated that preferred reality described herein Apply example to be merely to illustrate and explain the present invention, be not intended to limit the present invention.
Fig. 1 is a kind of flow chart of data cache method in one embodiment of the invention, as shown in figure 1, being recognized in the present invention Code processing method can be used for terminal, and the method comprises the following steps S101-S103:
In step S101, the trigger event for refreshing current page is received;
In step s 102, the corresponding link of data is increased newly in obtaining current page according to trigger event;
In step s 103, cached to linking corresponding newly-increased data.
User open such as photo share, make friends, the website of blog function when, detected whether that data update, when having When data update, the refreshing of the page is triggered.When refreshing, if new data include picture or video, only load current The picture or video of screen display, the picture being not explicitly shown in current screen or video only load its corresponding link.
Such treatment mechanism so that user from the environment changing for having net to without net environment after, without load figure Piece or video cannot be seen, great inconvenience is brought to user.
Above mentioned problem is considered, in the present embodiment, when the trigger event for refreshing current page is received (for example, monitoring User opens current page, clicks on refresh button refresh page etc.), then newly-increased data have been monitored whether, if increasing data newly, Judge whether there is video or this kind of data of picture in newly-increased data, if it has, the corresponding link of newly-increased data is then obtained, and according to The corresponding link of newly-increased data downloads newly-increased data.
In such manner, it is possible in the case where there is net, newly-increased data are cached, so as to still can after entering without net environment Content in enough complete display pages, facilitates user to browse.
The beneficial effects of the present invention are:In advance to being cached when all newly-increased data in the page of front opening, from And the content in the display page that still can be complete after entering without net environment, facilitate user to browse the page.
In one embodiment, as shown in Fig. 2 above-mentioned steps S103 can be implemented as following steps S201-S203:
In step s 201, whether the local corresponding newly-increased data of buffered link are judged;
In step S202, when the corresponding newly-increased data of link are not cached locally, the corresponding newly-increased number of download link According to;
In step S203, when corresponding newly-increased data download completion is linked, corresponding newly-increased data storage will be linked In local buffer.
In the present embodiment, after the corresponding link of data is increased newly in obtaining current page, not directly to link correspondence Newly-increased data cached, but first judge local either with or without caching the corresponding newly-increased data of the link.When not delaying locally When depositing the corresponding newly-increased data of the link, the corresponding newly-increased data of the link are downloaded.When the corresponding newly-increased data of the link are downloaded During completion, by the corresponding newly-increased data storage of the link in local buffer.
And when working as local cache and cross the corresponding newly-increased data of the link, then newly-increased data not corresponding to the link are carried out Caching.By taking photo sharing website as an example, before the good friend of user on be transmitted through picture A, and user is before by picture A correspondences Data be stored in local buffer.When this refreshes, if newly-increased data include the corresponding data of picture A, due to Occurred in current page before picture A, and had been saved in local buffer before, when the newly-increased data of current page In when occurring picture A again, due to stored before picture A, then no longer cache the corresponding data of picture A.
It should be noted that this programme can also become the storage of download side in the newly-increased data of caching, that is, download a part new Increase data, just stored the newly-increased data in the part.
The beneficial effect of the present embodiment is:Prejudge it is local whether the corresponding newly-increased data of the buffered link, So as to the repetition for avoiding newly-increased data is cached, the waste of storage resource is reduced.
In one embodiment, during being cached to newly-increased data, method can also be implemented as following steps:
When network request is received, network request is added in request queue.
In the present embodiment, when network request is received, network request is added in request queue, for example, newly-increased number According to including video A and video B, video A before video B, in caching, priority cache video A, and being cached in video A Cheng Zhong, if during region of user's content for showing current screen as corresponding to current region drags to video B, equivalent to The network request of request of loading video B, in network request of the local reception to loading video B, the network request is added to In request queue.
Due to limited bandwidth resources, now, if continuing buffered video A, the loading velocity of slow video B can be dragged, thus, The caching to video A, priority cache video B can be suspended.So as to accelerate the loading velocity of video B, user is set to watch faster The corresponding contents of video B.
The beneficial effect of the present embodiment is:Network request is stored by network request queue, so that network request Storage more in order, when multiple network requests are received, is easy to be processed according to the time sequencing for receiving network request.
In one embodiment, after network request is added in request queue, method can also be implemented as follows Step A1-A4
In step A1, when there is network request in monitoring request queue, caching of the pause to newly-increased data;
In step A2, the objective network request of request queue head of the queue is obtained;
In step A3, objective network request is processed;
In step A4, when objective network request is disposed, the objective network request of removal request queue head of the queue.
In the present embodiment, when there is network request in detecting request queue, pause is obtained to the caching of newly-increased data Take request queue head of the queue objective network request, and to get objective network request process, when the objective network please Ask when being disposed, the objective network request of removal request queue head of the queue.
For example, newly-increased data include video A and video B, before video B, in caching, priority cache is regarded video A Frequency A.
If in video A process of caching, the content that user shows current screen drags to video B institutes by current region Corresponding region, then equivalent to the network request b of request of loading video B, in network request of the local reception to loading video B B, network request b is added in request queue.
Because network request is typically what is actively sent by user, thus, it is contemplated that limited bandwidth resources, and cache newly-increased Data can drag the slow treatment to network request, thus, network request has data more newly-increased than caching priority higher, so that plus The processing progress of fast network request.When there is network request during request queue is locally detected, caching of the pause to newly-increased data Operation.Assuming that there was only network request b in queue, then network request b is obtained from the head of the queue of request queue, to network request b Processed (video B is loaded), when network request b is disposed (i.e. video B loadeds), deleting should Seek the network request b of queue head of the queue.
The beneficial effect of the present embodiment is:When there is network request in detecting request queue, pause is to increasing data newly Caching, so as to accelerate the processing speed of network request;Secondly, when objective network request is disposed, removal request Objective network request in queue, so as to save storage resource.
In one embodiment, when the network request in request queue is all disposed, method can also be implemented as Following steps B1-B3:
In step bl is determined., judge whether the corresponding server of link supports breakpoint transmission;
In step B2, when corresponding server support breakpoint transmission is linked, paused caching is continued executing with;
In step B3, when the corresponding server of link does not support breakpoint transmission, paused caching behaviour is re-executed Make.
In the present embodiment, when the network request in request queue is all disposed, it is necessary to suspend before continuing executing with The caching of newly-increased data.At this time, it may be necessary to judge whether the corresponding server of link of the caching for performing before is supported Breakpoint transmission.
If the corresponding server of the link supports breakpoint transmission, paused caching is continued executing with, for example, it The corresponding newly-increased data of the preceding link have downloaded 30%, then can continue to download from 30%.
If the corresponding server of the link does not support breakpoint transmission, paused caching is re-executed.For example, The corresponding newly-increased data of the link before have downloaded 30%, then the data that will be downloaded are deleted, and it is corresponding to re-download the link Newly-increased data.
The beneficial effect of the present embodiment is:When network request in request queue is all disposed, link is judged Whether corresponding server supports breakpoint transmission, when breakpoint transmission is supported, then continues executing with paused caching, so that Caching need not be re-executed, caching speed is accelerated.
Fig. 3 is a kind of block diagram of data buffer storage device in one embodiment of the invention, as shown in figure 3, identification code in the present invention Processing unit can be used for terminal, and the device includes such as lower module:
Receiver module 31, the trigger event of current page is refreshed for receiving;
First acquisition module 32, for increasing the corresponding link of data newly according in trigger event acquisition current page;
First cache module 33, for being cached to linking corresponding newly-increased data.
In one embodiment, as shown in figure 4, the first cache module 33, including:
Judging submodule 41, for judging the local whether corresponding newly-increased data of buffered link;
Submodule 42 is downloaded, for when the corresponding newly-increased data of link are not cached locally, download link to be corresponding new Increase data;
Sub-module stored 43, for when corresponding newly-increased data download completion is linked, corresponding newly-increased data will to be linked Storage is in local buffer.
In one embodiment, device also includes:
Add module, for when network request is received, network request being added in request queue.
In one embodiment, device also includes:
Pause module, for when there is network request in monitoring request queue, suspending the caching to increasing data newly;
Second acquisition module, the objective network for obtaining request queue head of the queue is asked;
Processing module, for processing objective network request;
Removing module, for when objective network request is disposed, the objective network of removal request queue head of the queue to be asked.
In one embodiment, device also includes:
Judge module, for judging whether the corresponding server of link supports breakpoint transmission;
Second cache module, for when corresponding server support breakpoint transmission is linked, continuing executing with paused delaying Deposit operation;
3rd cache module, for when the corresponding server of link does not support breakpoint transmission, re-executing paused Caching.
It should be understood by those skilled in the art that, embodiments of the invention can be provided as method, system or computer program Product.Therefore, the present invention can be using the reality in terms of complete hardware embodiment, complete software embodiment or combination software and hardware Apply the form of example.And, the present invention can be used and wherein include the computer of computer usable program code at one or more The shape of the computer program product implemented in usable storage medium (including but not limited to magnetic disk storage and optical memory etc.) Formula.
The present invention is the flow with reference to method according to embodiments of the present invention, equipment (system) and computer program product Figure and/or block diagram are described.It should be understood that every first-class during flow chart and/or block diagram can be realized by computer program instructions The combination of flow and/or square frame in journey and/or square frame and flow chart and/or block diagram.These computer programs can be provided The processor of all-purpose computer, special-purpose computer, Embedded Processor or other programmable data processing devices is instructed to produce A raw machine so that produced for reality by the instruction of computer or the computing device of other programmable data processing devices The device of the function of being specified in present one flow of flow chart or multiple one square frame of flow and/or block diagram or multiple square frames.
These computer program instructions may be alternatively stored in can guide computer or other programmable data processing devices with spy In determining the computer-readable memory that mode works so that instruction of the storage in the computer-readable memory is produced and include finger Make the manufacture of device, the command device realize in one flow of flow chart or multiple one square frame of flow and/or block diagram or The function of being specified in multiple square frames.
These computer program instructions can be also loaded into computer or other programmable data processing devices so that in meter Series of operation steps is performed on calculation machine or other programmable devices to produce computer implemented treatment, so as in computer or The instruction performed on other programmable devices is provided for realizing in one flow of flow chart or multiple flows and/or block diagram one The step of function of being specified in individual square frame or multiple square frames.
Obviously, those skilled in the art can carry out various changes and modification without deviating from essence of the invention to the present invention God and scope.So, if these modifications of the invention and modification belong to the scope of the claims in the present invention and its equivalent technologies Within, then the present invention is also intended to comprising these changes and modification.

Claims (10)

1. a kind of data cache method, it is characterised in that including:
Receive the trigger event for refreshing current page;
The corresponding link of data is increased newly in obtaining the current page according to the trigger event;
The corresponding newly-increased data of the link are cached.
2. the method for claim 1, it is characterised in that described that the corresponding newly-increased data of the link are cached, Including:
Judge it is local whether the corresponding newly-increased data of the buffered link;
When the corresponding newly-increased data of the link are not cached locally, the corresponding newly-increased data of the link are downloaded;
When the corresponding newly-increased data of the link are downloaded to be completed, by the corresponding newly-increased data storage of the link locally buffered Area.
3. the method for claim 1, it is characterised in that described during being cached to the newly-increased data Method also includes:
When network request is received, the network request is added in request queue.
4. method as claimed in claim 3, it is characterised in that methods described also includes:
When having network request in monitoring the request queue, caching of the pause to the newly-increased data;
Obtain the objective network request of the request queue head of the queue;
Objective network request is processed;
When objective network request is disposed, the objective network request of the request queue head of the queue is deleted.
5. method as claimed in claim 4, it is characterised in that when the network request in the request queue is all disposed When, methods described also includes:
Judge whether the corresponding server of the link supports breakpoint transmission;
When the corresponding server of the link supports breakpoint transmission, paused caching is continued executing with;
When the corresponding server of the link does not support breakpoint transmission, the paused caching is re-executed.
6. a kind of data buffer storage device, it is characterised in that including:
Receiver module, the trigger event of current page is refreshed for receiving;
First acquisition module, for increasing the corresponding link of data newly according in the trigger event acquisition current page;
First cache module, for being cached to the corresponding newly-increased data of the link.
7. device as claimed in claim 6, it is characterised in that first cache module, including:
Judging submodule, for judge it is local whether the corresponding newly-increased data of the buffered link;
Submodule is downloaded, for when the corresponding newly-increased data of the link are not cached locally, downloading the link corresponding Newly-increased data;
Sub-module stored, for when the corresponding newly-increased data of the link download completion, by the corresponding newly-increased number of the link According to storage in local buffer.
8. device as claimed in claim 6, it is characterised in that described device also includes:
Add module, for when network request is received, the network request being added in request queue.
9. device as claimed in claim 8, it is characterised in that described device also includes:
Pause module, for when having network request in monitoring the request queue, suspending the caching to the newly-increased data Operation;
Second acquisition module, the objective network for obtaining the request queue head of the queue is asked;
Processing module, for processing objective network request;
Removing module, for when objective network request is disposed, deleting the objective network of the request queue head of the queue Request.
10. device as claimed in claim 9, it is characterised in that described device also includes:
Judge module, for judging whether the corresponding server of the link supports breakpoint transmission;
Second cache module, for when the corresponding server of the link supports breakpoint transmission, continuing executing with paused delaying Deposit operation;
3rd cache module, for when the corresponding server of the link does not support breakpoint transmission, re-executing described temporary The caching for stopping.
CN201611032253.2A 2016-11-22 2016-11-22 A kind of data cache method and device Pending CN106790363A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611032253.2A CN106790363A (en) 2016-11-22 2016-11-22 A kind of data cache method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611032253.2A CN106790363A (en) 2016-11-22 2016-11-22 A kind of data cache method and device

Publications (1)

Publication Number Publication Date
CN106790363A true CN106790363A (en) 2017-05-31

Family

ID=58970992

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611032253.2A Pending CN106790363A (en) 2016-11-22 2016-11-22 A kind of data cache method and device

Country Status (1)

Country Link
CN (1) CN106790363A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019223137A1 (en) * 2018-05-24 2019-11-28 平安科技(深圳)有限公司 Cache data update method and apparatus, computer device, and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102737037A (en) * 2011-04-07 2012-10-17 北京搜狗科技发展有限公司 Webpage pre-reading method, device and browser
US8832288B1 (en) * 2012-07-13 2014-09-09 Google Inc. Transitions between remotely cached and live versions of a webpage
CN106095999A (en) * 2016-06-22 2016-11-09 腾讯科技(深圳)有限公司 Obtain the method and device of content of pages

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102737037A (en) * 2011-04-07 2012-10-17 北京搜狗科技发展有限公司 Webpage pre-reading method, device and browser
US8832288B1 (en) * 2012-07-13 2014-09-09 Google Inc. Transitions between remotely cached and live versions of a webpage
CN106095999A (en) * 2016-06-22 2016-11-09 腾讯科技(深圳)有限公司 Obtain the method and device of content of pages

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019223137A1 (en) * 2018-05-24 2019-11-28 平安科技(深圳)有限公司 Cache data update method and apparatus, computer device, and storage medium

Similar Documents

Publication Publication Date Title
US10733591B2 (en) Tiered model for event-based serverless computing
US11194882B1 (en) Behavior based optimization for content presentation
US9223684B2 (en) Online application testing across browser environments
US9065793B2 (en) Rendering web content using pre-caching
CN109447635B (en) Information storage method and device for block chain
CN103765420A (en) System and method for synchronization of actions in the background of an application
CN110247985A (en) A kind of resource downloading method, device, electronic equipment and medium
US8788927B2 (en) System and method for displaying web page content
CN111258736B (en) Information processing method and device and electronic equipment
US11615443B2 (en) System and method to selectively update supplemental content rendered in placement regions of a rendered page
US11748389B1 (en) Delegated decision tree evaluation
CN111488220A (en) Method and device for processing starting request and electronic equipment
US10541854B2 (en) Component loading based on user interactions
CN111258800A (en) Page processing method and device and electronic equipment
CN106649581B (en) Webpage repairing method and client
CN111858381B (en) Application fault tolerance capability test method, electronic device and medium
CN106790363A (en) A kind of data cache method and device
WO2017133229A1 (en) Mobile terminal image display method and device
CN111367783B (en) Application program testing method and device and electronic equipment
CN115474086B (en) Play control method, device, electronic equipment and storage medium
CN116010740A (en) Data file updating method and device of browser, electronic equipment and storage medium
US20170199634A1 (en) Methods and systems for managing media content of a webpage
CN112311842A (en) Method and device for information interaction
CN111831530A (en) Test method and device
WO2022171039A1 (en) Video pre-loading method and apparatus, and device and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20170531