CN110633434A - Page caching method and device, electronic equipment and storage medium - Google Patents

Page caching method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN110633434A
CN110633434A CN201910656405.3A CN201910656405A CN110633434A CN 110633434 A CN110633434 A CN 110633434A CN 201910656405 A CN201910656405 A CN 201910656405A CN 110633434 A CN110633434 A CN 110633434A
Authority
CN
China
Prior art keywords
page
target
cache region
caching
page data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910656405.3A
Other languages
Chinese (zh)
Other versions
CN110633434B (en
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Youzhuju Network Technology Co Ltd
Original Assignee
Beijing Infinite Light Field Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Infinite Light Field Technology Co Ltd filed Critical Beijing Infinite Light Field Technology Co Ltd
Priority to CN201910656405.3A priority Critical patent/CN110633434B/en
Publication of CN110633434A publication Critical patent/CN110633434A/en
Application granted granted Critical
Publication of CN110633434B publication Critical patent/CN110633434B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/957Browsing optimisation, e.g. caching or content distillation
    • G06F16/9574Browsing optimisation, e.g. caching or content distillation of access to content, e.g. by caching
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Abstract

The embodiment of the disclosure provides a page caching method, a page caching device, an electronic device and a storage medium, wherein the method comprises the following steps: when the operation of triggering the operation of entering the next page from the current page is detected, determining the current storage information in the first cache region; when the current storage information meets a first preset condition, caching page data in a target page in a first cache region to a target position; the first cache area is a memory, and the target position comprises a local disk. The technical scheme of the embodiment of the disclosure solves the problem that in the prior art, if a large number of pages are opened step by step, the pages are required to be stored in the system memory to ensure that the function of returning to the previous-level page can be executed, but the pages occupy a large memory of the system, and the situations of insufficient system memory, unsmooth operation and the like exist, which affect the user experience, realizes the storage of page data in advance, optimizes the storage strategy, reduces the memory capacity of the system memory, and improves the technical effect of the user experience.

Description

Page caching method and device, electronic equipment and storage medium
Technical Field
The embodiment of the disclosure relates to the technical field of computers, and in particular relates to a page caching method and device, an electronic device and a storage medium.
Background
With the development of electronic products, users can read various articles or watch videos through application programs. For example, after the user triggers the application program, a certain page is opened for browsing, and if the user triggers a certain control on the current page, the user can enter the next page of the current page for browsing, that is, the user can enter a new page for an unlimited number of times through the link or the control in the page. Usually, a return control is also arranged in the page, and the last page of the current page can be returned after clicking.
Therefore, if there are many pages opened step by step in the interface, these pages all need to be stored in the memory to ensure that the function of returning to the previous page can be executed. A plurality of pages are stored in a memory, and the technical problems of insufficient system memory, unsmooth card and further influence on user experience exist due to the fact that the occupied memory is large.
Disclosure of Invention
The embodiment of the disclosure provides a page caching method, a page caching device, an electronic device and a storage medium, so as to optimize a storage strategy, reduce the memory capacity of a system memory and improve the technical effect of user experience.
In a first aspect, an embodiment of the present disclosure further provides a page caching method, where the method includes:
when the operation of triggering the operation of entering the next page from the current page is detected, determining the current storage information in the first cache region;
when the current storage information meets a first preset condition, caching target page data in a target page in the first cache region to a target position;
the first cache region is a memory, and the target position includes a local disk.
In a second aspect, an embodiment of the present disclosure further provides a page caching apparatus, where the apparatus includes:
the detection module is used for determining the current storage information in the first cache region when the operation of triggering the current page to enter the next page is detected;
the cache module is used for caching the target page data in the target page in the first cache region to a target position when the current storage information meets a first preset condition;
the first cache region is a memory, and the target position includes a local disk.
In a third aspect, an embodiment of the present disclosure further provides an electronic device, where the electronic device includes:
one or more processors;
a storage device for storing one or more programs,
when the one or more programs are executed by the one or more processors, the one or more processors are enabled to implement the page caching method according to any one of the embodiments of the present disclosure.
In a fourth aspect, the embodiments of the present disclosure further provide a storage medium containing computer-executable instructions, which when executed by a computer processor, are used to perform the page caching method according to any one of the embodiments of the present disclosure.
The technical scheme of the embodiment of the disclosure determines the current storage information in a first cache region when detecting the operation of triggering the entry from the current page to the next page, and caches the page data in the target page in the first cache region to the target position when the current storage information meets a first preset condition, wherein the first cache region is a memory, and the target position comprises a local disk, so as to solve the technical problems that in the prior art, if a plurality of pages are opened step by step, the pages are required to be stored in a system memory to ensure that the function of returning to the previous page can be executed, but the pages need to occupy a larger memory of the system, the system memory is insufficient, the operation is blocked, and the like, thereby affecting the user experience, realize the storage of the paging data when the memory capacity of the system memory meets the preset condition, optimize the storage strategy, the memory space of the system memory is reduced, and therefore the technical effect of user experience is improved.
Drawings
The above and other features, advantages and aspects of various embodiments of the present disclosure will become more apparent by referring to the following detailed description when taken in conjunction with the accompanying drawings. Throughout the drawings, the same or similar reference numbers refer to the same or similar elements. It should be understood that the drawings are schematic and that elements and features are not necessarily drawn to scale.
Fig. 1 is a schematic flowchart of a page caching method according to a first embodiment of the present disclosure;
fig. 2 is a schematic flowchart of a page rollback method according to a second embodiment of the present disclosure;
fig. 3 is a schematic flow chart of a preferred embodiment provided in the third embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of a page caching apparatus according to a fourth embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of an electronic device according to a fifth embodiment of the disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but rather are provided for a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the disclosure are for illustration purposes only and are not intended to limit the scope of the disclosure.
It should be understood that the various steps recited in the method embodiments of the present disclosure may be performed in a different order, and/or performed in parallel. Moreover, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
The term "include" and variations thereof as used herein are open-ended, i.e., "including but not limited to". The term "based on" is "based, at least in part, on". The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments". Relevant definitions for other terms will be given in the following description.
It should be noted that the terms "first", "second", and the like in the present disclosure are only used for distinguishing different devices, modules or units, and are not used for limiting the order or interdependence relationship of the functions performed by the devices, modules or units.
It is noted that references to "a", "an", and "the" modifications in this disclosure are intended to be illustrative rather than limiting, and that those skilled in the art will recognize that "one or more" may be used unless the context clearly dictates otherwise.
Example one
Fig. 1 is a schematic flow chart of a page caching method provided by an embodiment of the present disclosure, where the embodiment is applicable to a situation where each page stored in a system memory is cached when a system memory storage amount meets a preset condition, the method may be executed by a page caching apparatus, the apparatus may be implemented in a form of software and/or hardware, and may be implemented by an electronic device, and the electronic device may be a mobile terminal, a PC terminal, or the like.
As shown in fig. 1, the method of the present embodiment includes:
s110, when the operation of triggering the operation of entering the next page from the current page is detected, the current storage information in the first cache region is determined.
Wherein, the page currently browsed by the user is taken as the current page. The current page can include at least two controls for the user to enter the next page after triggering. When a user triggers one of the controls, the user can jump from the current page to the next page for browsing, namely, the user can jump from the current page to the next page for browsing step by step. That is, by triggering a link or control in the current page, the next page of the current page can be entered. The first cache region may be understood as a system memory of an application program or a browser, and the first cache region is used as a system first memory region for storing each page opened by a user step by step. The current storage information may be understood as a memory occupied by each page stored in the first memory of the system, and/or the number of pages stored in the first cache region.
Specifically, when it is detected that a user triggers a certain link or control on the current page to jump to the next page, the number of pages stored in the first cache region and/or the memory occupied by each page stored in the first cache region may be obtained.
And S120, caching the target page data in the target page in the first cache region to a target position when the current storage information meets a first preset condition.
The method comprises the steps of obtaining current storage information, wherein a preset threshold value which is met by the storage amount in the current storage information is used as a first preset condition. The first preset condition may be: the number of pages including page data stored in the current storage information is larger than the first preset page number, and/or the memory occupied by each page stored in the current storage information exceeds the preset storage memory. That is, the target page in the first cache region can be acquired as long as the current storage information satisfies one of the above conditions. The target page may be understood as a page to be acquired from the first cache region and stored to the target location. The page includes page data, and a page frame. And taking the page data of the target page as the target page data. The target location includes a local disk. The first cache region is a system memory.
Specifically, when the number of pages stored in the first cache region is greater than the first preset number of pages, and/or the total memory occupied by storing each page exceeds the preset memory, a target page to be stored in the target position may be determined from the first cache region. After the target page is determined, the page data on the target page may be cached in the local disk.
Illustratively, the first preset condition is that the threshold of the number of pages stored in the first cache region is 10, and/or the memory occupied by storing the number of each page is 512M. When a user triggers a control on a current page to enter a next page, after the current page is stored in a first cache region, it can be obtained that the number of pages stored in the first cache region is 11, and the number of pages exceeds a page number threshold value by 10, then a target page can be obtained from the 11 pages, and page data of the target page is stored in a local disk.
It should be noted that, if the current storage information does not satisfy the first preset condition, the above operation may not be performed.
In this embodiment, the determining the target page from the first cache region may be: when the current storage information meets a first preset condition, obtaining the caching time for caching each page in a first caching area; determining a time difference value from the current time according to the caching time, taking a page corresponding to the maximum time difference value from the current time as a target page, and caching target page data in the target page into a local disk; the current moment is the moment when the current storage information is detected to meet the first preset condition.
And taking the time when the page is stored in the first cache region as the cache time of the page. That is, the pages stored in the first cache region are different, and the corresponding cache time is also different. The current time refers to: and after the fact that a user triggers to enter a next page from a current page is detected, the current page is stored in a first cache region, and the moment when the storage information in the first cache region meets the preset conditions is obtained.
It can be understood that, if the current storage information meets the first preset condition, the cache time of storing each page in the first cache region may be obtained. And calculating the time difference between the caching time of each page and the current time. When the time difference is larger, the earlier the time for caching the page to the first cache region is, the page browsed earlier by the user is obtained. The page corresponding to the maximum time difference from the current time may be used as the target page, that is, the page browsed earliest by the user in the first cache region may be used as the target page. And caching the page data on the target page into a local disk.
Illustratively, the time when the current storage information meets the first preset condition is a, and the cache times when the pages are stored in the first cache region are a1, a2, A3, a4, … a10, respectively. Where A1 represents the time at which the page was first stored in the first cache region, and so on, and A10 represents the time at which the page was last stored in the first cache region. Then, the page corresponding to the caching time a1 is the target page, and the page data in the target page is acquired and stored to the local disk.
Of course, the pages stored in the first cache region may also be numbered, the page number of the first cache region is 1, the page number of the second cache region is 2, and so on, the pages stored in the first cache region are numbered, and the pages are stored according to the order of numbering. When the number is 11, taking the page with the number of 1 as a target page, and storing page data in the target page to a local disk; and if the page number stored in the first cache region again is 12, caching the page corresponding to the number 2 in the first cache region as a target page to a target position, and so on to obtain the target page from the first cache region. That is to say, in the process of determining the target page and storing the target page to the target position, the number of stored pages in the first cache region is always less than or equal to the preset page number threshold.
On the basis of the technical scheme, caching the target page data to a target position comprises the following steps: caching the target page data into a local disk, and keeping a target page frame corresponding to the target page data in a first cache region; or caching the target page data to a local disk, and deleting the target page frame corresponding to the target page data from the first cache region.
That is, two embodiments may be adopted to process the target page frame corresponding to the target page. The first embodiment may be: and acquiring a target page in the first cache region, caching target page data in the target page into a local disk, and keeping a target page frame corresponding to the target page in the first cache region. The second embodiment may be: caching the page data in the target page acquired from the first cache region into a local disk, and deleting the page frame from the first cache region.
The technical scheme of the embodiment of the disclosure determines the current storage information in a first cache region when detecting the operation of triggering the entry from the current page to the next page, and caches the page data in the target page in the first cache region to the target position when the current storage information meets a first preset condition, wherein the first cache region is a memory, and the target position comprises a local disk, so as to solve the technical problems that in the prior art, if a plurality of pages are opened step by step, the pages are required to be stored in a system memory to ensure that the function of returning to the previous page can be executed, but the pages need to occupy a larger memory of the system, the system memory is insufficient, the operation is blocked, and the like, thereby affecting the user experience, realize the storage of the paging data when the memory capacity of the system memory meets the preset condition, optimize the storage strategy, the memory space of the system memory is reduced, and therefore the technical effect of user experience is improved.
Example two
On the basis of the technical scheme, when the operation that the user triggers page rollback is detected, each page datum stored in the local disk can be processed according to the information stored in the first cache region. Fig. 2 is a schematic flow chart of a page rollback method according to a second embodiment of the present disclosure.
As shown in fig. 2, the method includes:
s210, when the operation of triggering the rollback from the current page to the previous page is detected, determining the current storage information in the first cache region.
The previous page may be understood as a page with a minimum time interval from the current page among all browsed pages before the current page.
Specifically, when the trigger operation of triggering the rollback from the current page to the previous page is detected, the storage information in the first cache area is read, and the read storage information at this time is used as the current storage information.
Illustratively, the current storage information obtained to the first cache region is that 4 pages are stored.
And S220, when the current storage information meets a second preset condition, determining target page data to be recalled from the target position.
It should be noted that, when the trigger operation from the current page to the next page is detected, whether the page stored in the first cache region needs to be cached to the target position needs to meet a first preset condition. Correspondingly, when it is detected that the user triggers the rollback operation, whether to acquire the page data to be recalled from the target position needs to meet a second preset condition.
Optionally, the second preset condition may be: the number of pages stored in the current storage information is less than the first preset page number; and/or the total memory occupied by each page stored in the current storage information is smaller than the preset storage memory.
Specifically, when the number of pages stored in the current storage information is smaller than the first preset number of pages, and/or a memory occupied by each page stored in the current storage information is smaller than a preset storage memory, the target page data to be recalled may be acquired from the local disk.
Optionally, the determining of the target data to be recalled may be: acquiring the storage time of each page data stored in the local disk, and acquiring a time difference value corresponding to each page data according to the storage time and the return time; the return time is the time when the user triggers the operation from the current page to the previous page; and taking the page data corresponding to the minimum time difference as the target page data to be recalled.
When the page data is stored to the local disk from the first cache area, the corresponding time is used as the storage time. The return time refers to the time when the user triggers the operation of returning from the current page to the previous page. The target page data to be called back is page data to be called back from the target position to the first cache region.
According to the storage time and the return time, the time difference value of each page in the local disk from the return time can be determined. The larger the time difference is, the earliest the page user corresponding to the storage moment browses; correspondingly, the smaller the time difference is, the page corresponding to the storage moment is browsed by the user most recently.
It should be noted that when a page is backed, the page that is backed first is the page that is browsed by the user most recently, for example, after the user browses the page a and triggers the content on the page a to enter the page B, and triggers the control on the page B to enter the page C, the user triggers the back operation, so that the page that is backed first is the page B, and after the back operation is triggered, the page can be backed to the page a. That is, when it is detected that the user triggers page rollback, the page data corresponding to the minimum distance difference may be used as the target page data to be recalled.
Specifically, the storage time of each page data stored in the local disk is obtained, and the time difference value of each page from the return time is determined according to the return time. And taking the page data corresponding to the small time difference as the target page data to be recalled. It should be noted that, the page data stored in the local disk and corresponding to each page may also be numbered, the page data first stored in the local disk is marked as 1, the page data second stored in the local disk is marked as 2, and so on, and the page data stored in the local disk is numbered sequentially. When page data is called back, the page data with the largest number can be preferentially taken as target page data to be called back.
And S230, obtaining a target callback page according to the target page data to be recalled and the target page frame to be used corresponding to the target page data to be recalled in the first cache region, and storing the target callback page to a preset position in the first cache region.
It should be noted that this step is applicable to the case where the page frame is retained in the first cache region, and is not applicable to this step any more if the page frame of the target page is deleted from the first cache region.
That is to say, when the frame corresponding to the target page is stored in the first cache region, the page frame corresponding to the target data to be recalled may be determined according to the identifier in the target data to be recalled, and the data to be recalled and the page frame are rendered.
It should be noted that, because the page frame and the page are stored in the first cache region, in the first preset condition and the second preset condition, the page including the page frame and the page data is taken as one page number. Optionally, determining a target page frame to be used corresponding to the target page data to be recalled in the first cache region and a position where the target page frame to be used is located as a preset position according to the identification information in the target page data to be recalled; and rendering the data of the target page to be recalled and the target page frame to be used to obtain a target callback page, and storing the target callback page to a preset position.
It should be noted that, when pages are stored in the first cache region, the pages may be sequentially numbered from morning to evening according to the cache time when each page is stored in the first cache region, and the pages are sequentially arranged. Correspondingly, when the target page data in the first cache region is stored in the local disk, each page data stored in the local disk may also be numbered. And the page number in the first cache region corresponds to the page number in the local disk. When the target page data in the first cache region is stored in the local disk, the target page frame is retained in the first cache region, and the target page frame and the target page data have corresponding identification information, optionally numbering information.
Wherein, the preset position can be understood as a position corresponding to the target to-be-recalled data identifier. For example, if the number of the target data to be recalled is 10, the preset position in the first cache region is 10, that is, the target data to be recalled with the number of 10 in the first cache region is rendered and placed at the position with the number of 10. And the target to-be-called page frame is the page frame corresponding to the target to-be-called page data identification.
Specifically, after the target page data to be recalled is acquired, according to the identification information in the target page data to be recalled, a page frame corresponding to the identification information is searched from the first cache region and is used as the target page frame. And the position of the target page frame is a preset position. And rendering the target page frame and the target page data to be recalled to obtain a target callback page. The target callback page may be stored to a preset location in the first cache region. It should be noted that, if the number of pages stored in the first cache region satisfies the second preset condition and no page data is stored in the local disk, the page data in the local disk may not be read any more even if the page rollback operation is detected.
The technical scheme of the embodiment of the disclosure determines the current storage information in a first cache region when detecting the operation of triggering the entry from the current page to the next page, and caches the page data in the target page in the first cache region to the target position when the current storage information meets a first preset condition, wherein the first cache region is a memory, and the target position comprises a local disk, so as to solve the technical problems that in the prior art, if a plurality of pages are opened step by step, the pages are required to be stored in a system memory to ensure that the function of returning to the previous page can be executed, but the pages need to occupy a larger memory of the system, the system memory is insufficient, the operation is blocked, and the like, thereby affecting the user experience, realize the storage of the paging data when the memory capacity of the system memory meets the preset condition, optimize the storage strategy, the memory space of the system memory is reduced, and therefore the technical effect of user experience is improved.
EXAMPLE III
As a preferred embodiment of the above embodiments, fig. 3 is a schematic flow chart of a preferred embodiment provided in a third embodiment of the present disclosure.
As shown in fig. 3, the method includes:
s310, when the control on the current page is triggered to enter the next page, the number of the current storage pages in the first cache region is obtained.
Illustratively, when a user triggers a certain control on a current page and enters a next page, the current page may be stored in the first cache region, and meanwhile, the number of currently stored pages in the first cache region is acquired to be 6 pages. That is to say, when a user browses a current page, the number of pages stored in the first cache region is 5, and when the current page enters a next page from the current page, after the current page is stored in the first cache region, the number of pages currently stored in the first cache region is 6.
S320, judging whether the number of the current storage pages is larger than a preset page number threshold value, if so, executing S330; if not, the process returns to step S310.
Wherein, the preset page number threshold is 5.
That is, when the number of the current memory pages is greater than the preset page number threshold, the page in the first cache region may be processed, that is, S330 is executed. If the number of the current storage pages is smaller than the preset page number threshold, continuously detecting whether the user triggers the operation of entering the next page from the current page, namely, repeatedly executing S310 and S320.
S330, acquiring a target page in the first cache region, and caching target page data in the target page to a local disk.
In this embodiment, the target page may be determined according to the cache time of storing each page in the first cache region, or the number of storing each page. The above two ways can be taken as examples respectively.
Illustratively, the cache time of each page stored in the first cache region is obtained, and the page with the largest time difference from the current time is taken as the target page. And caching the page data in the target page to a local disk.
Illustratively, when each page is cached to the first cache region, numbering may be performed according to the sequence of caching each page, that is, the page cached from the first cache region to the first cache region is numbered as 1, and so on, the page numbered as 6 is obtained. And taking the page with the smallest number, namely the page stored in the first cache region firstly as the target page. And caching the page data in the target page to a local disk.
On the basis of the above technical solution, after the target page data in the target page is cached to the local disk, at least two implementation modes can be adopted for the target page frame corresponding to the target page, which are optional: and keeping the target page frame in the first cache region, or deleting the target page frame from the first cache region.
In this embodiment, the target page frame is retained in the first cache region for example.
S340, when the operation of triggering the rollback from the current page to the previous page is detected, the current storage information in the first cache region is obtained.
When detecting that the user triggers to send the page back to the previous page from the current page, the number of the pages currently stored in the first cache region also needs to be acquired.
Illustratively, when a rollback operation is detected, that is, a current page is a previous page, or in a process of jumping to the previous page, it is acquired that the number of currently stored pages in the first cache region is 4.
S350, judging whether the number of the current storage pages is smaller than a preset page number threshold value, if so, executing S360; if not, the process returns to the step S340.
Illustratively, the preset page number threshold is 5 pages. When the number of the current storage pages is 4 pages and is less than the preset page number threshold value 5, page data to be recalled can be obtained from the local disk, that is, S360 is executed. If the number of the current storage pages is not less than the preset page number threshold, it may be continuously detected whether the user triggers a page rollback operation, that is, S340 is executed.
And S360, acquiring the target page data to be recalled from the local disk, and acquiring a target callback page.
Specifically, the page data corresponding to the smallest time difference from the current time when each page data is stored in the local disk is used as the target page data to be recalled. And determining a page frame corresponding to the target data to be recalled from the first cache region according to the identification information in the target data to be recalled, and rendering the target data to be recalled and the page frame to obtain a target callback page. The target callback page can be inserted into the corresponding position according to the position of the target page frame or the number of the target page frame so as to call back.
The technical scheme of the embodiment of the disclosure determines the current storage information in a first cache region when detecting the operation of triggering the entry from the current page to the next page, and caches the page data in the target page in the first cache region to the target position when the current storage information meets a first preset condition, wherein the first cache region is a memory, and the target position comprises a local disk, so as to solve the technical problems that in the prior art, if a plurality of pages are opened step by step, the pages are required to be stored in a system memory to ensure that the function of returning to the previous page can be executed, but the pages need to occupy a larger memory of the system, the system memory is insufficient, the operation is blocked, and the like, thereby affecting the user experience, realize the storage of the paging data when the memory capacity of the system memory meets the preset condition, optimize the storage strategy, the memory space of the system memory is reduced, and therefore the technical effect of user experience is improved.
Example four
Fig. 4 is a schematic structural diagram of a page caching apparatus according to a fourth embodiment of the present disclosure, where the apparatus includes: a detection model 410, and a caching module 420.
The detection module 410 is configured to determine current storage information in the first cache region when an operation that triggers entering into a next page from a current page is detected; a caching module 420, configured to cache page data in a target page in the first caching area to a target location when the current storage information meets a first preset condition; the first cache region is a memory, and the target position includes a local disk.
The technical scheme of the embodiment of the disclosure determines the current storage information in a first cache region when detecting the operation of triggering the entry from the current page to the next page, and caches the page data in the target page in the first cache region to the target position when the current storage information meets a first preset condition, wherein the first cache region is a memory, and the target position comprises a local disk, so as to solve the technical problems that in the prior art, if a plurality of pages are opened step by step, the pages are required to be stored in a system memory to ensure that the function of returning to the previous page can be executed, but the pages need to occupy a larger memory of the system, the system memory is insufficient, the operation is blocked, and the like, thereby affecting the user experience, realize the storage of the paging data when the memory capacity of the system memory meets the preset condition, optimize the storage strategy, the memory space of the system memory is reduced, and therefore the technical effect of user experience is improved.
On the basis of the above technical solutions, the cache module further includes: a preset condition judgment unit for:
when at least one condition is detected to be met, caching page data in a target page in the first cache region to a target position: the number of pages including page data stored in the current storage information is larger than a first preset page number; the memory occupied by storing each page in the current storage information exceeds the preset storage memory; wherein, the page comprises page data and a page frame.
On the basis of the above technical solutions, the cache module is further configured to:
obtaining the caching time of caching each page in the first caching area; determining a time difference value from the current time according to the caching time, taking a page corresponding to the maximum time difference value from the current time as a target page, and caching page data in the target page to the target position; and the current moment is the moment when the current storage information is detected to meet a first preset condition.
On the basis of the above technical solutions, the apparatus further includes:
the first cache unit is used for caching the target page data into the local disk and reserving a target page frame corresponding to the target page data in the first cache region; or the like, or, alternatively,
and the second cache unit is used for caching the target page data to the local disk and deleting the target page frame corresponding to the target page data from the first cache region.
On the basis of the above technical solutions, the cache module further includes:
the rollback detection unit is used for determining the current storage information in the first cache region when detecting that the operation of rollback from the current page to the previous page is triggered;
the page data calling unit is used for determining target page data to be called back from the target position when the current storage information meets a second preset condition;
and the callback page caching unit is used for obtaining a target callback page according to the target to-be-callback page data and the target to-be-used page frame corresponding to the target to-be-callback data in the first caching area, and storing the target callback page to a preset position in the first caching area.
On the basis of the above technical solutions, the retrieve page data unit is further configured to:
when detecting that the current storage information meets at least one of the following conditions, determining target page data to be recalled from the target position:
the number of pages in the current storage information, which include page data, is less than the first preset number of pages;
and the total memory occupied by each page stored in the current storage information is smaller than the preset storage memory.
On the basis of the above technical solutions, the page data unit is further configured to:
acquiring the storage time of each page data stored in a local disk, and acquiring a time difference value corresponding to each page data according to the storage time and the return time; the return time is the time when the user triggers the operation of returning from the current page to the previous page;
and taking the page data corresponding to the minimum time difference value as target page data to be recalled.
On the basis of the above technical solutions, the callback page caching unit is further configured to:
determining a target page frame to be used corresponding to the target page data to be recalled in the first cache region and a position of the target page frame to be used as a preset position according to the identification information in the target page data to be recalled;
and rendering the data of the target page to be recalled and the target page frame to be used to obtain a target callback page, and storing the target callback page to the preset position.
The page caching device provided by the embodiment of the disclosure can execute the page caching method provided by any embodiment of the disclosure, and has corresponding functional modules and beneficial effects of the execution method.
It should be noted that, the units and modules included in the apparatus are merely divided according to functional logic, but are not limited to the above division as long as the corresponding functions can be implemented; in addition, specific names of the functional units are only used for distinguishing one functional unit from another, and are not used for limiting the protection scope of the embodiments of the present disclosure.
EXAMPLE five
Referring now to fig. 5, a schematic diagram of an electronic device (e.g., the terminal device or the server in fig. 5) 500 suitable for implementing embodiments of the present disclosure is shown. The terminal device in the embodiments of the present disclosure may include, but is not limited to, a mobile terminal such as a mobile phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet computer), a PMP (portable multimedia player), a vehicle terminal (e.g., a car navigation terminal), and the like, and a stationary terminal such as a digital TV, a desktop computer, and the like. The electronic device shown in fig. 5 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 5, electronic device 500 may include a processing means (e.g., central processing unit, graphics processor, etc.) 501 that may perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)502 or a program loaded from a storage means 506 into a Random Access Memory (RAM) 503. In the RAM 503, various programs and data necessary for the operation of the electronic apparatus 500 are also stored. The processing device 501, the ROM 502, and the RAM 503 are connected to each other through a bus 504. An input/output (I/O) interface 505 is also connected to bus 504.
Generally, the following devices may be connected to the I/O interface 505: input devices 506 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; output devices 507 including, for example, a Liquid Crystal Display (LCD), speakers, vibrators, and the like; storage devices 506 including, for example, magnetic tape, hard disk, etc.; and a communication device 509. The communication means 509 may allow the electronic device 500 to communicate with other devices wirelessly or by wire to exchange data. While fig. 5 illustrates an electronic device 500 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program carried on a non-transitory computer readable medium, the computer program containing program code for performing the method illustrated by the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication means 509, or installed from the storage means 506, or installed from the ROM 502. The computer program performs the above-described functions defined in the methods of the embodiments of the present disclosure when executed by the processing device 501.
The terminal provided by the embodiment of the present disclosure and the page caching method provided by the embodiment belong to the same inventive concept, and technical details that are not described in detail in the embodiment of the present disclosure may be referred to the embodiment, and the embodiment of the present disclosure have the same beneficial effects.
Example ten
The embodiment of the present disclosure provides a computer storage medium, on which a computer program is stored, and when the program is executed by a processor, the page caching method provided by the above embodiment is implemented.
It should be noted that the computer readable medium in the present disclosure can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
In some embodiments, the clients, servers may communicate using any currently known or future developed network Protocol, such as HTTP (HyperText Transfer Protocol), and may interconnect with any form or medium of digital data communication (e.g., a communications network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the Internet (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed network.
The computer readable medium may be embodied in the electronic device; or may exist separately without being assembled into the electronic device.
The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to:
when the operation of triggering the operation of entering the next page from the current page is detected, determining the current storage information in the first cache region;
when the current storage information meets a first preset condition, caching page data in a target page in the first cache region to a target position;
the first cache region is a memory, and the target position includes a local disk.
Computer program code for carrying out operations for the present disclosure may be written in any combination of one or more programming languages, including but not limited to an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software or hardware. Where the name of a unit does not in some cases constitute a limitation on the unit itself, for example, the call page data unit may also be described as a "target page data acquisition unit".
The functions described herein above may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), systems on a chip (SOCs), Complex Programmable Logic Devices (CPLDs), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
According to one or more embodiments of the present disclosure, [ example one ] there is provided a page caching method, the method comprising:
when the operation of triggering the operation of entering the next page from the current page is detected, determining the current storage information in the first cache region;
when the current storage information meets a first preset condition, caching page data in a target page in the first cache region to a target position;
wherein the first cache region is a memory, and the target location includes a local disk according to one or more embodiments of the present disclosure, an [ example two ] provides a page caching method, further including:
optionally, when the current storage information meets a first preset condition, caching page data in a target page in the first cache region to a target position, where the caching step includes:
when at least one condition is detected to be met, caching page data in a target page in the first cache region to a target position:
the number of pages including page data stored in the current storage information is larger than a first preset page number;
the memory occupied by storing each page in the current storage information exceeds the preset storage memory;
wherein, the page comprises page data and a page frame.
According to one or more embodiments of the present disclosure, [ example three ] there is provided a page caching method, further comprising:
optionally, the caching the page data in the target page in the first cache region to the target position includes:
obtaining the caching time of caching each page in the first caching area;
determining a time difference value from the current time according to the caching time, taking a page corresponding to the maximum time difference value from the current time as a target page, and caching page data in the target page to the target position;
and the current moment is the moment when the current storage information is detected to meet a first preset condition.
According to one or more embodiments of the present disclosure, [ example four ] there is provided a page caching method, further comprising:
optionally, the caching the target page data to the target location includes:
caching the target page data into the local disk, and keeping a target page frame corresponding to the target page data in the first cache region; or the like, or, alternatively,
caching the target page data to the local disk, and deleting the target page frame corresponding to the target page data from the first cache region.
According to one or more embodiments of the present disclosure, [ example five ] there is provided a page caching method, further comprising:
optionally, after caching the target page data in the local disk and retaining the target page frame corresponding to the target page data in the first cache region, the method further includes:
when the operation of triggering the rollback from the current page to the previous page is detected, determining the current storage information in the first cache region;
when the current storage information meets a second preset condition, determining target page data to be recalled from the target position;
and obtaining a target callback page according to the target page data to be recalled and the target page frame to be used corresponding to the target page data to be recalled in the first cache region, and storing the target callback page to a preset position in the first cache region.
According to one or more embodiments of the present disclosure, [ example six ] there is provided a page caching method, further comprising:
optionally, when the current storage information meets a second preset condition, determining target page data to be recalled from the target position includes:
when detecting that the current storage information meets at least one of the following conditions, determining target page data to be recalled from the target position:
the number of pages in the current storage information, which include page data, is less than the first preset number of pages;
and the total memory occupied by each page stored in the current storage information is smaller than the preset storage memory.
According to one or more embodiments of the present disclosure, [ example seven ] there is provided a page caching method, further comprising:
optionally, the determining the target page data to be recalled from the target position includes:
acquiring the storage time of each page data stored in a local disk, and acquiring a time difference value corresponding to each page data according to the storage time and the return time; the return time is the time when the user triggers the operation of returning from the current page to the previous page;
and taking the page data corresponding to the minimum time difference value as target page data to be recalled.
According to one or more embodiments of the present disclosure, [ example eight ] there is provided a page caching method, further comprising:
optionally, the obtaining a target callback page according to the target to-be-callback page data and the target to-be-used page frame corresponding to the target to-be-callback data in the first cache region, and storing the target callback page to a preset position in the first cache region includes:
determining a target page frame to be used corresponding to the target page data to be recalled in the first cache region and a position of the target page frame to be used as a preset position according to the identification information in the target page data to be recalled;
and rendering the data of the target page to be recalled and the target page frame to be used to obtain a target callback page, and storing the target callback page to the preset position.
According to one or more embodiments of the present disclosure, [ example nine ] there is provided a page buffering apparatus, including:
the detection module is used for determining the current storage information in the first cache region when the operation of triggering the current page to enter the next page is detected;
the cache module is used for caching page data in a target page in the first cache region to a target position when the current storage information meets a first preset condition;
the first cache region is a memory, and the target position includes a local disk.
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the disclosure herein is not limited to the particular combination of features described above, but also encompasses other embodiments in which any combination of the features described above or their equivalents does not depart from the spirit of the disclosure. For example, the above features and (but not limited to) the features disclosed in this disclosure having similar functions are replaced with each other to form the technical solution.
Further, while operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. Under certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limitations on the scope of the disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.

Claims (11)

1. A page caching method is characterized by comprising the following steps:
when the operation of triggering the operation of entering the next page from the current page is detected, determining the current storage information in the first cache region;
when the current storage information meets a first preset condition, caching target page data in a target page in the first cache region to a target position;
the first cache region is a memory, and the target position includes a local disk.
2. The method according to claim 1, wherein caching target page data in a target page in the first cache region to a target location when the current storage information satisfies a first preset condition comprises:
when at least one condition is detected to be met, caching the target page data in the target page in the first cache region to a target position:
the number of pages including page data stored in the current storage information is larger than a first preset page number;
the total memory occupied by storing each page in the current storage information exceeds the preset storage memory;
wherein, the page comprises page data and a page frame.
3. The method of claim 1, wherein caching the target page data in the target page in the first cache region to a target location comprises:
obtaining the caching time of caching each page in the first caching area;
determining a time difference value from the current time according to the caching time, taking a page corresponding to the maximum time difference value from the current time as a target page, and caching target page data in the target page to the target position;
and the current moment is the moment when the current storage information is detected to meet a first preset condition.
4. The method of claim 1, wherein caching the target page data to the target location comprises:
caching the target page data into the local disk, and keeping a target page frame corresponding to the target page data in the first cache region; or the like, or, alternatively,
caching the target page data to the local disk, and deleting the target page frame corresponding to the target page data from the first cache region.
5. The method of claim 4, wherein after caching the target page data in the local disk and retaining a target page frame corresponding to the target page data in the first cache region, further comprising:
when the operation of triggering the rollback from the current page to the previous page is detected, determining the current storage information in the first cache region;
when the current storage information meets a second preset condition, determining target page data to be recalled from the target position;
and obtaining a target callback page according to the target page data to be recalled and the target page frame to be used corresponding to the target page data to be recalled in the first cache region, and storing the target callback page to a preset position in the first cache region.
6. The method according to claim 5, wherein when the current storage information satisfies a second preset condition, determining target page data to be recalled from the target location comprises:
when detecting that the current storage information meets at least one of the following conditions, determining target page data to be recalled from the target position:
the number of pages in the current storage information, which include page data, is less than the first preset number of pages;
and the total memory occupied by each page stored in the current storage information is smaller than the preset storage memory.
7. The method of claim 5, wherein said determining target pending callback page data from said target location comprises:
acquiring the storage time of each page data stored in a local disk, and acquiring a time difference value corresponding to each page data according to the storage time and the return time; the return time is the time when the user triggers the operation of returning from the current page to the previous page;
and taking the page data corresponding to the minimum time difference value as target page data to be recalled.
8. The method of claim 7, wherein obtaining a target callback page according to the target to-be-callback page data and a target to-be-used page frame corresponding to the target to-be-callback data in the first cache region, and storing the target callback page to a preset position in the first cache region comprises:
determining a target page frame to be used corresponding to the target page data to be recalled in the first cache region and a position of the target page frame to be used as a preset position according to the identification information in the target page data to be recalled;
and rendering the data of the target page to be recalled and the target page frame to be used to obtain a target callback page, and storing the target callback page to the preset position.
9. A page caching apparatus, comprising:
the detection module is used for determining the current storage information in the first cache region when the operation of triggering the current page to enter the next page is detected;
the cache module is used for caching page data in a target page in the first cache region to a target position when the current storage information meets a first preset condition;
the first cache region is a memory, and the target position includes a local disk.
10. An electronic device, characterized in that the electronic device comprises:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the page caching method as recited in any one of claims 1-8.
11. A storage medium containing computer-executable instructions for performing the page caching method of any one of claims 1 to 8 when executed by a computer processor.
CN201910656405.3A 2019-07-19 2019-07-19 Page caching method and device, electronic equipment and storage medium Active CN110633434B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910656405.3A CN110633434B (en) 2019-07-19 2019-07-19 Page caching method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910656405.3A CN110633434B (en) 2019-07-19 2019-07-19 Page caching method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN110633434A true CN110633434A (en) 2019-12-31
CN110633434B CN110633434B (en) 2024-02-27

Family

ID=68968920

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910656405.3A Active CN110633434B (en) 2019-07-19 2019-07-19 Page caching method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN110633434B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113010891A (en) * 2021-02-26 2021-06-22 中科天齐(山西)软件安全技术研究院有限公司 Application program safety detection method and device, electronic equipment and storage medium
CN113010890A (en) * 2021-02-26 2021-06-22 中科天齐(山西)软件安全技术研究院有限公司 Application program safety detection method and device, electronic equipment and storage medium
CN115391582A (en) * 2022-09-27 2022-11-25 杭州涂鸦信息技术有限公司 Card processing method, electronic device and system
CN115509670A (en) * 2022-11-08 2022-12-23 广州文石信息科技有限公司 Page display method and device, ink screen equipment and storage medium

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030182510A1 (en) * 2002-03-22 2003-09-25 Asim Mitra Multiple-level persisted template caching
CN101729590A (en) * 2008-10-15 2010-06-09 北大方正集团有限公司 Method, system and device for providing web page
CN102368258A (en) * 2011-09-30 2012-03-07 广州市动景计算机科技有限公司 Webpage page caching management method and system
CN103336812A (en) * 2013-06-27 2013-10-02 优视科技有限公司 Webpage resource caching method and device for improving secondary loading efficiency
CN104050253A (en) * 2014-06-12 2014-09-17 北京金山网络科技有限公司 Webpage display method and browser
CN104199684A (en) * 2014-08-13 2014-12-10 百度在线网络技术(北京)有限公司 Browser cold-booting method and device
CN104462455A (en) * 2014-12-16 2015-03-25 北京京东尚科信息技术有限公司 Method and device for displaying and processing network data
CN104573025A (en) * 2015-01-12 2015-04-29 北京京东尚科信息技术有限公司 Method and system for increasing page loading rate
CN107943825A (en) * 2017-10-19 2018-04-20 阿里巴巴集团控股有限公司 Data processing method, device and the electronic equipment of page access
CN108153588A (en) * 2016-12-06 2018-06-12 阿里巴巴集团控股有限公司 A kind of page navigation method and device, a kind of memory allocation method and device
CN108391009A (en) * 2018-02-13 2018-08-10 广东欧珀移动通信有限公司 Display methods, device, storage medium and the electronic equipment of five application page
CN109165369A (en) * 2018-07-12 2019-01-08 北京猫眼文化传媒有限公司 Webpage display process and device
CN109558251A (en) * 2017-09-26 2019-04-02 北京京东尚科信息技术有限公司 The method and terminal of page structure information modification
CN109740085A (en) * 2019-01-10 2019-05-10 北京字节跳动网络技术有限公司 A kind of methods of exhibiting of content of pages, device, equipment and storage medium

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030182510A1 (en) * 2002-03-22 2003-09-25 Asim Mitra Multiple-level persisted template caching
CN101729590A (en) * 2008-10-15 2010-06-09 北大方正集团有限公司 Method, system and device for providing web page
CN102368258A (en) * 2011-09-30 2012-03-07 广州市动景计算机科技有限公司 Webpage page caching management method and system
CN103336812A (en) * 2013-06-27 2013-10-02 优视科技有限公司 Webpage resource caching method and device for improving secondary loading efficiency
CN104050253A (en) * 2014-06-12 2014-09-17 北京金山网络科技有限公司 Webpage display method and browser
CN104199684A (en) * 2014-08-13 2014-12-10 百度在线网络技术(北京)有限公司 Browser cold-booting method and device
CN104462455A (en) * 2014-12-16 2015-03-25 北京京东尚科信息技术有限公司 Method and device for displaying and processing network data
CN104573025A (en) * 2015-01-12 2015-04-29 北京京东尚科信息技术有限公司 Method and system for increasing page loading rate
CN108153588A (en) * 2016-12-06 2018-06-12 阿里巴巴集团控股有限公司 A kind of page navigation method and device, a kind of memory allocation method and device
CN109558251A (en) * 2017-09-26 2019-04-02 北京京东尚科信息技术有限公司 The method and terminal of page structure information modification
CN107943825A (en) * 2017-10-19 2018-04-20 阿里巴巴集团控股有限公司 Data processing method, device and the electronic equipment of page access
CN108391009A (en) * 2018-02-13 2018-08-10 广东欧珀移动通信有限公司 Display methods, device, storage medium and the electronic equipment of five application page
CN109165369A (en) * 2018-07-12 2019-01-08 北京猫眼文化传媒有限公司 Webpage display process and device
CN109740085A (en) * 2019-01-10 2019-05-10 北京字节跳动网络技术有限公司 A kind of methods of exhibiting of content of pages, device, equipment and storage medium

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113010891A (en) * 2021-02-26 2021-06-22 中科天齐(山西)软件安全技术研究院有限公司 Application program safety detection method and device, electronic equipment and storage medium
CN113010890A (en) * 2021-02-26 2021-06-22 中科天齐(山西)软件安全技术研究院有限公司 Application program safety detection method and device, electronic equipment and storage medium
CN113010891B (en) * 2021-02-26 2023-02-07 中科天齐(山西)软件安全技术研究院有限公司 Application program safety detection method and device, electronic equipment and storage medium
CN113010890B (en) * 2021-02-26 2023-02-07 中科天齐(山西)软件安全技术研究院有限公司 Application program safety detection method and device, electronic equipment and storage medium
CN115391582A (en) * 2022-09-27 2022-11-25 杭州涂鸦信息技术有限公司 Card processing method, electronic device and system
CN115509670A (en) * 2022-11-08 2022-12-23 广州文石信息科技有限公司 Page display method and device, ink screen equipment and storage medium

Also Published As

Publication number Publication date
CN110633434B (en) 2024-02-27

Similar Documents

Publication Publication Date Title
CN110633434B (en) Page caching method and device, electronic equipment and storage medium
CN111488185B (en) Page data processing method, device, electronic equipment and readable medium
CN110471709B (en) Method, device, medium and electronic equipment for accelerating webpage opening speed
CN110633433B (en) Page caching method and device, electronic equipment and storage medium
CN111400625B (en) Page processing method and device, electronic equipment and computer readable storage medium
CN113395572A (en) Video processing method and device, storage medium and electronic equipment
CN110647702A (en) Picture preloading method and device, electronic equipment and readable medium
CN110781437A (en) Method and device for acquiring webpage image loading duration and electronic equipment
CN111258736B (en) Information processing method and device and electronic equipment
CN111309496A (en) Method, system, device, equipment and storage medium for realizing delay task
CN111258800A (en) Page processing method and device and electronic equipment
CN112565890B (en) Video clipping method and device, storage medium and electronic equipment
CN111353296B (en) Article processing method, apparatus, electronic device and computer readable storage medium
CN110717126A (en) Page browsing method and device, electronic equipment and computer readable storage medium
CN113727172B (en) Video cache playing method and device, electronic equipment and storage medium
CN115357361A (en) Task processing method, device, equipment and medium
CN111459893B (en) File processing method and device and electronic equipment
CN111770385A (en) Card display method and device, electronic equipment and medium
CN110795670A (en) Webpage image monitoring method and device, electronic equipment and readable storage medium
CN111240758A (en) Material display method and device, electronic equipment and storage medium
CN111143355A (en) Data processing method and device
CN111291294A (en) Information loading method and device, terminal and storage medium
CN111258670B (en) Method and device for managing component data, electronic equipment and storage medium
CN111562913B (en) Method, device and equipment for pre-creating view component and computer readable medium
CN111209042A (en) Method, device, medium and electronic equipment for establishing function stack

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20230426

Address after: Room 802, Information Building, 13 Linyin North Street, Pinggu District, Beijing, 101299

Applicant after: Beijing youzhuju Network Technology Co.,Ltd.

Address before: No. 715, 7th floor, building 3, 52 Zhongguancun South Street, Haidian District, Beijing 100081

Applicant before: Beijing infinite light field technology Co.,Ltd.

GR01 Patent grant
GR01 Patent grant