CN106708568B - Method and device for loading client content in page-by-page manner - Google Patents

Method and device for loading client content in page-by-page manner Download PDF

Info

Publication number
CN106708568B
CN106708568B CN201611116729.0A CN201611116729A CN106708568B CN 106708568 B CN106708568 B CN 106708568B CN 201611116729 A CN201611116729 A CN 201611116729A CN 106708568 B CN106708568 B CN 106708568B
Authority
CN
China
Prior art keywords
data
content
client
page
cache data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201611116729.0A
Other languages
Chinese (zh)
Other versions
CN106708568A (en
Inventor
李枨煊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Weimeng Chuangke Network Technology China Co Ltd
Original Assignee
Weimeng Chuangke Network Technology China Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Weimeng Chuangke Network Technology China Co Ltd filed Critical Weimeng Chuangke Network Technology China Co Ltd
Priority to CN201611116729.0A priority Critical patent/CN106708568B/en
Publication of CN106708568A publication Critical patent/CN106708568A/en
Application granted granted Critical
Publication of CN106708568B publication Critical patent/CN106708568B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating
    • G06F9/44505Configuring for program initiating, e.g. using registry, configuration files
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Abstract

The embodiment of the invention provides a method and a device for loading client contents in a page-by-page manner, wherein the method comprises the following steps: when a server receives a paging access request initiated by a client aiming at certain client content, judging whether cache data corresponding to the client content is cached or not and the cache data does not exceed a preset expiration time, wherein the cache data is acquired and established from a data source when the client content is accessed in the past; and if the cache data corresponding to the client content is cached and the current cache data does not exceed the preset expiration time, loading the corresponding data content based on the current cache data and the specified page of the page access request and feeding back the data content to the client. The technical scheme has the following beneficial effects: the problem that data are repeated when a user performs page turning operation while the data are updated under the scene that the updating frequency is high and the data are updated and sorted every time can be solved.

Description

Method and device for loading client content in page-by-page manner
Technical Field
The invention relates to the technical field of internet, in particular to a method and a device for loading client content in a page-by-page manner.
Background
Currently, list data in network applications (including PCs (personal computers) and mobile phones) usually obtain data directly from a unified data source, and page turning generally uses the following two methods: (1) calculating according to the page number to be browsed and the number of strips of each page, wherein the initial position of the data to be read is as follows: (page number-1 currently to be browsed) number of pieces per page, such as: and each page takes 20 pieces of data, the initial position is 0, the first page takes the data with the position of 0-19, the second page takes the data with the position of 20-39, and so on. (2) Determining the starting position of the next loading content by the ID (identification number) of the last piece of data on the page, wherein the starting position of the data to be read is as follows: the data ID is greater than (when the data is sorted from small to large) or less than (when the data is sorted from large to small) the ID of the last piece of data or the number of pieces of data, such as: the logic of a microblog fed (interface for receiving the information source update in RSS (simple information aggregation)) list is sorted from large to small according to time, and the IDs of the microblogs are unique and are in direct proportion to the time, so the ID reverse sorting is technically realized, when a first page is loaded, the first 20 pieces of data of the sorted data are directly taken, when a second page is loaded, the ID of the last microblog of the first page is introduced, the first 20 pieces of data which are smaller than the ID after the data are sorted, and the rest is done after the third page.
The existing paging technology obtains data from a unified data source (a database or other storages), and can solve most of list paging requirements, including continuously increasing time sequencing (for example, microblog fed lists arranged in reverse order according to time, newly inserted data is always on the top) and long-time unchangeable heat value sequencing (for example, a hot microblog list of a certain topic is updated once per hour), but in a sequencing scene in which the heat value is changed continuously in a short time, when the content of the data source is updated, a situation that the content of a next page is repeated with the data of a previous page is generated when a page is turned or lazily loaded (when a user mobile phone or a browser rolls to the bottom of a screen, the next page is automatically loaded and is additionally arranged at the bottom of the screen), and when the mobile phone turns pages in a lazy loading mode, the user experience is more influenced.
Disclosure of Invention
The embodiment of the invention provides a method and a device for loading client contents in a page mode, and aims to solve the problem that data are repeated when a user performs page turning operation while the data are updated under the scene that the updating frequency is high and the updating data sequence is updated every time in the client loading process.
In one aspect, an embodiment of the present invention provides a method for loading client content in a page-wise manner, where the method includes:
when a server receives a paging access request initiated by a client aiming at certain client content, judging whether cache data corresponding to the client content is cached or not and the cache data does not exceed a preset expiration time, wherein the cache data is acquired and established from a data source when the client content is accessed in the past;
and if the cache data corresponding to the client content is cached and the current cache data does not exceed the preset expiration time, loading the corresponding data content based on the current cache data and the specified page of the page access request and feeding back the data content to the client.
On the other hand, an embodiment of the present invention provides an apparatus for loading client contents in a page-wise manner, where the apparatus is arranged at a server, and the apparatus includes:
the system comprises a receiving unit, a processing unit and a processing unit, wherein the receiving unit is used for receiving a paging access request initiated by a client aiming at a certain client content;
the judging unit is used for judging whether cache data corresponding to the client content is cached or not, the cache data does not exceed preset expiration time, and the cache data is acquired from a data source and established when the client content is accessed in the past;
and the first loading unit is used for loading the corresponding data content based on the current cache data and the specified paging of the paging access request and feeding back the data content to the client if the judgment result of the judging unit is positive.
The technical scheme has the following beneficial effects: the problem that data are repeated when a user performs page turning operation while the data are updated under the scene that the updating frequency is high and the data are updated and sorted every time can be solved. When a user browses client content (the client content may be page content or the like) for the first time, the client content is obtained from a data source and a preset cache of expiration time is established for the client content, and the cache may be one per user or one per user according to an actual scene. When a user browses the content of the second page, the cache data is obtained, and even if the data is changed, the data source and the sequence used by the user for browsing the second page are consistent with those used by the first page, so that the user is ensured not to see the same content as the first page on the second page; and when the user browses or refreshes the first page of content, judging whether the cache data exceeds the preset data updating time, if the cache data exceeds the preset relatively short data updating time, recalculating the content according to the service logic, and updating the cache data, so that the data seen by the user can be updated in time.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a flowchart of a method for loading client contents in a page-wise manner according to an embodiment of the present invention;
FIG. 2 is a schematic structural diagram of an apparatus for loading client contents page by page according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of a first loading unit according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
As shown in fig. 1, a flowchart of a method for loading client content in a page-wise manner according to an embodiment of the present invention is shown, where the method includes:
101. when a server receives a paging access request initiated by a client aiming at certain client content, judging whether cache data corresponding to the client content is cached or not and the cache data does not exceed a preset expiration time, wherein the cache data is acquired and established from a data source when the client content is accessed in the past;
102. and if the cache data corresponding to the client content is cached and the current cache data does not exceed the preset expiration time, loading the corresponding data content based on the current cache data and the specified page of the page access request and feeding back the data content to the client.
Preferably, the loading and feeding back the data content corresponding to the specified page based on the current cached data and the page access request to the client specifically includes: judging whether the specified page of the page access request is a first page; and if the specified page of the paging access request is not the first page, loading corresponding data content from the current cache data according to the specified page of the paging access request and feeding back the data content to the client.
Preferably, the method further comprises: if the specified page of the page access request is a first page, further judging whether the current cache data exceeds a preset data updating time, wherein the preset data updating time is less than the preset expiration time; if the current cache data does not exceed the preset data updating time, loading the data content of the first page from the current cache data and feeding back the data content to the client; and if the current cache data exceeds the preset data updating time, acquiring the client content from a data source, reestablishing the cache data corresponding to the client content, loading the data content of the first page from the reestablished cache data and feeding back the data content to the client.
Preferably, the method further comprises: if the cache data corresponding to the client content is not cached or the current cache data exceeds the preset expiration time, the client content is obtained from a data source and the cache data corresponding to the client content is reestablished, and the corresponding data content is loaded from the reestablished cache data according to the specified paging of the paging access request and fed back to the client.
Preferably, the data source comprises a database, or other storage device.
Corresponding to the above method embodiment, as shown in fig. 2, a schematic structural diagram of an apparatus for paging and loading content of a client according to an embodiment of the present invention is set at a server, and the apparatus includes:
a receiving unit 21, configured to receive a paging access request initiated by a client for a certain client content;
a determining unit 22, configured to determine whether cache data corresponding to the client content is cached, where the cache data does not exceed a preset expiration time, and the cache data is obtained and established from a data source when the client content is accessed in the past;
the first loading unit 23 is configured to, if the determination result of the determining unit is yes, load the corresponding data content based on the current cache data and the specified page of the page access request and feed back the data content to the client.
Preferably, as shown in fig. 3, which is a schematic structural diagram of a first loading unit in an embodiment of the present invention, the first loading unit 23 specifically includes: a first determining module 231, configured to determine whether a specified page of the page access request is a first page; a first loading module 232, configured to, if the determination result of the first determining module is negative, load corresponding data content from the current cache data according to the specified page of the page access request and feed back the data content to the client.
Preferably, the first loading unit 23 further includes: a second determining module 233, configured to further determine whether the current cached data exceeds a preset data update time if the determination result of the first determining module is yes, where the preset data update time is less than the preset expiration time; a second loading module 234, configured to load the data content of the first page from the current cache data and feed back the data content to the client if the determination result of the second determining module is negative; if the judgment result of the second judgment module is yes, the client content is obtained from the data source, the cache data corresponding to the client content is reestablished, and the data content of the first page is loaded from the reestablished cache data and fed back to the client.
Preferably, the apparatus further comprises: a second loading unit 24, configured to, if the determination result of the determining unit 22 is negative, obtain the client content from the data source and reestablish the cache data corresponding to the client content, and load the corresponding data content from the reestablished cache data according to the specified page of the page access request and feed back the data content to the client.
Preferably, the data source comprises a database, or other storage device.
The following is exemplified by taking the client content as the page content:
when a user browses a certain page content for the first time, the application example establishes a preset expiration time cache for data corresponding to a paging access request initiated by the page content by the client, and the cache can be one for each user or one for multiple users according to an actual scene. When a user browses the content of the second page, the cache data is obtained, and even if the data is changed, the data source and the sequence used by the user for browsing the second page are consistent with those used by the first page, so that the user is ensured not to see the same content as the first page on the second page; and when the user browses or refreshes the first page of content, judging whether the cache data exceeds the preset data updating time, if the cache data exceeds the preset relatively short data updating time, recalculating the content according to the service logic, and updating the cache data, so that the data seen by the user can be updated in time.
Such as: the hot topic mixed stream is a topic word list calculated by taking operation recommendation as a material according to the interest, the position and the topic list of the current access user. If the unified data source in the prior art is adopted to acquire data, because the material data is always changed, the content viewed by the user each time the user accesses may be inconsistent, and when the user browses by using a mobile phone, the data viewed by the user may be repeated with the above data when the first page of content is loaded and then the second page of content is automatically loaded. If the technical scheme of the application is adopted, when a user accesses the first page of content, all the calculated topic word list data are cached, a longer preset expiration time (for example, 15 minutes) is set, and in the preset expiration time, the data after the user accesses the second page of content and the second page of content are consistent with the data source calculated when the user accesses the first page of content, so that repeated content cannot be generated. If the user accesses the first page content again after the cache exceeds a relatively short preset data update time (e.g., 5 minutes), the data cache is recalculated to ensure that the data can be updated in time.
The technical scheme has the following beneficial effects: the problem that data are repeated when a user performs page turning operation while the data are updated under the scene that the updating frequency is high and the data are updated and sorted every time can be solved. When a user browses client content (the client content may be specifically page content and the like) for the first time, a preset buffer of expiration time is established for data corresponding to a paging access request initiated by a certain client content by the client, and the buffer may be one per user or one per user according to an actual scene. When a user browses the content of the second page, the cache data is obtained, and even if the data is changed, the data source and the sequence used by the user for browsing the second page are consistent with those used by the first page, so that the user is ensured not to see the same content as the first page on the second page; and when the user browses or refreshes the first page of content, judging whether the cache data exceeds the preset data updating time, if the cache data exceeds the preset relatively short data updating time, recalculating the content according to the service logic, and updating the cache data, so that the data seen by the user can be updated in time.
It should be understood that the specific order or hierarchy of steps in the processes disclosed is an example of exemplary approaches. Based upon design preferences, it is understood that the specific order or hierarchy of steps in the processes may be rearranged without departing from the scope of the present disclosure. The accompanying method claims present elements of the various steps in a sample order, and are not intended to be limited to the specific order or hierarchy presented.
In the foregoing detailed description, various features are grouped together in a single embodiment for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments of the subject matter require more features than are expressly recited in each claim. Rather, as the following claims reflect, invention lies in less than all features of a single disclosed embodiment. Thus, the following claims are hereby expressly incorporated into the detailed description, with each claim standing on its own as a separate preferred embodiment of the invention.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. To those skilled in the art; various modifications to these embodiments will be readily apparent, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the disclosure. Thus, the present disclosure is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
What has been described above includes examples of one or more embodiments. It is, of course, not possible to describe every conceivable combination of components or methodologies for purposes of describing the aforementioned embodiments, but one of ordinary skill in the art may recognize that many further combinations and permutations of various embodiments are possible. Accordingly, the embodiments described herein are intended to embrace all such alterations, modifications and variations that fall within the scope of the appended claims. Furthermore, to the extent that the term "includes" is used in either the detailed description or the claims, such term is intended to be inclusive in a manner similar to the term "comprising" as "comprising" is interpreted when employed as a transitional word in a claim. Furthermore, any use of the term "or" in the specification of the claims is intended to mean a "non-exclusive or".
Those of skill in the art will further appreciate that the various illustrative logical blocks, units, and steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate the interchangeability of hardware and software, various illustrative components, elements, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design requirements of the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present embodiments.
The various illustrative logical blocks, or elements, described in connection with the embodiments disclosed herein may be implemented or performed with a general purpose processor, a digital signal processor, an Application Specific Integrated Circuit (ASIC), a field programmable gate array or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general-purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices, e.g., a digital signal processor and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a digital signal processor core, or any other similar configuration.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may be stored in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art. For example, a storage medium may be coupled to the processor such the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an ASIC, which may be located in a user terminal. In the alternative, the processor and the storage medium may reside in different components in a user terminal.
In one or more exemplary designs, the functions described above in connection with the embodiments of the invention may be implemented in hardware, software, firmware, or any combination of the three. If implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Computer-readable media includes both computer storage media and communication media that facilitate transfer of a computer program from one place to another. Storage media may be any available media that can be accessed by a general purpose or special purpose computer. For example, such computer-readable media can include, but is not limited to, RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to carry or store program code in the form of instructions or data structures and which can be read by a general-purpose or special-purpose computer, or a general-purpose or special-purpose processor. Additionally, any connection is properly termed a computer-readable medium, and, thus, is included if the software is transmitted from a website, server, or other remote source via a coaxial cable, fiber optic cable, twisted pair, Digital Subscriber Line (DSL), or wirelessly, e.g., infrared, radio, and microwave. Such discs (disk) and disks (disc) include compact disks, laser disks, optical disks, DVDs, floppy disks and blu-ray disks where disks usually reproduce data magnetically, while disks usually reproduce data optically with lasers. Combinations of the above may also be included in the computer-readable medium.
The above-mentioned embodiments are intended to illustrate the objects, technical solutions and advantages of the present invention in further detail, and it should be understood that the above-mentioned embodiments are merely exemplary embodiments of the present invention, and are not intended to limit the scope of the present invention, and any modifications, equivalent substitutions, improvements and the like made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (6)

1. A method for paging client content, the method comprising:
when a server receives a paging access request initiated by a client aiming at certain client content, judging whether cache data corresponding to the client content is cached or not and the cache data does not exceed a preset expiration time, wherein the cache data is acquired and established from a data source when the client content is accessed in the past;
if the cache data corresponding to the client content is cached and the current cache data does not exceed the preset expiration time, loading the corresponding data content based on the current cache data and the specified page of the page access request and feeding back the data content to the client; the method specifically comprises the following steps:
judging whether the specified page of the page access request is a first page;
if the appointed paging of the paging access request is not the first page, loading corresponding data content from the current cache data according to the appointed paging of the paging access request and feeding back the data content to the client;
if the specified page of the page access request is a first page, further judging whether the current cache data exceeds a preset data updating time, wherein the preset data updating time is less than the preset expiration time;
if the current cache data does not exceed the preset data updating time, loading the data content of the first page from the current cache data and feeding back the data content to the client;
and if the current cache data exceeds the preset data updating time, acquiring the client content from a data source, reestablishing the cache data corresponding to the client content, loading the data content of the first page from the reestablished cache data and feeding back the data content to the client.
2. The method of paginating client content according to claim 1, wherein the method further comprises:
if the cache data corresponding to the client content is not cached or the current cache data exceeds the preset expiration time, the client content is obtained from a data source and the cache data corresponding to the client content is reestablished, and the corresponding data content is loaded from the reestablished cache data according to the specified paging of the paging access request and fed back to the client.
3. The method for pagination loading of client content according to claim 1, wherein the data source comprises a database, or other storage device.
4. An apparatus for paging and loading client contents, which is arranged at a server side, is characterized in that the apparatus comprises:
the system comprises a receiving unit, a processing unit and a processing unit, wherein the receiving unit is used for receiving a paging access request initiated by a client aiming at a certain client content;
the judging unit is used for judging whether cache data corresponding to the client content is cached or not, the cache data does not exceed preset expiration time, and the cache data is acquired from a data source and established when the client content is accessed in the past;
the first loading unit is used for loading corresponding data content based on the current cache data and the specified paging of the paging access request and feeding back the data content to the client if the judgment result of the judging unit is positive;
the first loading unit specifically includes: the first judgment module is used for judging whether the specified paging of the paging access request is a first page or not;
the first loading module is used for loading corresponding data content from the current cache data according to the specified paging of the paging access request and feeding back the data content to the client if the judgment result of the first judging module is negative;
the second judging module is used for further judging whether the current cache data exceeds preset data updating time if the judging result of the first judging module is yes, wherein the preset data updating time is less than the preset expiration time;
the second loading module is used for loading the data content of the first page from the current cache data and feeding back the data content to the client if the judgment result of the second judgment module is negative; if the judgment result of the second judgment module is yes, the client content is obtained from the data source, the cache data corresponding to the client content is reestablished, and the data content of the first page is loaded from the reestablished cache data and fed back to the client.
5. The apparatus for paging client content as recited in claim 4, wherein the apparatus further comprises:
and the second loading unit is used for acquiring the client content from a data source and reestablishing cache data corresponding to the client content if the judgment result of the judgment unit is negative, and loading the corresponding data content from the reestablished cache data according to the specified paging of the paging access request and feeding back the data content to the client.
6. The apparatus for paging client content as recited in claim 4, wherein the data source comprises a database or other storage device.
CN201611116729.0A 2016-12-07 2016-12-07 Method and device for loading client content in page-by-page manner Active CN106708568B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611116729.0A CN106708568B (en) 2016-12-07 2016-12-07 Method and device for loading client content in page-by-page manner

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611116729.0A CN106708568B (en) 2016-12-07 2016-12-07 Method and device for loading client content in page-by-page manner

Publications (2)

Publication Number Publication Date
CN106708568A CN106708568A (en) 2017-05-24
CN106708568B true CN106708568B (en) 2020-03-27

Family

ID=58936084

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611116729.0A Active CN106708568B (en) 2016-12-07 2016-12-07 Method and device for loading client content in page-by-page manner

Country Status (1)

Country Link
CN (1) CN106708568B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107577701A (en) * 2017-07-26 2018-01-12 努比亚技术有限公司 A kind of data reordering method, sequence server and computer-readable recording medium
CN109241099A (en) * 2018-08-22 2019-01-18 中国平安人寿保险股份有限公司 A kind of data query method and terminal device
CN109885729B (en) * 2019-02-20 2021-07-20 北京奇艺世纪科技有限公司 Method, device and system for displaying data
CN111143414A (en) * 2019-12-26 2020-05-12 五八有限公司 Feedback method and device of cache data, electronic equipment and storage medium
CN111782304B (en) * 2020-07-21 2024-04-02 深圳赛安特技术服务有限公司 Paging loading data logic control method, device, computer equipment and medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103425708A (en) * 2012-05-25 2013-12-04 金蝶软件(中国)有限公司 Optimized web paging query method and device
CN104850627A (en) * 2015-05-21 2015-08-19 北京京东尚科信息技术有限公司 Method and apparatus for performing paging display
CN104965717A (en) * 2014-06-05 2015-10-07 腾讯科技(深圳)有限公司 Method and apparatus for loading page

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8880652B2 (en) * 2011-09-14 2014-11-04 Hewlett-Packard Development Company, L.P. Heuristic browser predictive pre-caching

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103425708A (en) * 2012-05-25 2013-12-04 金蝶软件(中国)有限公司 Optimized web paging query method and device
CN104965717A (en) * 2014-06-05 2015-10-07 腾讯科技(深圳)有限公司 Method and apparatus for loading page
CN104850627A (en) * 2015-05-21 2015-08-19 北京京东尚科信息技术有限公司 Method and apparatus for performing paging display

Also Published As

Publication number Publication date
CN106708568A (en) 2017-05-24

Similar Documents

Publication Publication Date Title
CN106708568B (en) Method and device for loading client content in page-by-page manner
US10771946B2 (en) Dynamic types for activity continuation between electronic devices
US10389826B2 (en) Webpage pre-reading method, apparatus and smart terminal device
US8843608B2 (en) Methods and systems for caching popular network content
US9380123B2 (en) Activity continuation between electronic devices
KR101932395B1 (en) Activity continuation between electronic devices
US20150074289A1 (en) Detecting error pages by analyzing server redirects
US20160110414A1 (en) Information searching apparatus and control method thereof
US20220261407A1 (en) Search results based on subscription information
US10803232B2 (en) Optimizing loading of web page based on aggregated user preferences for web page elements of web page
US10158740B2 (en) Method and apparatus for webpage resource acquisition
US10061806B2 (en) Presenting previously selected search results
WO2017219524A1 (en) Page saving method and electronic device
WO2017076073A1 (en) Method and apparatus for search and recommendation
WO2012060866A1 (en) Determination of category information using multiple stages
WO2015081848A1 (en) Socialized extended search method and corresponding device and system
US20160092441A1 (en) File Acquiring Method and Device
CN103246713A (en) Web surfing method and web surfing device
WO2019041500A1 (en) Pagination realization method and device, computer equipment and storage medium
CN104462283A (en) Method, device and client for requesting webpage elements in mobile terminal
US10437830B2 (en) Method and apparatus for identifying media files based upon contextual relationships
CN113886683A (en) Label cluster construction method and system, storage medium and electronic equipment
CN107040454B (en) Unread message reminding method and device under large-data-volume quick updating scene
US20160315997A1 (en) File transfer method, device, and system
CN114528486A (en) Book recommendation method, server, system and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant