CN111339122A - Active caching method of travel platform, travel query method and related products - Google Patents

Active caching method of travel platform, travel query method and related products Download PDF

Info

Publication number
CN111339122A
CN111339122A CN202010092215.6A CN202010092215A CN111339122A CN 111339122 A CN111339122 A CN 111339122A CN 202010092215 A CN202010092215 A CN 202010092215A CN 111339122 A CN111339122 A CN 111339122A
Authority
CN
China
Prior art keywords
travel
public
line
information
business trip
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010092215.6A
Other languages
Chinese (zh)
Other versions
CN111339122B (en
Inventor
石路路
金长宁
刘纪方
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Yibo Software Technology Co ltd
Original Assignee
Nanjing Yibo Software Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Yibo Software Technology Co ltd filed Critical Nanjing Yibo Software Technology Co ltd
Priority to CN202010092215.6A priority Critical patent/CN111339122B/en
Publication of CN111339122A publication Critical patent/CN111339122A/en
Application granted granted Critical
Publication of CN111339122B publication Critical patent/CN111339122B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/242Query formulation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/245Query processing
    • G06F16/2455Query execution
    • G06F16/24552Database cache management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/14Travel agencies

Abstract

The embodiment of the application discloses an active caching method of a travel platform, a travel query method and related products, wherein the active caching method of the travel platform comprises the following steps: when the query entrance is a public-cause query entrance, the travel platform acquires travel information of a user who does not depart from the travel application platform; predicting a business trip line cached by the business trip agent matched with the business trip information of the user without departure according to the business trip caching strategy; and acquiring data corresponding to the business trip line cached by the public initiative from a supplier according to the business trip refreshing strategy and storing the data into a cache. The technical scheme provided by the embodiment of the application acquires the travel information of the user who does not depart when the query entry is the public query entry, and predicts the travel line matched with the travel information of the user who does not depart according to the public active cache strategy, so that the matching of data in the public active cache and user operation and the hit rate of the active cache in the public query are favorably improved, and the user experience is further improved.

Description

Active caching method of travel platform, travel query method and related products
Technical Field
The present application relates to the field of network and computer technologies, and in particular, to an active caching method for a travel platform, a travel query method, and a related product.
Background
With the development of the travel industry and the network, travel information service providers such as Online Travel Agencies (OTA) and Travel Management Companies (TMC) have appeared, and the travel information service providers can provide travel inquiry and booking services for users.
A large amount of travel information can be obtained in time from transportation companies, hotels, car rental companies, tourist companies, etc. using a large-scale computer information service system (e.g., a Global Distribution System (GDS)) applied to travel transportation and travel industries, which is collectively referred to as a supplier in this application, which can provide customers with quick, convenient, and reliable travel information and services. The travel modes corresponding to the travel can include one or more of the following modes: riding an airplane, riding a high-speed rail, riding an automobile or riding a ship, and the like. In the following description of the present application, a travel mode by taking an airplane is taken as an example for description, and other travel modes are similar to the travel mode by taking the airplane and are not illustrated and described one by one.
When the air ticket inquiry is carried out, the user can search the air route information through a platform of the travel information service provider. When many users query simultaneously, the high query volume can bring huge pressure to the suppliers, and also can cause response delay, and influence user experience. For example, a query request of an international airline includes 100 flight information, and the 100 flights all need to acquire information in a real-time access manner, the time for acquiring the final information depends on the request which takes the longest time, and from the perspective of a user, the time from query to page display needs more than 20 seconds.
In order to reduce the pressure of suppliers, the current platform generally adopts a mode of combining active cache inquiry and passive real-time inquiry to inquire the air ticket.
The active cache is used for predicting the most probable travel line queried by the user in advance, sending a query request to a supplier according to a preset refreshing strategy and storing the acquired related information in the cache. The passive real-time inquiry is an inquiry mode of initiating a real-time acquisition request to a supplier according to an inquiry condition.
When the query is performed, if the query of the user is matched with the information in the cache, the active cache hit is performed. When the active cache is hit, the display time of the average query page is within 1 to 4 seconds; and if not, sending a real-time query request to a supplier to acquire travel information, wherein the display time of the average query page is more than 20 s.
At present, the active cache is mainly determined by processing query information of all users, and practice shows that the active cache in the prior art has low hit rate.
Disclosure of Invention
The embodiment of the application provides an active cache method, a travel query method and related products of a travel platform, and is beneficial to improving the hit rate of the active cache.
In a first aspect, an embodiment of the present application provides an active caching method for a travel platform, including: when the query entrance is a public-cause query entrance, the travel platform acquires travel information of a user who does not depart from the travel application platform; predicting a business trip line of business trip information matching the user's non-departure according to a business trip caching strategy, the business trip line including one or more of: a flight line for taking an airplane, a train line for taking a high-speed rail, a shift line for taking an automobile or a ship line for taking a ship; and acquiring data corresponding to the business trip line cached by the public initiative from a supplier according to the business trip refreshing strategy and storing the data into a cache.
The technical scheme provided by the embodiment of the application obtains travel information of a user who does not depart when the query entry is a public query entry, predicts the travel line which is matched with the travel information of the user who does not depart according to a public initiative cache strategy, and obtains data corresponding to the travel line of the public initiative cache from a supplier according to a public initiative refreshing strategy and stores the data into the cache, so that the matching of the data in the cache and the query operation of the user and the hit rate of the active cache during public query are favorably improved, and the user experience is further improved.
In some possible embodiments of the present application based on the first aspect, the method further includes: predicting the first information according to a privacy-related prediction strategy; the first information includes: the hot line of travel and the refreshing frequency of the hot line of travel due to private search; and acquiring data corresponding to the hot travel line according to the first information and storing the data into the cache.
In some possible embodiments, the privacy-related prediction policy may be executed at a specified time of day, for example, at the zero point of each day.
In some possible embodiments, the first information may further include: an upper limit on the number of refreshes from the vendor. The upper limit of the number of refreshes from the vendor refers to the upper limit of the number of times query data is obtained from the vendor. For example, the first information may include: the upper limit of the number of refreshes from the supplier a is 2 ten thousand times per day, and the upper limit of the number of times of obtaining inquiry data from the supplier a per day is 2 ten thousand times.
According to the technical scheme provided by the embodiment of the application, the privacy-related prediction strategy is described. And according to the first information obtained by prediction, obtaining data corresponding to the hot travel line from the supplier and storing the data into the cache, thereby being beneficial to improving the hit rate of the active cache during query.
Based on the first aspect, in some possible embodiments of the present application, the predicting, according to a cause public initiative caching policy, a cause public initiative cached travel line that matches travel information that the user did not depart includes: acquiring historical travel information of the user from an unfair database; and predicting a business trip line cached by the public bus matched with the business trip information of the user which does not depart according to the historical business trip information of the user.
According to the technical scheme provided by the embodiment of the application, the method for predicting the initiative cache of the travel line in the process of inquiry of the public account is described, and the travel line in the process of caching the public account, which is matched with the travel information of the user who does not depart, is predicted according to the historical travel information of the user obtained from the public account database. The preference of the user can be obtained according to the historical travel information, so that the predicted travel line of the public active cache is more matched with the possible search operation of the user, and the hit rate of the active cache is favorably improved.
Based on the first aspect, in some possible embodiments of the present application, the predicting, according to a cause public initiative caching policy, a cause public initiative cached travel line that matches travel information that the user did not depart further includes: when the historical travel information of the user does not exist in the public database, determining the preference information of other users according to the historical travel information of other users stored in the public database; and predicting the business trip line cached by the public initiative matched with the business trip information of the user without departure according to the preference information of the other users.
According to the technical scheme, when the public database does not have the historical travel information of the user, the preference information of other users is determined according to the historical travel information of other users in the public database, the travel line cached by the public initiative is predicted according to the preference information of other users, and the implementation mode is favorable for improving the hit rate cached by the public initiative.
Based on the first aspect, in some possible embodiments of the present application, the obtaining data corresponding to the business-to-business-activity cached travel line from a provider according to a business-to-business-activity refresh policy and storing the data in a cache includes: and when the data of the business trip line cached by the public owner changes, acquiring the data corresponding to the business trip line cached by the public owner in real time from the supplier and storing the data into a cache.
The technical scheme that this application embodiment provided has injectd because of the refresh frequency of public initiative buffer memory travel line, acquires the data that corresponds because of public initiative buffer memory travel line through in real time from the supplier for data that corresponds because of public initiative buffer memory travel line can in time be updated in the buffer memory, is favorable to reducing the failure rate of checking a warehouse.
Based on the first aspect, in some possible embodiments of the present application, in predicting the refresh frequency of the hot trip line, the privacy-related prediction policy specifically includes: and adjusting the refreshing frequency of the hot travel line according to the historical change times of the hot travel line.
In some possible embodiments, adjusting the refresh frequency of the hot travel line according to the historical change number of the hot travel line may include: predicting the change times of the hot route in the fixed time period of the day according to the historical change times of the hot route in the fixed time period of the hot trip route, determining the hot value of the hot route in the fixed time period of the day according to the predicted change times of the hot route in the fixed time period of the day, and determining the refresh frequency of the hot route in the fixed time period of the day according to the hot value of the hot route. It will be appreciated that the refresh frequency is higher the further forward the warm-gate value.
The technical scheme that this application embodiment provided, to having injectd because of the privacy initiative refresh frequency, according to the historical change number of times of hot travel line confirms the day the refresh frequency of hot travel line can be optimized like this for the setting of refresh frequency more matches with the change of hot travel line, is favorable to reducing the failure rate of checking a warehouse.
Based on the first aspect, in some possible embodiments of the present application, in terms of predicting an upper limit of the number of times of refreshing the provider, the privacy-related prediction policy specifically includes: predicting the number of PNRs of the supplier according to the number of historical passenger order records (PNRs) of the supplier, and determining the upper limit of the refreshing times of the supplier by combining the check-up ratio of the supplier and the predicted number of PNRs of the supplier.
According to the technical scheme provided by the embodiment of the application, the upper limit of the refreshing times of the supplier is determined according to the historical PNR number and the searching and booking ratio of the supplier, and the query cost is favorably reduced.
In a second aspect, an embodiment of the present application provides a travel query method, including: when the query entrance is a public-cause query entrance, the travel platform acquires travel information of a user who does not depart from the travel application platform; predicting a business trip line cached by the business trip agent matched with the business trip information of the user without departure according to the business trip caching strategy; the travel line includes one or more of: a flight line for taking an airplane, a train line for taking a high-speed rail, a shift line for taking an automobile or a ship line for taking a ship; and acquiring data corresponding to the business trip line cached by the public initiative from a supplier according to the business trip refreshing strategy and storing the data into a cache.
The technical scheme provided by the embodiment of the application obtains travel information of a user who does not depart when the query entry is a public query entry, predicts the travel line which is matched with the travel information of the user who does not depart according to a public initiative cache strategy, and obtains data corresponding to the travel line of the public initiative cache from a supplier according to a public initiative refreshing strategy and stores the data into the cache, so that the matching of the data in the cache and the query operation of the user and the hit rate of the active cache during public query are favorably improved, and the user experience is further improved.
Based on the second aspect, in some possible embodiments of the present application, the travel query may further include: predicting the first information according to a privacy-related prediction strategy; the first information includes: the hot line of travel and the refreshing frequency of the hot line of travel due to private search; and acquiring data corresponding to the hot travel line according to the first information and storing the data into the cache.
In some possible embodiments, the privacy-related prediction policy may be executed at a specified time of day, for example, at the zero point of each day.
In some possible embodiments, the first information may further include: an upper limit on the number of refreshes from the vendor.
According to the technical scheme provided by the embodiment of the application, the privacy-related prediction strategy is described. And according to the first information obtained by prediction, obtaining data corresponding to the hot travel line from the supplier and storing the data into the cache, thereby being beneficial to improving the hit rate of the active cache during query.
Based on the second aspect, in some possible embodiments of the present application, after the obtaining data corresponding to the travel line cached in the public initiative from the vendor according to the public initiative refresh policy and saving the data in the cache, the method further includes: when the user triggers the public inquiry operation in the public inquiry interface, searching whether data matched with the public inquiry operation exist in the cache; if so, displaying the data matched with the public inquiry operation; if not, a first real-time query request is sent to a supplier according to a query condition, data corresponding to the travel line of the public and passive cache matched with the first real-time query request are obtained and displayed, and data corresponding to the travel line of the public and passive cache are obtained from the supplier according to a public and passive refreshing strategy and are stored in the cache.
According to the technical scheme provided by the embodiment of the application, after the data corresponding to the travel line cached by the public initiative is obtained from the supplier according to the public initiative refreshing strategy and is stored in the cache, when the user triggers the public inquiry operation, if the data matched with the public inquiry operation is found in the cache, the data matched with the public inquiry operation is displayed; if the data matched with the public query operation is not found in the cache, an acquisition request is sent to the supplier according to the query condition, the data corresponding to the public and passive cache travel line matched with the acquisition request is acquired and displayed, and the data corresponding to the public and passive cache travel line is acquired from the supplier according to the public and passive refreshing strategy and is stored in the cache. Therefore, the method is beneficial to timely inquiring the needed information when the cache is matched with the inquiring operation in the public inquiring process; meanwhile, when the cache is not matched with the query operation, the user can be ensured to query the required information.
Based on the second aspect, in some possible embodiments of the present application, when the query entry is a private query entry, the method further includes: when the user triggers a private query operation in a private query interface, searching whether data matched with the private query operation exist in the cache; if so, displaying the data matched with the private query operation; if not, a second real-time query request is sent to a supplier according to a query condition, data corresponding to the private-passive cache travel line matched with the second real-time query request are obtained and displayed, and data corresponding to the private-passive cache travel line are obtained from the supplier according to a private-passive refreshing strategy and are stored in the cache.
According to the technical scheme provided by the embodiment of the application, the condition that the query entry is the private query entry is described, and when a user triggers the private query operation in the private query interface, whether data matched with the private query operation exists or not is searched in the cache; if so, displaying data matched with the private query operation; if not, a second real-time query request is sent to the supplier according to the query condition, data corresponding to the private-passive cache travel line matched with the second real-time query request are obtained and displayed, and data corresponding to the private-passive cache travel line are obtained from the supplier according to the private-passive refreshing strategy and are stored in the cache. Therefore, the method is beneficial to timely inquiring the required information when the active cache is matched with the inquiring operation during private inquiring; meanwhile, when the active cache is not matched with the query operation, the user can be ensured to query the required information.
Based on the second aspect, in some possible embodiments of the present application, the predicting, according to a cause public initiative caching policy, a cause public initiative cached travel line that matches travel information that the user did not depart includes: acquiring historical travel information of the user from an unfair database; and predicting a business trip line cached by the public bus matched with the business trip information of the user which does not depart according to the historical business trip information of the user.
According to the technical scheme provided by the embodiment of the application, the method for predicting the initiative cache of the travel line in the process of inquiry of the public account is described, and the travel line in the process of caching the public account, which is matched with the travel information of the user who does not depart, is predicted according to the historical travel information of the user obtained from the public account database. The preference of the user can be obtained according to the historical travel information, so that the predicted travel line of the public active cache is more matched with the possible search operation of the user, and the hit rate of the active cache is favorably improved.
Based on the second aspect, in some possible embodiments of the present application, the predicting, according to a cause public initiative caching policy, a cause public initiative cached travel line that matches travel information that the user did not depart further includes: when the historical travel information of the user does not exist in the public database, determining the preference information of other users according to the historical travel information of other users stored in the public database; and predicting the business trip line cached by the public initiative matched with the business trip information of the user without departure according to the preference information of the other users.
According to the technical scheme, when the public database does not have the historical travel information of the user, the preference information of other users is determined according to the historical travel information of other users in the public database, the travel line cached by the public initiative is predicted according to the preference information of other users, and the implementation mode is favorable for improving the hit rate cached by the public initiative.
Based on the second aspect, in some possible embodiments of the present application, the obtaining data corresponding to the business-to-business-activity cached travel line from a provider according to a business-to-business-activity refresh policy and storing the data in a cache includes: and when the data of the business trip line cached by the public owner changes, acquiring the data corresponding to the business trip line cached by the public owner in real time from the supplier and storing the data into a cache.
The technical scheme that this application embodiment provided has injectd because of the refresh frequency of public initiative buffer memory travel line, acquires the data that corresponds because of public initiative buffer memory travel line through in real time from the supplier for data that corresponds because of public initiative buffer memory travel line can in time be updated in the buffer memory, is favorable to reducing the failure rate of checking a warehouse.
Based on the second aspect, in some possible embodiments of the present application, in predicting the refresh frequency of the hot trip line, the privacy-related prediction policy specifically includes: and adjusting the refreshing frequency of the hot travel line according to the historical change times of the hot travel line.
In some possible embodiments, adjusting the refresh frequency of the hot travel line according to the historical change number of the hot travel line may include: predicting the change times of the hot trip line in the fixed time period of the day according to the historical change times of the hot trip line in the fixed time period of the day, determining the hot value of the hot trip line in the fixed time period of the day according to the predicted change times of the hot trip line in the fixed time period of the day, and determining the refreshing frequency of the hot trip line in the fixed time period of the day according to the hot value of the hot trip line. It will be appreciated that the refresh frequency is higher the further forward the warm-gate value.
The technical scheme that this application embodiment provided, to having injectd because of the privacy initiative refresh frequency, according to the historical change number of times of hot travel line confirms the day the refresh frequency of hot travel line can be optimized like this for the setting of refresh frequency more matches with the change of hot travel line, is favorable to reducing the failure rate of checking a warehouse.
Based on the second aspect, in some possible embodiments of the present application, in terms of predicting an upper limit of the number of times of refreshing the provider, the privacy-related prediction policy specifically includes: predicting the PNR number of the supplier according to the historical passenger booking record PNR number of the supplier, and determining the upper limit of the refreshing time of the supplier by combining the booking ratio of the supplier and the predicted PNR number of the supplier.
According to the technical scheme provided by the embodiment of the application, the upper limit of the refreshing times of the supplier is determined according to the historical PNR number and the searching and booking ratio of the supplier, and the query cost is favorably reduced.
In a third aspect, an embodiment of the present application provides an active caching apparatus, including: the business trip information acquisition system comprises a first acquisition unit, a business trip application platform and a second acquisition unit, wherein the first acquisition unit is used for acquiring business trip information of a user from the business trip application platform when an inquiry entrance is a public inquiry entrance; a first prediction unit, configured to predict a business trip line that matches business trip information that the user did not depart according to a business trip caching policy, where the business trip line includes one or more of the following: a flight line for taking an airplane, a train line for taking a high-speed rail, a shift line for taking an automobile or a ship line for taking a ship; and the first processing unit is used for acquiring data corresponding to the business trip line cached by the public initiative from a supplier according to the business trip refreshing strategy and storing the data into a cache.
The technical scheme provided by the embodiment of the application obtains travel information of a user who does not depart when the query entry is a public query entry, predicts the travel line which is matched with the travel information of the user who does not depart according to a public initiative cache strategy, and obtains data corresponding to the travel line of the public initiative cache from a supplier according to a public initiative refreshing strategy and stores the data into the cache, so that the matching of the data in the cache and the query operation of the user and the hit rate of the active cache during public query are favorably improved, and the user experience is further improved.
Based on the third aspect, in some possible embodiments of the present application, the method further includes: the second prediction unit is used for predicting the first information according to the private prediction strategy; the first information includes: the hot line of travel and the refreshing frequency of the hot line of travel due to private search; and the second processing unit is used for acquiring data corresponding to the hot travel line according to the first information and storing the data into the cache.
In some possible embodiments, the second prediction unit may execute the private prediction policy at a specified time of day, for example, at a zero point of each day.
In some possible embodiments, the first information may further include: an upper limit on the number of refreshes from the vendor.
According to the technical scheme provided by the embodiment of the application, the privacy-related prediction strategy is described. And according to the first information obtained by prediction, obtaining data corresponding to the hot travel line from the supplier and storing the data into the cache, thereby being beneficial to improving the hit rate of the active cache during query.
Based on the third aspect, in some possible embodiments of the present application, the first prediction unit is specifically configured to obtain historical travel information of the user from an unfair database; and predicting a business trip line cached by the public bus matched with the business trip information of the user which does not depart according to the historical business trip information of the user.
According to the technical scheme provided by the embodiment of the application, the method for predicting the initiative cache of the travel line in the process of inquiry of the public account is described, and the travel line in the process of caching the public account, which is matched with the travel information of the user who does not depart, is predicted according to the historical travel information of the user obtained from the public account database. The preference of the user can be obtained according to the historical travel information, so that the predicted travel line of the public active cache is more matched with the possible search operation of the user, and the hit rate of the active cache is favorably improved.
Based on the third aspect, in some possible embodiments of the present application, the first prediction unit is further specifically configured to, when there is no historical travel information of the user in the cause public database, determine preference information of another user according to historical travel information of another user stored in the cause public database; and predicting the business trip line cached by the public initiative matched with the business trip information of the user without departure according to the preference information of the other users.
According to the technical scheme, when the public database does not have the historical travel information of the user, the preference information of other users is determined according to the historical travel information of other users in the public database, the travel line cached by the public initiative is predicted according to the preference information of other users, and the implementation mode is favorable for improving the hit rate cached by the public initiative.
Based on the third aspect, in some possible embodiments of the present application, the first processing unit is specifically configured to, when data of the business trip line cached by the public owner changes, obtain, in real time, data corresponding to the business trip line cached by the public owner from the supplier and store the data in a cache.
The technical scheme that this application embodiment provided has injectd because of the refresh frequency of public initiative buffer memory travel line, acquires the data that corresponds because of public initiative buffer memory travel line through in real time from the supplier for data that corresponds because of public initiative buffer memory travel line can in time be updated in the buffer memory, is favorable to reducing the failure rate of checking a warehouse.
Based on the third aspect, in some possible embodiments of the present application, in predicting the refresh frequency of the hot trip line, the second prediction unit is specifically configured to adjust the refresh frequency of the hot trip line according to the historical change number of the hot trip line.
In some possible embodiments, in determining the refresh frequency of the hot trip line according to the historical change times of the hot trip line, the second prediction unit may be specifically configured to predict the change times of the hot trip line in a fixed time period according to the historical change times of the hot trip line in the fixed time period, determine the hot value of the hot trip line in the fixed time period according to the predicted change times of the hot trip line in the fixed time period, and determine the refresh frequency of the hot trip line in the fixed time period according to the hot value of the hot trip line. It will be appreciated that the refresh frequency is higher the further forward the warm-gate value.
The technical scheme that this application embodiment provided, to having injectd because of the privacy initiative refresh frequency, according to the refresh frequency of hot travel line is confirmed to the historical change number of times of hot travel line, can optimize the refresh frequency like this for the setting of refresh frequency more matches with the change of hot travel line, is favorable to reducing the failure rate of checking a warehouse.
Based on the third aspect, in some possible embodiments of the present application, in terms of predicting the upper limit of the number of times of refreshing the provider, the second prediction unit is specifically configured to predict the PNR number of the provider according to the historical PNR number of the provider, and determine the upper limit of the number of times of refreshing the provider by combining the search ratio of the provider and the predicted PNR number of the provider.
According to the technical scheme provided by the embodiment of the application, the upper limit of the refreshing times of the supplier is determined according to the historical PNR number and the searching and booking ratio of the supplier, and the query cost is favorably reduced.
In a fourth aspect, an embodiment of the present application provides an airline ticket querying device, including: the business trip information acquisition system comprises a first acquisition unit, a business trip application platform and a second acquisition unit, wherein the first acquisition unit is used for acquiring business trip information of a user from the business trip application platform when an inquiry entrance is a public inquiry entrance; the first prediction unit is used for predicting the business trip line which is matched with the business trip information of the user and is not started according to the business trip caching strategy; the travel line includes one or more of: a flight line for taking an airplane, a train line for taking a high-speed rail, a shift line for taking an automobile or a ship line for taking a ship; and the first processing unit is used for acquiring data corresponding to the business trip line cached by the public initiative from a supplier according to the business trip refreshing strategy and storing the data into a cache.
According to the technical scheme, when the query entry is the public query entry, travel information of a user who does not depart is obtained, and according to the travel line matched with the travel information of the user who does not depart, predicted according to a public initiative cache strategy, the matching performance of data in the cache and user operation and the hit rate of the initiative cache in the public query are favorably improved, and user experience is further improved.
Based on the fourth aspect, in some possible embodiments of the present application, the method further includes: a second prediction unit, configured to predict first information according to a private cause prediction policy, where the first information includes: the hot line of travel and the refreshing frequency of the hot line of travel due to private search; and the second processing unit is used for acquiring data corresponding to the hot travel line according to the first information and storing the data into the cache.
According to the technical scheme provided by the embodiment of the application, the privacy-related prediction strategy is described. And according to the first information obtained by prediction, obtaining data corresponding to the hot travel line from the supplier and storing the data into the cache, thereby being beneficial to improving the hit rate of the active cache during query.
Based on the fourth aspect, in some possible embodiments of the present application, the privacy-related prediction policy may be executed at a specified time of day, and the specified time may be, for example, zero of each day.
In some possible embodiments, the first information may further include: an upper limit on the number of refreshes from the vendor.
According to the technical scheme provided by the embodiment of the application, the privacy-related prediction strategy is described. And according to the first information obtained by prediction, obtaining data corresponding to the hot travel line from the supplier and storing the data into the cache, thereby being beneficial to improving the hit rate of the active cache during query.
Based on the fourth aspect, in some possible embodiments of the present application, the method further includes: the first display unit is used for displaying the data matched with the public affair inquiry operation when the user triggers the public affair inquiry operation in a public affair inquiry interface and searches the data matched with the public affair inquiry operation in the cache after the first processing unit obtains the data corresponding to the public affair initiative cache travel line from a supplier according to a public affair initiative refreshing strategy and stores the data in the cache; and displaying data corresponding to the public and passive cache travel line matched with the first real-time request after the data matched with the public query operation is not found in the cache and the first real-time query request is sent to a supplier according to the query condition.
According to the technical scheme provided by the embodiment of the application, after the data corresponding to the travel line cached by the public initiative is obtained from the supplier according to the public initiative refreshing strategy and is stored in the cache, when the user triggers the public inquiry operation, if the data matched with the public inquiry operation is found in the cache, the data matched with the public inquiry operation is displayed; if the data matched with the public query operation is not found in the cache, an acquisition request is sent to the supplier according to the query condition, the data corresponding to the public and passive cache travel line matched with the acquisition request is acquired and displayed, and the data corresponding to the public and passive cache travel line is acquired from the supplier according to the public and passive refreshing strategy and is stored in the cache. Therefore, the method is beneficial to timely inquiring the needed information when the cache is matched with the inquiring operation in the public inquiring process; meanwhile, when the cache is not matched with the query operation, the user can be ensured to query the required information.
Based on the fourth aspect, in some possible embodiments of the present application, the method further includes: the second display unit is used for displaying the data matched with the private inquiry operation in the cache when the inquiry entrance is the private inquiry entrance and the user triggers the private inquiry operation in the private inquiry interface; and if no data matched with the private query operation exists in the cache, and after a second real-time query request is sent to a supplier according to the query condition, displaying data corresponding to the private and passive cache travel line matched with the second real-time query request.
According to the technical scheme provided by the embodiment of the application, the condition that the query entry is the private query entry is described, and when a user triggers the private query operation in the private query interface, whether data matched with the private query operation exists or not is searched in the cache; if so, displaying data matched with the private query operation; if not, a second real-time query request is sent to the supplier according to the query condition, data corresponding to the private-passive cache travel line matched with the second real-time query request are obtained and displayed, and data corresponding to the private-passive cache travel line are obtained from the supplier according to the private-passive refreshing strategy and are stored in the cache. Therefore, the required information can be displayed in time when private query is carried out.
Based on the fourth aspect, in some possible embodiments of the present application, in predicting a business trip line matching business trip information that is not departing from the user according to a business trip caching policy, the first prediction unit is specifically configured to: acquiring historical travel information of the user from an unfair database; and predicting a business trip line cached by the public bus matched with the business trip information of the user which does not depart according to the historical business trip information of the user.
Based on the fourth aspect, in some possible embodiments of the present application, in predicting a business trip line matching business trip information that is not departing from the user according to a business trip caching policy, the first prediction unit is further specifically configured to: when the historical travel information of the user does not exist in the public database, determining the preference information of other users according to the historical travel information of other users stored in the public database; and predicting the business trip line cached by the public initiative matched with the business trip information of the user without departure according to the preference information of the other users.
Based on the fourth aspect, in some possible embodiments of the present application, in terms of obtaining the data corresponding to the business trip line cached by the public initiative from a provider according to a business trip refresh policy and storing the data in a cache, the first processing unit is specifically configured to, when the data of the business trip line cached by the public initiative changes, obtain the data corresponding to the business trip line cached by the public initiative in real time from the provider and store the data in the cache.
Based on the fourth aspect, in some possible embodiments of the present application, in predicting the refresh frequency of the hot trip line, the second prediction unit is specifically configured to: and adjusting the refreshing frequency of the hot travel line according to the historical change times of the hot travel line.
Based on the fourth aspect, in some possible embodiments of the present application, in terms of predicting an upper limit of the number of times of refreshing the provider, the privacy-related prediction policy specifically includes: predicting the PNR number of the supplier according to the historical passenger booking record PNR number of the supplier, and determining the upper limit of the refreshing time of the supplier by combining the booking ratio of the supplier and the predicted PNR number of the supplier.
According to the technical scheme provided by the embodiment of the application, the upper limit of the refreshing times of the supplier is determined according to the historical PNR number and the searching and booking ratio of the supplier, and the query cost is favorably reduced.
In a fifth aspect, an embodiment of the present application provides a travel platform, including: the system comprises a communication interface, a processor, a memory and a bus, wherein the communication interface is used for receiving travel inquiry operation from a client and communicating with a business trip application platform; the memory for storing executable program code; the processor is configured to execute a program corresponding to the executable program code by reading the executable program code stored in the memory, so as to perform a part or all of the steps of any one of the methods of the first aspect or perform a part or all of the steps of any one of the methods of the second aspect.
The technical scheme provided by the embodiment of the application obtains travel information of a user who does not depart when the query entry is a public query entry, predicts the travel line which is matched with the travel information of the user who does not depart according to a public initiative cache strategy, and obtains data corresponding to the travel line of the public initiative cache from a supplier according to a public initiative refreshing strategy and stores the data into the cache, so that the matching of the data in the cache and the query operation of the user and the hit rate of the active cache during public query are favorably improved, and the user experience is further improved.
In a sixth aspect, the present application provides a computer-readable storage medium storing a computer program, where the computer program is executed by hardware to perform part or all of the steps of any one of the methods in the first aspect or part or all of the steps of any one of the methods in the second aspect.
The technical scheme provided by the embodiment of the application obtains travel information of a user who does not depart when the query entry is a public query entry, predicts the travel line which is matched with the travel information of the user who does not depart according to a public initiative cache strategy, and obtains data corresponding to the travel line of the public initiative cache from a supplier according to a public initiative refreshing strategy and stores the data into the cache, so that the matching of the data in the cache and the query operation of the user and the hit rate of the active cache during public query are favorably improved, and the user experience is further improved.
Drawings
Some drawings to which embodiments of the present application relate will be described below.
Fig. 1 is a schematic diagram of a system architecture for performing a travel query according to an embodiment of the present application.
Fig. 2A is a flowchart illustrating an active caching method for a travel platform according to an embodiment of the present application.
Fig. 2B is a flowchart illustrating an active caching method for a travel platform according to another embodiment of the present application.
Fig. 3A is an interaction flow diagram of a travel query method according to an embodiment of the present application.
Fig. 3B is a schematic diagram of active cache hit rates before and after the travel query method shown in fig. 3A is adopted.
Fig. 3C is a schematic diagram of average query time consumption before and after the travel query method shown in fig. 3A is adopted.
Fig. 3D is a schematic diagram of active cache hit rates before and after the travel query method shown in fig. 3A is used.
Fig. 4A is a schematic structural diagram of an active cache apparatus according to an embodiment of the present application.
Fig. 4B is a schematic structural diagram of an active cache apparatus according to another embodiment of the present disclosure.
Fig. 5A is a schematic structural diagram of a travel inquiry apparatus according to an embodiment of the present application.
Fig. 5B is a schematic structural diagram of a travel inquiry apparatus according to another embodiment of the present application.
Fig. 6 is a schematic structural diagram of a travel platform according to an embodiment of the present application.
Detailed Description
The embodiments of the present application will be described below with reference to the drawings.
Referring to fig. 1, fig. 1 is a schematic structural diagram of a travel system according to an embodiment of the present application, where the travel system may include: a travel platform 101, a supplier 102, and a business application platform 103 interconnected by a network. The user 100 can perform operations such as inquiry and ticket booking through the travel platform 101.
The travel platform 101 may be a professional travel management service platform provided to a specific user (e.g., provided to an enterprise). The supplier 102 may be a global distribution system GDS, for example, the GDS may be a large computer information service system applied to one or more travel modes such as civil aviation transportation, high-speed rail transportation, automobile passenger transportation, ship transportation, and the like, and the whole travel industry. Through GDS, tourism sales organizations throughout the world can obtain a large amount of tourism related information from transportation companies, hotels, car renting companies and tourism companies in time, thereby providing quick, convenient and reliable service for customers. The business trip application platform 103 is a business trip application and approval platform applied to an enterprise, and historical business trip information, business trip application, business trip information approved and approved by a user, and the like are stored in the business trip application platform 103. Of course, the modular architecture of the travel platform is not limited to the above example.
The technical solution of the embodiment of the present application can be implemented based on the travel system with the architecture illustrated in fig. 1 by way of example or a variant architecture thereof.
Referring to fig. 2A, fig. 2A is a schematic flowchart of an active caching method for a travel platform according to an embodiment of the present application, where the method includes, but is not limited to, the following steps.
201. And when the query entrance is a public query entrance, the travel platform acquires the travel information of the user who does not depart from the business application platform.
The user operation may be foreground (the foreground may be a client, a page, a user interface such as H5, etc.), and the foreground may be deployed on a mobile terminal or a desktop computer or other terminal devices. The query entries of the foreground operation interface can comprise a private query entry and a public query entry. When the user is on business, the user can inquire through the business inquiry entrance. When the user is on the private trip, the user can inquire through the private inquiry entrance. It should be noted that the travel mode corresponding to the travel information may be: taking one or more of airplane taking, high-speed rail taking, automobile taking, or ship taking, taking the airplane taking travel mode as an example in the following description, and referring to the airplane taking travel mode corresponding flow for the other travel mode corresponding flows, which are not described in detail by way of example.
When the query entry operated by the user is the public query entry, the travel platform can obtain the travel information of the user without departure from the business application platform, and the travel information without departure can be recorded as follows: user identification, flight type, travel number, departure city, arrival city, departure time, return time, supplier, travel number, etc. For example, one trip-free travel message includes: the user a, the voyage is one-way, the number of the travel is TR001, the departure city is beijing, the arrival city is jacada, the departure date is 2019, 12 and 18 months, the return time is 2020, 3 and 30 months, the supplier is the supplier numbered 2 in the travel platform, and the supplier can be, for example, an internet book booking engine (IBE) in china, and the like.
In some possible embodiments, the travel system executes the privacy-related prediction policy at a specified time of day, and predicts the following information: a hot route searched by a private on the same day, a date pair of the hot route, the maximum refreshing times of the suppliers on the same day, and the refreshing frequency of the hot route on the same day; and acquiring data corresponding to the hot air route from the supplier according to the predicted hot air route, the date pair of the hot air route, the maximum refreshing times of the supplier on the current day and the refreshing frequency of the hot air route on the current day, and storing the data in the cache. This is beneficial to improving the hit rate of the active cache during query.
202. And predicting the business trip line cached by the business trip account according to the business trip information cached by the business trip account.
Wherein, the travel line can include one or more of the following: a flight line for taking an airplane, a train line for taking a high-speed rail, a shift line for taking an automobile or a ship line for taking a ship.
In some possible embodiments, the public active caching policy may include: and obtaining the historical travel information of the user from the cause public database, and predicting a cause public bus cache travel line matched with the travel information of the user who does not depart according to the historical travel information of the user. For the one-way situation, the interval between the inquiry date and departure date of each route in the historical travel information is counted, and the predicted date pair of each route with the occupation ratio reaching the set threshold value is used. Taking the example in step 201, according to the user's historical travel information, if the user takes a flight with a flight date of 80% of the flight on the day of business, 20% of the day of business after business, and 90% of the day of business from Beijing to Yagarda, 10% of the day of business from Beijing to hong Kong and then from hong Kong to Yagarda. According to historical travel information, airlines from Beijing to Yagarda of 12 months and 18 days in 2019 can be used as information to be cached in the mode of public initiative.
For the one-way case, the interval of each route inquiry date and departure date is counted, and the ratio of the inquiry dates and the departure dates reaches a given threshold value to be used as each route forecast date pair. For the round trip situation, in addition to the interval between the inquiry date and departure date of each route, the interval between the return date and departure date may be counted, and the ratio reaching the predetermined threshold value according to the two date intervals is used as the forecast date pair.
In some possible embodiments, the public active caching policy may include: when the historical travel information of the user does not exist in the public database, determining the preference information of other users according to the historical travel information of other users stored in the public database; and predicting the business trip line cached by the public initiative matched with the business trip information of the user without departure according to the preference information of other users.
After predicting the business trip line of the business trip not started by the user according to the business trip line of the business trip not started by the public initiative caching strategy, the business trip line of the business trip not started by the user is generally stored in a form of a business trip line of the public initiative caching table, as shown in table 1, the business trip line of the public initiative caching table generally comprises the following fields: travel application number, voyage type, departure city, arrival city, departure time, return time, refresh sequence, and vendor type.
Figure BDA0002383894600000111
Table 1 public initiative cache table
N1, N2 in table 1 are the number of refresh date pairs predicted for each business application number. And according to different historical business trip conditions of users, the number of the information to be cached corresponding to each business trip application number is different.
In a specific embodiment, since the public active cache table includes N1+ N2 records, the information of the first record is: the travel application number is TR112, the voyage type is one-way, the departure city is NKG, the arrival city is KUL, the departure time is 20191001, the return time is nulled, the refresh sequence is 1, and the vendor types include vendor 1 and vendor 4. The 2 nd to the N1-1 st items of information are recorded as other information to be cached corresponding to the travel information number; the information recorded in item N1 is: the travel information number TR112, the type of voyage is round trip, the departure city is NKG, the arrival city is KUL, the departure time is 20191001, the return time is 20191007, the refresh sequence is N1, and the vendor types are vendor 1 and vendor 4. Another piece of recorded information is: the application number of the travel is TR113, the type of the voyage is round trip, the departure city is NKG, the arrival city is PEX, the departure time is 20191001, the return time is 20191007, the refreshing sequence is 1, and the type of the supplier comprises a supplier 3; the remaining N2-1 interpretations are as TR 112.
In some possible embodiments, when there is no historical travel information of the user in the public database, determining the preference information of other users according to the historical travel information of other users stored in the public database; and predicting the business trip line cached by the public initiative matched with the business trip information of the user without departure according to the preference information of the other users.
203. And acquiring data corresponding to the business trip line cached by the public initiative from a supplier according to the business trip refreshing strategy and storing the data into a cache.
In some possible embodiments, when data of a business trip line cached by a public owner changes, the data corresponding to the business trip line cached by the public owner is obtained from the supplier in real time and is stored in a cache.
The technical scheme provided by the embodiment of the application acquires the travel information of the user who does not depart when the query entry is the public query entry, and predicts the travel line matched with the travel information of the user who does not depart according to the public active cache strategy, so that the matching of data in the public active cache and user operation and the hit rate of the active cache in the public query are favorably improved, and the user experience is further improved.
Referring to fig. 2B, fig. 2B is a flowchart illustrating an active caching method for a travel platform according to an embodiment of the present disclosure, where the active caching method includes, but is not limited to, the following steps.
201. And when the query entrance is a public query entrance, the travel platform acquires the travel information of the user who does not depart from the business application platform.
In some possible embodiments, before step 201, it may further include determining whether the query entry is a public query entry or a private query entry.
202. And predicting the business trip line cached by the business trip account according to the business trip information cached by the business trip account.
203. And acquiring data corresponding to the business trip line cached by the public initiative from a supplier according to the business trip refreshing strategy and storing the data into a cache.
In step 201 to step 203, reference is made to the foregoing description of each step in fig. 2A, which is not described herein again.
204. And when the query entrance is the private-cause query entrance, predicting the first information according to the private-cause prediction strategy.
The first information includes: the hot travel line searched by private, the refreshing frequency of the hot travel line, the date pair of the hot air line and the upper limit of the refreshing times from the corresponding business can be further included.
When the user queries through the private query entry, the counting of the query data of the private ticket of the user may include: departure city, arrival city, departure time, return time, search time, number of adults, number of children, number of babies. The query quantity of the past time is counted according to the frequency of the route city pair (departure city + arrival city), and the route with the frequency percentage exceeding a given threshold value (such as 50%) is classified as a hot route. The threshold may be adjusted based on the ratio of the refresh amount to the PNR.
Predicting hot routes based on privacy-driven caching strategies, the most likely date pairs (departure time + arrival time) for a user to query can be predicted in one-way and round-trip using statistical methods. For the one-way case, the interval of each route inquiry date and departure date is counted, and the ratio of the inquiry dates and the departure dates reaches a given threshold value to be used as each route forecast date pair. For the round trip situation, in addition to the interval between the inquiry date and the departure date of each route, the interval between the return date and the departure date is counted, and according to the two date intervals, the date with the ratio reaching the preset threshold value is used as a forecast date pair.
In one embodiment, the hot route is shown in Table 2.
Figure BDA0002383894600000122
TABLE 2 hot course
NKG, KUL and PEK in Table 2 are city codes. In the above example, the refresh amount of vendor 1 is 1 × the refresh count for which the refresh policy is 1+ 1 × the total refresh count for which the refresh policy is 2. And sending a query request to a supplier according to the predicted route refreshing frequency and storing the query request in a cache library.
205. And acquiring data corresponding to the hot air route according to the predicted first information and storing the data into the cache.
In the aspect of predicting the maximum refreshing times of the supplier, the privacy-related prediction strategy specifically comprises the following steps: predicting the PNR number of the supplier according to the historical PNR number of the supplier, and determining the upper limit of the refreshing times of the supplier on the same day by combining the check ratio of the supplier and the predicted PNR number of the supplier on the same day.
In one embodiment, the refresh frequency of the hot lane is shown in Table 3.
Figure BDA0002383894600000121
TABLE 3 route Refresh policy Table
As can be seen from Table 3, the SZX-BKK flight line is refreshed using a refresh policy number 1. The refreshing is carried out every 1.5 hours (h) in a time period of 0:00-2:59, the time period is brushed for 2 times, and the effective caching time is 1.5 h. Refreshing every 3h in a 3:00-8:59 time period, wherein the time period is refreshed for 2 times in total, and the effective time of the cache is 3 h. Refreshing every 1h in a 9:00-11:59 time period, wherein the time period is refreshed for 3 times in total, and the effective time of the cache is 1 h. Refreshing every 2h in a time period of 12:00-13:59, wherein the time period is refreshed for 1 time in total, and the effective time of the cache is 2 h. Refreshing every 0.5h in a time period of 2:00-18:59, wherein the time period is refreshed 10 times in total, and the effective time of the cache is 0.5 h. Refreshing every 1h in the time period from 19:00 to 23:59, wherein the time period is refreshed for 5 times in total, and the effective time of the cache is 1 h.
In the embodiment, when the user logs in the foreground, whether the user has an air ticket business trip application or not is judged, and if yes, the latest business trip information is returned. And inquiring whether the cache library has predicted cache data or not according to the user travel application, and if so, initiating a real-time active inquiry request. Otherwise, the real-time query request is initiated according to the query condition input by the user.
In some possible embodiments, the refresh frequency of the hot airlines obtained from the suppliers can be determined according to the historical change times of the hot airlines, and the data corresponding to the hot airlines are obtained from the suppliers according to the refresh frequency and stored in the cache. The hot line changes may include: a change in the number of remaining seats, a change in price, or a change in flight list, etc. For example, according to the historical data, if the lane information obtained at two adjacent sampling moments is not changed or the probability of the change is smaller, the sampling interval may be extended, for example, 1h sampling may be performed once, and 1.5h sampling may be performed once. According to historical data, if the route information obtained at two adjacent sampling moments changes every time or the change probability is high, the sampling interval can be shortened, for example, the sampling interval can be adjusted to 0.5h once after 1h sampling.
In some possible embodiments, the PNR number of each supplier in the order data may be counted, after the data is processed correspondingly, a prediction model is established by using a machine learning time sequence regression algorithm to predict the PNR number of a certain day, and the maximum daily refresh amount is determined according to the charging standard and the order-finding ratio of each supplier. And acquiring suppliers corresponding to the airlines configured based on the business rules, and adjusting and determining the final supplier of the airlines and the data to be refreshed according to the maximum refreshing amount of each supplier every day. The historical PNR data of the supplier on a daily basis is processed by processes including, but not limited to, outlier processing, missing value processing, sequence stationarity checking, white noise checking. Dividing the data into a training set and a testing set, training the model on the training set, wherein the adopted machine learning time sequence regression algorithm comprises a difference integration moving average autoregressive model (ARIMA), holtwitter (holtwitters), and time series decomposition (STL) algorithm. And selecting an algorithm with the minimum Root Mean Square Error (RMSE) value on the training set test set to predict the PNR number of a certain day. For example, if a certain supplier a has a query ratio of 200:1 (query amount: PNR number), and the PNR number on a certain day is predicted to be 200, the maximum query amount is 200 × thresholdAWherein threshold isAThis value is flexibly adjustable for the maximum refresh rate of the supplier. It should be noted that, in addition to active caching, the actual operation is performedIn practice, the situation that data is obtained from the provider by passive caching, the inquiry budget, the charging standard of the provider, etc. need to be considered, so the maximum refresh rate of the provider is usually multiplied by a coefficient (such as 0.6, etc.). The maximum query volume determined according to PNR can also be applied to scenes that an online travel agency obtains data from suppliers and the like
According to the technical scheme provided by the embodiment of the application, besides the condition that the query entrance is the public query entrance, the private prediction strategy executed at the specified time every day is also described. And acquiring data corresponding to the hot airline from the supplier according to the predicted hot airline searched for the private of the current day, the date pair of the hot airline of the current day, the refreshing frequency of the hot airline of the current day and the maximum refreshing frequency of the supplier of the current day, and storing the data into a cache, so that the hit rate of active cache during query is improved.
Referring to fig. 3A, fig. 3A is an interaction flow diagram of a travel query method according to an embodiment of the present application, in which the travel query method includes the following steps.
301. And when the query entrance is a public query entrance, the travel platform acquires the travel information of the user who does not depart from the business application platform.
302. And predicting the business trip line cached by the business trip account according to the business trip information cached by the business trip account.
In some possible embodiments, the historical travel information of the user is obtained from an unfair database; and predicting a business trip line cached by the public bus matched with the business trip information of the user which does not depart according to the historical business trip information of the user.
In some possible embodiments, when there is no historical travel information of the user in the cause public database, determining preference information of other users according to historical travel information of other users stored in the cause public database;
and predicting the business trip line cached by the public initiative matched with the business trip information of the user without departure according to the preference information of the other users.
303. And acquiring data corresponding to the business trip line cached by the public initiative from a supplier according to the business trip refreshing strategy and storing the data into a cache.
In some possible embodiments, when data of an active cache route changes, data corresponding to the business-based cache travel route may be obtained from a supplier in real time and stored in a cache.
304. And when the user triggers the public inquiry operation in the public inquiry interface, searching whether data matched with the public inquiry operation exist in the cache.
If yes, go to step 3041. If not, go to step 3042.
For example, if the cache holds all flight information from Shenzhen to Mangu for a 7-day future (11.21-11.27) one-way pass, when the user queries 11.22 Shenzhen to Mangu one-way tickets, data matching the cause-public query operation is found in the cache. When the user searches for the 11.28 Shenzhen to Mangu one-way ticket, there is no data in the cache that matches the cause public query operation.
3041. And displaying the data matched with the public inquiry operation.
3042. The method comprises the steps of initiating a first real-time query request to a supplier according to a query condition, acquiring and displaying data corresponding to a public and passive cache travel line matched with the first real-time query request, acquiring the data corresponding to the public and passive cache travel line from the supplier according to a public and passive refreshing strategy, and storing the data into a cache.
305. And executing a privacy-related prediction strategy, wherein the predicted information comprises: a hot route for private search, a date pair for the hot route, a refresh frequency for the hot route, and an upper limit on the number of refreshes from the supplier.
In some possible implementations, the privacy-related prediction strategy may be performed at zero-point of the day. It is understood that the privacy-related prediction policy may be executed at other time points set by the user.
306. And acquiring data corresponding to the hot air route according to the predicted hot air route searched by the reason of privacy, the date pair of the hot air route, the refreshing frequency of the hot air route and the upper limit of the refreshing times of the supplier, and storing the data into the cache.
In some possible embodiments, in predicting the refresh frequency of the hot route, the privacy-related prediction strategy specifically includes: and determining the refreshing frequency of the hot air route according to the historical change times of the hot air route.
In some possible embodiments, in predicting the maximum number of refreshes of the vendor, the privacy-related prediction policy specifically includes: and predicting the check ratio of the suppliers according to the historical check ratio of the suppliers, and determining the upper limit of the refreshing times of the suppliers on the same day by combining the charging standard of the suppliers and the predicted check ratio of the suppliers on the same day.
307. And when the user triggers the private query operation in the private query interface, searching whether data matched with the private query operation exist in the cache.
If yes, go to step 3071. If not, go to step 3072.
3071. And displaying the data matched with the private query operation.
3072. And according to the private passive refreshing strategy, acquiring the data corresponding to the private cached travel line from the supplier and storing the data into the cache.
Through the experiments, the performance ratio before and after the test is shown in fig. 3B to 3D. As shown in fig. 3B, after the method provided by this embodiment is used in 8/2019, the active cache hit rate is increased from X to X + 16.9%. As shown in fig. 3C, the average query duration is reduced by about 2 seconds, and as shown in fig. 3D, the failure rate of the binning remains stable.
Referring to fig. 4A, fig. 4A is a schematic structural diagram of an active cache apparatus according to an embodiment of the present disclosure, where the active cache apparatus 400 includes: a first acquisition unit 401, a first prediction unit 402 and a first processing unit 403. The first obtaining unit 401 is configured to, when the query entry is a public query entry, obtain, by the business trip platform, business trip information that the user did not depart from the business trip application platform. A first prediction unit 402, configured to predict a cause public initiative cached travel line matching the travel information that the user did not depart according to a cause public initiative caching policy. The first processing unit 403 is configured to obtain data corresponding to the business trip line cached by the public initiative from a provider according to the business trip refresh policy, and store the data in a cache. The specific implementation of each module refers to the description in the method embodiment shown in fig. 2A, and is not described here again.
The technical scheme provided by the embodiment of the application obtains travel information of a user who does not depart when the query entry is a public query entry, predicts the travel line which is matched with the travel information of the user who does not depart according to a public initiative cache strategy, and obtains data corresponding to the travel line of the public initiative cache from a supplier according to a public initiative refreshing strategy and stores the data into the cache, so that the matching of the data in the cache and the query operation of the user and the hit rate of the active cache during public query are favorably improved, and the user experience is further improved.
Referring to fig. 4B, fig. 4B is a schematic structural diagram of an active cache apparatus according to an embodiment of the present disclosure, where the active cache apparatus 400' includes: a first acquisition unit 401, a first prediction unit 402, a first processing unit 403, a second prediction unit 404, and a second processing unit 405. The first obtaining unit 401 is configured to, when the query entry is a public query entry, obtain, by the business trip platform, business trip information that the user did not depart from the business trip application platform. A first prediction unit 402, configured to predict a cause public initiative cached travel line matching the travel information that the user did not depart according to a cause public initiative caching policy. The first processing unit 403 is configured to obtain data corresponding to the business trip line cached by the public initiative from a provider according to the business trip refresh policy, and store the data in a cache. A second prediction unit 404, configured to predict, according to the private prediction policy, first information, where the first information includes: and the hot travel line searched by the private and the refreshing frequency of the hot travel line. And a second processing unit 405, configured to obtain data corresponding to the hot travel line according to the first information, and store the data in the cache. The specific implementation of each module refers to the description in the method embodiment shown in fig. 2B, and is not described here again. In some possible embodiments of the present application, the privacy-related prediction policy may be executed at a specified time of day, for example, at a zero point of each day.
In some possible embodiments, the first information may further include: an upper limit on the number of refreshes from the vendor.
According to the technical scheme provided by the embodiment of the application, the privacy-related prediction strategy is described. And according to the first information obtained by prediction, obtaining data corresponding to the hot travel line from the supplier and storing the data into the cache, thereby being beneficial to improving the hit rate of the active cache during query.
In some possible embodiments, the first prediction unit 402 is specifically configured to obtain the historical travel information of the user from an unfair database; and predicting a business trip line cached by the public bus matched with the business trip information of the user which does not depart according to the historical business trip information of the user.
In some possible embodiments, the first prediction unit 402 is further configured to, when there is no historical travel information of the user in the cause public database, determine preference information of other users according to historical travel information of other users stored in the cause public database; and predicting the business trip line cached by the public initiative matched with the business trip information of the user without departure according to the preference information of the other users.
In some possible embodiments, the first processing unit 403 is specifically configured to, when data of the business trip line cached by the public owner changes, obtain, in real time, data corresponding to the business trip line cached by the public owner from the supplier and store the data in a cache.
In some possible embodiments, in predicting the refresh frequency of the hot trip line, the second prediction unit is specifically configured to: and adjusting the refreshing frequency of the hot travel line according to the historical change times of the hot travel line.
According to the technical scheme provided by the embodiment of the application, the private initiative refreshing frequency is limited, and the refreshing frequency of the hot air line on the same day is determined according to the historical change times of the hot air line, so that the refreshing frequency can be optimized, the setting of the refreshing frequency is more matched with the change of the hot air line, and the reduction of the failure rate of the warehouse inspection is facilitated.
In some possible embodiments, in terms of predicting an upper limit of the number of times of refreshing the provider, the privacy-related prediction policy specifically includes: predicting the PNR number of the supplier according to the historical passenger booking record PNR number of the supplier, and determining the upper limit of the refreshing time of the supplier by combining the booking ratio of the supplier and the predicted PNR number of the supplier.
According to the technical scheme provided by the embodiment of the application, the upper limit of the refreshing times of the supplier is determined according to the historical PNR number and the searching and booking ratio of the supplier, and the query cost is favorably reduced.
Referring to fig. 5A, fig. 5A is a device for querying travel, which includes: a first acquisition unit 501, a first prediction unit 502 and a first processing unit 503. The first obtaining unit 501 is configured to, when the query entry is a public-cause query entry, obtain, by the business trip platform, business trip information that the user did not depart from the business trip application platform; a first prediction unit 502, configured to predict a cause public initiative cached travel route matching the travel information that the user did not depart according to a cause public initiative caching policy; the travel line includes one or more of: a flight line for taking an airplane, a train line for taking a high-speed rail, a shift line for taking an automobile or a ship line for taking a ship. The first processing unit 503 is configured to obtain data corresponding to the business trip line cached by the public initiative from a provider according to the business trip refresh policy, and store the data in the cache.
The technical scheme provided by the embodiment of the application acquires the travel information of the user who does not depart when the query entry is the public query entry, and predicts the travel line matched with the travel information of the user who does not depart according to the public active cache strategy, so that the matching performance of data in the public active cache and user operation and the hit rate of the active cache in the public query are favorably improved, and the user experience is further improved.
Referring to fig. 5B, fig. 5B is a schematic structural diagram of a travel inquiry apparatus according to an embodiment of the present application, and the ticket inquiry apparatus 500' shown in fig. 5B includes: a first acquisition unit 501, a first prediction unit 502, a first processing unit 503, a first display unit 504, a second prediction unit 505, a second processing unit 506, a third processing unit 507, a second display unit 508, and a fourth processing unit 508. The first obtaining unit 501 is configured to, when the query entry is a public-cause query entry, obtain, by the business trip platform, business trip information that the user did not depart from the business trip application platform; a first prediction unit 502, configured to predict a cause public initiative cached travel route matching the travel information that the user did not depart according to a cause public initiative caching policy; the travel line includes one or more of: a flight line for taking an airplane, a train line for taking a high-speed rail, a shift line for taking an automobile or a ship line for taking a ship; the first processing unit 503 is configured to obtain data corresponding to the business trip line cached by the public initiative from a provider according to the business trip refresh policy, and store the data in the cache. A first display unit 504, configured to, after the first processing unit 503 obtains data corresponding to the business public initiative cache travel line from a provider according to a business public initiative refreshing policy and stores the data in a cache, display the data matching the business public query operation when the user triggers a business public query operation in a business public query interface and finds data matching the business public query operation in the cache; and displaying data corresponding to the public and passive cache travel line matched with the first real-time request after the data matched with the public query operation is not found in the cache and the first real-time query request is sent to a supplier according to the query condition.
A second prediction unit 505, configured to predict first information according to the private cause prediction policy, where the first information includes: and the hot travel line searched by the private and the refreshing frequency of the hot travel line. And a second processing unit 506, configured to obtain data corresponding to the hot travel line according to the first information, and store the data in the cache.
A third processing unit 507, configured to, after the first processing unit obtains the data corresponding to the active cache route from the provider according to the cause-public active refresh policy and stores the data in the cause-public active cache, initiate a first real-time query request to the provider according to a query condition when the user triggers a cause-public query operation in the cause-public query interface and no data matching the cause-public query operation is searched in the cache, obtain data corresponding to the cause-public passive cache travel route matching the first real-time request, and obtain data corresponding to the cause-public passive cache travel route from the provider according to the cause-public passive refresh policy and store the data in the cache. A fourth processing unit 509, configured to, when the query entry is a private-cause query entry and a user triggers a private-cause query operation in a private-cause query interface, search, in the cache, whether there is data matching the private-cause query operation; if yes, the second display unit 508 displays data matched with the private query operation; if not, a second real-time query request is sent to the supplier according to the query condition, data corresponding to the private-passive cache travel line matched with the second real-time query request is obtained and displayed through the second display unit 508, and data corresponding to the private-passive cache travel line is obtained from the supplier according to a private-passive refresh policy and stored in the cache.
In some possible embodiments, the first prediction unit 502 is specifically configured to obtain the historical travel information of the user from an unfair database; and predicting a business trip line cached by the public bus matched with the business trip information of the user which does not depart according to the historical business trip information of the user.
In some possible embodiments, the first prediction unit 502 is further configured to, when there is no historical travel information of the user in the cause public database, determine preference information of other users according to historical travel information of other users stored in the cause public database; and predicting the business trip line cached by the public initiative matched with the business trip information of the user without departure according to the preference information of the other users.
In some possible embodiments, the first processing unit 503 is specifically configured to, when the data of the business trip line cached by the public agency changes, obtain, in real time, the data corresponding to the business trip line cached by the public agency from the supplier and store the data in the cache.
In some possible embodiments, in predicting the refresh frequency of the hot pattern on the current day, the second prediction unit 505 is specifically configured to determine the refresh frequency of the hot pattern on the current day according to the historical number of changes of the hot pattern. The second prediction unit 505 determines the refresh frequency of the hot line on the current day according to the historical change times of the hot line, and may include: predicting the change times of the hot route in the fixed time period of the day according to the historical change times of the hot route in the fixed time period of the day, determining the hot value of the hot route in the fixed time period of the day according to the predicted change times of the hot route in the fixed time period of the day, and determining the refreshing frequency of the hot route in the fixed time period of the day according to the hot value of the hot route. It will be appreciated that the refresh frequency is higher the further forward the warm-gate value.
In some possible embodiments, in terms of predicting the maximum number of times of refreshing the provider on the current day, the second prediction unit 505 is specifically configured to predict the PNR number of the provider on the current day according to the historical PNR number of the provider, and determine the upper limit of the number of times of refreshing the provider on the current day by combining the check-up ratio of the provider and the predicted PNR number of the provider on the current day.
The specific implementation of each module refers to the description in the foregoing method embodiment, and is not described here again.
Referring to fig. 6, fig. 6 is a travel platform according to an embodiment of the present application, where the travel platform 600 includes: a communication interface 601, a processor 602, a memory 603 and a bus 604, wherein the communication interface 601 is used for receiving the air ticket inquiry operation from the client and communicating with the traveling platform. A memory 603 for storing executable program code. A processor 602, configured to execute a program corresponding to the executable program code by reading the executable program code stored in the memory, so as to perform the method described in any of the foregoing method embodiments.
The embodiment of the application also provides a computer readable storage medium, which stores a computer program, and the computer program is executed by hardware to realize the method of any previous method embodiment.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wire (e.g., coaxial cable, fiber optic, digital subscriber line) or wirelessly (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., compact disk), or a semiconductor medium (e.g., solid state disk), among others. In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the foregoing embodiments, the descriptions of the embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus may be implemented in other manners. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is merely a logical division, and the actual implementation may have another division, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted or not executed. In addition, the indirect coupling or direct coupling or communication connection between each other shown or discussed may be through some interfaces, indirect coupling or communication connection of devices or units, and may be electrical or in other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit may be implemented in the form of hardware, or may also be implemented in the form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage media may include, for example: various media capable of storing program codes, such as a usb disk, a removable hard disk, a read-only memory (ROM), a Random Access Memory (RAM), a magnetic disk, and an optical disk.

Claims (20)

1. An active caching method for a travel platform, comprising:
when the query entrance is a public-cause query entrance, the travel platform acquires travel information of a user who does not depart from the travel application platform;
predicting a business trip line of business trip information matching the user's non-departure according to a business trip caching strategy, the business trip line including one or more of: a flight line for taking an airplane, a train line for taking a high-speed rail, a shift line for taking an automobile or a ship line for taking a ship;
and acquiring data corresponding to the business trip line cached by the public initiative from a supplier according to the business trip refreshing strategy and storing the data into a cache.
2. The active caching method of claim 1, further comprising:
predicting the first information according to a privacy-related prediction strategy; the first information includes: the hot line of travel and the refreshing frequency of the hot line of travel due to private search;
and acquiring data corresponding to the hot travel line according to the first information and storing the data into the cache.
3. The active caching method of claim 2,
the first information further includes: an upper limit on the number of refreshes from the vendor.
4. The active caching method of any one of claims 1 to 3, wherein predicting a business trip line that matches business trip information that the user did not depart according to a business trip caching policy comprises:
acquiring historical travel information of the user from an unfair database;
and predicting a business trip line cached by the public bus matched with the business trip information of the user which does not depart according to the historical business trip information of the user.
5. The active caching method of claim 4, wherein predicting a business trip line that matches business trip information for which the user did not depart according to a business trip caching policy further comprises:
when the historical travel information of the user does not exist in the public database, determining the preference information of other users according to the historical travel information of other users stored in the public database;
and predicting the business trip line cached by the public initiative matched with the business trip information of the user without departure according to the preference information of the other users.
6. The active caching method according to any one of claims 1 to 5, wherein the obtaining and storing data corresponding to the business trip line from a supplier into a cache according to a business trip refresh policy comprises:
and when the data corresponding to the business trip line cached by the public owner changes, acquiring the data corresponding to the business trip line cached by the public owner in real time from the supplier and storing the data into a cache.
7. The active caching method according to any one of claims 2 to 6,
in terms of predicting the refresh frequency of the hot trip line, the privacy-related prediction strategy specifically includes: and adjusting the refreshing frequency of the hot travel line according to the historical change times of the hot travel line.
8. The active caching method according to any one of claims 3 to 6,
in terms of predicting the upper limit of the number of times of refreshing the provider, the privacy-related prediction policy specifically includes: predicting the PNR number of the supplier according to the historical passenger booking record PNR number of the supplier, and determining the upper limit of the refreshing time of the supplier by combining the booking ratio of the supplier and the predicted PNR number of the supplier.
9. A travel inquiry method is characterized by comprising the following steps:
when the query entrance is a public-cause query entrance, the travel platform acquires travel information of a user who does not depart from the travel application platform;
predicting a business trip line cached by the business trip agent matched with the business trip information of the user without departure according to the business trip caching strategy; the travel line includes one or more of: a flight line for taking an airplane, a train line for taking a high-speed rail, a shift line for taking an automobile or a ship line for taking a ship;
and acquiring data corresponding to the business trip line cached by the public initiative from a supplier according to the business trip refreshing strategy and storing the data into a cache.
10. The travel query method of claim 9, further comprising:
predicting the first information according to a privacy-related prediction strategy; the first information includes: the hot line of travel and the refreshing frequency of the hot line of travel due to private search;
and acquiring data corresponding to the hot travel line according to the first information and storing the data into the cache.
11. The ticket enquiry method of claim 10,
the first information further includes: an upper limit on the number of refreshes from the vendor.
12. The travel inquiry method of claims 9 to 11, wherein after obtaining the data corresponding to the travel line cached by the public initiative from the supplier according to the public initiative refresh policy and saving the data in the cache, the method further comprises:
when the user triggers the public inquiry operation in the public inquiry interface, searching whether data matched with the public inquiry operation exist in the cache;
if so, displaying the data matched with the public inquiry operation; if not, a first real-time query request is sent to a supplier according to a query condition, data corresponding to the travel line of the public and passive cache matched with the first real-time query request are obtained and displayed, and data corresponding to the travel line of the public and passive cache are obtained from the supplier according to a public and passive refreshing strategy and are stored in the cache.
13. The travel inquiry method of claim 10 or 11, wherein when the inquiry portal is a private inquiry portal, the method further comprises:
when the user triggers a private query operation in a private query interface, searching whether data matched with the private query operation exist in the cache;
if so, displaying the data matched with the private query operation; if not, a second real-time query request is sent to a supplier according to a query condition, data corresponding to the private-passive cache travel line matched with the second real-time query request are obtained and displayed, and data corresponding to the private-passive cache travel line are obtained from the supplier according to a private-passive refreshing strategy and are stored in the cache.
14. The travel query method of any one of claims 9 to 13, wherein predicting a cause principals cached travel line matching the travel information that the user did not depart according to a cause principals caching policy comprises:
acquiring historical travel information of the user from an unfair database;
and predicting a business trip line cached by the public bus matched with the business trip information of the user which does not depart according to the historical business trip information of the user.
15. The travel query method of claim 14, wherein predicting a travel line cached by a principals that matches travel information that the user did not depart according to a principals-caching policy further comprises:
when the historical travel information of the user does not exist in the public database, determining the preference information of other users according to the historical travel information of other users stored in the public database;
and predicting the business trip line cached by the public initiative matched with the business trip information of the user without departure according to the preference information of the other users.
16. The travel inquiry method of any one of claims 9 to 15, wherein said obtaining data corresponding to said travel line cached by a public owner from a supplier according to a public owner refresh policy and storing the data in a cache comprises:
and when the data of the business trip line cached by the public owner changes, acquiring the data corresponding to the business trip line cached by the public owner in real time from the supplier and storing the data into a cache.
17. The travel inquiry method according to any one of claims 10 to 16,
in terms of predicting the refresh frequency of the hot trip line, the privacy-related prediction strategy specifically includes: and adjusting the refreshing frequency of the hot travel line according to the historical change times of the hot travel line.
18. The travel inquiry method of any one of claims 11 to 17,
in terms of predicting the upper limit of the number of times of refreshing the provider, the privacy-related prediction policy specifically includes: predicting the PNR number of the supplier according to the historical passenger booking record PNR number of the supplier, and determining the upper limit of the refreshing time of the supplier by combining the booking ratio of the supplier and the predicted PNR number of the supplier.
19. A travel platform, comprising: a communication interface, a processor, a memory, and a bus, wherein,
the communication interface is used for receiving travel inquiry operation from the client and communicating with the business trip application platform;
the memory for storing executable program code;
the processor, configured to execute a program corresponding to the executable program code by reading the executable program code stored in the memory, for performing the method according to any one of claims 1 to 18.
20. A computer-readable storage medium, in which a computer program is stored, the computer program being executed by hardware to implement the method of any one of claims 1 to 18.
CN202010092215.6A 2020-02-13 2020-02-13 Active caching method of travel platform, travel query method and related products Active CN111339122B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010092215.6A CN111339122B (en) 2020-02-13 2020-02-13 Active caching method of travel platform, travel query method and related products

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010092215.6A CN111339122B (en) 2020-02-13 2020-02-13 Active caching method of travel platform, travel query method and related products

Publications (2)

Publication Number Publication Date
CN111339122A true CN111339122A (en) 2020-06-26
CN111339122B CN111339122B (en) 2023-11-10

Family

ID=71185144

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010092215.6A Active CN111339122B (en) 2020-02-13 2020-02-13 Active caching method of travel platform, travel query method and related products

Country Status (1)

Country Link
CN (1) CN111339122B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111970269A (en) * 2020-08-14 2020-11-20 中国民航信息网络股份有限公司 Server access behavior identification method and device and server
CN112507205A (en) * 2020-10-14 2021-03-16 中国民航信息网络股份有限公司 Data management method and device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120330693A1 (en) * 2011-06-27 2012-12-27 Damien Ciabrini Method and system for a pre-shopping reservation system with increased search efficiency
CN107291901A (en) * 2017-06-23 2017-10-24 携程旅游网络技术(上海)有限公司 Main passive combination obtains method, system, equipment and the storage medium of air ticket data
CN109558545A (en) * 2019-01-07 2019-04-02 北京三快在线科技有限公司 Information recommendation method, device, electronic equipment and readable storage medium storing program for executing
CN109783545A (en) * 2019-01-24 2019-05-21 深圳市活力天汇科技股份有限公司 A kind of air ticket real-time recommendation method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120330693A1 (en) * 2011-06-27 2012-12-27 Damien Ciabrini Method and system for a pre-shopping reservation system with increased search efficiency
CN107291901A (en) * 2017-06-23 2017-10-24 携程旅游网络技术(上海)有限公司 Main passive combination obtains method, system, equipment and the storage medium of air ticket data
CN109558545A (en) * 2019-01-07 2019-04-02 北京三快在线科技有限公司 Information recommendation method, device, electronic equipment and readable storage medium storing program for executing
CN109783545A (en) * 2019-01-24 2019-05-21 深圳市活力天汇科技股份有限公司 A kind of air ticket real-time recommendation method

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111970269A (en) * 2020-08-14 2020-11-20 中国民航信息网络股份有限公司 Server access behavior identification method and device and server
CN111970269B (en) * 2020-08-14 2022-04-08 中国民航信息网络股份有限公司 Server access behavior identification method and device and server
CN112507205A (en) * 2020-10-14 2021-03-16 中国民航信息网络股份有限公司 Data management method and device
CN112507205B (en) * 2020-10-14 2024-03-29 中国民航信息网络股份有限公司 Data management method and device

Also Published As

Publication number Publication date
CN111339122B (en) 2023-11-10

Similar Documents

Publication Publication Date Title
CN104169950B (en) Utilize the Database Systems of the calculating towards batch processing
JP6557662B2 (en) Method and server for providing fare availability, eg air fare availability
JP5433640B2 (en) Information providing apparatus, information providing method, information providing program, and recording medium
EP2840543A1 (en) Contextualized travel offers
JP6473158B2 (en) Method and server for providing a set of quoted prices, eg airfare price quotes
US10332038B2 (en) Travel inventory demand modeling
US20080168093A1 (en) Providing travel information using a layered cache
WO2013082151A1 (en) Layover management system and method
US20090271226A1 (en) Cache poller for providing travel planning information
US20150286960A1 (en) Media input reservation system
US20160180256A1 (en) History-based probability forecasting
WO2008086150A2 (en) Providing travel information using a layered cache
CN111339122A (en) Active caching method of travel platform, travel query method and related products
Gutiérrez et al. High-speed rail, tourists’ destination choice and length of stay: A survival model analysis
CN112925974A (en) Optimized product determination system
CN112215453A (en) Inventory information processing method and device, electronic equipment and storage medium
KR20160034223A (en) Corporate recognition for travel related services
CN107506435B (en) Special price air ticket query method based on price prediction
AU2015201731A1 (en) Media input reservation system
CA2859643A1 (en) Contextualized travel offers
CN110766420A (en) Service information determination method and device
EP3046054A1 (en) Travel inventory demand modeling
US20230376499A1 (en) Predictive data source selection for request handling systems
US20150106137A1 (en) Enhanced Content Delivery Systems and Methods
CN103635928B (en) Information provider unit, information providing method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant