US20170228424A1 - Prefetching for computing airline routes - Google Patents
Prefetching for computing airline routes Download PDFInfo
- Publication number
- US20170228424A1 US20170228424A1 US15/016,373 US201615016373A US2017228424A1 US 20170228424 A1 US20170228424 A1 US 20170228424A1 US 201615016373 A US201615016373 A US 201615016373A US 2017228424 A1 US2017228424 A1 US 2017228424A1
- Authority
- US
- United States
- Prior art keywords
- engine
- routings
- fetching
- airline
- request
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/24—Querying
- G06F16/245—Query processing
- G06F16/2453—Query optimisation
- G06F16/24534—Query rewriting; Transformation
- G06F16/24539—Query rewriting; Transformation using cached or materialised query results
-
- G06F17/30457—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/24—Querying
- G06F16/245—Query processing
- G06F16/2455—Query execution
- G06F16/24552—Database cache management
-
- G06F17/3048—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/02—Reservations, e.g. for tickets, services or events
- G06Q10/025—Coordination of plural reservations, e.g. plural trip segments, transportation combined with accommodation
Definitions
- the instant disclosure relates to airline management. More specifically, this disclosure relates to computing of airline routes.
- a system 150 may include an AirCore or other computing system 152 that accesses a routings engine 154 .
- a computing system 152 may generate a request for routes between San Francisco and New York City at a particular time and date.
- the routings engine 154 searches a listing of flights, such as stored in a database, and returns a response containing a listing of flight paths that match San Francisco and New York City for approximately the particular time and date.
- the number of available routes from the origin to the destination can be nearly endless when considering more than only direct routes. For example, a trip from San Francisco to New York City can be flown directly, or can be flow through Chicago, Dallas, Nashville, Seattle, etc.
- the permutations through possible routes from the origin to the destination may take several seconds to traverse and generate a list for display to a user. These calculations, particularly during peak times when many users are accessing the computing system to search routes, may cause unpleasant delays for users of the computing system for determining flight routes.
- a first query to the routings engine 154 for flights from San Francisco to New York City may take a long period of time, such as several seconds, to execute. The user is then provided the results in a response from the Routings Engine. These results may also be cached in memory.
- a second query to the routings engine 154 for the same San Francisco to New York City path at the same time and date may return results from the cache, and thus be returned to the user in a response in a much shorter period of time, such as fractions of a second.
- FIG. 1B One conventional application of caching in the routings engine is shown in FIG. 1B .
- FIG. 1B is a flow chart illustrating caching of flight routes in a routing engine.
- a method 100 begins with receiving a route request at block 102 and determining whether the results of the response are present in the cache at block 104 . If so, the response is loaded from the cache at block 110 , the response framed in a response message at block 112 , and the response sent at block 114 . If the request is not present in the cache at block 104 , then the method 100 processes routes for the request at block 106 , stores the request and response into the cache at block 108 , frames the response in a message at block 112 , and sends the response at block 114 .
- Caching may improve the response time of repeated, identical requests to the routings engine. Although caching improves the response time in some situations, the first user making a request that is not stored in cache must still wait a relatively long duration of time before receiving a response. This long duration of time creates an unpleasant experience for the user.
- a user experience for searching and identifying flight routes from an origin to a destination location may be enhanced by providing pre-fetching of flight routes based on predictions of user requests for particular flight routes.
- a server may be configured to perform pre-fetching of flight routes between predicted origin and destination locations and store the pre-fetched flight routes in a cache.
- the pre-fetching of flight routes may pre-fetch predicted flights routes based on historical data regarding queries being executed against a database of flight routes.
- a method may include the steps of receiving, at a routings engine from an airline reservation system, a request for first airline routes between an origin and a destination based on specified route parameters; storing, by the routings engine, the request in a historical record of requests; fetching, by the routings engine, first airline routes matching the origin, the destination, and the specified route parameters; and pre-fetching, by the routings engine, second airline routes based, at least in part, on the historical record of requests.
- a computer program product may include a non-transitory medium having instructions, when executed by a processor of a computing system, cause the processor to perform the steps of receiving, at a routings engine from an airline reservation system, a request for first airline routes between an origin and a destination based on specified route parameters; storing, by the routings engine, the request in a historical record of requests; fetching, by the routings engine, first airline routes matching the origin, the destination, and the specified route parameters; and pre-fetching, by the routings engine, second airline routes based, at least in part, on the historical record of requests.
- an apparatus may include a processor and a memory coupled to the processor.
- the processor may be configured to perform the steps of receiving, at a routings engine from an airline reservation system, a request for first airline routes between an origin and a destination based on specified route parameters; storing, by the routings engine, the request in a historical record of requests; fetching, by the routings engine, first airline routes matching the origin, the destination, and the specified route parameters; and pre-fetching, by the routings engine, second airline routes based, at least in part, on the historical record of requests.
- FIG. 1A is a conventional system for responding to route requests.
- FIG. 1B is a flow chart illustrating caching of flight routs in a routing engine.
- FIG. 2 is a flow chart illustrating operation of a routings engine with pre-fetching enabled when responding to a route request according to one embodiment of the disclosure.
- FIG. 3 is a flow chart illustrating a pre-fetching operation according to one embodiment of the disclosure.
- FIG. 4 is a flow chart illustrating a method of pre-fetching at specific times according to one embodiment of the disclosure.
- FIG. 5 is a block diagram illustrating a database table for recording route requests for use during pre-fetching according to one embodiment of the disclosure.
- FIG. 6 is flow chart illustrating a method of pre-fetching routes according to one embodiment of the disclosure.
- FIG. 7 is a block diagram illustrating a computer network according to one embodiment of the disclosure.
- FIG. 8 is a block diagram illustrating a computer system according to one embodiment of the disclosure.
- FIG. 2 is a flow chart illustrating operation of a routings engine with pre-fetching enabled when responding to a route request according to one embodiment of the disclosure.
- a method 200 begins with receiving a route request at block 202 and determining whether the results of the response are present in the cache at block 204 . If so, the response is loaded from the cache at 212 , the response framed in a response message at block 214 , and the response sent at block 216 . If the request is not present in the cache at block 204 , then the method 200 processes routes for the request at block 206 .
- the route request may then be sent to a pre-fetch engine at block 208 , which may be integrated with the routings engine, where the pre-fetch engine may store the request and use the set of stored requests to predict future route requests.
- a route request may only be forwarded to the pre-fetch engine if processing the request at block 206 exceeded a predetermined threshold amount of time. In this method of operation, only long-running route requests are pre-fetched. For example, any route request that takes longer than one second to process at block 206 may be forwarded to the pre-fetching engine.
- the request and response may be stored into the cache. Then, at block 214 , the method 200 may include framing the response in a message, and sending the response at block 216 .
- the route requests may be sent to the pre-fetch engine and processed by the pre-fetch engine in parallel with responding to the route requests.
- the pre-fetching engine may process requests at certain intervals, such as intervals defined by a scheduled timer.
- FIG. 3 is a flow chart illustrating a pre-fetching operation according to one embodiment of the disclosure.
- a method 300 may begin at block 304 with a trigger even being received to process received route requests.
- the pre-fetch engine determines if any new requests have been received from the routings engine (RE). If not, the method 300 proceeds to block 316 to delay for a duration of time and return to block 306 to check for new requests.
- RE routings engine
- the method 300 continues to block 308 to update a data object to include the received request.
- FIG. 4 is a flow chart illustrating a method of pre-fetching at specific times according to one embodiment of the disclosure.
- a configuration may be read, such as from a database table named CONFIGURATION_PARAMETERS.
- a frequency value of historical request information may be read from the configuration.
- the pre-fetch engine may identify one or more of a busy time, an idle time, a weighted arithmetic mean (WAM) for historical request time, and a next interval.
- WAM weighted arithmetic mean
- schedule information may be retrieved from the REQUEST_DETAILS table of FIG. 3 based on a frequency identified at block 402 .
- a busy time value may be identified as a mean between a total WAM and a highest time, and an idle time value may be identified as a mean between the total WAM and a lowest time. Then, at block 406 , all route requests whose response time greater than a busy time threshold value may be collected.
- the busy time threshold value may set a value that indicates that a particular route request is predicted to be repeated. For example, if a busy time threshold value is set to four and the historical records show five requests made for San Francisco to New York City routes over a certain duration of time, then the request for San Francisco to New York City routes may be marked for pre-fetching at the next interval by queueing the request at block 408 .
- the busy time threshold value may be predetermined and set by an administrator or may be adaptive and set to, for example, a value to cache a certain percentage of requests, such as top 25% busiest requests.
- a pre-fetch engine may determine whether a current date is present in a critical date and may execute a process when the current date is present in the critical date.
- the critical date may be obtained from a DB table named CRITICAL_DATE. This database may indicate that on a given date, one can expect increase in travel request from a given origin and/or to a given destination. Additional details are described below.
- the process may include processing all requests for a certain origin-destination (OD) pair, if both are given: processing all requests from the historical record of route requests with the given origin, if only an origin is given; and/or processing all requests from the historical record of route requests with the given destination, if only the destination is given.
- OD origin-destination
- All route requests identified as candidates for pre-fetching during blocks 402 , 404 , and 406 may be queued at block 408 for pre-fetching at a next interval.
- the next interval value may specify when to execute the route requests to pre-fetch and store results in the cache.
- the next interval value may be selected to be a time when the routings engine (RE) is known to be operating well below its capacity to respond to route requests.
- the data for certain predicted routes may be pre-fetched and stored in a cache when the routings engine is not busy. This pre-fetching may create little negative impact to the users of the routing engine while providing increased response to frequently performed route requests.
- route requests received during a busy time such as a time period when demand on the routings engine is above a mean between a total weighted arithmetic mean and a highest load
- route requests received during an idle time such as a time period when demand on the routings engine is below a mean between a total weighted arithmetic mean and a lowest load.
- predictions for route requests that are pre-fetched may be based on the day. For example, route requests may be analyzed to determine common origin-destination pairs requests on Mondays and pre-fetch route requests for those common origin-destination pairs on Monday. In one example situation, many flyers may be returning home from vacation cities on Mondays and thus route requests specifying non-vacation oriented cities may be cached.
- unique days such as holidays that fall on different days each year, may be scheduled into the pre-fetch engine and used to select routes for pre-fetching.
- holidays such as holidays that fall on different days each year
- the server may be programmed with the dates of labor days in the future and past, such that the pre-fetch engine may predict likely flights around such holidays or other special days.
- FIG. 5 is a block diagram illustrating a database table for recording route requests for use during pre-fetching according to one embodiment of the disclosure.
- One table 502 such as a CONFIGURATION_PARAMETERS table, may include fields for start time, daily frequency, weekly frequency, monthly frequency, yearly frequency, and/or sort criteria.
- the daily frequency, weekly frequency, monthly frequency, and yearly frequency values may be Boolean values indicating a schedule for re-evaluating which routes to pre-fetch. For example, if the weekly frequency value is set to TRUE, then the pre-fetch engine may re-evaluate each week the list of route requests to pre-fetch.
- Another table 504 may include fields for serial, critical date, critical origin, and/or critical destination.
- this table may be entered by a user based on experience and/or automatically generated based on statistical approaches. For example, during the Christmas holiday season there may be an increase in bookings to Orlando, Fla. on December 22 and 23 indicated by entries: CRITICAL_DATE ⁇ ‘1’, 22 Dec 2014, *, MCO ⁇ , CRITICAL_DATE ⁇ ‘1’, 23 Dec 2014, *, MCO ⁇ . Because the 22nd and 23rd are important travel dates, it may be anticipated that these requests could come at least a few weeks earlier, such as from December 1 onward.
- the system may be updated with requests for routes required on 22nd and 23rd from December 1 st itself by prefetching each day from 1st until 23rd.
- prefetch may obtain routes from Orlando, Fla. on 26th December such as with an entry: CRITICAL_DATE ⁇ ‘1’, 26 Dec 2014, MCO, * ⁇ .
- Other examples of known events for prefetching may include, for example, football league events.
- a further table 506 may include fields for serial, request date, request time, origin, destination, journey date, and computation time.
- the table 506 may, for example, store the historical records of a selection of route requests received at the routings engine. Each request may include an identifier value (serial), the request date and time, the origin and destination of the request, route parameters such as journey date, and a computation time of when results for that route request were stored in the cache.
- the computation time field may allow, for example, the pre-fetch engine to delete stale route response computations from the cache.
- FIG. 6 is flow chart illustrating a method of pre-fetching routes according to one embodiment of the disclosure.
- a method 600 begins at block 602 with receiving, at a routings engine from an airline reservation system, a request for first airline routes between an origin and a destination based on specified route parameters. Then at block 604 , the method 600 continues with storing, by the routings engine, the request in a historical record of requests. At block 606 , the method 600 continues with fetching, by the routings engine, first airline routes matching the origin, the destination, and the specified route parameters. Then, at block 608 , the method 600 continues with pre-fetching, by the routings engine, second airline routes based, at least in part, on the historical record of requests.
- FIG. 7 illustrates one embodiment of a system 700 for an information system, including a system for responding to flight requests.
- the system 700 may include a server 702 , a data storage device 706 , a network 708 , and a user interface device 710 .
- the system 700 may include a storage controller 704 , or storage server configured to manage data communications between the data storage device 706 and the server 702 or other components in communication with the network 708 .
- the storage controller 704 may be coupled to the network 708 .
- the user interface device 710 is referred to broadly and is intended to encompass a suitable processor-based device such as a desktop computer, a laptop computer, a personal digital assistant (PDA) or tablet computer, a smartphone, or other mobile communication device having access to the network 708 .
- the user interface device 710 may access the Internet or other wide area or local area network to access a web application or web service hosted by the server 702 and may provide a user interface for controlling the information system.
- the network 708 may facilitate communications of data between the server 702 and the user interface device 710 .
- the network 708 may include any type of communications network including, but not limited to, a direct PC-to-PC connection, a local area network (LAN), a wide area network (WAN), a modem-to-modem connection, the Internet, a combination of the above, or any other communications network now known or later developed within the networking arts which permits two or more computers to communicate.
- FIG. 8 illustrates a computer system 800 adapted according to certain embodiments of the server 702 and/or the user interface device 710 .
- the central processing unit (“CPU”) 802 is coupled to the system bus 804 . Although only a single CPU is shown, multiple CPUs may be present.
- the CPU 802 may be a general purpose CPU or microprocessor, graphics processing unit (“GPU”), and/or microcontroller. The present embodiments are not restricted by the architecture of the CPU 802 so long as the CPU 802 , whether directly or indirectly, supports the operations as described herein.
- the CPU 802 may execute the various logical instructions according to the present embodiments.
- the computer system 800 may also include random access memory (RAM) 808 , which may be synchronous RAM (SRAM), dynamic RAM (DRAM), synchronous dynamic RAM (SDRAM), or the like.
- RAM random access memory
- the computer system 800 may utilize RAM 808 to store the various data structures used by a software application.
- the computer system 800 may also include read only memory (ROM) 806 which may be PROM, EPROM, EEPROM, optical storage, or the like.
- ROM read only memory
- the ROM may store configuration information for booting the computer system 800 .
- the RAM 808 and the ROM 806 hold user and system data, and both the RAM 808 and the ROM 806 may be randomly accessed.
- the computer system 800 may also include an input/output (I/O) adapter 810 , a communications adapter 814 , a user interface adapter 816 , and a display adapter 822 .
- the I/O adapter 810 and/or the user interface adapter 816 may, in certain embodiments, enable a user to interact with the computer system 800 .
- the display adapter 822 may display a graphical user interface (GUI) associated with a software or web-based application on a display device 824 , such as a monitor or touch screen.
- GUI graphical user interface
- the I/O adapter 810 may couple one or more storage devices 812 , such as one or more of a hard drive, a solid state storage device, a flash drive, a compact disc (CD) drive, a floppy disk drive, and a tape drive, to the computer system 800 .
- the data storage 812 may be a separate server coupled to the computer system 800 through a network connection to the I/O adapter 810 .
- the communications adapter 814 may be adapted to couple the computer system 800 to the network 708 , which may be one or more of a LAN, WAN, and/or the Internet.
- the user interface adapter 816 couples user input devices, such as a keyboard 820 , a pointing device 818 , and/or a touch screen (not shown) to the computer system 800 .
- the keyboard 820 may be an on-screen keyboard displayed on a touch panel.
- the display adapter 822 may be driven by the CPU 802 to control the display on the display device 824 . Any of the devices 802 - 822 may be physical and/or logical.
- the applications of the present disclosure are not limited to the architecture of computer system 800 .
- the computer system 800 is provided as an example of one type of computing device that may be adapted to perform the functions of the server 702 and/or the user interface device 710 .
- any suitable processor-based device may be utilized including, without limitation, personal data assistants (PDAs), tablet computers, smartphones, computer game consoles, and multi-processor servers.
- PDAs personal data assistants
- the systems and methods of the present disclosure may be implemented on application specific integrated circuits (ASIC), very large scale integrated (VLSI) circuits, or other circuitry.
- ASIC application specific integrated circuits
- VLSI very large scale integrated circuits
- persons of ordinary skill in the art may utilize any number of suitable structures capable of executing logical operations according to the described embodiments.
- the computer system may be virtualized for access by multiple users and/or applications.
- Computer-readable media includes physical computer storage media.
- a storage medium may be any available medium that can be accessed by a computer.
- such computer-readable media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer.
- Disk and disc includes compact discs (CD), laser discs, optical discs, digital versatile discs (DVD), floppy disks and blu-ray discs. Generally, disks reproduce data magnetically, and discs reproduce data optically. Combinations of the above should also be included within the scope of computer-readable media. Additionally, the firmware and/or software may be executed by processors integrated with components described above.
- instructions and/or data may be provided as signals on transmission media included in a communication apparatus.
- a communication apparatus may include a transceiver having signals indicative of instructions and data. The instructions and data are configured to cause one or more processors to implement the functions outlined in the claims.
Abstract
Description
- The instant disclosure relates to airline management. More specifically, this disclosure relates to computing of airline routes.
- Identifying routes between an origin and a destination is an important part of routing a package or seating a passenger. One system for identifying routes is shown in
FIG. 1A . Asystem 150 may include an AirCore orother computing system 152 that accesses aroutings engine 154. For example, acomputing system 152 may generate a request for routes between San Francisco and New York City at a particular time and date. Theroutings engine 154 searches a listing of flights, such as stored in a database, and returns a response containing a listing of flight paths that match San Francisco and New York City for approximately the particular time and date. - The number of available routes from the origin to the destination can be nearly endless when considering more than only direct routes. For example, a trip from San Francisco to New York City can be flown directly, or can be flow through Chicago, Dallas, Nashville, Seattle, etc. The permutations through possible routes from the origin to the destination, even with modern computing systems, may take several seconds to traverse and generate a list for display to a user. These calculations, particularly during peak times when many users are accessing the computing system to search routes, may cause unpleasant delays for users of the computing system for determining flight routes.
- One conventional solution to enhancing the experience of a user searching for routes is to cache certain results. A first query to the
routings engine 154 for flights from San Francisco to New York City may take a long period of time, such as several seconds, to execute. The user is then provided the results in a response from the Routings Engine. These results may also be cached in memory. A second query to theroutings engine 154 for the same San Francisco to New York City path at the same time and date may return results from the cache, and thus be returned to the user in a response in a much shorter period of time, such as fractions of a second. One conventional application of caching in the routings engine is shown inFIG. 1B . -
FIG. 1B is a flow chart illustrating caching of flight routes in a routing engine. Amethod 100 begins with receiving a route request atblock 102 and determining whether the results of the response are present in the cache atblock 104. If so, the response is loaded from the cache atblock 110, the response framed in a response message atblock 112, and the response sent atblock 114. If the request is not present in the cache atblock 104, then themethod 100 processes routes for the request atblock 106, stores the request and response into the cache atblock 108, frames the response in a message atblock 112, and sends the response atblock 114. - Caching may improve the response time of repeated, identical requests to the routings engine. Although caching improves the response time in some situations, the first user making a request that is not stored in cache must still wait a relatively long duration of time before receiving a response. This long duration of time creates an unpleasant experience for the user.
- A user experience for searching and identifying flight routes from an origin to a destination location may be enhanced by providing pre-fetching of flight routes based on predictions of user requests for particular flight routes. In one embodiment, a server may be configured to perform pre-fetching of flight routes between predicted origin and destination locations and store the pre-fetched flight routes in a cache. When a user queries for flight routes between an origin and destination, the results may be retrieved from the cache, if available, and the user provided a much quicker response to the query. The pre-fetching of flight routes may pre-fetch predicted flights routes based on historical data regarding queries being executed against a database of flight routes.
- According to one embodiment, a method may include the steps of receiving, at a routings engine from an airline reservation system, a request for first airline routes between an origin and a destination based on specified route parameters; storing, by the routings engine, the request in a historical record of requests; fetching, by the routings engine, first airline routes matching the origin, the destination, and the specified route parameters; and pre-fetching, by the routings engine, second airline routes based, at least in part, on the historical record of requests.
- According to another embodiment, a computer program product may include a non-transitory medium having instructions, when executed by a processor of a computing system, cause the processor to perform the steps of receiving, at a routings engine from an airline reservation system, a request for first airline routes between an origin and a destination based on specified route parameters; storing, by the routings engine, the request in a historical record of requests; fetching, by the routings engine, first airline routes matching the origin, the destination, and the specified route parameters; and pre-fetching, by the routings engine, second airline routes based, at least in part, on the historical record of requests.
- According to yet another embodiment, an apparatus may include a processor and a memory coupled to the processor. The processor may be configured to perform the steps of receiving, at a routings engine from an airline reservation system, a request for first airline routes between an origin and a destination based on specified route parameters; storing, by the routings engine, the request in a historical record of requests; fetching, by the routings engine, first airline routes matching the origin, the destination, and the specified route parameters; and pre-fetching, by the routings engine, second airline routes based, at least in part, on the historical record of requests.
- The foregoing has outlined rather broadly the features and technical advantages of the present invention in order that the detailed description of the invention that follows may be better understood. Additional features and advantages of the invention will be described hereinafter that form the subject of the claims of the invention. It should be appreciated by those skilled in the art that the conception and specific embodiment disclosed may be readily utilized as a basis for modifying or designing other structures for carrying out the same purposes of the present invention. It should also be realized by those skilled in the art that such equivalent constructions do not depart from the spirit and scope of the invention as set forth in the appended claims. The novel features that are believed to be characteristic of the invention, both as to its organization and method of operation, together with further objects and advantages will be better understood from the following description when considered in connection with the accompanying figures. It is to be expressly understood, however, that each of the figures is provided for the purpose of illustration and description only and is not intended as a definition of the limits of the present invention.
- For a more complete understanding of the disclosed system and methods, reference is now made to the following descriptions taken in conjunction with the accompanying drawings.
-
FIG. 1A is a conventional system for responding to route requests. -
FIG. 1B is a flow chart illustrating caching of flight routs in a routing engine. -
FIG. 2 is a flow chart illustrating operation of a routings engine with pre-fetching enabled when responding to a route request according to one embodiment of the disclosure. -
FIG. 3 is a flow chart illustrating a pre-fetching operation according to one embodiment of the disclosure. -
FIG. 4 is a flow chart illustrating a method of pre-fetching at specific times according to one embodiment of the disclosure. -
FIG. 5 is a block diagram illustrating a database table for recording route requests for use during pre-fetching according to one embodiment of the disclosure. -
FIG. 6 is flow chart illustrating a method of pre-fetching routes according to one embodiment of the disclosure. -
FIG. 7 is a block diagram illustrating a computer network according to one embodiment of the disclosure. -
FIG. 8 is a block diagram illustrating a computer system according to one embodiment of the disclosure. -
FIG. 2 is a flow chart illustrating operation of a routings engine with pre-fetching enabled when responding to a route request according to one embodiment of the disclosure. Amethod 200 begins with receiving a route request atblock 202 and determining whether the results of the response are present in the cache atblock 204. If so, the response is loaded from the cache at 212, the response framed in a response message atblock 214, and the response sent atblock 216. If the request is not present in the cache atblock 204, then themethod 200 processes routes for the request atblock 206. The route request may then be sent to a pre-fetch engine atblock 208, which may be integrated with the routings engine, where the pre-fetch engine may store the request and use the set of stored requests to predict future route requests. In one embodiment, a route request may only be forwarded to the pre-fetch engine if processing the request atblock 206 exceeded a predetermined threshold amount of time. In this method of operation, only long-running route requests are pre-fetched. For example, any route request that takes longer than one second to process atblock 206 may be forwarded to the pre-fetching engine. Atblock 210, the request and response may be stored into the cache. Then, atblock 214, themethod 200 may include framing the response in a message, and sending the response atblock 216. - The route requests may be sent to the pre-fetch engine and processed by the pre-fetch engine in parallel with responding to the route requests. In one embodiment, the pre-fetching engine may process requests at certain intervals, such as intervals defined by a scheduled timer.
FIG. 3 is a flow chart illustrating a pre-fetching operation according to one embodiment of the disclosure. Amethod 300 may begin atblock 304 with a trigger even being received to process received route requests. Atblock 306, the pre-fetch engine determines if any new requests have been received from the routings engine (RE). If not, themethod 300 proceeds to block 316 to delay for a duration of time and return to block 306 to check for new requests. When new requests are identified atblock 306, then themethod 300 continues to block 308 to update a data object to include the received request. Atblock 310, it is determined whether the received request matches a previously-received request. If so, a counter for the request is incremented atblock 312. If not, a new record for the request is created atblock 314. - Additional details for one embodiment of pre-fetching route data is described in
FIG. 4 .FIG. 4 is a flow chart illustrating a method of pre-fetching at specific times according to one embodiment of the disclosure. Atblock 402, a configuration may be read, such as from a database table named CONFIGURATION_PARAMETERS. For example, a frequency value of historical request information may be read from the configuration. Atblock 404, the pre-fetch engine may identify one or more of a busy time, an idle time, a weighted arithmetic mean (WAM) for historical request time, and a next interval. For example, schedule information may be retrieved from the REQUEST_DETAILS table ofFIG. 3 based on a frequency identified atblock 402. In another example, a busy time value may be identified as a mean between a total WAM and a highest time, and an idle time value may be identified as a mean between the total WAM and a lowest time. Then, atblock 406, all route requests whose response time greater than a busy time threshold value may be collected. - The busy time threshold value may set a value that indicates that a particular route request is predicted to be repeated. For example, if a busy time threshold value is set to four and the historical records show five requests made for San Francisco to New York City routes over a certain duration of time, then the request for San Francisco to New York City routes may be marked for pre-fetching at the next interval by queueing the request at
block 408. The busy time threshold value may be predetermined and set by an administrator or may be adaptive and set to, for example, a value to cache a certain percentage of requests, such as top 25% busiest requests. - While collecting routes at
block 406, a pre-fetch engine may determine whether a current date is present in a critical date and may execute a process when the current date is present in the critical date. In one embodiment, the critical date may be obtained from a DB table named CRITICAL_DATE. This database may indicate that on a given date, one can expect increase in travel request from a given origin and/or to a given destination. Additional details are described below. The process may include processing all requests for a certain origin-destination (OD) pair, if both are given: processing all requests from the historical record of route requests with the given origin, if only an origin is given; and/or processing all requests from the historical record of route requests with the given destination, if only the destination is given. - All route requests identified as candidates for pre-fetching during
blocks block 408 for pre-fetching at a next interval. The next interval value may specify when to execute the route requests to pre-fetch and store results in the cache. The next interval value may be selected to be a time when the routings engine (RE) is known to be operating well below its capacity to respond to route requests. Thus, the data for certain predicted routes may be pre-fetched and stored in a cache when the routings engine is not busy. This pre-fetching may create little negative impact to the users of the routing engine while providing increased response to frequently performed route requests. - In one embodiment, route requests received during a busy time, such as a time period when demand on the routings engine is above a mean between a total weighted arithmetic mean and a highest load, may be identified and processed during an idle time, such as a time period when demand on the routings engine is below a mean between a total weighted arithmetic mean and a lowest load.
- In another embodiment, predictions for route requests that are pre-fetched may be based on the day. For example, route requests may be analyzed to determine common origin-destination pairs requests on Mondays and pre-fetch route requests for those common origin-destination pairs on Monday. In one example situation, many flyers may be returning home from vacation cities on Mondays and thus route requests specifying non-vacation oriented cities may be cached.
- In a further embodiment, unique days, such as holidays that fall on different days each year, may be scheduled into the pre-fetch engine and used to select routes for pre-fetching. For example, the American Labor Day holiday may fall on a different date each year, but the server may be programmed with the dates of labor days in the future and past, such that the pre-fetch engine may predict likely flights around such holidays or other special days.
- Example database tables for storing data discussed above with reference to
FIG. 3 andFIG. 4 are shown inFIG. 5 .FIG. 5 is a block diagram illustrating a database table for recording route requests for use during pre-fetching according to one embodiment of the disclosure. One table 502, such as a CONFIGURATION_PARAMETERS table, may include fields for start time, daily frequency, weekly frequency, monthly frequency, yearly frequency, and/or sort criteria. The daily frequency, weekly frequency, monthly frequency, and yearly frequency values may be Boolean values indicating a schedule for re-evaluating which routes to pre-fetch. For example, if the weekly frequency value is set to TRUE, then the pre-fetch engine may re-evaluate each week the list of route requests to pre-fetch. - Another table 504, such as a CRITICAL_DATE table, may include fields for serial, critical date, critical origin, and/or critical destination. In one embodiment, this table may be entered by a user based on experience and/or automatically generated based on statistical approaches. For example, during the Christmas holiday season there may be an increase in bookings to Orlando, Fla. on December 22 and 23 indicated by entries: CRITICAL_DATE{‘1’, 22 Dec 2014, *, MCO}, CRITICAL_DATE{‘1’, 23 Dec 2014, *, MCO}. Because the 22nd and 23rd are important travel dates, it may be anticipated that these requests could come at least a few weeks earlier, such as from December 1 onward. Thus, the system may be updated with requests for routes required on 22nd and 23rd from December 1st itself by prefetching each day from 1st until 23rd. Similarly, prefetch may obtain routes from Orlando, Fla. on 26th December such as with an entry: CRITICAL_DATE{‘1’, 26 Dec 2014, MCO, *}. Other examples of known events for prefetching may include, for example, football league events.
- A further table 506, such as a REQUEST_DETAILS table, may include fields for serial, request date, request time, origin, destination, journey date, and computation time. The table 506 may, for example, store the historical records of a selection of route requests received at the routings engine. Each request may include an identifier value (serial), the request date and time, the origin and destination of the request, route parameters such as journey date, and a computation time of when results for that route request were stored in the cache. The computation time field may allow, for example, the pre-fetch engine to delete stale route response computations from the cache.
-
FIG. 6 is flow chart illustrating a method of pre-fetching routes according to one embodiment of the disclosure. Amethod 600 begins atblock 602 with receiving, at a routings engine from an airline reservation system, a request for first airline routes between an origin and a destination based on specified route parameters. Then atblock 604, themethod 600 continues with storing, by the routings engine, the request in a historical record of requests. Atblock 606, themethod 600 continues with fetching, by the routings engine, first airline routes matching the origin, the destination, and the specified route parameters. Then, atblock 608, themethod 600 continues with pre-fetching, by the routings engine, second airline routes based, at least in part, on the historical record of requests. -
FIG. 7 illustrates one embodiment of asystem 700 for an information system, including a system for responding to flight requests. Thesystem 700 may include aserver 702, adata storage device 706, anetwork 708, and auser interface device 710. In a further embodiment, thesystem 700 may include astorage controller 704, or storage server configured to manage data communications between thedata storage device 706 and theserver 702 or other components in communication with thenetwork 708. In an alternative embodiment, thestorage controller 704 may be coupled to thenetwork 708. - In one embodiment, the
user interface device 710 is referred to broadly and is intended to encompass a suitable processor-based device such as a desktop computer, a laptop computer, a personal digital assistant (PDA) or tablet computer, a smartphone, or other mobile communication device having access to thenetwork 708. In a further embodiment, theuser interface device 710 may access the Internet or other wide area or local area network to access a web application or web service hosted by theserver 702 and may provide a user interface for controlling the information system. - The
network 708 may facilitate communications of data between theserver 702 and theuser interface device 710. Thenetwork 708 may include any type of communications network including, but not limited to, a direct PC-to-PC connection, a local area network (LAN), a wide area network (WAN), a modem-to-modem connection, the Internet, a combination of the above, or any other communications network now known or later developed within the networking arts which permits two or more computers to communicate. -
FIG. 8 illustrates acomputer system 800 adapted according to certain embodiments of theserver 702 and/or theuser interface device 710. The central processing unit (“CPU”) 802 is coupled to thesystem bus 804. Although only a single CPU is shown, multiple CPUs may be present. TheCPU 802 may be a general purpose CPU or microprocessor, graphics processing unit (“GPU”), and/or microcontroller. The present embodiments are not restricted by the architecture of theCPU 802 so long as theCPU 802, whether directly or indirectly, supports the operations as described herein. TheCPU 802 may execute the various logical instructions according to the present embodiments. - The
computer system 800 may also include random access memory (RAM) 808, which may be synchronous RAM (SRAM), dynamic RAM (DRAM), synchronous dynamic RAM (SDRAM), or the like. Thecomputer system 800 may utilizeRAM 808 to store the various data structures used by a software application. Thecomputer system 800 may also include read only memory (ROM) 806 which may be PROM, EPROM, EEPROM, optical storage, or the like. The ROM may store configuration information for booting thecomputer system 800. TheRAM 808 and theROM 806 hold user and system data, and both theRAM 808 and theROM 806 may be randomly accessed. - The
computer system 800 may also include an input/output (I/O)adapter 810, acommunications adapter 814, auser interface adapter 816, and adisplay adapter 822. The I/O adapter 810 and/or theuser interface adapter 816 may, in certain embodiments, enable a user to interact with thecomputer system 800. In a further embodiment, thedisplay adapter 822 may display a graphical user interface (GUI) associated with a software or web-based application on adisplay device 824, such as a monitor or touch screen. - The I/
O adapter 810 may couple one ormore storage devices 812, such as one or more of a hard drive, a solid state storage device, a flash drive, a compact disc (CD) drive, a floppy disk drive, and a tape drive, to thecomputer system 800. According to one embodiment, thedata storage 812 may be a separate server coupled to thecomputer system 800 through a network connection to the I/O adapter 810. Thecommunications adapter 814 may be adapted to couple thecomputer system 800 to thenetwork 708, which may be one or more of a LAN, WAN, and/or the Internet. Theuser interface adapter 816 couples user input devices, such as akeyboard 820, apointing device 818, and/or a touch screen (not shown) to thecomputer system 800. Thekeyboard 820 may be an on-screen keyboard displayed on a touch panel. Thedisplay adapter 822 may be driven by theCPU 802 to control the display on thedisplay device 824. Any of the devices 802-822 may be physical and/or logical. - The applications of the present disclosure are not limited to the architecture of
computer system 800. Rather thecomputer system 800 is provided as an example of one type of computing device that may be adapted to perform the functions of theserver 702 and/or theuser interface device 710. For example, any suitable processor-based device may be utilized including, without limitation, personal data assistants (PDAs), tablet computers, smartphones, computer game consoles, and multi-processor servers. Moreover, the systems and methods of the present disclosure may be implemented on application specific integrated circuits (ASIC), very large scale integrated (VLSI) circuits, or other circuitry. In fact, persons of ordinary skill in the art may utilize any number of suitable structures capable of executing logical operations according to the described embodiments. For example, the computer system may be virtualized for access by multiple users and/or applications. - If implemented in firmware and/or software, the functions described above may be stored as one or more instructions or code on a computer-readable medium. Examples include non-transitory computer-readable media encoded with a data structure and computer-readable media encoded with a computer program. Computer-readable media includes physical computer storage media. A storage medium may be any available medium that can be accessed by a computer. By way of example, and not limitation, such computer-readable media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer. Disk and disc includes compact discs (CD), laser discs, optical discs, digital versatile discs (DVD), floppy disks and blu-ray discs. Generally, disks reproduce data magnetically, and discs reproduce data optically. Combinations of the above should also be included within the scope of computer-readable media. Additionally, the firmware and/or software may be executed by processors integrated with components described above.
- In addition to storage on computer readable medium, instructions and/or data may be provided as signals on transmission media included in a communication apparatus. For example, a communication apparatus may include a transceiver having signals indicative of instructions and data. The instructions and data are configured to cause one or more processors to implement the functions outlined in the claims.
- Although the present disclosure and its advantages have been described in detail, it should be understood that various changes, substitutions and alterations can be made herein without departing from the spirit and scope of the disclosure as defined by the appended claims. Moreover, the scope of the present application is not intended to be limited to the particular embodiments of the process, machine, manufacture, composition of matter, means, methods and steps described in the specification. As one of ordinary skill in the art will readily appreciate from the present invention, disclosure, machines, manufacture, compositions of matter, means, methods, or steps, presently existing or later to be developed that perform substantially the same function or achieve substantially the same result as the corresponding embodiments described herein may be utilized according to the present disclosure. Accordingly, the appended claims are intended to include within their scope such processes, machines, manufacture, compositions of matter, means, methods, or steps.
Claims (18)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/016,373 US10146832B2 (en) | 2016-02-05 | 2016-02-05 | Prefetching for computing airline routes |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/016,373 US10146832B2 (en) | 2016-02-05 | 2016-02-05 | Prefetching for computing airline routes |
Publications (2)
Publication Number | Publication Date |
---|---|
US20170228424A1 true US20170228424A1 (en) | 2017-08-10 |
US10146832B2 US10146832B2 (en) | 2018-12-04 |
Family
ID=59497731
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/016,373 Active 2036-12-07 US10146832B2 (en) | 2016-02-05 | 2016-02-05 | Prefetching for computing airline routes |
Country Status (1)
Country | Link |
---|---|
US (1) | US10146832B2 (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110348971A (en) * | 2019-07-18 | 2019-10-18 | 海南太美航空股份有限公司 | A kind of route information response method, device, system and readable storage medium storing program for executing |
CN110489647A (en) * | 2019-08-15 | 2019-11-22 | 海南太美航空股份有限公司 | A kind of route information recommended method, system, storage medium and computer equipment |
CN111008257A (en) * | 2019-11-28 | 2020-04-14 | 海南太美航空股份有限公司 | Airline data competition analysis method and system based on airline big data platform |
US20230409572A1 (en) * | 2022-05-24 | 2023-12-21 | Kayak Software Corporation | Reducing latency in query-based search engines |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6067565A (en) * | 1998-01-15 | 2000-05-23 | Microsoft Corporation | Technique for prefetching a web page of potential future interest in lieu of continuing a current information download |
US6385641B1 (en) * | 1998-06-05 | 2002-05-07 | The Regents Of The University Of California | Adaptive prefetching for computer network and web browsing with a graphic user interface |
US6993591B1 (en) * | 1998-09-30 | 2006-01-31 | Lucent Technologies Inc. | Method and apparatus for prefetching internet resources based on estimated round trip time |
US7130890B1 (en) * | 2002-09-04 | 2006-10-31 | Hewlett-Packard Development Company, L.P. | Method and system for adaptively prefetching objects from a network |
US20050108069A1 (en) * | 2003-11-18 | 2005-05-19 | Tomer Shiran | System and a method for prefetching travel information |
US8667224B1 (en) * | 2007-12-20 | 2014-03-04 | Emc Corporation | Techniques for data prefetching |
US8938548B2 (en) * | 2008-12-23 | 2015-01-20 | At&T Mobility Ii Llc | Streaming enhancements through pre-fetch background |
US9075665B2 (en) * | 2010-06-29 | 2015-07-07 | International Business Machines Corporation | Smoothing peak system load via behavior prediction in collaborative systems with temporal data access patterns |
-
2016
- 2016-02-05 US US15/016,373 patent/US10146832B2/en active Active
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110348971A (en) * | 2019-07-18 | 2019-10-18 | 海南太美航空股份有限公司 | A kind of route information response method, device, system and readable storage medium storing program for executing |
CN110348971B (en) * | 2019-07-18 | 2022-05-03 | 海南太美航空股份有限公司 | Airline information response method, device and system and readable storage medium |
CN110489647A (en) * | 2019-08-15 | 2019-11-22 | 海南太美航空股份有限公司 | A kind of route information recommended method, system, storage medium and computer equipment |
CN111008257A (en) * | 2019-11-28 | 2020-04-14 | 海南太美航空股份有限公司 | Airline data competition analysis method and system based on airline big data platform |
US20230409572A1 (en) * | 2022-05-24 | 2023-12-21 | Kayak Software Corporation | Reducing latency in query-based search engines |
Also Published As
Publication number | Publication date |
---|---|
US10146832B2 (en) | 2018-12-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11036805B2 (en) | Presenting anticipated user search query results prompted by a trigger | |
US11537584B2 (en) | Pre-caching of relational database management system based on data retrieval patterns | |
US9652538B2 (en) | Web crawler optimization system | |
US9032000B2 (en) | System and method for geolocation of social media posts | |
US20180225350A1 (en) | Query dispatching system and method | |
US10146832B2 (en) | Prefetching for computing airline routes | |
AU2017204075A1 (en) | Method and apparatus for managing visitor interactions | |
US20140365452A1 (en) | Discovering Trending Content of a Domain | |
US9378235B2 (en) | Management of updates in a database system | |
JP2017525026A (en) | Generating a contextual search presentation | |
US10375157B2 (en) | System and method for reducing data streaming and/or visualization network resource usage | |
US10769547B2 (en) | Mobile searches utilizing a query-goal-mission structure | |
US9563366B2 (en) | Using queues corresponding to attribute values associated with units of work and sub-units of the unit of work to select the units of work and their sub-units to process | |
US8082342B1 (en) | Discovery of short-term and emerging trends in computer network traffic | |
JP2017501501A (en) | Generating news timelines and recommended news editions | |
US9229968B2 (en) | Management of searches in a database system | |
US20160055203A1 (en) | Method for record selection to avoid negatively impacting latency | |
US20220391379A1 (en) | Assistant nodes in director-based database system for transactional consistency | |
US20220391377A1 (en) | Time alignment in director-based database system for transactional consistency | |
US20220067004A1 (en) | Merges using key range data structures | |
JP2023522690A (en) | cached updatable top k indices | |
US20230409572A1 (en) | Reducing latency in query-based search engines | |
JP5674850B2 (en) | Database management system and method | |
KR100966477B1 (en) | Data cache method and system using region quarter tree for three dimesion map service | |
US10706190B2 (en) | Transfer and visualization of time streams for forecast simulation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: WELLS FARGO BANK, NATIONAL ASSOCIATION, PENNSYLVANIA Free format text: SECURITY INTEREST;ASSIGNOR:UNISYS CORPORATION;REEL/FRAME:038792/0820 Effective date: 20160527 Owner name: WELLS FARGO BANK, NATIONAL ASSOCIATION, PENNSYLVAN Free format text: SECURITY INTEREST;ASSIGNOR:UNISYS CORPORATION;REEL/FRAME:038792/0820 Effective date: 20160527 |
|
AS | Assignment |
Owner name: WELLS FARGO BANK, NATIONAL ASSOCIATION, AS COLLATERAL TRUSTEE, NEW YORK Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:UNISYS CORPORATION;REEL/FRAME:042354/0001 Effective date: 20170417 Owner name: WELLS FARGO BANK, NATIONAL ASSOCIATION, AS COLLATE Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:UNISYS CORPORATION;REEL/FRAME:042354/0001 Effective date: 20170417 |
|
AS | Assignment |
Owner name: JPMORGAN CHASE BANK, N.A., AS ADMINISTRATIVE AGENT, ILLINOIS Free format text: SECURITY INTEREST;ASSIGNOR:UNISYS CORPORATION;REEL/FRAME:044144/0081 Effective date: 20171005 Owner name: JPMORGAN CHASE BANK, N.A., AS ADMINISTRATIVE AGENT Free format text: SECURITY INTEREST;ASSIGNOR:UNISYS CORPORATION;REEL/FRAME:044144/0081 Effective date: 20171005 |
|
AS | Assignment |
Owner name: UNISYS CORPORATION, PENNSYLVANIA Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:WELLS FARGO BANK, NATIONAL ASSOCIATION;REEL/FRAME:044416/0114 Effective date: 20171005 |
|
AS | Assignment |
Owner name: UNISYS CORPORATION, PENNSYLVANIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CK, PRADEEP;R., RAKSHA;SANJEEVA, VENKATESH RAMACHAR;AND OTHERS;SIGNING DATES FROM 20150410 TO 20150414;REEL/FRAME:046989/0579 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: WELLS FARGO NATIONAL ASSOCIATION, NEW YORK Free format text: SECURITY INTEREST;ASSIGNOR:UNISYS CORPORATION;REEL/FRAME:048581/0126 Effective date: 20190305 |
|
AS | Assignment |
Owner name: JPMORGAN CHASE BANK, N.A., AS ADMINISTRATIVE AGENT, ILLINOIS Free format text: SECURITY INTEREST;ASSIGNOR:UNISYS CORPORATION;REEL/FRAME:051682/0465 Effective date: 20190305 |
|
AS | Assignment |
Owner name: UNISYS CORPORATION, PENNSYLVANIA Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:WELLS FARGO BANK, NATIONAL ASSOCIATION;REEL/FRAME:054231/0496 Effective date: 20200319 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |