WO2024013755A1 - A system to estimate the fastest shipping time of a package and a method thereof - Google Patents

A system to estimate the fastest shipping time of a package and a method thereof Download PDF

Info

Publication number
WO2024013755A1
WO2024013755A1 PCT/IN2022/050664 IN2022050664W WO2024013755A1 WO 2024013755 A1 WO2024013755 A1 WO 2024013755A1 IN 2022050664 W IN2022050664 W IN 2022050664W WO 2024013755 A1 WO2024013755 A1 WO 2024013755A1
Authority
WO
WIPO (PCT)
Prior art keywords
shipping
package
time
fastest
deliver
Prior art date
Application number
PCT/IN2022/050664
Other languages
French (fr)
Inventor
Chinmay S. BORKAR
Sagar R. DESHMUKH
Original Assignee
Montezuma Private Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Montezuma Private Limited filed Critical Montezuma Private Limited
Publication of WO2024013755A1 publication Critical patent/WO2024013755A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/08Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
    • G06Q10/083Shipping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning

Definitions

  • the present invention relates to a system to estimate package shipping time using machine learning and a method thereof.
  • the present invention relates to a system to estimate the fastest time to ship a package from one location to another using machine learning and a method thereof.
  • the present invention discloses a cloudbased system to estimate package-shipping times with 99% accuracy using machine learning a method thereof.
  • the shipping services field is important to people and businesses across the world. Providing accurate delivery time is a challenging task with many facets. While the shipping services field has grown and matured over the years, delivery and shipping inefficiencies still cause substantial problems for shipping senders, shipping recipients, and shipping carriers alike. In particular, it can be difficult for shipping senders and/or recipients to accurately estimate transit time (and thus delivery time) for a given shipment.
  • the purchase price paid by a consumer includes a shipping fee
  • customers require most accurate estimate time of delivery.
  • Many vendors typically supply many different retailers, each having separate warehouses, physical storerooms, and even warehouses for carriers that make the last mile delivery. Because of this, most retailers establish their own supply chain network. While a dedicated supply chain network has an upside in providing increased control and flexibility within the network, there are normally significant costs associated with having too little or too much capacity in the network.
  • consumers tend to wait additional time for their delivery because retailers or carriers may need to ship a full truckload for economic reasons.
  • Another issue is with synchronizing other carrier loads with the ones that are expected by the consumers. These costs can translate to higher shipping costs to be paid by the customer or be absorbed by the retailer or vendor, or lost sales because delivery times are too long, to name a few.
  • W02020034044A1 aims to overcome the problem of estimating accurate time of shipping a package by providing a system and method are provided for providing delivery options.
  • the method includes interfacing a supply-chain management system with a retailer user interface and detecting a request from the retailer user interface to schedule a delivery of one or more items through an integrated supply chain network coordinated by the supply-chain management system.
  • the method also includes, based on the request, determining from the integrated supply chain network, at least one delivery option for at least one delivery date, each delivery option comprising a route, one or more carriers for the route, and a time window.
  • the method also includes, for each delivery option, use a prediction engine and at least one data model to compute at least one delivery prediction parameter indicative of a likelihood of success for completing the route within the time window using the one or more carriers for the route, the at least one data model having been generated using historical delivery data.
  • the method also includes providing, in response to the request, each of the at least one delivery option augmented with the at least one delivery prediction parameter to enable the retailer user interface to display the at least one delivery option augmented with the at least one delivery prediction parameter in a user interface element enabling selection of a delivery date for the one or more items.
  • prior art CN107276896A provides a kind of point-to-point transmission method for searching shortest route for improving Dijkstra's algorithm, include following steps : 1) adjacency matrix in graph theory thought is introduced, search procedure is aided in using adjacency matrix, i.e., a certain summit is determined every time to after the beeline of initial vertex, all searches in adjacency matrix, element is 1 row in the row corresponding to the summit ; 2) only to step 1) element for 1 row corresponding to vertex distance value carry out calculating analysis, effectively reduction search procedure amount of calculation.
  • This prior art aims to avoid the useless calculating process for being not attached to node based on adjacency matrix progress path analysis, amount of calculation is effectively reduced, while ensure that searching route is most short.
  • US20180204229A1 discloses method for improving parcel delivery.
  • the method comprises generating a first and a second cross-carrier delivery prediction model based on delivery data retrieved from a plurality of carriers; at a delivery data database of a server system, storing a plurality of cross-carrier delivery models comprising the first and the second cross-carrier delivery prediction model, the first cross-carrier delivery prediction model stored in association with a first carrier identifier and the second cross-carrier delivery prediction model stored in association with a second carrier identifier, for improving upon data storage and retrieval functionality to enable substantially real-time analysis of cross-carrier data to determine delivery estimates of improved accuracy; determining a first parcel delivery estimate for a first parcel based on processing first parcel data, from a first carrier of the plurality of carriers, with the first cross-carrier delivery prediction model retrieved from the delivery data database; automatically determining a second parcel delivery estimate for a second parcel based on a first parcel delivery status and upon retrieving, from a
  • none of the prior art addresses the issue of estimating the delivery time of a package to close to absolute accuracy.
  • none of the cited prior art document proposes a solution that considers the real-time obstructions in estimating the shipping time of a package.
  • Prior Art depends on the assumption of static travel times between different nodes, which is incorrect (for instance, the travel time from Spokane, WA to Seattle, WA can be from 4 hours to 3 days depending upon traffic, closures, and other road conditions. Any model that relies on static data about distances or times or seasonality - as all Prior Art does - is bound to introduce inaccuracies, as the data does not always reflect reality.
  • Prior Art also uses shipment-by-shipment data to estimate shipping times, thereby ignoring multi-modal configurations that may involve backtracking.
  • shipment-by-shipment data As an example, just because shipments have only been carried from Seattle, WA to Sacramento, CA by truck does not mean that a complex yet faster routing (say, through San Francisco, CA by plane) is impossible.
  • Prior Art also does not factor in seasonality and holiday schedule in making delivery estimations, which is bound to introduce further errors into the prediction.
  • the present invention provides a method and a system that leverages three critical elements to estimate in the method: a. a machine learning model that continually creates and optimizes a directed graph (digraph) that models the various modes of transportation available between two non-terminal end points.
  • the machine-learning model continually observes both transit times and nodes for any changes (as an example, there may be new nodes introduced in Kansas during peak times, like Christmas, that are operational only in November and December) and recreates the digraph every 15 minutes ( ⁇ 100x per day).
  • b. a shortest-path algorithm implementation that uses the correct digraph (depending upon the shipping speed) to find the fastest shipping speed possible.
  • the present invention radically improves to 99% the accuracy of shipping time predictions made for e-commerce websites that rely on multi-modal, multi-speed logistics networks.
  • the system of the invention accepts as inputs through an Application Programming Interface (API): the source and destination ZIP codes and chosen shipping speed, shipping history - source, destination, hops, and shipping speed and time by hop, all ZIP codes and corresponding longitude/latitude, and population by ZIP codes and returns (through the same API) the fastest shipping time for the combination as the output.
  • API Application Programming Interface
  • a system for determining a fastest shipping time to deliver a package to a shipping location comprising a memory module configured to store predefined databases, a handler configured to input raw data and to update said predefined databases based on the raw data, a formatter configured to format said updated predefined databases and to generate vertices of locations stored in the update said predefined databases based on said formatted updated predefined databases, and a shipping digraph optimizer comprises a graph vertex determiner, an edge weight optimizer and a machine learning model tuner.
  • the graph vertex determiner is configured to receive said vertices from the formatter and to generate shipping vertices with edge weights distributed in all directions for the input location.
  • the edge weight optimizer is configured to receive the vertices from the formatter and to compute raw edge weights on the vertices based on historical shipping outcomes for said shipping vertices stored in said predefined databases.
  • the machine learning model tuner is configured to receive said shipping vertices with edge weights distributed in all directions and generate raw shipping graphs based on the said shipping vertices, receive computed raw edge weights and generate raw shipping edge weights, apply machine learning inputs to said generated raw shipping graphs and to said generated raw shipping edge weights for tuning of said generated raw shipping graphs, wherein a shipping-interruption-database provides said machine learning inputs, and output said tuned raw shipping graphs.
  • the system further comprises a processing module configured to process said tuned shipping graphs to output the fastest shipping time to deliver the package.
  • the machine learning model tuner is further configured to store said machine learning inputted generated raw shipping graphs among the predefined databases.
  • the machine learning model tuner comprises a short-path interface module configured to tune said generated shipping graphs.
  • the processing module comprises an input pre-processor, a shortest pathfinder and a rationalizer.
  • the input pre-processor is configured to receive data packets through an Internet based Interface (API) for determining the fastest shipping time to deliver the package; unpack the data packets and to validate the variables of shipping information on the basis of information stored in said predefined database; and confirm that data points, related to source/destination/shipping speed, present in the variables satisfy the requirement to accurately determine the shipping time of the package.
  • API Internet based Interface
  • the shortest pathfinder module is configured to initiate determining shortest path to deliver the package on said tuned shipping graphs based on refined Dijkstra’s shortest-path algorithm and to output the shortest path.
  • the rationalizer is configured to fetch shipping-interruption information from the shipping-interruption database; overlay the shipping-interruption information on the output of the shortest-path-finder module; and factor in a potential delay in the shipping time of the package in the shorted path to output the fastest time to deliver the package.
  • the shortest-path-finder module refreshes shipping graphs every 15 minutes to consider any updates in the shipping data.
  • the machine learning model tuner further comprises a machine-learning-feedback module configured to monitor the accuracy of determinations made by said graph vertex determiner and said edge weight optimizer.
  • a method for determining a fastest shipping time to deliver a package to a shipping location comprising: accepting, by an input pre-processor, a call from a caller, wherein the call comprising variables through API, unpacking said variables containing ZIP codes and validating ZIP codes using a ZIP code database, validating a presence of sufficient number of data points present in the variables, invoking a shortest path finder module to find out a shortest path to deliver a package from a starting location to the shipping location, wherein the shortest path finder module uses graphs stored in the database, tuning said graphs with machine leaning inputs, selecting one said tuned graph for finding fastest shipping time and the feeding the selected tuned graph to for rationalizing, fetching shipping-interruption details from the database and overlays them on input from the shortest-path-finder module, and returning the fastest-shipping-time as an API response to the caller.
  • the method for determining a fastest shipping time to deliver a package to a shipping location further comprising monitoring the accuracy of the fastest-shipping-time predictions by comparing the realized shipping times to predicted shipping times for all predictions.
  • the method for determining a fastest shipping time to deliver a package to a shipping further comprising, if the accuracy of the predictions is greater than 99%: for each vertex, monitor the accuracy of last k predictions by comparing the realized shipping times to the predicted shipping times for all predictions involving that vertex; and for each route, monitor the accuracy of last k predictions by comparing the realized shipping time for each prediction with predicted shipping time.
  • the method for determining a fastest shipping time to deliver a package to a shipping further comprising, if the accuracy is not greater than 99%, adjust machine learning input parameters by launching model recalibration.
  • the present invention is made with an objective to provide a system to estimate package shipping time using machine learning and a method thereof.
  • the present invention is made with an objective to provide a system that estimates package shipping time using machine learning with high accuracy and a method thereof.
  • the present invention is made with an objective to provide a method to reduce the margin of error in generated shipping time estimates.
  • the present invention is made with an objective to provide a system to estimate a package shipping time that factors in seasonality and holiday schedule in making delivery estimations, which is bound to introduce further errors into the prediction.
  • Figure 1 of the present invention illustrates a system to estimate package shipping time using machine learning according to an embodiment of the present invention.
  • Figure 2 of the present invention is a flowchart of a method that illustrates estimate package shipping time using machine learning according to an embodiment of the present invention.
  • Figure 3 of the present invention is a flow chart of a machine learning recalibration model according to an embodiment of the present invention.
  • Figure 4 of the present invention is a flow chart of a continuous validation for a machine-learning model according to an embodiment of the present invention.
  • any terms used herein such as but not limited to “includes,” “comprises,” “has,” “consists,” and grammatical variants thereof do not specify an exact limitation or restriction and certainly do not exclude the possible addition of one or more features or elements, unless otherwise stated, and furthermore must not be taken to exclude the possible removal of one or more of the listed features and elements, unless otherwise stated with the limiting language “must comprise” or “needs to include.”
  • one or more particular features and/or elements described in connection with one or more embodiments may be found in one embodiment, or may be found in more than one embodiment, or may be found in all embodiments, or may be found in no embodiments.
  • one or more features and/or elements may be described herein in the context of only a single embodiment, or alternatively in the context of more than one embodiment, or further alternatively in the context of all embodiments, the features and/or elements may instead be provided separately or in any appropriate combination or not at all.
  • any features and/or elements described in the context of separate embodiments may alternatively be realized as existing together in the context of a single embodiment.
  • Figure 1 of the present invention illustrates a system (1) to estimate package shipping time using machine learning according to an embodiment of the present invention.
  • the system (1) for determining a fastest shipping time to deliver a package to a shipping location comprises a memory module (2), a handler (4), a formatter (6), a shipping digraph optimizer (8) and a processing module (10).
  • the memory module (2) is fed with databases, like a shipping history database (I), a ZIP code database (II) and a population database (III).
  • the shipping history database (I) comprises data on the historical shipping outcomes.
  • the shipping history database may include data related to source, destination, speed, time-average, time standard deviation, count and last updates.
  • the ZIP code database (II) comprises data on the various zip-codes, the distance between them, and includes data of zip codes, associated longitudinal position and latitudinal position.
  • the population database (III) comprises data on the various population centers, their longitude and latitude positions.
  • the handler (4) and the formatter (6) may be included as one module.
  • This module is configured to accept new updates to the three datasets and formats and converts the existing formatted data to feed to the Graph-vertex-determiner module and the edge-weight-optimizer module.
  • the shipping digraph optimizer (8) comprises a graph vertex determiner (8a), an edge weight optimizer (8b) and a machine learning model tuner (8c).
  • the graph vertex determiner (8a) comprises a ZIP code vertex checker (V), a hot granularity checker (VI), and a position gravity checker (VII).
  • the ZIP code vertex checker (V) may be a module that is configured to check each ZIP code to see if that ZIP code should be grouped into a larger vertex or not (e.g. 10027 and 10031 should all be grouped into a 10001 New York City vertex).
  • the hot granularity checker (VI) may be a module that is configured to check if the hops in the shipping data are at a comparable granularity to the vertices in the zip-code data, and decides the overall granularity for the digraph.
  • the position gravity checker (VII) may be a module configured to make a final check on the grain of the data is made by this module, which corroborates the grouping decisions made by the zip-code-vertex-checker by ensuring that all grouped vertices have a population.
  • the edge weight optimizer (8b) is configured to calculate the edge weights (time taken to travel from one node to another) based on the historical shipping outcomes stored in shipping-history-database.
  • the system comprises shipping-edge-weights that reflect the raw edge weights that are computed without any machine learning input. These weights will be used to compute an ML-improved digraph every 15 minutes.
  • a raw-shipping-graph is a set of raw vertices without any machine learning inputs applied to it, ready for further processing every 15 minutes.
  • the machine learning model tuner (8c) comprises a short-path interface module (12).
  • the short-path interface module (12) is configured to apply all the machine learning inputs to digraphs that are finally stored in the shortest-path-finder module, and ensures that the learning that are captured based on new realities that are in the incoming data.
  • the short-path interface module (12) is configured to tune said generated shipping graphs, extract latest said tune generated shipping graphs, and select a right digraph corresponding to select shipping speed.
  • the processing module (10) comprises an input pre-processor (14), a shortest pathfinder module (16), and a rationalizer (18).
  • the input pre-processor (14) is configured to work to handle the incoming API call, ensuring that the ZIP codes and shipping speeds are valid and that there are enough data points to make a meaningful prediction.
  • the input pre-processor (14) is configured to receive data packets through an Internet based Interface (API) for determining the fastest shipping time to deliver the package.
  • the input pre-processor (14) is further configured to unpack the data packets and to validate the variables of shipping information on the basis of information stored in said predefined database.
  • the input preprocessor (14) is further configured to confirm that data points, related to source/destination/shipping speed, present in the variables satisfy the requirement of number of data points required to accurately determine the shipping time of the package.
  • the shortest pathfinder module (16) is configured to, when invoked by the input preprocessor (14), the shortest pathfinder module (16) is configured to run a refined version of Dijkstra’s shortest-path algorithm to find the fastest shipping speed between the source and destination vertices.
  • the shortest pathfinder module (16) is configured to initiate determining shortest path to deliver the package using said right digraph based on refined Dijkstra’s shortest-path algorithm and to output the shortest path.
  • the shortest-path- finder module (16) is configured to refresh the tuned shipping graphs every 15 minutes to consider any updates in the shipping data and to ensure that the tuned shipping-graphs are recent, so that a right digraph is selected for the selected shipping speed
  • the rationalizer (18) is configured to factor in the potential delay (by leveraging #XVI, shipping-interruption-database) that may be caused due to holidays and planned/implied interruptions due to weather and other factors or any other factors, and to output the fastest shipping time based on the speed chosen.
  • the rationalizer (18) is configured to fetch a shipping-interruption information from the shippinginterruption database.
  • the rationalizer (18) is further configured to overlay the shipping-interruption information on the output of the shortest-path-finder module.
  • the rationalizer (18) is further configured to factor in a potential delay in the shipping time of the package in the shorted path to output the fastest time to deliver the package.
  • the system further comprises a shipping interruption database (XVI).
  • the shipping interruption database (XVI) comprises data on the various scheduled and implied shipping disruptions, like source, destination, method, delta start date, delta end data, and delta.
  • the memory module (2) is configured to store predefined databases.
  • the predefined databases include the shipping history database (I), the ZIP code database (II) and/or the population database (III).
  • the handler (4) is configured to input raw data and to update said predefined databases based on the raw data.
  • the formatter (6) configured to format the updated predefined databases and to generate vertices of locations stored in the updated predefined databases based on the formatted updated predefined databases.
  • the graph vertex determiner (8a) is configured to receive the vertices from the formatter (6) and to generate shipping vertices with edge weights distributed in all directions for the input location.
  • the edge weight optimizer (8b) is configured to receive the vertices from the formatter (6) and to compute raw edge weights on the vertices based on historical shipping outcomes for the shipping vertices stored in the predefined databases.
  • the machine learning model tuner (8c) is configured to receive said shipping vertices with edge weights distributed in all directions and generate raw shipping graphs based on the said shipping vertices.
  • the machine learning model tuner (8c) is configured to receive computed raw edge weights and generate raw shipping edge weights
  • the machine learning model tuner (8c) is configured to apply machine learning inputs to the generated raw shipping graphs and to the generated raw shipping edge weights for tuning of said generated raw shipping graphs.
  • the machine learning model tuner (8c) is configured to output said tuned raw shipping graphs.
  • the shipping-interruption-database provides the machine learning inputs.
  • the processing module (10) configured to process the tuned shipping graphs to output the fastest shipping time to deliver the package.
  • Figure 2 of the present invention is a flowchart of a method that illustrates estimate package shipping time using machine learning according to an embodiment of the present invention.
  • the method for determining a fastest shipping time to deliver a package to a shipping location comprising: accepting (SI), by an input pre-processor, a call from a caller, wherein the call comprising variables through API, unpacking (S2) said variables containing ZIP codes, validating (S3) ZIP codes using a ZIP code database, validating (S4) a presence of sufficient number of data points present in the variables, invoking (S5) a shortest path finder module, checking (S6) the timings of the stored fine tuned graphs to extract recent tuned graphs and selecting a right digraph to find the fastest shipping time, tuning said di-graph with machine leaning inputs, selecting one said tuned graph for finding fastest shipping time and the feeding the selected tuned graph to for rationalizing, fetching (S8) shipping-interruption details from the database and overlays them on input from the shortest-path
  • the method comprises a step of accepting, by the input-pre- processor, API calls and unpacking the variables ⁇ source-ZIP, destination-ZIP, shippingspeed ⁇ .
  • the next step is of validating the zip-codes by finding the zip codes to exist in the databases, and confirming that there are enough data points for the source/destination/shipping speed combination to accurately predict, and to invoke the shortest-path-finder module.
  • the method comprises a step of checking, by the shortest- path-finder module, the timestamp on the tuned-shipping-graphs store and requests a refresh if it has been more than 15 minutes.
  • Next steps include in which the shortest-path-finder module is configured to select the right digraph for the chosen shipping speed and run a finetuned implementation of the Dijkstra algorithm to find the fastest shipping time between source and destination.
  • Next step is that the shortest-path-finder module is configured to invoke the rationalizer module.
  • the method incudes applying machine learning transforms, by the shortest-path-interface module when invoked by ra -shipping-graph or shipping-edge- weights, to adjust the graph vertices and edge weights to include machine learning insights to the raw digraph (e.g. change vertex groupings or adjust edge-weight-standard-deviations upward or downward depending upon recent accuracy readings).
  • the method incudes triggering the recalibration of the entire predictive mode, by the shortest-path-interface when invoked by the shor test-path-finder module, to re-run processes (A), (B), (Bl), (B2), (B3), (B4), (C), (Cl), (C2), and (D), as shown in Fig. 3, to ingest any incremental new data and re-generate the digraphs to reflect the new reality.
  • the method comprises a step, by the rationalizer module, fetching ⁇ shipping-interruption ⁇ details from the database and overlaying them on the output of the shortest-path-finder module.
  • the next step is, by the rationalizer module, packing the output variable ⁇ fastest-shipping-time ⁇ into an API response and returns it to the caller, to be processed further and displayed to the end-user as appropriate.
  • Figure 3 of the present invention is a flow chart of a machine learning recalibration model according to an embodiment of the present invention.
  • step (A) illustrates data handling and formatting module as configured to ingest raw data, format the data and performs validations, shapes the data and stores it into the database(s), and invokes step (B) and step (C), if there are any changes.
  • step (B) illustrates the zip-code-vertex-checker module as configured to run a customized k-means clustering algorithm to identify the appropriate centroid for each ZIP code, create a mapping matrix ⁇ zip-centroid-map ⁇ for each ZIP code to define what centroid ZIP code it should be replaced by, rewrite the shipping-history-database table to replace all occurrences of the original ZIP codes with the centroids, and invoke step (Bl) and invoke step (B2).
  • step (Bl) illustrates the hop-granularity-checker module as configured to run a test to check if all vertices are grouped relatively evenly.
  • Step (Bl) further illustrates the hop-granularity-checker module as configured to re-run the k-means clustering algorithm in Step (B) to ensure that El value is in the range [0.1, 0.3], According to an embodiment of the invention, if it is on the smaller end of that range, the k-means clustering algorithm is run to find a greater number of centroids. If it is on the larger end of that range, the k-means clustering algorithm is run to find fewer centroids. Step (Bl) further illustrates the hop-granularity-checker module as configured to store the output through method (B3).
  • This step (B3) relates to a raw-shipping- graph module that is configured to store the vertices of the shipping graph from step (B2) with a unit edge weight in all directions.
  • step (C) illustrates the edge-weight-optimizer module as configured to ingest formatted records from the shipping-history-database and creates a raw edge weight mapping matrix ⁇ raw-edge-map ⁇ as follows:
  • w here p and c are the mean and standard deviation of the time taken to go from source to destination
  • step (Cl) illustrates the shipping-edge-weights module as configured to store the raw edge weights that are computed without any machine learning input that are then used in step (B4) and step (C2) to overlay ML inputs on.
  • the steps (B4) and (C2) illustrate the shortest-path-interface module as configured to apply the machine learning inputs from steps (DI) and (D2) to the raw graphs and raw edges in step (B3) and step (Cl), before exporting the serialized ML-enhanced digraphs to be stored in the Part 1 flow.
  • step (DI) illustrates the the machine-leaming-feedback module for vertices as configured to, for each vertex j, monitors the accuracy of last k predictions according to the following formula: where R,.k. S is the realized shipping time for the prediction involving vertex j with shipping speed k. p and c are the mean shipping time and standard deviation of shipping time for the last k predictions involving vertex j at shipping speed 5.
  • step (D2) illustrates the machine-learning-feedback module for edgeweights as configured to, for each route j, monitors the accuracy of last k predictions according to the following formula: where R,.k. S is the realized shipping time for the prediction on route j with shipping speed k. p and c are the mean shipping time and standard deviation of shipping time for the last k predictions on route j at shipping speed 5.
  • Figure 4 of the present invention is a flow chart of a continuous validation for a machine-learning model according to an embodiment of the present invention.
  • the method comprises a step of monitoring the accuracy of the fastest-shipping-time predictions by comparing (S10) the realized shipping times to predicted shipping times for all predictions. In some embodiments, if the accuracy of the predictions (Si l) is greater than 99%, the method comprises a step of, for each vertex, monitoring (S10) the accuracy of last k predictions by comparing the realized shipping times to the predicted shipping times for all predictions involving that vertex.
  • the method comprises a step of, for each route, monitoring the accuracy of last k predictions by comparing the realized shipping time for each prediction with predicted shipping time (S10). In some embodiments, the method comprises a step of, if the accuracy is not greater than 99%, adjusting machine learning (S12) input parameters by launching model recalibration.
  • the present invention may be implemented as a cloud-based system that allows customers on e-Commerce websites to get a 99% accurate prediction of how long it will take for the item to be shipped from the origin to their ZIP code.
  • the system is accessible through an Internet-based interface (an API) that can be called by making a request to a URL that includes three parameters - the source ZIP code, the destination ZIP code, and shipping speed (ground, saver, priority).
  • the system then returns the fastest shipping speed for that tuple, based on the latest version of the machine-learning model.
  • the system continually analyzes new shipping data as it comes in, from other orders placed on the same e-Commerce website, and updates its own machine learning models to reflect the new realities represented by this data.
  • the system leverages continuous machine learning to drastically improve the reality reflected in the model; the invention allows this predicted shipping time to be accurate 99% of the time.
  • This invention significantly reduces the noise and uncertainty in shipping times predicted through existing methods. It uses machine learning to improve prediction accuracy and takes into account the ground realities and holiday schedules to ensure that 99% of its predictions are accurate. If the accuracy percentage starts slipping, the proposed method and the system adjust itself to bring the accuracy percentage back up.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Economics (AREA)
  • General Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Strategic Management (AREA)
  • Tourism & Hospitality (AREA)
  • Operations Research (AREA)
  • Marketing (AREA)
  • Human Resources & Organizations (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Artificial Intelligence (AREA)
  • Quality & Reliability (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Development Economics (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

A system (1) for determining a fastest shipping time to deliver a package to a shipping location, comprising a memory module (2) configured to store predefined databases, a handler (4) configured to input raw data and to update said predefined databases based on the raw data, a formatter (6) configured to format said updated predefined databases and to generate vertices of locations stored in the update said predefined databases based on said formatted updated predefined databases, a shipping digraph optimizer (8) comprises a graph vertex determiner (8a), an edge weight optimizer (8b) and a machine learning model tuner (8c), and a processing module (10) configured to process said tuned shipping graphs to output the fastest shipping time to deliver the package.

Description

A SYSTEM TO ESTIMATE THE FASTEST SHIPPING TIME OF A PACKAGE AND A METHOD THEREOF
Field of the invention:
The present invention relates to a system to estimate package shipping time using machine learning and a method thereof. In particular, the present invention relates to a system to estimate the fastest time to ship a package from one location to another using machine learning and a method thereof. More particularly, the present invention discloses a cloudbased system to estimate package-shipping times with 99% accuracy using machine learning a method thereof.
Background of the invention:
It is known in prior art that merchants implement a supply chain that allows goods to be transferred from manufacturers and factories, wholesalers, and distributors, to target locations. In particular, after obtaining the goods from the manufacturers and factories, the goods are then distributed to warehouses or are made available online on e-commerce websites. In online environment nowadays, when a customer wishes to purchase goods from a merchant website online utilises e-commerce benefits, like receiving an estimate of shipping a package of goods to the delivery location of the customer. When shopping online, goods of all sizes requiring shipping, and this often occurs between a retailer’s warehouse or a storeroom, and the customer’s home, office, or other designated delivery location.
The shipping services field is important to people and businesses across the world. Providing accurate delivery time is a challenging task with many facets. While the shipping services field has grown and matured over the years, delivery and shipping inefficiencies still cause substantial problems for shipping senders, shipping recipients, and shipping carriers alike. In particular, it can be difficult for shipping senders and/or recipients to accurately estimate transit time (and thus delivery time) for a given shipment.
In fact, the purchase price paid by a consumer includes a shipping fee, and customers require most accurate estimate time of delivery. Many vendors typically supply many different retailers, each having separate warehouses, physical storerooms, and even warehouses for carriers that make the last mile delivery. Because of this, most retailers establish their own supply chain network. While a dedicated supply chain network has an upside in providing increased control and flexibility within the network, there are normally significant costs associated with having too little or too much capacity in the network. Moreover, consumers tend to wait additional time for their delivery because retailers or carriers may need to ship a full truckload for economic reasons. Another issue is with synchronizing other carrier loads with the ones that are expected by the consumers. These costs can translate to higher shipping costs to be paid by the customer or be absorbed by the retailer or vendor, or lost sales because delivery times are too long, to name a few.
In this regard, W02020034044A1 aims to overcome the problem of estimating accurate time of shipping a package by providing a system and method are provided for providing delivery options. The method includes interfacing a supply-chain management system with a retailer user interface and detecting a request from the retailer user interface to schedule a delivery of one or more items through an integrated supply chain network coordinated by the supply-chain management system. The method also includes, based on the request, determining from the integrated supply chain network, at least one delivery option for at least one delivery date, each delivery option comprising a route, one or more carriers for the route, and a time window. The method also includes, for each delivery option, use a prediction engine and at least one data model to compute at least one delivery prediction parameter indicative of a likelihood of success for completing the route within the time window using the one or more carriers for the route, the at least one data model having been generated using historical delivery data. The method also includes providing, in response to the request, each of the at least one delivery option augmented with the at least one delivery prediction parameter to enable the retailer user interface to display the at least one delivery option augmented with the at least one delivery prediction parameter in a user interface element enabling selection of a delivery date for the one or more items.
The above-cited prior art does not contribute to estimate the shorted route in estimating a shipping time of a package. Thus, prior art CN107276896A provides a kind of point-to-point transmission method for searching shortest route for improving Dijkstra's algorithm, include following steps : 1) adjacency matrix in graph theory thought is introduced, search procedure is aided in using adjacency matrix, i.e., a certain summit is determined every time to after the beeline of initial vertex, all searches in adjacency matrix, element is 1 row in the row corresponding to the summit ; 2) only to step 1) element for 1 row corresponding to vertex distance value carry out calculating analysis, effectively reduction search procedure amount of calculation.
This prior art aims to avoid the useless calculating process for being not attached to node based on adjacency matrix progress path analysis, amount of calculation is effectively reduced, while ensure that searching route is most short. However, there are still problems in estimating the fastest time of shipping a package or goods.
To overcome the problems of prior art, another patent application US20180204229A1 discloses method for improving parcel delivery. The method comprises generating a first and a second cross-carrier delivery prediction model based on delivery data retrieved from a plurality of carriers; at a delivery data database of a server system, storing a plurality of cross-carrier delivery models comprising the first and the second cross-carrier delivery prediction model, the first cross-carrier delivery prediction model stored in association with a first carrier identifier and the second cross-carrier delivery prediction model stored in association with a second carrier identifier, for improving upon data storage and retrieval functionality to enable substantially real-time analysis of cross-carrier data to determine delivery estimates of improved accuracy; determining a first parcel delivery estimate for a first parcel based on processing first parcel data, from a first carrier of the plurality of carriers, with the first cross-carrier delivery prediction model retrieved from the delivery data database; automatically determining a second parcel delivery estimate for a second parcel based on a first parcel delivery status and upon retrieving, from a second carrier of the plurality of carriers, second parcel data, wherein determining the second parcel delivery estimate comprises: transmitting a first API request to the second carrier associated with the second parcel; establishing a webhook endpoint Uniform Resource Locator (URL) for receiving communication from the first carrier associated with the first parcel; at the webhook endpoint Uniform Resource Locator (URL) associated with the first carrier, receiving a request from the first carrier in response to the first carrier updating the first parcel delivery status; in response to receiving the request at the webhook endpoint URL associated with the first carrier, automatically transmitting a second API request to the second carrier to obtain updated second parcel data; and outputting the second parcel delivery estimate upon retrieving the second cross-carrier delivery prediction model from the delivery data database and processing updated second parcel features with the second cross-carrier delivery prediction model in substantially real-time, as facilitated by the improved data storage and retrieval functionality at the remote server system; and in response to determining the second parcel delivery estimate, automatically selecting a service level for delivering the second parcel; generating, at a remote delivery system, a set of control instructions; and executing the delivery of the second parcel based on the selected service level and the set of control instructions.
However, none of the prior art addresses the issue of estimating the delivery time of a package to close to absolute accuracy. Moreover, none of the cited prior art document proposes a solution that considers the real-time obstructions in estimating the shipping time of a package.
Prior Art depends on the assumption of static travel times between different nodes, which is incorrect (for instance, the travel time from Spokane, WA to Seattle, WA can be from 4 hours to 3 days depending upon traffic, closures, and other road conditions. Any model that relies on static data about distances or times or seasonality - as all Prior Art does - is bound to introduce inaccuracies, as the data does not always reflect reality.
Prior Art also uses shipment-by-shipment data to estimate shipping times, thereby ignoring multi-modal configurations that may involve backtracking. As an example, just because shipments have only been carried from Seattle, WA to Sacramento, CA by truck does not mean that a complex yet faster routing (say, through San Francisco, CA by plane) is impossible.
Prior Art also does not factor in seasonality and holiday schedule in making delivery estimations, which is bound to introduce further errors into the prediction.
Thus, there is a requirement to provide a system to accurately estimate shipping time of a package to overcome the drawbacks of prior art.
Summary of the invention:
The present invention provides a method and a system that leverages three critical elements to estimate in the method: a. a machine learning model that continually creates and optimizes a directed graph (digraph) that models the various modes of transportation available between two non-terminal end points. The machine-learning model continually observes both transit times and nodes for any changes (as an example, there may be new nodes introduced in Kansas during peak times, like Christmas, that are operational only in November and December) and recreates the digraph every 15 minutes (~100x per day). b. a shortest-path algorithm implementation that uses the correct digraph (depending upon the shipping speed) to find the fastest shipping speed possible. c. a shipping time rationalizer that incorporates the impact of holidays and other scheduled or known disruptions (e.g. Airport closures)
The present invention radically improves to 99% the accuracy of shipping time predictions made for e-commerce websites that rely on multi-modal, multi-speed logistics networks. The system of the invention accepts as inputs through an Application Programming Interface (API): the source and destination ZIP codes and chosen shipping speed, shipping history - source, destination, hops, and shipping speed and time by hop, all ZIP codes and corresponding longitude/latitude, and population by ZIP codes and returns (through the same API) the fastest shipping time for the combination as the output.
According to an embodiment of the present invention, a system for determining a fastest shipping time to deliver a package to a shipping location comprising a memory module configured to store predefined databases, a handler configured to input raw data and to update said predefined databases based on the raw data, a formatter configured to format said updated predefined databases and to generate vertices of locations stored in the update said predefined databases based on said formatted updated predefined databases, and a shipping digraph optimizer comprises a graph vertex determiner, an edge weight optimizer and a machine learning model tuner.
According to an embodiment of the present invention, the graph vertex determiner is configured to receive said vertices from the formatter and to generate shipping vertices with edge weights distributed in all directions for the input location.
According to an embodiment of the present invention, the edge weight optimizer is configured to receive the vertices from the formatter and to compute raw edge weights on the vertices based on historical shipping outcomes for said shipping vertices stored in said predefined databases.
According to an embodiment of the present invention, the machine learning model tuner is configured to receive said shipping vertices with edge weights distributed in all directions and generate raw shipping graphs based on the said shipping vertices, receive computed raw edge weights and generate raw shipping edge weights, apply machine learning inputs to said generated raw shipping graphs and to said generated raw shipping edge weights for tuning of said generated raw shipping graphs, wherein a shipping-interruption-database provides said machine learning inputs, and output said tuned raw shipping graphs.
According to an embodiment of the present invention, The system further comprises a processing module configured to process said tuned shipping graphs to output the fastest shipping time to deliver the package.
According to an embodiment of the present invention, the machine learning model tuner is further configured to store said machine learning inputted generated raw shipping graphs among the predefined databases.
According to an embodiment of the present invention, the machine learning model tuner comprises a short-path interface module configured to tune said generated shipping graphs.
According to an embodiment of the present invention, the processing module comprises an input pre-processor, a shortest pathfinder and a rationalizer.
According to an embodiment of the present invention, the input pre-processor is configured to receive data packets through an Internet based Interface (API) for determining the fastest shipping time to deliver the package; unpack the data packets and to validate the variables of shipping information on the basis of information stored in said predefined database; and confirm that data points, related to source/destination/shipping speed, present in the variables satisfy the requirement to accurately determine the shipping time of the package.
According to an embodiment of the present invention, the shortest pathfinder module is configured to initiate determining shortest path to deliver the package on said tuned shipping graphs based on refined Dijkstra’s shortest-path algorithm and to output the shortest path.
According to an embodiment of the present invention, the rationalizer is configured to fetch shipping-interruption information from the shipping-interruption database; overlay the shipping-interruption information on the output of the shortest-path-finder module; and factor in a potential delay in the shipping time of the package in the shorted path to output the fastest time to deliver the package.
According to an embodiment of the present invention, the shortest-path-finder module refreshes shipping graphs every 15 minutes to consider any updates in the shipping data. According to an embodiment of the present invention, the machine learning model tuner further comprises a machine-learning-feedback module configured to monitor the accuracy of determinations made by said graph vertex determiner and said edge weight optimizer.
According to an embodiment of the present invention, a method for determining a fastest shipping time to deliver a package to a shipping location, comprising: accepting, by an input pre-processor, a call from a caller, wherein the call comprising variables through API, unpacking said variables containing ZIP codes and validating ZIP codes using a ZIP code database, validating a presence of sufficient number of data points present in the variables, invoking a shortest path finder module to find out a shortest path to deliver a package from a starting location to the shipping location, wherein the shortest path finder module uses graphs stored in the database, tuning said graphs with machine leaning inputs, selecting one said tuned graph for finding fastest shipping time and the feeding the selected tuned graph to for rationalizing, fetching shipping-interruption details from the database and overlays them on input from the shortest-path-finder module, and returning the fastest-shipping-time as an API response to the caller.
According to an embodiment of the present invention, the method for determining a fastest shipping time to deliver a package to a shipping location further comprising monitoring the accuracy of the fastest-shipping-time predictions by comparing the realized shipping times to predicted shipping times for all predictions.
According to an embodiment of the present invention, the method for determining a fastest shipping time to deliver a package to a shipping further comprising, if the accuracy of the predictions is greater than 99%: for each vertex, monitor the accuracy of last k predictions by comparing the realized shipping times to the predicted shipping times for all predictions involving that vertex; and for each route, monitor the accuracy of last k predictions by comparing the realized shipping time for each prediction with predicted shipping time.
According to an embodiment of the present invention, the method for determining a fastest shipping time to deliver a package to a shipping further comprising, if the accuracy is not greater than 99%, adjust machine learning input parameters by launching model recalibration.
The present invention is made with an objective to provide a system to estimate package shipping time using machine learning and a method thereof. The present invention is made with an objective to provide a system that estimates package shipping time using machine learning with high accuracy and a method thereof.
The present invention is made with an objective to provide a method to reduce the margin of error in generated shipping time estimates.
The present invention is made with an objective to provide a system to estimate a package shipping time that factors in seasonality and holiday schedule in making delivery estimations, which is bound to introduce further errors into the prediction.
BRIEF DESCRIPTION OF THE DRAWINGS
These and other features, aspects, and advantages of the present invention will become better understood when the following detailed description is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:
Figure 1 of the present invention illustrates a system to estimate package shipping time using machine learning according to an embodiment of the present invention.
Figure 2 of the present invention is a flowchart of a method that illustrates estimate package shipping time using machine learning according to an embodiment of the present invention.
Figure 3 of the present invention is a flow chart of a machine learning recalibration model according to an embodiment of the present invention.
Figure 4 of the present invention is a flow chart of a continuous validation for a machine-learning model according to an embodiment of the present invention.
DETAILED DESCRIPTION OF THE INVENTION WITH NON-LIMITING ILLUSTRATIONS AND EMBODIMENTS
For the purpose of promoting an understanding of the principles of the invention, reference will now be made to the embodiment illustrated in the drawings and specific language will be used to describe the same. It will nevertheless be understood that no limitation of the scope of the invention is thereby intended, such alterations and further modifications in the illustrated system, and such further applications of the principles of the invention as illustrated therein being contemplated as would normally occur to one skilled in the art to which the invention relates. Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skilled in the art to which this invention belongs. The system, methods, and examples provided herein are illustrative only and not intended to be limiting. Embodiments of the present invention will be described below in detail with reference to the accompanying drawings.
The term “some” as used herein is to be understood as “none or one or more than one or all.” Accordingly, the terms “none,” “one,” “more than one,” “more than one, but not all” or “all” would all fall under the definition of “some.” The term “some embodiments” may refer to no embodiments or to one embodiment or to several embodiments or to all embodiments, without departing from the scope of the present disclosure.
The terminology and structure employed herein is for describing, teaching, and illuminating some embodiments and their specific features. It does not in any way limit, restrict or reduce the spirit and scope of the claims or their equivalents.
More specifically, any terms used herein such as but not limited to “includes,” “comprises,” “has,” “consists,” and grammatical variants thereof do not specify an exact limitation or restriction and certainly do not exclude the possible addition of one or more features or elements, unless otherwise stated, and furthermore must not be taken to exclude the possible removal of one or more of the listed features and elements, unless otherwise stated with the limiting language “must comprise” or “needs to include.”
Whether or not a certain feature or element was limited to being used only once, either way, it may still be referred to as “one or more features” or “one or more elements” or “at least one feature” or “at least one element.” Furthermore, the use of the terms “one or more” or “at least one” feature or element do not preclude there being none of that feature or element, unless otherwise specified by limiting language such as “there needs to be one or more . . . ” or “one or more element is required.”
Unless otherwise defined, all terms, and especially any technical and/or scientific terms, used herein may be taken to have the same meaning as commonly understood by one having ordinary skills in the art.
Reference is made herein to some “embodiments.” It should be understood that an embodiment is an example of a possible implementation of any features and/or elements presented in the attached claims. Some embodiments have been described for the purpose of illuminating one or more of the potential ways in which the specific features and/or elements of the attached claims fulfil the requirements of uniqueness, utility and non-obviousness. Use of the phrases and/or terms including, but not limited to, “a first embodiment,” “a further embodiment,” “an alternate embodiment,” “one embodiment,” “an embodiment,” “multiple embodiments,” “some embodiments,” “other embodiments,” “further embodiment”, “furthermore embodiment”, “additional embodiment” or variants thereof do not necessarily refer to the same embodiments. Unless otherwise specified, one or more particular features and/or elements described in connection with one or more embodiments may be found in one embodiment, or may be found in more than one embodiment, or may be found in all embodiments, or may be found in no embodiments. Although one or more features and/or elements may be described herein in the context of only a single embodiment, or alternatively in the context of more than one embodiment, or further alternatively in the context of all embodiments, the features and/or elements may instead be provided separately or in any appropriate combination or not at all. Conversely, any features and/or elements described in the context of separate embodiments may alternatively be realized as existing together in the context of a single embodiment.
Any particular and all details set forth herein are used in the context of some embodiments and therefore should not be necessarily taken as limiting factors to the attached claims. The attached claims and their legal equivalents can be realized in the context of embodiments other than the ones used as illustrative examples in the description below. Embodiments of the present invention will be described below in detail with reference to the accompanying drawings.
Figure 1 of the present invention illustrates a system (1) to estimate package shipping time using machine learning according to an embodiment of the present invention. The system (1) for determining a fastest shipping time to deliver a package to a shipping location comprises a memory module (2), a handler (4), a formatter (6), a shipping digraph optimizer (8) and a processing module (10).
In some embodiments, the memory module (2) is fed with databases, like a shipping history database (I), a ZIP code database (II) and a population database (III). The shipping history database (I) comprises data on the historical shipping outcomes. In some embodiments, the shipping history database may include data related to source, destination, speed, time-average, time standard deviation, count and last updates. In some embodiments, the ZIP code database (II) comprises data on the various zip-codes, the distance between them, and includes data of zip codes, associated longitudinal position and latitudinal position. In some embodiments, the population database (III) comprises data on the various population centers, their longitude and latitude positions.
In some embodiments, the handler (4) and the formatter (6) may be included as one module. This module is configured to accept new updates to the three datasets and formats and converts the existing formatted data to feed to the Graph-vertex-determiner module and the edge-weight-optimizer module.
In some embodiments, the shipping digraph optimizer (8) comprises a graph vertex determiner (8a), an edge weight optimizer (8b) and a machine learning model tuner (8c).
In some embodiments, the graph vertex determiner (8a) comprises a ZIP code vertex checker (V), a hot granularity checker (VI), and a position gravity checker (VII). In some embodiments, the ZIP code vertex checker (V) may be a module that is configured to check each ZIP code to see if that ZIP code should be grouped into a larger vertex or not (e.g. 10027 and 10031 should all be grouped into a 10001 New York City vertex). In some embodiments, the hot granularity checker (VI) may be a module that is configured to check if the hops in the shipping data are at a comparable granularity to the vertices in the zip-code data, and decides the overall granularity for the digraph. In some embodiments, the position gravity checker (VII) may be a module configured to make a final check on the grain of the data is made by this module, which corroborates the grouping decisions made by the zip-code-vertex-checker by ensuring that all grouped vertices have a population.
In some embodiments, the edge weight optimizer (8b) is configured to calculate the edge weights (time taken to travel from one node to another) based on the historical shipping outcomes stored in shipping-history-database. In some embodiments, the system comprises shipping-edge-weights that reflect the raw edge weights that are computed without any machine learning input. These weights will be used to compute an ML-improved digraph every 15 minutes. For example, a raw-shipping-graph is a set of raw vertices without any machine learning inputs applied to it, ready for further processing every 15 minutes.
In some embodiments, the machine learning model tuner (8c) comprises a short-path interface module (12). The short-path interface module (12) is configured to apply all the machine learning inputs to digraphs that are finally stored in the shortest-path-finder module, and ensures that the learning that are captured based on new realities that are in the incoming data. The short-path interface module (12) is configured to tune said generated shipping graphs, extract latest said tune generated shipping graphs, and select a right digraph corresponding to select shipping speed.
In some embodiment, the processing module (10) comprises an input pre-processor (14), a shortest pathfinder module (16), and a rationalizer (18).
The input pre-processor (14) is configured to work to handle the incoming API call, ensuring that the ZIP codes and shipping speeds are valid and that there are enough data points to make a meaningful prediction. In some embodiments, the input pre-processor (14) is configured to receive data packets through an Internet based Interface (API) for determining the fastest shipping time to deliver the package. The input pre-processor (14) is further configured to unpack the data packets and to validate the variables of shipping information on the basis of information stored in said predefined database. The input preprocessor (14) is further configured to confirm that data points, related to source/destination/shipping speed, present in the variables satisfy the requirement of number of data points required to accurately determine the shipping time of the package.
The shortest pathfinder module (16) is configured to, when invoked by the input preprocessor (14), the shortest pathfinder module (16) is configured to run a refined version of Dijkstra’s shortest-path algorithm to find the fastest shipping speed between the source and destination vertices. In some embodiments, The shortest pathfinder module (16) is configured to initiate determining shortest path to deliver the package using said right digraph based on refined Dijkstra’s shortest-path algorithm and to output the shortest path. The shortest-path- finder module (16) is configured to refresh the tuned shipping graphs every 15 minutes to consider any updates in the shipping data and to ensure that the tuned shipping-graphs are recent, so that a right digraph is selected for the selected shipping speed
The rationalizer (18) is configured to factor in the potential delay (by leveraging #XVI, shipping-interruption-database) that may be caused due to holidays and planned/implied interruptions due to weather and other factors or any other factors, and to output the fastest shipping time based on the speed chosen. In some embodiments, the rationalizer (18) is configured to fetch a shipping-interruption information from the shippinginterruption database. In some embodiments, the rationalizer (18) is further configured to overlay the shipping-interruption information on the output of the shortest-path-finder module. In some embodiments, the rationalizer (18) is further configured to factor in a potential delay in the shipping time of the package in the shorted path to output the fastest time to deliver the package.
In some embodiments, the system further comprises a shipping interruption database (XVI). The shipping interruption database (XVI) comprises data on the various scheduled and implied shipping disruptions, like source, destination, method, delta start date, delta end data, and delta.
According to an embodiment of the present invention, the memory module (2) is configured to store predefined databases. The predefined databases include the shipping history database (I), the ZIP code database (II) and/or the population database (III). According to an embodiment of the present invention, the handler (4) is configured to input raw data and to update said predefined databases based on the raw data. According to an embodiment of the present invention, the formatter (6) configured to format the updated predefined databases and to generate vertices of locations stored in the updated predefined databases based on the formatted updated predefined databases.
According to an embodiment of the present invention, the graph vertex determiner (8a) is configured to receive the vertices from the formatter (6) and to generate shipping vertices with edge weights distributed in all directions for the input location. According to an embodiment of the present invention, the edge weight optimizer (8b) is configured to receive the vertices from the formatter (6) and to compute raw edge weights on the vertices based on historical shipping outcomes for the shipping vertices stored in the predefined databases.
According to an embodiment of the present invention, the machine learning model tuner (8c) is configured to receive said shipping vertices with edge weights distributed in all directions and generate raw shipping graphs based on the said shipping vertices. In some embodiments, the machine learning model tuner (8c) is configured to receive computed raw edge weights and generate raw shipping edge weights, the machine learning model tuner (8c) is configured to apply machine learning inputs to the generated raw shipping graphs and to the generated raw shipping edge weights for tuning of said generated raw shipping graphs. In some embodiments, the machine learning model tuner (8c) is configured to output said tuned raw shipping graphs. The shipping-interruption-database provides the machine learning inputs. According to an embodiment of the present invention, the processing module (10) configured to process the tuned shipping graphs to output the fastest shipping time to deliver the package.
Figure 2 of the present invention is a flowchart of a method that illustrates estimate package shipping time using machine learning according to an embodiment of the present invention. According to an embodiment of the present invention, the method for determining a fastest shipping time to deliver a package to a shipping location, comprising: accepting (SI), by an input pre-processor, a call from a caller, wherein the call comprising variables through API, unpacking (S2) said variables containing ZIP codes, validating (S3) ZIP codes using a ZIP code database, validating (S4) a presence of sufficient number of data points present in the variables, invoking (S5) a shortest path finder module, checking (S6) the timings of the stored fine tuned graphs to extract recent tuned graphs and selecting a right digraph to find the fastest shipping time, tuning said di-graph with machine leaning inputs, selecting one said tuned graph for finding fastest shipping time and the feeding the selected tuned graph to for rationalizing, fetching (S8) shipping-interruption details from the database and overlays them on input from the shortest-path-finder module, and returning (S9) the fastest-shipping-time as an API response to the caller.
In some embodiments, the method comprises a step of accepting, by the input-pre- processor, API calls and unpacking the variables {source-ZIP, destination-ZIP, shippingspeed}. The next step is of validating the zip-codes by finding the zip codes to exist in the databases, and confirming that there are enough data points for the source/destination/shipping speed combination to accurately predict, and to invoke the shortest-path-finder module.
In some embodiments, the method comprises a step of checking, by the shortest- path-finder module, the timestamp on the tuned-shipping-graphs store and requests a refresh if it has been more than 15 minutes. Next steps include in which the shortest-path-finder module is configured to select the right digraph for the chosen shipping speed and run a finetuned implementation of the Dijkstra algorithm to find the fastest shipping time between source and destination. Next step is that the shortest-path-finder module is configured to invoke the rationalizer module.
In some embodiments, the method incudes applying machine learning transforms, by the shortest-path-interface module when invoked by ra -shipping-graph or shipping-edge- weights, to adjust the graph vertices and edge weights to include machine learning insights to the raw digraph (e.g. change vertex groupings or adjust edge-weight-standard-deviations upward or downward depending upon recent accuracy readings).
In some embodiments, the method incudes triggering the recalibration of the entire predictive mode, by the shortest-path-interface when invoked by the shor test-path-finder module, to re-run processes (A), (B), (Bl), (B2), (B3), (B4), (C), (Cl), (C2), and (D), as shown in Fig. 3, to ingest any incremental new data and re-generate the digraphs to reflect the new reality.
In some embodiments, the method comprises a step, by the rationalizer module, fetching {shipping-interruption} details from the database and overlaying them on the output of the shortest-path-finder module. The next step is, by the rationalizer module, packing the output variable {fastest-shipping-time} into an API response and returns it to the caller, to be processed further and displayed to the end-user as appropriate.
Figure 3 of the present invention is a flow chart of a machine learning recalibration model according to an embodiment of the present invention. In some embodiments, step (A) illustrates data handling and formatting module as configured to ingest raw data, format the data and performs validations, shapes the data and stores it into the database(s), and invokes step (B) and step (C), if there are any changes.
In some embodiments, step (B) illustrates the zip-code-vertex-checker module as configured to run a customized k-means clustering algorithm to identify the appropriate centroid for each ZIP code, create a mapping matrix {zip-centroid-map} for each ZIP code to define what centroid ZIP code it should be replaced by, rewrite the shipping-history-database table to replace all occurrences of the original ZIP codes with the centroids, and invoke step (Bl) and invoke step (B2).
In some embodiments, step (Bl) illustrates the hop-granularity-checker module as configured to run a test to check if all vertices are grouped relatively evenly. Step (Bl) further illustrates the hop-granularity-checker module as configured to calculates the edge importance distribution as El = T, is the time taken to traverse the ith edge in N total
Figure imgf000017_0001
edges, and p is the average of the time taken to traverse each of those N edges. Step (Bl) further illustrates the hop-granularity-checker module as configured to re-run the k-means clustering algorithm in Step (B) to ensure that El value is in the range [0.1, 0.3], According to an embodiment of the invention, if it is on the smaller end of that range, the k-means clustering algorithm is run to find a greater number of centroids. If it is on the larger end of that range, the k-means clustering algorithm is run to find fewer centroids. Step (Bl) further illustrates the hop-granularity-checker module as configured to store the output through method (B3).
In some embodiments, step (B2) illustrates the population-gravity-checker as configured to use population-database and runs a test on {zip-centroid-map} to compare the population of each centroid, calculate the vertex importance distribution as VI =
Figure imgf000018_0001
where C, is the population of the ith of N total centroids, and p is the average of the population of those N centroids. If VI > 0.3, runs an adjustment algorithm to bring VI under 0.3 Once VI is under 0.3, stores the output through the step (B3). This step (B3) relates to a raw-shipping- graph module that is configured to store the vertices of the shipping graph from step (B2) with a unit edge weight in all directions.
Further, the step (C) illustrates the edge-weight-optimizer module as configured to ingest formatted records from the shipping-history-database and creates a raw edge weight mapping matrix {raw-edge-map} as follows:
• source = same as source ZIP
• destination = same as destination ZIP
Figure imgf000018_0002
• edge weight EW — MAX (psource, destination +
Figure imgf000018_0003
X
^source, destination' tcrow)where p and c are the mean and standard deviation of the time taken to go from source to destination
• Serialize and store EWs in {shipping-edge-weights} matrix and trigger Cl
Further, the step (Cl) illustrates the shipping-edge-weights module as configured to store the raw edge weights that are computed without any machine learning input that are then used in step (B4) and step (C2) to overlay ML inputs on. The steps (B4) and (C2) illustrate the shortest-path-interface module as configured to apply the machine learning inputs from steps (DI) and (D2) to the raw graphs and raw edges in step (B3) and step (Cl), before exporting the serialized ML-enhanced digraphs to be stored in the Part 1 flow.
Further, the step (DI) illustrates the the machine-leaming-feedback module for vertices as configured to, for each vertex j, monitors the accuracy of last k predictions according to the following formula:
Figure imgf000019_0001
where R,.k. S is the realized shipping time for the prediction involving vertex j with shipping speed k. p and c are the mean shipping time and standard deviation of shipping time for the last k predictions involving vertex j at shipping speed 5.
Accordingly, if accuracy j, k, s < 0.99, the need for a correction is identified. The values of ak s are then adjusted upward for all j so that the condition accuracy j, k, s > 0.99 is satisfied again. The values (j.j k s and aJ k s are updated in the appropriate databases for route j and shipping time s.
Further, the step (D2) illustrates the machine-learning-feedback module for edgeweights as configured to, for each route j, monitors the accuracy of last k predictions according to the following formula:
Figure imgf000019_0002
where R,.k. Sis the realized shipping time for the prediction on route j with shipping speed k. p and c are the mean shipping time and standard deviation of shipping time for the last k predictions on route j at shipping speed 5.
Accordingly, if accuracy j, k, s < 0.99, the need for a correction is identified. The value of k is then adjusted upward or downward so that the value of c increases to a point where the condition accuracy j, k, s > 0.99 is satisfied again. The values (j.j k s and aJ k s are updated in the appropriate databases for route j and shipping time s.
Figure 4 of the present invention is a flow chart of a continuous validation for a machine-learning model according to an embodiment of the present invention. The method comprises a step of monitoring the accuracy of the fastest-shipping-time predictions by comparing (S10) the realized shipping times to predicted shipping times for all predictions. In some embodiments, if the accuracy of the predictions (Si l) is greater than 99%, the method comprises a step of, for each vertex, monitoring (S10) the accuracy of last k predictions by comparing the realized shipping times to the predicted shipping times for all predictions involving that vertex. Further, if the accuracy of the predictions (SI 1) is greater than 99%, the method comprises a step of, for each route, monitoring the accuracy of last k predictions by comparing the realized shipping time for each prediction with predicted shipping time (S10). In some embodiments, the method comprises a step of, if the accuracy is not greater than 99%, adjusting machine learning (S12) input parameters by launching model recalibration.
In one embodiment, the present invention may be implemented as a cloud-based system that allows customers on e-Commerce websites to get a 99% accurate prediction of how long it will take for the item to be shipped from the origin to their ZIP code. The system is accessible through an Internet-based interface (an API) that can be called by making a request to a URL that includes three parameters - the source ZIP code, the destination ZIP code, and shipping speed (ground, saver, priority). The system then returns the fastest shipping speed for that tuple, based on the latest version of the machine-learning model.
Advantages of the invention:
The system continually analyzes new shipping data as it comes in, from other orders placed on the same e-Commerce website, and updates its own machine learning models to reflect the new realities represented by this data. Thus, the system leverages continuous machine learning to drastically improve the reality reflected in the model; the invention allows this predicted shipping time to be accurate 99% of the time.
This invention significantly reduces the noise and uncertainty in shipping times predicted through existing methods. It uses machine learning to improve prediction accuracy and takes into account the ground realities and holiday schedules to ensure that 99% of its predictions are accurate. If the accuracy percentage starts slipping, the proposed method and the system adjust itself to bring the accuracy percentage back up.

Claims

Claims:
1. A system (1) for determining a fastest shipping time to deliver a package to a shipping location, comprising:
- a memory module (2) configured to store predefined databases;
- a handler (4) configured to input raw data and to update said predefined databases based on the raw data;
- a formatter (6) configured to format said updated predefined databases and to generate vertices of locations stored in the update said predefined databases based on said formatted updated predefined databases;
- a shipping digraph optimizer (8) comprises a graph vertex determiner (8a), an edge weight optimizer (8b) and a machine learning model tuner (8c), wherein said graph vertex determiner (8a) is configured to receive said vertices from the formatter (6) and to generate shipping vertices with edge weights distributed in all directions for the input location, wherein said edge weight optimizer (8b) is configured to receive the vertices from the formatter (6) and to compute raw edge weights on the vertices based on historical shipping outcomes for said shipping vertices stored in said predefined databases, wherein said machine learning model tuner (8c) is configured to:
- receive said shipping vertices with edge weights distributed in all directions and generate raw shipping graphs based on the said shipping vertices,
- receive computed raw edge weights and generate raw shipping edge weights,
- apply machine learning inputs to said generated raw shipping graphs and to said generated raw shipping edge weights for tuning of said generated raw shipping graphs, wherein a shipping-interruption-database provides said machine learning inputs, and
- output said tuned raw shipping graphs; and
- a processing module (10) configured to process said tuned shipping graphs to output the fastest shipping time to deliver the package.
2. The system (1) for determining a fastest shipping time to deliver a package to a shipping location according to claim 1, wherein the machine learning model tuner (8c) comprises a short-path interface module (12) configured to:
- tune said generated shipping graphs;
- extract latest said tune generated shipping graphs; and
- select a right digraph corresponding to select shipping speed.
3. The system (1) for determining a fastest shipping time to deliver a package to a shipping location according to claim 1, wherein said processing module (10) comprises an input pre-processor (14), a shortest pathfinder (16) and a rationalizer (18).
4. The system (1) for determining a fastest shipping time to deliver a package to a shipping location according to claim 3, wherein the input pre-processor (14) is configured to:
- receive data packets through an Internet based Interface (API) for determining the fastest shipping time to deliver the package;
- unpack the data packets and to validate the variables of shipping information on the basis of information stored in said predefined database; and
- confirm that data points, related to source/destination/shipping speed, present in the variables satisfy the requirement of number of data points required to accurately determine the shipping time of the package.
5. The system (1) for determining a fastest shipping time to deliver a package to a shipping location according to claim 4, wherein the shortest pathfinder module (16) is configured to initiate determining shortest path to deliver the package using said right digraph and to output the shortest path.
6. The system (1) for determining a fastest shipping time to deliver a package to a shipping location according to claim 3 to 5, wherein the rationalizer (18) is configured to:
- fetch a shipping-interruption information from the shipping-interruption database;
- overlay the shipping-interruption information on the output of the shortest-path-finder module; and - factor in a potential delay in the shipping time of the package in the shorted path to output the fastest time to deliver the package.
7. The system (1) for determining a fastest shipping time to deliver a package to a shipping location according to claim 6, wherein the shortest-path-finder module (16) refreshes said tuned shipping graphs every 15 minutes to consider any updates in the shipping data and to ensure that the tuned shipping-graphs are recent.
8. A method for determining a fastest shipping time to deliver a package to a shipping location, comprising:
- accepting (SI), by an input pre-processor, a call from a caller, wherein the call comprising variables through API;
- unpacking (S2) said variables containing ZIP codes;
- validating (S3) ZIP codes using a ZIP code database;
- validating (S4) a presence of sufficient number of data points present in the variables;
- invoking (S5) a shortest path finder module;
- checking (S6) the timings of the stored fine tuned graphs to extract recent tuned graphs and selecting a right digraph to find the fastest shipping time;
- tuning said di-graph with machine leaning inputs, selecting one said tuned graph for finding fastest shipping time and the feeding the selected tuned graph to for rationalizing;
- fetching (S8) shipping-interruption details from the database and overlays them on input from the shortest-path-finder module; and
- returning (S9) the fastest-shipping-time as an API response to the caller.
9. The method for determining a fastest shipping time to deliver a package to a shipping location according to claim 8, further comprising monitoring the accuracy of the fastest- shipping-time predictions by comparing (S10) the realized shipping times to predicted shipping times for all predictions.
10. The method for determining a fastest shipping time to deliver a package to a shipping location according to claim 9, further comprising, if the accuracy of the predictions (SI 1) is greater than 99%: for each vertex, monitor (S10) the accuracy of last k predictions by comparing the realized shipping times to the predicted shipping times for all predictions involving that vertex; and for each route, monitor the accuracy of last k predictions by comparing the realized shipping time for each prediction with predicted shipping time (S10), and if the accuracy is not grater than 99%, adjust machine learning (S12) input parameters by launching model recalibration.
PCT/IN2022/050664 2022-07-15 2022-07-24 A system to estimate the fastest shipping time of a package and a method thereof WO2024013755A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN202221040558 2022-07-15
IN202221040558 2022-07-15

Publications (1)

Publication Number Publication Date
WO2024013755A1 true WO2024013755A1 (en) 2024-01-18

Family

ID=89536188

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IN2022/050664 WO2024013755A1 (en) 2022-07-15 2022-07-24 A system to estimate the fastest shipping time of a package and a method thereof

Country Status (1)

Country Link
WO (1) WO2024013755A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180204229A1 (en) * 2013-09-18 2018-07-19 Simpler Postage, Inc. Method and system for generating delivery estimates
KR102035864B1 (en) * 2018-09-07 2019-10-23 정완식 Method for providing multiple shortest-way finding service
US20210216968A1 (en) * 2019-11-21 2021-07-15 Rockspoon, Inc. Delivery driver routing and order preparation timing system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180204229A1 (en) * 2013-09-18 2018-07-19 Simpler Postage, Inc. Method and system for generating delivery estimates
KR102035864B1 (en) * 2018-09-07 2019-10-23 정완식 Method for providing multiple shortest-way finding service
US20210216968A1 (en) * 2019-11-21 2021-07-15 Rockspoon, Inc. Delivery driver routing and order preparation timing system

Similar Documents

Publication Publication Date Title
US20230169448A1 (en) Delivery prediction generation system
US20210366009A1 (en) Personalized delivery time estimate system
US10636079B2 (en) Demand-based product sourcing
US11270372B2 (en) System for improving in-store picking performance and experience by optimizing tote-fill and order batching of items in retail store and method of using same
JP6142033B2 (en) Container selection at material handling facilities
US11854062B2 (en) Order fulfillment system having dynamic routing
US8428988B1 (en) Generating current order fulfillment plans to influence expected future conditions
US9619775B1 (en) Machine learning for determination of shipping rules and shipping methods for order fulfillment
US10445691B2 (en) System for improving order batching using location information of items in retail store and method of using same
US8650132B2 (en) System and method for distribution of single-product-type unlabeled packages
US20190258978A1 (en) Parallel lead time determinations in supply chain architecture
US20180314999A1 (en) Methods and systems for managing fullfillment of one or more online orders
US20210319398A1 (en) Method and system for aggregate shipping
US8407154B1 (en) Predicting shipment origin points
KR20030084547A (en) Inventory planning method and apparatus, and program product
US20090106033A1 (en) System and Method for Estimating a Shipment Delivery Date
JP2018073032A (en) Information processing apparatus for supporting delivery, delivery system and delivery support method
WO2024013755A1 (en) A system to estimate the fastest shipping time of a package and a method thereof
US11514395B2 (en) System for cost efficient order fulfillment
US20240070609A1 (en) Generating and providing notifications and indications identifying items that are likely to be restocked
US20240070605A1 (en) Selectively providing machine learning model-based services
US20240104449A1 (en) Iterative order availability for an online fulfillment system
US20230342711A1 (en) Creation and arrangement of items in an online concierge system-specific portion of a warehouse for order fulfillment
US20230401186A1 (en) Generating datastore checkpoints
US20230385911A1 (en) Automatic selection of dynamic data entries for multiple dynamic databases

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22950996

Country of ref document: EP

Kind code of ref document: A1