WO2021031222A1 - Systèmes et procédés de prédiction d'informations de demande de service - Google Patents

Systèmes et procédés de prédiction d'informations de demande de service Download PDF

Info

Publication number
WO2021031222A1
WO2021031222A1 PCT/CN2019/102588 CN2019102588W WO2021031222A1 WO 2021031222 A1 WO2021031222 A1 WO 2021031222A1 CN 2019102588 W CN2019102588 W CN 2019102588W WO 2021031222 A1 WO2021031222 A1 WO 2021031222A1
Authority
WO
WIPO (PCT)
Prior art keywords
service demand
demand parameters
parameters
historical
trained model
Prior art date
Application number
PCT/CN2019/102588
Other languages
English (en)
Inventor
Haibo Li
Original Assignee
Beijing Didi Infinity Technology And Development Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Didi Infinity Technology And Development Co., Ltd. filed Critical Beijing Didi Infinity Technology And Development Co., Ltd.
Publication of WO2021031222A1 publication Critical patent/WO2021031222A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0201Market modelling; Market analysis; Collecting market data
    • G06Q30/0202Market predictions or forecasting for commercial activities
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/02Reservations, e.g. for tickets, services or events
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0631Resource planning, allocation, distributing or scheduling for enterprises or organisations
    • G06Q10/06315Needs-based resource requirements planning or analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/40Business processes related to the transportation industry

Definitions

  • the present disclosure generally relates to systems and methods for on-demand service, and in particular, to systems and methods for predicting service demand information.
  • O2O Online to Offline
  • a system providing the O2O services may match available service providers and allocate the service requests to the available service providers to direct them to provide services for the users.
  • service demand and service supply may be unmatched in spatial dimension or in time dimension. For example, in office area, the service demand may be much higher than the service supply in evening peak hours, whereas, the service demand may be relatively lower than the service supply in working hours.
  • service demand information can be predicted in spatial dimension and time dimension, the system can perform suitable service supply scheduling in advance, thereby improving service efficiency and user experience. Therefore, it is desirable to provide systems and methods for predicting service demand information in spatial dimension and time dimension, which can provide reference for service supply scheduling.
  • An aspect of the present disclosure relates to a system for predicting service demand information.
  • the system may include a storage medium storing a set of instructions and a processor communicatively coupled to the storage medium. When the processor executes the set of instructions, the processor may perform one or more of the following operations.
  • the processor may receive a plurality of service demand parameters, each corresponding to a respective one of a plurality of time periods, wherein the plurality of service demand parameters may be associated with a target area including a plurality of regions.
  • the processor may extract, using a first trained model, spatial features of each of the plurality of service demand parameters based on the plurality of regions.
  • the processor may determine historical features of each of the plurality of service demand parameters based on historical service demand parameters associated with each of the plurality of service demand parameters.
  • the processor may further determine, using a second trained model, a plurality of future service demand parameters based on the spatial features and the historical features, each of the plurality of future service demand parameters corresponding to a respective one of a plurality of future time periods.
  • each of the plurality of service demand parameters may include a plurality of sub-parameters corresponding to the plurality of regions respectively.
  • the processor may determine a predetermined number count of candidate regions surrounding a region specified by the sub-parameter for each of the plurality of sub-parameters.
  • the processor may determine, using the first trained model, spatial dependence information among sub-parameters specifying the candidate regions and the sub-parameter specifying the region for each of the plurality of sub-parameters.
  • the processor may further extract the spatial features of each of the plurality of service demand parameters based on the spatial dependence information.
  • the historical features of the plurality of service demand parameters may include a maximum historical service demand parameter within a predetermined time period, a minimum historical service demand parameter within the predetermined time period, an average historical service demand parameter within the predetermined time period, and/or a deviation of a plurality of historical service demand parameters within the predetermined time period.
  • each of the plurality of future service demand parameters may include a plurality of future sub-parameters corresponding to the plurality of regions respectively.
  • the processor may determine a comprehensive parameter by combining the spatial features and the historical features for each of the plurality of service demand parameters.
  • the processor may determine a vector based on a plurality of comprehensive parameters corresponding to the plurality of service demand parameters by using the second trained model.
  • the processor may determine the plurality of future service demand parameters based on the vector by using the second trained model.
  • the first trained model and/or the second trained model may be trained based on a plurality of sample service demand parameters of the target area corresponding to a plurality of sample time periods respectively.
  • the first trained model may include a convolutional neural network model.
  • the second trained model may include a long short-term memory network model.
  • the method may be implemented on a computing device having at least one processor, at least one storage medium, and a communication platform connected to a network.
  • the method may include one or more of the following operations.
  • the method may include receiving a plurality of service demand parameters, each of which corresponding to a respective one of a plurality of time periods, wherein the plurality of service demand parameters may be associated with a target area comprising a plurality of regions.
  • the method may include extracting, using a first trained model, spatial features of each of the plurality of service demand parameters based on the plurality of regions.
  • the method may also include determining historical features of each of the plurality of service demand parameters based on historical service demand parameters associated with each of the plurality of service demand parameters.
  • the method may further include determining, using a second trained model, a plurality of future service demand parameters based on the spatial features and the historical features, each of the plurality of future service demand parameters corresponding to a respective one of a plurality of future time periods.
  • each of the plurality of service demand parameters may include a plurality of sub-parameters corresponding to the plurality of regions respectively.
  • the extracting, using the first trained model, the spatial features of each of the plurality of service demand parameters based on the plurality of regions may include determining a predetermined number count of candidate regions surrounding a region specified by the sub- parameter for each of the plurality of sub-parameters; determining, using the first trained model, spatial dependence information among sub-parameters specifying the candidate regions and the sub-parameter specifying the region for each of the plurality of sub-parameters; and extracting the spatial features of each of the plurality of service demand parameters based on the spatial dependence information.
  • the historical features of the plurality of service demand parameters may include a maximum historical service demand parameter within a predetermined time period, a minimum historical service demand parameter within the predetermined time period, an average historical service demand parameter within the predetermined time period, and/or a deviation of a plurality of historical service demand parameters within the predetermined time period.
  • each of the plurality of future service demand parameters may include a plurality of future sub-parameters corresponding to the plurality of regions respectively.
  • the determining, using the second trained model, the plurality of future service demand parameters based on the spatial features and the historical features may include determining a comprehensive parameter by combining the spatial features and the historical features for each of the plurality of service demand parameters; determining a vector based on a plurality of comprehensive parameters corresponding to the plurality of service demand parameters by using the second trained model; and determining the plurality of future service demand parameters based on the vector by using the second trained model.
  • the first trained model and/or the second trained model may be trained based on a plurality of sample service demand parameters of the target area corresponding to a plurality of sample time periods respectively.
  • the first trained model may include a convolutional neural network model.
  • the second trained model may include a long short-term memory network model.
  • a further aspect of the present disclosure relates to a system for predicting service demand information.
  • the system may include a receiving module, an extraction module, a determination module, and a prediction module.
  • the receiving module may be configured to receive a plurality of service demand parameters, each corresponding to a respective one of a plurality of time periods, wherein the plurality of service demand parameters may be associated with a target area comprising a plurality of regions.
  • the extraction module may be configured to extract, using a first trained model, spatial features of each of the plurality of service demand parameters based on the plurality of regions.
  • the determination module is configured to determine historical features of each of the plurality of service demand parameters based on historical service demand parameters associated with each of the plurality of service demand parameters.
  • the prediction module may be configured to determine, using a second trained model, a plurality of future service demand parameters based on the spatial features and the historical features, each of the plurality of future service demand parameters corresponding to a respective one of a plurality of future time periods.
  • each of the plurality of service demand parameters may include a plurality of sub-parameters corresponding to the plurality of regions respectively.
  • the extraction module may determine a predetermined number count of candidate regions surrounding a region specified by the sub-parameter for each of the plurality of sub-parameters.
  • the extraction module may determine, using the first trained model, spatial dependence information among sub-parameters specifying the candidate regions and the sub-parameter specifying the region for each of the plurality of sub-parameters.
  • the extraction module may further extract the spatial features of each of the plurality of service demand parameters based on the spatial dependence information.
  • the historical features of the plurality of service demand parameters may include a maximum historical service demand parameter within a predetermined time period, a minimum historical service demand parameter within the predetermined time period, an average historical service demand parameter within the predetermined time period, and/or a deviation of a plurality of historical service demand parameters within the predetermined time period.
  • each of the plurality of future service demand parameters may include a plurality of future sub-parameters corresponding to the plurality of regions respectively.
  • the prediction module may determine a comprehensive parameter by combining the spatial features and the historical features for each of the plurality of service demand parameters.
  • the prediction module may determine a vector based on a plurality of comprehensive parameters corresponding to the plurality of service demand parameters by using the second trained model.
  • the prediction module may determine the plurality of future service demand parameters based on the vector by using the second trained model.
  • the first trained model and/or the second trained model may be trained based on a plurality of sample service demand parameters of the target area corresponding to a plurality of sample time periods respectively.
  • the first trained model may include a convolutional neural network model.
  • the second trained model may include a long short-term memory network model.
  • a still further aspect of the present disclosure relates to a non-transitory computer readable medium.
  • the non-transitory computer readable medium may include a set of instructions for predicting service demand information.
  • the processor may be directed to perform one or more of the following operations.
  • the processor may receive a plurality of service demand parameters, each corresponding to a respective one of a plurality of time periods, wherein the plurality of service demand parameters may be associated with a target area including a plurality of regions.
  • the processor may extract, using a first trained model, spatial features of each of the plurality of service demand parameters based on the plurality of regions.
  • the processor may determine historical features of each of the plurality of service demand parameters based on historical service demand parameters associated with each of the plurality of service demand parameters.
  • the processor may further determine, using a second trained model, a plurality of future service demand parameters based on the spatial features and the historical features, each of the plurality of future service demand parameters corresponding to a respective one of a plurality of future time periods.
  • FIG. 1 is a schematic diagram illustrating an exemplary on-demand service system according to some embodiments of the present disclosure
  • FIG. 2 is a schematic diagram illustrating exemplary hardware and/or software components of an exemplary computing device according to some embodiments of the present disclosure
  • FIG. 3 is a schematic diagram illustrating exemplary hardware and/or software components of an exemplary mobile device according to some embodiments of the present disclosure
  • FIG. 4 is a block diagram illustrating an exemplary processing engine according to some embodiments of the present disclosure
  • FIG. 5 is a flowchart illustrating an exemplary process for determining a plurality of future service demand parameters according to some embodiments of the present disclosure
  • FIG. 6 is a schematic diagram illustrating an exemplary target area including a plurality of regions according to some embodiments of the present disclosure
  • FIG. 7 is a flowchart illustrating an exemplary process for extracting spatial features of a service demand parameter according to some embodiments of the present disclosure
  • FIG. 8 is a schematic diagram illustrating exemplary candidate regions surrounding a specific region according to some embodiments of the present disclosure.
  • FIG. 9 is a flowchart illustrating an exemplary process for determining a plurality of future service demand parameters according to some embodiments of the present disclosure.
  • FIG. 10 is a schematic diagram illustrating an exemplary process for determining a plurality of future service demand parameters according to some embodiments of the present disclosure.
  • the flowcharts used in the present disclosure illustrate operations that systems implement according to some embodiments of the present disclosure. It is to be expressly understood, the operations of the flowcharts may be implemented not in order. Conversely, the operations may be implemented in inverted order or simultaneously. Moreover, one or more other operations may be added to the flowcharts. One or more operations may be removed from the flowcharts.
  • the systems and methods disclosed in the present disclosure are described primarily regarding a transportation system, it should be understood that this is only one exemplary embodiment.
  • the systems and methods of the present disclosure may be applied to any other kind of on-demand service.
  • the systems and methods of the present disclosure may be applied to transportation systems of different environments including ocean, aerospace, or the like, or any combination thereof.
  • the vehicle of the transportation systems may include a taxi, a private car, a hitch, a bus, a train, a bullet train, a high-speed rail, a subway, a vessel, an aircraft, a spaceship, a hot-air balloon, a driverless vehicle, or the like, or any combination thereof.
  • the transportation system may also include any transportation system for management and/or distribution, for example, a system for sending and/or receiving an express.
  • the application of the system or method of the present disclosure may include a web page, a plug-in of a browser, a client terminal, a custom system, an internal analysis system, an artificial intelligence robot, or the like, or any combination thereof.
  • passenger, ” “requester, ” “requestor, ” “service requester, ” “service requestor” and “customer” in the present disclosure are used interchangeably to refer to an individual, an entity, or a tool that may request or order a service.
  • driver, ” “provider, ” “service provider, ” and “supplier” in the present disclosure are used interchangeably to refer to an individual, an entity, or a tool that may provide a service or facilitate the providing of the service.
  • user in the present disclosure may refer to an individual, an entity, or a tool that may request a service, order a service, provide a service, or facilitate the providing of the service.
  • the user may be a passenger, a driver, an operator, or the like, or any combination thereof.
  • passenger and passenger terminal may be used interchangeably, and terms “driver” and “driver terminal” may be used interchangeably.
  • service in the present disclosure are used interchangeably to refer to a request that may be initiated by a passenger, a requester, a service requester, a customer, a driver, a provider, a service provider, a supplier, or the like, or any combination thereof.
  • the service request may be accepted by any one of a passenger, a requester, a service requester, a customer, a driver, a provider, a service provider, or a supplier.
  • the service request may be chargeable or free.
  • the positioning technology used in the present disclosure may be based on a global positioning system (GPS) , a global navigation satellite system (GLONASS) , a compass navigation system (COMPASS) , a Galileo positioning system, a quasi-zenith satellite system (QZSS) , a wireless fidelity (WiFi) positioning technology, or the like, or any combination thereof.
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • COMPASS compass navigation system
  • Galileo positioning system Galileo positioning system
  • QZSS quasi-zenith satellite system
  • WiFi wireless fidelity positioning technology
  • An aspect of the present disclosure relates to systems and methods for predicting service demand information.
  • the systems may receive a plurality of service demand parameters (e.g., a number count of service requests) associated with a target area (e.g., a city) including a plurality of regions.
  • Each of the plurality of service demand parameters may correspond to a respective one of a plurality of time slices.
  • the systems may extract spatial features of each of the plurality of service demand parameters based on the plurality of regions by using a first trained model (e.g., a Convolutional Neural Network (CNN) model) .
  • CNN Convolutional Neural Network
  • the systems may also determine historical features of each of the plurality of service demand parameters based on historical service demand parameters associated with each of the plurality of service demand parameters.
  • the systems may determine a plurality of future service demand parameters based on the spatial features and the historical features by using a second trained model (e.g., a Long Short-Term Memory (LSTM) network model) .
  • a second trained model e.g., a Long Short-Term Memory (LSTM) network model
  • Each of the plurality of future service demand parameters may correspond to a respective one of a plurality of future time slices.
  • LSTM Long Short-Term Memory
  • a plurality of future service demand parameters corresponding to a plurality of time slices can be predicted.
  • a plurality of service demand parameters corresponding to a plurality of time slices are used, and spatial features and historical features of each of the plurality of service demand parameters are both taken into consideration, which can improve accuracy of the prediction.
  • online on-demand transportation service such as online taxi hailing including taxi hailing combination services
  • online taxi is a new form of service rooted only in post-Internet era. It provides technical solutions to users and service providers that could raise only in post-Internet era.
  • pre-Internet era when a user hails a taxi on street, the taxi request and acceptance occur only between the passenger and one taxi driver that sees the passenger. If the passenger hails a taxi through telephone call, the service request and acceptance may occur only between the passenger and one service provider (e.g., one taxi company or agent) .
  • service provider e.g., one taxi company or agent
  • Online taxi allows a user of the service to real-time and automatic distribute a service request to a vast number of individual service providers (e.g., taxi) distance away from the user.
  • the online on-demand transportation systems may provide a much more efficient transaction platform for the users and the service providers that may never meet in a traditional pre-Internet transportation service system.
  • FIG. 1 is a schematic diagram illustrating an exemplary on-demand service system according to some embodiments of the present disclosure.
  • the on-demand service system 100 may include a server 110, a network 120, a requester terminal 130, a provider terminal 140, and a storage 150.
  • the server 110 may be a single server or a server group.
  • the server group may be centralized or distributed (e.g., the server 110 may be a distributed system) .
  • the server 110 may be local or remote.
  • the server 110 may access information and/or data stored in the requester terminal 130, the provider terminal 140, and/or the storage 150 via the network 120.
  • the server 110 may be directly connected to the requester terminal 130, the provider terminal 140, and/or the storage 150 to access stored information and/or data.
  • the server 110 may be implemented on a cloud platform.
  • the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multi-cloud, or the like, or any combination thereof.
  • the server 110 may be implemented on a computing device 200 including one or more components illustrated in FIG. 2 in the present disclosure.
  • the server 110 may include a processing engine 112.
  • the processing engine 112 may process information and/or data associated with service demand information to perform one or more functions described in the present disclosure. For example, the processing engine 112 may determine a plurality of future service demand parameters corresponding to a plurality of future time slices respectively based on spatial features and/or historical features of each of a plurality of service demand parameters corresponding to a plurality of time slices respectively.
  • the processing engine 112 may include one or more processing engines (e.g., single-core processing engine (s) or multi-core processor (s) ) .
  • the processing engine 112 may include a central processing unit (CPU) , an application-specific integrated circuit (ASIC) , an application-specific instruction-set processor (ASIP) , a graphics processing unit (GPU) , a physics processing unit (PPU) , a digital signal processor (DSP) , a field programmable gate array (FPGA) , a programmable logic device (PLD) , a controller, a microcontroller unit, a reduced instruction-set computer (RISC) , a microprocessor, or the like, or any combination thereof.
  • CPU central processing unit
  • ASIC application-specific integrated circuit
  • ASIP application-specific instruction-set processor
  • GPU graphics processing unit
  • PPU physics processing unit
  • DSP digital signal processor
  • FPGA field programmable gate array
  • PLD programmable logic device
  • controller a microcontroller unit, a reduced instruction-set computer (RISC) , a microprocessor, or the like, or any combination thereof.
  • RISC reduced
  • the network 120 may facilitate exchange of information and/or data.
  • one or more components e.g., the server 110, the requester terminal 130, the provider terminal 140, or the storage 150
  • the server 110 may transmit information and/or data to other component (s) of the on-demand service system 100 via the network 120.
  • the server 110 may obtain service demand parameters associated with a target area from the storage 150 via the network 120.
  • the network 120 may be any type of wired or wireless network, or combination thereof.
  • the network 120 may include a cable network, a wireline network, an optical fiber network, a tele communications network, an intranet, an Internet, a local area network (LAN) , a wide area network (WAN) , a wireless local area network (WLAN) , a metropolitan area network (MAN) , a public telephone switched network (PSTN) , a Bluetooth network, a ZigBee network, a near field communication (NFC) network, or the like, or any combination thereof.
  • the network 120 may include one or more network access points.
  • the network 120 may include wired or wireless network access points, through which one or more components of the on-demand service system 100 may be connected to the network 120 to exchange data and/or information.
  • a requester may be a user of the requester terminal 130.
  • the user of the requester terminal 130 may be someone other than the requester.
  • a user A of the requester terminal 130 may use the requester terminal 130 to send a service request for a user B, or receive service and/or information or instructions from the server 110.
  • a provider may be a user of the provider terminal 140.
  • the user of the provider terminal 140 may be someone other than the provider.
  • a user C of the provider terminal 140 may user the provider terminal 140 to receive a service request for a user D, and/or information or instructions from the server 110.
  • “requester” and “requester terminal” may be used interchangeably, and “provider” and “provider terminal” may be used interchangeably.
  • the requester terminal 130 may include a mobile device 130-1, a tablet computer 130-2, a laptop computer 130-3, a built-in device in a motor vehicle 130-4, or the like, or any combination thereof.
  • the mobile device 130-1 may include a smart home device, a wearable device, a mobile device, a virtual reality device, an augmented reality device, or the like, or any combination thereof.
  • the smart home device may include a smart lighting device, a control device of an intelligent electrical apparatus, a smart monitoring device, a smart television, a smart video camera, an interphone, or the like, or any combination thereof.
  • the wearable device may include a smart bracelet, a smart footgear, a smart glass, a smart helmet, a smart watch, a smart clothing, a smart backpack, a smart accessory, or the like, or any combination thereof.
  • the mobile device may include a mobile phone, a personal digital assistance (PDA) , a gaming device, a navigation device, a point of sale (POS) device, a laptop, a desktop, or the like, or any combination thereof.
  • the virtual reality device and/or the augmented reality device may include a virtual reality helmet, a virtual reality glass, a virtual reality patch, an augmented reality helmet, augmented reality glasses, an augmented reality patch, or the like, or any combination thereof.
  • the virtual reality device and/or the augmented reality device may include a Google Glass TM , an Oculus Rift TM , a Hololens TM , a Gear VR TM , etc.
  • a built-in device in the motor vehicle 130-4 may include an onboard computer, an onboard television, etc.
  • the requester terminal 130 may be a device with positioning technology for locating the position of the requester and/or the requester terminal 130.
  • the provider terminal 140 may be similar to, or the same device as the requester terminal 130. In some embodiments, the provider terminal 140 may be a device with positioning technology for locating the position of the provider and/or the provider terminal 140. In some embodiments, the requester terminal 130 and/or the provider terminal 140 may communicate with another positioning device to determine the position of the requester, the requester terminal 130, the provider, and/or the provider terminal 140. In some embodiments, the requester terminal 130 and/or the provider terminal 140 may send positioning information to the server 110.
  • the storage 150 may store data and/or instructions. In some embodiments, the storage 150 may store data obtained from the requester terminal 130 and/or the provider terminal 140. In some embodiments, the storage 150 may store data and/or instructions that the server 110 may execute or use to perform exemplary methods described in the present disclosure. In some embodiments, storage 150 may include a mass storage, a removable storage, a volatile read-and-write memory, a read-only memory (ROM) , or the like, or any combination thereof. Exemplary mass storage may include a magnetic disk, an optical disk, a solid-state drive, etc. Exemplary removable storage may include a flash drive, a floppy disk, an optical disk, a memory card, a zip disk, a magnetic tape, etc.
  • Exemplary volatile read-and-write memory may include a random access memory (RAM) .
  • RAM may include a dynamic RAM (DRAM) , a double date rate synchronous dynamic RAM (DDR SDRAM) , a static RAM (SRAM) , a thyrisor RAM (T-RAM) , and a zero-capacitor RAM (Z-RAM) , etc.
  • Exemplary ROM may include a mask ROM (MROM) , a programmable ROM (PROM) , an erasable programmable ROM (EPROM) , an electrically-erasable programmable ROM (EEPROM) , a compact disk ROM (CD-ROM) , and a digital versatile disk ROM, etc.
  • MROM mask ROM
  • PROM programmable ROM
  • EPROM erasable programmable ROM
  • EEPROM electrically-erasable programmable ROM
  • CD-ROM compact disk ROM
  • digital versatile disk ROM etc.
  • the storage 150 may be implemented on a cloud platform.
  • the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multi-cloud, or the like, or any combination thereof.
  • the storage 150 may be connected to the network 120 to communicate with one or more components (e.g., the server 110, the requester terminal 130, the provider terminal 140) of the on-demand service system 100.
  • One or more components of the on-demand service system 100 may access the data or instructions stored in the storage 150 via the network 120.
  • the storage 150 may be directly connected to or communicate with one or more components (e.g., the server 110, the requester terminal 130, the provider terminal 140) of the on-demand service system 100.
  • the storage 150 may be part of the server 110.
  • one or more components of the on-demand service system 100 may access the storage 150.
  • one or more components of the on-demand service system 100 may read and/or modify information relating to the requester, the provider, and/or the public when one or more conditions are met.
  • the server 110 may read and/or modify one or more users’ information after a service is completed.
  • the provider terminal 140 may access information relating to the requester when receiving a service request from the requester terminal 130, but the provider terminal 140 can not modify the relevant information of the requester.
  • information exchanging of one or more components of the on-demand service system 100 may be achieved by way of requesting a service.
  • the object of the service request may be any product.
  • the product may be a tangible product or immaterial product.
  • the tangible product may include food, medicine, commodity, chemical product, electrical appliance, clothing, car, housing, luxury, or the like, or any combination thereof.
  • the immaterial product may include a servicing product, a financial product, a knowledge product, an internet product, or the like, or any combination thereof.
  • the internet product may include an individual host product, a web product, a mobile internet product, a commercial host product, an embedded product, or the like, or any combination thereof.
  • the mobile internet product may be used in a software of a mobile terminal, a program, a system, or the like, or any combination thereof.
  • the mobile terminal may include a tablet computer, a laptop computer, a mobile phone, a personal digital assistance (PDA) , a smart watch, a point of sale (POS) device, an onboard computer, an onboard television, a wearable device, or the like, or any combination thereof.
  • PDA personal digital assistance
  • POS point of sale
  • the product may be any software and/or application used on the computer or mobile phone.
  • the software and/or application may relate to socializing, shopping, transporting, entertainment, learning, investment, or the like, or any combination thereof.
  • the software and/or application relating to transporting may include a traveling software and/or application, a vehicle scheduling software and/or application, a mapping software and/or application, etc.
  • the vehicle may include a horse, a carriage, a rickshaw (e.g., a wheelbarrow, a bike, a tricycle) , a car (e.g., a taxi, a bus, a private car) , a train, a subway, a vessel, an aircraft (e.g., an airplane, a helicopter, a space shuttle, a rocket, a hot-air balloon) , or the like, or any combination thereof.
  • a traveling software and/or application the vehicle may include a horse, a carriage, a rickshaw (e.g., a wheelbarrow, a bike, a tricycle) , a car (e.g., a taxi, a bus, a private car) , a train, a subway, a vessel, an aircraft (e.
  • the on-demand service system 100 may be a navigation system.
  • the navigation system may include a user terminal (e.g., the requester terminal 130, the provider terminal 140) and a server (e.g., the server 110) .
  • the navigation system may obtain GPS information associated with the user terminal while providing navigation service for the user terminal. Further, the navigation system may predict future traffic information based on historical traffic information (which may be determined based on historically obtained GPS information) according to the methods disclosed in the present disclosure.
  • FIG. 2 is a schematic diagram illustrating exemplary hardware and/or software components of an exemplary computing device according to some embodiments of the present disclosure.
  • the server 110 may be implemented on the computing device 200.
  • the processing engine 112 may be implemented on the computing device 200 and configured to perform functions of the processing engine 112 disclosed in this disclosure.
  • the computing device 200 may be used to implement any component of the on-demand service system 100 of the present disclosure.
  • the processing engine 112 of the on-demand service system 100 may be implemented on the computing device 200, via its hardware, software program, firmware, or a combination thereof.
  • the computer functions related to the on-demand service system 100 as described herein may be implemented in a distributed manner on a number of similar platforms to distribute the processing load.
  • the computing device 200 may include communication (COM) ports 250 connected to and from a network (e.g., the network 120) connected thereto to facilitate data communications.
  • the computing device 200 may also include a processor (e.g., a processor 220) , in the form of one or more processors (e.g., logic circuits) , for executing program instructions.
  • the processor may include interface circuits and processing circuits therein.
  • the interface circuits may be configured to receive electronic signals from a bus 210, wherein the electronic signals encode structured data and/or instructions for the processing circuits to process.
  • the processing circuits may conduct logic calculations, and then determine a conclusion, a result, and/or an instruction encoded as electronic signals. Then the interface circuits may send out the electronic signals from the processing circuits via the bus 210.
  • the computing device 200 may further include program storage and data storage of different forms, for example, a disk 270, and a read only memory (ROM) 230, or a random access memory (RAM) 240, for storing various data files to be processed and/or transmitted by the computing device 200.
  • the computing device 200 may also include program instructions stored in the ROM 230, the RAM 240, and/or other type of non-transitory storage medium to be executed by the processor 220.
  • the methods and/or processes of the present disclosure may be implemented as the program instructions.
  • the computing device 200 also includes an I/O component 260, supporting input/output between the computing device 200 and other components therein.
  • the computing device 200 may also receive programming and data via network communications.
  • step A and step B may also be performed by two different CPUs and/or processors jointly or separately in the computing device 200 (e.g., the first processor executes step A and the second processor executes step B, or the first and second processors jointly execute steps A and B) .
  • FIG. 3 is a schematic diagram illustrating exemplary hardware and/or software components of an exemplary mobile device according to some embodiments of the present disclosure.
  • the requester terminal 130 and/or the provider 140 may be implemented on the mobile device 300.
  • the mobile device 300 may include a communication platform 310, a display 320, a graphic processing unit (GPU) 330, a central processing unit (CPU) 340, an I/O 350, a memory 360, and a storage 390.
  • any other suitable component including but not limited to a system bus or a controller (not shown) , may also be included in the mobile device 300.
  • a mobile operating system 370 e.g., iOS TM , Android TM , Windows Phone TM
  • one or more applications 380 may be loaded into the memory 360 from the storage 390 in order to be executed by the CPU 340.
  • the applications 380 may include a browser or any other suitable mobile apps for receiving and rendering information relating to the on-demand service system 100.
  • User interactions with the information stream may be achieved via the I/O 350 and provided to one or more components of the on-demand service system 100 via the network 120.
  • computer hardware platforms may be used as the hardware platform (s) for one or more of the elements described herein.
  • a computer with user interface elements may be used to implement a personal computer (PC) or any other type of work station or terminal device.
  • PC personal computer
  • a computer may also act as a server if appropriately programmed.
  • FIG. 4 is a block diagram illustrating an exemplary processing engine according to some embodiments of the present disclosure.
  • the processing engine 112 may include a receiving module 410, an extraction module 420, a determination module 430, and a prediction module 440.
  • the receiving module 410 may be configured to receive a plurality of service demand parameters associated with a target area (e.g., a city, a district, a block) , wherein each of the plurality of service demand parameters corresponds to a respective one of a plurality of time slices (e.g., 1 minute, 2 minutes, 5 minutes, 10 minutes) .
  • the target area may include a plurality of regions.
  • a shape of each of the plurality of regions may be any shape (e.g., rectangle, triangle, circle, irregular shape) .
  • each of the plurality of service demand parameters may include a plurality of sub-parameters corresponding to the plurality of regions respectively.
  • the extraction module 420 may be configured to extract spatial features of each of the plurality of service demand parameters based on the plurality of regions by using a first trained model.
  • the extraction module 420 may extract the spatial features of the specific service demand parameter based on spatial dependence information among the plurality of sub-parameters included in the specific service demand parameter. More descriptions of the spatial features of the plurality of service demand parameters may be found elsewhere in the present disclosure (e.g., FIG. 7, FIG. 8, and the relevant descriptions thereof) .
  • the determination module 430 may be configured to determine historical features of each of the plurality of service demand parameters based on historical service demand parameters associated with each of the plurality of service demand parameters. Also take a specific service demand parameter corresponding to a specific time slice as an example, the historical features of the specific service demand parameter may include a maximum historical service demand parameter within a predetermined time period (e.g., last month, last three months) , a minimum historical service demand parameter within the predetermined time period, an average historical service demand parameter within the predetermined time period, a deviation of a plurality of historical service demand parameters within the predetermined time period, or the like, or a combination thereof.
  • a predetermined time period e.g., last month, last three months
  • the prediction module 440 may be configured to determine a plurality of future service demand parameters based on the spatial features and the historical features by using a second trained model, wherein each of the plurality of future service demand parameters corresponds to a respective one of a plurality of future time slices. More descriptions of the future service demand parameters may be found elsewhere in the present disclosure (e.g., FIG. 9, FIG. 10, and the relevant descriptions thereof) .
  • the modules in the processing engine 112 may be connected to or communicate with each other via a wired connection or a wireless connection.
  • the wired connection may include a metal cable, an optical cable, a hybrid cable, or the like, or any combination thereof.
  • the wireless connection may include a Local Area Network (LAN) , a Wide Area Network (WAN) , a Bluetooth, a ZigBee, a Near Field Communication (NFC) , or the like, or any combination thereof.
  • LAN Local Area Network
  • WAN Wide Area Network
  • NFC Near Field Communication
  • Two or more of the modules may be combined as a single module, and any one of the modules may be divided into two or more units.
  • the extraction module 420 and the determination module 430 may be combined as a single module which may both determine spatial features and historical features of each of the plurality of service demand parameters.
  • the processing engine 112 may include a storage module (not shown in FIG. 4) which may be configured to store the plurality of service demand parameters, the spatial features of each of the plurality of service demand parameters, the historical features of each of the plurality of service demand parameters, the plurality of future service demand parameters, etc.
  • FIG. 5 is a flowchart illustrating an exemplary process for predicting service demand information according to some embodiments of the present disclosure.
  • the process 500 may be executed by the on-demand service system 100.
  • the process 500 may be implemented as a set of instructions (e.g., an application) stored in the storage ROM 230 or RAM 240.
  • the processor 220 and/or the modules in FIG. 4 may execute the set of instructions, and when executing the instructions, the processor 220 and/or the modules may be configured to perform the process 500.
  • the operations of the illustrated process/method presented below are intended to be illustrative. In some embodiments, the process 500 may be accomplished with one or more additional operations not described and/or without one or more of the operations discussed. Additionally, the order in which the operations of the process 500 illustrated in FIG. 5 and described below is not intended to be limiting.
  • the processing engine 112 (e.g., the receiving module 410) (e.g., the interface circuits of the processor 220) may receive a plurality of service demand parameters associated with a target area (e.g., a city, a district, a block) , wherein each of the plurality of service demand parameters corresponds to a respective one of a plurality of time slices (e.g., 1 minute, 2 minutes, 5 minutes, 10 minutes) .
  • a target area e.g., a city, a district, a block
  • time slices e.g., 1 minute, 2 minutes, 5 minutes, 10 minutes
  • the processing engine 112 may determine the plurality of time slices by segmenting a first predetermined time period into a plurality of parts. For example, it is assumed that a current time point is 10: 00, the processing engine 112 may define a time period "8: 00 ⁇ 10: 00" and segment the time period into 24 parts. In this situation, the plurality of time slices may be expressed as a first time set below:
  • the plurality of service demand parameters corresponding to the plurality of time slices may be expressed as below:
  • D refers to a set including the plurality of service demand parameters
  • t refers to a tth time slice
  • D t refers to a tth service demand parameter corresponding to the tth time slice
  • n refers to a number count of the plurality of time slices (i.e., a number count of the plurality of service demand parameters) , which may be default settings of the on-demand service system 100 or may be adjustable under different situations.
  • a corresponding service demand parameter may refer to a number count of service requests (e.g., a transportation service request initiated by a requester via the requester terminal 130) in the target area within the specific time slice.
  • the target area may include a plurality of regions.
  • a shape of each of the plurality of regions may be any shape (e.g., rectangle, triangle, circle, irregular shape) .
  • each of the plurality of service demand parameters may include a plurality of sub-parameters corresponding to the plurality of regions respectively.
  • the processing engine 112 may segment the target area into a plurality of squares, wherein a side length of each of the plurality of squares is 200 meters. Accordingly, take a tth service demand parameter as an example, the tth service demand parameter may be expressed as below:
  • D t refers to the tth service demand parameter which may be considered as a set including a plurality of sub-parameters
  • [i] [j] refers to a region (i.e., a square) in an ith row and a jth column of the target area
  • H refers to a number count of rows
  • M refers to a number count of columns.
  • the processing engine 112 e.g., the extraction module 420
  • the processing circuits of the processor 220 may extract spatial features of each of the plurality of service demand parameters based on the plurality of regions by using a first trained model.
  • the first trained model may include a convolutional neural network (CNN) model used in image processing, which may be trained based on a plurality of sample service demand parameters of the target area corresponding to a plurality of sample time periods.
  • CNN convolutional neural network
  • the processing engine 112 may extract the spatial features of the specific service demand parameter based on spatial dependence information among the plurality of sub-parameters included in the specific service demand parameter. More descriptions of the spatial features of the plurality of service demand parameters may be found elsewhere in the present disclosure (e.g., FIG. 7, FIG. 8, and the relevant descriptions thereof) .
  • the processing engine 112 e.g., the determination module 430
  • the processing circuits of the processor 220 may determine historical features of each of the plurality of service demand parameters based on historical service demand parameters associated with each of the plurality of service demand parameters.
  • the historical features of the specific service demand parameter may include a maximum historical service demand parameter within a predetermined time period (e.g., last month, last three months) , a minimum historical service demand parameter within the predetermined time period, an average historical service demand parameter within the predetermined time period, a deviation of a plurality of historical service demand parameters within the predetermined time period, or the like, or a combination thereof.
  • the predetermined time period refers to a time set including a plurality of historical time slices, each of which corresponds to the specific time slice corresponding to the specific service demand parameter. For example, it is assumed that the specific time slice is 9: 00 ⁇ 9: 05 a. m.
  • the plurality of historical time slices may be a plurality of corresponding time slices (i.e., 9: 00 ⁇ 9: 05 a. m. ) at weekends within the predetermined time period (e.g., last month, last three months) .
  • the processing engine 112 e.g., the prediction module 440
  • the processing circuits of the processor 220 may determine a plurality of future service demand parameters based on the spatial features and the historical features by using a second trained model, wherein each of the plurality of future service demand parameters corresponds to a respective one of a plurality of future time slices.
  • the processing engine 112 may determine the plurality of future time slices by segmenting a second predetermined time period (which is connected to the first predetermined time period via the current time point) into a plurality of parts. For example, as described in connection with operation 510, it is also assumed that the current time point is 10: 00, the processing engine 112 may define a time period "10: 00 ⁇ 10: 30" (which is connected to "8: 00 ⁇ 10: 00" ) and segment the time period into 6 parts. In this situation, the plurality of future time slices may be expressed as a second time set below:
  • T 2 ⁇ "10: 00 ⁇ 10: 05, " "10: 05 ⁇ 10: 10, " ..., "10: 25 ⁇ 10: 30” ⁇ (4)
  • the plurality of future service demand parameters may be expressed as below:
  • D′ refers to a set including the plurality of future service demand parameters
  • D′ t refers to a (t-n) th future service demand parameter (which corresponds to a tth time slice)
  • m refers to a number count of the plurality of future service demand parameters.
  • a corresponding future service demand parameter refers to predicted service demand information (e.g., a predicted number count of service requests) in the target area within the specific future time slice.
  • each of the plurality of future service demand parameters also includes a plurality of future sub-parameters corresponding to the plurality of regions included in the target area.
  • the second trained model may be a Long Short-Term Memory (LSTM) network model used in machine learning associated with time series data, which may be trained based on a plurality of sample service demand parameters of the target area corresponding to a plurality of sample time periods.
  • LSTM Long Short-Term Memory
  • the processing engine 112 may determine a comprehensive parameter by combining the spatial features and the historical features. Then the processing engine 112 may determine a vector based on a plurality of comprehensive parameters corresponding to the plurality of service demand parameters by using the second trained model. Further, the processing engine 112 may determine the plurality of future service demand parameters based on the vector by using the second trained model. More descriptions of the future service demand parameters may be found elsewhere in the present disclosure (e.g., FIG. 9, FIG. 10, and the relevant descriptions thereof) .
  • the processing engine 112 may determine suitable scheduling strategies based on the plurality of future service demand parameters, thereby improving service efficiency.
  • the processing engine 112 may receive the plurality of service demand parameters from a storage device (e.g., the storage 150) disclosed elsewhere in the present disclosure or an external data resource.
  • a storage device e.g., the storage 150
  • one or more other optional operations e.g., a storing operation
  • the processing engine 112 may store the plurality of service demand parameters, the spatial features, the historical features, the plurality of future service demand parameters, etc.
  • operation 520 and operation 530 may be combined as a single operation in which the processing engine 112 may both determine the spatial features and the historical features.
  • FIG. 6 is a schematic diagram illustrating an exemplary target area including a plurality of regions according to some embodiments of the present disclosure.
  • a target area 600 may be a quadrangle area (which can be considered as an image) and may be segmented into H ⁇ W squares (i.e., the plurality of regions) , wherein a side length of each of the squares is 200 meters.
  • the specific service demand parameter refers to a number count of service requests in the target area 600 within the corresponding specific time slice.
  • the specific service demand parameter includes a plurality of sub-parameters corresponding to the plurality of regions (i.e., the squares) respectively, that is, each of the plurality of sub-parameters refers to a number count of service requests in a corresponding region within the specific time slice.
  • the sub-parameter X refers to a number count of service requests in a corresponding region [M, N] .
  • FIG. 7 is a flowchart illustrating an exemplary process for extracting spatial features of a service demand parameter according to some embodiments of the present disclosure.
  • the process 700 may be executed by the on-demand service system 100.
  • the process 700 may be implemented as a set of instructions (e.g., an application) stored in the storage ROM 230 or RAM 240.
  • the processor 220 and/or the modules in FIG. 4 may execute the set of instructions, and when executing the instructions, the processor 220 and/or the modules may be configured to perform the process 700.
  • the operations of the illustrated process/method presented below are intended to be illustrative. In some embodiments, the process 700 may be accomplished with one or more additional operations not described and/or without one or more of the operations discussed.
  • operation 520 of process 500 may be performed based on process 700.
  • the description below takes a specific service demand parameter as an example.
  • the processing engine 112 e.g., the extraction module 420
  • the processing circuits of the processor 220 may obtain a plurality of sub-parameters corresponding to the plurality of regions respectively.
  • the processing engine 112 e.g., the extraction module 420
  • the processing circuits of the processor 220 may determine a predetermined number count of candidate regions surrounding a region specified by the sub-parameter (i.e., a region corresponding to the sub-parameter) .
  • the processing engine 112 may determine a predetermined number count (e.g., 24) of candidate regions (e.g., the regions that dashed lines pass through) surrounding the region [M, N] specified by the sub-parameter X. It can be seen that the candidate regions and the region [M, N] form a quadrangle (which includes 25 regions) with the region [M, N] as a center.
  • a predetermined number count e.g. 24
  • the processing engine 112 may determine spatial dependence information among sub-parameters specifying the candidate regions and the sub-parameter specifying the region by using the first trained model.
  • the spatial dependence information refers to a spatial relationship among variables (e.g., the sub-parameters) , each of which may be associated with a geographic region (e.g., the regions in the target area) .
  • the spatial dependence information may include cross-correlation information and/or autocorrelation information.
  • the first trained model may be a CNN model which can be used in image processing. In the present disclosure, the CNN model may be used to extract the spatial dependence information among the sub-parameters.
  • the specific service demand parameter refers to service demand information (e.g., a number count of service requests) in the target area within a corresponding specific time slice.
  • the target area can be considered as an image including a plurality of regions.
  • the specific service demand parameter can be considered as an image indicating the service demand information in the target area within the corresponding specific time slice.
  • the plurality of service demand parameters can be considered as a series of images in time dimension, which can be determined as input of the CNN model.
  • the CNN model may include one or more model parameters, for example, a number count of filters (e.g., 32) , a kernel size (e.g., 5 ⁇ 5) , etc.
  • the predetermined number count of candidate regions may be associated with the kernel size.
  • the kernel size indicates a size of the quadrangle formed by the region and the candidate regions surrounding the region. It is assumed that the kernel size is "5 ⁇ 5, " the predetermined number count of candidate regions may be 24.
  • the one or more model parameters may be default settings of the on-demand service system 100 or may be adjustable under different situations.
  • the processing engine 112 may extract the spatial features of the specific service demand parameter based on the spatial dependence information.
  • the spatial features may be expressed as values, vectors, matrices, determinants, or the like, or a combination thereof.
  • FIG. 9 is a flowchart illustrating an exemplary process for determining a plurality of future service demand parameters according to some embodiments of the present disclosure.
  • the process 900 may be executed by the on-demand service system 100.
  • the process 900 may be implemented as a set of instructions (e.g., an application) stored in the storage ROM 230 or RAM 240.
  • the processor 220 and/or the modules in FIG. 4 may execute the set of instructions, and when executing the instructions, the processor 220 and/or the modules may be configured to perform the process 900.
  • the operations of the illustrated process/method presented below are intended to be illustrative. In some embodiments, the process 900 may be accomplished with one or more additional operations not described and/or without one or more of the operations discussed. Additionally, the order in which the operations of the process 900 illustrated in FIG. 9 and described below is not intended to be limiting. In some embodiments, operation 540 of process 500 may be performed based on process 900.
  • the processing engine 112 may determine a comprehensive parameter by combining the spatial features and the historical features. As described in connection with operation 520 and operation 530, take the tth service demand parameter corresponding to the tth time slice as an example, the processing engine 112 may combine the spatial features and the historical features to determine the comprehensive parameter as below:
  • f t all [i] [j] refers to a tth comprehensive parameter corresponding to the tth service demand parameter
  • f t [i] [j] refers to an expression indicating the spatial features of the tth service demand parameter
  • D t [i] [j] max refers to the maximum historical service demand parameter
  • D t [i] [j] min refers to the minimum historical service demand parameter
  • D t [i] [j] avg refers to the average historical service demand parameter
  • D t [i] [j] deviation refers to the deviation of the plurality of historical service demand parameters
  • is_we refers to that the tth the time slice is on working days or at weekends ( "is” refers to working days and "we” refers to weekends)
  • ts refers to an index indicating the tth time slice.
  • the index indicating the time slice may be a serial number, a code, etc.
  • each of the plurality of time slices is "5 minutes, " in this situation, a day (which includes 24 hours) includes 288 time slices.
  • the 288 time slices can be indexed as a serial number range [0, 287] .
  • the processing engine 112 e.g., the prediction module 440
  • the processing circuits of the processor 220 may determine a vector based on a plurality of comprehensive parameters corresponding to the plurality of service demand parameters by using the second trained model.
  • the second trained model may be a long short-term memory (LSTM) network model including an encoding portion and a decoding portion.
  • the plurality of comprehensive parameters corresponding to the plurality of service demand parameters may be input into the encoding portion of the LSTM network model and the vector may be determined based on the plurality of comprehensive parameters, which can be regarded as an encoded result of the model.
  • the processing engine 112 e.g., the prediction module 440
  • the processing circuits of the processor 220 may determine the plurality of future service demand parameters based on the vector by using the second trained model.
  • the vector may be input into the decoding portion of the LSTM network model and the plurality of future service demand parameters may be determined based on the vector, which can be regarded as decoding results of the model.
  • FIG. 10 is a schematic diagram illustrating an exemplary process for determining a plurality of future service demand parameters according to some embodiments of the present disclosure.
  • the plurality of service demand parameters i.e., D 1 [i] [j] , D 2 [i] [j] , ..., D n [i] [j] ) corresponding to a plurality of time slices are obtained and input into the CNN model. Then spatial features of each of the plurality of service demand parameters are extracted by using the CNN model. Simultaneously or successively, historical features of each of the plurality of service demand parameters are also obtained. Further, for each of the plurality of service demand parameters, the spatial features and the historical features are combined and a comprehensive parameter is determined. A plurality of comprehensive parameters corresponding to the plurality of service demand parameters are input into the LSTM network model respectively.
  • the LSTM network model includes an encoding portion and a decoding portion.
  • a vector i.e., "C”
  • the plurality of future service demand parameters i.e., D n+1 [i] [j] , ..., D n+m [i] [j] ) are determined based on the vector by using the decoding portion of the LSTM network model.
  • the first trained model e.g., the CNN model
  • the second trained model e.g., the LSTM network model
  • the processing engine 112 may obtain a first preliminary model and train the first preliminary model based on the plurality of sample service demand parameters until a preset condition is satisfied, for example, a value of a loss function is less than a loss threshold, an accuracy rate is larger than an accuracy threshold, a number count of iterations reaches an iteration threshold, etc.
  • aspects of the present disclosure may be illustrated and described herein in any of a number of patentable classes or context including any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof. Accordingly, aspects of the present disclosure may be implemented entirely hardware, entirely software (including firmware, resident software, micro-code, etc. ) or combining software and hardware implementation that may all generally be referred to herein as a “unit, ” “module, ” or “system. ” Furthermore, aspects of the present disclosure may take the form of a computer program product embodied in one or more computer readable media having computer readable program code embodied thereon.
  • a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including electro-magnetic, optical, or the like, or any suitable combination thereof.
  • a computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that may communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
  • Program code embodied on a computer readable signal medium may be transmitted using any appropriate medium, including wireless, wireline, optical fiber cable, RF, or the like, or any suitable combination of the foregoing.
  • Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including an object-oriented programming language such as Java, Scala, Smalltalk, Eiffel, JADE, Emerald, C++, C#, VB. NET, Python or the like, conventional procedural programming languages, such as the "C" programming language, Visual Basic, Fortran 2003, Perl, COBOL 2002, PHP, ABAP, dynamic programming languages such as Python, Ruby and Groovy, or other programming languages.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN) , or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider) or in a cloud computing environment or offered as a service such as a Software as a Service (SaaS) .
  • LAN local area network
  • WAN wide area network
  • SaaS Software as a Service

Landscapes

  • Business, Economics & Management (AREA)
  • Engineering & Computer Science (AREA)
  • Human Resources & Organizations (AREA)
  • Strategic Management (AREA)
  • Economics (AREA)
  • Tourism & Hospitality (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Development Economics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Marketing (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • Finance (AREA)
  • Game Theory and Decision Science (AREA)
  • Accounting & Taxation (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Primary Health Care (AREA)
  • Data Mining & Analysis (AREA)
  • Educational Administration (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

L'invention concerne des systèmes et des procédés de prédiction d'informations de demande de service. Les systèmes reçoivent une pluralité de paramètres de demande de service associés à une zone cible comprenant une pluralité de régions, chaque paramètre parmi la pluralité de paramètres de demande de service correspondant à une tranche respective parmi une pluralité de tranches de temps ; extraient, à l'aide d'un premier modèle entraîné, des caractéristiques spatiales de chaque paramètre parmi la pluralité de paramètres de demande de service sur la base de la pluralité de régions ; déterminent des caractéristiques historiques de chaque paramètre parmi la pluralité de paramètres de demande de service sur la base de paramètres de demande de service historiques associés à chaque paramètre parmi la pluralité de paramètres de demande de service ; et déterminent, à l'aide d'un second modèle entraîné, une pluralité de paramètres de demande de service futurs sur la base des caractéristiques spatiales et des caractéristiques historiques, chaque paramètre parmi la pluralité de paramètres de demande de service futurs correspondant à une tranche respective parmi une pluralité de tranches de temps futures.
PCT/CN2019/102588 2019-08-22 2019-08-26 Systèmes et procédés de prédiction d'informations de demande de service WO2021031222A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910779274.8A CN111860926B (zh) 2019-08-22 2019-08-22 用于预测服务需求信息的系统和方法
CN201910779274.8 2019-08-22

Publications (1)

Publication Number Publication Date
WO2021031222A1 true WO2021031222A1 (fr) 2021-02-25

Family

ID=72970594

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/102588 WO2021031222A1 (fr) 2019-08-22 2019-08-26 Systèmes et procédés de prédiction d'informations de demande de service

Country Status (2)

Country Link
CN (1) CN111860926B (fr)
WO (1) WO2021031222A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113792906A (zh) * 2021-08-05 2021-12-14 交控科技股份有限公司 列车长时间窗运行轨迹预测方法、装置、设备及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160111004A1 (en) * 2013-04-22 2016-04-21 Ineo Method and device for dynamic management of urban mobility
CN107451714A (zh) * 2017-06-22 2017-12-08 浙江力石科技股份有限公司 一种基于大数据分析的共享交通资源时域配置方法及系统
CN108090646A (zh) * 2016-11-23 2018-05-29 重庆邮电大学 一种公共自行车智能调度系统预测调度数据的获取方法
CN109117973A (zh) * 2017-06-26 2019-01-01 北京嘀嘀无限科技发展有限公司 一种网约车订单量预测方法及装置
CN109993408A (zh) * 2019-02-28 2019-07-09 河海大学 一种基于服务区域划分的网约车运力调配方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160111004A1 (en) * 2013-04-22 2016-04-21 Ineo Method and device for dynamic management of urban mobility
CN108090646A (zh) * 2016-11-23 2018-05-29 重庆邮电大学 一种公共自行车智能调度系统预测调度数据的获取方法
CN107451714A (zh) * 2017-06-22 2017-12-08 浙江力石科技股份有限公司 一种基于大数据分析的共享交通资源时域配置方法及系统
CN109117973A (zh) * 2017-06-26 2019-01-01 北京嘀嘀无限科技发展有限公司 一种网约车订单量预测方法及装置
CN109993408A (zh) * 2019-02-28 2019-07-09 河海大学 一种基于服务区域划分的网约车运力调配方法

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113792906A (zh) * 2021-08-05 2021-12-14 交控科技股份有限公司 列车长时间窗运行轨迹预测方法、装置、设备及存储介质
CN113792906B (zh) * 2021-08-05 2024-04-30 交控科技股份有限公司 列车长时间窗运行轨迹预测方法、装置、设备及存储介质

Also Published As

Publication number Publication date
CN111860926B (zh) 2024-03-29
CN111860926A (zh) 2020-10-30

Similar Documents

Publication Publication Date Title
US11348042B2 (en) Systems and methods for determining predicted distribution of future transportation service time point
US10883842B2 (en) Systems and methods for route searching
US11631027B2 (en) Systems and methods for allocating service requests
WO2018191856A1 (fr) Système et procédé de détermination d'un score de sécurité de conducteur
WO2017157069A1 (fr) Systèmes et procédés de prédiction de point temporel de service
WO2021012342A1 (fr) Systèmes et procédés de prédiction de trafic
AU2017419266B2 (en) Methods and systems for estimating time of arrival
WO2019242286A1 (fr) Systèmes et procédés d'attribution de demandes de service
US11068815B2 (en) Systems and methods for vehicle scheduling
AU2016377721A1 (en) Systems and methods for updating sequence of services
US11120091B2 (en) Systems and methods for on-demand services
US11748860B2 (en) Systems and methods for new road determination
WO2019061129A1 (fr) Systèmes et procédés d'évaluation de stratégie de programmation associée à des services de conduite désignés
US11303713B2 (en) Systems and methods for on-demand services
WO2021031222A1 (fr) Systèmes et procédés de prédiction d'informations de demande de service
WO2021051221A1 (fr) Systèmes et procédés d'évaluation de trajet de conduite
WO2019100366A1 (fr) Systèmes et procédés de répartition de demandes de service à la demande
WO2021022487A1 (fr) Systèmes et procédés de détermination d'une heure d'arrivée estimée
WO2020243963A1 (fr) Systèmes et procédés de détermination d'informations recommandées de demande de service

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19941862

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19941862

Country of ref document: EP

Kind code of ref document: A1