EP3571859B1 - System and method for evaluating wireless device and wireless network performance - Google Patents
System and method for evaluating wireless device and wireless network performance Download PDFInfo
- Publication number
- EP3571859B1 EP3571859B1 EP18741154.1A EP18741154A EP3571859B1 EP 3571859 B1 EP3571859 B1 EP 3571859B1 EP 18741154 A EP18741154 A EP 18741154A EP 3571859 B1 EP3571859 B1 EP 3571859B1
- Authority
- EP
- European Patent Office
- Prior art keywords
- network
- data
- wireless device
- optionally
- wireless
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims description 44
- 238000012360 testing method Methods 0.000 claims description 151
- 238000004458 analytical method Methods 0.000 claims description 18
- 230000006399 behavior Effects 0.000 claims description 16
- 238000012545 processing Methods 0.000 claims description 12
- 230000001010 compromised effect Effects 0.000 claims description 10
- 238000010801 machine learning Methods 0.000 claims description 8
- 230000008569 process Effects 0.000 claims description 8
- 230000004931 aggregating effect Effects 0.000 claims description 5
- 230000008859 change Effects 0.000 claims description 5
- 238000012423 maintenance Methods 0.000 claims description 4
- 238000013473 artificial intelligence Methods 0.000 claims description 3
- 238000007405 data analysis Methods 0.000 claims description 3
- 230000006870 function Effects 0.000 claims description 3
- 238000012986 modification Methods 0.000 claims description 2
- 230000004048 modification Effects 0.000 claims description 2
- 238000004891 communication Methods 0.000 description 12
- 238000010586 diagram Methods 0.000 description 12
- 230000009471 action Effects 0.000 description 11
- 230000002776 aggregation Effects 0.000 description 11
- 238000004220 aggregation Methods 0.000 description 11
- 238000013480 data collection Methods 0.000 description 9
- 239000000969 carrier Substances 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 6
- 238000012544 monitoring process Methods 0.000 description 6
- 238000013459 approach Methods 0.000 description 5
- 230000004044 response Effects 0.000 description 5
- 230000002159 abnormal effect Effects 0.000 description 4
- 230000008901 benefit Effects 0.000 description 4
- 230000001413 cellular effect Effects 0.000 description 4
- 230000000875 corresponding effect Effects 0.000 description 4
- 238000013500 data storage Methods 0.000 description 4
- 230000006872 improvement Effects 0.000 description 4
- 230000003993 interaction Effects 0.000 description 4
- 238000007726 management method Methods 0.000 description 4
- 238000013507 mapping Methods 0.000 description 4
- 230000000737 periodic effect Effects 0.000 description 4
- 239000003795 chemical substances by application Substances 0.000 description 3
- 230000002860 competitive effect Effects 0.000 description 3
- 238000009434 installation Methods 0.000 description 3
- 238000000177 wavelength dispersive X-ray spectroscopy Methods 0.000 description 3
- 239000008186 active pharmaceutical agent Substances 0.000 description 2
- 235000014510 cooky Nutrition 0.000 description 2
- 238000012517 data analytics Methods 0.000 description 2
- 238000013467 fragmentation Methods 0.000 description 2
- 238000006062 fragmentation reaction Methods 0.000 description 2
- 230000036541 health Effects 0.000 description 2
- 230000002452 interceptive effect Effects 0.000 description 2
- 208000010125 myocardial infarction Diseases 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000011056 performance test Methods 0.000 description 2
- 230000001960 triggered effect Effects 0.000 description 2
- 238000012800 visualization Methods 0.000 description 2
- 230000002411 adverse Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000015556 catabolic process Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000001276 controlling effect Effects 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 238000013079 data visualisation Methods 0.000 description 1
- 238000006731 degradation reaction Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 230000000670 limiting effect Effects 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000003012 network analysis Methods 0.000 description 1
- 230000002085 persistent effect Effects 0.000 description 1
- 230000001737 promoting effect Effects 0.000 description 1
- 238000000060 site-specific infrared dichroism spectroscopy Methods 0.000 description 1
- 230000001629 suppression Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 238000010200 validation analysis Methods 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W24/00—Supervisory, monitoring or testing arrangements
- H04W24/08—Testing, supervising or monitoring using real traffic
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L41/00—Arrangements for maintenance, administration or management of data switching networks, e.g. of packet switching networks
- H04L41/50—Network service management, e.g. ensuring proper service fulfilment according to agreements
- H04L41/5003—Managing SLA; Interaction between SLA and QoS
- H04L41/5009—Determining service level performance parameters or violations of service level contracts, e.g. violations of agreed response time or mean time between failures [MTBF]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L41/00—Arrangements for maintenance, administration or management of data switching networks, e.g. of packet switching networks
- H04L41/40—Arrangements for maintenance, administration or management of data switching networks, e.g. of packet switching networks using virtualisation of network functions or resources, e.g. SDN or NFV entities
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L43/00—Arrangements for monitoring or testing data switching networks
- H04L43/08—Monitoring or testing based on specific metrics, e.g. QoS, energy consumption or environmental parameters
- H04L43/0876—Network utilisation, e.g. volume of load or congestion level
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L43/00—Arrangements for monitoring or testing data switching networks
- H04L43/20—Arrangements for monitoring or testing data switching networks the monitoring system or the monitored elements being virtualised, abstracted or software-defined entities, e.g. SDN or NFV
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L43/00—Arrangements for monitoring or testing data switching networks
- H04L43/50—Testing arrangements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L43/00—Arrangements for monitoring or testing data switching networks
- H04L43/04—Processing captured monitoring data, e.g. for logfile generation
- H04L43/045—Processing captured monitoring data, e.g. for logfile generation for graphical visualisation of monitoring data
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L43/00—Arrangements for monitoring or testing data switching networks
- H04L43/08—Monitoring or testing based on specific metrics, e.g. QoS, energy consumption or environmental parameters
Definitions
- the following relates to a method and a computer readable medium for evaluating wireless device and wireless network performance, and wireless network usage trends.
- the number of wireless devices that are accessing wireless communication networks is continually growing. These devices may access the various networks via cellular, WiFi and other access points. As the number of devices grows, the strain on these networks grows, affecting the performance of both the networks and the devices.
- a mobile wireless communication device for use with a wireless network and a method and a system for use with a wireless network.
- the mobile wireless communication device may for example comprise a processor and a memory coupled to the processor and an application data module communicatively connected to the processor to collect information related to an application in response to an application management event invoked on the mobile wireless communication device.
- the invention sets out a method of evaluating wireless device and/or wireless network performance and/or wireless network usage trends according to claim 1.
- the invention also sets out a computer readable medium comprising computer executable instructions for performing the method according to any one of claims 1 to 14.
- the following provides a system and method that enables wireless device and wireless network performance to be evaluated by embedding wireless device software in a plurality of applications (or operating systems) deployed and running on a plurality of wireless electronic device types and across a plurality of network types, to enable an aggregation of a more comprehensive collection of data.
- This allows a larger and more meaningful data set to be created for subsequent analyses and reporting.
- the aggregated data set(s) can be used to provide raw data, reports, and dashboard-type interfaces to third parties, and/or prepare and send feedback data to the wireless device software to control testing behaviour and if desired to control the amount and type of data that is collected.
- the feedback data can be used for many different operations, including to adapt and improve performance of the application and/or device, which takes into account data acquired and aggregated from a multitude of applications, devices, and networks.
- the raw data, reports, and dashboard interfaces can be used to provide network carriers or service providers, device manufacturers, game and/or application developers, and other interested parties to perform actions based on a more complete data set, for example, for device and network benchmarking, mobile advertising, application traction and popularity, investment decision making, quality of experience (QoS), network planning, etc.
- a method of evaluating wireless device and/or wireless network performance and/or wireless network usage trends comprises providing wireless device software to each of a plurality of wireless electronic devices connected to one or more of a plurality of networks by having the wireless device software embedded in the corresponding electronic device, wherein the wireless device software is embedded in or operable with a plurality of types of applications and performs at least one test associated with characteristics and/or location of the device, and/or performance of the device and/or the network, and/or usage of the device by a user; receiving via one or more collection servers, test data obtained by the wireless device software of each of the plurality of wireless electronic devices; aggregating the received data; and storing and outputting the aggregated data.
- the following provides a system and method that enables wireless device and wireless network performance and wireless network usage trends to be evaluated by embedding wireless device software in the background of a plurality of applications (or operating systems) deployed and running on a plurality of device types and across a plurality of network types; to enable an aggregation of data types for the analysis and reporting of a more meaningful dataset.
- a larger and more meaningful data set is obtained for not only performing analytics on devices, applications, and networks, but also for providing feedback to such applications, devices, and networks to modify and/or improve testing behaviour.
- the aggregated data set(s) can be used to provide raw data, reports, and dashboard-type interfaces to third parties.
- data such as network quality of service (QoS), app and device data that is crowdsourced from mobile devices can be used to: determine and illustrate the customer's perspective of a network, show device and application usage data, deliver insights that are immediately actionable, and test various parts of device and network performance that is useful in various ongoing applications.
- QoS network quality of service
- FIG. 1 illustrates an example of a wireless environment 10 which includes a number of wireless networks 14 that can be of different types, e.g., different cellular network types (2G, 3G, 4G, etc.).
- the different network types can also include other types of wireless networks such as WiFi networks accessible through available WiFi access points.
- various electronic communication devices 12 having wireless capabilities operate by connecting to one or more of the different networks/network types 14 and/or directly with each other over peer-to-peer or mesh-type networks 14.
- various types of electronic devices 12 are configured to connect to and utilize one or more wireless network types 14, therefore providing a heterogeneous network environment 10 for which performance can be monitored, measured, analyzed, feedback provided, and operability adjusted as herein described.
- each of the devices 12 includes a software functionality (described below) that is capable of performing tests, monitoring existing device operations and usage, and otherwise collecting data on or related to one or more applications on that device 12.
- the software functionality can be distributed to millions of mobile devices 12 to anonymously collect QoS, device and app usage data, among other things.
- Various partnership arrangements can be implemented, such as 1) revenue sharing on raw data, and/or report sales - e.g. providing insights by way of analysis or convenient presentation of information for subsequent analyses; 2) payment of upfront fees to app/game developers, 3) providing app/game developers with useful reports to encourage adoption of software functionality, etc. It can be appreciated that partnerships would not necessarily be required in order to deploy the software functionality on devices by distribution through existing channels, apps, OS, etc.
- Such software functionality can also be integrated with carrier apps or other purpose-built apps to be used on a number of mobile devices 12.
- the software functionality can be embedded in apps and games running on various mobile platforms/OS such as Android, iOS, Windows, etc.; as well as other mobile platforms such as those used for wearables, gaming, vehicle systems, wireless sensors, etc. That is, any other device/platform with location-tracking capabilities (e.g., GPS, network based location, etc.) and network (e.g., Internet) connectivity with an ability to run the software functionality described herein are applicable to the data collection, analysis, and feedback/reporting mechanisms described herein. In some implementations, devices with only network connectivity and without location-based capabilities may also be incorporated into the system.
- the data 16 that is collected is preferably tied to a location or otherwise considered “location-based" such that the data 16 or information derived from the data 16 can be placed on a map.
- the data 16 is also preferably collected in an anonymous manner such that no personally identifiable information is collected and/or stored by the system 18.
- the system 18 should be configured to not collect a device's advertiser ID, device ID, or other information that could be used in conjunction with another dataset to identify the user of the device 12.
- the software functionality described herein can be configured to generate and append a unique random number which is specific to the particular installation, and which is reset (e.g. regenerated) periodically (e.g., each day). This can be done to ensure that an adversary cannot observe data reported from one device over the course of several days to determine who that device may belong to.
- the data 16 can include, without limitation: device location, device manufacturer name, device model, OS name and version, network operator ID, % memory free, CPU utilization, battery drain rate, storage utilization (i.e. device metrics), application name, download bytes, upload bytes, first install time, last updated time (i.e. mobile application metrics), upload throughput, download throughput, latency, link speed, signal strength, jitter, packet discard rate, packet loss, # of radio frequency conflicts (i.e. network QoS metrics), BSSID, SSID, signal strength (i.e. Wi-Fi scan metrics), connection start/end times, connection type, technology, service provider, cell ID, LAC, MCC, MNC, DHCP response time (i.e. connection metrics), etc.
- device metrics i.e. device metrics
- application name download bytes, upload bytes, first install time, last updated time
- upload throughput download throughput
- latency link speed
- signal strength i.e. Wi-Fi scan metrics
- connection start/end times connection type, technology, service
- the collected data 16 is fed to a central system 18 that includes modules and processes for collecting the data 16, processing and analyzing the data 16, generating feedback for the devices 12, and preparing user interfaces and reports therefore.
- a central system 18 that includes modules and processes for collecting the data 16, processing and analyzing the data 16, generating feedback for the devices 12, and preparing user interfaces and reports therefore.
- multiple "central" systems 18 can be used, e.g., to comply with handling laws requiring that data from a particular jurisdiction be stored in that jurisdiction, etc.
- the data can be securely stored in cloud-based databases and securely transmitted via secure connections (e.g., HTTPS).
- the databases can be globally dispersed and can be configured to provide direct access to the clients of the system 18.
- the reports and user interfaces are generated and made available to one or more third parties 22.
- third party types including without limitation, network carriers (multiple different ones), game and/or application (app) developers, and other 3 rd party systems such as data analytics firms, mobile advertising entities, investment and financial entities, etc.
- the reports and user interfaces can be provided using data visualization tools such as graphical reports, interactive dashboards, web tools, etc. Reports can be delivered on a periodic basis or in real time with dashboards being available when needed online at any time.
- the raw data and/or reports, dashboards and other user interfaces can be used for innumerable applications and use cases.
- the data 16 that is collected can be used for network planning, wherein since the system 18 observes potentially all networks 14, wireless service providers can interact with the system 18 to see how they are performing relative to competition in specific areas. This can indicate, for example, what the highest value areas are for improving a network, since the best new sites for network improvement are typically areas where there is a lot of network traffic and one in which a network provider is being outperformed by the competition.
- the data can also be useful to wireless service provider customers who may wish to see how their service provide is performing relative to alternatives.
- the system 18 configured as herein described can determine, for the devices 12 having the software functionality obtaining the crowdsourced data 16, whether or not data 16 is being collected from a roaming device 12, and if the device 12 is roaming the system 18 can be configured to determine who the home service provider is, from the data 16 that is collected. This tells the carrier if their roaming partners are performing well enough and who they should be choosing to be their roaming partner in the future.
- SONs Self-Organizing Networks
- SONs can also benefit from the data 16 that is crowdsourced by the system 18.
- SONs dynamically adjust the network, antennae, or wave forming characteristics in response to network quality. The idea is that the network is self-healing such that if there is an issue, the network will adjust to eliminate the issue all on its own.
- SONs typically require access to large amounts of field data to operate, which can be satisfied with the large datasets that can be obtained using the system 18.
- use cases include, without limitation: device and network benchmarking in which the system reports on which networks, devices, cell towers, network equipment, operating systems, etc. perform best for consumers; and investment applications. Regarding informing investments, it can be appreciated that the system 18 can determine what various mobile applications exist and are being used on mobile devices 12, and be able to determine on a day-to- day basis how much those mobile applications are being used. This allows correlations to be made between user activity (e.g., mobile shopping, browsing, etc.) and a company's performance and thus share value.
- Yet another use case can include reporting on mobile application usage trends and which apps are gaining or losing popularity with customers.
- a wireless service provider could use this information to predict the network impact of consumer application trends.
- a financial institution could use this information to make predictions related to the stock market.
- the device 12 includes a processor 30, memory 32, and an operating system 42.
- the device 12 is also operable in this example to provide graphical user interfaces to a user via a display 34.
- a visual component can be provided directly on the device 12 for displaying a portion of the information collected to the user, if desired by the user.
- the device 12 also includes one or more communication interfaces 36 that are operable to connect the device 12 to one or more networks 14.
- the device 12 can include multiple apps 38, of different types as discussed above.
- each app 38 embeds the aforementioned software functionality, depicted as wireless device software (WDS) 40, that is embedded in, and runs in the background of the app 38 to gather particular data, perform tests, etc.
- WDS 40 wireless device software
- the WDS 40 is capable of not only accessing components on the device 12 such as the processor 30, battery (not shown) and OS 42, the WDS 40 can be configured to either directly, or via the app 38 on which it resides, communicate on one or more networks 14 by interfacing with the one or more communication interfaces 36.
- each app 38 includes an embedded instance of the WDS 40 for monitoring and testing the app 38 and/or device 12, the WDS 40 can be deployed in various other configurations.
- FIG. 2B illustrates that the WDS 40 can instead (or in addition to) reside in the OS 42 and centrally interact with a number of the apps 38.
- the WDS 40 may also reside as a stand-alone application or in another location or component of the device 12 as shown in dashed lines with functionality to interact with a number of (or all) of the apps 38.
- one or more of the apps 38 can additionally have the WDS 40 reside thereon (also shown in dashed lines), e.g., apps 38 that need to have such operations controlled internally rather than being opened up to an external program, module or routine.
- the WDS 40 can therefore be installed in several different apps (i.e. in a weather app and then a totally different game) and these different apps could potentially be installed on the same phone or a multitude of different phones. This allows for the scenario wherein the WDS 40 is installed several times on the same phone (e.g., as illustrated), in which case the WDS 40 should identify that it is getting data from the same device 12.
- the WDS 40 can have a hardcoded limit of a number of tests that can be performed over a time period, which limits are unalterable by the configuration server.
- the WDS 40 can also be operable to identify its own code running in a different application on a same electronic device, and be responsive to identifying its own code running in the different application by having only one instance of the wireless device software operating at the same time.
- FIG. 3 A data collection configuration is shown at a high level in FIG. 3 .
- Each mobile device 12 that is configured to operate the WDS 40 (using one or more apps 38) provides data to a collection server 50 that is deployed as part of the system 18.
- the collected data 16 is processed as herein described, along with data 16 obtained from other devices 12, to generate information and data for third party systems 22.
- FIG. 4 provides further detail for the configuration shown in FIG. 3 , in which the collection server 50 collects the data 16 and has the collected data processed by a data processing stage 52. The data thus processed is then provided to a data distribution stage 54 for distribution to the third party systems 22.
- FIG. 4 provides further detail for the configuration shown in FIG. 3 , in which the collection server 50 collects the data 16 and has the collected data processed by a data processing stage 52. The data thus processed is then provided to a data distribution stage 54 for distribution to the third party systems 22.
- the data distribution stage 54 can also enable the system 18 to provide feedback to the mobile device 12 by communicating with a device software support functionality 56 that is connectable to the WDS 40 to complete the feedback loop.
- a device software support functionality 56 that is connectable to the WDS 40 to complete the feedback loop.
- FIG. 5 illustrates a configuration similar to that shown in FIG. 4 , but illustrating the collection of data 16 from multiple devices 12 via multiple WDSs 40.
- the plurality of mobile devices 12 shown can be served by a common device software support entity 56 and can provide data 16 to a common collection server 50.
- the system 18 may employ multiple regional collection servers 50 and device software support entities 56 as needed and thus the example shown in FIG. 5 is illustrative only.
- FIG. 6 illustrates operations that can be performed in the data processing stage 52 to collect and aggregate the data 16 that is received from potentially a multitude of different types of sources.
- the collected data is therefore aggregated, in this example using a data aggregation module 60 that can utilize rule set(s) 62 or template(s) or other data structure(s) defining how to meaningfully aggregate the data for subsequent analysis by generating one or more aggregated dataset(s) 66 that can be stored within the data processing stage 52 or elsewhere within or accessible to the system 18.
- the data aggregation module 60 can also utilize any other 3 rd party metadata 64 from third party data sources that are useful in aggregating and analyzing the data 16.
- FIG. 7 illustrates at a high level the use of the aggregated dataset(s) 66 to perform subsequent analyses using a feedback analytics module 70. This allows the analysis to be performed on data that has been aggregated or "stitched" across the data types, app types and network types to provide more meaningful feedback 72 that can be tailored according to different devices 12, different apps 38, utilization in different regions, on different networks or network types, etc.
- the WDS 40 in this example is embedded in a mobile app 38 and includes a software interface 84 for interfacing between the app 38 and a software controller 86 for controlling the tests and other operations of the WDS 40.
- the WDS 40 also includes a test data storage 88 for storing data acquired during the tests, a testing SDK 90 for performing one or more particular tests that involve operation of the app 38 and/or the device itself via the device OS 42.
- the WDS 40 also includes a utilities SDK 92 that includes methods, functions, and APIs that can be used to pull data and info from the device OS 42. Such methods can be used to export data to the collection server 50..
- the SDK 92 is also operable to communicate with the collection server 50.
- the collection server 50 includes a reporting server 96 for receiving test and any other data being reported by the WDS 40, and a reporting database 98 for storing the test data for use by the data processing module 52.
- the data processing module 52 includes a central data services (CDS) server 100 that provides data source APIs for different third party data sources and metadata.
- the CDS server 100 can also provide local storage for quick responses to the data aggregation operations.
- the CDS server 100 also interfaces externally with the one or more third party data sources 64 and internally with the data aggregation module 60 discussed above.
- the data aggregation module 60 obtains (i.e. pulls, requests or otherwise receives) the data collected by the collection server 50.
- the data aggregation module 60 also performs aggregation of the various data and data types and stores the aggregated data in a reports database 104 to be accessed by a report generation module 106 for generating various types of reports, dashboards, etc.
- data can also be pulled in from third party data sources and not only the collection server. For example external databases can be pulled in that help translate latitude and longitude into city names where the data was collected.
- the report generation module 106 can generate various types of data for distribution to third parties 22 as shown in FIG. 8 .
- the report generation module 106 can generate reports 110 and/or dashboards 112, and can prepare raw data 114 to be analyzed elsewhere.
- the report generation module 106 can also prepare feedback data 116 to be sent to the device support software 56, in this example configuration, to a feedback server 126 that is part of such device support software 56.
- the device support software 56 can include various servers that can communicate with and control, monitor, update, fix, kill, or otherwise interact with the WDS 40 in the various devices 12.
- the device support software 56 includes the feedback server 126 mentioned above, as well as a configuration server 124 for managing the configurations for the WDS 40, and an authentication server 122 for authenticating the WDS 40 to ensure that it is from an appropriate app and app developer.
- the device support software 56 also includes a testing server 120 for interacting with the testing SDK 90 for providing and updating/configuring tests and test sets to be performed by the WDS 40.
- the WDS 40 can be configured as a software library that is embedded in the mobile device apps 38 in order to report and integrate with the collection server 50 and data processing module 52.
- the libraries of the WDS 40 can be added to an existing application to collect device, connection, network QoS, Wi-Fi, and application key performance indicators (KPIs). It can be appreciated that using this over the top approach only requires the WDS 40 to have the ability to communicate with the system 18 over an network connection, for example, either on Wi-Fi or mobile. This allows for the flexibility of deploying through a cloud infrastructure anywhere around the world.
- the WDS 40 interacts with the device software support entity 56, which can include different servers with which the WDS 40 can communicate during its operation.
- the WDS 8 includes servers responsible for authentication and initiation (authentication server 122), configuration (configuration server 124), testing (testing server 120), and reporting (reporting server 96) that communicate with the WDS 40.
- the authentication server 122 can be used to dictate which application programming interface (API) keys and apps 38 are allowed to operate and collect data through the WDS 40.
- the configuration server 124 can be used to set specific rules and parameters for the operation of the WDS 40.
- the WDS 40 can also use testing servers 120 to perform active tests on the connected network 14.
- the reporting servers 96 are used to upload the data payloads from the WDS 40 to the system 18.
- the authentication server 122 can be used to verify that applications 38 are using the correct API key for each developer, and to provision each app with a unique deployment key.
- Each application developer can be assigned an API key, which is used to generate a unique deployment key for each application 38. This deployment key is used to control the configuration of the WDS 40, as well as track the data collected by each application 38.
- the authentication server 122 can also check that the app 38 has not been registered with the system 18 previously. This ensures that the data collected through the WDS 40 is associated back to the correct application 38 and developer, e.g., to account for revenue sharing.
- the authentication server 122 also allows the control of shutting down specific applications or developers from collecting data at any time, e.g. for implementing a "kill switch".
- the WDS 40 can be configured to check with the authentication server 122 on first initialization of the WDS 40, and periodically (e.g., every few days) following initialization. This allows for the authentication server 122 to shut off any application 38 from collecting data 16. All communication and data transferred between the WDS 40 and the authentication server 122 is preferably secured and encrypted. For example, the WDS 40 can be given a three day local cache on the device 12 to prevent the WDS 40 from checking in with the authentication server 122 on every initialization to prevent extra traffic or chattiness over the network 14, and to act as a local cache on the device 12.
- the testing servers 120 are used to perform active tests on a network 14 through interaction with the WDS 40.
- the testing servers 120 can host various files of different sizes for performing download throughput tests. For upload throughput tests, the testing servers 120 can provide an un-throttled bucket to upload files of any size. Furthermore, the testing servers 120 can also echo back packets for the corresponding communication protocol (e.g., UDP packets) sent from the WDS 40 for server response tests. Multiple testing servers 120 can be setup as necessary around the world.
- the testing servers 120 can be deployed on an cloud or on-premises hosting environment.
- the WDS 40 determines which server 120 to use for performing active tests by choosing the most appropriate server 120 based on the device's geographic location.
- the testing servers 120 used by the WDS 40 can be configured through the configuration server 124. All communication and data transferred between the WDS 40 and the testing servers 120 is preferably secured and encrypted.
- the configuration server 124 is designed to allow full control over the WDS 40.
- the configuration server 124 allows the system 18 to adjust data collection frequencies, data reporting frequencies, and the types of data being collect for devices 12 out in the field.
- Each WDS deployment can be assigned a unique deployment key, used by the WDS 40 to periodically check what data collecting/reporting behaviors the WDS 40 should be adhering to. This allows the dynamic adjustment of the WDS 40 performance to fine tune battery consumption, network chattiness, and other parameters.
- a configuration profile held by the configuration server 124 is downloaded to the WDS 40 upon the initialization of the WDS 40.
- the configuration server 124 may hold a new policy that says "Do not collect data in Country X". That new policy, or that new profile for data collection, would be downloaded and executed by the WDS 40.
- a new configuration profile is pulled to the WDS 40 on a specified frequency.
- the WDS 40 can also have a local cache on the device 12 (e.g., three days), of the configuration server 124, to prevent the WDS 40 from pulling configurations from the configuration server 124 too frequently. All communications and data transferred between the WDS 40 and the configuration server 124 are preferably secured and encrypted.
- the configuration file/data can be signed by the service with a known, trusted security certificate.
- the signature is passed down with the configuration server's configuration where it is verified in the WDS 40 on the device 12.
- the WDS 40 may then try to match the signature on the server configuration with one generated locally on the device 12 using the same certificate as the server side. If the signature generated on the WDS 40 does not match the one provided by the configuration server 124, the WDS 40 can be configured to reject the configuration and continue to use the previous configuration, or a default. This co-signing verification between the server 124 and WDS 40 ensures that the configuration is not compromised. Compromising the configuration supplied to the WDS 40 can have varying degrees of impact on the user device, the amount of data used, the battery impact, etc.
- the WDS 40 can initialize by checking with the authentication server 122 to run or not.
- the WDS 40 then pulls a configuration file from the configuration server 124 to direct the operation of the WDS 40.
- Data is then collected by the WDS 40 by interacting with the device OS to capture various KPIs about the device, network connection, network QoS, WiFi scan information, and application data usage, etc. as discussed herein.
- the WDS 40 can also perform network performance tests against the testing server(s) 120.
- Data is collected by the WDS 40 and stored in a database (e.g., SQLite) over a particular time period, e.g., a 24 hour period.
- the database is then exported to the reporting server(s) 96.
- the reporting servers 96 can parse through the database to split the data into different tables, e.g., within BigQuery.
- the data is stored in various BigQuery reporting tables depending on the type of data.
- dataflow jobs can be run to add additional metadata to the raw data uploaded from the WDS 40. This metadata includes tagging the raw data with country, region, and city metadata, etc.
- the collection server 50 is configured to collect data from multiple mobile devices 12 by having the reporting server 96 interfaced or otherwise communicable with the WDS 40 in each of the multiple devices 12. It can be appreciated that while the collection server 50 can communicate with multiple devices 12, the wider system can include multiple collection servers 50, e.g., regionally placed, each collection server 50 being capable of communicating with the data processing module 52. FIG. 9 also illustrates that the feedback data 116 generated by the report generation module 106 can be provided to multiple different third parties 22 in addition to the feedback server 126. The feedback server 126 can be configured to communicate with multiple mobile devices 12 via the respective WDS(s) 40.
- FIG. 10 illustrates data flow in gathering, aggregating, and analyzing data from mobile devices 12 for preparing and providing reports and/or raw data.
- the mobile application or operating system, etc.
- the WDS 40 initiates the WDS 40 to begin collecting test data collection on the mobile device 12 at step 206.
- the OS 42 or other components of the device 12 can be used to initiate the WDS 40 to begin the data collection.
- the data collection at step 206 is performed based on network tests performed in connection with the device support software 56 at step 202 and by communicating with the device OS 42 at step 204.
- the collected data is stored at step 208 and uploaded to the system 18 at step 210.
- the uploaded data is collected and aggregated at step 212 and stored at step 214 in the reporting data storage as noted above.
- the aggregated data can be correlated in various ways at step 216 by referencing third party data sources 82 in order to generate and store reports data at 220. This enables the various data reports to be provided at step 222.
- the data can be aggregated at step 212 by adding the uploaded data to a large set of tables, e.g., split by day.
- the large set of tables can then be queried according to certain variables.
- data for all apps 38, devices 12 and networks 14 can be placed in the same data storage, and can be grouped in various ways depending on what is meant to be shown in the reports, dashboards, etc.
- the data is analyzed in various ways. For example, the data can be broken down by country, region, city, etc.; as well as by time periods (e.g., month). Custom groupings can also be performed by network type (2G vs 3G vs 4G) and statistics determined and displayed for those groupings. Custom groupings can also be performed to determine application package names, application names. It can be appreciated that determining application package names is non non-trivial since a single application can have multiple packages as part of its installation, and also different names in different languages.
- the system 18 is configured to coalesce the packages to obtain a single-language list of app names and their associated package names (since package names are globally unique).
- Custom groupings can also be prepared for service providers based on mobile country codes (MCCs) and mobile network codes (MNCs). This allows brands to be matched up with operators for a given network 14, rather than relying solely on the network 14 reported by the device 12 (e.g., since there may exist a roaming situation or other scenario where the provider listed by devices 12 may be inconsistent).
- MCCs mobile country codes
- MNCs mobile network codes
- the system 18 can therefore combine the uploaded data from a multitude of different mobile applications 38 and deployments from a multitude of devices in various networks, regions, etc.
- the system 18 is also able to pull additional metadata 64 from several other third-parties and open data sources 82.
- the system 18 can output raw data files as well as make data available for visualizations through user interfaces (e.g., dashboards).
- a set of the dataflow jobs can be used to add additional metadata 64 to the raw data being uploaded from the WDS 40.
- These dataflow jobs can be performed periodically, e.g., hourly on the last hour of data upload from the WDS 40.
- the results can then be grouped into daily tables at a particular time, e.g., GMT midnight, for querying.
- the data reports generated at step 222 can therefore be constructed in various ways and, if desired, additional third party data sources 82 can be incorporated. Since the data is collected from a multitude of WDSs 40 deployed within various types of applications running on various types of OSs 42 and device types; all within, crossing between and/or interacting with various network types 14 and regions; a more comprehensive view of how a network, device, application, operating system or electronic environment more generally can be assessed. The data that is collected and stored can be queried in many ways for many purposes to suit the needs of different third parties 22 wanting access to such a wider and more complete set of data.
- the WDS 40 can be deployed within various types of apps 38, such as games that enjoy substantial circulation and reach across multiple platforms, regions, an unobtrusive tool is deployed and can be leveraged gather such desired data on a periodic and ongoing basis without adversely affecting the performance of the devices 12 or apps 38.
- FIG. 11 illustrates a process flow similar to FIG. 10 , wherein the data stored at step 220 can be additionally used to conduct feedback analyses at step 250 (e.g., as shown illustratively in FIG. 7 . While the reports provide feedback in the form of raw data, analyzed data, graphical user interfaces, dashboards, etc., the data that is collected can also be used to distribute feedback to and affect the operation of the WDSs 40 and the mobile devices 12 themselves. As shown in FIG. 11 , the feedback analysis at 250 can be followed by a feedback distribution stage at step 252 to complete a "feedback loop" with the data collection operations performed at step 206.. The feedback can be used in various ways.
- the WDS 40 could: 1) affect the WDS 40 to change how/when data is collected; 2) affect the mobile application itself; and 3) affect affect the device.
- 2) one can consider a case where it is identified that all networks in a particular city are particularly slow. A game in that city may choose to download lower resolution images or avoid gameplay features that require interaction with many other players or avoid asking the user to buy anything since the credit card payment may fail.
- the mobile device 12 could decide to use a different type of network based on the information that is in the feedback package, or in a SON-type use case the feedback could direct the device 12 to connect to a specific cell tower. In the case of 2) and 3), the actions taken will ultimately affect the type and quantity of data collected by the WDS 40.
- FIGS. 12 to 19 illustrate screen shots of example user interfaces that can be generated using the data collected from devices 12 as herein described.
- FIG. 12 illustrates an example of a hex map showing coverage availability for 2G/3G/4G networks for a particular geographic region. The performance is shown in coloured hexagons of consistent size, with the radius being dynamically re-sized for different applications.
- FIG. 13 illustrates another hex map with network provider rankings. It can be appreciated that for both FIGS. 12 and 13 , specific key performance indicators (KPIs) can be shown, as well as radio technology coverage, operator comparisons, and other data types.
- KPIs key performance indicators
- the hex maps shown in FIGS. 12 and 13 can be useful for seeing pockets of coverage type and quality in certain areas, seeing competitor and roaming partner experience, and identifying areas of poor experience (e.g., high packet loss, etc.), among others.
- FIG. 14 illustrates two examples of region maps, one showing download performance, and the other showing performance change for specified time periods.
- the region maps can be used to show regions of interest and can be set to a particular country, region, postal/zip code, etc. Colour coding can also be used to allow comparisons between regions.
- Such region maps can be useful for identifying performance quality or lack thereof in regions of interest, as well as the ability to see area performance for customer support and marketing purposes. For example, by having data from multiple network types 14, carriers can determine metrics such as "the best provider in your postal area", etc.
- FIG. 15 illustrates a regional map with highways and other points of interest (POIs). This allows for network QoS to be shown relative to highways and other POIs like airports, train stations, train/transit lines, sports stadiums and other places that users may gather and expect or desire good network coverage.
- the screen shot shown in FIG. 15 can also be incorporated into a user interface or dashboard that allows a user to drill down into specific venues, junctions, and isolate based on date ranges.
- the data that is collected by the system 18 can also be used to allow users to drill down into various KPIs such as download speed, latency, packet-loss, etc., therefore allowing service providers, venue operators and other interested parties to determine network QoS for metrics in which they are interested.
- FIG. 16 illustrates a screen shot of a user interface for displaying overview statistics for a particular region.
- network statistics are shown for all operators in a selected geography and the data can be displayed for specific date ranges. Since data is collected by the system 18 over a multitude of devices 12 in a multitude of networks and network types 14, the overview provided in FIG. 16 can be obtained and periodically updated over time. The information provided can be useful for competitor benchmarking, since data concerning other networks is available, as opposed to only having data for one particular network. The data shown in FIG. 16 can also be useful for making roaming partner selections, since a network can obtain data for all operators in a particular region and can assess the quality of service that can be expected should they choose that roaming partner.
- QoS trends can be used to issue resolution and performance monitoring.
- Regional performance tables can be provided to show network QoS performance broken down by region (e.g., city) and by operator in selected countries. The tables can be colour coded to highlight improvements or degradation. These tables can be useful for competitor benchmarking, roaming partner selection, and for identifying areas requirement investment/improvement.
- Device performance statistics can also be provided to show performance by device and how these devices compare when used on home and competitor networks. Device performance statistics can be useful for device manufacturer considerations and issue resolutions, recalls, warranty issues, etc.
- app usage statistics can be provided to show, for example, total active users, total data usage, etc.
- the app statistics can be filtered by geography, operator, device type, radio technology, etc.
- the app statistics can be considered useful for determining trends in user behaviour (e.g. growth in app types), and for optimizing networks for popular applications.
- the data gathered and analyzed by the system 18 can also be used for infrastructure planning tools in which poor performing locations or infrastructure can be displayed on a map. These maps can be made interactive such that clicking on a location displays a street-view to search for possible infrastructure locations, etc.
- the maps can also display a list of local businesses for potential partnership (e.g., for small cell or WiFi access points).
- FIG. 17 illustrates an example of a web-based platform that can be provided to conduct network analyses.
- the network analysis dashboard in FIG. 17 can utilize multiple panes or portions with options to deep-dive to street level and cell-tower performance analysis, select different statistical tables or mappings to be displayed, etc.
- the dashboard shown in FIG. 17 can be used for infrastructure planning and validation.
- the dashboard can be used to display device statistics for understanding macro-level trends, as shown in FIG. 18 , or to show detailed network coverage mappings of areas and venues as shown in FIG. 19 .
- the system 18 described above contemplates testing networks 14 and generating test data in a few different ways, namely:
- the system 18 described herein can be configured to perform network tests that are either initiated by user actions, or informed by user actions. This can be done by being given, or otherwise having access to, additional user or mobile service information, which can greatly enhance passive testing (and testing in general).
- mobile apps 38 can track user actions such as the user clicking a button to upload a photo. When the mobile app 38 sees that a user has clicked the button "upload photo", it can run a passive network test on that data upload while knowing: 1) It was a photo; 2) the size of the photo being uploaded; and 3) the destination server address.
- the mobile app 38 and WDS 40 are in a position to leverage an increased understanding of the nature of the file transfer to perform a more effective and accurate passive throughput test.
- This can be done, for example, by having the WDS 40 utilize an API to ingest information from the mobile app 38.
- the mobile app 38 passes information to the WDS 40, such as "the user just clicked a button to upload a photo of size x". Accessing this information provides context that may not have previously been available for passive testing, for instance when a file has been uploaded, not knowing that it was a photo, the resolution or size of the photo, or the destination server and routing details.
- the system 18 can therefore be adapted such that the user's interaction with a mobile service would dictate what type of passive network test to perform and how to interpret the results. For example, if the user uploads a photo on a particular mobile service such as Instagram, the system 18 can use that additional information to perform a passive network test that is designed to monitor the network's ability to handle photo uploads. This additional information can be provided by a mobile application 38 and is typically provided by the mobile application 38 which contains the network testing code - however other sources for that additional information are possible. In this event, the system's passive test would have access to additional information such as: 1) that the user is trying to upload a photo; 2) the size of that photo; and 3) the destination sever, etc.
- user informed testing does not need to be limited to passive network tests.
- the mobile user's behaviour, characteristics, location, etc. could dictate specific active tests which should be run based on the types of tests desired by the controller of the system.
- User informed testing also allows the system to consider when an active test or a passive test would be most appropriate. For example, it may be best to only run passive tests, which don't create more new network traffic, when the user is watching a video or doing something with their device 12 which is sensitive to network performance. In other words this "additional information" and user informed testing can help dictate when and where tests should be performed to: 1) not interfere with user experience, or 2) provide the information which is most needed by the system.
- the user informed test results can be used to modify or dictate the hardware, software or implementation of the network 14 itself by informing the network's requirements based on the services and applications 38 being used by users and the actions they take.
- the system 18 described herein can therefore be used to perform user informed/dictated testing, that is, where the user does not specifically choose to run a network test.
- network tests are selected and initiated based on the actions performed by a user of a mobile device 12 which contains the network testing software (e.g., downloading a photo).
- the details of those actions performed by the user can be used as an input into the analysis of the results (e.g., a network's ability to serve a photo).
- the action performed by the user is something that is not the user choosing to run a network test.
- the above-described systems and methods contemplate tracking mobile devices 12 as they access and make user of wireless networks 14. These mobile devices 12 and their users can be identified and tracked on a day-to-day basis in various ways, including:
- Each device tracking approach has its own privacy implications which typically needs to be considered and managed. That is, a selected tracking approach would normally need to be both acceptable to the mobile device user and certain legal requirements.
- the system 18 may be used to inform wireless service providers about user churn. For example, if an application ID is used to log-in on a phone on a first network 14a one day, and then later the same application ID is used to log-in on a phone on a second network 14b, then it can be reported that this user likely churned. That is, in this case it can be expected that this user left the first network 14a and became a customer on the second network 14b.
- Such churn reporting on its own provides a valuable service to wireless providers. However, this reporting becomes even more powerful when combined with other data sets to enable predictive capabilities which create the possibility of advertising to influence churn.
- this historical network churn information when combined with other information sets such as wireless network coverage, wireless network performance, website cookies, recent searches, mobile device hardware/software, user network subscription plans, what people are saying about the wireless network operator on social media, and other information sets, can be used to perform churn prediction on individual users or on large aggregate portions of the population.
- information sets such as wireless network coverage, wireless network performance, website cookies, recent searches, mobile device hardware/software, user network subscription plans, what people are saying about the wireless network operator on social media, and other information sets, can be used to perform churn prediction on individual users or on large aggregate portions of the population.
- the same mobile IDs can be used to target specific users or IDs with appropriate advertisements.
- the system's wireless network performance tests can be used to compare networks and inform targeted advertising campaigns. If the second network provider discovers that they are the best wireless network in a specific city they could adjust their advertising to devices in that city to promote their network as being the highest performer. It is then possible for mobile applications 38 and services to suggest wireless operators to their users. Users may opt-in to allow a wireless service, such as Facebook, to track network performance, their usage patterns, and location and then suggest to them the best wireless network 14 for their requirements.
- a wireless service such as Facebook
- the system 18 may track which groupings of mobile devices 12 tend to show up on specific networks 14. For example, if the same four mobile devices consistently access the same WiFi access point, or access networks via the same IP address, it is reasonable to assume that this is a family unit or associated group. If suddenly one of those devices 12 leaves that grouping and a new device 12 appears which is authenticated with a different cellular wireless network 14 it can be reasonably assumed that there has been a network churn event by the user of that newly appearing device.
- tracking one or more IDs associated with a user or device 12, and obtaining access to or otherwise tracking user-related events such as social media posts, can enhance churn identification and churn reporting and/or targeted advertising.
- the system 18 can be adapted for such churn prediction by tracking a user as they move across networks 14 and across mobile devices 12 using their social media log-in IDs, such that an analysis of network/device churn can be performed.
- Wireless network performance tracking by the system 18, which can be performed by crowdsourcing from mobile end points as described above, can also be used to determine which areas, users, or services are being throttled; as well as which areas, users or services are being provided with enhanced levels of service.
- Identifying and comparing low performance and high performance cases can be used in a variety of ways, for example:
- the system 18 can therefore be adapted such that the network test results or service quality is compared against a threshold of quality dictated by a wireless regulator or home network provider to see if requirements are met.
- Network quality and coverage is often considered critical to certain emerging cyber-physical domains such as self-driving vehicles and ehealth.
- the end mobile device 12 has a core purpose, which is network sensitive. It is important that these devices 12 maintain access to network quality that is good enough to meet their core purpose requirements. For example, an ehealth device designed to inform hospitals of heart attacks should be able to send a message to hospitals or emergency dispatchers when a heart attack is detected.
- Network testing capabilities for these devices 12 may then be considered critical to their performance, with test being triggered by events which are inherent to the device's core purpose.
- a self-driving vehicle or vehicle network may choose to run tests whenever vehicles need to perform emergency maneuvers (e.g., avoid an animal or other obstruction on the road) to track the performance of these maneuvers.
- the vehicle grouping may run tests only in cases when it is known that there are portions of the road or route where network performance information is lacking.
- a network testing system can have its tests triggered by external events.
- the resulting network dataset can be combined with information about the cyber-physical device's operation and requirements to determine if the network 14 is adequate for that cyber-physical device's requirements.
- an e-health device 12 may perform event driven tests on the network 14 to ensure that the network 14 is performing well enough to handle the network requirements of an emergency situation (and that the devices is connected to the appropriate server).
- Example events in this case may be: 1) User is sleeping or user is in nor immediate health danger; 2) User health reading are reaching dangerous levels which could get worse; 3) User is in danger.
- the devices 12 are in a great position to map network quality across huge areas and therefore may be relied upon or otherwise play an increased role in future network testing. It can also be appreciated that vehicles are not just limited to automobiles, and may include drones or other autonomous devices.
- the mobile devices 12 used to perform network testing typically need to have the ability to preserve user privacy to degrees that are informed by the user themselves. For example, if a user inputs that they either opt-in or opt-out of the service, or portions of the service, the overall system should be responsive to that input and adjust what is collected accordingly. The analysis and handling of that data should also be informed by those same user inputs.
- the system 18 can also be adapted to ensure that it is capable of consuming information about the jurisdictional and geographic difference in privacy rules and be responsive to those rules.
- a global testing system may perform differently in Russia than in the European Union depending on the current governing privacy legislation in both areas.
- the system 18 orchestrate the tests performed amongst the full network of testing end points to preserve privacy of users.
- the system 18 may choose to distribute the tests amongst the mobile devices 12 in such a way that makes it even more difficult to track the movement or characteristics of a specific device 12.
- the system 18 can be configured to be able to handle that data differently, or not collect data from that area, since it would be easier than normal to associate the tests taken in that low-population area with the person or persons known to live in or access that area.
- Multi-input Multi-output (MIMO) and SON systems 22b may have a multiplicity of channels available, each of which is evaluated. Also, MIMO and SON systems 22b can use beamforming to broadcast specific channels and network resources to specific mobile devices 12, namely based on their unique requirements. As a result each user in the network 14 can be experiencing something completely different such that the importance of crowdsourcing network quality increases.
- MIMO and SON systems 22b can use beamforming to broadcast specific channels and network resources to specific mobile devices 12, namely based on their unique requirements. As a result each user in the network 14 can be experiencing something completely different such that the importance of crowdsourcing network quality increases.
- Information crowdsourced from the mobile devices 12 themselves can ultimately be used to inform the network 14 about the network characteristics which are required to be broadcasted to each mobile device 12 and how this beamforming needs to take place (generally based on the application being used or subscription tier of the user).
- the mobile device's application and network experience information (crowdsourced via the system 18) can be used in a feedback loop to inform the waveforming and beamforming processes.
- beamforming allows every user to get access to different network characteristics.
- network crowdsourcing as herein described.
- the network testing/monitoring agent e.g. the WDS 40
- the network testing/monitoring agent can be used to detect/identify compromised mobile devices 12. For example, if the WDS 40 normally sees that a mobile device 12, or an loT device 12, normally only uses 2MB/day of data, and then that suddenly jumps to 100MB, the system 18 can be used to identify this abnormal network behaviour and flag the device 12 as possibility being compromised.
- Abnormal Access Point Behavior It is recognized that adversaries are beginning to use rogue access points and fake cell towers to lure mobile devices 12 into connecting. They can then monitor the traffic over the network 14 or use these malicious connections to install malware.
- the system 18 can also be used to identify abnormal access point behaviours. For example, if users are accessing the same access point from various locations, then that access point may be a rogue access point which is being driven around luring connections. Alternatively, if the cell tower ID, or some other identifier of a cell tower, or a cell tower's characteristics suddenly change, it can be flagged as possibly being a false tower made to appear similar to the non-malicious access point.
- the system 18 can therefore be adapted such that the performance and details of mobile devices 12 and network access points are compared against the expected details/performance to search for network issues and compromised systems.
- Certain networks are not intended to be seen outside of specific geographic areas and certain facilities.
- the system 18 can report if certain networks 14 are seen where they should not be seen.
- the network tests described above can be used to report the performance or likely performance of network applications 38 such as Skype, YouTube, Netflix, etc. without ever interacting directly with the proprietary servers used by those applications. Instead, the network requirements of those applications 38 are understood and compared against the network characteristics being observed and collected by the network testing agent (e.g., WDS 40) in order to report on application performance.
- the system 18 can therefore be configured such that the results are used to report the performance or likely performance of network applications 38.
- the above-described crowdsourcing can provide alarms to network operators indicating specific areas or network access points which are providing less than optimal performance. These alarms and this information can be used to inform network maintenance or indicate which areas of a network 14 require additional testing by other methods.
- the system 18 can therefore be configured such that the performance and details of mobile devices 12 and network access points are compared against the expected details/performance to search for network issues and compromised systems.
- the system 18 can pinpoint areas with large foot traffic or population densities that are also underserved by wireless service providers. These are the areas where network improvements are expected to provide the largest gains to the overall subscriber base. By comparing this performance to that of competitors, the system 18 can suggest areas where the network operator should focus to be more competitive and perform better customer acquisition.
- the system 18 can therefore be configured such that the results are used in conjunction with user density information collected from the system 18 or external sources to inform a network operator on the most beneficial location for network maintenance, expansions, and upgrades.
- the system 18 can be used to inform a network operator on: 1) what new towers or technologies are being implemented by competitors; 2) which network operators are gaining the most subscribers and where; 3) what types of applications/services the competitive network are running and how that is changing over time; and 4) the performance of competitive networks and how that is evolving overtime.
- the system 18 can therefore be configured such that the results are used to inform a wireless operator on the performance being delivered by their competitors to their competitor's subscribers and in which the new network implementation/alternations of competitors are recorded, predicted, and reported.
- system 18 can also be configured to interact with a device connection management platform (not shown), as my be provided by a mobile phone operating system, or as may be controlled by the network operator, to help a mobile device 12 select an appropriate network 14 or access point connection.
- a device connection management platform (not shown), as my be provided by a mobile phone operating system, or as may be controlled by the network operator, to help a mobile device 12 select an appropriate network 14 or access point connection.
- the data collected by the WDS 40 is transmitted, either in its raw form or after an analysis of the data, to the connection management platform via an API for use in the network or access point selection process.
- the system can also benefit from the use of Artificial Intelligence (Al) and Machine Learning (ML) in addition to data analysis.
- Data reported by the WDS 40 may be input to AI and ML platforms (not shown) for processing into enhanced information to be used by network operators for purposes such as network planning, network maintenance, customer care, customer advertising, and operators.
- this enhanced information may be input to SON, software defined network (SDN), network function virtualization (NFV), or MIMO systems such that the network 14 can be responsive to this enhanced information produced by AI and ML processes run on the data supplied by the WDS 40. Groups other than network operators may similarly benefit from the enhanced information produced by AI and ML applied to the WDS test data.
- any module or component exemplified herein that executes instructions may include or otherwise have access to computer readable media such as storage media, computer storage media, or data storage devices (removable and/or non-removable) such as, for example, magnetic disks, optical disks, or tape.
- Computer storage media may include volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data.
- Examples of computer storage media include RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by an application, module, or both. Any such computer storage media may be part of the system 18, any component of or related to the system 18, etc., or accessible or connectable thereto. Any application or module herein described may be implemented using computer readable/executable instructions that may be stored or otherwise held by such computer readable media.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Environmental & Geological Engineering (AREA)
- Debugging And Monitoring (AREA)
- Mobile Radio Communication Systems (AREA)
- Monitoring And Testing Of Transmission In General (AREA)
- Radio Relay Systems (AREA)
Description
- The following relates to a method and a computer readable medium for evaluating wireless device and
wireless network performance, and wireless network usage trends. - The number of wireless devices that are accessing wireless communication networks is continually growing. These devices may access the various networks via cellular, WiFi and other access points. As the number of devices grows, the strain on these networks grows, affecting the performance of both the networks and the devices.
- In order to address the performance of wireless devices and wireless networks, network service providers, device manufacturers, application developers and other entities that have a stake in affecting such performance require performance and usage data. Various techniques exist for collecting and evaluating performance and usage data, for example, standalone on-device applications or modules that perform periodic testing. Wireless carriers may also have native applications that have access to certain performance data that can be evaluated. However, these techniques can be either intrusive to the devices and users of those devices, or be limited to the type of network and/or type of device and/or type of user onto which the applications or modules are deployed.
- In
EP 2 378 712 A1 - The invention sets out a method of evaluating wireless device and/or wireless network performance and/or wireless network usage trends according to
claim 1. The invention also sets out a computer readable medium comprising computer executable instructions for performing the method according to any one ofclaims 1 to 14. - The following provides a system and method that enables wireless device and wireless network performance to be evaluated by embedding wireless device software in a plurality of applications (or operating systems) deployed and running on a plurality of wireless electronic device types and across a plurality of network types, to enable an aggregation of a more comprehensive collection of data. This allows a larger and more meaningful data set to be created for subsequent analyses and reporting. The aggregated data set(s) can be used to provide raw data, reports, and dashboard-type interfaces to third parties, and/or prepare and send feedback data to the wireless device software to control testing behaviour and if desired to control the amount and type of data that is collected.
- The feedback data can be used for many different operations, including to adapt and improve performance of the application and/or device, which takes into account data acquired and aggregated from a multitude of applications, devices, and networks. The raw data, reports, and dashboard interfaces can be used to provide network carriers or service providers, device manufacturers, game and/or application developers, and other interested parties to perform actions based on a more complete data set, for example, for device and network benchmarking, mobile advertising, application traction and popularity, investment decision making, quality of experience (QoS), network planning, etc.
- In one aspect, there is provided a method of evaluating wireless device and/or wireless network performance and/or wireless network usage trends. The method comprises providing wireless device software to each of a plurality of wireless electronic devices connected to one or more of a plurality of networks by having the wireless device software embedded in the corresponding electronic device, wherein the wireless device software is embedded in or operable with a plurality of types of applications and performs at least one test associated with characteristics and/or location of the device, and/or performance of the device and/or the network, and/or usage of the device by a user; receiving via one or more collection servers, test data obtained by the wireless device software of each of the plurality of wireless electronic devices; aggregating the received data; and storing and outputting the aggregated data.
- In other aspects there are systems and computer readable medium configured or operable to perform the method.
- Embodiments will now be described by way of example with reference to the appended drawings wherein:
-
FIG. 1 is a schematic block diagram of a wireless communication environment that includes a number of network and device types; -
FIG. 2A is a block diagram of a configuration for a wireless device; -
FIG. 2B is a block diagram of another configuration for a wireless device; -
FIG. 3 is a block diagram of a configuration for collecting data from mobile devices using wireless device software (WDS); -
FIG. 4 is a block diagram of a configuration for collecting data from mobile devices using WDS with data processing, data distribution, and device software support; -
FIG. 5 is a block diagram illustrating the configuration shown inFIG. 4 for a plurality of devices and a plurality of third party systems; -
FIG. 6 is a schematic diagram illustrating a configuration for a data processing module to perform data aggregation from a plurality of devices; -
FIG. 7 is a schematic diagram illustrating a configuration for a data distribution module to perform feedback analytics for distribution to devices; -
FIG. 8 is a block diagram illustrating additional detail for the configuration shown inFIG. 4 ; -
FIG. 9 is a block diagram illustrating a configuration in which a feedback server is used to communicate feedback data to mobile devices; -
FIG. 10 is a flow chart illustrating computer executable instructions performed in aggregating data from a plurality of devices; -
FIG. 11 is a flow chart illustrating computer executable instructions performed in analyzing and distributing feedback data; -
FIG. 12 is a screen shot of an example user interface for displaying network coverage availability for a region; -
FIG. 13 is a screen shot of an example user interface for displaying a best provider map; -
FIG. 14 is a screen shot of an example user interface for displaying download performance and performance change for a region; -
FIG. 15 is a screen shot of an example user interface for displaying network quality of service (QoS) for highways in a region; -
FIG. 16 is a screen shot of an example user interface for displaying network statistics for all operators in a selected geography; -
FIG. 17 is a screen shot of an example user interface for performing web-based network analyses; -
FIG. 18 is a screen shot of an example user interface for displaying macro-level trends; and -
FIG. 19 is a screen shot of an example user interface for displaying micro-level network performance details. - The following provides a system and method that enables wireless device and wireless network performance and wireless network usage trends to be evaluated by embedding wireless device software in the background of a plurality of applications (or operating systems) deployed and running on a plurality of device types and across a plurality of network types; to enable an aggregation of data types for the analysis and reporting of a more meaningful dataset.
- It has been found that by crowdsourcing data from a plurality of applications on a plurality of device types in a plurality of network types, a larger and more meaningful data set is obtained for not only performing analytics on devices, applications, and networks, but also for providing feedback to such applications, devices, and networks to modify and/or improve testing behaviour. Moreover, the aggregated data set(s) can be used to provide raw data, reports, and dashboard-type interfaces to third parties. In this way, data such as network quality of service (QoS), app and device data that is crowdsourced from mobile devices can be used to: determine and illustrate the customer's perspective of a network, show device and application usage data, deliver insights that are immediately actionable, and test various parts of device and network performance that is useful in various ongoing applications.
- Turning now to the figures,
FIG. 1 illustrates an example of awireless environment 10 which includes a number ofwireless networks 14 that can be of different types, e.g., different cellular network types (2G, 3G, 4G, etc.). The different network types can also include other types of wireless networks such as WiFi networks accessible through available WiFi access points. Within thewireless network environment 10 variouselectronic communication devices 12 having wireless capabilities operate by connecting to one or more of the different networks/network types 14 and/or directly with each other over peer-to-peer or mesh-type networks 14. As illustrated inFIG. 1 , various types ofelectronic devices 12 are configured to connect to and utilize one or morewireless network types 14, therefore providing aheterogeneous network environment 10 for which performance can be monitored, measured, analyzed, feedback provided, and operability adjusted as herein described. - In order to obtain a more meaningful data set and to provide data analytics on a more
representative environment 10, each of thedevices 12 includes a software functionality (described below) that is capable of performing tests, monitoring existing device operations and usage, and otherwise collecting data on or related to one or more applications on thatdevice 12. By partnering with publishers of various mobile apps (e.g., games), the software functionality can be distributed to millions ofmobile devices 12 to anonymously collect QoS, device and app usage data, among other things. Various partnership arrangements can be implemented, such as 1) revenue sharing on raw data, and/or report sales - e.g. providing insights by way of analysis or convenient presentation of information for subsequent analyses; 2) payment of upfront fees to app/game developers, 3) providing app/game developers with useful reports to encourage adoption of software functionality, etc. It can be appreciated that partnerships would not necessarily be required in order to deploy the software functionality on devices by distribution through existing channels, apps, OS, etc. - It can be appreciated that such software functionality can also be integrated with carrier apps or other purpose-built apps to be used on a number of
mobile devices 12. The software functionality can be embedded in apps and games running on various mobile platforms/OS such as Android, iOS, Windows, etc.; as well as other mobile platforms such as those used for wearables, gaming, vehicle systems, wireless sensors, etc. That is, any other device/platform with location-tracking capabilities (e.g., GPS, network based location, etc.) and network (e.g., Internet) connectivity with an ability to run the software functionality described herein are applicable to the data collection, analysis, and feedback/reporting mechanisms described herein. In some implementations, devices with only network connectivity and without location-based capabilities may also be incorporated into the system. - The
data 16 that is collected is preferably tied to a location or otherwise considered "location-based" such that thedata 16 or information derived from thedata 16 can be placed on a map. Thedata 16 is also preferably collected in an anonymous manner such that no personally identifiable information is collected and/or stored by thesystem 18. For example, thesystem 18 should be configured to not collect a device's advertiser ID, device ID, or other information that could be used in conjunction with another dataset to identify the user of thedevice 12. In one implementation, the software functionality described herein can be configured to generate and append a unique random number which is specific to the particular installation, and which is reset (e.g. regenerated) periodically (e.g., each day). This can be done to ensure that an adversary cannot observe data reported from one device over the course of several days to determine who that device may belong to. - The
data 16 can include, without limitation: device location, device manufacturer name, device model, OS name and version, network operator ID, % memory free, CPU utilization, battery drain rate, storage utilization (i.e. device metrics), application name, download bytes, upload bytes, first install time, last updated time (i.e. mobile application metrics), upload throughput, download throughput, latency, link speed, signal strength, jitter, packet discard rate, packet loss, # of radio frequency conflicts (i.e. network QoS metrics), BSSID, SSID, signal strength (i.e. Wi-Fi scan metrics), connection start/end times, connection type, technology, service provider, cell ID, LAC, MCC, MNC, DHCP response time (i.e. connection metrics), etc. - The collected
data 16 is fed to acentral system 18 that includes modules and processes for collecting thedata 16, processing and analyzing thedata 16, generating feedback for thedevices 12, and preparing user interfaces and reports therefore. It can be appreciated that multiple "central"systems 18 can be used, e.g., to comply with handling laws requiring that data from a particular jurisdiction be stored in that jurisdiction, etc. The data can be securely stored in cloud-based databases and securely transmitted via secure connections (e.g., HTTPS). The databases can be globally dispersed and can be configured to provide direct access to the clients of thesystem 18. - The reports and user interfaces are generated and made available to one or more
third parties 22. InFIG. 1 several examples of third party types are provided, including without limitation, network carriers (multiple different ones), game and/or application (app) developers, and other 3rd party systems such as data analytics firms, mobile advertising entities, investment and financial entities, etc. The reports and user interfaces can be provided using data visualization tools such as graphical reports, interactive dashboards, web tools, etc. Reports can be delivered on a periodic basis or in real time with dashboards being available when needed online at any time. - The raw data and/or reports, dashboards and other user interfaces can be used for innumerable applications and use cases. For example, the
data 16 that is collected can be used for network planning, wherein since thesystem 18 observes potentially allnetworks 14, wireless service providers can interact with thesystem 18 to see how they are performing relative to competition in specific areas. This can indicate, for example, what the highest value areas are for improving a network, since the best new sites for network improvement are typically areas where there is a lot of network traffic and one in which a network provider is being outperformed by the competition. The data can also be useful to wireless service provider customers who may wish to see how their service provide is performing relative to alternatives. - Another use case is for roaming monitoring, since network carriers typically lack knowledge regarding the quality of experience their users get when they roam onto other networks. The
system 18 configured as herein described can determine, for thedevices 12 having the software functionality obtaining thecrowdsourced data 16, whether or notdata 16 is being collected from aroaming device 12, and if thedevice 12 is roaming thesystem 18 can be configured to determine who the home service provider is, from thedata 16 that is collected. This tells the carrier if their roaming partners are performing well enough and who they should be choosing to be their roaming partner in the future. - Self-Organizing Networks (SONs) can also benefit from the
data 16 that is crowdsourced by thesystem 18. SONs dynamically adjust the network, antennae, or wave forming characteristics in response to network quality. The idea is that the network is self-healing such that if there is an issue, the network will adjust to eliminate the issue all on its own. SONs typically require access to large amounts of field data to operate, which can be satisfied with the large datasets that can be obtained using thesystem 18. - Other use cases include, without limitation: device and network benchmarking in which the system reports on which networks, devices, cell towers, network equipment, operating systems, etc. perform best for consumers; and investment applications. Regarding informing investments, it can be appreciated that the
system 18 can determine what various mobile applications exist and are being used onmobile devices 12, and be able to determine on a day-to- day basis how much those mobile applications are being used. This allows correlations to be made between user activity (e.g., mobile shopping, browsing, etc.) and a company's performance and thus share value. Yet another use case can include reporting on mobile application usage trends and which apps are gaining or losing popularity with customers. A wireless service provider could use this information to predict the network impact of consumer application trends. A financial institution could use this information to make predictions related to the stock market. - Turning now to
FIG. 2A , an example of a configuration for anelectronic device 12 is shown. Thedevice 12 includes aprocessor 30,memory 32, and anoperating system 42. Thedevice 12 is also operable in this example to provide graphical user interfaces to a user via adisplay 34. For example, a visual component can be provided directly on thedevice 12 for displaying a portion of the information collected to the user, if desired by the user. Thedevice 12 also includes one ormore communication interfaces 36 that are operable to connect thedevice 12 to one ormore networks 14. As also shown inFIG. 2A , thedevice 12 can includemultiple apps 38, of different types as discussed above. In order to collect and senddata 16 relevant acrossmultiple apps 38 and app types, in this example configuration, eachapp 38 embeds the aforementioned software functionality, depicted as wireless device software (WDS) 40, that is embedded in, and runs in the background of theapp 38 to gather particular data, perform tests, etc. TheWDS 40 is capable of not only accessing components on thedevice 12 such as theprocessor 30, battery (not shown) andOS 42, theWDS 40 can be configured to either directly, or via theapp 38 on which it resides, communicate on one ormore networks 14 by interfacing with the one or more communication interfaces 36. - It can be appreciated that while in
FIG. 2A eachapp 38 includes an embedded instance of theWDS 40 for monitoring and testing theapp 38 and/ordevice 12, theWDS 40 can be deployed in various other configurations. For example,FIG. 2B illustrates that theWDS 40 can instead (or in addition to) reside in theOS 42 and centrally interact with a number of theapps 38. TheWDS 40 may also reside as a stand-alone application or in another location or component of thedevice 12 as shown in dashed lines with functionality to interact with a number of (or all) of theapps 38. Similarly, one or more of theapps 38 can additionally have theWDS 40 reside thereon (also shown in dashed lines), e.g.,apps 38 that need to have such operations controlled internally rather than being opened up to an external program, module or routine. TheWDS 40 can therefore be installed in several different apps (i.e. in a weather app and then a totally different game) and these different apps could potentially be installed on the same phone or a multitude of different phones. This allows for the scenario wherein theWDS 40 is installed several times on the same phone (e.g., as illustrated), in which case theWDS 40 should identify that it is getting data from thesame device 12. It can be appreciated that theWDS 40 can have a hardcoded limit of a number of tests that can be performed over a time period, which limits are unalterable by the configuration server. TheWDS 40 can also be operable to identify its own code running in a different application on a same electronic device, and be responsive to identifying its own code running in the different application by having only one instance of the wireless device software operating at the same time. - A data collection configuration is shown at a high level in
FIG. 3 . Eachmobile device 12 that is configured to operate the WDS 40 (using one or more apps 38) provides data to acollection server 50 that is deployed as part of thesystem 18. The collecteddata 16 is processed as herein described, along withdata 16 obtained fromother devices 12, to generate information and data forthird party systems 22.FIG. 4 provides further detail for the configuration shown inFIG. 3 , in which thecollection server 50 collects thedata 16 and has the collected data processed by adata processing stage 52. The data thus processed is then provided to adata distribution stage 54 for distribution to thethird party systems 22.FIG. 4 also illustrates that thedata distribution stage 54 can also enable thesystem 18 to provide feedback to themobile device 12 by communicating with a devicesoftware support functionality 56 that is connectable to theWDS 40 to complete the feedback loop. By having theWDS 40 deployed in multiple different app types on multiple different device types operating with multiple different network types, not only can data be collected from a wider range of sources to provide a more meaningful and complete data set; a more comprehensive feedback network can be established thus providing the ability to reach a wider range ofdevices 12. Such a feedback network can be used for various purposes, including to modify the behaviour of theWDS 40. -
FIG. 5 illustrates a configuration similar to that shown inFIG. 4 , but illustrating the collection ofdata 16 frommultiple devices 12 viamultiple WDSs 40. As shown inFIG. 5 , the plurality ofmobile devices 12 shown can be served by a common devicesoftware support entity 56 and can providedata 16 to acommon collection server 50. Thesystem 18 may employ multipleregional collection servers 50 and devicesoftware support entities 56 as needed and thus the example shown inFIG. 5 is illustrative only. - On the data collection side,
FIG. 6 illustrates operations that can be performed in thedata processing stage 52 to collect and aggregate thedata 16 that is received from potentially a multitude of different types of sources. As illustrated inFIG. 6 since thedata 16 originates fromdifferent apps 38 on different device types operating across different network types, while the data may be collectively relevant, is not necessarily homogeneous. The collected data is therefore aggregated, in this example using adata aggregation module 60 that can utilize rule set(s) 62 or template(s) or other data structure(s) defining how to meaningfully aggregate the data for subsequent analysis by generating one or more aggregated dataset(s) 66 that can be stored within thedata processing stage 52 or elsewhere within or accessible to thesystem 18. Thedata aggregation module 60 can also utilize any other 3rdparty metadata 64 from third party data sources that are useful in aggregating and analyzing thedata 16. -
FIG. 7 illustrates at a high level the use of the aggregated dataset(s) 66 to perform subsequent analyses using afeedback analytics module 70. This allows the analysis to be performed on data that has been aggregated or "stitched" across the data types, app types and network types to provide moremeaningful feedback 72 that can be tailored according todifferent devices 12,different apps 38, utilization in different regions, on different networks or network types, etc. - Further detail concerning the functional blocks shown in
FIGS. 4 and5 is provided inFIG. 8 . Beginning with themobile device 12, theWDS 40 in this example is embedded in amobile app 38 and includes asoftware interface 84 for interfacing between theapp 38 and asoftware controller 86 for controlling the tests and other operations of theWDS 40. TheWDS 40 also includes a test data storage 88 for storing data acquired during the tests, atesting SDK 90 for performing one or more particular tests that involve operation of theapp 38 and/or the device itself via thedevice OS 42. TheWDS 40 also includes autilities SDK 92 that includes methods, functions, and APIs that can be used to pull data and info from thedevice OS 42. Such methods can be used to export data to thecollection server 50.. - The
SDK 92 is also operable to communicate with thecollection server 50. Thecollection server 50 includes a reportingserver 96 for receiving test and any other data being reported by theWDS 40, and areporting database 98 for storing the test data for use by thedata processing module 52. - The
data processing module 52 includes a central data services (CDS)server 100 that provides data source APIs for different third party data sources and metadata. TheCDS server 100 can also provide local storage for quick responses to the data aggregation operations. TheCDS server 100 also interfaces externally with the one or more thirdparty data sources 64 and internally with thedata aggregation module 60 discussed above. Thedata aggregation module 60 obtains (i.e. pulls, requests or otherwise receives) the data collected by thecollection server 50. Thedata aggregation module 60 also performs aggregation of the various data and data types and stores the aggregated data in areports database 104 to be accessed by areport generation module 106 for generating various types of reports, dashboards, etc. It can be appreciated that data can also be pulled in from third party data sources and not only the collection server. For example external databases can be pulled in that help translate latitude and longitude into city names where the data was collected. - The
report generation module 106 can generate various types of data for distribution tothird parties 22 as shown inFIG. 8 . For example, thereport generation module 106 can generatereports 110 and/ordashboards 112, and can prepareraw data 114 to be analyzed elsewhere. Thereport generation module 106 can also preparefeedback data 116 to be sent to thedevice support software 56, in this example configuration, to afeedback server 126 that is part of suchdevice support software 56. - The
device support software 56 can include various servers that can communicate with and control, monitor, update, fix, kill, or otherwise interact with theWDS 40 in thevarious devices 12. In this example, thedevice support software 56 includes thefeedback server 126 mentioned above, as well as a configuration server 124 for managing the configurations for theWDS 40, and anauthentication server 122 for authenticating theWDS 40 to ensure that it is from an appropriate app and app developer. Thedevice support software 56 also includes atesting server 120 for interacting with thetesting SDK 90 for providing and updating/configuring tests and test sets to be performed by theWDS 40. - The
WDS 40 can be configured as a software library that is embedded in themobile device apps 38 in order to report and integrate with thecollection server 50 anddata processing module 52. The libraries of theWDS 40 can be added to an existing application to collect device, connection, network QoS, Wi-Fi, and application key performance indicators (KPIs). It can be appreciated that using this over the top approach only requires theWDS 40 to have the ability to communicate with thesystem 18 over an network connection, for example, either on Wi-Fi or mobile. This allows for the flexibility of deploying through a cloud infrastructure anywhere around the world. As shown inFIG. 8 , theWDS 40 interacts with the devicesoftware support entity 56, which can include different servers with which theWDS 40 can communicate during its operation. The example configuration shown inFIG. 8 includes servers responsible for authentication and initiation (authentication server 122), configuration (configuration server 124), testing (testing server 120), and reporting (reporting server 96) that communicate with theWDS 40. Theauthentication server 122 can be used to dictate which application programming interface (API) keys andapps 38 are allowed to operate and collect data through theWDS 40. The configuration server 124 can be used to set specific rules and parameters for the operation of theWDS 40. TheWDS 40 can also usetesting servers 120 to perform active tests on the connectednetwork 14. The reportingservers 96 are used to upload the data payloads from theWDS 40 to thesystem 18. - As indicated above, the
authentication server 122 can be used to verify thatapplications 38 are using the correct API key for each developer, and to provision each app with a unique deployment key. Each application developer can be assigned an API key, which is used to generate a unique deployment key for eachapplication 38. This deployment key is used to control the configuration of theWDS 40, as well as track the data collected by eachapplication 38. - The
authentication server 122 can also check that theapp 38 has not been registered with thesystem 18 previously. This ensures that the data collected through theWDS 40 is associated back to thecorrect application 38 and developer, e.g., to account for revenue sharing. Theauthentication server 122 also allows the control of shutting down specific applications or developers from collecting data at any time, e.g. for implementing a "kill switch". - The
WDS 40 can be configured to check with theauthentication server 122 on first initialization of theWDS 40, and periodically (e.g., every few days) following initialization. This allows for theauthentication server 122 to shut off anyapplication 38 from collectingdata 16. All communication and data transferred between theWDS 40 and theauthentication server 122 is preferably secured and encrypted. For example, theWDS 40 can be given a three day local cache on thedevice 12 to prevent theWDS 40 from checking in with theauthentication server 122 on every initialization to prevent extra traffic or chattiness over thenetwork 14, and to act as a local cache on thedevice 12. - The
testing servers 120 are used to perform active tests on anetwork 14 through interaction with theWDS 40. Thetesting servers 120 can host various files of different sizes for performing download throughput tests. For upload throughput tests, thetesting servers 120 can provide an un-throttled bucket to upload files of any size. Furthermore, thetesting servers 120 can also echo back packets for the corresponding communication protocol (e.g., UDP packets) sent from theWDS 40 for server response tests.Multiple testing servers 120 can be setup as necessary around the world. Thetesting servers 120 can be deployed on an cloud or on-premises hosting environment. TheWDS 40 determines whichserver 120 to use for performing active tests by choosing the mostappropriate server 120 based on the device's geographic location. For example, the closest route may require using undersea cable whereas a server slightly farther away may be able to make use of faster land-based cable (i.e. to account for more than just geographical proximity). Thetesting servers 120 used by theWDS 40 can be configured through the configuration server 124. All communication and data transferred between theWDS 40 and thetesting servers 120 is preferably secured and encrypted. - The configuration server 124 is designed to allow full control over the
WDS 40. The configuration server 124 allows thesystem 18 to adjust data collection frequencies, data reporting frequencies, and the types of data being collect fordevices 12 out in the field. Each WDS deployment can be assigned a unique deployment key, used by theWDS 40 to periodically check what data collecting/reporting behaviors theWDS 40 should be adhering to. This allows the dynamic adjustment of theWDS 40 performance to fine tune battery consumption, network chattiness, and other parameters. - A configuration profile held by the configuration server 124 is downloaded to the
WDS 40 upon the initialization of theWDS 40. For example, the configuration server 124 may hold a new policy that says "Do not collect data in Country X". That new policy, or that new profile for data collection, would be downloaded and executed by theWDS 40. A new configuration profile is pulled to theWDS 40 on a specified frequency. TheWDS 40 can also have a local cache on the device 12 (e.g., three days), of the configuration server 124, to prevent theWDS 40 from pulling configurations from the configuration server 124 too frequently. All communications and data transferred between theWDS 40 and the configuration server 124 are preferably secured and encrypted. - The configuration file/data can be signed by the service with a known, trusted security certificate. The signature is passed down with the configuration server's configuration where it is verified in the
WDS 40 on thedevice 12. TheWDS 40 may then try to match the signature on the server configuration with one generated locally on thedevice 12 using the same certificate as the server side. If the signature generated on theWDS 40 does not match the one provided by the configuration server 124, theWDS 40 can be configured to reject the configuration and continue to use the previous configuration, or a default. This co-signing verification between the server 124 andWDS 40 ensures that the configuration is not compromised. Compromising the configuration supplied to theWDS 40 can have varying degrees of impact on the user device, the amount of data used, the battery impact, etc. - With the configuration shown in
FIG. 8 , the following process flow can be implemented. TheWDS 40 can initialize by checking with theauthentication server 122 to run or not. TheWDS 40 then pulls a configuration file from the configuration server 124 to direct the operation of theWDS 40. Data is then collected by theWDS 40 by interacting with the device OS to capture various KPIs about the device, network connection, network QoS, WiFi scan information, and application data usage, etc. as discussed herein. TheWDS 40 can also perform network performance tests against the testing server(s) 120. - Data is collected by the
WDS 40 and stored in a database (e.g., SQLite) over a particular time period, e.g., a 24 hour period. The database is then exported to the reporting server(s) 96. The reportingservers 96 can parse through the database to split the data into different tables, e.g., within BigQuery. In this example, the data is stored in various BigQuery reporting tables depending on the type of data. On a periodic basis, e.g., hourly, dataflow jobs can be run to add additional metadata to the raw data uploaded from theWDS 40. This metadata includes tagging the raw data with country, region, and city metadata, etc. Once the data is processed by the dataflow jobs, data is made available in various tables and views. These tables and views allow raw data export or building visualizations and standard reports with other tools as herein described. It can be appreciated that standard reports, custom reports, customer dashboards, and raw data can all be made available through a combination of custom reports and dashboards or through different views and exports from the tables (e.g., from BigQuery). - As illustrated in
FIG. 9 , thecollection server 50 is configured to collect data from multiplemobile devices 12 by having the reportingserver 96 interfaced or otherwise communicable with theWDS 40 in each of themultiple devices 12. It can be appreciated that while thecollection server 50 can communicate withmultiple devices 12, the wider system can includemultiple collection servers 50, e.g., regionally placed, eachcollection server 50 being capable of communicating with thedata processing module 52.FIG. 9 also illustrates that thefeedback data 116 generated by thereport generation module 106 can be provided to multiple differentthird parties 22 in addition to thefeedback server 126. Thefeedback server 126 can be configured to communicate with multiplemobile devices 12 via the respective WDS(s) 40. -
FIG. 10 illustrates data flow in gathering, aggregating, and analyzing data frommobile devices 12 for preparing and providing reports and/or raw data. Atstep 200 the mobile application (or operating system, etc.) that contains theWDS 40 initiates theWDS 40 to begin collecting test data collection on themobile device 12 atstep 206. It can be appreciated that as shown inFIG. 2 , theOS 42 or other components of thedevice 12 can be used to initiate theWDS 40 to begin the data collection. The data collection atstep 206 is performed based on network tests performed in connection with thedevice support software 56 atstep 202 and by communicating with thedevice OS 42 atstep 204. The collected data is stored atstep 208 and uploaded to thesystem 18 atstep 210. The uploaded data is collected and aggregated atstep 212 and stored atstep 214 in the reporting data storage as noted above. The aggregated data can be correlated in various ways atstep 216 by referencing thirdparty data sources 82 in order to generate and store reports data at 220. This enables the various data reports to be provided atstep 222. - The data can be aggregated at
step 212 by adding the uploaded data to a large set of tables, e.g., split by day. The large set of tables can then be queried according to certain variables. In one configuration, data for allapps 38,devices 12 andnetworks 14 can be placed in the same data storage, and can be grouped in various ways depending on what is meant to be shown in the reports, dashboards, etc. - The data is analyzed in various ways. For example, the data can be broken down by country, region, city, etc.; as well as by time periods (e.g., month). Custom groupings can also be performed by network type (2G vs 3G vs 4G) and statistics determined and displayed for those groupings. Custom groupings can also be performed to determine application package names, application names. It can be appreciated that determining application package names is non non-trivial since a single application can have multiple packages as part of its installation, and also different names in different languages. The
system 18 is configured to coalesce the packages to obtain a single-language list of app names and their associated package names (since package names are globally unique). Custom groupings can also be prepared for service providers based on mobile country codes (MCCs) and mobile network codes (MNCs). This allows brands to be matched up with operators for a givennetwork 14, rather than relying solely on thenetwork 14 reported by the device 12 (e.g., since there may exist a roaming situation or other scenario where the provider listed bydevices 12 may be inconsistent). - The
system 18 can therefore combine the uploaded data from a multitude of differentmobile applications 38 and deployments from a multitude of devices in various networks, regions, etc. Thesystem 18 is also able to pulladditional metadata 64 from several other third-parties and open data sources 82. Thesystem 18 can output raw data files as well as make data available for visualizations through user interfaces (e.g., dashboards). - For example, a set of the dataflow jobs can be used to add
additional metadata 64 to the raw data being uploaded from theWDS 40. These dataflow jobs can be performed periodically, e.g., hourly on the last hour of data upload from theWDS 40. The results can then be grouped into daily tables at a particular time, e.g., GMT midnight, for querying. - The following is a summary of the processes that can take place throughout the dataflow jobs:
- 1. For many fields, enumerators can be used in the
WDS 40 for simplicity and for reducing the amount of data uploaded. The dataflow jobs can be used to swap out the enumerations for human-readable strings. - 2. Country, region, and city tags can be added to the data based on the reported latitude and longitude.
- 3. The geohash can be calculated for the reported latitude and longitude.
- 4. The device storage remaining and device memory remaining can be calculated.
- 5. Mapping from MCC and MNC to a service provider branding can be added.
- 6. Mapping from an application package name to application name can also be added.
- It can be appreciated that several open and paid third party sources can be used to complement the raw data collected by the
WDS 40. - The data reports generated at
step 222 can therefore be constructed in various ways and, if desired, additional thirdparty data sources 82 can be incorporated. Since the data is collected from a multitude ofWDSs 40 deployed within various types of applications running on various types ofOSs 42 and device types; all within, crossing between and/or interacting withvarious network types 14 and regions; a more comprehensive view of how a network, device, application, operating system or electronic environment more generally can be assessed. The data that is collected and stored can be queried in many ways for many purposes to suit the needs of differentthird parties 22 wanting access to such a wider and more complete set of data. Since theWDS 40 can be deployed within various types ofapps 38, such as games that enjoy substantial circulation and reach across multiple platforms, regions, an unobtrusive tool is deployed and can be leveraged gather such desired data on a periodic and ongoing basis without adversely affecting the performance of thedevices 12 orapps 38. -
FIG. 11 illustrates a process flow similar toFIG. 10 , wherein the data stored atstep 220 can be additionally used to conduct feedback analyses at step 250 (e.g., as shown illustratively inFIG. 7 . While the reports provide feedback in the form of raw data, analyzed data, graphical user interfaces, dashboards, etc., the data that is collected can also be used to distribute feedback to and affect the operation of theWDSs 40 and themobile devices 12 themselves. As shown inFIG. 11 , the feedback analysis at 250 can be followed by a feedback distribution stage atstep 252 to complete a "feedback loop" with the data collection operations performed atstep 206.. The feedback can be used in various ways. For example, it could: 1) affect theWDS 40 to change how/when data is collected; 2) affect the mobile application itself; and 3) affect affect the device. For 2), one can consider a case where it is identified that all networks in a particular city are particularly slow. A game in that city may choose to download lower resolution images or avoid gameplay features that require interaction with many other players or avoid asking the user to buy anything since the credit card payment may fail. For 3), themobile device 12 could decide to use a different type of network based on the information that is in the feedback package, or in a SON-type use case the feedback could direct thedevice 12 to connect to a specific cell tower. In the case of 2) and 3), the actions taken will ultimately affect the type and quantity of data collected by theWDS 40. -
FIGS. 12 to 19 illustrate screen shots of example user interfaces that can be generated using the data collected fromdevices 12 as herein described.FIG. 12 illustrates an example of a hex map showing coverage availability for 2G/3G/4G networks for a particular geographic region. The performance is shown in coloured hexagons of consistent size, with the radius being dynamically re-sized for different applications.FIG. 13 illustrates another hex map with network provider rankings. It can be appreciated that for bothFIGS. 12 and13 , specific key performance indicators (KPIs) can be shown, as well as radio technology coverage, operator comparisons, and other data types. The hex maps shown inFIGS. 12 and13 can be useful for seeing pockets of coverage type and quality in certain areas, seeing competitor and roaming partner experience, and identifying areas of poor experience (e.g., high packet loss, etc.), among others. -
FIG. 14 illustrates two examples of region maps, one showing download performance, and the other showing performance change for specified time periods. The region maps can be used to show regions of interest and can be set to a particular country, region, postal/zip code, etc. Colour coding can also be used to allow comparisons between regions. Such region maps can be useful for identifying performance quality or lack thereof in regions of interest, as well as the ability to see area performance for customer support and marketing purposes. For example, by having data frommultiple network types 14, carriers can determine metrics such as "the best provider in your postal area", etc. -
FIG. 15 illustrates a regional map with highways and other points of interest (POIs). This allows for network QoS to be shown relative to highways and other POIs like airports, train stations, train/transit lines, sports stadiums and other places that users may gather and expect or desire good network coverage. The screen shot shown inFIG. 15 can also be incorporated into a user interface or dashboard that allows a user to drill down into specific venues, junctions, and isolate based on date ranges. Also, the data that is collected by thesystem 18 can also be used to allow users to drill down into various KPIs such as download speed, latency, packet-loss, etc., therefore allowing service providers, venue operators and other interested parties to determine network QoS for metrics in which they are interested. -
FIG. 16 illustrates a screen shot of a user interface for displaying overview statistics for a particular region. In this example, network statistics are shown for all operators in a selected geography and the data can be displayed for specific date ranges. Since data is collected by thesystem 18 over a multitude ofdevices 12 in a multitude of networks andnetwork types 14, the overview provided inFIG. 16 can be obtained and periodically updated over time. The information provided can be useful for competitor benchmarking, since data concerning other networks is available, as opposed to only having data for one particular network. The data shown inFIG. 16 can also be useful for making roaming partner selections, since a network can obtain data for all operators in a particular region and can assess the quality of service that can be expected should they choose that roaming partner. - In addition to the overview stats, other views can be provided, such as QoS trends to show trends for certain KPIs, with selectable geographical and date ranges. QoS trends can be used to issue resolution and performance monitoring. Regional performance tables can be provided to show network QoS performance broken down by region (e.g., city) and by operator in selected countries. The tables can be colour coded to highlight improvements or degradation. These tables can be useful for competitor benchmarking, roaming partner selection, and for identifying areas requirement investment/improvement. Device performance statistics can also be provided to show performance by device and how these devices compare when used on home and competitor networks. Device performance statistics can be useful for device manufacturer considerations and issue resolutions, recalls, warranty issues, etc. Similarly, app usage statistics can be provided to show, for example, total active users, total data usage, etc. The app statistics can be filtered by geography, operator, device type, radio technology, etc. The app statistics can be considered useful for determining trends in user behaviour (e.g. growth in app types), and for optimizing networks for popular applications. The data gathered and analyzed by the
system 18 can also be used for infrastructure planning tools in which poor performing locations or infrastructure can be displayed on a map. These maps can be made interactive such that clicking on a location displays a street-view to search for possible infrastructure locations, etc. The maps can also display a list of local businesses for potential partnership (e.g., for small cell or WiFi access points). -
FIG. 17 illustrates an example of a web-based platform that can be provided to conduct network analyses. The network analysis dashboard inFIG. 17 can utilize multiple panes or portions with options to deep-dive to street level and cell-tower performance analysis, select different statistical tables or mappings to be displayed, etc. The dashboard shown inFIG. 17 can be used for infrastructure planning and validation. For example, the dashboard can be used to display device statistics for understanding macro-level trends, as shown inFIG. 18 , or to show detailed network coverage mappings of areas and venues as shown inFIG. 19 . - In addition to providing a system and method that enables wireless device and wireless network performance and wireless network usage trends to be evaluated by embedding wireless device software in the background of a plurality of applications (or operating systems) deployed and running on a plurality of device types and across a plurality of network types, to enable an aggregation of data types for the analysis and reporting of a more meaningful dataset as described above; various other applications, configurations, and use cases making use of or configuring the
underlying system 18 will now be described. - The
system 18 described above contemplatestesting networks 14 and generating test data in a few different ways, namely: - a) Requesting the
mobile device OS 42 for information (i.e. device API calls). - b) Creating network traffic and running "active tests". For example, determining the throughput of a network by downloading a file from a controlled
testing server 120 then watching the performance of that owned and controlled download. In this case, the network traffic being analyzed was created for the express purpose of performing a test. - c) Watching network traffic initiated by the user or some other mobile device service that has not been generated for the specific purpose of performing a test, i.e., a "passive test". For example, a network testing service can examine how quickly a user is able to upload a photo on Facebook or download a YouTube video, and then determine throughput by passively watching the performance of those non-controlled operations.
- It is recognized that access to more user information makes it possible to enhance these three types of tests. For example, the actions, behaviours, or locations of the users (or mobile services) could dictate which of the three types of tests to perform. These same actions, behaviours, or locations could also provide additional information which can inform the approach to testing or how the results should be interpreted to generate more valuable and accurate insights.
- Traditionally, passive testing has been found to be less accurate than active testing. This is because less is known about the traffic being analyzed, that is, passive testing is less controlled. The
system 18 described herein can be configured to perform network tests that are either initiated by user actions, or informed by user actions. This can be done by being given, or otherwise having access to, additional user or mobile service information, which can greatly enhance passive testing (and testing in general). This is becausemobile apps 38 can track user actions such as the user clicking a button to upload a photo. When themobile app 38 sees that a user has clicked the button "upload photo", it can run a passive network test on that data upload while knowing: 1) It was a photo; 2) the size of the photo being uploaded; and 3) the destination server address. In other words, themobile app 38 andWDS 40 are in a position to leverage an increased understanding of the nature of the file transfer to perform a more effective and accurate passive throughput test. This can be done, for example, by having theWDS 40 utilize an API to ingest information from themobile app 38. In this way, themobile app 38 passes information to theWDS 40, such as "the user just clicked a button to upload a photo of size x". Accessing this information provides context that may not have previously been available for passive testing, for instance when a file has been uploaded, not knowing that it was a photo, the resolution or size of the photo, or the destination server and routing details. - The
system 18 can therefore be adapted such that the user's interaction with a mobile service would dictate what type of passive network test to perform and how to interpret the results. For example, if the user uploads a photo on a particular mobile service such as Instagram, thesystem 18 can use that additional information to perform a passive network test that is designed to monitor the network's ability to handle photo uploads. This additional information can be provided by amobile application 38 and is typically provided by themobile application 38 which contains the network testing code - however other sources for that additional information are possible. In this event, the system's passive test would have access to additional information such as: 1) that the user is trying to upload a photo; 2) the size of that photo; and 3) the destination sever, etc. - It can be appreciated that user informed testing does not need to be limited to passive network tests. The mobile user's behaviour, characteristics, location, etc. could dictate specific active tests which should be run based on the types of tests desired by the controller of the system. User informed testing also allows the system to consider when an active test or a passive test would be most appropriate. For example, it may be best to only run passive tests, which don't create more new network traffic, when the user is watching a video or doing something with their
device 12 which is sensitive to network performance. In other words this "additional information" and user informed testing can help dictate when and where tests should be performed to: 1) not interfere with user experience, or 2) provide the information which is most needed by the system. - Furthermore, as wireless networks move more and more towards being virtualized or software defined, the user informed test results can be used to modify or dictate the hardware, software or implementation of the
network 14 itself by informing the network's requirements based on the services andapplications 38 being used by users and the actions they take. - The
system 18 described herein can therefore be used to perform user informed/dictated testing, that is, where the user does not specifically choose to run a network test. In this case, network tests are selected and initiated based on the actions performed by a user of amobile device 12 which contains the network testing software (e.g., downloading a photo). The details of those actions performed by the user can be used as an input into the analysis of the results (e.g., a network's ability to serve a photo). The action performed by the user is something that is not the user choosing to run a network test. - It can be appreciated that while the above examples are in the context of knowing more about a user, and the in-app buttons such a user would select, it could equally be a non-human service that provides the additional information.
- The above-described systems and methods contemplate tracking
mobile devices 12 as they access and make user ofwireless networks 14. Thesemobile devices 12 and their users can be identified and tracked on a day-to-day basis in various ways, including: - a) The mobile device ID: For example MAC Address, IMEI, or IMSI of the mobile device.
- b) The advertising ID of the device: Advertiser ID or IDFA are non-persistent ID's of the
mobile device 12 used to serve targeted mobile advertisements. - c) Cookies: IDs that are installed on devices as they access and use networks and network services.
- d) The mobile software ID (or WDS ID): A unique ID generated by mobile device software to identify a specific installation of the software.
- e) An ID used to log-in to mobile software: For example, a Facebook ID, Netflix ID or Gmail ID that is used by a user to log-in to a
mobile application 38. - f) A set of behaviour characteristics: For example, a set of characteristics, which may be defined based on a number of factors which may include locations of the device, IP addresses used by the device, or WiFi/Cellular access points generally used by the user.
- Each device tracking approach has its own privacy implications which typically needs to be considered and managed. That is, a selected tracking approach would normally need to be both acceptable to the mobile device user and certain legal requirements.
- By tracking how these IDs flow through
networks 14, thesystem 18 may be used to inform wireless service providers about user churn. For example, if an application ID is used to log-in on a phone on a first network 14a one day, and then later the same application ID is used to log-in on a phone on a second network 14b, then it can be reported that this user likely churned. That is, in this case it can be expected that this user left the first network 14a and became a customer on the second network 14b. Such churn reporting on its own provides a valuable service to wireless providers. However, this reporting becomes even more powerful when combined with other data sets to enable predictive capabilities which create the possibility of advertising to influence churn. - For example, this historical network churn information when combined with other information sets such as wireless network coverage, wireless network performance, website cookies, recent searches, mobile device hardware/software, user network subscription plans, what people are saying about the wireless network operator on social media, and other information sets, can be used to perform churn prediction on individual users or on large aggregate portions of the population.
- This enables enhanced targeted advertising by wireless operators to users who are either: 1) high probability candidates to leave their
network 14; or 2) high probability candidates to leave their competitor'snetworks 14. The same mobile IDs can be used to target specific users or IDs with appropriate advertisements. - As an example, the system's wireless network performance tests can be used to compare networks and inform targeted advertising campaigns. If the second network provider discovers that they are the best wireless network in a specific city they could adjust their advertising to devices in that city to promote their network as being the highest performer. It is then possible for
mobile applications 38 and services to suggest wireless operators to their users. Users may opt-in to allow a wireless service, such as Facebook, to track network performance, their usage patterns, and location and then suggest to them thebest wireless network 14 for their requirements. - As an alternative approach to tracking user churn, the
system 18 may track which groupings ofmobile devices 12 tend to show up onspecific networks 14. For example, if the same four mobile devices consistently access the same WiFi access point, or access networks via the same IP address, it is reasonable to assume that this is a family unit or associated group. If suddenly one of thosedevices 12 leaves that grouping and anew device 12 appears which is authenticated with a differentcellular wireless network 14 it can be reasonably assumed that there has been a network churn event by the user of that newly appearing device. - As such, tracking one or more IDs associated with a user or
device 12, and obtaining access to or otherwise tracking user-related events such as social media posts, can enhance churn identification and churn reporting and/or targeted advertising. Thesystem 18 can be adapted for such churn prediction by tracking a user as they move acrossnetworks 14 and acrossmobile devices 12 using their social media log-in IDs, such that an analysis of network/device churn can be performed. - Wireless network performance tracking by the
system 18, which can be performed by crowdsourcing from mobile end points as described above, can also be used to determine which areas, users, or services are being throttled; as well as which areas, users or services are being provided with enhanced levels of service. - Identifying and comparing low performance and high performance cases can be used in a variety of ways, for example:
- a) To inform cities and governments on which areas are being properly served by wireless service providers. Wireless regulators often require that carriers provide certain levels of service to rural areas and/or less privileged neighborhoods, and violators can be identified and penalized using the testing data.
- b) To inform Mobile Virtual Network Operators (MVNOs) on whether or not a home network is providing adequate levels of service or if the home network operator is providing inferior service to the MVNO's subscribers compared to their own. This allows the MVNO to determine if their home operator is in violation service level agreement (SLA) rules.
- c) To inform
wireless networks 14 on whichnetwork 14 they should have their subscribers roam to and whether or not those roamingnetworks 14 are adhering to or violating SLAs and how the roaming quality experience by their roaming subscribers compares to the quality being received by that network home subscribers. - d) Whether or not net neutrality laws are being adhered to or violated. For example, it can be seen if a network operator is throttling a third party streaming service, and promoting their own streaming service, and to what extent.
- The
system 18 can therefore be adapted such that the network test results or service quality is compared against a threshold of quality dictated by a wireless regulator or home network provider to see if requirements are met. - Network quality and coverage is often considered critical to certain emerging cyber-physical domains such as self-driving vehicles and ehealth. In these cases, the end
mobile device 12 has a core purpose, which is network sensitive. It is important that thesedevices 12 maintain access to network quality that is good enough to meet their core purpose requirements. For example, an ehealth device designed to inform hospitals of heart attacks should be able to send a message to hospitals or emergency dispatchers when a heart attack is detected. - Network testing capabilities for these
devices 12 may then be considered critical to their performance, with test being triggered by events which are inherent to the device's core purpose. - In one example, a self-driving vehicle or vehicle network may choose to run tests whenever vehicles need to perform emergency maneuvers (e.g., avoid an animal or other obstruction on the road) to track the performance of these maneuvers. Alternatively, the vehicle grouping may run tests only in cases when it is known that there are portions of the road or route where network performance information is lacking. In these cases a network testing system can have its tests triggered by external events. The resulting network dataset can be combined with information about the cyber-physical device's operation and requirements to determine if the
network 14 is adequate for that cyber-physical device's requirements. - In another example, an
e-health device 12 may perform event driven tests on thenetwork 14 to ensure that thenetwork 14 is performing well enough to handle the network requirements of an emergency situation (and that the devices is connected to the appropriate server). Example events in this case may be: 1) User is sleeping or user is in nor immediate health danger; 2) User health reading are reaching dangerous levels which could get worse; 3) User is in danger. - It can be appreciated that in applications such as self-driving vehicles the
devices 12 are in a great position to map network quality across huge areas and therefore may be relied upon or otherwise play an increased role in future network testing. It can also be appreciated that vehicles are not just limited to automobiles, and may include drones or other autonomous devices. - The
mobile devices 12 used to perform network testing typically need to have the ability to preserve user privacy to degrees that are informed by the user themselves. For example, if a user inputs that they either opt-in or opt-out of the service, or portions of the service, the overall system should be responsive to that input and adjust what is collected accordingly. The analysis and handling of that data should also be informed by those same user inputs. - The
system 18 can also be adapted to ensure that it is capable of consuming information about the jurisdictional and geographic difference in privacy rules and be responsive to those rules. For example, a global testing system may perform differently in Russia than in the European Union depending on the current governing privacy legislation in both areas. - It can also be important that the
system 18 orchestrate the tests performed amongst the full network of testing end points to preserve privacy of users. For example, thesystem 18 may choose to distribute the tests amongst themobile devices 12 in such a way that makes it even more difficult to track the movement or characteristics of aspecific device 12. Or, for example, if a specific area is known to be private property and have a very low population density, thesystem 18 can be configured to be able to handle that data differently, or not collect data from that area, since it would be easier than normal to associate the tests taken in that low-population area with the person or persons known to live in or access that area. There may also be specific geographic areas in which it becomes illegal to run tests or measure location, and thesystem 18 may need to be adapted accordingly. - Multi-input Multi-output (MIMO) and SON systems 22b may have a multiplicity of channels available, each of which is evaluated. Also, MIMO and SON systems 22b can use beamforming to broadcast specific channels and network resources to specific
mobile devices 12, namely based on their unique requirements. As a result each user in thenetwork 14 can be experiencing something completely different such that the importance of crowdsourcing network quality increases. - Information crowdsourced from the
mobile devices 12 themselves can ultimately be used to inform thenetwork 14 about the network characteristics which are required to be broadcasted to eachmobile device 12 and how this beamforming needs to take place (generally based on the application being used or subscription tier of the user). As the waveforming and beamforming takes place, the mobile device's application and network experience information (crowdsourced via the system 18) can be used in a feedback loop to inform the waveforming and beamforming processes. - In other words, beamforming allows every user to get access to different network characteristics. However in order to understand if this is working well, there needs to be a feedback loop informed by network crowdsourcing as herein described.
- Abnormal Mobile Device Behavior: The network testing/monitoring agent (e.g. the WDS 40) can be used to detect/identify compromised
mobile devices 12. For example, if theWDS 40 normally sees that amobile device 12, or anloT device 12, normally only uses 2MB/day of data, and then that suddenly jumps to 100MB, thesystem 18 can be used to identify this abnormal network behaviour and flag thedevice 12 as possibility being compromised. - Abnormal Access Point Behavior: It is recognized that adversaries are beginning to use rogue access points and fake cell towers to lure
mobile devices 12 into connecting. They can then monitor the traffic over thenetwork 14 or use these malicious connections to install malware. Thesystem 18 can also be used to identify abnormal access point behaviours. For example, if users are accessing the same access point from various locations, then that access point may be a rogue access point which is being driven around luring connections. Alternatively, if the cell tower ID, or some other identifier of a cell tower, or a cell tower's characteristics suddenly change, it can be flagged as possibly being a false tower made to appear similar to the non-malicious access point. - The
system 18 can therefore be adapted such that the performance and details ofmobile devices 12 and network access points are compared against the expected details/performance to search for network issues and compromised systems. - Leaking of Private Network: Certain networks are not intended to be seen outside of specific geographic areas and certain facilities. The
system 18 can report ifcertain networks 14 are seen where they should not be seen. - Additional features which can make the
system 18 more secure include: - a) The network of
mobile devices 12 can be controlled by several network controllers instead of just one (i.e. system fragmentation). For example, themobile devices 12 can use a different configuration server 124. It can be appreciated that there may also be benefits in fragmentation, which would require subset populations ofdevices 12 to use all different servers (i.e.different testing servers 120,different authentication servers 122, and different configuration servers 124). This way if one of the controllers is compromised then thewhole system 18 is not compromised at once. In the scope of the above principles, the network controllers are generally used to control whichdevices 12 run which tests and under what conditions. The network controllers are also used to control which servers are used for those tests. If those servers are compromised, then the entire system could be used to run a DDOS attack. - b) The mobile device agents (e.g., WDS 40) which perform the tests can be setup so that they re-authenticate every so often or they otherwise go dormant. This characteristic can be hardcoded into the
WDS 40 so that if theWDS 40 becomes compromised (e.g. , to run a DDOS attack) then after a certain period of time theWDS 40 shuts off because it stops being able to re-authenticate. - Application Monitoring: The network tests described above can be used to report the performance or likely performance of
network applications 38 such as Skype, YouTube, Netflix, etc. without ever interacting directly with the proprietary servers used by those applications. Instead, the network requirements of thoseapplications 38 are understood and compared against the network characteristics being observed and collected by the network testing agent (e.g., WDS 40) in order to report on application performance. Thesystem 18 can therefore be configured such that the results are used to report the performance or likely performance ofnetwork applications 38. - Network Operations: The above-described crowdsourcing can provide alarms to network operators indicating specific areas or network access points which are providing less than optimal performance. These alarms and this information can be used to inform network maintenance or indicate which areas of a
network 14 require additional testing by other methods. Thesystem 18 can therefore be configured such that the performance and details ofmobile devices 12 and network access points are compared against the expected details/performance to search for network issues and compromised systems. - Network Planning: The
system 18 can pinpoint areas with large foot traffic or population densities that are also underserved by wireless service providers. These are the areas where network improvements are expected to provide the largest gains to the overall subscriber base. By comparing this performance to that of competitors, thesystem 18 can suggest areas where the network operator should focus to be more competitive and perform better customer acquisition. Thesystem 18 can therefore be configured such that the results are used in conjunction with user density information collected from thesystem 18 or external sources to inform a network operator on the most beneficial location for network maintenance, expansions, and upgrades. - Competitor Tracking: The
system 18 can be used to inform a network operator on: 1) what new towers or technologies are being implemented by competitors; 2) which network operators are gaining the most subscribers and where; 3) what types of applications/services the competitive network are running and how that is changing over time; and 4) the performance of competitive networks and how that is evolving overtime. Thesystem 18 can therefore be configured such that the results are used to inform a wireless operator on the performance being delivered by their competitors to their competitor's subscribers and in which the new network implementation/alternations of competitors are recorded, predicted, and reported. - Furthermore, the
system 18 can also be configured to interact with a device connection management platform (not shown), as my be provided by a mobile phone operating system, or as may be controlled by the network operator, to help amobile device 12 select anappropriate network 14 or access point connection. In this case the data collected by theWDS 40 is transmitted, either in its raw form or after an analysis of the data, to the connection management platform via an API for use in the network or access point selection process. - Furthermore, the system can also benefit from the use of Artificial Intelligence (Al) and Machine Learning (ML) in addition to data analysis. Data reported by the
WDS 40 may be input to AI and ML platforms (not shown) for processing into enhanced information to be used by network operators for purposes such as network planning, network maintenance, customer care, customer advertising, and operators. In addition, this enhanced information may be input to SON, software defined network (SDN), network function virtualization (NFV), or MIMO systems such that thenetwork 14 can be responsive to this enhanced information produced by AI and ML processes run on the data supplied by theWDS 40. Groups other than network operators may similarly benefit from the enhanced information produced by AI and ML applied to the WDS test data. - For simplicity and clarity of illustration, where considered appropriate, reference numerals may be repeated among the figures to indicate corresponding or analogous elements. In addition, numerous specific details are set forth in order to provide a thorough understanding of the examples described herein. However, it will be understood by those of ordinary skill in the art that the examples described herein may be practiced without these specific details. In other instances, well-known methods, procedures and components have not been described in detail so as not to obscure the examples described herein. Also, the description is not to be considered as limiting the scope of the examples described herein.
- It will be appreciated that the examples and corresponding diagrams used herein are for illustrative purposes only. Different configurations and terminology can be used without departing from the principles expressed herein. For instance, components and modules can be added, deleted, modified, or arranged with differing connections without departing from these principles.
- It will also be appreciated that any module or component exemplified herein that executes instructions may include or otherwise have access to computer readable media such as storage media, computer storage media, or data storage devices (removable and/or non-removable) such as, for example, magnetic disks, optical disks, or tape. Computer storage media may include volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data. Examples of computer storage media include RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by an application, module, or both. Any such computer storage media may be part of the
system 18, any component of or related to thesystem 18, etc., or accessible or connectable thereto. Any application or module herein described may be implemented using computer readable/executable instructions that may be stored or otherwise held by such computer readable media. - The steps or operations in the flow charts and diagrams described herein are just for example. There may be many variations to these steps or operations without departing from the principles discussed above. For instance, the steps may be performed in a differing order, or steps may be added, deleted, or modified.
- Although the above principles have been described with reference to certain specific examples, various modifications thereof will be apparent to those skilled in the art as outlined in the appended claims.
Claims (15)
- A method of evaluating wireless device and/or wireless network performance and/or wireless network usage trends, the method comprising:deploying wireless device software (40) on each of a plurality of wireless devices (12) connected to one or more of a plurality of networks (14) by having the wireless device software (40) embedded in an application or software component (38) running on the corresponding wireless device (12), wherein the wireless device software (40) is embedded in or operable with a plurality of types of applications (38) and performs at least one test associated with characteristics and/or location of the wireless device (12), and/or performance of the wireless device (12) and/or the network (14), and/or usage of the wireless device (12) by a user;providing an external testing server (120), wherein the wireless device software (40) communicates with the external testing server (120) for testing quality of a wireless network (14) and producing test data;receiving via one or more collection servers (50), test data pertaining to the at least one test, obtained by the wireless device software (40) from each of the plurality of wireless devices (12);aggregating the received test data;storing, analyzing, and outputting the aggregated data; andsending configurations based on the aggregated test data, to the plurality of wireless devices (12), to modify the operation of the wireless device software to collect a type of test data based on the aggregated test data.
- The method of claim 1, wherein the wireless device software (40) is embedded in at least one application (38) for at least one of the corresponding wireless devices (12), and/or wherein the wireless device software (40) is embedded in an operating system of at least one corresponding wireless device (12) and, optionally, wherein the at least one application (38) embedding the wireless device software (40) is downloadable from an application store.
- The method of claim 1, wherein the wireless device software (40) is provided to at least some of the wireless devices (12) by embedding the wireless device software (40) into a third party application (38) or game that is downloaded by the wireless devices (12) independently of an analytics and reporting system that comprises the collection server (50) and, optionally, further comprising tracking a proportion of data acquired by particular third party applications (38) or games for determining revenue sharing.
- The method of claim 1, further comprising:providing a plurality of testing servers (120) to enable the wireless device software (40) to communicate with a particular one of the testing servers (120); and, optionallyproviding an authentication server (122) for registering the wireless device software (40) and approving use of the wireless device software (40) within the wireless device (12).
- The method of claim 1, further comprising communicating with a configuration server (124) for specifying testing behaviours for the wireless device software (40), wherein the wireless device software (40) or the testing server (120) communicates with the configuration server (124) to obtain configuration data for performing the at least one test and, optionally, further comprising using the configuration server (124) to control a frequency of testing and/or when and where tests are performed, including an ability to cease further testing by a particular application (38) or wireless device (12) and, optionally, wherein the configuration server (124) is configured to enable a kill message to be sent after which the wireless device software (40) stops performing testing, and is no longer responsive to new commands to change testing behaviour and, optionally, wherein the wireless device software (40) has a hardcoded limit of a number of tests that can be performed over a time period, which limits are unalterable by the configuration server (124).
- The method of claim 1, further comprising providing test data to one or more external entities or systems and, optionally, further comprising processing the test data to generate a user interface for the one or more external entities or systems and, optionally, wherein the one or more external entities or systems register with an online portal for obtaining the processed test data.
- The method of claim 6, wherein:the one or more external entities or systems comprise any one or more of equipment manufacturers, application developers, and wireless network operators for analyzing and modifying systems and practices according to data contained in the test data and, optionally, further comprising enabling future test results to be monitored against previous test results to enable modifications to the systems and practices to be tracked by the one or more external entities or systems over time; and, optionallyfurther comprising analyzing the test data on behalf of a particular third party using an analytics engine and, optionally, further comprising having the wireless device software (40) modulate the testing behaviour in accordance with a request from the particular third party.
- The method of claim 1, wherein the collected test data is location-based and is stored by the collection server (50) based on its location and, optionally, wherein the location-based data is collected anonymously using a random ID regenerated periodically or is anonymized after collection.
- The method of claim 1, further comprising:waiting, by the wireless device software (40), for the wireless device (12) to connect to a WiFi network (14) prior to sending the test data; and, optionallyobtaining metadata from at least one third party data source, and incorporating the metadata into the stored data.
- The method of claim 1, Z further comprising:J identifying, by the wireless device software (40), its own code
running in a different application (38) on a same wireless device (12) and, optionally,responding, by the wireless device software (40), to identifying its own code running in the different application (38), by having only one instance of the wireless device software (40) operating at the same time. - The method of claim 1, further comprising:using the received test data to track a user, whether the user is personally identified or not, as the user moves across networks (14) and/or wireless devices (12) by identifying use of an associated social media login identifier (ID); anddetermining an analysis of network or device churn based on such tracking; and,optionally using the received test data to track a user or group of users, whether the user or
group of users are personally identified or not, by use of associated mobile advertising identifiers; andusing such data directly or indirectly to target that user or group of users with wireless network operator mobile advertisements; and, optionallyusing the received data to compare network test results or service quality to a threshold of quality dictated by a wireless regulator or service level agreement to determine if one or more requirements have been met. - The method of claim 11, further comprising:wherein the test data is transmitted to a Self-Organizing network, software defined network, network function virtualization, or multi-input multi-output system, to allow the network to respond to the received test data; and, optionallywherein the test data is transmitted to a connection management system such that mobile devices (12) may be responsive to the data in selecting their network connections; and, optionallyfurther comprising applying artificial intelligence or machine learning to the test data to produce enhanced information.
- The method of claim 1, further comprising:using the received data to compare performance and/or details of wireless devices (12) and network access points against expected performance and/or details to identify network issues, compromised systems, or cybersecurity threats; and, optionallyusing the received data to determine network characteristics and report on a likely experience provided to users from applications providing data or information to their wireless device (12); and, optionallyusing the received data in conjunction with user density information collected from the collection servers (50) or external sources to inform a network operator on a recommended location for network maintenance and/or network expansion and/or network upgrades; and, optionallyusing the received data to inform a first wireless operator on performance delivered by a competitor second network operator to the second network operator's subscribers; and recording, predicting, or reporting new network implementations or alternations applied by the second network operator.
- The method of claim 1, further comprising:performing data analysis and machine learning operations to the received data to process into enhanced information to be used by the network operators; and, optionallywherein the wireless device software (40) is configured to run network tests against a plurality of different test servers (120); and, optionallywherein the wireless device software (40) is configured to run network tests using a plurality of protocols.
- A computer readable medium comprising computer executable instructions for performing the method according to any one of claims 1 to 14.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201762447239P | 2017-01-17 | 2017-01-17 | |
PCT/CA2018/050042 WO2018132901A1 (en) | 2017-01-17 | 2018-01-16 | System and method for evaluating wireless device and wireless network performance |
Publications (3)
Publication Number | Publication Date |
---|---|
EP3571859A1 EP3571859A1 (en) | 2019-11-27 |
EP3571859A4 EP3571859A4 (en) | 2020-11-04 |
EP3571859B1 true EP3571859B1 (en) | 2022-03-16 |
Family
ID=62841240
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP18741154.1A Active EP3571859B1 (en) | 2017-01-17 | 2018-01-16 | System and method for evaluating wireless device and wireless network performance |
Country Status (5)
Country | Link |
---|---|
US (1) | US10667154B2 (en) |
EP (1) | EP3571859B1 (en) |
CA (1) | CA3050164C (en) |
ES (1) | ES2922650T3 (en) |
WO (1) | WO2018132901A1 (en) |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10142165B2 (en) * | 2017-04-24 | 2018-11-27 | Hall Labs Llc | Adaptive communication channel redundancy in a hub-based intermediate-range system |
US10838950B2 (en) * | 2017-04-29 | 2020-11-17 | Cisco Technology, Inc. | Dynamic review cadence for intellectual capital |
US20200007410A1 (en) * | 2018-06-27 | 2020-01-02 | Viasat, Inc. | Vehicle communication service performance monitoring |
CN109347950B (en) * | 2018-10-17 | 2021-04-06 | 南京邮电大学 | Kaa Project-based Internet of things intelligent service system |
US10785123B2 (en) * | 2018-11-19 | 2020-09-22 | Facebook, Inc. | Communication network optimization |
US11212186B2 (en) | 2019-03-13 | 2021-12-28 | Facebook, Inc. | Measuring the impact of network deployments |
US10601640B1 (en) * | 2019-05-23 | 2020-03-24 | Accenture Global Solutions Limited | Enriched self-healing for cloud platforms |
US11089485B2 (en) * | 2019-08-02 | 2021-08-10 | Verizon Patent And Licensing Inc. | Systems and methods for network coverage optimization and planning |
US11349880B2 (en) * | 2019-09-05 | 2022-05-31 | Zscaler, Inc. | Cloud application design for efficient troubleshooting |
CN113098708B (en) * | 2021-03-23 | 2022-07-05 | 北京首都在线科技股份有限公司 | Public network quality evaluation method and device, electronic equipment and storage medium |
CN113301531B (en) * | 2021-05-25 | 2024-06-07 | 上海商汤临港智能科技有限公司 | Network access system, method and device for vehicle automatic driving test |
CN115515173A (en) * | 2021-06-04 | 2022-12-23 | 中兴通讯股份有限公司 | Method, system, electronic device and storage medium for analyzing performance of base station |
US11864104B2 (en) | 2021-09-08 | 2024-01-02 | Cisco Technology, Inc. | Dynamic frequency coordination in shared wireless communication environments |
US11523289B1 (en) * | 2021-09-22 | 2022-12-06 | T-Mobile Innovations Llc | Method and system for enhancing cellular network coverage |
US20240202010A1 (en) * | 2022-12-15 | 2024-06-20 | Vmware, Inc. | Aggregating metrics of network elements of a software-defined network for different applications based on different aggregation criteria |
Family Cites Families (39)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6745011B1 (en) | 2000-09-01 | 2004-06-01 | Telephia, Inc. | System and method for measuring wireless device and network usage and performance metrics |
US7302420B2 (en) * | 2003-08-14 | 2007-11-27 | International Business Machines Corporation | Methods and apparatus for privacy preserving data mining using statistical condensing approach |
US20050125408A1 (en) * | 2003-11-20 | 2005-06-09 | Beena Somaroo | Listing service tracking system and method for tracking a user's interaction with a listing service |
WO2006099473A2 (en) | 2005-03-15 | 2006-09-21 | Mformation Technologies Inc. | System and method for monitoring and measuring end-to-end performance using wireless devices |
US7873321B2 (en) | 2005-03-29 | 2011-01-18 | Qualcomm Incorporated | Apparatus and methods for determining network access performance of a wireless device |
US8954045B2 (en) | 2006-09-29 | 2015-02-10 | Qualcomm Incorporated | Method and apparatus for managing resources at a wireless device |
US8483068B2 (en) | 2007-03-30 | 2013-07-09 | Verizon Patent And Licensing Inc. | System and method of performance monitoring of multicast services with mobility support |
US9137664B2 (en) * | 2007-05-01 | 2015-09-15 | Qualcomm Incorporated | Application logging interface for a mobile device |
US9235956B2 (en) * | 2007-12-27 | 2016-01-12 | Bally Gaming, Inc. | Group games and rewards in wagering systems |
US20100041391A1 (en) | 2008-08-12 | 2010-02-18 | Anthony Wayne Spivey | Embedded mobile analytics in a mobile device |
US8345599B2 (en) * | 2008-09-29 | 2013-01-01 | Telcordia Technologies, Inc. | Pre-evaluation of multiple network access points |
US8355945B1 (en) | 2009-05-11 | 2013-01-15 | Sprint Communications Company L.P. | Identifying and ranking high-impact churn sectors |
US8811977B2 (en) * | 2010-05-06 | 2014-08-19 | At&T Mobility Ii Llc | Device-driven intelligence and feedback for performance optimization and planning of a service network |
AU2011258873B2 (en) | 2010-05-25 | 2015-09-24 | Headwater Research Llc | Device- assisted services for protecting network capacity |
CN101902688A (en) * | 2010-07-09 | 2010-12-01 | 中兴通讯股份有限公司 | Counting and acquiring system and method of navigation information |
US8676196B2 (en) * | 2010-11-30 | 2014-03-18 | Ta-gang Chiou | Apparatus and method for competitor network monitoring |
US9444692B2 (en) | 2011-04-26 | 2016-09-13 | Openet Telecom Ltd. | Systems, devices and methods of crowd-sourcing across multiple domains |
US8572290B1 (en) * | 2011-05-02 | 2013-10-29 | Board Of Supervisors Of Louisiana State University And Agricultural And Mechanical College | System and architecture for robust management of resources in a wide-area network |
US20130159081A1 (en) * | 2011-07-08 | 2013-06-20 | Vishwanath Shastry | Bidirectional bandwidth reducing notifications and targeted incentive platform apparatuses, methods and systems |
US10217117B2 (en) * | 2011-09-15 | 2019-02-26 | Stephan HEATH | System and method for social networking interactions using online consumer browsing behavior, buying patterns, advertisements and affiliate advertising, for promotions, online coupons, mobile services, products, goods and services, entertainment and auctions, with geospatial mapping technology |
US8862950B1 (en) * | 2011-09-22 | 2014-10-14 | Amazon Technologies, Inc. | Testing the operation of an application programming interface |
US9451451B2 (en) | 2011-09-30 | 2016-09-20 | Tutela Technologies Ltd. | System for regulating wireless device operations in wireless networks |
WO2013078541A1 (en) * | 2011-11-29 | 2013-06-06 | Energy Aware Technology Inc. | Method and system for forecasting power requirements using granular metrics |
US9465668B1 (en) * | 2012-04-30 | 2016-10-11 | Google Inc. | Adaptive ownership and cloud-based configuration and control of network devices |
US8976695B2 (en) * | 2012-08-23 | 2015-03-10 | Harris Corporation | Wireless communications system having selective wireless communications network and related methods |
US9178807B1 (en) * | 2012-09-20 | 2015-11-03 | Wiretap Ventures, LLC | Controller for software defined networks |
GB2507994A (en) * | 2012-11-16 | 2014-05-21 | Vodafone Ip Licensing Ltd | Mobile Device Application Analysis |
US9195574B1 (en) | 2012-11-30 | 2015-11-24 | Mobile Labs, LLC | Systems, methods, and apparatuses for testing mobile device applications |
US9274935B1 (en) * | 2013-01-15 | 2016-03-01 | Google Inc. | Application testing system with application programming interface |
US20150371163A1 (en) | 2013-02-14 | 2015-12-24 | Adaptive Spectrum And Signal Alignment, Inc. | Churn prediction in a broadband network |
US9530168B2 (en) | 2013-03-28 | 2016-12-27 | Linkedin Corporation | Reducing churn rate for a social network service |
WO2014165631A1 (en) | 2013-04-04 | 2014-10-09 | Pulse.io, Inc. | Mobile application performance prediction |
US20150186952A1 (en) | 2014-01-01 | 2015-07-02 | SlamAd.com, Inc. | Apparatus and method to facilitate downloading mobile software applications into a portable electronic device, which software applications include advertisements that are embedded within the software application and are re-transmitted to others through use of the portable electronic device |
US20160100325A1 (en) | 2014-01-27 | 2016-04-07 | Google Inc. | Wireless network monitoring device |
US20160134508A1 (en) * | 2014-11-12 | 2016-05-12 | International Business Machines Corporation | Non-disruptive integrated network infrastructure testing |
DE102015121484A1 (en) | 2015-12-10 | 2017-06-14 | P3 Insight GmbH | Method for determining a data transmission speed of a telecommunication network |
US10756988B2 (en) * | 2016-01-22 | 2020-08-25 | Chapter Communication Operating Llc. | System and method of isolating QoS fault |
US10949771B2 (en) | 2016-01-28 | 2021-03-16 | Facebook, Inc. | Systems and methods for churn prediction |
US20180070866A1 (en) * | 2016-09-13 | 2018-03-15 | Kaamran Raahemifar | Non-invasive nanosensor system to determine analyte concentration in blood and/or bodily fluids. |
-
2018
- 2018-01-16 WO PCT/CA2018/050042 patent/WO2018132901A1/en unknown
- 2018-01-16 US US15/872,209 patent/US10667154B2/en active Active
- 2018-01-16 EP EP18741154.1A patent/EP3571859B1/en active Active
- 2018-01-16 ES ES18741154T patent/ES2922650T3/en active Active
- 2018-01-16 CA CA3050164A patent/CA3050164C/en active Active
Also Published As
Publication number | Publication date |
---|---|
US20180206135A1 (en) | 2018-07-19 |
WO2018132901A1 (en) | 2018-07-26 |
US10667154B2 (en) | 2020-05-26 |
CA3050164A1 (en) | 2018-07-26 |
EP3571859A1 (en) | 2019-11-27 |
CA3050164C (en) | 2021-08-24 |
ES2922650T3 (en) | 2022-09-19 |
EP3571859A4 (en) | 2020-11-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3571859B1 (en) | System and method for evaluating wireless device and wireless network performance | |
US11671856B2 (en) | System and method for evaluating wireless device and/or wireless network performance | |
US20180262533A1 (en) | Monitoring Device Data and Gateway Data | |
US10819613B2 (en) | System and method for interacting with and controlling testing of wireless device and/or wireless network performance on wireless electronic devices | |
US10129777B2 (en) | Device-driven intelligence and feedback for performance optimization and planning of a service network | |
US20220225065A1 (en) | Systems and methods to determine mobile edge deployment of microservices | |
US9204329B2 (en) | Distributed RAN information collection, consolidation and RAN-analytics | |
Goel et al. | Survey of end-to-end mobile network measurement testbeds, tools, and services | |
JP2023522199A (en) | mobile management system | |
US9967156B2 (en) | Method and apparatus for cloud services for enhancing broadband experience | |
US10779178B1 (en) | Systems and methods of using network slicing for test platform | |
US20160029218A1 (en) | Controlling network access using a wrapper application executing on a mobile device | |
US20210397438A1 (en) | Remote detection of device updates | |
Frias et al. | Measuring Mobile Broadband Challenges and Implications for Policymaking | |
Berto | ASSURANCE-AWARE 5G EDGE-CLOUD ARCHITECTURES FOR INTENSIVE DATA ANALYTICS | |
Pelloni et al. | Analytics with Passive Wi-Fi Signals | |
Dominguez | Design and realization of a low-cost, location-aware sensor network for noise pollution monitoring |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20190813 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
A4 | Supplementary search report drawn up and despatched |
Effective date: 20201002 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: H04W 64/00 20090101ALN20200928BHEP Ipc: H04L 12/26 20060101ALI20200928BHEP Ipc: H04L 12/24 20060101ALN20200928BHEP Ipc: H04W 24/08 20090101AFI20200928BHEP |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R079 Ref document number: 602018032331 Country of ref document: DE Free format text: PREVIOUS MAIN CLASS: H04W0024000000 Ipc: H04W0024080000 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: H04L 12/24 20060101ALN20210726BHEP Ipc: H04W 64/00 20090101ALN20210726BHEP Ipc: H04L 12/26 20060101ALI20210726BHEP Ipc: H04W 24/08 20090101AFI20210726BHEP |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: H04L 12/24 20060101ALN20210830BHEP Ipc: H04L 12/26 20060101ALI20210830BHEP Ipc: H04W 24/08 20090101AFI20210830BHEP |
|
INTG | Intention to grant announced |
Effective date: 20210927 |
|
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE PATENT HAS BEEN GRANTED |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: EP Ref country code: DE Ref legal event code: R096 Ref document number: 602018032331 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: REF Ref document number: 1476740 Country of ref document: AT Kind code of ref document: T Effective date: 20220415 |
|
REG | Reference to a national code |
Ref country code: FI Ref legal event code: FGE |
|
REG | Reference to a national code |
Ref country code: LT Ref legal event code: MG9D |
|
REG | Reference to a national code |
Ref country code: SE Ref legal event code: TRGR |
|
REG | Reference to a national code |
Ref country code: NL Ref legal event code: MP Effective date: 20220316 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: RS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220316 Ref country code: NO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220616 Ref country code: LT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220316 Ref country code: HR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220316 Ref country code: BG Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220616 |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: MK05 Ref document number: 1476740 Country of ref document: AT Kind code of ref document: T Effective date: 20220316 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LV Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220316 Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220617 |
|
REG | Reference to a national code |
Ref country code: ES Ref legal event code: FG2A Ref document number: 2922650 Country of ref document: ES Kind code of ref document: T3 Effective date: 20220919 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: NL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220316 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SM Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220316 Ref country code: SK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220316 Ref country code: RO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220316 Ref country code: PT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220718 Ref country code: EE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220316 Ref country code: CZ Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220316 Ref country code: AT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220316 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: PL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220316 Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220716 Ref country code: AL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220316 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R097 Ref document number: 602018032331 Country of ref document: DE |
|
PLBE | No opposition filed within time limit |
Free format text: ORIGINAL CODE: 0009261 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: DK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220316 |
|
26N | No opposition filed |
Effective date: 20221219 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220316 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220316 |
|
REG | Reference to a national code |
Ref country code: BE Ref legal event code: MM Effective date: 20230131 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: BE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20230131 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: GB Payment date: 20231219 Year of fee payment: 7 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20230116 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: SE Payment date: 20231219 Year of fee payment: 7 Ref country code: LU Payment date: 20231219 Year of fee payment: 7 Ref country code: FR Payment date: 20231219 Year of fee payment: 7 Ref country code: FI Payment date: 20231219 Year of fee payment: 7 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: ES Payment date: 20240202 Year of fee payment: 7 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: DE Payment date: 20231219 Year of fee payment: 7 Ref country code: CH Payment date: 20240202 Year of fee payment: 7 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MC Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220316 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MC Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20220316 |