<<

Mobile Network Performance from User Devices: A Longitudinal, Multidimensional Analysis

Ashkan Nikravesh, David R. Choffnes, Ethan Katz-Bassett Z. Morley Mao, Matt Welsh Why is that difficult? • Performance depends on many factors How to improve the visibility? • Pervasive network monitoring is needed: ◦ Continuous ◦ Large-scale • Sampling performance of devices across: ◦ Carriers ◦ Access Technologies ◦ Location ◦ Time

Problem

Mobile Network Performance:  Poor visibility into user perceived performance.  It is difficult to capture a view of network performance.

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 2 / 16 How to improve the visibility? • Pervasive network monitoring is needed: ◦ Continuous ◦ Large-scale • Sampling performance of devices across: ◦ Carriers ◦ Access Technologies ◦ Location ◦ Time

Problem

Mobile Network Performance:  Poor visibility into user perceived performance.  It is difficult to capture a view of network performance. Why is that difficult? • Performance depends on many factors

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 2 / 16 • Pervasive network monitoring is needed: ◦ Continuous ◦ Large-scale • Sampling performance of devices across: ◦ Carriers ◦ Access Technologies ◦ Location ◦ Time

Problem

Mobile Network Performance:  Poor visibility into user perceived performance.  It is difficult to capture a view of network performance. Why is that difficult? • Performance depends on many factors How to improve the visibility?

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 2 / 16 • Sampling performance of devices across: ◦ Carriers ◦ Access Technologies ◦ Location ◦ Time

Problem

Mobile Network Performance:  Poor visibility into user perceived performance.  It is difficult to capture a view of network performance. Why is that difficult? • Performance depends on many factors How to improve the visibility? • Pervasive network monitoring is needed: ◦ Continuous ◦ Large-scale

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 2 / 16 Problem

Mobile Network Performance:  Poor visibility into user perceived performance.  It is difficult to capture a view of network performance. Why is that difficult? • Performance depends on many factors How to improve the visibility? • Pervasive network monitoring is needed: ◦ Continuous ◦ Large-scale • Sampling performance of devices across: ◦ Carriers ◦ Access Technologies ◦ Location ◦ Time

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 2 / 16 • Collected from mobile devices, but not continuously Our work differs from previous related work: • Longitudinal • Continuous • Gathered from mobile devices using controlled experiments.

Previous Works

Their Limitations: • Passively collected from infrastructure (e.g. GGSN) • One month of data • Limited to a single carrier

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 3 / 16 Our work differs from previous related work: • Longitudinal • Continuous • Gathered from mobile devices using controlled experiments.

Previous Works

Their Limitations: • Passively collected from cellular network infrastructure (e.g. GGSN) • One month of data • Limited to a single carrier • Collected from mobile devices, but not continuously

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 3 / 16 Previous Works

Their Limitations: • Passively collected from cellular network infrastructure (e.g. GGSN) • One month of data • Limited to a single carrier • Collected from mobile devices, but not continuously Our work differs from previous related work: • Longitudinal • Continuous • Gathered from mobile devices using controlled experiments.

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 3 / 16  Identify patterns, trends, anomalies, and evolution of cellular networks’ performance. We find: • Significant variance in end-to-end performance for all carriers. • Part of the high variability is due to the geographic and temporal properties of network. • Routing and signal strength are potential sources of performance variability. • Performance is inherently unstable.

Data Analysis

 Analyzing the data collected from:  144 carriers  17 months  11 cellular networks

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 4 / 16 We find: • Significant variance in end-to-end performance for all carriers. • Part of the high variability is due to the geographic and temporal properties of network. • Routing and signal strength are potential sources of performance variability. • Performance is inherently unstable.

Data Analysis

 Analyzing the data collected from:  144 carriers  17 months  11 cellular networks  Identify patterns, trends, anomalies, and evolution of cellular networks’ performance.

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 4 / 16 • Part of the high variability is due to the geographic and temporal properties of network. • Routing and signal strength are potential sources of performance variability. • Performance is inherently unstable.

Data Analysis

 Analyzing the data collected from:  144 carriers  17 months  11 cellular networks  Identify patterns, trends, anomalies, and evolution of cellular networks’ performance. We find: • Significant variance in end-to-end performance for all carriers.

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 4 / 16 • Routing and signal strength are potential sources of performance variability. • Performance is inherently unstable.

Data Analysis

 Analyzing the data collected from:  144 carriers  17 months  11 cellular networks  Identify patterns, trends, anomalies, and evolution of cellular networks’ performance. We find: • Significant variance in end-to-end performance for all carriers. • Part of the high variability is due to the geographic and temporal properties of network.

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 4 / 16 • Performance is inherently unstable.

Data Analysis

 Analyzing the data collected from:  144 carriers  17 months  11 cellular networks  Identify patterns, trends, anomalies, and evolution of cellular networks’ performance. We find: • Significant variance in end-to-end performance for all carriers. • Part of the high variability is due to the geographic and temporal properties of network. • Routing and signal strength are potential sources of performance variability.

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 4 / 16 Data Analysis

 Analyzing the data collected from:  144 carriers  17 months  11 cellular networks  Identify patterns, trends, anomalies, and evolution of cellular networks’ performance. We find: • Significant variance in end-to-end performance for all carriers. • Part of the high variability is due to the geographic and temporal properties of network. • Routing and signal strength are potential sources of performance variability. • Performance is inherently unstable.

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 4 / 16  Traceroute  Carrier + Cellular network technology  Signal strength  Location  Timestamp

• HTTP GET Throughput • Ping RTT • DNS Lookup time

To identify and isolate the performance impact of each factor

Methodology

• User perceived performance:

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 5 / 16  Traceroute  Carrier + Cellular network technology  Signal strength  Location  Timestamp To identify and isolate the performance impact of each factor

Methodology

• User perceived performance: • HTTP GET Throughput • Ping RTT • DNS Lookup time

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 5 / 16 To identify and isolate the performance impact of each factor

Methodology

• User perceived performance: • HTTP GET Throughput • Ping RTT • DNS Lookup time

 Traceroute  Carrier + Cellular network technology  Signal strength  Location  Timestamp

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 5 / 16 Methodology

• User perceived performance: • HTTP GET Throughput • Ping RTT • DNS Lookup time

 Traceroute  Carrier + Cellular network technology  Signal strength  Location  Timestamp To identify and isolate the performance impact of each factor

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 5 / 16 Methodology

• User perceived performance: • HTTP GET Throughput • Ping RTT • DNS Lookup time

 Traceroute  Carrier + Cellular network technology  Signal strength  Location  Timestamp To identify and isolate the performance impact of each factor

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 5 / 16 • Mobiperf • 11 months • Only used for our signal strength analysis • Controlled experiments Speedometer dataset: 4-5 measurements per minute • Code is open source and data is publicly available

Dataset

• Mainly collected by Speedometer: • 2011-10 to 2013-2 (17 months) • Internal android app developed by Google • Anonymized data

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 6 / 16 • Controlled experiments Speedometer dataset: 4-5 measurements per minute • Code is open source and data is publicly available

Dataset

• Mainly collected by Speedometer: • 2011-10 to 2013-2 (17 months) • Internal android app developed by Google • Anonymized data • Mobiperf • 11 months • Only used for our signal strength analysis

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 6 / 16 • Code is open source and data is publicly available

Dataset

• Mainly collected by Speedometer: • 2011-10 to 2013-2 (17 months) • Internal android app developed by Google • Anonymized data • Mobiperf • 11 months • Only used for our signal strength analysis • Controlled experiments Speedometer dataset: 4-5 measurements per minute

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 6 / 16 Dataset

• Mainly collected by Speedometer: • 2011-10 to 2013-2 (17 months) • Internal android app developed by Google • Anonymized data • Mobiperf • 11 months • Only used for our signal strength analysis • Controlled experiments Speedometer dataset: 4-5 measurements per minute • Code is open source and data is publicly available

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 6 / 16 1 Latency varies significantly across carriers and access technologies 2 Same performance for different access technologies

• Ping RTT Latency

1000 GPRS EDGE UMTS HSDPA HSPA Ping RTT (ms)

100

T-Mobile AT&T YesOptus (DE)Vodafone(NL)Vodafone(IE)Vodafone(UK)O2(UK) Airtel TelkomselRogers NTT DoCoMoTelstra SFR SK TelecomEmobile

Performance across Carriers

How observed performance matches with the expectations across access technologies?

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 7 / 16 1 Latency varies significantly across carriers and access technologies 2 Same performance for different access technologies

Performance across Carriers

How observed performance matches with the expectations across access technologies? • Ping RTT Latency

1000 GPRS EDGE UMTS HSDPA HSPA Ping RTT (ms)

100

T-Mobile AT&T YesOptus Swisscom Vodafone(DE)Vodafone(NL)Vodafone(IE)Vodafone(UK)O2(UK) Airtel TelkomselRogers SingTel NTT DoCoMoTelstra SFR SK TelecomEmobile

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 7 / 16 2 Same performance for different access technologies

Performance across Carriers

How observed performance matches with the expectations across access technologies? • Ping RTT Latency 1 Latency varies significantly across carriers and access technologies

1000 GPRS EDGE UMTS HSDPA HSPA Ping RTT (ms)

100

T-Mobile AT&T YesOptus Swisscom Vodafone(DE)Vodafone(NL)Vodafone(IE)Vodafone(UK)O2(UK) Airtel TelkomselRogers SingTel NTT DoCoMoTelstra SFR SK TelecomEmobile

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 7 / 16 Performance across Carriers

How observed performance matches with the expectations across access technologies? • Ping RTT Latency 1 Latency varies significantly across carriers and access technologies 2 Same performance for different access technologies

1000 GPRS EDGE UMTS HSDPA HSPA Ping RTT (ms)

100

T-Mobile AT&T YesOptus Swisscom Vodafone(DE)Vodafone(NL)Vodafone(IE)Vodafone(UK)O2(UK) Airtel TelkomselRogers SingTel NTT DoCoMoTelstra SFR SK TelecomEmobile

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 7 / 16 1 Relatively smaller difference between the carriers 2 Download size (224KB) in not sufficiently large 3 Lower latency is generally correlated with higher throughput

Performance across Carriers

• HTTP GET Throughput

1000

100 HTTP Throughput (Kbps) GPRS EDGE UMTS HSDPA HSPA 10 T-Mobile AT&T YesOptus Swisscom Vodafone(DE)Vodafone(NL)Vodafone(IE)Vodafone(UK)O2(UK) Airtel TelkomselRogers SingTelNTT DoCoMoTelstraSFR SK TelecomEmobile

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 8 / 16 2 Download size (224KB) in not sufficiently large 3 Lower latency is generally correlated with higher throughput

Performance across Carriers

• HTTP GET Throughput 1 Relatively smaller difference between the carriers

1000

100 HTTP Throughput (Kbps) GPRS EDGE UMTS HSDPA HSPA 10 T-Mobile AT&T YesOptus Swisscom Vodafone(DE)Vodafone(NL)Vodafone(IE)Vodafone(UK)O2(UK) Airtel TelkomselRogers SingTelNTT DoCoMoTelstraSFR SK TelecomEmobile

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 8 / 16 3 Lower latency is generally correlated with higher throughput

Performance across Carriers

• HTTP GET Throughput 1 Relatively smaller difference between the carriers 2 Download size (224KB) in not sufficiently large

1000

100 HTTP Throughput (Kbps) GPRS EDGE UMTS HSDPA HSPA 10 T-Mobile AT&T YesOptus Swisscom Vodafone(DE)Vodafone(NL)Vodafone(IE)Vodafone(UK)O2(UK) Airtel TelkomselRogers SingTelNTT DoCoMoTelstraSFR SK TelecomEmobile

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 8 / 16 3 Lower latency is generally correlated with higher throughput

Performance across Carriers

• HTTP GET Throughput 1 Relatively smaller difference between the carriers 2 Download size (224KB) in not sufficiently large

1000

100 HTTP Throughput (Kbps) GPRS EDGE UMTS HSDPA HSPA 10 T-Mobile AT&T YesOptus Swisscom Vodafone(DE)Vodafone(NL)Vodafone(IE)Vodafone(UK)O2(UK) Airtel TelkomselRogers SingTelNTT DoCoMoTelstraSFR SK TelecomEmobile

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 8 / 16 Performance across Carriers

• HTTP GET Throughput 1 Relatively smaller difference between the carriers 2 Download size (224KB) in not sufficiently large 3 Lower latency is generally correlated with higher throughput

1000

100 HTTP Throughput (Kbps) GPRS EDGE UMTS HSDPA HSPA 10 T-Mobile AT&T YesOptus Swisscom Vodafone(DE)Vodafone(NL)Vodafone(IE)Vodafone(UK)O2(UK) Airtel TelkomselRogers SingTelNTT DoCoMoTelstraSFR SK TelecomEmobile

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 8 / 16 • New York, Bay Area, and Seattle

110 Bay Area 100 Seattle New York 90 80 70 60 50 Ping RTT (ms) 40 Mean and Standard Error 30 20 Oct 30 Nov 19 Dec 09 Dec 29 Jan 18 Feb 07 Feb 27 Mar 18 Apr 07 Apr 27 May 17 Jun 06 Time (Days) 2011-2012 LTE

Performance across different Locations

• Different topologies in different regions

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 9 / 16 110 Bay Area 100 Seattle New York 90 80 70 60 50 Ping RTT (ms) 40 Mean and Standard Error 30 20 Oct 30 Nov 19 Dec 09 Dec 29 Jan 18 Feb 07 Feb 27 Mar 18 Apr 07 Apr 27 May 17 Jun 06 Time (Days) 2011-2012 Verizon LTE

Performance across different Locations

• Different topologies in different regions • New York, Bay Area, and Seattle

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 9 / 16 Performance across different Locations

• Different topologies in different regions • New York, Bay Area, and Seattle

110 Bay Area 100 Seattle New York 90 80 70 60 50 Ping RTT (ms) 40 Mean and Standard Error 30 20 Oct 30 Nov 19 Dec 09 Dec 29 Jan 18 Feb 07 Feb 27 Mar 18 Apr 07 Apr 27 May 17 Jun 06 Time (Days) 2011-2012 Verizon LTE

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 9 / 16 1 Throughput decreases during the busy hours of usage 2 Carriers experience minimum throughput at different times 3 Carriers experience different variation in performance during the busy hours

• When to measure the network? Time of the day

1400 AT&T HSDPA Sprint EVDO_A 1300 T-Mobile HSDPA Verizon EVDO_A 1200

1100

1000

900

800

700

HTTP Throughput (Kbps), Mean and Standard Error 600 0 2 4 6 8 10 12 14 16 18 20 22 24 Local Time from 0 (00:00) to 24 (24:00)

Performance over Time

• How much performance depends on time? • time of the day • stability

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 10 / 16 1 Throughput decreases during the busy hours of usage 2 Carriers experience minimum throughput at different times 3 Carriers experience different variation in performance during the busy hours

Time of the day

1400 AT&T HSDPA Sprint EVDO_A 1300 T-Mobile HSDPA Verizon EVDO_A 1200

1100

1000

900

800

700

HTTP Throughput (Kbps), Mean and Standard Error 600 0 2 4 6 8 10 12 14 16 18 20 22 24 Local Time from 0 (00:00) to 24 (24:00)

Performance over Time

• How much performance depends on time? • time of the day • stability • When to measure the network?

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 10 / 16 1 Throughput decreases during the busy hours of usage 2 Carriers experience minimum throughput at different times 3 Carriers experience different variation in performance during the busy hours

Performance over Time

• How much performance depends on time? • time of the day • stability • When to measure the network? Time of the day

1400 AT&T HSDPA Sprint EVDO_A 1300 T-Mobile HSDPA Verizon EVDO_A 1200

1100

1000

900

800

700

HTTP Throughput (Kbps), Mean and Standard Error 600 0 2 4 6 8 10 12 14 16 18 20 22 24 Local Time from 0 (00:00) to 24 (24:00)

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 10 / 16 2 Carriers experience minimum throughput at different times 3 Carriers experience different variation in performance during the busy hours

Performance over Time

• How much performance depends on time? • time of the day • stability • When to measure the network? Time of the day

1400 AT&T HSDPA Sprint EVDO_A 1 Throughput decreases during 1300 T-Mobile HSDPA Verizon EVDO_A the busy hours of usage 1200

1100

1000

900

800

700

HTTP Throughput (Kbps), Mean and Standard Error 600 0 2 4 6 8 10 12 14 16 18 20 22 24 Local Time from 0 (00:00) to 24 (24:00)

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 10 / 16 3 Carriers experience different variation in performance during the busy hours

Performance over Time

• How much performance depends on time? • time of the day • stability • When to measure the network? Time of the day

1400 AT&T HSDPA Sprint EVDO_A 1 Throughput decreases during 1300 T-Mobile HSDPA Verizon EVDO_A the busy hours of usage 1200

1100 2 Carriers experience minimum

1000 throughput at different times

900

800

700

HTTP Throughput (Kbps), Mean and Standard Error 600 0 2 4 6 8 10 12 14 16 18 20 22 24 Local Time from 0 (00:00) to 24 (24:00)

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 10 / 16 Performance over Time

• How much performance depends on time? • time of the day • stability • When to measure the network? Time of the day

1400 AT&T HSDPA Sprint EVDO_A 1 Throughput decreases during 1300 T-Mobile HSDPA Verizon EVDO_A the busy hours of usage 1200

1100 2 Carriers experience minimum

1000 throughput at different times

900 3 Carriers experience different 800 variation in performance during 700 the busy hours

HTTP Throughput (Kbps), Mean and Standard Error 600 0 2 4 6 8 10 12 14 16 18 20 22 24 Local Time from 0 (00:00) to 24 (24:00)

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 10 / 16 Error

w2 w1 Window size: 2 5PM Sampling Period: 2hrs

Two metrics: • Auto-Correlation • Weighted moving average

1PM 2PM 3PM 4PM 5PM

Performance over Time

Stability of Performance • Users want a stable network! • Measurements are expensive!

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 11 / 16 Error

w2 w1 Window size: 2 5PM Sampling Period: 2hrs

1PM 2PM 3PM 4PM 5PM

Performance over Time

Stability of Performance • Users want a stable network! • Measurements are expensive! Two metrics: • Auto-Correlation • Weighted moving average

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 11 / 16 Error

w2 w1 Window size: 2 5PM Sampling Period: 2hrs

Performance over Time

Stability of Performance • Users want a stable network! • Measurements are expensive! Two metrics: • Auto-Correlation • Weighted moving average

1PM 2PM 3PM 4PM 5PM

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 11 / 16 Performance over Time

Stability of Performance • Users want a stable network! • Measurements are expensive! Two metrics: • Auto-Correlation • Weighted moving average

1PM 2PM 3PM 4PM 5PM

Error

w2 w1 Window size: 2 5PM Sampling Period: 2hrs

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 11 / 16 2 Prediction error increases with longer sampling periods with the exception of 24hr sampling periods

1 Prediction accuracy varies significantly by carriers

Performance over Time

Stability of Performance (Weighted Moving Average)

0.9 0.8 0.7 0.6 0.5 0.4

0.3 Error (%) 0.2

Verizon, LTE, BayArea Sprint, EVDOA, Seattle 0.1 Sprint, EVDOA, BayArea T-Mobile, HSDPA, BayArea 1 3 6 9 12 15 18 21 24 27 30 33 36 Sampling Period (hour)

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 12 / 16 with the exception of 24hr sampling periods

2 Prediction error increases with longer sampling periods

Performance over Time

Stability of Performance (Weighted Moving Average)

0.9 0.8 1 Prediction accuracy varies 0.7 0.6 0.5 significantly by carriers 0.4

0.3 Error (%) 0.2

Verizon, LTE, BayArea Sprint, EVDOA, Seattle 0.1 Sprint, EVDOA, BayArea T-Mobile, HSDPA, BayArea 1 3 6 9 12 15 18 21 24 27 30 33 36 Sampling Period (hour)

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 12 / 16 with the exception of 24hr sampling periods

Performance over Time

Stability of Performance (Weighted Moving Average)

0.9 0.8 1 Prediction accuracy varies 0.7 0.6 0.5 significantly by carriers 0.4 2 Prediction error increases with 0.3

Error (%) longer sampling periods 0.2

Verizon, LTE, BayArea Sprint, EVDOA, Seattle 0.1 Sprint, EVDOA, BayArea T-Mobile, HSDPA, BayArea 1 3 6 9 12 15 18 21 24 27 30 33 36 Sampling Period (hour)

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 12 / 16 Performance over Time

Stability of Performance (Weighted Moving Average)

0.9 0.8 1 Prediction accuracy varies 0.7 0.6 0.5 significantly by carriers 0.4 2 Prediction error increases with 0.3

Error (%) longer sampling periods with 0.2 the exception of 24hr Verizon, LTE, BayArea Sprint, EVDOA, Seattle 0.1 Sprint, EVDOA, BayArea sampling periods T-Mobile, HSDPA, BayArea 1 3 6 9 12 15 18 21 24 27 30 33 36 Sampling Period (hour)

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 12 / 16 Inefficient Paths • Time evolution • Impact on the performance

160 18 80 Seattle (25-50-75) Ping RTT Los Angeles (25-50-75) 17 Hops 140 70 16 120 15 60 100 14 50 # of Hops 80 13

40 Median Ping RTT (ms) Median Ping RTT (ms) 60 12 11 30 40 05 12 19 26 03 10 17 24 31 07 11 12 13 14 16 17 18 19 20 21 22 Nov Nov Nov Nov Dec Dec Dec Dec Dec Jan Time (day) - Feb 2012 Time (day) - 2011 (a) T-Mobile HSDPA (Seattle) (b) Verizon LTE (Bay Area)

Performance Degradation: Root Causes

Focus on the cases where persistent performance degradations were : • observed in consecutive days • affects both latency and throughput

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 13 / 16 160 18 80 Seattle (25-50-75) Ping RTT Los Angeles (25-50-75) 17 Hops 140 70 16 120 15 60 100 14 50 # of Hops 80 13

40 Median Ping RTT (ms) Median Ping RTT (ms) 60 12 11 30 40 05 12 19 26 03 10 17 24 31 07 11 12 13 14 16 17 18 19 20 21 22 Nov Nov Nov Nov Dec Dec Dec Dec Dec Jan Time (day) - Feb 2012 Time (day) - 2011 (a) T-Mobile HSDPA (Seattle) (b) Verizon LTE (Bay Area)

Performance Degradation: Root Causes

Focus on the cases where persistent performance degradations were : • observed in consecutive days • affects both latency and throughput Inefficient Paths • Time evolution • Impact on the performance

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 13 / 16 18 80 Ping RTT 17 Hops 70 16

15 60

14 50 # of Hops 13

40 Median Ping RTT (ms) 12

11 30 05 12 19 26 03 10 17 24 31 07 Nov Nov Nov Nov Dec Dec Dec Dec Dec Jan Time (day) - 2011 (b) Verizon LTE (Bay Area)

Performance Degradation: Root Causes

Focus on the cases where persistent performance degradations were : • observed in consecutive days • affects both latency and throughput Inefficient Paths • Time evolution • Impact on the performance

160 Seattle (25-50-75) Los Angeles (25-50-75) 140

120

100

80 Median Ping RTT (ms) 60

40 11 12 13 14 16 17 18 19 20 21 22 Time (day) - Feb 2012 (a) T-Mobile HSDPA (Seattle)

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 13 / 16 Performance Degradation: Root Causes

Focus on the cases where persistent performance degradations were : • observed in consecutive days • affects both latency and throughput Inefficient Paths • Time evolution • Impact on the performance

160 18 80 Seattle (25-50-75) Ping RTT Los Angeles (25-50-75) 17 Hops 140 70 16 120 15 60 100 14 50 # of Hops 80 13

40 Median Ping RTT (ms) Median Ping RTT (ms) 60 12 11 30 40 05 12 19 26 03 10 17 24 31 07 11 12 13 14 16 17 18 19 20 21 22 Nov Nov Nov Nov Dec Dec Dec Dec Dec Jan Time (day) - Feb 2012 Time (day) - 2011 (a) T-Mobile HSDPA (Seattle) (b) Verizon LTE (Bay Area)

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 13 / 16 I Accounting for signal strength is important for interpreting measurement results.

0.26 1800 550 Ping RTT HTTP Throughput 0.24 Packet Loss 1600 500 0.22 450 0.2 1400

0.18 400 1200 0.16 350 0.14 1000 0.12 300

800 Mean Ping RTT (ms) Mean Packet Loss (%) 0.1 250

0.08 Mean HTTP Throughput (Kbps) 600 200 0.06

0.04 400 150 0 5 10 15 20 25 30 0 5 10 15 20 25 30 ASU ASU (a) Ping RTT and Packet Loss (b) HTTP Throughput AT&T HSDPA (Seattle), Arbitrary Strength Units (ASU)

Performance Degradation: Root Causes Signal Strength How much it can affect performance?

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 14 / 16 I Accounting for signal strength is important for interpreting measurement results.

Performance Degradation: Root Causes Signal Strength How much it can affect performance?

0.26 1800 550 Ping RTT HTTP Throughput 0.24 Packet Loss 1600 500 0.22 450 0.2 1400

0.18 400 1200 0.16 350 0.14 1000 0.12 300

800 Mean Ping RTT (ms) Mean Packet Loss (%) 0.1 250

0.08 Mean HTTP Throughput (Kbps) 600 200 0.06

0.04 400 150 0 5 10 15 20 25 30 0 5 10 15 20 25 30 ASU ASU (a) Ping RTT and Packet Loss (b) HTTP Throughput AT&T HSDPA (Seattle), Arbitrary Strength Units (ASU)

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 14 / 16 I Accounting for signal strength is important for interpreting measurement results.

Performance Degradation: Root Causes Signal Strength How much it can affect performance?

0.26 1800 550 Ping RTT HTTP Throughput 0.24 Packet Loss 1600 500 0.22 450 0.2 1400

0.18 400 1200 0.16 350 0.14 1000 0.12 300

800 Mean Ping RTT (ms) Mean Packet Loss (%) 0.1 250

0.08 Mean HTTP Throughput (Kbps) 600 200 0.06

0.04 400 150 0 5 10 15 20 25 30 0 5 10 15 20 25 30 ASU ASU (a) Ping RTT and Packet Loss (b) HTTP Throughput AT&T HSDPA (Seattle), Arbitrary Strength Units (ASU)

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 14 / 16 Performance Degradation: Root Causes Signal Strength How much it can affect performance?

0.26 1800 550 Ping RTT HTTP Throughput 0.24 Packet Loss 1600 500 0.22 450 0.2 1400

0.18 400 1200 0.16 350 0.14 1000 0.12 300

800 Mean Ping RTT (ms) Mean Packet Loss (%) 0.1 250

0.08 Mean HTTP Throughput (Kbps) 600 200 0.06

0.04 400 150 0 5 10 15 20 25 30 0 5 10 15 20 25 30 ASU ASU (a) Ping RTT and Packet Loss (b) HTTP Throughput AT&T HSDPA (Seattle), Arbitrary Strength Units (ASU)

I Accounting for signal strength is important for interpreting measurement results.

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 14 / 16 Mobilyzer

An Open Platform for Mobile Network Measurement A comprehensive codebase for issuing measurements for researchers and developers

Future Work

• We need for more monitoring and diagnosis. • Data is difficult to get.

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 15 / 16 Future Work

• We need for more monitoring and diagnosis. • Data is difficult to get.

Mobilyzer

An Open Platform for Mobile Network Measurement A comprehensive codebase for issuing measurements for researchers and developers

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 15 / 16 Any Questions?

Thank You!

A. Nikravesh, D. R. Choffnes, E. Katz-Bassett, Z. M. Mao, M. Welsh PAM 2014 16 / 16