Cloud Provider Global Performance Ranking – 12 Month Average

Take twelve months, more than half a million tests, one sample application and 25 cloud service provider sites and you get this:

Wait no, you actually get this--> alltopperformers.zip 

To be more specific, this is the breakdown of the average global response times seen against the exact same sample application living on all the major cloud service providers courtesy of the Global Provider View.

I’ll leave the interpretation of the data to you, but let me qualify why we chose to display the data in this fashion.

In Fig 1. we’ve displayed in order of fastest average performance across the globe, the top ranked service provider sites. You’ll notice here that not all 25 provider sites are listed, download the full report for the remaining data points.  This representation only includes provider sites we’ve been testing for 6 months or longer, eliminating dispersal and I hope… yielding the most honest display of response times across the best of the best.

Fig 1. Seventeen cloud provider sites ranked in order of average global GPV performance, Aug 2010 – July 2011.

Here in Fig 2. things get really interesting. We’ve created this chart to understand the trending global average performance of the top cloud service providers.  The same rule applies here, we have at least 6 months of data to form an average and the outliers above 10s response times were removed.

Fig 2. Twelve provider site response times averaged globally and mapped over time

  Now here is the fun part and where we need your help.

  • What observations do you gather from this information?
  • What questions do we have for our included providers?
  • What suggestions do you have for representing this data more accurately?

Comment below with your thoughts.  If you are one of the providers included in this report and you’d like to provide some feedback, contact admin@cloudsleuth.net

Approach

Let’s talk a little more about the approach we took in gathering this information. Global Provider View's approach is conceptually very simple. We deploy an identical target application (a cute/fake e-commerce site that sells boots) to each cloud service provider. The Gomez Performance Network (GPN) is used to run test transactions on the deployed target applications and monitor the response time and availability from multiple points around the globe.

 

Fig 3. The Gomez Performance Network backbone test locations

A global perspective is essential when evaluating service provider performance, but it is often best to start from a regional perspective. The Global Provider View provides both (See Fig 4). Your first map view will be a regionalized perspective based on your  ISP, selected from North America, South America, Europe or Asia. To get data from another regional perspective, or from a global perspective, simply change the location with the "locations pull-down at the top of the page.

 Performance calculations are done based upon the region's backbone nodes. For example, the response time averages displayed in the "Provider Response Time" list for North America are based on the data retrieved from the 19 backbone nodes that make up the region. European averages are based on the 5 nodes in that region. The World perspective includes data from all 30 backbone nodes and is represented in the data provided here.

Fig 4. The Regional view representing response times for local providers

I hope you find this information intriguing and I look forward to hearing your feedback. We are always entertaining new ideas to increase the usefulness of this app so don’t be shy. You can reach me personally @ryanbateman or on ryanbateman.com