CHAPTER 18

Shortcuts to Success

Revenue relates to response time, which, as we saw in the last chapter, can be improved through parallelization. But when an application has a global base of users—employees, customers, or partners—network latency—delays due to signal transmission time over distance—is a critical factor as well. Taking the shortest path between the user and the application helps, as does reducing round trips. After that, the only thing left to do is to reduce the distance between the user and the service: As the saying goes, either the prophet must go to the mountain, or the mountain must come to the prophet.

The first strategy involves bringing the users closer to the data center. Most people think that the New York Stock Exchange (NYSE) is on Wall Street, but, for many, it is in Mahwah, New Jersey: Almost half of the equity and options trading in North America is processed there. There we find users such as hedge funds, which make use of the NYSE’s Capital Markets Community Platform, a colocation/cloud data center that guarantees a round-trip message time of 70 microseconds.1 For the entire system to perform in times measured in microseconds, various components must be even faster. Microseconds have been shaved end to end by using ultra-high-performance network switching gear that guarantees less than 700 nanoseconds—billionths of a second—to aid in the routing of messages from a buyer to a seller, for example.2

This strategy—migrating the user to the data center—works ...

Get Cloudonomics: The Business Value of Cloud Computing, + Website now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.