O'Reilly logo

Cloud Computing by James F. Ransome, John W. Rittinghouse

Stay ahead with the world's most comprehensive technology and business learning platform.

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, tutorials, and more.

Start Free Trial

No credit card required

183
Chapter 7
Common Standards in
Cloud Computing
7.1 Chapter Overview
In Internet circles, everything eventually gets driven by a working group of
one sort or another. A working group is an assembled, cooperative collabo-
ration of researchers working on new research activities that would be diffi-
cult for any one member to develop alone. A working group can can exist
for anywhere between a few months and many years. Working groups gen-
erally strive to create an informational document a standard, or find some
resolution for problems related to a system or network. Most often, the
working group attempts to assemble experts on a topic. Together, they will
work intensively toward their goal. Working groups are sometimes also
referred to as task groups or technical advisory groups. In this chapter, we
will discuss the Open Cloud Consortium (OCC) and the Distributed
Management Task Force (DMTF) as examples of cloud-related working
groups. We will also discuss the most common standards currently used in
cloud environments.
7.2 The Open Cloud Consortium
The purpose of the Open Cloud Consortium is to support the develop-
ment of standards for cloud computing and to develop a framework for
interoperability among various clouds. The OCC supports the develop-
ment of benchmarks for cloud computing and is a strong proponent of
open source software to be used for cloud computing. OCC manages a
testing platform and a test-bed for cloud computing called the Open
Cloud Test-bed. The group also sponsors workshops and other events
related to cloud computing.
The OCC is organized into several different working groups. For
example, the Working Group on Standards and Interoperability for Clouds
Chap7.fm Page 183 Friday, May 22, 2009 11:27 AM
184 Cloud Computing
That Provide On-Demand Computing Capacity focuses on developing
standards for interoperating clouds that provide on-demand computing
capacity. One architecture for clouds that was popularized by a series of
Google technical reports describes a
storage cloud
providing a distributed
file system, a
compute cloud
supporting MapReduce, and a
data cloud
sup-
porting table services. The open source Hadoop system follows this archi-
tecture. These types of cloud architectures support the concept of on-
demand computing capacity.
There is also a Working Group on Wide Area Clouds and the Impact of
Network Protocols on Clouds. The focus of this working group is on devel-
oping technology for wide area clouds, including creation of methodologies
and benchmarks to be used for evaluating wide area clouds. This working
group is tasked to study the applicability of variants of TCP (Transmission
Control Protocol) and the use of other network protocols for clouds.
The Open Cloud Test-bed uses Cisco C-Wave and the UIC Teraflow
Network for its network connections. C-Wave makes network resources
available to researchers to conduct networking and applications research. It
is provided at no cost to researchers and allows them access to 10G Waves
(Layer-1 p2p) on a per-project allocation. It provides links to a 10GE (giga-
bit Ethernet) switched network backbone. The Teraflow Test-bed (TFT) is
an international application network for exploring, integrating, analyzing,
and detecting changes in massive and distributed data over wide-area high-
performance networks. The Teraflow Test-bed analyzes streaming data with
the goal of developing innovative technology for data streams at very high
speeds. It is hoped that prototype technology can be deployed over the next
decade to analyze 100-gigabit-per-second (Gbps) and 1,000-Gbps streams.
Both of these products use wavelengths provided by the National
Lambda Rail (NLR). The NLR can support many distinct networks for the
U.S. research community using the same core infrastructure. Experimental
and productions networks exist side by side but are physically and opera-
tionally separate. Production networks support cutting-edge applications by
providing users guaranteed levels of reliability, availability, and performance.
At the same time, experimental networks enable the deployment and testing
of new networking technologies, providing researchers national-scale test-
beds without the limitations typically associated with production networks.
The Working Group on Information Sharing, Security, and Clouds
has a primary focus on standards and standards-based architectures for
sharing information between clouds. This is especially true for clouds
Chap7.fm Page 184 Friday, May 22, 2009 11:27 AM

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, interactive tutorials, and more.

Start Free Trial

No credit card required