Here are the slides to follow along as you watch the video or just browse and read the transcript of the talk...
Naba barkakati it solutions for new business - keynote - kisdi-global-conference 2010
Good afternoon!
View more presentations from Naba Barkakati
First, I wanted to thank our host — the Korea Information Society Development Institute —for inviting me to deliver this session keynote and moderate a session at this conference. I am honored to have this opportunity to address this distinguished audience.
In the next half an hour, I hope to provide a unique perspective on some innovative information technology (IT) solutions for businesses — solutions that are enabled, in general, by “convergence”, and specifically, by the convergence of IT and the network infrastructure.
The uniqueness of my perspective comes from over 30 years of experience I have had in the ICT sector and my U.S. government policy perspective — having worked the past 13 years as a Technologist at the U.S. Government Accountability Office or GAO, where we review IT systems and solutions at U.S. government agencies. These government IT solutions are now beginning to take advantage of the convergence that’s the focus of this conference. So I hope that perspective would be helpful to you as you think of future directions for IT.
When we talk about “digital convergence,” we are referring to what Nicholas Negroponte of MIT’s Media Lab called the transformation of "atoms to bits," the conversion of everything from voice, video, TV, etc. into digital information flow across platforms on the Internet or any IP—Internet Protocol—network. The network includes IT systems of varying sizes and functions from network devices to back-end servers that store and process the digital information.
Nowadays we see the results of this convergence in our daily lives — particularly in smartphones or other smart devices that you can use to browse the web, make phone calls, take pictures, shoot video, provide a Wi-Fi hotspot for other devices to connect to the Internet, and much more — these are the technologies that are moving all of us towards ubiquitous computing and connectivity, where information processing is integrated into everyday objects and activities.
Let’s start by taking stock of what facilitated and continues to drive this convergence.
To understand what facilitated this convergence, we have to look at some recent history. The idea is to learn from history and shape the future for continued innovations in IT convergence.
Let’s begin in the 1990s, when Sun Microsystems trademarked the phrase: "The Network Is the Computer." That’s as good a point in time as any, to think of as the beginning of the convergence.
When Sun coined that phrase, the conditions were just right for computing and networking technologies to make that claim — that the network is a force-multiplier for computing. Computing power was growing exponentially since the first microprocessors appeared in late 1970s — and continues to do so— following the trend described by Moore’s Law.
And with the Internet we had the ability to connect TCP/IP networks to one another. Best of all the TCP/IP networks were architected using a “layered model,” based on the Open Systems Interconnection (OSI) model that dates back to the late 1970s.
The layered model is a conceptual breakthrough — in this model, information always flows from one layer to the next. For example, when an application sends data to another application, the data go down through the layers at the transmitting end and then up from the Physical layer to the Application at the receiving end. And each layer has its own set of protocols for handling and formatting the data.
The benefit of the layered model is that each layer takes care of only its specific task, leaving the rest to the other layers. The layers can mix and match. That’s why TCP/IP networks can work over any type of physical network medium, from Ethernet to radio waves (in a wireless network). Also, each layer can be implemented in different modules. For example, typically the transport and network layers already exist as part of the operating system, and any application can make use of these layers without having to include them in the application. Each layer can also innovate and change independently of other layers.
The bottom line is that the layered networking model greatly facilitated interconnections among different computer systems and software applications that could easily communicate with one another. And, as I will explain a little later, the layered model comes up in other contexts as well, for example, in another key driver of convergence—virtualization. The layered model also comes up in discussions of telecom regulations such as those relating to “network neutrality.”
Last, but not the least, in the U.S., there was an additional driver—this one is regulatory—The Telecommunications Act of 1996—this was the first major overhaul of telecommunications law in the United States in almost 62 years. In a nutshell, the goal of the 1996 Act was to let anyone enter any telecommunications business and to let any telecommunications business compete in any market against any other. The 1996 Act also directed the Federal Communications Commission (FCC) to encourage the deployment of advanced telecommunications capability, which includes broadband, to all Americans. FCC classified broadband Internet access (regardless of the platform) as an information service — a classification that reduced regulatory requirements.
The 1996 Act was meant to foster “intramodal” competition — among companies that used the same underlying technology such as local and long distance wireline carriers and new competitive local exchange carriers, all of which were offering voice services over circuit-switched networks.
What the 1996 telecom act did not foresee was the “intermodal” competition that came about—for example, wireless service competing with both local and long distance wireline service, VoIP competing with wireline and wireless telephony, IP video competing with cable television. In any case, in the U.S., we now have intermodal competition between bundles of services (voice, Internet, and TV) provided by either telecom or cable TV service. The end result has been growth in broadband Internet access to U.S. homes, primarily through wireline services.
Many well-known Internet companies representing different business models began during these early years of digital convergence, for example—online commerce: Amazon.com (1994), online auction: eBay (1995), search engines: Yahoo! (1995), Google (1998), and, more recently, social networking: Facebook (2004) and many more…
By now, these business models are well-known and most businesses know to set up an online presence and pay attention to things like “search engine optimization (SEO)” — the process of improving the visibility of a web site in search engines, which is an Internet market strategy.
But recent developments in IT and network convergence take us beyond the routine Internet strategies to new business solutions.
Many service providers have now built large data centers that, coupled with virtualization, gives rise to the convergence of IT and network infrastructure. And this brings us the concept of computing in the cloud or “cloud computing,” where the term “cloud” refers the Internet because we typically depict the Internet as a cloud in network diagrams.
Before going any further, let me bring in the layered model again to first explain the concept of virtualization.
Virtualization refers to the creation of multiple virtual machines that can run on a single physical system. In this case, the layered model starts with the physical hardware and the physical peripheral devices —processor, memory, disk drive, and network interface — on top of which we have a virtual representation of these hardware devices. In the next higher layer, we have virtual machines, each with its own internal layers of operating system and multiple applications. And each operating system can be different from one another. So, we end up with one physical server supporting multiple different virtual machines of different types.
Cloud computing has three commonly accepted service models: infrastructure as a service, platform as a service, and software as a service (see figure on slide).
Notice the boxes in the drawing — the different models depend on that box showing what the cloud computing vendor offers versus what the user (business) provides.
Infrastructure as a service provides various infrastructure components such as hardware, storage, and other fundamental computing resources.
Platform as a service provides a service that runs over an underlying infrastructure. A platform vendor offers a ready-to-use platform, such as an operating system like Microsoft Windows or Linux, which runs on vendor-provided infrastructure. Customers can build applications on a platform using application development frameworks, middleware capabilities, and functions such as databases.
Software as a service runs on an underlying platform and infrastructure managed by the vendor and provides a self-contained operating environment used to deliver a complete application such as Web-based e-mail and related management capabilities.
A side note: although cloud computing has gained popularity recently, the concept of “software as service” is not new — for example, Salesforce.com began offering their Customer Relationship Management (CRM) application as a service in 2007.
In addition to the service models that describe what can be provided, there are four deployment models that relate to how the cloud service is implemented. These four cloud models are private, community, public, and hybrid (see figure on slide).
In a private cloud, the service is set up specifically for one organization, although there may be multiple customers within that organization, and the cloud may exist on or off the premises.
In a community cloud, the service is set up for related organizations that have similar requirements.
A public cloud is available to any paying customer and is owned and operated by the service provider.
A hybrid cloud is a composite of the deployment models.
U.S. government agencies are beginning to consider and, in some cases, start using cloud computing services. In 2010, about half of the 24 U.S. government agencies reported using some form of cloud computing for infrastructure, platform, or software services.Examples:
Defense Department’s Rapid Access Computing Environment or RACE program (2008) provides “platform as a service” to support defense department’s systems development efforts within a private cloud.
NASA’s Nebula is an open-source cloud computing project that provides an “infrastructure as a service” implementation for scientific data and Web-based applications, with everything housed in a standard shipping container that is mounted in place, but could be transported if needed.
Department of Transportation’s CARS program used a public cloud for part of its system — this was a program that allowed owners of certain less fuel-efficient vehicles to receive a credit for trading in a vehicle and purchasing or leasing a new, more fuel-efficient vehicle (this was part of the recent stimulus program).
Also, Google is offering Google Apps for Government where the data generated by the government's use of Gmail and calendaring applications will be segregated from everybody else's “cloud-based” data on servers located in the continental U.S.
So, to reiterate, the convergence of IT with the network infrastructure brings large-scale computing and storage capabilities that can now be provided over the Internet, giving us cloud computing… this has the potential to make computing a service similar to other public utilities such as electricity, water, gas etc.
Cloud computing, whether in a private cloud or in the public Internet, provides new IT solutions for businesses.
Here’s an example:
Businesses can run applications off of servers that are located remotely in the Internet — this would be an example of “software as a service”.
There are some good reasons why businesses may use applications in the cloud:
* Applications on the cloud are always ready to go. You can simply sign up for a service (such as email or office productivity applications) and begin using them without having to go through the time, cost and management expense of setting up and running your own servers with the applications.
* You can use the applications from anywhere - - cloud applications are typically web-based applications and with these you are not tied to a specific computer. This means that you can run the applications at your office, at home, from your laptop while on the road, and even via a Web-enabled smartphones, all with full access to all your data and resources.
* You get the benefits of the economies of scale because the cloud computing resources are shared among many. You get to use services of large data centers that have multiple Internet connections, backup electric power, and security and redundancy. Virtualization enables these service providers to build massive computing infrastructures with low cost per CPU cycle. Software developers can share applications among millions of users, lowering per-user cost. For example, a CRM application like Salesforce.com could easily cost many thousands of dollars a month to purchase, install, and maintain, but anyone can sign up for Salesforce.com for $125 per month.
*You get the benefit fault-tolerance and redundancy that service providers build into their systems — something that most small businesses will never be able to match. Cloud computing is often more reliable than the networks run by small companies who cannot afford the benefit of redundant electric power, hot backup systems, as well as specialized management software and expertise in data center operations.
* You can easily add computing resources in a cloud model. As your company adds employees, you can purchase more computing resources from the cloud provider. Similarly, when no longer needed, you can as easily reduce the user or server count and the resulting monthly expenses.
* You can pay for cloud services as you go. Typically you buy cloud services via monthly payments instead of a large up-front payment. This enables growing companies conserve cash for other investments and for growing their business operations.
What are some of the issues with cloud computing? In our work with U.S. government agencies, we identified information security as an area of concern. We found that cloud computing can both increase and decrease the security of information systems in federal agencies. Potential information security benefits come from the use of virtualization and automation, which can enable rapid implementation of secure configurations for virtual machines.
However, the use of cloud computing can also create information security risks for federal agencies. Specifically, 22 of 24 major U.S. federal agencies reported that they are either concerned or very concerned about the potential information security risks associated with cloud computing. These concerns include risks related to dependence on the vendor and concerns related to sharing computing resources. Agencies also identified challenges and concerns in implementing existing federal information security laws and guidance. These concerns include limitations on their ability to conduct independent audits and assessments of security controls of cloud computing service providers and concerns related to the division of information security responsibilities between customer and vendor.
We expect that specific U.S. government agencies will develop guidance related to cloud computing security to help other agencies.
Cloud computing is a back-end IT solution for new businesses, but the innovation in convergence is most visible at the user/customer end. Wireless connectivity has been the driver for this. Wireless activity continues to grow rapidly, moving us towards the ubiquitous computing that has long been predicted. Based on industry data, as of December 2009, the wireless penetration rate was 91 percent in the United States.
Additionally, the number of adults living in U.S. households with only wireless telephone service has increased from less than 5 percent in 2003 to nearly 23 percent in 2009. According to one study of wireless use, wireless connections in California now exceed the combined connections of both wireline and broadband services.
The implication for a business is that many of your customers/clients now have powerful devices that enable them to access your services that may be hosted on the cloud. Smartphones such as Samsung Galaxy S and Apple iPhone support applications that can be tailored to work with specific back-end services, so businesses can gain an advantage by providing smartphone apps for to their customers.
In the few minutes I have left, I’d like to close with recent focus on telecom regulations and convergence in the U.S.
In the United States, Congressional policymakers think that additional regulations are needed to address the changing telecom environment. Because of the growing convergence in the telecom sector, many policymakers consider it necessary to "rewrite," or revise the laws governing these markets.
Whether regulators should play a role to ensure that the Internet remains open to all, often referred to as "open access" or "net neutrality," has also become part of the dialogue.
As a segue at this point, I’d like to point out that the layered network model comes into play in discussions of net neutrality — for example, advocates of net neutrality point out that trying to affect the physical transport of bits based on application type violates the separation of layers in the network model.
Recent events in net neutrality debate: FCC ruling on Comcast overturned, Google and Verizon proposal for a legislative framework that proposes neutrality for wired broadband, but excludes wireless for now.
FCC auctions off licenses for wireless spectrum, but the challenge to find open bands of spectrum.
In March 2010, FCC published the National Broadband Plan http://www.broadband.gov/plan/ , which outlines policies and actions intended to ensure that everyone in the United States have high-speed Internet access. The plan’s long-term goals include ensuring at least 100 million U.S. homes have access to affordable broadband services with speeds of 100 megabits per second and enabling citizens to use broadband services to track energy consumption.
One of the linchpins of the National Broadband Plan is reallocating or crafting new sharing arrangements for a large amount of spectrum currently designated for use by federal agencies and commercial services, which are licensed by the FCC and National Telecommunications and Information Administration (NTIA), respectively. In a July 2010 memorandum President Obama directed the FCC and NTIA to complete by Oct 1, 2010 a specific plan and timetable for identifying 500 megahertz of spectrum that over the next 10 years could be used for wireless broadband services.
A recent development is the Sep 24, 2010 announcement by FCC approving the use of unused airwaves in the broadcast TV spectrum for unlicensed mobile broadband operations. These “whitespaces” were freed up after the U.S. transitioned from analog to digital TV broadcasting. The freed up white spaces will be open to all users and do not require a license.
The expectation is that the new spectrum will be used in new consumer devices with wireless capabilities that have both longer range and greater bandwidth than current Wi-Fi solutions—the technology has been referred to as “super Wi-Fi” because of its improved bandwidth and ability to more easily penetrate buildings. This is the biggest block of spectrum space freed up by FCC in the last two decades.
So I’d leave you with the note that we have some interesting developments and some uncertainties with potential new laws in the U.S.
Bottom line: we have made progress, but we’re not there yet - - witness the broadband penetration chart for the top few countries…
So the journey continues…
Added July 2013: Cloud computing trends
• Mobile computing – bring your own device (BYOD) or
provided by employer
• Broadband wireless connectivity – cloud computing just
won’t be possible without network connectivity
• “Apps” to perform specific tasks:
Employees/customers/clients now have powerful devices that enable them to
access services that are hosted on the cloud. Smartphones support applications (“apps”) that can be tailored to
work with specific back-end services, so businesses can gain an advantage by
providing smartphone apps for to their
employees and customers.
• Big data analytics – businesses collect large amount of
data in a central location in the cloud, which makes it amenable to applying
analytics on that “big data” and gain insights
• More sensors – Samsung Galaxy S4 comes with 9 sensors
(gesture, proximity, gyro, accelerometer, geomagnetic,
light, cover open/close, temperature+humity, and barometer) and there are apps that use these sensors; AT&T
Digital Life offers home automation through smartphone apps…
• Internet of Things (IoT) – objects (sensors) on the
Internet accessible from anywhere
No comments:
Post a Comment