Friday, June 8, 2012

Father of the Bride Speech at Daughter's Wedding

As the summer wedding season gets going, as the proud father of the bride, you may have to deliver a brief welcome speech at the reception. If so, here's my "father of the bride" speech for you to use as an example. This one is also an exception to the 3-minute rule for my NBTMV videos -- weighing in at almost 6 minutes, but you can't blame me for being long-winded; after all, this is my daughter's wedding and the first one at that!
Below is the speech as I had prepared it, but I spoke from memory, so the video does not match exactly, but it's always helpful to prepare (notice how short it is in print, yet it took almost 6 minutes to deliver the speech):

Good evening everyone. On behalf of my wife Leha and our family, we’d like to welcome friends and relatives of both families who are gathered here today to celebrate the marriage of Emily and Anran.
Thank you for taking time from your busy lives to join us on this evening, in some cases traveling quite a distance to be here. We hope you’ll enjoy the rest of the evening and remember this day fondly as we surely would.
I feel so proud today to be standing here as the father of the bride as she is looking beautiful with the man of her dream.
I remember as if it was only yesterday... when Emily was 4, she started to play children’s songs by ear on our piano and that’s when we decided to give her piano lessons when she turned 5.
The other thing I remember is how one day she announced that she wants to learn the violin when she was 9 and in elementary school. Despite discouragement from my wife who thinks the violin is much harder to learn than the piano, Emily’s progress was so impressive within 6 months of learning the violin that her mother was convinced that Emily would be able to play the violin after all. I remember her having violin lessons with Jody Gatwood and later taking her to Juilliard in NY city for violin lessons in her senior year. After year’s of chauffeuring to violin lessons and orchestra rehearsals, we were so happy when she got her driver’s license and could drive herself to her fellowship with the National Symphony Orchestra at the Kennedy Center. Emily is so good at everything -- from handicraft to cooking to math and science -- she got perfect score in math in both the SAT & GRE -- she could have been anything, but she chose music and the violin because she loves it.
You know, how some parents think no one is good enough for their daughter, it’s quite the opposite for us; we think Anran is great for Emily. He’s very considerate and thoughtful-- not to mention, extremely smart.

Emily and Anran have been friends since middle school and became best friends in college. With their love and friendship Emily and Anran can face anything that life may throw at them.
Traditionally, at this point, I should offer some advice on marriage, but you two have known each other and been together now for quite some time, so you don’t really need much advice on that front. The only practical advice I can think of is -- cook for the entire week during the weekend, live like you’re still a graduate student, and take vacations each year. That should see you through a long and happy married life.
Before I close, may I propose a toast to the most important couple tonight... if you’d please join me...
Ladies and Gentlemen, Emily and Anran -- here’s to a long and happy marriage!


P.S. As for father-daughter dance, we danced the "foxy" to the tune of "Rufus Wainwright-Across The Universe." Here's a video:


Tuesday, April 3, 2012

Visiting Lisbon, Portugal


We visited Lisbon, Portugal for a week in late February 2012 and enjoyed it very much. We rented an apartment and explored the city on our own. As usual, I started by buying a copy of Lisbon travel guide to figure out the layout of the city and then searched for apartments on VRBO.com near Rossio, which is one of the main areas of the city.

We found an apartment called Casa Travessa that was located a short walk up the hill near Rossio, the busy central plaza. The apartment served as a very convenient base from which to explore the great city of Lisbon. When we arrived on Saturday, we were greeted by Maria, a friend of the owners, Jordan and Deb Kleber, who happened to be away in Italy that week. Maria gave us the key and  took us on a very helpful tour of the neighborhood --  how to get down to Rossio, the restaurant street, metro station, Rossio train station, and tram stops, etc. We found a welcome basket with bread, cheese, fruits, and vinho verde (green wine) to get us started. We shopped in Pingo Doce for milk, bread etc for breakfast and light meals and the apartment served as a great base for our daily excursions. Jordan and Deb had also sent us helpful information about the neighborhood and restaurants.

Before I forget, here’s some helpful advice -- get a €5 "7 Colinas" card from the Casa da Sorte store on Rossio that's good for 24 hours of unlimited tram and metro rides (and refill it every morning, the first time, you’ll pay €0.50 extra for the card itself). If possible, go to Belém and museums on Sunday when it's free. If you plan to go to Sintra, check the days when the Pena Palace and National Palace are open.

We did all the usual sightseeing in Lisbon  -- on Sunday morning, after buying the “7 Colinas” card, we took Tram 15E to Belém to see the Belém tower, National Coach Museum, San Jeronimo monastery), had lunch at Os Jeronimos restaurant and coffee plus the famous Pasteis de Belém (called Pasteis de Nata elsewhere), and then took the metro to see the Gulbenkian musueum in the afternoon. We rode the famous Tram 28E many times from Martim Moniz park nearby all the way through Alfama and Bairro Alto. We got off the Tram 28E to see Castelo San Jorge (the castle), Miradoro St Luzia, and Sé cathedral. We also took the Elevador Gloria funicular up to Chiado, to see the city from up there and also taste port wine at Solar do Vinho do Porto.  One day we took metro to the Oriente station and visited Parque das Naciones. We spent nearly each afternoon walking the plaza in Rossio, then down Rua Augusta pedestrian street to Praca Comercio and back. We had coffee and Pasteis de Nata each afternoon. We liked the area around Rossio a lot.

We took a day trip to Sintra by train from the Rossio train station and another half-day trip by train to Cascais (from the Cais do Sodré station) to enjoy the beach. In Sintra we saw the Pena palace and walked around the city. We also had a good lunch at the GSpot Gastronomia restaurant near Sintra train station.

There are lots of good restaurants in Lisbon. We ate meals at the Bom Jardim (slowly roasted chicken with piri piri sauce) and Cafe Tighelina nearby, and took the ferry across Rio Tejo to Cacilhas for a seafood lunch at Farol restaurant (shrimp in garlic sauce, Bacalhau à Farol platter -- bacalhau is salted cod, very popular in Portugal). We also ate at a small restaurant near the Castle called Claras em Castelo that was quite good.

Monday, March 19, 2012

Cybersecurity is hard, we need some assist


Cybersecurity is so much harder than physical security! In physical security, you have a well-defined physical perimeter -- your doors, windows, gates etc -- to watch and protect, but in cybersecurity the perimeter is not well-defined. Think of cybersecurity for a typical home or business -- you usually have a box -- the “router,” perhaps a wireless one -- that lets you connect your network of PCs and other devices to the Internet via your Internet Service Provider. Although there is a single physical connection to the Internet, the software applications in your PCs and devices are making lots of network connections. If you think of each of these as a door, it’s like trying to watch over and protect thousands of doors at once and, on top of that, you need to check the packets of information that are coming in and going out of these “doors” -- like checking each visitor to a building, only the number of visitors is in the billions! To make matters worse, the software applications -- think of Web browsers, Office suite, PDF reader, etc have their own weaknesses and could serve as gateways through which bad guys get access to your information... so that’s even more “doors” to protect. Anyway, you get the idea -- compared to physical security, cybersecurity is too difficult for us to tackle in a routine manner.

Does that mean we do nothing about cybersecurity? Of course not! We already try to do our best with antivirus and firewalls etc, but to keep up with the ever-changing number and types of “doors” that we have to watch over, we need some assist from the information security companies.

First, we need a way to monitor the status of cybersecurity, similar to the way we have guards monitoring doors, fences, and gates through video cameras etc. Only in cybersecurity, someone needs to build a simple dashboard to show us how well our defenses are working against the torrent of potential malicious packets coming through the cyber “doors” to our network.

Second, someone needs to build a consumer “cybersecurity appliance” -- I envision a box that sits between that router and the rest of your network, a box that watches over all the network connections and does whatever is needed to keep our internal network safe. Come to think of it, the cybersecurity appliance can both monitor cybersecurity and provide protection.

I hope someone takes up the challenge and builds us a “cybersecurity appliance” someday soon.


Here's some more information to help you...
You may find the following books useful:




Here's an old GAO report that's still quite relevant: 

Cybersecurity for Critical Infrastructure Protection

GAO-04-321, May 28, 2004

Here's a presentation on Cybrsecurity research an development (R&D) based on this report:

Tuesday, February 28, 2012

Assessing impact of using facial recognition technology



In a previous video I talked about the importance of assessing the impact of technology before adopting it. Today I want to talk to you about the impact of increased use of facial recognition technology. You may have already heard about the use of facial recognition technology to detect the mix of male and female customers at bars. Facial recognition technology is also used to identify people on the street or in a crowd.

Facial recognition is what is known as biometrics, which is identification of people from some features of their body such as fingerprint, iris pattern, or facial characteristics. No matter what biometrics we pick, if you stop and think about it for a moment, you would realize that your fingerprint or facial image does not come with your name stamped on it. Rather, somewhere in a computer system your facial image and your name are linked together. That linkage between your name and your facial image must be correct to begin with and the information must be stored securely.
Additionally, facial recognition systems are not error-free and there is a potential for misidentification.
So, as much as facial recognition is useful, it is not error-free. Therefore, you need some protection when the facial recognition technology misidentifies you as someone else. You need some way of correcting the mistake.
There is also the potential of someone trying to fool a facial recognition system by presenting, for example, a photograph or a video of you as your face. Something has to be done to detect such misuse and not accept a photograph as substitute for a face. It’s not easy to overcome this problem because facial recognition system identifies people from photographs or videos.
This means that if you are using facial recognition technology to identify customers or perhaps suspected criminals, you have to make sure that appropriate security precautions are in place to protect the linkage between a person’s facial image and their identity information and that the information is correct in the first place. You must also provide some sort of recourse when someone is misidentified.

Here's some more information to help you...
You may find my previous videos on technology impact assessment and biometric technology helpful:


NBTMV - On the need to assess impact of technology...

NBTMV on using biometric technologies to identify ...



Friday, February 17, 2012

How to lose weight - a story to motivate you


If you want to lose weight, here's a story to help and motivate you.

I always knew that good food with fewer calories and exercise are important to manage your weight, but when he was young did not follow this advice. My diet included lots of rice and very little exercise. So, over many years, bit by bit, I gained weight. When I was 44, my weight was over 200 pounds. Around this time, I had a stroke of good luck. My youngest daughter wanted to run in a 5K race and asked me to run with her. I agreed and started running on the treadmill to get ready for the race. Soon I was running five kilometers in less than 30 minutes ans was so proud that I could run five kilometers!

At the same time, I saw a TV show where a doctor discussed how to improve your diet. He said it is important to eat plenty of vegetables, fish and white meat. I changed my diet after watching that program. I substituted rice in my diet with green leafy vegetables. For breakfast and lunch I ate vegetable and chicken (or fish) soup. For dinner, I ate salads with fish and grilled salmon (but without any dressing). But I made sure that I ate the same volume of food as before, so I was not hungry even though I was eating salads instead of rice. By the way, I also started drinking black tea instead of coffee with milk and sugar.

So, overall, I made two changes at the same time: started doing more exercises and reduced the calorie intake (because the veggies had less calories than rice). The body responded by losing weight. Within a few weeks I lost over 30 pounds. I really liked this because I could see the results very soon. I could also continue the program of diet and exercise because I was not hungry. I really felt better than I ever felt before.

Even though I have been continuing this program since 2003, my weight has gone up some. Over the past eight years, I have gained weight about 8 pounds. But it’s still a great way to lose weight and manage your health. When I think why the weight has gone up despite maintaining about the same diet and exercise, all I can think is that the body has a mind of its own and it fights your attempts to lose weight.

If you are thinking about trying to lose weight, I recommend that you make two changes in your life: start an exercise program and change your diet by replacing carbohydrates with veggies. Stop eating too much rice or pasta and eat a lot of salads with fish (or chicken) but skip the dressing .Remember that to succeed you should eat the same amount of food as you ate before making these changes because it is very important to not be hungry. Good luck!


Here's some more information to help you...
You may find my previous video on weight loss helpful:
To lose weight, think of the human body as a syste...

Cómo perder peso


Si quieres perder el peso, aquí está una historia para ayudarte y motivarte.

Siempre sabía que buena alimentación con menos calorias y ejercicios eran importantes para manejar su peso, pero cuando era joven no seguía este consejo. Comía mucho arroz y hacía poco ejercicios. Así que, hace muchos años, poco a poco, me gané peso. Cuando tenía 44 años, pesaba más de 200 libras. Al rededor de este tiempo, fue una buena suerte que mi hija menor queria correr en una carrera de cinco kilometros y me pidió aue la acompañara. Estaba de acuerdo a correr con mi hija y empecé a correr en la cinta para estar listo. Muy pronto estaba corriendo cinco kilometros en menos que 30 minutos. Estaba orgulloso de poder correr bastante rapído como eso.

Al mismo tiempo, ví un programa de televisión donde un doctor discutió comó mejorar su dieta. Se dijo que es importante para comer muchas verduras, pescado, y carne blanco. Decidí cambiar mi dieta despues de ver ese programa. Sustituí arroz in mi dieta con verduras de hoja verde. Para el desayuno y para el almuerzo comía sopa de verduras y pollo. Para la cena, comía las verduras con pescado como el salmón a la parilla. Es como una ensalada sin aderoso. La comida tenía el mismo volumen como antes de cambiar la dieta. Asi que no tenía hambre a pesar de que comía verduras en lugar de arroz.

Así que hice dos cambios al mismo tiempo, hacer mas ejercicios y reducir las calorias de comidas. El cuerpo respondió con la perdida de peso. En unas semanas perdí más de 30 libras. Me gustó mucho porque pude ver los resultados en poco tiempo. Podía continuar el programa de dieta y ejercicio porque no tenía hambre. De verdad me sentía mejor que antes.

Aunque continuaba esta programa desde 2003 el peso se sube un poco. En los últimos ocho años el peso aumentó 7 u 8 libras. Sin embargo este ha sido un buen método para perder peso y manejar tu salud. Cuando pienso en porque el peso sube a pesar de las mismas dietas y ejercicios, creo que el cuerpo tiene una mente propia y lucha contra de bajar el peso.

Todavia si no trata de manejar el peso, te recomiendo que hagas dos cambios en tu vida: comienza un programa de ejercicio y al mismo tiempo cambia tu dieta reduciendo los carbohidratos. Deja de comer demasiado arroz o tortillas y come mucho ensaladas con pescado (o pollo) pero sin aderezo. Recuerda que para tener éxito debes comer el mismo volumen de comida que comías antes de hacer estos cambios porque es muy importante no tener hambre.

Thursday, February 9, 2012

NBTMV on using biometric technologies to identify people



Biometric technologies use specific features of your body such as fingerprint, facial image, iris pattern, or even the geometry of your hand, to identify you.  Assessing the impact of using biometrics depend on the exact type of use. If your laptop has a fingerprint reader where you swipe your finger to log in, then all we may need to worry about is the possibility that the system somehow could not match your fingerprint with what was previously stored in the system, in which case, there should be some alternate method for logging into the laptop.

On the other hand, if biometric technology such as facial recognition is used to identify a person from among millions in a database, then we have to think about many more things.

For  starters, you know that your fingerprint or facial image or iris pattern do not come with your name stamped on the it, which means that some computer system has to store your identifying information with your biometric features.  That association must be correct when it’s initially stored (meaning that your biometrics are linked to your identifying information) and the information must be securely stored and maintained.

Next, we have to consider the potential of errors in the system that compares and matches your biometric features such as your fingerprint or your facial image with information stored in a database. There is a possibility of misidentification where someone else’s biometric may be identified as yours or vice versa. This means that we need to provide some redress mechanism for anyone who is misidentified. This is especially important if the biometric is being used to look for a criminal, for example, and you are mistakenly tagged as that criminal.

Another potential problem is when someone intentionally tries to fool the system by presenting, for example, a photograph of you as that your facial image.

These are among some of the issues that we need to consider when assessing the impact of using biometric technologies to identify people.

Here's some more information to help you...
In 2002 GAO had published the following report that assesses use of biometric technologies for improving border security:
GAO-03-174, Nov 15, 2002

There are a number of books on biometric technologies as well:

Thursday, January 19, 2012

Cloud security from a user’s point of view



As more and more of us begin to rely on cloud services, security is definitely one of the topmost concerns on our mind. From the user’s perspective, whether it’s an individual or an organization, cloud security translates to protecting the information that you’re storing on the cloud. You want the information to be available when you want it, and not be shared with anyone that you don’t want to share with, and make sure that nobody changes that inform ation without your permission or knowledge.
When you use cloud services the cloud provider has a lot of responsibilities for security, but you also share the responsibility of making sure that your login information is secure and you use any security features that are available to you. For example, if the cloud provider offers 2-step verification, as Google does, then you must make sure that you turn on this extra security feature.
As a user, you know that the same cloud services are used by many other users, so you definitely want your cloud service to offer good walls between you and your neighbors so that information cannot maliciously or inadvertently be shared with your neighbors. You’d also want the cloud provider to back up your information and protect it from loss due to technical or natural disasters.
In a nutshell, cloud security is a shared responsibility, where you take care of using all the security features available to you for the login process and the provider carries a much larger burden of protecting your information from other users as well as from outsiders. in particular you’d expect the cloud provider to employ the best security personnel available, follow the latest security procedures,  and adopt the latest technology to keep users separate from each other and to keep any outside attackers away from your information. From my perspective, I think this is what users expect from cloud security.
Here's some more information to help you...
For my earlier videos on cloud security, please see:



Saturday, January 14, 2012

Cloud Computing from the perspective of IT Convergence

(Rules are meant to be broken.... so here's a video that's not 3 minutes, but almost 30 minutes long... for you to peruse when you have some time :-)

Here are the slides to follow along as you watch the video or just browse and read the transcript of the talk...
Good afternoon!  
First, I wanted to thank our host — the Korea Information Society Development Institute —for inviting me to deliver this session keynote and moderate a session at this conference. I am honored to have this opportunity to address this distinguished audience.

In the next half an hour, I hope to provide a unique perspective on some innovative information technology (IT) solutions for businesses — solutions that are enabled, in general, by “convergence”, and specifically, by the convergence of IT and the network infrastructure.

The uniqueness of my perspective comes from over 30 years of experience I have had in the ICT sector and my U.S. government policy perspective — having worked the past 13 years as a Technologist at the U.S. Government Accountability Office or GAO, where we review IT systems and solutions at U.S. government agencies.  These government IT solutions are now beginning to take advantage of the convergence that’s the focus of this conference. So I hope that perspective would be helpful to you as you think of future directions for IT.


When we talk about “digital convergence,” we are referring to what Nicholas Negroponte of MIT’s Media Lab called the transformation of "atoms to bits," the conversion of everything from voice, video, TV, etc. into digital information flow across platforms on the Internet or any IP—Internet Protocol—network. The network includes IT systems of varying sizes and functions from network devices to back-end servers that store and process the digital information.


Nowadays we see the results of this convergence in our daily lives — particularly in smartphones or other smart devices that you can use to browse the web, make phone calls, take pictures, shoot video, provide a Wi-Fi hotspot for other devices to connect to the Internet, and much more — these are the technologies that are moving all of us towards ubiquitous computing and connectivity, where information processing is integrated into everyday objects and activities.


Let’s start by taking stock of what facilitated and continues to drive this convergence.
 

To understand what facilitated this convergence, we have to look at some recent history. The idea is to learn from history and shape the future for continued innovations in IT convergence.


Let’s begin in the 1990s, when Sun Microsystems trademarked the phrase: "The Network Is the Computer."  That’s as good a point in time as any, to think of as the beginning of the convergence.
When Sun coined that phrase, the conditions were just right for computing and networking technologies to make that claim — that the network is a force-multiplier for computing. Computing power was growing exponentially since the first microprocessors appeared in late 1970s — and continues to do so— following the trend described by Moore’s Law.

And with the Internet we had the ability to connect TCP/IP networks to one another. Best of all the TCP/IP networks were architected using a “layered model,” based on the Open Systems Interconnection (OSI) model that dates back to the late 1970s.
The layered model is a conceptual breakthrough — in this model, information always flows from one layer to the next. For example, when an application sends data to another application, the data go down through the layers at the transmitting end and then up from the Physical layer to the Application at the receiving end. And each layer has its own set of protocols for handling and formatting the data.

The benefit of the layered model is that each layer takes care of only its specific task, leaving the rest to the other layers. The layers can mix and match. That’s why TCP/IP networks can work over any type of physical network medium, from Ethernet to radio waves (in a wireless network). Also, each layer can be implemented in different modules. For example, typically the transport and network layers already exist as part of the operating system, and any application can make use of these layers without having to include them in the application. Each layer can also innovate and change independently of other layers.
The bottom line is that the layered networking model greatly facilitated interconnections among different computer systems and software applications that could easily communicate with one another. And, as I will explain a little later, the layered model comes up in other contexts as well, for example, in another key driver of convergence—virtualization. The layered model also comes up in discussions of telecom regulations such as those relating to “network neutrality.”

So we have the computing power, a versatile network architecture as drivers, but another key driver was the emergence of the World Wide Web in mid-1990s, which, through Web servers, Web browsers, and other standards such as HTML and HTTP, enabled delivery of information from anywhere in the Internet to any device.


Last, but not the least, in the U.S., there was an additional driver—this one is regulatory—The Telecommunications Act of 1996—this was the first major overhaul of telecommunications law in the United States in almost 62 years. In a nutshell, the goal of the 1996 Act was to let anyone enter any telecommunications business and to let any telecommunications business compete in any market against any other. The 1996 Act also directed the Federal Communications Commission (FCC) to encourage the deployment of advanced telecommunications capability, which includes broadband, to all Americans. FCC classified broadband Internet access (regardless of the platform) as an information service — a classification that reduced regulatory requirements.

The 1996 Act was meant to foster “intramodal” competition — among companies that used the same underlying technology such as local and long distance wireline carriers and new competitive local exchange carriers, all of which were offering voice services over circuit-switched networks.


What the 1996 telecom act did not foresee was the “intermodal” competition that came about—for example, wireless service competing with both local and long distance wireline service, VoIP competing with wireline and wireless telephony, IP video competing with cable television. In any case, in the U.S., we now have intermodal competition between bundles of services (voice, Internet, and TV) provided by either telecom or cable TV service. The end result has been growth in broadband Internet access to U.S. homes, primarily through wireline services.

All in all, the technological factors—exponential growth in computing power at lower cost, universal TCP/IP network connectivity afforded by the Internet—combined with the U.S. telecom regulations enabling competition, and attendant growth in broadband Internet access, fueled the growth of Internet businesses and digital convergence in the U.S.

Many well-known Internet companies representing different business models began during these early years of digital convergence, for example—online commerce: Amazon.com (1994), online auction: eBay (1995), search engines: Yahoo! (1995), Google (1998), and, more recently, social networking: Facebook (2004) and many more…

By now, these business models are well-known and most businesses know to set up an online presence and pay attention to things like “search engine optimization (SEO)” — the process of improving the visibility of a web site in search engines, which is an Internet market strategy.

But recent developments in IT and network convergence take us beyond the routine Internet strategies to new business solutions.

Many service providers have now built large data centers that, coupled with virtualization, gives rise to the convergence of IT and network infrastructure. And this brings us the concept of computing in the cloud or “cloud computing,” where the term “cloud” refers the Internet because we typically depict the Internet as a cloud in network diagrams.

Before going any further, let me bring in the layered model again to first explain the concept of virtualization.


Virtualization refers to the creation of multiple virtual machines that can run on a single physical system. In this case, the layered model starts with the physical hardware and the physical peripheral devices —processor, memory, disk drive, and network interface — on top of which we have a virtual representation of these hardware devices. In the next higher layer, we have virtual machines, each with its own internal layers of operating system and multiple applications. And each operating system can be different from one another. So, we end up with one physical server supporting multiple different virtual machines of different types.

The combination of virtualization with the large data centers at the Internet core give us the latest trend of “cloud computing” — where a business pays a service provider to deliver IT applications, computing power, and storage via the Internet. This enables businesses to access and share computer resources and potentially gain significant cost savings. Businesses with heavy demands for computing power, such as banks, could easily acquire huge processing capabilities on demand by processing data across large groups of servers.

Cloud computing has three commonly accepted service models: infrastructure as a service, platform as a service, and software as a service (see figure on slide).


Notice the boxes in the drawing — the different models depend on that box showing what the cloud computing vendor offers versus what the user (business) provides.
Infrastructure as a service provides various infrastructure components such as hardware, storage, and other fundamental computing resources.
Platform as a service provides a service that runs over an underlying infrastructure. A platform vendor offers a ready-to-use platform, such as an operating system like Microsoft Windows or Linux, which runs on vendor-provided infrastructure. Customers can build applications on a platform using application development frameworks, middleware capabilities, and functions such as databases.
Software as a service runs on an underlying platform and infrastructure managed by the vendor and provides a self-contained operating environment used to deliver a complete application such as Web-based e-mail and related management capabilities.
A side note: although cloud computing has gained popularity recently, the concept of “software as service” is not new — for example, Salesforce.com began offering their Customer Relationship Management (CRM) application as a service in 2007.
In addition to the service models that describe what can be provided, there are four deployment models that relate to how the cloud service is implemented. These four cloud models are private, community, public, and hybrid (see figure on slide).
In a private cloud, the service is set up specifically for one organization, although there may be multiple customers within that organization, and the cloud may exist on or off the premises.
In a community cloud, the service is set up for related organizations that have similar requirements.
A public cloud is available to any paying customer and is owned and operated by the service provider.
A hybrid cloud is a composite of the deployment models.
U.S. government agencies are beginning to consider and, in some cases, start using cloud computing services. In 2010, about half of the 24 U.S. government agencies reported using some form of cloud computing for infrastructure, platform, or software services.

Examples:
Defense Department’s Rapid Access Computing Environment or RACE program (2008) provides “platform as a service” to support defense department’s systems development efforts within a private cloud.

NASA’s Nebula is an open-source cloud computing project that provides an “infrastructure as a service” implementation for scientific data and Web-based applications,  with everything housed in a standard shipping container that is mounted in place, but could be transported if needed.
Department of Transportation’s CARS program used a public cloud for part of its system — this was a program that allowed owners of certain less fuel-efficient vehicles to receive a credit for trading in a vehicle and purchasing or leasing a new, more fuel-efficient vehicle (this was part of the recent stimulus program).
Also, Google is offering Google Apps for Government where the data generated by the government's use of Gmail and calendaring applications will be segregated from everybody else's “cloud-based” data on servers located in the continental U.S.
So, to reiterate, the convergence of IT with the network infrastructure brings large-scale computing and storage capabilities that can now be provided over the Internet, giving us cloud computing… this has the potential to make computing a service similar to other public utilities such as electricity, water, gas etc.
Cloud computing, whether in a private cloud or in the public Internet, provides new IT solutions for businesses.
Here’s an example:
Businesses can run applications off of servers that are located remotely in the Internet — this would be an example of “software as a service”.
There are some good reasons why businesses may use applications in the cloud:
* Applications on the cloud are always ready to go. You can simply sign up for a service (such as email or office productivity applications) and begin using them without having to go through the time, cost and management expense of setting up and running your own servers with the applications.
* You can use the applications from anywhere - - cloud applications are typically web-based applications and with these you are not tied to a specific computer. This means that you can run the applications at your office, at home, from your laptop while on the road, and even via a Web-enabled smartphones, all with full access to all your data and resources.
* You get the benefits of the economies of scale because the cloud computing resources are shared among many. You get to use services of large data centers that have multiple Internet connections, backup electric power, and security and redundancy. Virtualization enables these service providers to build massive computing infrastructures with low cost per CPU cycle. Software developers can share applications among millions of users, lowering per-user cost. For example, a CRM application like Salesforce.com could easily cost many thousands of dollars a month to purchase, install, and maintain, but anyone can sign up for Salesforce.com for $125 per month.
*You get the benefit fault-tolerance and redundancy that service providers build into their systems — something that most small businesses will never be able to match. Cloud computing is often more reliable than the networks run by small companies who cannot afford the benefit of redundant electric power, hot backup systems, as well as specialized management software and expertise in data center operations.
* You can easily add computing resources in a cloud model. As your company adds employees, you can purchase more computing resources from the cloud provider. Similarly, when no longer needed, you can as easily reduce the user or server count and the resulting monthly expenses.
* You can pay for cloud services as you go. Typically you buy cloud services via monthly payments instead of a large up-front payment. This enables growing companies conserve cash for other investments and for growing their business operations.
What are some of the issues with cloud computing? In our work with U.S. government agencies, we identified information security as an area of concern. We found that cloud computing can both increase and decrease the security of information systems in federal agencies. Potential information security benefits come from the use of virtualization and automation, which can enable rapid implementation of secure configurations for virtual machines.
However, the use of cloud computing can also create information security risks for federal agencies. Specifically, 22 of 24 major U.S. federal agencies reported that they are either concerned or very concerned about the potential information security risks associated with cloud computing. These concerns include risks related to dependence on the vendor and concerns related to sharing computing resources. Agencies also identified challenges and concerns in implementing existing federal information security laws and guidance. These concerns include limitations on their ability to conduct independent audits and assessments of security controls of cloud computing service providers and concerns related to the division of information security responsibilities between customer and vendor.
We expect that specific U.S. government agencies will develop guidance related to cloud computing security to help other agencies.
Cloud computing is a back-end IT solution for new businesses, but the innovation in convergence is most visible at the user/customer end. Wireless connectivity has been the driver for this. Wireless activity continues to grow rapidly, moving us towards the ubiquitous computing that has long been predicted. Based on industry data, as of December 2009, the wireless penetration rate was 91 percent in the United States.
Additionally, the number of adults living in U.S. households with only wireless telephone service has increased from less than 5 percent in 2003 to nearly 23 percent in 2009. According to one study of wireless use, wireless connections in California now exceed the combined connections of both wireline and broadband services.
The implication for a business is that many of your customers/clients now have powerful devices that enable them to access your services that may be hosted on the cloud. Smartphones such as Samsung Galaxy S and Apple iPhone support applications that can be tailored to work with specific back-end services, so businesses can gain an advantage by providing smartphone apps for to their customers.
In the few minutes I have left, I’d like to close with recent focus on telecom regulations and convergence in the U.S.
In the United States, Congressional policymakers think that additional regulations are needed to address the changing telecom environment. Because of the growing convergence in the telecom sector, many policymakers consider it necessary to "rewrite," or revise the laws governing these markets.
Whether regulators should play a role to ensure that the Internet remains open to all, often referred to as "open access" or "net neutrality," has also become part of the dialogue.
As a segue at this point, I’d like to point out that the layered network model comes into play in discussions of net neutrality — for example, advocates of net neutrality point out that trying to affect the physical transport of bits based  on application type violates the separation of layers in the network model.
Recent events in net neutrality debate: FCC ruling on Comcast overturned, Google and Verizon proposal for a legislative framework that proposes neutrality for wired broadband, but excludes wireless for now.

Spectrum and National Broadband Plan
- - Another crucial component in the telecommunications policy debate is the allocation and regulation of radio-frequency spectrum.  The private sector is developing an increasing appetite for wireless communications services such as high-speed access to the Internet and digital television broadcasts that require lots of spectrum. The public sector also requires spectrum for a number of uses, including voice and data support for emergency communications. The challenge is to meet both the public and private sector needs for usable spectrum.
FCC auctions off licenses for wireless spectrum, but the challenge to find open bands of spectrum.
In March 2010, FCC published the National Broadband Plan http://www.broadband.gov/plan/ , which outlines policies and actions intended to ensure that everyone in the United States have high-speed Internet access. The plan’s long-term goals include ensuring at least 100 million U.S. homes have access to affordable broadband services with speeds of 100 megabits per second and enabling citizens to use broadband services to track energy consumption.
One of the linchpins of the National Broadband Plan is reallocating or crafting new sharing arrangements for a large amount of spectrum currently designated for use by federal agencies and commercial services, which are licensed by the FCC and National Telecommunications and Information Administration (NTIA), respectively. In a July 2010 memorandum President Obama directed the FCC and NTIA to complete by Oct 1, 2010 a specific plan and timetable for identifying 500 megahertz of spectrum that over the next 10 years could be used for wireless broadband services.
A recent development is the Sep 24, 2010 announcement by FCC approving the use of unused airwaves in the broadcast TV spectrum for unlicensed mobile broadband operations. These “whitespaces” were freed up after the U.S. transitioned from analog to digital TV broadcasting. The freed up white spaces will be open to all users and do not require a license.
The expectation is that the new spectrum will be used in new consumer devices with wireless capabilities that have both longer range and greater bandwidth than current Wi-Fi solutions—the technology has been referred to as “super Wi-Fi” because of its improved bandwidth and ability to more easily penetrate buildings. This is the biggest block of spectrum space freed up by FCC in the last two decades.
So I’d leave you with the note that we have some interesting developments and some uncertainties with potential new laws in the U.S.

Bottom line: we have made progress, but we’re not there yet - - witness the broadband penetration chart for the top few countries…

So the journey continues…

Added July 2013: Cloud computing trends

• Mobile computing – bring your own device (BYOD) or provided by employer

• Broadband wireless connectivity – cloud computing just won’t be possible without network connectivity

• “Apps” to perform specific tasks: Employees/customers/clients now have powerful devices that enable them to access services that are hosted on the cloud. Smartphones support applications (“apps”) that can be tailored to work with specific back-end services, so businesses can gain an advantage by providing smartphone apps for to their employees and customers.

• Big data analytics – businesses collect large amount of data in a central location in the cloud, which makes it amenable to applying analytics on that “big data” and gain insights

• More sensors – Samsung Galaxy S4 comes with 9 sensors (gesture, proximity, gyro, accelerometer, geomagnetic, light, cover open/close, temperature+humity, and barometer) and there are apps that use these sensors; AT&T Digital Life offers home automation through smartphone apps…

• Internet of Things (IoT) – objects (sensors) on the Internet accessible from anywhere

• Everything “as-aService” (aaS):  “Data as a Service”, “desktop as a service” (outsourcing VDI – Virtual Desktop Infrastructure),  “business process as a service (BPaaS)” (for example, payroll, printing, e-commerce, etc)