Thursday, January 19, 2012

Cloud security from a user’s point of view



As more and more of us begin to rely on cloud services, security is definitely one of the topmost concerns on our mind. From the user’s perspective, whether it’s an individual or an organization, cloud security translates to protecting the information that you’re storing on the cloud. You want the information to be available when you want it, and not be shared with anyone that you don’t want to share with, and make sure that nobody changes that inform ation without your permission or knowledge.
When you use cloud services the cloud provider has a lot of responsibilities for security, but you also share the responsibility of making sure that your login information is secure and you use any security features that are available to you. For example, if the cloud provider offers 2-step verification, as Google does, then you must make sure that you turn on this extra security feature.
As a user, you know that the same cloud services are used by many other users, so you definitely want your cloud service to offer good walls between you and your neighbors so that information cannot maliciously or inadvertently be shared with your neighbors. You’d also want the cloud provider to back up your information and protect it from loss due to technical or natural disasters.
In a nutshell, cloud security is a shared responsibility, where you take care of using all the security features available to you for the login process and the provider carries a much larger burden of protecting your information from other users as well as from outsiders. in particular you’d expect the cloud provider to employ the best security personnel available, follow the latest security procedures,  and adopt the latest technology to keep users separate from each other and to keep any outside attackers away from your information. From my perspective, I think this is what users expect from cloud security.
Here's some more information to help you...
For my earlier videos on cloud security, please see:



Saturday, January 14, 2012

Cloud Computing from the perspective of IT Convergence

(Rules are meant to be broken.... so here's a video that's not 3 minutes, but almost 30 minutes long... for you to peruse when you have some time :-)

Here are the slides to follow along as you watch the video or just browse and read the transcript of the talk...
Good afternoon!  
First, I wanted to thank our host — the Korea Information Society Development Institute —for inviting me to deliver this session keynote and moderate a session at this conference. I am honored to have this opportunity to address this distinguished audience.

In the next half an hour, I hope to provide a unique perspective on some innovative information technology (IT) solutions for businesses — solutions that are enabled, in general, by “convergence”, and specifically, by the convergence of IT and the network infrastructure.

The uniqueness of my perspective comes from over 30 years of experience I have had in the ICT sector and my U.S. government policy perspective — having worked the past 13 years as a Technologist at the U.S. Government Accountability Office or GAO, where we review IT systems and solutions at U.S. government agencies.  These government IT solutions are now beginning to take advantage of the convergence that’s the focus of this conference. So I hope that perspective would be helpful to you as you think of future directions for IT.


When we talk about “digital convergence,” we are referring to what Nicholas Negroponte of MIT’s Media Lab called the transformation of "atoms to bits," the conversion of everything from voice, video, TV, etc. into digital information flow across platforms on the Internet or any IP—Internet Protocol—network. The network includes IT systems of varying sizes and functions from network devices to back-end servers that store and process the digital information.


Nowadays we see the results of this convergence in our daily lives — particularly in smartphones or other smart devices that you can use to browse the web, make phone calls, take pictures, shoot video, provide a Wi-Fi hotspot for other devices to connect to the Internet, and much more — these are the technologies that are moving all of us towards ubiquitous computing and connectivity, where information processing is integrated into everyday objects and activities.


Let’s start by taking stock of what facilitated and continues to drive this convergence.
 

To understand what facilitated this convergence, we have to look at some recent history. The idea is to learn from history and shape the future for continued innovations in IT convergence.


Let’s begin in the 1990s, when Sun Microsystems trademarked the phrase: "The Network Is the Computer."  That’s as good a point in time as any, to think of as the beginning of the convergence.
When Sun coined that phrase, the conditions were just right for computing and networking technologies to make that claim — that the network is a force-multiplier for computing. Computing power was growing exponentially since the first microprocessors appeared in late 1970s — and continues to do so— following the trend described by Moore’s Law.

And with the Internet we had the ability to connect TCP/IP networks to one another. Best of all the TCP/IP networks were architected using a “layered model,” based on the Open Systems Interconnection (OSI) model that dates back to the late 1970s.
The layered model is a conceptual breakthrough — in this model, information always flows from one layer to the next. For example, when an application sends data to another application, the data go down through the layers at the transmitting end and then up from the Physical layer to the Application at the receiving end. And each layer has its own set of protocols for handling and formatting the data.

The benefit of the layered model is that each layer takes care of only its specific task, leaving the rest to the other layers. The layers can mix and match. That’s why TCP/IP networks can work over any type of physical network medium, from Ethernet to radio waves (in a wireless network). Also, each layer can be implemented in different modules. For example, typically the transport and network layers already exist as part of the operating system, and any application can make use of these layers without having to include them in the application. Each layer can also innovate and change independently of other layers.
The bottom line is that the layered networking model greatly facilitated interconnections among different computer systems and software applications that could easily communicate with one another. And, as I will explain a little later, the layered model comes up in other contexts as well, for example, in another key driver of convergence—virtualization. The layered model also comes up in discussions of telecom regulations such as those relating to “network neutrality.”

So we have the computing power, a versatile network architecture as drivers, but another key driver was the emergence of the World Wide Web in mid-1990s, which, through Web servers, Web browsers, and other standards such as HTML and HTTP, enabled delivery of information from anywhere in the Internet to any device.


Last, but not the least, in the U.S., there was an additional driver—this one is regulatory—The Telecommunications Act of 1996—this was the first major overhaul of telecommunications law in the United States in almost 62 years. In a nutshell, the goal of the 1996 Act was to let anyone enter any telecommunications business and to let any telecommunications business compete in any market against any other. The 1996 Act also directed the Federal Communications Commission (FCC) to encourage the deployment of advanced telecommunications capability, which includes broadband, to all Americans. FCC classified broadband Internet access (regardless of the platform) as an information service — a classification that reduced regulatory requirements.

The 1996 Act was meant to foster “intramodal” competition — among companies that used the same underlying technology such as local and long distance wireline carriers and new competitive local exchange carriers, all of which were offering voice services over circuit-switched networks.


What the 1996 telecom act did not foresee was the “intermodal” competition that came about—for example, wireless service competing with both local and long distance wireline service, VoIP competing with wireline and wireless telephony, IP video competing with cable television. In any case, in the U.S., we now have intermodal competition between bundles of services (voice, Internet, and TV) provided by either telecom or cable TV service. The end result has been growth in broadband Internet access to U.S. homes, primarily through wireline services.

All in all, the technological factors—exponential growth in computing power at lower cost, universal TCP/IP network connectivity afforded by the Internet—combined with the U.S. telecom regulations enabling competition, and attendant growth in broadband Internet access, fueled the growth of Internet businesses and digital convergence in the U.S.

Many well-known Internet companies representing different business models began during these early years of digital convergence, for example—online commerce: Amazon.com (1994), online auction: eBay (1995), search engines: Yahoo! (1995), Google (1998), and, more recently, social networking: Facebook (2004) and many more…

By now, these business models are well-known and most businesses know to set up an online presence and pay attention to things like “search engine optimization (SEO)” — the process of improving the visibility of a web site in search engines, which is an Internet market strategy.

But recent developments in IT and network convergence take us beyond the routine Internet strategies to new business solutions.

Many service providers have now built large data centers that, coupled with virtualization, gives rise to the convergence of IT and network infrastructure. And this brings us the concept of computing in the cloud or “cloud computing,” where the term “cloud” refers the Internet because we typically depict the Internet as a cloud in network diagrams.

Before going any further, let me bring in the layered model again to first explain the concept of virtualization.


Virtualization refers to the creation of multiple virtual machines that can run on a single physical system. In this case, the layered model starts with the physical hardware and the physical peripheral devices —processor, memory, disk drive, and network interface — on top of which we have a virtual representation of these hardware devices. In the next higher layer, we have virtual machines, each with its own internal layers of operating system and multiple applications. And each operating system can be different from one another. So, we end up with one physical server supporting multiple different virtual machines of different types.

The combination of virtualization with the large data centers at the Internet core give us the latest trend of “cloud computing” — where a business pays a service provider to deliver IT applications, computing power, and storage via the Internet. This enables businesses to access and share computer resources and potentially gain significant cost savings. Businesses with heavy demands for computing power, such as banks, could easily acquire huge processing capabilities on demand by processing data across large groups of servers.

Cloud computing has three commonly accepted service models: infrastructure as a service, platform as a service, and software as a service (see figure on slide).


Notice the boxes in the drawing — the different models depend on that box showing what the cloud computing vendor offers versus what the user (business) provides.
Infrastructure as a service provides various infrastructure components such as hardware, storage, and other fundamental computing resources.
Platform as a service provides a service that runs over an underlying infrastructure. A platform vendor offers a ready-to-use platform, such as an operating system like Microsoft Windows or Linux, which runs on vendor-provided infrastructure. Customers can build applications on a platform using application development frameworks, middleware capabilities, and functions such as databases.
Software as a service runs on an underlying platform and infrastructure managed by the vendor and provides a self-contained operating environment used to deliver a complete application such as Web-based e-mail and related management capabilities.
A side note: although cloud computing has gained popularity recently, the concept of “software as service” is not new — for example, Salesforce.com began offering their Customer Relationship Management (CRM) application as a service in 2007.
In addition to the service models that describe what can be provided, there are four deployment models that relate to how the cloud service is implemented. These four cloud models are private, community, public, and hybrid (see figure on slide).
In a private cloud, the service is set up specifically for one organization, although there may be multiple customers within that organization, and the cloud may exist on or off the premises.
In a community cloud, the service is set up for related organizations that have similar requirements.
A public cloud is available to any paying customer and is owned and operated by the service provider.
A hybrid cloud is a composite of the deployment models.
U.S. government agencies are beginning to consider and, in some cases, start using cloud computing services. In 2010, about half of the 24 U.S. government agencies reported using some form of cloud computing for infrastructure, platform, or software services.

Examples:
Defense Department’s Rapid Access Computing Environment or RACE program (2008) provides “platform as a service” to support defense department’s systems development efforts within a private cloud.

NASA’s Nebula is an open-source cloud computing project that provides an “infrastructure as a service” implementation for scientific data and Web-based applications,  with everything housed in a standard shipping container that is mounted in place, but could be transported if needed.
Department of Transportation’s CARS program used a public cloud for part of its system — this was a program that allowed owners of certain less fuel-efficient vehicles to receive a credit for trading in a vehicle and purchasing or leasing a new, more fuel-efficient vehicle (this was part of the recent stimulus program).
Also, Google is offering Google Apps for Government where the data generated by the government's use of Gmail and calendaring applications will be segregated from everybody else's “cloud-based” data on servers located in the continental U.S.
So, to reiterate, the convergence of IT with the network infrastructure brings large-scale computing and storage capabilities that can now be provided over the Internet, giving us cloud computing… this has the potential to make computing a service similar to other public utilities such as electricity, water, gas etc.
Cloud computing, whether in a private cloud or in the public Internet, provides new IT solutions for businesses.
Here’s an example:
Businesses can run applications off of servers that are located remotely in the Internet — this would be an example of “software as a service”.
There are some good reasons why businesses may use applications in the cloud:
* Applications on the cloud are always ready to go. You can simply sign up for a service (such as email or office productivity applications) and begin using them without having to go through the time, cost and management expense of setting up and running your own servers with the applications.
* You can use the applications from anywhere - - cloud applications are typically web-based applications and with these you are not tied to a specific computer. This means that you can run the applications at your office, at home, from your laptop while on the road, and even via a Web-enabled smartphones, all with full access to all your data and resources.
* You get the benefits of the economies of scale because the cloud computing resources are shared among many. You get to use services of large data centers that have multiple Internet connections, backup electric power, and security and redundancy. Virtualization enables these service providers to build massive computing infrastructures with low cost per CPU cycle. Software developers can share applications among millions of users, lowering per-user cost. For example, a CRM application like Salesforce.com could easily cost many thousands of dollars a month to purchase, install, and maintain, but anyone can sign up for Salesforce.com for $125 per month.
*You get the benefit fault-tolerance and redundancy that service providers build into their systems — something that most small businesses will never be able to match. Cloud computing is often more reliable than the networks run by small companies who cannot afford the benefit of redundant electric power, hot backup systems, as well as specialized management software and expertise in data center operations.
* You can easily add computing resources in a cloud model. As your company adds employees, you can purchase more computing resources from the cloud provider. Similarly, when no longer needed, you can as easily reduce the user or server count and the resulting monthly expenses.
* You can pay for cloud services as you go. Typically you buy cloud services via monthly payments instead of a large up-front payment. This enables growing companies conserve cash for other investments and for growing their business operations.
What are some of the issues with cloud computing? In our work with U.S. government agencies, we identified information security as an area of concern. We found that cloud computing can both increase and decrease the security of information systems in federal agencies. Potential information security benefits come from the use of virtualization and automation, which can enable rapid implementation of secure configurations for virtual machines.
However, the use of cloud computing can also create information security risks for federal agencies. Specifically, 22 of 24 major U.S. federal agencies reported that they are either concerned or very concerned about the potential information security risks associated with cloud computing. These concerns include risks related to dependence on the vendor and concerns related to sharing computing resources. Agencies also identified challenges and concerns in implementing existing federal information security laws and guidance. These concerns include limitations on their ability to conduct independent audits and assessments of security controls of cloud computing service providers and concerns related to the division of information security responsibilities between customer and vendor.
We expect that specific U.S. government agencies will develop guidance related to cloud computing security to help other agencies.
Cloud computing is a back-end IT solution for new businesses, but the innovation in convergence is most visible at the user/customer end. Wireless connectivity has been the driver for this. Wireless activity continues to grow rapidly, moving us towards the ubiquitous computing that has long been predicted. Based on industry data, as of December 2009, the wireless penetration rate was 91 percent in the United States.
Additionally, the number of adults living in U.S. households with only wireless telephone service has increased from less than 5 percent in 2003 to nearly 23 percent in 2009. According to one study of wireless use, wireless connections in California now exceed the combined connections of both wireline and broadband services.
The implication for a business is that many of your customers/clients now have powerful devices that enable them to access your services that may be hosted on the cloud. Smartphones such as Samsung Galaxy S and Apple iPhone support applications that can be tailored to work with specific back-end services, so businesses can gain an advantage by providing smartphone apps for to their customers.
In the few minutes I have left, I’d like to close with recent focus on telecom regulations and convergence in the U.S.
In the United States, Congressional policymakers think that additional regulations are needed to address the changing telecom environment. Because of the growing convergence in the telecom sector, many policymakers consider it necessary to "rewrite," or revise the laws governing these markets.
Whether regulators should play a role to ensure that the Internet remains open to all, often referred to as "open access" or "net neutrality," has also become part of the dialogue.
As a segue at this point, I’d like to point out that the layered network model comes into play in discussions of net neutrality — for example, advocates of net neutrality point out that trying to affect the physical transport of bits based  on application type violates the separation of layers in the network model.
Recent events in net neutrality debate: FCC ruling on Comcast overturned, Google and Verizon proposal for a legislative framework that proposes neutrality for wired broadband, but excludes wireless for now.

Spectrum and National Broadband Plan
- - Another crucial component in the telecommunications policy debate is the allocation and regulation of radio-frequency spectrum.  The private sector is developing an increasing appetite for wireless communications services such as high-speed access to the Internet and digital television broadcasts that require lots of spectrum. The public sector also requires spectrum for a number of uses, including voice and data support for emergency communications. The challenge is to meet both the public and private sector needs for usable spectrum.
FCC auctions off licenses for wireless spectrum, but the challenge to find open bands of spectrum.
In March 2010, FCC published the National Broadband Plan http://www.broadband.gov/plan/ , which outlines policies and actions intended to ensure that everyone in the United States have high-speed Internet access. The plan’s long-term goals include ensuring at least 100 million U.S. homes have access to affordable broadband services with speeds of 100 megabits per second and enabling citizens to use broadband services to track energy consumption.
One of the linchpins of the National Broadband Plan is reallocating or crafting new sharing arrangements for a large amount of spectrum currently designated for use by federal agencies and commercial services, which are licensed by the FCC and National Telecommunications and Information Administration (NTIA), respectively. In a July 2010 memorandum President Obama directed the FCC and NTIA to complete by Oct 1, 2010 a specific plan and timetable for identifying 500 megahertz of spectrum that over the next 10 years could be used for wireless broadband services.
A recent development is the Sep 24, 2010 announcement by FCC approving the use of unused airwaves in the broadcast TV spectrum for unlicensed mobile broadband operations. These “whitespaces” were freed up after the U.S. transitioned from analog to digital TV broadcasting. The freed up white spaces will be open to all users and do not require a license.
The expectation is that the new spectrum will be used in new consumer devices with wireless capabilities that have both longer range and greater bandwidth than current Wi-Fi solutions—the technology has been referred to as “super Wi-Fi” because of its improved bandwidth and ability to more easily penetrate buildings. This is the biggest block of spectrum space freed up by FCC in the last two decades.
So I’d leave you with the note that we have some interesting developments and some uncertainties with potential new laws in the U.S.

Bottom line: we have made progress, but we’re not there yet - - witness the broadband penetration chart for the top few countries…

So the journey continues…

Added July 2013: Cloud computing trends

• Mobile computing – bring your own device (BYOD) or provided by employer

• Broadband wireless connectivity – cloud computing just won’t be possible without network connectivity

• “Apps” to perform specific tasks: Employees/customers/clients now have powerful devices that enable them to access services that are hosted on the cloud. Smartphones support applications (“apps”) that can be tailored to work with specific back-end services, so businesses can gain an advantage by providing smartphone apps for to their employees and customers.

• Big data analytics – businesses collect large amount of data in a central location in the cloud, which makes it amenable to applying analytics on that “big data” and gain insights

• More sensors – Samsung Galaxy S4 comes with 9 sensors (gesture, proximity, gyro, accelerometer, geomagnetic, light, cover open/close, temperature+humity, and barometer) and there are apps that use these sensors; AT&T Digital Life offers home automation through smartphone apps…

• Internet of Things (IoT) – objects (sensors) on the Internet accessible from anywhere

• Everything “as-aService” (aaS):  “Data as a Service”, “desktop as a service” (outsourcing VDI – Virtual Desktop Infrastructure),  “business process as a service (BPaaS)” (for example, payroll, printing, e-commerce, etc)

Thursday, January 12, 2012

NBTMV - update on video recording setup, now with Sony Alpha NEX-3 and external mic






In this video I provide an update on my video recording setup at home. For the last few months, I have been using the same setup, recording the videos with a Panasonic Lumix DMC-LX3 digital camera, which has been performing very well but that it doesn’t have an external microphone input and I thought I could improve the sound quality of the video recording by using a camera with an external microphone so I began searching and I found this website called snapsort.com where you can put in various features that you want in the camera for example external microphone, low light sensitivity, a small form factor, specific price points, etc. and the site recommends cameras mathing the features you want. When I put everything I wanted, snapsort.com showed Sony Alpha NEX-3 as a top choice. I checked it out on amazon.com and they had a good price for a Sony Alpha NEX-3 with a 16mm wide-angle lens, so I bought it along with an external microphone (model Sony ECM-SST1) for recording my videos. Here are some photos of the camera and the camera on a tripod in my video recording setup.



Sony Alpha NEX-3 with external stereo microphone (front view)




Sony Alpha NEX-3 with external stereo microphone (back view)

Sony Alpha NEX-3 with ECM-SST1 external mic mounted on tripod for video recording

Other than the new camera with external microphone, the rest of the video recording setup is the same with three-point lighting system and a black muslin sheet as background.

I recorded this video with the new setup and I think it does sound better with the external microphone. You can judge for yourself by comparing this video with previous ones.
Here's some more information to help you...

Here are links to the camera and external microphone and some accessories:

Friday, January 6, 2012

NBTMV - Ideas for new year’s resolution 2012



Happy new year. I bought a Sony Alpha NEX-3 camera with an external stereo microphone and I’m recording this first video of 2012 using that camera. I hope this video sounds better than my old videos recorded without an external microphone.

It’s early enough in the new year that there’s still time for you to make some new year’s resolutions and follow through on them. Here are  three ideas for new year’s resolutions, for each of which I have videos and other information in my blog.
The first idea has to do with managing your body weight. What with all the food and celebrations during the holidays leading up to the new year, your weight and fitness must be at the top of your mind. I had taken some steps in 2003 to lose over 30 lbs by thinking of my body as a system with food and exercise as input and the weight as the output. I got good results by simultaneously changing my diet (less carbohydrate, more veggies) and increasing the level of exercise. I have more information about this in a previous video.
The second idea for new year’s resolution would be to learn a new language. With all the Spanish-speaking population here in the United States, learning to speak Spanish is an ideal choice. I took the plunge over two years ago and used a 3-step approach to start speaking Spanish -- (1) Listen and Repeat, (2) Listen and Understand, and (3) Start Speaking. You can learn more about my method from some earlier videos.  
And last, but not the least, if you are empty-nesters like us and you're considering picking up traveling abroad, then I have a number of videos where I talk about strategies for starting your world travels. I think you’ll find it worthwhile to consider starting your travels with some guided tours, followed by exploring cities on your own, and finally renting apartments in a city and doing everything on your own.
Those are some of the ideas for new year’s resolutions. I hope you pick one or more of these, consult my videos, and follow through on your new year’s resolutions. Good luck!