As with any technology, the cloud brings its own alphabet soup of terms. This article will hopefully help you navigate your way through the terminology, and provide you the knowledge and power to make the decisions you need to make when considering a new cloud implementation.
Here’s the list of terms we will cover in this article
-Anything as a Service
Phew—that’s a lot. Let’s dig in to the definitions and examples to help drive home the meanings of the list of terms above.
SaaS (Software as a Service)
This is probably the most common implementation of cloud services end users experience. This is software that users access through their web browser. Some software may be installed locally to help augment functionality or provide a richer user experience, but the software installed locally has minimal impact on the user’s computer. Figure 1 provides a high-level overview of this concept.
Figure 1 High-level overview of Software as a Service
You are probably a user Facebook, Google docs, Office 365, Salesforce, or LinkedIn either at home or at work, so you’ve experienced SaaS first hand and probably for a long time. What SaaS tools are you using outside of those mentioned here? Reach out and let me know—I’m very curious.
PaaS (Platform as a Service)
PaaS allows a developer to deploy code to an environment that supports their software but they do not have full access to the operating system. In this case the developer has no server responsibility or server access.
When I first starting writing about cloud technology three years ago, this was kind of primitive service. The provider would just give you access to a folder somewhere on the server with just a bit of documentation and then you were on your own.
Now there are tools, such as CloudFoundry, that allow a developer to deploy right from their Integrated Development Environment (IDE) or from a command line production release tool. Then CloudFoundry can take the transmitted release and install it correctly into the cloud environment. With a little trial and error, anyone with a bit of technical skills can deploy to a tool like CloudFoundry where the older style of PaaS took a lot of skill and experience to deploy correctly.
IaaS (Infrastructure as a Service)
Originally IaaS dealt with a provider giving a user access to a virtual machine located on a system in the provider’s data center. A virtual machine is an operating system that resides in a piece of software on the host computer. Virtual Box, Parallels and VMWare are examples of software that provide virtualization of operating systems called Virtual Machines (VM)
Virtualization of servers was all the rage for a while, but when you try to scale within the cloud with multiple virtual servers there are a lot of drawbacks. First, it’s a lot of work to make VMs aware of each other and they don’t always share filesystems and resources easily. Plus, as your needs grow, VMs with a lot of memory and disk space are very expensive, and very often an application on a VM is only using a portion of the OS. For example, if you are deploying a tool that does data aggregation and runs as a service you won’t be taking advantage of the web server that might be running on server too.
The issues mentioned in the previous paragraph are common headaches for those moving their on-premise implementations to the cloud, and those headaches gave rise to Docker. Docker is a lighter weight form of virtualization that allows for easier sharing of files, versioning, and configuration. Servers that could only host a few VMs can host thousands of Docker images, so providers get better bang for the buck for their server purchases.
Further explanation of Docker is an article all by itself (provide link to my future Docker article), but for now it’s import to realize that Docker needs to be part of any discussion of moving your applications to the cloud.
DaaS (Desktop as a Service)
Desktop computers are expensive for large corporations to implement and maintain. The cost of the OS, hardware, security software, productivity software, and more start to add up to where it makes a major impact on any corporation’s budget. Then just as they finish deploying new systems to everyone in the company, it’s time to start upgrading again because Microsoft just released a new OS.
Another fact with most desktop computers is that they are heavily underutilized, and DaaS allows an IT department to dynamically allocate RAM and disk space based on user need. In addition backups and restores are a breeze in this environment, and if you are using a third party provider all you need to do is make a phone call when a restore of a file or desktop is needed. Plus upgrades to new operating systems are seamless because the DaaS provider takes care of them for you.
The main advantage I see with DaaS is security. With one project I was involved with, we restored the state of each Desktop to a base configuration each night. While this did not affect user files, it did remove any malware that might have been accidently installed by a user clicking on the wrong email. Documents from Microsoft Office or Adobe products were scanned with a separate antivirus program residing on the storage system they were a part of, and the network appliance that we used did not allow for the execution of software. That made it very secure for the client I was working with.
So what does a user have on their desktops? Luckily in recent years there has been an explosion of low cost computing devices, such as a Raspberry PI, that support Remote Desktop Protocol (RDP) so your users could access a windows desktop from the linux-based PI which you can get for a measely $35.
DaaS is awesome for your average information worker, but for a power user like a software developer this setup in my experience doesn’t work well. Your average developer needs to much control over the OS to install the widgets and tools they need to do their job. However as DaaS moves forward, I’m sure providers will come up with a way to work around these developer needs and come up with an environment that will make developers even more productive.
A private cloud is a cloud implemented by your company within a corporate owned datacenter. This was the approach a telecommunications company I worked at took in 2007 and 2008. This was a huge improvement in service at the company because I could call up and have access to a new VM in a few hours. Before you would have to go through the procurement process for months to get access to a server running in a data center. Figure 2 gives a high level overview of how a private cloud may be configured.
A private cloud is more expensive because there is still the procurement of servers, networking gear, backup solutions and more to support this infrastructure. Today with most corporations looking to reduce their datacenter footprint and IT spend, private clouds are slowing disappearing.
Figure 2 A Private Cloud Configuration
A hybrid cloud is implemented when some of the servers and software reside within the corporate datacenter and other parts of the implementation are hosted within the cloud.
Microsoft provides hybrid cloud solutions for SharePoint. Many companies already have installations of SharePoint within their datacenter. Many times as usage increases, there is a need to purchase more servers to through into the SharePoint farm, and as you know it’s sometimes difficult to get the budget for new servers when you really need it. This is where the hybrid cloud can help. Consider Figure 3:
Figure 3 Existing SharePoint Farm Augmented for Hybrid Cloud
Here is an existing SharePoint farm that needs some extra resources because of the large number of running workflows. Instead of purchasing additional servers, Microsoft has made it easy to allow to offload certain functions, such as workflow, to the cloud while you can keep your existing farm in place and not worry about a migration for now. Instead of the expense and time involved in procuring new servers, you just take out a credit card and setup your workflow engine and you’re up and running in a few hours.
SharePoint isn’t the only software and Microsoft isn’t the only provider of this type of solution. All the major players offer solutions like this, and it’s a great way to put your foot in the water with cloud technology.
This is an interesting trend. Individuals downloading software or purchasing devices so they can create their own cloud at home. With all of us having high speed Internet now, this is a trend that could take off. You can bypass cloud providers and hide your files easier from the NSA.
Using Software Like Tonido and BitTorrent Sync, you can host your own cloud at home by downloading some software onto a local computer, and then access your pictures and other files through the web. There’s usually some online setup so that you can reach your system at home, but the setup is straightforward.
If setting up your own hardware doesn’t appeal to you, there are several manufacturers offering personal cloud solutions such as Transporter and Seagate. These are appealing because you plug them into your network and do a bit of setup and you’re up and running. If you’re spending hundreds of dollars a year on tools like Drobox or other file sharing services, this can be a viable and fun alternative.
I hesitated documenting the idea of a community cloud since to me it feels like a term looking for an implementation. This idea is much like a private cloud, but instead of a single organization setting it up and paying for software and hardware a group would pool their resources and create it together.
With the cloud hosting company’s prices falling all the time, I don’t see this concept happening. This community will still need to staff and pay for software and hardware. They would also need to monitor security and uptime. I think there are too many disadvantages to this approach to ever see it become reality.