What are the prerequisites and requirements to learn cloud computing?
This is the first article in a series to introduce our members to the prerequisites to learning cloud computing. This was a question I was emailed countless times from our users, and while we have Learning Paths, AWS Certification Prep Learning Paths, Hands-on Labs, and Skill Assessments to help guide you on this journey, can provide a useful guide on this topic.
Prerequisites to learning cloud computing
At one point or another, most new members of Cloud Academy ask us, “What are the prerequisites and requirements to start learning cloud computing?”
In this article, we’ll give you the necessary information needed to answer this and make sure you’re ready to start learning cloud computing without any worries.
The term “cloud computing” refers to a wide area of information technology (IT) that touches on the following areas:
- Hardware infrastructure
- Software infrastructure
- Data center facilities
- Virtualization technologies
- Software engineering concepts
All these areas are connected and can provide you with a strong background as you start exploring and working with cloud computing platforms. However, in this article we’ll focus only on Infrastructure-as-a-Service (IaaS) cloud providers such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP).
We can start with a basic assumption: you don’t need a degree or a computer science or computer engineering degree to learn cloud computing. You can start learning cloud computing from scratch even if you have very basic IT skills. You will just need to learn some of the main concepts, and how those concepts relate to each other. Then get some hands-on practice, especially in fixing problems.
Assumptions about learning cloud computing
- To learn cloud computing you should be able to code.
FALSE.
To learn cloud computing, you can start using a public or private cloud computing service, even if you are not a software developer. It doesn’t require coding chops to get an account or launch resources in a cloud environment. It will, however, be helpful to think about things in an organized and deliberate way, the way that coders do. And it’s good if you like to figure out what causes problems. There are always problems to be solved.
- You should have some previous experience in the IT world.
FALSE.
Cloud Computing is a technology used in all industries globally. Understanding it will help everyone, not just technical people. Chances are, you work at a company that is already using cloud infrastructure. It doesn’t matter if you’re completely new to cloud tech, you’ll still be able to learn about the value of it and see the value applied all around you.
- Cloud computing is only for technical people and developers.
FALSE.
Cloud Computing is changing how companies build and use IT systems, involving all of their software and teams that use it (not just teams that create the tools). Cloud Computing is a topic that managers, marketing people, executives, finance experts, system administrators, and developers must learn about. Of course, this will be with different approaches and specific aspects for the different roles and responsibilities, but that’s the beauty and the opportunity within this field: it touches everyone.
Operating Systems, Virtualization, and Networking
Starting at the beginning: Operating Systems
Since cloud computing is a broad area, to learn Cloud Computing you should have some skills related to basic concepts of an Operating System (OS) — how they work and operate at a high level — e.g. Windows, Linux, and a few basic concepts about them. Really, if you are familiar with using computers for different office duties, that should provide you with enough context.
Virtualization in Cloud Computing
Once you get your feet wet with operating systems and servers, you’ll see that virtualization technologies play a very big role when we talk about cloud computing: from a die-hard point of view, there is nothing like trying to start your own virtual machine (hint: you can do that with VirtualBox) to understand how a virtualized environment works.
Virtualization allows you to create virtual environments that have specific amounts of CPU, RAM and disk space assigned and their own installation of an Operating System like Linux or Windows. They share the same physical hardware and all the network equipment but they are virtually separated from the other virtual machines.
One of the first pioneers in this industry is VMware. Virtualization was a well-known technology before VMware, but they basically changed the landscape by commercializing it as packaged software that is now very common in small, medium, and big sized companies. A key advantage that VMware currently has is that their technology is used across different cloud providers, so while AWS and others fight over the biggest marketshare, VMware can get business in many areas.
Virtualization has introduced an economic and strategic concept called Consolidation. This means that companies around the world, using Virtualization, don’t have to allocate a single, physical server for every type of application or workload they run, but they can share resources across many virtual machines.
Virtualization: Basic Concepts
Virtual Machine (VM): A VM is the basic element in virtualization. Think of it as a custom, ephemeral computer that is created for you. In reality, through the VM tech, a functioning “instance” of a computer has been made and what’s cool is that many VMs can exist on one physical computer. Each virtual machine has its own operating system, CPU, RAM, disk space, and may have a more complicated configuration on the networking side to deal with the complexities that are inherent in virtualization.
Virtualization advantages: As mentioned above, consolidation is the first plus. There is also the option to move VMs among different physical nodes, and flexibility to add new resources to an existing virtual machine thereby making it very modular and changeable for your needs. Another key advantage is economical. You can create VMs of many different sizes from tiny to huge, and a cloud provider can accomplish this with a fixed number of physical resources. A user reaps the benefits of this because they are given many different options with varying costs.
If you want to deepen your understanding of the most important technical advantages of server virtualization, this course on virtualization technologies gives you everything you need to get started.
Hypervisor: This is the name of the “Virtualization Core” that runs all the virtualization machines — it’s the boss of all the VMs on a given machine. Some examples of hypervisors are VMware, KVM, Xen, or OpenVZ. All the cloud environments use some version of a hypervisor, with an example being Xen, which is used by AWS. You can get in depth info on what a hypervisor is here.
Networking
Once you have basic knowledge of Virtualization Technologies and Operating Systems, you can consider some introductory courses on Networking. Then, you will easily learn cloud computing.
Networking can be a very difficult topic and even those with strong analytical skills typically require time to fully understand it, so don’t lose faith if it all seems like some kind of mystical magic at the beginning — that’s totally normal.
You can start out by checking out a whole learning path we have that details AWS networking basics, starting with a VPC — a virtual private cloud. To get you started, all you need to think about is that your virtual machine also needs a virtualized communication center. This is the VPC, and within this VPC there are a whole host of configuration options that you can choose to start your deployment and make it secure. Don’t worry too much about understanding all the details right away — some people make whole careers out of networking!
Public vs. Private Cloud Computing
Finally, we want you to have a clear idea of what is meant when you hear discussions about public cloud computing vs. private cloud computing.
Public Cloud: This refers to a publicly accessible infrastructure where you can store data, virtual machines, and any other kind of cloud resources. You can do this autonomously/programmatically, and you don’t have to invest in hardware or infrastructure. You use public clouds with a pay-per-use approach. Think of it like this: you are not buying the car, you are renting it for a specific period of time.
Private Cloud: As a company, you may want all the flexibility and advantages of cloud computing but you may still have your own on-premises data center and infrastructure for security or compliance needs. In a private cloud case, you are responsible for managing the whole infrastructure which can behave in a similar — though at a much smaller scale — way as a large public cloud.
Hybrid Cloud: As the name suggests, this is a combination of the two, and for many groups is driven by business needs. There are cool (and proprietary) services created to connect the public and private clouds directly, such as the aptly named AWS Direct Connect.
In this short lecture, you’ll gain the knowledge you need to understand the differences between these models: Cloud Deployment Models: Public, Private, and Hybrid Clouds.
Hands-on Experience
Research shows that hand-on learning is the best way to learn something. So take some of the ideas we’ve shared above and try them out. We have labs that make it quick and easy for you to get your hands dirty and get reps on real cloud environments.
Check out some quick lab walkthroughs below:
And check out a beginner Hands-on Lab that is designed to help you learn quickly and retain your knowledge: