Cloud Storage | Cloud Academy Blog https://cloudacademy.com/blog/category/cloud-storage/ Tue, 05 Sep 2023 17:08:11 +0000 en-US hourly 1 https://wordpress.org/?v=6.4.1 10 Benefits of Cloud Storage https://cloudacademy.com/blog/10-benefits-of-using-cloud-storage/ https://cloudacademy.com/blog/10-benefits-of-using-cloud-storage/#respond Fri, 30 Jun 2023 13:30:00 +0000 https://cloudacademy.com/?p=41854 It’s 2023, and cloud storage has become one of the most convenient and efficient methods to store data online. With more than 60% of the world’s corporate data stored in the cloud, this service area has exploded, with cloud infrastructure services generating $178 billion in revenue annually. It’s become so...

The post 10 Benefits of Cloud Storage appeared first on Cloud Academy.

]]>
It’s 2023, and cloud storage has become one of the most convenient and efficient methods to store data online. With more than 60% of the world’s corporate data stored in the cloud, this service area has exploded, with cloud infrastructure services generating $178 billion in revenue annually. It’s become so big in fact that even big tech has gotten in on the action, with many companies now owning and operating separate data storage facilities.

With cloud computing data storage, a business, rather than storing data at an onsite storage facility, stores data at a remote location, which can be accessed using an internet connection. The cloud computing provider handles the management and security of the storage servers, infrastructure, and networks to guarantee access to data at any time with elastic capacity. Experts estimate that by the year 2025 cloud data storage will exceed 100 ZB.

If you want to find out how Cloud Academy can help your business transition to cloud storage or simply optimize its use by training and upskilling your team, fill out the form below and get a free demo!

Like other technologies and services, cloud storage has its advantages and disadvantages. Here in this article, we will discuss the ten benefits of using a cloud storage service. You should also be aware of the cons, so here we have also provided some significant disadvantages of cloud storage services.

To dive deeper into these topics, check out Cloud Academy’s Get Started Building Cloud Solutions Learning Path. In this learning path, you’ll build your knowledge of cloud services, starting with the fundamentals of AWS compute and AWS storage.

What are the benefits of cloud storage?

Below are the 10 benefits of cloud-based storage:

  1. Usability and accessibility
  2. Security
  3. Cost-efficient
  4. Convenient sharing of files
  5. Automation
  6. Multiple users
  7. Synchronization
  8. Convenience
  9. Scalable
  10. Disaster recovery

Let’s dive into all cloud storage advantages one by one.

1. Usability and accessibility

The first of the main cloud storage benefits concerns usability and accessibility. Most of the cloud data storage services come with an easy-to-use user interface and provide a feature of drag and drop. For instance, you can think of Google Drive from Google or iDrive from Apple. They both have a simple interface, and you can easily upload your file on your online drive without any expert knowledge. For example, if you have saved a file in a drive using a mobile device, you can retrieve that file using a computer or any other device with internet connectivity. It doesn’t matter where you are right now. If you have a good internet connection, you can access your files, which are saved online somewhere in the data centers.

2. Security

Before choosing a cloud service for your business, it’s important to make sure that the service provided offers maximum security.

Cloud storage saves your data across redundant servers, so even if one of the data centers collapses, your data will be managed by the other data centers, which makes your data safe and supervised. If all the data centers of the storage provider collapse or are destroyed, only then can data be lost. However, this is highly unlikely, as a cloud storage service is formed of thousands of data centers.

Some cloud storage providers keep copies of your data at different data centers, so even if the data gets lost or corrupted at the server, the backup will be there.

As a tech manager, you know best that security is one of the most significant issues holding back cloud adoption. Cloud Academy’s training library focuses deeply on IT Security, allowing your team to stay up to date with new security breaches and ways to resolve them.

If you’re interested in understanding how to keep your cloud environment secure with Cloud Academy, contact us and request a free demo!

3. Cost-efficient

Cloud data storage can help to reduce internal expenses. With this technology, the company itself does not need internal resources and support to manage and store its data; the cloud storage vendor handles it all. There are some cloud storage options that provide lifetime storage at an affordable rate which is a win-win offer for medium-to-small businesses and individual users.

4. Convenient sharing of files

All cloud storage options provide file-sharing features, which help you to share your file with other users. You can either send a file to another user or invite multiple users to view your data. Mostly all the vendors provide a cloud environment in which two users using the same cloud service can share their data, though there are only a few service vendors that offer cross-platform file-sharing features.

5. Automation

Cloud storage services work like a hard disk on your system. If you want to store a file in the cloud, it will not affect ongoing tasks. There may be more than one user using a cloud storage service, and the current responsibility of one user would not affect the task of another since it is all managed and automated by the cloud service provider.

6. Multiple users

The same cloud environment can have more than one user associated with it. With cloud storage, multiple users can collaborate with a common file. For instance, you can allow multiple users to access your files and edit them. The authorized person can access your file from any part of the world in real time.

7. Synchronization

Another benefit of cloud storage relates to synchronization. Each cloud storage provider gives the sync feature. With synchronization, you can sync the cloud storage data with any device you want. As we have discussed, we can access our data from any device and any part of the world, but this accessibility is done with the help of synchronization. With proper credentials, you can log in to your subscribed storage service with any device, and you will be able to access your all data that have been stored in that cloud storage. There is no need to copy data from one device to another, but you need a good internet connection to access your files.

8. Convenience

You do not need a hard disk or flash drive to access or view your data — everything is done online. However, if you want to download a file, you may require a storage device or you can download that data to your device. If you want to surf your data, then it would not occupy any space on your device. 

Even if you make changes to the data, changes will be reflected on every device that is synced with the storage service. Expert or technical knowledge is not required to use the cloud storage service. All the heavy lifting is managed by the vendor itself. 

9. Scalable

Cloud storage is scalable and flexible. If the current plan of storage is not enough, you can upgrade the service plan. And you do not need to move any data from one location to another, the extra space will be added to your data storage environment with some extra features.

10. Disaster recovery

Cloud storage is considered the best solution for both business continuity and disaster recovery, and it eliminates the need to run a separate disaster recovery data center. One of the main advantages of creating a disaster recovery plan using cloud storage services is that you can back up the entire server almost simultaneously. All the information coupled with the systems and applications are grouped into one software block or a virtual server for easy backup and retrieval.

Disadvantages of cloud storage

Below are the main drawbacks of using cloud storage services.

1. Drag and drop

The drag and drop option may move your original data from one location to another, so make sure instead of using the drag and drop option. Simply use the copy-and-paste method.

2. Internet dependency

Without an internet connection, you cannot access your data while downloading the file from cloud storage. If there is an internet failure, it might corrupt the data which you were downloading.

3. Data security and privacy

Many cloud storage vendors lack data security and privacy fields, and there are many cases where the data from the cloud storage gets leaked.

4. Costs

Most of the best cloud storage services are expensive; this is because they are mainly designed for business purposes. If you go for a less expensive plan, you might have to compromise with some of the features.

Conclusion

Over the last ten years, cloud storage services have seen tremendous success in the software world. Nowadays, it is practically a necessity for any company, regardless of size, to use cloud storage to some degree. With its incredible potential, cloud technology is capable of completely revolutionizing how information is stored and communicated.

We hope you are now more conscious of the main benefits of cloud storage, its cons or disadvantages and you’ll be more informed when you need to choose what cloud provider is best for you.

If you want to find out how Cloud Academy can help your business transition to cloud storage or simply optimize its use by training and upskilling your team, feel free to contact us and get a free demo!

The post 10 Benefits of Cloud Storage appeared first on Cloud Academy.

]]>
0
Azure Storage: Overview and Introduction to the Various Solutions https://cloudacademy.com/blog/azure-storage-service-overview/ https://cloudacademy.com/blog/azure-storage-service-overview/#respond Thu, 22 Sep 2022 09:01:00 +0000 https://cloudacademy.com/blog/?p=9466 Learn about the features and benefits of Azure Storage, with details explaining services and storage accounts.

The post Azure Storage: Overview and Introduction to the Various Solutions appeared first on Cloud Academy.

]]>
The competition for customers among cloud vendors is driving a great deal of innovation. Services are becoming easier to use and more robust, scalable, and available. Not to mention cheaper. Both AWS and Azure storage options are constantly improving, with AWS’s S3 and Glacier emerging as mature and reliable platforms. But Microsoft certainly hasn’t been quietly watching from the sidelines and Azure Storage has become a real player in the Cloud Storage game.

What is Azure Storage?

“Azure Storage is a cloud storage solution for modern data storage scenarios”
as reported in the Microsoft Azure documentation.

Azure Storage capacity is virtually limitless. It uses a pay-as-you-go service model that will fit your business whether you’re storing only a few hundred GBs or trillions of objects. Under the hood, it comes with Microsoft’s guaranteed, massively scalable, highly available, robust and economical architecture. For developers, Azure storage supports a variety of clients, such as .NET, Ruby, and Java for REST, and access from multiple Windows and Linux operating systems.

In this article, we’ll explore Azure Storage’s benefits and features, the services/solutions it offers, and more.

Azure Storage features and benefits

These features apply to all Azure Storage offerings:

Durability

Azure Storage data is replicated multiple times across regions. There are four ways you can make sure data is stored redundantly: Locally Redundant Storage (LRS), Zone-Redundant Storage (ZNS), Geo-redundant Storage (GRS), and Read Access Geo-redundant Storage (RA-GRS).

Using LRS, three copies of all data are maintained in a single facility within a single region. With ZRS, three copies of your data will be stored in multiple facilities of two or three regions. Obviously, this will achieve greater durability than LRS. For GRS, six copies of data are stored across two regions, with three copies in a so-called primary region, and the rest in a secondary region, usually geographically distant from your primary region. In case of primary region failure, the secondary region is used as part of a fail-over mechanism. RA-GRS data will be stored just like GRS, except that you get read-only access to the secondary region.

Geo-redundant Storage (GRS) and Read Access Geo-redundant Storage (RA-GRS) provide the highest level of durability but at a higher cost. GRS is the default storage redundancy mode. In case you need to switch from LRS to GRS or to RA-GRS, an additional one-time data transfer cost will be applied. But if you chose ZRS, you cannot subsequently change to any other redundancy mode.

High Availability

With such durable features, storage services will automatically be highly available. If you chose GRS or RA-GRS, your data will be replicated in multiple facilities across multiple regions. Any catastrophic failure of one data center will not result in permanent data loss.

Scalability

Data is automatically scaled out and load-balanced to meet peak demands. Azure Storage provides a global namespace to access data from anywhere.

Security

Azure Storage relies on a Shared Key model for authentication security. Access can be further restricted through the use of a shared access signature (SAS). SAS is a token that can be appended to a URI, defining specific permissions for a specified period of time. With SAS, you can access standard stores like Blob, Table, Queue, and File. You can also provide anonymous access, although that is generally not recommended.

Accessibility

Azure Storage data are accessible from anywhere in the world over HTTP or HTTPS. A variety of languages are supported, such as NET, Java, Node.js, Python, PHP, Ruby, and Go.

Azure Storage Services

With an Azure Storage account, you can choose from different types of storage services:

  1. Azure Blob Storage (Azure Blobs)
  2. Azure File Storage (Azure Files)
  3. Azure Queue Storage (Azure Queues)
  4. Azure Table Storage (Azure Tables)
  5. Azure Disk Storage (Azure Disks)
Azure Storage

Let’s give an overview of each Azure Storage Data Service.

Azure Blob Storage (Azure Blobs)

Azure Blob Storage is basically a storage for unstructured data that can include pictures, videos, music files, documents, raw data, and log data…along with their meta-data. Blobs are stored in a directory-like structure called a “container”. If you are familiar with AWS S3, containers work much the same way as S3 buckets. You can store any number of blob files up to a total size of 500 TB and, like S3, you can also apply security policies. Azure Blob Storage can also be used for data or device backup.

Azure Blob Storage service comes with three types of blobs: block blobs, append blobs, and page blobs. You can use block blobs for documents, image files, and video file storage. Append blobs are similar to block blobs, but are more often used for append operations like logging. Page blobs are used for objects meant for frequent read-write operations. Page blobs are therefore used in Azure VMs to store OS and data disks.

Azure File Storage (Azure Files)

Azure File Storage is meant for legacy applications. Azure VMs and services share their data via mounted file shares, while on-premise applications access the files using the File Service REST API. Azure File Storage offers file shares in the cloud using the standard SMB protocol and supports both SMB 3.0 and SMB 2.1.

Azure Queue Storage (Azure Queues)

The Azure Queue Storage service is used to exchange messages between components either in the cloud or on-premise (compare to Amazon’s SQS). You can store large numbers of messages to be shared between independent components of applications and communicated asynchronously via HTTP or HTTPS. Typical use cases of Queue Storage include processing backlog messages or exchanging messages between Azure Web roles and Worker roles.

Azure Table Storage (Azure Tables)

Azure Table Storage, as the name indicates, is preferred for tabular data, which is ideal for key-value NoSQL data storage. Table Storage is massively scalable and extremely easy to use. Like other NoSQL data stores, it is schema-less and accessed via a REST API. Azure Table Storage is now part of Azure Cosmos DB.

Azure Disk Storage (Azure Disks)

Azure Disk Storage allows data to be persistently stored and accessed from an attached virtual hard disk. The available types of disks are ultra disks, premium solid-state drives (SSD), standard SSDs, and standard hard disk drives (HDD). Azure-managed disks are stored as page blobs, which are a random IO storage object in Azure.

Azure Storage Account Overview

An Azure Storage Account can be considered as a container that combines a set of Azure Storage services together. Integrating data services into a storage account allows you to manage them as a group. Azure Storage offers several types of storage accounts. Each type supports different features and has its own pricing model. Let’s take a look at each type of storage account you can choose.

Standard general-purpose v2

Supported storage services

Blob Storage (including Data Lake Storage), Queue Storage, Table Storage, and Azure Files.

Usage

Standard storage account type for blobs, file shares, queues, and tables. Recommended for most scenarios using Azure Storage.

Premium block blobs

Supported storage services

Blob Storage (including Data Lake Storage).

Usage

Premium storage account type for block blobs and append blobs. Recommended for scenarios with high transaction rates or that use smaller objects or require consistently low storage latency.

Premium file shares

Supported storage services

Azure File Storage.

Usage

Premium storage account type for file shares only. Recommended for enterprise or high-performance scale applications.

Premium page blobs

Supported storage services

Page blobs only.

Usage

Premium storage account type for page blobs only.

Learn about Azure Storage on Cloud Academy

Azure storage service is a fine example of well-designed architecture that fits many use cases, including enterprise needs. With an SLA ranging from 99.9% – 99.99%, it is an easy choice for users looking for scalable, reliable, and effectively infinite space. Microsoft itself uses Azure Storage for its popular services like Skype, XBOX, Bing, and SkyDrive.

In this post, we introduced you to the basics of the Azure Storage service. In subsequent posts, we will explore the features, architecture, and hands-on experiences with Azure Storage in greater detail. As you move forward in your learning, we strongly recommend the course Introduction to Azure Storage.

This course is completely free for those who register on the Cloud Academy Free Courses and Free Content page. Once you register (no credit card required) you’ll be able to take the Introduction to Azure Storage course.

If you’ve got something to add, please do add a comment below!

The post Azure Storage: Overview and Introduction to the Various Solutions appeared first on Cloud Academy.

]]>
0
Linux vs. Windows Hosting: Which Should You Choose? https://cloudacademy.com/blog/linux-vs-windows-hosting-which-should-you-choose/ https://cloudacademy.com/blog/linux-vs-windows-hosting-which-should-you-choose/#respond Mon, 20 Jul 2020 12:54:00 +0000 https://cloudacademy.com/?p=41823 Whether you need to create a website for your online blog or e-commerce business, or you want to practice running your own servers, you’re faced with one very important choice: Should you run a server on Linux or Windows? Most hosting services would have you start a Linux server, but...

The post Linux vs. Windows Hosting: Which Should You Choose? appeared first on Cloud Academy.

]]>
Whether you need to create a website for your online blog or e-commerce business, or you want to practice running your own servers, you’re faced with one very important choice: Should you run a server on Linux or Windows?

Most hosting services would have you start a Linux server, but is running it on Windows a better option? This guide tells you everything you need to know about Linux vs. Windows hosting. If you’re already familiar with the basics and want to learn how to administer Linux, check out Cloud Academy’s intermediate-level Linux Server Professional Learning Path.

Linux Professional Learning Path

Licensing and cost

The first difference is pretty straightforward. Windows is sold by a for-profit company, Microsoft, while Linux is open-source. This means that hosting companies that run Windows servers have to pay for a license. The hosting company will then pass that cost on to you, making Windows hosting a bit more expensive than Linux.

Linux is completely open-source and it’s free. You’ll only be paying for the hardware and services when you get a Linux hosting, not for a Windows license.

For those unfamiliar with how hosting works, you can’t run a server on the free Windows 10; you need a Windows Server Datacenter license for that.

That said, the difference in price is not huge. The lowest price you’ll pay for decent Linux hosting is about $2 per month. For Windows hosting, you’ll have to part with $4 per month.

Popularity

When it comes to popularity, Unix is the leader — hands down. As of 2020, over 71% of all websites on the internet run on a Unix system. Linux, an open-source version of Unix, is used by 42.8% of all the websites that use Unix. 33% of all websites run on one of the major Linux distros, and other Unix systems like BSD, Darwin, Solaris, and Minix only serve 0.1% of all websites.

The best guess is that those unidentified systems are either advanced Linux distros or were simply not mentioned on the host websites. Even if they aren’t, they’re still using Unix or Linux systems, not Windows.

Windows is only used by 28% of the websites. Here’s a graph by W3Techs to visualize that.

Linux Market Postion

Source: W3Techs

Note that this graph shows that Windows is used by websites with higher traffic. Linux, however, isn’t that far behind. Some of the biggest websites run on it (Wikipedia being one example).

Why is Linux so popular that it beats Windows almost 3 to 1 when it comes to usage stats? Apart from Linux being free, this system has been around for a while and now has simple and intuitive management tools.

Unlike when using Linux as a personal OS, you don’t need to be a programmer to use Linux hosting. To create a personal website, it’s as simple as generating a name, creating a website on WordPress, and publishing it. Naturally, many people have done this, resulting in huge numbers of websites hosted on Linux.

Even if you don’t host a website on WordPress directly, Linux CPanel is a very intuitive piece of software.

Technology

So far, technology is the most important difference between Linux and Windows hosting. Linux uses custom software and supports most programming languages, while Windows uses specific Microsoft software to run and manage data.

Linux works on MySQL, while Windows servers use MSSQL. Windows hosting also uses Microsoft ASP.NET as the main programming language. These are not very wide-spread technologies, and most beginner programmers would have to invest considerable time to master them.

Windows Server Datacenter and MSSQL are mostly used in large corporations to develop proprietary servers. If you’re looking for a job in one of those, or you think your future projects may include handling legacy projects from big corporations, a Windows hosting may be a good idea. Also, if you need MSSQL or ASP.NET for your future employment or to run applications on the website, Windows is the way to go.

In most other cases, Linux should serve your needs just fine. MySQL is the most popular data management system, and it has plenty of supportive software as well. It supports PHP, Perl, and Python, which makes it accessible to developers of many backgrounds.

That said, if you’re planning to configure a Linux hosting on your own, you will need to be familiar with Apache or NGINX. If you’re a regular user and don’t need to configure the hosting on your own, Linux is one of the easiest systems to use. You just need to figure out CPanel, which is largely intuitive. There are also thousands of guides covering every aspect of CPanel if you’re having any trouble.

Customizability

The last significant difference between Linux and Windows hosting is the potential to customize it. Linux is a leader here as well.

For starters, if you own a server, you can install any Linux distro there. With Windows, you can only get a licensed version of Windows Server Datacenter. This may mean nothing if you’re a beginner, but for people who know how to configure Linux distros, this is a great opportunity to make their servers more effective.

There’s a layer of easier customizability in Linux hosting. Popular website builder platforms like WordPress, Drupal, and Joomla are all built for Linux. They can work on Windows servers too, but it would require a bit of tinkering to accomplish this.

Many open-source applications that can help you customize a website or a server may not support ASP.NET as well, making Linux a better option.

Linux vs. Windows Hosting: Which should you choose?

So, what’s the final verdict? Should you choose Linux or Windows hosting? While the decision may be more complicated for many developers, here’s a quick guide to help you decide.

If you’re not a developer, you should probably go for Linux. The CPanel covers most needs that people who run e-commerce websites, blogs, or wikis may have. You also have access to customization tools that don’t require a lot of learning. And if you ever need to hire third-party help, you’ll be able to find a contractor cheaper than you would running a Windows server.

If you’re a developer, it gets a bit more complicated. For developers with a corporate background, it would make more sense to use Windows, as this is what many corporations use for internal servers, and you’re probably used to it.

For those who want to work for a corporation, running a server on MSSQL would be great practice and a positive feature on their resume. The same goes for developers who either are already familiar with ASP.NET stack or want to learn more about it for future employment.

On the other hand, if your main language is PHP, Python, or Perl, you’re probably going to be better off using Linux hosting. The same goes for technologies like MySQL, Apache, and NGINX. If you know them already or want to gain a better understanding of them, Linux is the way to go. You may have to learn about Linux and how to configure distros, but that’s not a necessity for having a good experience hosting on a Linux system. You could also go with an ordinary build.

Use these tips to choose the OS you need for your hosting, and don’t forget to learn more about AWS and Azure, as these clouds can be used to host both Windows and Linux servers.

The post Linux vs. Windows Hosting: Which Should You Choose? appeared first on Cloud Academy.

]]>
0
Top 10 Cloud Storage & File-Sharing Services for 2020 https://cloudacademy.com/blog/top-10-cloud-storage-file-sharing-services-for-2020/ https://cloudacademy.com/blog/top-10-cloud-storage-file-sharing-services-for-2020/#respond Wed, 20 May 2020 15:06:48 +0000 https://cloudacademy.com/?p=40254 The internet has drastically changed the IT industry. It not only connects a person with the world, but it also introduces new features every year. In the last decade, “cloud” was a new term tossed in the market, and soon it gained so much popularity that it now covers a...

The post Top 10 Cloud Storage & File-Sharing Services for 2020 appeared first on Cloud Academy.

]]>
The internet has drastically changed the IT industry. It not only connects a person with the world, but it also introduces new features every year. In the last decade, “cloud” was a new term tossed in the market, and soon it gained so much popularity that it now covers a large area of the industry.

Up to some extent, we all are familiar with cloud technology and how it stores our data at remote locations, and now every big tech company uses this technology to save their own and customers’ data. Cloud is not only limited to large organizations or enterprises but now even ordinary people also use the cloud to store their data.

Cloud has become so popular that every big tech giant has its cloud service. For example, if you are an android user, you have to connect your mobile device to a Google drive. This will store your data at a remote location, so even if your mobile gets lost or reset, you can recover your data from the remote server.

If you are looking to further your knowledge and learn the basics of cloud computing, head over to Cloud Academy and check out Cloud Literacy. Whether you’re involved in sourcing IT services, need to understand the cloud to make your business work more efficiently, or just want to know about what the cloud is, this learning path is for you. 

Cloud Literacy Learning Path

What is the cloud?

In the software industry, cloud means a service that runs on the internet and stores your data at a “remote location,” which means the data center of a cloud service provider. In cloud storage, the data that you have stored at your device or hard drive can be stored at the data center of the cloud service provider. The cloud service storage works as a bank that stores your data so you can access it from anywhere and any device.

How does cloud storage work?

Cloud storage works on a client-server model, in which a client sends the request to the subscribed cloud service storage and the server at the data center gives the appropriate response. The main objective of the cloud, instead of saving data at local storage, the data of the user can be collected at some data center so the user could retrieve his or her data from any device.

What if the data center of the cloud service provider collapses or gets destroyed, would the user data also be destroyed? The answer is no. The cloud storage depends on hundreds of data centers, so even if one of the data centers collapses, there are hundreds of centers that would help you to retrieve and save your data.

This article is not about the nitty-gritty of cloud storage. This article focuses on the top 10 cloud storage services of 2020. There are many competitors in this field. Since all the big tech giants have their own cloud storage services, the competition is fierce. So, when we select a cloud storage service, we should consider some essential features such as service security, privacy, mobile app, pricing, complexity, and speed.

Top 10 cloud storage services of 2020

1. Dropbox

Dropbox is of the best file cloud storage and file synchronization services, developed by American company dropbox.

Pros of dropbox

  • Ideal for any sized organization, big or small.
  • It does not have a data limit.
  • Data storage plans vary from 2GB to unlimited storage space.
  • Supported by multiple devices such as Windows, Mac OS, Linux, Android, iOS, and Windows phones.
  • Store and share any file on this platform.
  • It provides you with admin control, which is very easy to use.
  • It focuses on the privacy and security of its customers.
  • Easily manage files without downloading them with the web version.

Dropbox pricing

Plans Storage Capacity Price
Standard 5GB
  • $15 (Monthly)
  • $12.5 (yearly)
Advance Customizable plans
  • $25(Monthly)
  • $20(Yearly)
Enterprise Customizable plans Customizable solutions

Cons of Dropbox

  • Though it gives 30 days free trial but does not provide any free storage space.
  • Very expensive for freelancers and sole users.

2. iCloud

Apple Inc. provides this cloud storage service and every apple user is aware of this technology. It is only available for Apple devices, and in Windows 7 or later versions, there is no mobile application for Android devices. iCloud serves more than 850 million active users.

With iCloud, you can store your data such as documents, music, pictures, etc. at the remote server of Apple data centers. It also stores data of other Apple apps such as mail, calendar, contact, Reminder, Safari, etc.

iCloud pricing

iCloud pricing varies from country to country:

United States (USD)
  • 50GB: $0.99
  • 200GB: $2.99
  • 2TB: $9.99
India (INR)
  • 50GB: Rs 75
  • 200GB: Rs 219
  • 2TB: Rs 749

Cons of iCloud

  • Only People having an Apple ID can use this service.
  • There is no Android App for this service
  • Some of the app’s data can only be transferred from one Apple device to another Apple device.

3. Google Drive

Google itself developed Google drive. With android devices, it comes with built-in integration.

Pros of Google Drive

  • It can collaborate with many other services.
  • Only accessible cloud storage service that provides 15GB of free space to its every user.
  • Very easy to use.
  • There are mobile and desktop apps for every operating system.
  • It can be synced with any device.
  • One of the best cloud storage services for big and small businesses.
  • With Google drive, you can store any file.
  • You can also integrate many third-party applications with Google Drive.

Google Drive pricing

Storage Price
15GB Free
100GB $1.99 per month
200GB $2.99 per month
2TB $9.99 per month
10TB $99.99 per month
20TB $199.99 per month
30TB $299.99 per month

Cons of Google Drive

  • 5 TB data limit in Google Drive
  • No password protection feature for shared files.
  • The web interface of Google drive could be confusing.

4. Microsoft One Drive

Microsoft One Drive was specially designed for Microsoft users, so the documents of Microsoft application could be saved on the cloud storage.

Pros of Microsoft One Drive

  • Considered one of the best cloud storage services for data management, project and workflow, user management, and branding.
  • Provides mobile and desktop applications for all the popular operating systems such as Android, Windows, and Apple.
  • 5 GB of free cloud storage to its user.
  • Can be easily integrated with the Windows file explorer so you can quickly backup your data.

One Drive pricing

Plan Plan Editions Price Storage
Personal Personal Free 5 GB
  Personal Unlimited $9.95/month Unlimited storage
  Custom $5/month 500GB
Business Custom $7/month 500 GB
  Business Unlimited

It provides Branding features.

$29.95/month Unlimited
  Reseller Unlimited

It comes with a partner account.

$59.95/month Unlimited

Cons of One Drive

  • Often compared with Google drive, and it provides less free space than Google.
  • Limitation of the uploaded data file of 15GB.
  • Does not have an easy to use interface.

5. IDrive

IDrive is a cloud storage service that mostly focuses on the data backup feature.

Pros of IDrive

  • Highly recommended for freelancers and small organizations.
  • Storage plans vary from 5 GB to 1.12TB.
  • Supported by Windows, Mac, IOS, and Android.
  • File size cannot exceed 2 GB.
  • Only provides 5 GB of free space.
  • If a file is deleted, it can be recovered within 30 days.
  • It can be synced with any device.

IDrive Pricing

Plans Storage Capacity Price
Basic 5GB Free
IDrive Personal 2TB $104.25 for two years
  5TB $149.25 for two years
IDrive Business (Unlimited users, Unlimited computers, and servers.) 250GB $149.25 for two years
  500GB $299.25 for two years
  1.25 TB $749.25 for two years

Cons of IDrive

  • It has a very confusing interface.
  • Hard to use.
  • Its software can slow down your system.

6. Mega

Mega is a cloud storage and file hosting service offered by an Auckland based company known as Mega limited. It is the only top cloud storage service that provides 50Gb of free data storage.

Pros of Mega

  • Provides mobile and desktop applications for Windows, Linux, Android, iOS, and macOS operating systems.
  • Many popular web browsers even have a browser extension for Mega cloud storage.
  • Well known for its security features, which provide end-to-end encryption of files before they are uploaded, so even the Mega limited company employees cannot access the data without the proper key.
  • Serves more than 100 million registered users.

Mega pricing

Plans Storage Price/Month
Mega Free account. 50GB Free
Mega Lite account 200GB $4.99
Mega Pro I account 500GB $9.99
Mega Pro II account. 2TB $19.99
Mega Pro III account 4TB $29.99

Mega Cons

  • Its web-interface is only supported by Chrome and Mozilla Firefox.

7. Box

Box is highly recommended for enterprise solutions and for small teams.

Pros of Box

  • It comes with 10GB of free storage.
  • Focus on the security and privacy of the customers.
  • Can store any file.
  • Integrates with G-suit, so any Google files can be stored and shared with Box.
  • Access from any device.
  • Easy to use.
  • You can invite other Box users to view your files and easily share data.

Box pricing

Plans Storage Price
Individuals Plans 10GB Storage Free
Personal Pro 100GB Storage $10 per month
Business Plans Starter 100GB $5 per user/month
Business Unlimited $15 per user/month
Pro 1TB Storage $4250
Custom Unlimited Storage Contact Company.

Cons of Box

  • Very expensive
  • 5GB file size upload limitation.

8. pCloud

pCloud is specially designed to store large files, and it is often used for personal and small businesses. pCloud only provides two plans for storage space 10 and 100 GB.

Pros of pCloud

  • Supported by every operating system, which includes Windows, Linux, iOS, Mac, and Android.
  • No free storage service in pCloud.
  • Provides TLS/SSL encryption for data security.
  • Multiple file-sharing options.
  • Pictures and videos from any social media platform can be stored in pCloud.
  • Offers plans for a lifetime, which is missing in most of the cloud storage service providers.

pCloud Pricing

Plan Storage Price/month
Premium 500GB $3.99
Premium Plus 2TB $7.99

Cons of pCloud 

  • No free storage plan.

9. Tresorit

Tresorit is a cloud storage service that emphasizes on data security and encryption, and it is an ideal cloud storage service for individuals and businesses.

Pros of Tresorit

  • Provides end-to-end data encryption on shared files.
  • Supported by Windows, Linux, Android, Mac, and iOS.
  • Highly recommended for confidential data storage.
  • Very secure. Tresorit encrypts every file before they are uploaded to the cloud, applying the Advanced Encryption Standard algorithm using 256-bit keys.

Tresorit pricing

Plan Storage Price/month
Small Business (2 Users) 1,000GB $20
Big Business (10 Users) 1TB $12
Enterprise (100 Users) 1 TB $24

Cons of Tresorit

  • Does not provide any free storage.

10. Amazon Drive

Amazon Drive was formerly known as Amazon Cloud Drive. Amazon itself supports it for the cloud storage service, file backup, file sharing, and photo printing.

  • This cloud service is only limited to some countries which include, United States, United Kingdom, Japan, Germany, Spain, France, Italy, India, Australia, Canada, China, and Brazil.
  • Provides unlimited free storage for Amazon Prime subscribers.

Amazon Drive pricing

Plans Storage Price/Month
Prime Membership 5GB Free
100 GB Amazon Storage plan 100GB $19.99
1 TB Amazon Storage Plan 1TB $59.99

Cons of Amazon Drive

  • Not for business use.
  • A file size greater than 2GB won’t be uploaded.
  • Amazon Drive streaming is not available for videos longer than 20 minutes or larger than 2 GB
  • Only for personal use.

Conclusion

There are many cloud storage services, but here we have only provided the top 10 cloud storage of 2020. So before you choose any one of these services for yourself, make sure that all features meet your needs. If you want a storage service for a large enterprise, your focus should be on security features provided by the cloud storage service.

We hope you like this article if you have any suggestion related to this article or we have missed any cloud storage service please let us know by dropping a comment below. 

The post Top 10 Cloud Storage & File-Sharing Services for 2020 appeared first on Cloud Academy.

]]>
0
S3 FTP: Build a Reliable and Inexpensive FTP Server Using Amazon’s S3 https://cloudacademy.com/blog/s3-ftp-server/ https://cloudacademy.com/blog/s3-ftp-server/#respond Sat, 13 Apr 2019 11:32:00 +0000 https://cloudacademy.com/blog/?p=6649 Is it possible to create an S3 FTP file backup/transfer solution, minimizing associated file storage and capacity planning administration headache? FTP (File Transfer Protocol) is a fast and convenient way to transfer large files over the Internet. You might, at some point, have configured an FTP server and used block...

The post S3 FTP: Build a Reliable and Inexpensive FTP Server Using Amazon’s S3 appeared first on Cloud Academy.

]]>
Is it possible to create an S3 FTP file backup/transfer solution, minimizing associated file storage and capacity planning administration headache?

FTP (File Transfer Protocol) is a fast and convenient way to transfer large files over the Internet. You might, at some point, have configured an FTP server and used block storage, NAS, or an SAN as your backend. However, using this kind of storage requires infrastructure support and can cost you a fair amount of time and money.

Could an S3 FTP solution work better? Since AWS’s reliable and competitively priced infrastructure is just sitting there waiting to be used, we were curious to see whether AWS can give us what we need without the administration headache.

Updated 14/Aug/2019 – streamlined instructions and confirmed that they are still valid and work.

Why S3 FTP?

Amazon S3 is reliable and accessible, that’s why. Also, in case you missed it, AWS just announced some new Amazon S3 features during the last edition of re:Invent.

  • Amazon S3 provides  infrastructure that’s “designed for durability of 99.999999999% of objects.”
  • Amazon S3 is built to provide “99.99% availability of objects over a given year.”
  • You pay for exactly what you need with no minimum commitments or up-front fees.
  • With Amazon S3, there’s no limit to how much data you can store or when you can access it.
  • Last but not least, you can always optimize Amazon S3’s performance.

NOTE: FTP is not a secure protocol and should not be used to transfer sensitive data. You might consider using the SSH File Transfer Protocol (sometimes called SFTP) for that.

Using S3 FTP: object storage as filesystem

SAN, iSCSI, and local disks are block storage devices. That means block storage volumes that are attached directly to an machine running an operating system that drives your filesystem operations. But S3 is built for object storage. This mean interactions occur at the application level via an API interface, meaning you can’t mount S3 directly within your operating system.

S3FS To the Rescue

S3FS-Fuse will let us mount a bucket as a local filesystem with read/write access. On S3FS mounted files systems, we can simply use cp, mv, and ls – and all the basic Unix file management commands – to manage resources on locally attached disks. S3FS-Fuse is a FUSE based file system that enables fully functional filesystems in a userspace program.

GitHub S3FS Repository

So it seems that we’ve got all the pieces for an S3 FTP solution. How will it actually work?

S3FTP Environment Settings

In this documented setup the following environmental settings have been used. If you deviate from these values, ensure to correctly use the values that you set within your own environment:

  1. S3 Bucket Name: ca-s3fs-bucket (you’ll need to use your own unique bucket name)
  2. S3 Bucket Region: us-west-2
  3. S3 IAM Policy Name: S3FS-Policy
  4. EC2 IAM Role Name: S3FS-Role
  5. EC2 Public IP Address: 18.236.230.74 (yours will definitely be different)
  6. FTP User Name: ftpuser1
  7. FTP User Password: your-strong-password-here

Note: The remainder of this setup assumes that the S3 bucket and EC2 instance are deployed/provisioned in the same AWS Region.

S3FTP Installation and Setup

Step 1: Create an S3 Bucket

First step is to create an S3 bucket which will be the end location for our FTP uploaded files. We can do this simply by using the AWS console:

Create S3 Bucket

Step 2: Create an IAM Policy and Role for S3 Bucket Read/Write Access

Next, we create an IAM Policy and Role to control access into the previously created S3 bucket.

Later on, our EC2 instance will be launched with this role attached to grant it read and write bucket permissions. Note, it is very important to take this approach with respect to granting permissions to the S3 bucket, as we want to avoid hard coding credentials within any of our scripts and/or configuration later applied to our EC2 FTP instance.

We can use the following AWS CLI command and JSON policy file to perform this task:

aws iam create-policy \ 
--policy-name S3FS-Policy \ 
--policy-document file://s3fs-policy.json

Where the contents of the s3fs-policy.json file are:

Note: the bucket name (ca-s3fs-bucket) needs to be replaced with the S3 bucket name that you use within your own environment.

{
   "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": ["s3:ListBucket"],
            "Resource": ["arn:aws:s3:::ca-s3fs-bucket"]
        },
        {
            "Effect": "Allow",
            "Action": [
                "s3:PutObject",
                "s3:GetObject",
                "s3:DeleteObject"
            ],
            "Resource": ["arn:aws:s3:::ca-s3fs-bucket/*"]
        }
    ]
}

Using the AWS IAM console, we then create the S3FS-Role and attach the S3FS-Policy like so:

Create IAM Role

Step 3: Launch FTP Server (EC2 instance – Amazon Linux)

Launch EC2 Instance

We’ll use AWS’s Amazon Linux 2 for our EC2 instance that will host our FTP service. Again using the AWS CLI we can launch an EC2 instance by running the following command – ensuring that we launch with the S3FS-Role attached.

Note: in this case we are lazily using the –associate-public-ip-address parameter to temporarily assign a public IP address for demonstration purposes. In a production environment we would provision an EIP address, and use this instead.

aws ec2 run-instances \
--image-id ami-0d1000aff9a9bad89 \
--count 1 \
--instance-type t3.micro \
--iam-instance-profile Name=S3FS-Role \
--key-name EC2-KEYNAME-HERE \
--security-group-ids SG-ID-HERE \
--subnet-id SUBNET-ID-HERE \
--associate-public-ip-address \
--region us-west-2 \
--tag-specifications \
'ResourceType=instance,Tags=[{Key=Name,Value=s3fs-instance}]' \
'ResourceType=volume,Tags=[{Key=Name,Value=s3fs-volume}]'

EC2 Running Instance

Step 4: Build and Install S3FS from Source:

Note: The remainder of the S3 FTP installation as follows can be quickly performed by executing the s3ftp.install.sh script on the EC2 instance that you have just provisioned. The script assumes that the S3 bucket has been created in the Oregon (us-west-2) region. If your setup is different, you can simply update the variables at the top of the script to address differences.

Otherwise, the S3 FTP installation as manually performed…

Next we need to update the local operating system packages and install extra packages required to build and compile the s3fs binary.

sudo yum -y update && \
sudo yum -y install \
jq \
automake \
openssl-devel \
git \
gcc \
libstdc++-devel \
gcc-c++ \
fuse \
fuse-devel \
curl-devel \
libxml2-devel

Download the S3FS source code from GitHub, run the pre-build scripts, build and install the s3fs binary, and confirm s3fs binary is installed correctly.

git clone https://github.com/s3fs-fuse/s3fs-fuse.git
cd s3fs-fuse/

./autogen.sh
./configure

make
sudo make install

which s3fs
s3fs --help

Step 5: Configure FTP User Account and Home Directory

We create our ftpuser1 user account which we will use to authenticate against our FTP service:

sudo adduser ftpuser1
sudo passwd ftpuser1

We create the directory structure for the ftpuser1 user account which we will later configure within our FTP service, and for which will be mounted to using the s3fs binary:

sudo mkdir /home/ftpuser1/ftp
sudo chown nfsnobody:nfsnobody /home/ftpuser1/ftp
sudo chmod a-w /home/ftpuser1/ftp
sudo mkdir /home/ftpuser1/ftp/files
sudo chown ftpuser1:ftpuser1 /home/ftpuser1/ftp/files

Step 6: Install and Configure FTP Service

We are now ready to install and configure the FTP service, we do so by installing the vsftpd package:

sudo yum -y install vsftpd

Take a backup of the default vsftpd.conf configuration file:

sudo mv /etc/vsftpd/vsftpd.conf /etc/vsftpd/vsftpd.conf.bak

We’ll now regenerate the vsftpd.conf configuration file by running the following commands:

sudo -s
EC2_PUBLIC_IP=`curl -s ifconfig.co`
cat > /etc/vsftpd/vsftpd.conf << EOF
anonymous_enable=NO
local_enable=YES
write_enable=YES
local_umask=022
dirmessage_enable=YES
xferlog_enable=YES
connect_from_port_20=YES
xferlog_std_format=YES
chroot_local_user=YES
listen=YES
pam_service_name=vsftpd
tcp_wrappers=YES
user_sub_token=\$USER
local_root=/home/\$USER/ftp
pasv_min_port=40000
pasv_max_port=50000
pasv_address=$EC2_PUBLIC_IP
userlist_file=/etc/vsftpd.userlist
userlist_enable=YES
userlist_deny=NO
EOF
exit

The previous commands should have now resulted in the following specific set of VSFTP configuration properties:

sudo cat /etc/vsftpd/vsftpd.conf

anonymous_enable=NO
local_enable=YES
write_enable=YES
local_umask=022
dirmessage_enable=YES
xferlog_enable=YES
connect_from_port_20=YES
xferlog_std_format=YES
chroot_local_user=YES
listen=YES
pam_service_name=vsftpd
userlist_enable=YES
tcp_wrappers=YES
user_sub_token=$USER
local_root=/home/$USER/ftp
pasv_min_port=40000
pasv_max_port=50000
pasv_address=X.X.X.X
userlist_file=/etc/vsftpd.userlist
userlist_deny=NO

Where X.X.X.X has been replaced with the public IP address assigned to your own EC2 instance.

Before we move on, consider the following firewall requirements that will need to be in place to meet the requirements of the VSFTP configuration as per the properties we have just saved into the vsftpd.conf file:

  • This configuration is leveraging passive ports (40000-50000) for the actual FTP data transmission. You will need to allow outbound initiated connections to both the default FTP command port (21) and the passive port range (40000-50000).
  • The FTP EC2 instance security group will need to be configured to allow inbound connections to the ports above, and where the source IP address of the inbound FTP traffic as generated by your FTP client is your external public IP address.

Since we are configuring a user list file, we need to add our ftpuser1 user account into the vsftpd.userlist file:

echo "ftpuser1" | sudo tee -a /etc/vsftpd.userlist

Finally, we are ready to startup the FTP service, we do so by running the command:

sudo systemctl start vsftpd

Let’s check to ensure that the FTP service started up and is running correctly:

sudo systemctl status vsftpd

● vsftpd.service - Vsftpd ftp daemon
Loaded: loaded (/usr/lib/systemd/system/vsftpd.service; disabled; vendor preset: disabled)
Active: active (running) since Tue 2019-08-13 22:52:06 UTC; 29min ago
Process: 22076 ExecStart=/usr/sbin/vsftpd /etc/vsftpd/vsftpd.conf (code=exited, status=0/SUCCESS)
Main PID: 22077 (vsftpd)
CGroup: /system.slice/vsftpd.service
└─22077 /usr/sbin/vsftpd /etc/vsftpd/vsftpd.conf

Step 7: Test FTP with FTP client

Ok so we are now ready to test our FTP service – we’ll do so before we add the S3FS mount into the equation.

On a Mac we can use Brew to install the FTP command line tool:

brew install inetutils

Let’s now authenticate against our FTP service using the public IP address assigned to the EC2. In this case the public IP address I amusing is: 18.236.230.74 – this will be different for you. We authenticate using the ftpuser1 user account we previously created:

ftp 18.236.230.74
Connected to 18.236.230.74.
220 (vsFTPd 3.0.2)
Name (18.236.230.74): ftpuser1
331 Please specify the password.
Password:
230 Login successful.
ftp>

We need to ensure we are in passive mode before we perform the FTP put (upload). In this case I am uploading a local file named mp3data –  again this will be different for you:

ftp> passive
Passive mode on.
ftp> cd files
250 Directory successfully changed.
ftp> put mp3data
227 Entering Passive Mode (18,236,230,74,173,131).
150 Ok to send data.
226 Transfer complete.
131968 bytes sent in 0.614 seconds (210 kbytes/s)
ftp>
ftp> ls -la
227 Entering Passive Mode (18,236,230,74,181,149).
150 Here comes the directory listing.
drwxrwxrwx    1 0 0             0 Jan 01 1970 .
dr-xr-xr-x    3 65534 65534          19 Oct 25 20:17 ..
-rw-r--r--    1 1001 1001       131968 Oct 25 21:59 mp3data
226 Directory send OK.
ftp>

Lets now delete the remote file and then quit the FTP session

ftp> del mp3data
ftp> quit

Ok that looks good!

We are now ready to move on and configure the S3FS mount…

Step 8: Startup S3FS and Mount Directory

Run the following commands to launch the s3fs process.

Notes:

  • The s3fs process requires the hosting EC2 instance to have the S3FS-Role attached, as it uses the security credentials provided through this IAM Role to gain read/write access to the S3 bucket.
  • The following commands assume that the S3 bucket and EC2 instance are deployed/provisioned in the same AWS Region. If this is not the case for your deployment, hardcode the REGION variable to be the region that your S3 bucket resides in.
EC2METALATEST=http://169.254.169.254/latest && \
EC2METAURL=$EC2METALATEST/meta-data/iam/security-credentials/ && \
EC2ROLE=`curl -s $EC2METAURL` && \
S3BUCKETNAME=ca-s3fs-bucket && \
DOC=`curl -s $EC2METALATEST/dynamic/instance-identity/document` && \
REGION=`jq -r .region <<< $DOC`
echo "EC2ROLE: $EC2ROLE"
echo "REGION: $REGION"
sudo /usr/local/bin/s3fs $S3BUCKETNAME \
-o use_cache=/tmp,iam_role="$EC2ROLE",allow_other /home/ftpuser1/ftp/files \
-o url="https://s3-$REGION.amazonaws.com" \
-o nonempty

Lets now do a process check to ensure that the s3fs process has started:

ps -ef | grep  s3fs

root 12740 1  0 20:43 ? 00:00:00 /usr/local/bin/s3fs 
ca-s3fs-bucket -o use_cache=/tmp,iam_role=S3FS-Role,allow_other 
/home/ftpuser1/ftp/files -o url=https://s3-us-west-2.amazonaws.com

Looks Good!!

Note: If required, the following debug command can be used for troubleshooting and debugging of the S3FS Fuse mounting process:

sudo /usr/local/bin/s3fs ca-s3fs-bucket \
-o use_cache=/tmp,iam_role="$EC2ROLE",allow_other /home/ftpuser1/ftp/files \
-o dbglevel=info -f \
-o curldbg \
-o url="https://s3-$REGION.amazonaws.com" \
-o nonempty

Step 9: S3 FTP End-to-End Test

In this test, we are going to use FileZilla, an FTP client. We use Site Manager to configure our connection. Note here, we explicitly set the encryption option to insecure for demonstration purposes. Do NOT do this in production if transferring sensitive files, instead setup SFTP or FTPS.

FileZilla Site Manager

With our FTP connection and credential settings in place we can go ahead and connect…

Ok, we are now ready to do an end-to-end file transfer test using FTP. In this example we FTP the mp3data file across by dragging and dropping it from the left hand side to the right hand side into the files directory – and kaboom it works!!

FileZilla FTP Application

The acid test is to now review the AWS S3 web console and confirm the presence of the mp3data file within the configured bucket, which we can clearly see here:

S3 Bucket FTP File

From now on, any files you FTP into your user directory, will automatically be uploaded and synchronized into the respective Amazon S3 bucket. How cool is that!

Summary

Voila! An S3 FTP server!

As you have just witnessed – we have successfully proven that we can leverage the S3FS-Fuse tool together with both S3 and FTP to build a file transfer solution. Let’s again review the S3 associated benefits of using this approach:

  • Amazon S3 provides  infrastructure that’s “designed for durability of 99.999999999% of objects.”
  • Amazon S3 is built to provide “99.99% availability of objects over a given year.”
  • You pay for exactly what you need, with no minimum commitments or up-front fees.
  • With Amazon S3, there’s no limit to how much data you can store or when you can access it.

If you want to deepen your understanding of how S3 works, then check out the CloudAcademy course Storage Fundamentals for AWS

CloudAcademy S3 Storage Fundamentals

 

The post S3 FTP: Build a Reliable and Inexpensive FTP Server Using Amazon’s S3 appeared first on Cloud Academy.

]]>
0
Understanding Object Storage and Block Storage Use Cases https://cloudacademy.com/blog/object-storage-block-storage/ https://cloudacademy.com/blog/object-storage-block-storage/#comments Tue, 12 Mar 2019 00:30:15 +0000 https://cloudacademy.com/blog/?p=7308 Cloud Computing, like any computing, is a combination of CPU, memory, networking, and storage. Infrastructure as a Service (IaaS) platforms allow you to store your data in either Block Storage or Object Storage formats. Understanding the differences between these two formats – and how they can sometimes be used together...

The post Understanding Object Storage and Block Storage Use Cases appeared first on Cloud Academy.

]]>
Cloud Computing, like any computing, is a combination of CPU, memory, networking, and storage. Infrastructure as a Service (IaaS) platforms allow you to store your data in either Block Storage or Object Storage formats.

Understanding the differences between these two formats – and how they can sometimes be used together – can be a critical part of designing an overall storage profile. And the relatively low costs of cloud storage, along with its durability and high availability, can make it attractive even for local infrastructure projects.

What is Block Storage?

Block storage devices provide fixed-sized raw storage capacity. Each storage volume can be treated as an independent disk drive and controlled by an external server operating system. This block device can be mounted by the guest operating system as if it were a physical disk. The most common examples of Block Storage are SAN, iSCSI, and local disks.

Block storage is the most commonly used storage type for most applications. It can be either locally or network-attached and are typically formatted with a file system like FAT32, NTFS, EXT3, and EXT4.

Use cases

  • Ideal for databases, since a DB requires consistent I/O performance and low-latency connectivity.
  • Use block storage for RAID Volumes, where you combine multiple disks organized through stripping or mirroring.
  • Any application which requires service side processing, like Java, PHP, and .Net will require block storage.
  • Running mission-critical applications like Oracle, SAP, Microsoft Exchange, and Microsoft SharePoint.

Block storage options in the cloud

  1. AWS Elastic Block Storage (EBS): Amazon EBS provides raw storage – just like a hard disk – which you can attach to your Elastic Cloud Compute (EC2) instances. Once attached, you create a file system and get immediate access to your storage. You can create EBS General Purpose (SSD) and Provisioned IOPS (SSD) volumes up to 16 TB in size, and slower, legacy magnetic volumes.
  2. Rackspace Cloud Block Storage: Rackspace provides raw storage devices capable of delivering super fast 10GbE internal connections.
  3. Azure Premium Storage: Premium Storage delivers high-performance, low-latency disk support for I/O intensive workloads running on Azure Virtual Machines. Volumes allow up to 32 TB of storage.
  4. Google Persistent Disks: Compute Engine Persistent Disks provide network-attached block storage, much like a high speed and highly reliable SAN, for Compute Engine instances. You can remove a disk from one server and attach it to another server, or attach one volume to multiple nodes in read-only mode. Two types of block storage are available: Standard Persistent Disk and Solid-State Persistent Disks.

What is Object Storage

Block storage volumes can only be accessed when they’re attached to an operating system. But data kept on object storage devices, which consist of the object data and metadata, can be accessed directly through APIs or http/https. You can store any kind of data, photos, videos, and log files. The object store guarantees that the data will not be lost. Object storage data can be replicated across different data centers and offer simple web services interfaces for access.

A simple use case would see application developers who deal with large amounts of user-generated media, using object storage to store unlimited media files. As data stores scale to hundreds of terabytes and then into the petabyte range and beyond, object storage becomes even more attractive.

Use Cases

  • Storage of unstructured data like music, image, and video files.
  • Storage for backup files, database dumps, and log files.
  • Large data sets. Whether you’re storing pharmaceutical or financial data, or multimedia files such as photos and videos, storage can be used as your big data object store.
  • Archive files in place of local tape drives. Media assets such as video footage can be stored in object storage and archived to AWS glacier.

Object storage options in the Cloud

  1. Amazon S3Amazon S3 stores data as objects within resources called “buckets.” AWS S3 offers features like 99.999999999% durability, cross-region replication, event notifications, versioning, encryption, and flexible storage options (redundant and standard).
  2. Rackspace Cloud Files: Cloud Files provides online object storage for files and media. Cloud Files writes each file to three storage disks on separate nodes that have dual power supplies. All traffic between your application and Cloud Files uses SSL to establish a secure, encrypted channel. You can host static websites (for example: blogs, brochure sites, small company sites) entirely from Cloud Files with a global CDN.
  3. Azure Blob Storage: For users with large amounts of unstructured data to store in the cloud, Blob storage offers a cost-effective and scalable solution. Every blob is organized into a container with up to a 500 TB storage account capacity limit.
  4. Google cloud storage: Cloud Storage allows you to store data in Google’s cloud. Google Cloud Storage supports individual objects that are terabytes in size. It also supports a large number of buckets per account. Google Cloud Storage provides strong read-after-write consistency for all upload and delete operations. Two types of storage class are available: Standard Storage class and Storage Near line class (with Near Line being MUCH cheaper).

Conclusion

Object storage and Block storage both have unique advantages and limitations. Understanding the use cases and costs associated with each medium will help you get the best possible mileage out of your application storage profile.

The post Understanding Object Storage and Block Storage Use Cases appeared first on Cloud Academy.

]]>
4
Inside the Cloud – Episode 3: Security, Migration, and Storage on Azure Cloud https://cloudacademy.com/blog/inside-the-cloud-security-migration-and-storage-on-azure/ https://cloudacademy.com/blog/inside-the-cloud-security-migration-and-storage-on-azure/#respond Thu, 12 Oct 2017 01:07:13 +0000 https://cloudacademy.com/blog/?p=21953 Our third episode of Inside the Cloud is all about Microsoft Azure. In this episode, we’ll be focusing on the host of new services and updates on Azure Security, Migration, and Storage recently announced on Microsoft Azure following its annual Ignite conference, held last month in Orlando. Microsoft Azure’s hybrid cloud platform...

The post Inside the Cloud – Episode 3: Security, Migration, and Storage on Azure Cloud appeared first on Cloud Academy.

]]>
Our third episode of Inside the Cloud is all about Microsoft Azure. In this episode, we’ll be focusing on the host of new services and updates on Azure Security, Migration, and Storage recently announced on Microsoft Azure following its annual Ignite conference, held last month in Orlando.
Azure - September 2017 | Inside the Cloud
Microsoft Azure’s hybrid cloud platform continues to be a strong choice for companies like Dun & Bradstreet, Geico, and recently, Bank of America, who want to take advantage of the benefits of the cloud even while maintaining some on-premises infrastructure for legacy applications. With the new announcements, it’s clear that Microsoft is continuing to focus on strengthening its Azure cloud platform as it works to keep pace with its main competitor, Amazon Web Services.
Ben Lambert, our resident Microsoft Azure expert, will dig into the top 10 Azure announcements that you need to know about, including:

  • How to use the new PowerShell for Cloud Shell, now in public preview.
  • Why you need to immediately check out the new Azure DDos Protection service.
  • Speaking of security, learn how to use its Azure Storage Firewalls and Virtual Networks for layered defense.
  • How the new Azure Migrate service can help move your on-premises VMs and databases to the cloud.
  • How to use Azure File Sync to sync Azure files to an on-premises Windows server.
  • What the new porting of Azure Functions to .NET Core 2.0 means for developers.

The post Inside the Cloud – Episode 3: Security, Migration, and Storage on Azure Cloud appeared first on Cloud Academy.

]]>
0
How to Migrate Your SQL Server Database to Amazon RDS https://cloudacademy.com/blog/how-to-migrate-sql-server-database-to-amazon-rds/ https://cloudacademy.com/blog/how-to-migrate-sql-server-database-to-amazon-rds/#comments Thu, 25 Aug 2016 19:41:55 +0000 https://cloudacademy.com/blog/?p=15805 Amazon RDS is a web service that provides cloud database functionality for developers looking for a cost-effective and simple way to manage databases. If you’re looking to migrate your existing SQL database to RDS, this is the guide for you. RDS offers six database engines: Amazon Aurora Microsoft SQL Server Oracle...

The post How to Migrate Your SQL Server Database to Amazon RDS appeared first on Cloud Academy.

]]>
Amazon RDS is a web service that provides cloud database functionality for developers looking for a cost-effective and simple way to manage databases. If you’re looking to migrate your existing SQL database to RDS, this is the guide for you.
RDS offers six database engines:

  1. Amazon Aurora
  2. Microsoft SQL Server
  3. Oracle
  4. PostgreSQL
  5. MYSQL
  6. MariaDB

With RDS, there is no need for you to buy any other rack and stack hardware or install any software.
The complexity of moving your existing Microsoft SQL Server to Amazon RDS is mostly determined by the size of your database and the types of database objects which you are transferring.

For example, migrating a database which has data sets on the order of gigabytes along with stored triggers and procedures is going to be more complex than migrating a modest database with only few megabytes of test data and no stored procedures or triggers.

Why You Might Consider Migrating Your SQL Database to Amazon RDS

RDS allows developers to set up database instances in the cloud. Developers are relieved of the complexity of managing and maintenance of the database. Instead, they can focus on developing successful products.
There’s one issue, however: There is no file system access. Though this is usually not a huge problem, it becomes a concern if you are trying to restore or create an SQL Server backup (.bak) file.
The biggest setback unfortunately with an Amazon SQL Server RDS instance is that it cannot restore the data from .bak files or import .bak file. When a situation like this arises, what should a developer do?


Editor’s Note: AWS launched a native SQL server backup and restore feature in late July. You can read more about it here: Amazon RDS for SQL Server – Support for Native Backup/Restore to Amazon S3.


For these situations, a developer can migrate his SQL Server database to Amazon RDS. This may sound like a daunting task, but it is in fact quite easy.
In this post, we have tried to provide you with some easy steps to migrate your SQL Server database to Amazon RDS:

  1. The first step would be to take a snapshot of the source RDS instance.
  2. Secondly, you will have to disable automatic backups on the origin RDS instance.
  3. Now, create your target database by disabling all foreign key constraints and triggers.
  4. Import all the logins into the destination database.
  5. The next step is creating the schema DDL with the help of the Generate and Publish Scripts Wizard in SSMS.
  6. Next, execute the SQL commands on your target database to create your schema.
  7. You can use either the bulk copy command (cp) or the Import/Export Wizard in SSMS to migrate your data from the origin database to your target database.
  8. Clean up the target database by re-enabling the foreign key constraints and triggers.
  9. Again re-enable the automatic backups on the source RDS instance.

Thankfully, after experimenting with this process many times, we found a better solution not documented in the AWS documentation. 

SQL Azure Migration Wizard

To save time and avoid errors, we have discovered a new and better solution called the SQL Azure Migration Wizard. With SQL Azure Migration Wizard, the process of migrating databases (or anything including views/tablse/stored procedures) in, out, or between RDS instances is much easier and faster.
To migrate your SQL database to Amazon RDS using SQL Azure Migration Wizard, follow these easy steps.

Step 1: Download the SQLAzureMW Tool
Download SQL Azure Migration Wizard on CodePlex. Next, you need to extract the SQLAzureMW.exe file. You can utilize SQL Server Management Studio for connecting your local SQL server and Amazon Web Service RDS instance. But, before doing all this, make sure that you have a good connection to these two servers.

Step 2: Begin the Migration
Double click on the SQLAzureMW.exe file. A page will appear on your screen and what you now need to do is to select Database as an option under the Analyze/Migrate category. Once you do this, click on the Next button.

Step 3: Source Database Tasks
Now enter your Source SQL Server Database connection details and click on the Connect button.
Choose the source database and click on the button that says ‘Next.’ Then select an option named as ‘Script all database objects’.
This option can enable to do the complete migration of the database. But if you don’t want to migrate entire database then you select an option that says ‘Select specific database objects.’

Step 4: Create Scripts
Create scripts for all selected SQL server objects. You should save the script on local hard drive and the move ahead by hitting a click on a button ‘Next’.

Step 5: Destination Database Process
Now you have created a script of your database. You will now be required to enter your RDS SQL Server connection credentials and then connect it.

Step 6Select the Target Database
Choose the target database that you would like to migrate. If you have not created any database earlier, then create a new one using Create Database option and go next. Be sure to do a quick check to confirm if there are any errors.

Step 7: The Grand Finale
You can now verify your SQL Server Management Studio and check all the migrated data.
As you can see, SQL Azure Migration Wizard saves a lot of time.
You will have to modify settings in your corporate firewall if your database is on-premises. In case your database is already hosted on Amazon Web Services, you can also add an entry to your instance’s security group.

Next, what you have to do is simple: launch and prepare an Amazon RDS instance running SQL. Then restore the database from the SQL dump and take note of the current log file name. Now you will use the database dump to start the RDS instance.

Once the RDS instance is initialized, you will be required to run the replication stored procedures that are supplied as part of the release to configure the RDS instance. Once your RDS instance matches any changes that have taken place, the application configuration can be modified to use it in preference to the existing version.

Summary

Thus sums up the process on how to migrate a SQL Server database to Amazon RDS. The process of data migration is not a very complicated process but a very easy one indeed. We hope that this post was useful enough in helping all those who want to migrate their SQL Server Database to Amazon RDS.

The post How to Migrate Your SQL Server Database to Amazon RDS appeared first on Cloud Academy.

]]>
10
Cloud Learning Management Systems Support Corporate Training and Development https://cloudacademy.com/blog/cloud-learning-management-systems-support-corporate-training-and-development/ https://cloudacademy.com/blog/cloud-learning-management-systems-support-corporate-training-and-development/#comments Tue, 15 Mar 2016 16:25:18 +0000 https://cloudacademy.com/blog/?p=13675 Corporate training is an investment in the present. It yields immediate results for all involved. Image courtesy of pixabay.com, licensed under CC0 Public Domain. Business success depends on a well-trained workforce. It makes sense that a motivated, engaged staff will ultimately result in improved profitability. I contend that most businesses should...

The post Cloud Learning Management Systems Support Corporate Training and Development appeared first on Cloud Academy.

]]>
Corporate training is an investment in the present. It yields immediate results for all involved.
PhotoImage courtesy of pixabay.com, licensed under CC0 Public Domain.

Business success depends on a well-trained workforce. It makes sense that a motivated, engaged staff will ultimately result in improved profitability.
I contend that most businesses should implement employee training periodically, and some industries should make an effort to secure continuous training and development for their teams. It makes sense that various tech-related roles, such as engineers, IT professionals or developers, must stay on top of emerging trends and ongoing technological developments. Their roles depend on maintaining a mastery of emerging tech trends and putting these concepts into practical use for the benefit of their organizations. Tech businesses carefully monitor developments and constantly strain their workforces in adapting to meet new marketplace demands. What is the best use of resources in this pursuit? I suggest smart, flexible corporate training.

There is an ecosystem of commercial education and training vendors, such as Cloud Academy, who offers consumer and enterprise programs on a variety of technology and cloud-related topics at beginner, intermediate, and advanced levels.

A deeper insight into the matter made me realize that development experts and human resources professionals increasingly rely on cloud-based platforms for managing training because workers typically embrace the convenience with enthusiasm. Companies of all sizes finally have access to affordable digital employee training that delivers outstanding results. The flexibility of modern systems allows employers to schedule training in ways that don’t disrupt core workflows. Learners consume content on standard and mobile devices when their schedules permit. I believe this to be a game-changer.

Importance of Training and Development
Traditionally there has been a misconception that training costs detract from profitability. Some managers put employees to work with minimal training and are puzzled when workers appear indifferent about their jobs.

I contend that failure to invest resources training people sets the stage for dead-end careers, unproductive workers, and customer dissatisfaction. From a financial perspective, employers have a genuine interest in developing human capital. When employees succeed, they enjoy a sense of purpose and accomplishment. Satisfied workers welcome professional engagement and their success breeds a culture of success.
It’s safe to say that providing meaningful career support empowers employees and offers them a positive vision for the future and a greater purpose in life.

Training and Development Requirements
Businesses in different industries require different levels of training. Most organizations should spend time determining their training/development requirements. In some settings, the process should begin with a formal analysis of workflows and job designs. A thorough review should provide enough data identifying the key training objectives that will improve job performance and satisfaction at all levels. Training plans should include a long-range view that anticipates which skills employees will need in future roles as they advance.
After identifying training requirements and objectives, companies can more accurately gather data to estimate their potential investment costs. It’s only natural that businesses will search for the most efficient ways of developing and delivering training materials to their workers. The solutions may be external, internal or, more typically, a hybrid model. Thanks to technology-based training options, businesses have a near-unlimited selection of great offerings, from simple clerical training to extremely specific technical topics such as a Solutions Architect Professional Level Certification for AWS course.

After considering organizational needs and the options available to fulfill them, companies can decide on the solution that works best for them. Available Cloud-based Learning Management System (LMS) programs conform to the requirements of almost any business.
Below I offer an overview of some LMS programs that I’ve found work well for corporate training.

Cloud-Based LMS Explained
Web-based LMS platforms make administering and using employee training programs possible from almost anywhere in the world. These Cloud-based systems enable you to easily deliver content, record activity, track/report performance. Learning occurs entirely online, where the data is secured by providers.
Cloud-based solutions offer more convenience than traditional LMS applications that required the use of company servers, IT staff, and hands-on maintenance. Cloud-based LMS products require no installation because everyone involved accesses them through web browsers. The time and money saved by using the Cloud for LMS can go toward the training courses.

Cloud LMS products work on a pay-as-you-go pricing model that enables companies to pay only for the resources they use. System is typically included with the base price. Gone is the fear of system updates or configuration changes. Program plans and subscriptions can scale dynamically based on demand. You can expand or shrink training and development projects based on your business needs.
Trainers accessing Cloud-based LMS products have an efficient process for creating and delivering content based on their particular needs. After importing course content in the Cloud, trainers can assign learners to the various classes and run reports on training activities on individual employees and their organization. LMS administrators can modify or improve courses to continually improve the effectiveness of their training program based on the real-time feedback.

Learners have the ability to access training materials wherever and whenever they want as long as they have an internet conncetion. Thanks to smartphones and tablets, training via cloud-based LMS is easy to schedule and complete.

Top Cloud-Based LMS Programs
1. Docebo – A modular LMS that includes and advanced test engine, robust reporting, and performance tracking.
2. TalentLMS – This enterprise-friendly solution lets trainers assemble e-learning courses in minutes and customize them with a look and feel consistent with other corporate online assets.
3. Litmos – Multi-language and localization support, as well as custom branding, make Litmos an appealing LMS. The product allows the development of premium course content for customers and internal content for employees.
4. Firmwater LMS – Designed primarily for training companies, this robust platform can work for any business that will commit to online learning.
5. CourseMill – This LMS also has a traditional software-based counterpart, giving companies many choices for deploying e-learning. It also comes with a compliance interface that ends employee confusion over their responsibilities for completing a course.
Most LMS products offer free trial periods, so companies may evaluate them before purchase.

Mastering Cloud Computing
The growing omnipresence and utilization of Cloud Computing technology increases the demand for cloud-savvy IT professionals. By providing cloud integration, LMS helps businesses automate their training processes while synchronizing the data between the LMS and other platforms. Businesses can leverage various online learning platforms, such as Cloud Academy, aimed at developing their staff’s Cloud Computing skills, by integrating such platforms with the LMS of their choice to provide their staff with specific, high-quality learning materials that ensure skill building.

I encourage you to review some learning paths, courses, labs, and quizzes designed and built by working professionals around cloud topics. These programs lead to greater professional growth in critical areas. Users may follow a path to certification or simply for greater subject mastery and job mobility.

Conclusion
Businesses should always consider the training needs of their human capital. My own experience has taught me that well-trained people produce more and have a greater sense of satisfaction with their jobs. When people have the opportunity to learn and develop at both professional and personal levels, they can look forward to new opportunities for advancement and achievement with their current employers. Owners and managers develop a stable, professional organization that satisfies customers and generate profits.

Choosing Cloud-based LMS solutions reduces the implementation costs of training programs while making corporate training easier to administer. People have the advantage of completing their training courses at times and places convenient to them. Businesses have many LMS options from which to choose. With some research you will find a solution that best serves the needs of your organization.

I include some Success stories from Cloud Academy users that offer insight into the process and results of online corporate training.

The post Cloud Learning Management Systems Support Corporate Training and Development appeared first on Cloud Academy.

]]>
1