Why Linux Containers Are As Cool As Docker Containers!

4 fundamental reasons why you should learn Linux Containers along with Docker and why they’re as cool as Docker Containers. Of course they serve different purpose, but are complimentary.

Whether you are a developer, infrastructure, operations or a cloud professional, you ought to learn about Linux Containers! More so, if you are a college student learning computers and like to dive deep into system architecture, you should have learnt this yesterday. Linux Container is the next generation of OS virtualization techniques and helps get rid of your long time Virtual Machine woes. Lets see why these “System Containers” are so cool. Why do we even call Linux Containers as System Containers? Simply, because they are full-fledged Linux Operating Systems by themselves. And guess what, multiple such Linux System Containers can co-exist on the same physical host/computer in a completely isolated, independent and efficient manner. Therefore,

What Docker Containers are to Applications, LXD is to Linux Machines!

Linux Containers are Super Efficient

If you have ever tried creating 2 or more Linux based virtual machines on your local computer (whether desktop or laptop) using any hypervisor, you would know that it’s a nightmare in terms of system performance. Each Linux VM can be anywhere from 600 MB to 1 GB+ in terms of disk usage, needs a separate OS, static RAM allocation and much more. Even if the VM Operating System is exactly the same as your physical host OS, the VM will need a duplicate copy of the same operating system.

This OS duplication is at the heart of VM and System Container comparison. Linux system containers address this issue and do not need a   duplicate OS copy to run. This is irrespective of what Linux OS you are   using on host and your Linux Containers. Of course, this is only true for all Linux OS’ with the same kernel.

To illustrate the above-mentioned scenario, imagine that you have an experiment to perform or a project to implement which needs the following simple config;

1.)  5 Independent Linux Machines – 1 CentOS, 2 Ubuntu, 1 Mint and 1 Fedora distros

2.)  Two HTTP Servers, Database Server, Cache Management and Load Balancer

Such a multi-Linux-machine scenario can be easily implemented by creating and using 5 independent Linux System Containers on a single physical host with a single Linux OS. To replicate this same set up using 5 Linux VMs, you’d need 5 different copies of Linux OS and much more.

Linux Containers are Disposable (Infrastructure)

You can literally create, use and dispose of these system containers (LXD Containers) as and when you need without disturbing your original Linux host.

Imagine a situation where you had a tricky project to execute and it required a whole bunch of experimental scenarios for you to work on. This could be a new server set up or a single node installation of OpenStack. Any infrastructure professional would agree to the fact that, if you perform such infra related experimentation on your localhost (your physical computer with the base OS), there would be a multitude of changes made to your base system. This is because complex projects like Single Node OpenStack Installation or a comprehensive server set up requires to manipulate system libraries, services, file permissions, and directory structures to mention a few things.

It’s a safe bet to say that unless you run a well-defined installation script from the likes of DevStack/PackStack in case of OpenStack or similarly comprehensive and well-tested scripts for server setups from GitHub etc, Things Will Fail many a time before you reach a stable state of your desired project outcome. Given all of this, you really do not want to mess up your local system by running all such experimental scenarios over and over again.

For such experimentation, just use Linux System Containers, create your desired systems/machines, use them and simply dispose them at the end. This way, you’ve experimented enough, completed your desired infra project, cleaned up everything that you created without even touching your base system config. Of course, it’s not necessary to dispose of all your hard work. You can always save your Linux System Container images, much like you save your Docker container images and export them, save them to a central repo and let the larger world benefit from your hard work.

An example of a publicly available/shared Linux System Container.

Multiple Linux OS on a Single Linux Host

Linux Containers are managed using LXC/LXD and its super awesome. To give you a realistic idea, you can run almost all of the widely used Linux distros like Ubuntu, Fedora, Oracle Linux, OpenSuse, Debian, CentOS, Gentoo etc. using pre-defined LXD images and even customize them further to your needs.

Guess what, you do not need to create multiple VM’s with multiple OS copies to achieve this. Just pull the relevant container image from LXD repository, start the container and you’re good to go.

But, what is LXD anyways? LXD is a next-generation system container manager. It offers a user experience similar to virtual machines but using Linux containers instead. It’s image based with pre-made images available for a wide number of Linux distributions and is built around a very powerful, yet pretty simple, REST API.

Experimentation for Innovation

This aspect is particularly important for curious souls, especially college students wanting to build their projects based on a plethora of use cases. In the age of cloud computing, Internet and Containerization, there is really no reason for someone to stop experimenting.

It’s literally the age of disposable infrastructure!

Cloud Computing allows you to create, use and dispose of almost any infrastructure component that you can imagine today. From a ready to use Machine Learning/ Artificial Intelligence toolkit to Tableau Engines, from the Internet of Things frameworks to Python Jupyter Notebooks, from container clusters to database engines; It’s all out there waiting for you to be used for your next awesome use case/project scenario.

But what do you do if you were to experiment with a few machines and apps, using a single Linux laptop/desktop WITHOUT an Internet connection? Assuming that you have all the prerequisites installed along with LXC/LXD, you can use Linux System Containers to work with your use case and experiment.

I am sure by now you realize what Linux System Containers allow you to do in terms of machine isolation, independence, efficiency, replication, sharing, and disposability. Sounds similar to what Docker containers do? Yes, in the context of systems.

So, are you ready to create your first Linux System Container?

5 Reasons Why Cloud Computing Will Grow in the Next 10 Years

Sky is the limit for Cloud Computing. We are witnessing a massive growth in the adoption and evolution of cloud computing. In some sense, cloud providers like Microsoft, AWS, Google etc. have become to computing and IT, what Walmart, IKEA, and Tesco were for retail. They have huge global infrastructures (read data centers and networks) to offer seamless, on demand, real-time IT services for individuals as well as enterprises of all sizes.

Can you believe AWS even offers “Human work as a service”? Yes, you heard it right. So if you have a job and need human workforce, AWS can offer you that as well!

It may look like the cloud ecosystem has stabilized with a few major players from the west and typical consumers being start-ups, large enterprises, and niche use cases. But, you are in for a surprise here! In the next couple of years, the global market for cloud is already projected to be over $200 Billion. And this is only going to get bigger and better in the next decade!

Imagine a world, where from the moment you wake up until the time you sip your morning cup of tea or coffee, you’ve already created megabytes of data on your quality of sleep, heart rate, calories burnt etc. Isn’t this happening already? Well, yes, but today, your phone doesn’t recommend you a personalized breakfast option based on your last night’s sleep. Neither do you get a fully personalized holiday package deal with your best friends or family when you absolutely need it! We’re still far from this. At least at a mass scale.

These technologies will become commonplace when we accept them ubiquitously – at work, at home, during travel, and in public services. It is no surprise to see working professionals feeling an urgency to re-skill and upskill themselves in hot tech areas like cloud computing, big data engineering, cloud-based analytics and IoT. However, pursuing either isolated cloud computing courses or analytics courses are not very helpful. What you need is a truly experiential learning program or pathway.

“A world where every important physical entity (including us) is attached to various types of sensors that generate data 24×7 and automatically push it to a cloud platform, only to be consumed by a machine learning model hosted and managed by the cloud platform, to generate useful artificial intelligence “AI” based results.”

But how do we even get there? Where does all this data go? How is it analyzed? Where are all these millions of intelligent computers? How will I (an individual) ever be able to afford these computers? So, what will propel cloud in the next 10 years? Multiple factors like 5G networks, Big Data, Artificial Intelligence, Internet of Things and Smart Cities.

Super High-Speed Networks

Until last year, Indian mobile consumers kept a close track of their Internet usage and its associated costs. However, that habit is history now. Indians together consume more data than US and China combined. Astonishingly, this has happened in a short span of 12 months. “People have learned the habit of consuming gigabytes of data every day.”

In the context of India and most of the world, we have 4G telecom networks and data speeds today which is far better than what 3G was. Very soon, the world will witness 10 to 100 times faster networks with 5G. For all the right reasons, this might happen in India sooner than any other country. This means all of us will create and consume terabytes of data on a daily basis. And it is not just us, but everything around us. All our connected devices, home appliances, vehicles, buildings etc. will continuously generate 10 to 100 times the data being generated today – Simply because, with 5G we will be able to do so, at the tap of our fingers!

Such high-speed networks will only enable cloud providers to deliver services at an almost real-time basis without the slight “yet acceptable” latency of the networks available today. This is a positive development for the growth of cloud.

Big Data

“A faster Internet creates highly engaged customers and happier companies (pun intended).” Simply because you create more data.

When people are given a convenient option of downloading an HD video over other inferior resolutions, they will almost always prefer to consume HD. And arguably, they’ll do it many times during a day. More and more people will move closer to such media streaming services which will explode the subscriber base for such companies. e.g. Netflix, Amazon Prime, Hotstar etc. Interestingly, the more you consume from such platforms, the better for them – as your engagement generates continuous behavioral data points, which will allow the likes of Netflix to create better and relevant content for various user groups. If you already don’t know, Netflix is now transforming itself from a content streaming service to also a content creation platform.

It is amazing to note that “Televisions never had this big data advantage”.

However again, to create truly personalized content engines and recommendations, we would need highly skilled big data and cloud computing professionals. Remember, big data only became possible due to mobile devices, cheap storage, and cloud computing platforms.

Another exciting area of development is in the gaming industry! It’s already huge. But with companies and individuals attempting to create real-time massive online games, enhanced with AR/VR and blessed with upcoming 5G speeds, this cant get any better. With such an immersive experience, there is no doubt that people will spend even more time playing these online games across the world. And in this entertaining process, generate BIG DATA!

Artificial Intelligence

What do you do with all the data that we’ve talked about? Well, its used to understand YOU!

If businesses understand you really well (excluding PII – Personally Identifiable Information), they’re better enabled to serve you with the most personalized product or service. But how do they achieve this business goal?

All the big data created by users is ultimately consumed directly by an analytical tool like Tableau or is used to train machine learning models. Like humans, machine learns from observations. The better the data in terms of quality and quantity, the better is the learning outcome of these machines (understand software machines created using code).

Higher network speeds, bigger data pools, faster computing capabilities and most importantly ubiquitous real-time connectivity to the Internet, are all positives for AI’s continuing evolution. A recent MIT Technology Review article already believes that AI will double the cloud market to over $260 billion over the coming years. And this is just the start.

IoT & Smart Cities

With all the connected devices, data, cloud and artificial intelligence, it is only natural and imperative that we create communities truly based on these technologies. These communities (read smart cities) will require real-time insights and intelligent “thinking applications” which are built using Big Data, Machine Learning, and AI. And cloud helps us do exactly this from anywhere, anytime and for whatever you need to build.

To cash in on this trend and technological evolution, tech professionals are trying to jump the bandwagon of learning cloud computing, AI and IoT through on line learning courses on cloud computing and allied areas. Again to reiterate, isolated cloud certifications and courses are only as helpful as getting your foot in the door. In order to become a true expert, you will need more an online certification – a truly experiential cloud computing learning program. One such program is offered by Great Learning.

So in order to create tomorrow’s IT solutions based on billions of data generating devices, applications, algorithms, and people, you will have to change your approach in how you design and consume IT infrastructure and Applications. But more than that, Cloud fundamentally changes what we can possibly build tomorrow. This is the reason why we think, Cloud is a paradigm shift in computing.

Are you ready for the next wave of Cloud revolution?