AWS solution to build Real-time Data processing Application using Kinesis, Lambda, DynamoDB, S3

Reading Time: 4 minutes

A Capstone Project by Amit Bajaj and Sathya Guruprasad

Introduction

Cloud Computing has become very popular due to the multiple benefits it provides and is being adopted by businesses worldwide. Flexibility to scale up or down as per the business needs, faster and efficient disaster recovery, subscription-based models which reduce the high cost of hardware, and flexible working for employees are some of the benefits of cloud that attracts businesses. Similar to cloud, Data Analytics is another crucial area which businesses are exploring for their growth. With the exponential rise in the amount of data available on the internet is a result of the boom in the usage of social media, mobile apps, IoT devices, sensors and so on. It has become imperative for the organisations to analyse this data to get insights into their businesses and take appropriate action.

AWS provides a reliable platform for solving complex problems where cost-effective infrastructure can be built with great ease at low cost. AWS provides a wide range of managed services, including computing, storage, networking, database, analytics, application services and many more. 

Problem Statement:

We have analysed multiple software solutions which perform analysis on data collected from the market and provide information as well as suggestions and provide better customer experience. This includes trade application providing stock price, taxi companies providing locations of nearby taxis, journey plan applications providing live updates on the different transport media and many more.

We have considered a “server-less” platform / “Server-less Computing Execution Model” to build the real-time data-processing app. Architecture is based on managed services provided by AWS.

What is “Server-less”?

A cloud-based execution model in which the cloud provider dynamically allocates and runs the server. This is a consumption-based model where pricing is directly proportional to consumer use. AWS takes complete ownership of operational responsibilities eliminating infrastructure management and availability with higher uptime. 

Services Consumed:

  1. Kinesis – Kinesis Data Stream- Kinesis Data Analytics- Kinesis Firehose
  2. Athena
  3. Lambda
  4. Dynamo DB
  5. Amazon S3
  6. AWS CLI

Architecture:

AWS solution to build Real-time Data processing Application - cloud computing

Without building a sizable infrastructure, how to receive data from different sources for cloud-based infrastructure?

Kinesis, a managed service by AWS, Amazon Kinesis makes it easy to collect, process, and analyse real-time, streaming data so you can get timely insights and react quickly to new information. Kinesis Datastream allows user to receive data from data generation source. We have created amazon kinesis data stream using AWS CLI commands which is expected to consume data from the data source.

Technical + Functional Flow 

Create Kinesis data streams: 

      1. Create a stream in Kinesis using AWS Console or AWS CLI Commands; one to receive data from Data generator and another to write post processing. Data generator will produce the data which will be read and written to input/source data stream. Kinesis Analytics App will process and write data to Output/destination stream.
      2. We have created a program to generate data, and with the help of AWS SDKs and AWS CLI commands transmitted to Kinesis Data Streams. Data can be generated in various fashion:
        1. Using IoT devices
        2. Live trackers
        3. GPS trackers
        4. API
        5. Data generator tools (in case of Analysis)

Create a Kinesis Analytics App to Aggregate data

      1. Build a Kinesis Data Analytics application to read from the input/source data stream and write to output/destination data stream in formatted fashion in a specified time interval.
      2. It is very important to stop the application when not in use to save unwanted cost.

Data Storage and Processing:

      1. Lambda, another managed service by AWS processes data from trigger data stream and write to dynamo DB
      2. Lambda function works on trigger basis and cost model is strictly driven by consumption. No cost is incurred from user when function is not running. Data is stored in Dynamo DB and can be accessed in standard fashion.

Kinesis Firehose, S3 and Athena:

    1. Kinesis Firehose acts as mediator between Kinesis Datastream and S3 where Data received from Kinesis Datastream will be predefined S3 bucket in specified format
    2. Amazon Athena is server-less interactive query service which enables user to glorify data stored in S3 Bucket for analysis. 

Amazon CLI, AWS Cloud formation and AWS IAM also plays a very important role in building Cloud based infrastructure and ensure secure connectivity within and outside AWS cloud world. 

Conclusion:

Using AWS services, we were able to create a real-time data processing application based on serverless architecture which is capable of accepting data through Kinesis data streams, processing through Kinesis Data Analytics, triggering Lambda Function and storing in DynamoDB. The architecture can be reused for multiple data types from various data sources and formats with minor modifications. We have used all the managed services provided by AWS which led to zero infrastructure management efforts. 

Capstone project has helped us in building practical expertise on AWS services like Kinesis, Lambda, Dynamo DB, Athena, S3, Identity and Access Management, Serverless Architecture and Managed Services. We have also learnt the Go programming language to build pseudo data producer programs. AWS CLI has helped us to connect on-premise infrastructure with cloud services.  

This project is a part of Great Learning’s post graduate program in Cloud Computing. 

Authors
Amit Bajaj – Project Manager at Cognizant
Sathya Guruprasad – Infrastructure Specialist at IBM Pvt Ltd

Setting up a hospitality business model on AWS

Reading Time: 5 minutes

A capstone project by Sajal Biswas and Shreya Sharma

Use Case: Accommodation options in the travel industry are not limited to hotels and resorts. People often look for homestay options as this model benefits both the parties. Tourists can enjoy home-like comfort while owners can earn reasonable revenues on the rent.

Introduction:

We have taken the Airbnb business model as a reference, and we have analyzed how to utilize AWS cloud services so that business only need to focus on their model.

We are following ‘server-less architecture’ for our proposed solution. Serverless architectures help in significantly reducing operational cost, complexity, and engineering lead time, at the price of increased reliance on the vendor. 

Architecture:cloud computing capstone project

CICD Architecture:

cloud computing capstone project

Tech stack used:

– ReactJs for creating the web application using AWS AMPLIFY

– Profile Management using AWS COGNITO

– ChatBot using AWS LEX and AWS AMPLIFY

– Static website hosting on S3 bucket

– CLOUDFRONT for CDN

– Code repository in CODECOMMIT

– Backend API’s using Lambda functions(in Python) which will be triggered via API Gateway

– AWS ElastiCache for efficient Search functionality

– DynamoDB database for storing data in key-value pairs

– Static files like images are kept in an S3 bucket

– CloudWatch Alarms are being used for monitoring purpose

– AWS SES service to send emails to customers

– AWS Pinpoint and Athena for analytics purpose

Case Studies:

  1. Without provisioning Infrastructure, load balancing and less cost, how can we develop API, as fast as business needs to launch in the market?

For this requirement, Serverless architecture is the best choice. So, we have implemented the same so that business need not worry about Infrastructure changes and management.

  1. What if a user wants to track email user communication and process the data based on reply?

Enterprise solutions not only want the business to send promotional emails, contact services but also interested in user replies and track user communication as well. AWS SES is implemented for this feature, though we have integrated only sending email using Lambda function, other features can also be explored.

  1. The design approach for Search and Listing Properties on website

We have considered that a large amount of data will be generated, hence transaction would be huge as well, so we have chosen Dynamo DB. We are maintaining property list by partition key as <propertyCode>_<stateCode>_<pinCode> so that we can easily search, and whenever a huge request comes in, then it should split up in such a way that hot partition key issue does not arise.

  1. Efficient Search functionality using AWS ElasticSearch.

We are using AWS ElasticSearch for saving a record along with DynamoDB. We have also created Lambda function for collecting transaction data from DynamoDb and create a CSV file in S3 bucket which will be used from Athena for analytics purpose.

  1. Is it possible to increase customer interaction, instantly? 

We have integrated LEX ChatBot with basic functionalities.

  1. What would be a good approach for User Profile Management?

The initial thought was to use AWS RDS service for this, but later we used managed service for this which is AWS Cognito.

  1. Analytics from Business Perspective.

Currently, we have used below services for analytics purpose:

– Aws pinpoint

– Athena Query 

Technical Details:

Website hosting with API integration:

We have developed a static website using React Js and AWS Amplify. This website is hosted on S3 bucket and Cloudfront is integrated for caching and CDN.

– User Registration, Login, Password Management, Logout and Session management using AWS Cognito.

– LEX Chatbot for basic functionalities

– Integration with backend API’s deployed on API Gateway. We have consistent response JSON format i.e. ArrayList of objects

– AWS pinpoint for tracking user activity on the website

Deployment:

Repository Management: Website repository is maintained using AWS Code Commit.

CI/CD: We have used AWS Code Pipeline for website deployment

API deployment: All Backend API’s are deployed in API gateway integrated with AWS Lambda and we have created the dev stage environment for the same.

Monitoring and Metrics:

We have used Cloud Watch logs and Metrics for debugging and monitoring purpose using various tags.API’s and Database:

We have created API’s using AWS Lambda as backend. All functions are written in the Python environment. 

Although neither of us has expertise in Python, we learnt about it in the PGPCC course. 

Library:

We have used PIP package manager for installing boto3 for python.

API Endpoints:

All Lambda functions are exposed through API gateway as a POST request, wherein we have used “action” field in the body so that based on this field, API can respond accordingly.

Services details: 

We have created the following services:

Product Management Service:

We have created 3 functionalities by querying DynamoDb database/ ElasticSearch

– Create product

– Get All product

– Get all product by state

For the same functionality, based on the “es_service” flag in the body, we decide whether to call DynamoDb or ElasticSearch

Transaction Management Service:

We have created 3 functionalities by querying DynamoDb

– Create Transaction

– Get All transaction and by UserID

– Transaction by date for a particular UserId.

Transaction Analytics service which will gather all transaction data and dump into s3 as CSV file where we can query the data using Athena.

Conclusion:

Serverless computing offers several advantages over traditional cloud-based or server-centric infrastructure. For many developers, serverless architectures offer greater scalability, more flexibility, and quicker time to release, all at a reduced cost. With serverless architectures, developers do not need to worry about purchasing, provisioning, and managing backend servers.

We have observed the following advantages while working on this capstone project:

– No server management is necessary

– Developers are only charged for the server space they use, reducing cost

– Serverless architectures are inherently scalable

– Quick deployments and updates are possible

– Code can run closer to the end-user, decreasing latency

Authors’ Bio:

Shreya Sharma – Shreya is an AWS Certified Solutions Architect and is currently working as Senior Software Developer with Hexaware Technologies Pvt Ltd. in Mumbai. She has a particular interest in all things related to AWS Cloud, migration from on-premise to Cloud & Backend API. She has 8 years of extensive work experience in designing and developing Full Stack Applications on cloud and on-premise both.

Sajal Biswas – Sajal is passionate about cloud computing development and architecting cloud migration projects with backend API development. He is an OCA 7(java), CSM, Mule ESB certified professional and is currently working with Capgemini as a software consultant in Mule ESB technology. He has a total experience of 6.7 years including extensive experience in API integration.

 

Experts Talk Series: Cloud Security Demystified

Reading Time: 5 minutes

Episode 2 – Cloud security overview

Cloud computing is a dynamic platform with continuous provisioning and de-provisioning of on-demand resources based on utility and consumption. It has caused considerable confusion on how this is both different and similar to conventional architectures and how this impacts information and application security from a technical standpoint.

Cloud security should be thought of not only from the perspective of the “physical” location of the resources but also the ones managing them and consuming them. It follows what is known as the shared responsibilities model, where the responsibility is shared between the customer and the cloud provider. 

But how do you know which responsibilities belong to you and which belong to the cloud provider? A good rule of thumb would be to break down the security aspect into the various dimensions first.

  1. Applications: This includes access control, application firewalls, and transactional security
  2. Information: This consists of the aspects of database encryption and monitoring
  3. Management: This includes patch management, configuration management, and monitoring
  4. Network: This holds firewalls and Anti-DDOS measures
  5. Compute and Storage: The focus here is on Host-based firewalls, Integrity, and file-log management
  6. Physical: This includes physical data centre security

After you have identified these aspects in your application, they can be mapped to your cloud provider to check which controls exist and which do not. The division of the responsibilities of these dimensions will depend on the classification of your cloud implementation based on the “SPI model”, i.e. SaaS, PaaS or IaaS.

Read Episode 1: Migrating to the cloud

SPI Model

Software as a Service(SaaS) – In this implementation, the customer is given the use of a software or application deployed and managed by the provider on a cloud infrastructure, which cannot be controlled or managed by the customer apart from limited customization and configuration options based on special requirements. 

The user has the responsibility of managing access to applications and dictates policies on who has access to which resources. For example, an employee from the sales team may have access to only data from the CRM application, someone from the academic team may only have access to the LMS, etc. The rest of the cloud stack is the responsibility of the cloud provider including infrastructure and the platform.

Platform as a Service(PaaS) – This enables the customer to build, deploy and manage applications to the cloud using programming languages and tools supplied by the cloud provider. The organization can deploy applications without having to manage the underlying hardware and hosting capabilities.

The cloud provider takes the responsibility of securing the platform provided and all stacks below it. The customer has the responsibility of securing the developed application and all the access to these applications. It is also recommended that customers encrypt all application data before storing it on the cloud providers platform and plan for load balancing across different providers or across geographical regions in case of an outage.

Infrastructure as a Service(IaaS) – The cloud provider delivers computing infrastructure along with storage and networking needs via a platform virtualization service. The customer can then run and deploy applications and software on the infrastructure as per their need.

The responsibility of the underlying hardware along with all used storage and networking resources falls with the cloud provider. The customer is responsible for putting controls in place regarding how virtual machines are created and who has access to the machines to keep costs in control and reduce wastage of resources.

 

Recommended practices

– Encrypt data before migrating: Your cloud provider will do everything it can to make sure the data you have uploaded is secure, however, the application as such may not be infallible. If the data contains private information which should not be found by a third party, it needs to be encrypted before storing and/or uploading.

– Take care of data security (at rest): This can primarily fall under the following categories

– Encrypt your data: All cloud providers will have some encryption systems in place to protect your data from rogue usage. Make sure these systems are in accordance with your organization’s policies. For security reasons, you may also want to manage the encryption keys yourself rather than let your provider do it; check whether this service is available. 

– Protect your keys: Some providers will allow you to manually handle encryption keys in the form of hardware security modules (HSM). This will place the responsibility of managing the keys on the customer but allows for better control. Also, you will certainly be issued SSH and API keys for access to various cloud services. These should be stored securely and protected against unauthorized access. Remember, if the keys are compromised, there is likely nothing your provider can do to help you!

– Data that is deleted stays deleted: Redundancy systems used by cloud providers often replicate data to maintain persistence. As such, sensitive data can often find it’s way into logging systems, backups, and management tools. It’s highly recommended to be familiar with the cloud deployment system to keep track of where your data may have ended up.

– Secure your data in transit: Firewalls, network access control solutions, and organizational policies should be in place to make sure that your data is safe against malware attacks or intrusions. For example, policies should also be set up to automatically encrypt or block sensitive data when it is attached to an email or moved to another cloud storage or external drive. This can be made easier by categorizing and classifying all company data, no matter where it resides, to maintain easier access control.

– Unauthorized cloud usage: Strict policies will need to be set up to ensure that employees can only access the resources that they should. Similar measures will need to be put in place to regulate the number of virtual machines being run and make sure those machines are spun down when not in use.

Every cloud provider will have its own governance services to manage resource usage. It is highly recommended that an in-house cloud governance framework is put in place.

– Keep an Audit trail: Cloud ecosystems run on a pay-as-you-go basis and can rack up huge bills and lead to considerable wastage when not used properly. Therefore, tracking the use of cloud resources is very important. Your cloud provider will likely have a system in place to generate audit trails, but if your cloud implementation is spread across multiple providers, creating an independent in-house audit trail becomes important. A Cloud Service Broker solution will be able to assist you in this by monitoring resource usage and identify vulnerabilities and rogue users, which brings us to the next point.

– Ask your provider: Your cloud provider will have numerous manuals and whitepapers describing best practices to follow for various implementations. Make sure to take advantage of them!

 

Cloud Security is a tricky jungle to navigate, but by following some simple guidelines and best practices, you can ensure that your organization’s data and applications are safe and rest easy. To read more on Cloud Computing click here

Experts Talk Series is a repository of articles written and published by cloud experts. Here we talk in-depth about cloud concepts, applications, and implementation practices.  

Experts Talk Series: Migrating to the cloud

Reading Time: 5 minutes

Episode 1 – Cloud migration

Migrating to the cloud is a buzzword these days. Every enterprise wants to say that they are “100% cloud-enabled”. If you are an enterprise looking to move over to the cloud, how should you go about it?

First off, let’s just clarify that “100% cloud-enabled” is a myth. Most enterprises will have a portion of their business running in their own datacenter, also known as on-premise. Therefore, a better way to quantify cloud enablement would be “100% of all applications that have been found fit for the cloud have been migrated”.

How to decide if you really need to migrate?

To get the process off the ground, the first thing you have to decide is whether the cloud is the right fit for your use-case. If your application landscape consists of legacy code or is highly optimized for the hardware it is being run on, it is safe to say the cloud will do more harm than good. But, if your application comprises of a set of loosely coupled components, each being a small highly specialized hardware-independent function, these seem like ripe candidates for a cloud-based server-less implementation.

There should also be a good reason for this endeavour. Change for change’s sake does not always equal to progress. The pros and cons of a cloud-based infrastructure must be taken into account, along with factors like cost and manpower requirements and whether they can be met.

So you want to migrate. What’s next?

Have you decided that you want to jump into the cloud? If so, let’s venture together into the labyrinth of choices you will have to make during this journey.

First, you will have to look at various business dimensions while contemplating your cloud implementation. For example, immediate cost benefits will be highest on IaaS implementations, after a lift and shift of on-premises applications to the cloud. Likewise, other dimensions like time to market, functional responsiveness, and scaling have to be taken into consideration and a balance has to be found. This will help you to decide if your implementation will be IaaS, PaaS or SaaS-based. Perhaps a combination may yield the best results.

The next step is app evaluation. As mentioned earlier, it is necessary to check which applications are fit for the cloud. Low-risk applications from a business perspective can be safely migrated. However, an enterprise may feel more secure storing trade secrets, proprietary functionality, and security services on local servers. Let this be noted though, on-premise servers do not guarantee 100% security any more but cloud providers do. As a matter of fact, cloud providers take security very seriously and take strong measures to make sure that you know exactly where, and by whom your data is being accessed. Also, only authorized users can access your data.

You may be on the fence about migrating certain services, like client-server applications and supporting functions. For such cases, an ROI analysis will help you decide. Please note that on-premises implementation allows the enterprise to take advantage of financial levers like depreciation. In the end, let me emphasize that these decisions are highly case-specific and are not cast in stone. 

An application in an enterprise is hardly ever standalone. Hence, you will have to go through various levels of integration. The usual options are synchronous and asynchronous integration. The on-premises data centre can be integrated with the cloud to create a hybrid cloud deployment topology. This means the cloud applications can access the on-premises applications directly, though a bit of latency will be at play. Maybe asynchronous or batch-based integration will help hide the latency.

The migration process 

It is a myth that cloud migration is a single-step process. As mentioned earlier, the first step is usually a lift-and-shift approach. This is where the existing on-premises architecture is cloned onto the cloud. This relieves the enterprise of the burden of maintaining a data centre, but that is all the benefits you’ll ever get from this approach. After that, gradually, some of the functionality can be re-engineered to take advantage of managed cloud services, such as a database can be moved over to a cloud-provided database. Then there is the concept of cloud-native applications, where new components or functionality can be designed from the get-go to take advantage of platform-specific services built for media, analytics, or content distribution. This way the workload on the enterprise is reduced until you can be solely responsible for the business processes while letting the cloud handle the heavy lifting.

The next step is to choose a cloud provider. Your hired or in-house Cloud expert can help you make an informed decision from myriad choices available to you. Which of these is suitable for you is highly situational, and requires you to take several factors into consideration, like cost, software or platform requirements, compliance requirements, and geographical zone availability. You may also want to take advantage of a specific API or managed service offered by a service provider. It should be noted that most of the top cloud providers have a nearly similar set of services, so if you don’t have any highly specialised requirements, you cannot go wrong with either of them.  

The on-premises setup then has to be restructured to fit the cloud architecture. Your cloud provider will definitely have a list of reference architectures available based on real-life use-cases and a list of best practices to follow, including but not limited to data and application migration tools. They also have an extensive collection of white papers to aid you in this task.

Implementing the migration plan

The above discussion concludes the planning and selection stage of cloud migration. All that is left now is to implement the plan. This should begin with drawing up and implementing a proof of concept. Not only will this allow you to run performance comparisons with your existing application, but it will also highlight unforeseen challenges and complexity levels which may show up during the actual migration process, allowing you to be prepared for the same. This will also give you a good idea of the reliability of the chosen cloud provider and will allow you to evaluate its support system.

While performing the actual migration, you should be careful to minimize the resulting disruption time and service outages. Dry runs should be conducted to identify potential failure points and minimize errors during the process.  Every use case will have its own set of steps to follow during the migration, but it generally starts by taking a backup of the databases, followed by the deployment of applications, and migrating the database. Also, there will be quite a few application components to manage and set up, like middleware, caching, warehousing, and file systems. All these components must be planned and mapped to the relevant cloud service. Don’t forget to set access roles and policies! Make sure you have a clear idea of who should be able to access your applications and which components they can access, then assign appropriate roles for them. Parallel deployments of the application in the cloud and on-premises must be performed to check performance and detect failures.

Benchmarking tests are a must. This will let you know how your cloud application runs in comparison to your on-premises setup and will allow you to fine-tune your setup and be sure if it is ready for deployment.

Congratulations! You have successfully migrated to the cloud. As mentioned before, cloud migration is not a goal but a journey. Every new application will have to be evaluated whether it is a better fit for cloud or on-premises implementation. If it is destined for the cloud, integration with other applications that may still be on-premises will have to be taken into account. As new services are released by the provider, existing on-premises applications will have to be re-evaluated to see if they can take advantage of those new services. 

As you can see, this journey is not easy, but once it has been completed, just sit back and watch the clouds do their magic! But with regular management and prompting from you of course!

Experts Talk Series is a repository of articles written and published by cloud experts. Here we talk in-depth about cloud concepts, applications, and implementation practices.  

Top Interview Questions For Cloud Computing You Should Know

Reading Time: 9 minutes

With 3.6 billion people actively using Cloud Services in 2018, Cloud Computing has become popular than ever before. In this article, we will discuss the top Cloud Computing interview questions. 

With an unfathomable volume of data, it becomes cumbersome for industries to manage their data. Thus, Cloud Computing is like the straw to the drowning industries in the ocean of data. Amazon, Microsoft, Deloitte, Lockheed Martin, are among the top recruiters for cloud computing professionals. 

cloud computing interview questions

(Source: Forbes)

According to a survey, the average salary of an entry-level cloud professional is around 8 lacs per annum, 12-15 lacs for professionals with under 3 years of experience, and for individuals with 10+ years of experience, the salary is a whopping 30 lacs or more.

Check out our PGP program in Cloud Computing, get trained by industry professionals
Solve 15+ use cases, and many more challenging projects as well. Enroll Now!

For all aspiring cloud computing architects, here is a curated list of cloud computing interview questions.

So let’s begin:

1. How will you describe Cloud Computing as concisely and simply to a Layman?

Even though this might sound like a fundamental question, this was asked in one of the interviews. (source- Quora)

Now, you must use simple words while answering this question. Use of technical terms is not advised.

In cloud computing, ‘cloud’ refers to the internet, metaphorically. So cloud computing is a method where internet acts as the fuel to computing services. You can now use the word- Internet-based computing. 

2. Give the best example of open source Cloud Computing.

Opensource cloud is a cloud service or solution that is built using opensource software and technologies. This includes any public, private or hybrid cloud model providing SaaS, IaaS, PaaS, or XaaS built and operated entirely on opensource technologies.

The best example of open source Cloud Computing is OpenStack.

3. What are system integrators in cloud computing?

System Integrators emerged into the scene in 2006. System integration is the practice of bringing together components of a system into a whole and making sure that the system performs smoothly.

A person or a company which specializes in system integration is called as a system integrator.

4. List the platforms which are used for large-scale cloud computing.

The timely processing of massive digital collection demands the use of large-scale distributed computing resources and the flexibility to customize the processing performed on the collections.

The platforms that are used for large-scale cloud computing are:

– Apache Hadoop

– MapReduce

5. Mention the different types of models used for deployment in cloud computing.

You need the perfect cloud deployment model to help you gain a competitive edge in the market. Through this, you will have access to IT resources and services that can make your business flexible and agile, both concerning volume and scale.

The different deployment models in cloud computing are:

– Private Cloud

– Public Cloud

– Community Cloud

– Hybrid Cloud

6. What do you mean by software as a service?

Software as a service (SaaS) is a software distribution model in which a third-party provider hosts applications and makes them available to their customers over the Internet. SaaS is one of three main categories of cloud computing, alongside infrastructure as a service (IaaS) and platform as a service (PaaS).

7. What is the platform as a service?

Platform as a service (PaaS) is a cloud computing model wherein a third-party provider delivers hardware and software tools. These tools usually comprise of those needed for the development of applications. PaaS services are provided to users over the internet. The provider hosts the hardware and software. As a result, PaaS gives users the flexibility to use service without installing hardware and software to run an application.

8. What is a private cloud?

Private cloud is one which delivers similar advantages to public cloud-like scalability and self-service. In the private cloud, this is done by using a proprietary architecture. Private clouds focus on the needs and demands of a single organization.

As a result, the private cloud is best for businesses with dynamic or unpredictable computing needs that require direct control over their environments. Security, governance, and regulation are best suited for private cloud services.

Private clouds are used to keep the strategic operations and others secure. It is a complete platform which is fully functional and can be owned, operated and restricted to only an organization or an industry. Nowadays, most of the organizations have moved to private clouds due to security concerns. Virtual private cloud is being used that operate by a hosting company.

9. What is the public cloud?

Be it a public or private cloud, the primary objective is to deliver services using the internet. Unlike a private cloud, public cloud services are the third party applications which can be used by anybody who wants to access them. The service may be free or be sold on demand.

Public clouds are open to people for use and deployment. For example Google and Amazon etc. The public clouds focus on a few layers like cloud application, providing infrastructure, and providing platform markets.

10. What are Hybrid Clouds?

Hybrid cloud is a cloud computing environment where we can use the services available to us locally, use third-party private services, and public services as well to meet the demand. By allowing workloads to move between private and public clouds as computing needs and costs change, hybrid cloud gives businesses greater flexibility and more data deployment options.

Hybrid clouds are a combination of public clouds and private clouds. It is preferred over both the clouds because it applies the most robust approach to implement cloud architecture. It includes the functionalities and features of both worlds. It allows organizations to create their cloud and allow them to give control over to someone else as well.

11. What is the difference between cloud computing and mobile computing?

Cloud Computing is when you store your files and folders in a “cloud” on the Internet, this will give you the flexibility to access all your files and folders wherever you are in the world– but you do need a physical device with Internet access to access it.

Mobile computing is taking a physical device with you. This could be a laptop or mobile phone or some device. Mobile computing and cloud computing are somewhat analogous. Mobile computing uses the concept of cloud computing. Cloud computing provides the users with the data which they require while in mobile computing, applications run on the remote server and give the user access for storage and managing the data.

12. What is the difference between scalability and elasticity?

Scalability is a characteristic of cloud computing which is used to handle the increasing workload by increasing in proportion amount of resource capacity. By the use of scalability, the architecture provides on-demand resources if the traffic is raising the requirement. Whereas, Elasticity is a characteristic which provides the concept of commissioning and decommissioning of a large amount of resource capacity dynamically. It is measured by the speed at which the resources are on-demand and the usage of the resources.

13. What are the security benefits of cloud computing?

Complete protection against DDoS: Distributed Denial of Service attacks have become very common and are attacking cloud data of companies. So the cloud computing security ensures restricting traffic to the server. Traffic which can be a threat to the company and their data is thus averted.

Security of data: As data develops, data breaching becomes a significant issue and the servers become soft targets. The security solution of cloud data helps in protecting sensitive information and also helps the data to stay secure against a third party.

Flexibility feature: Cloud offers flexibility, and this makes it popular. The user has the flexibility to avoid server crashing in case of excess traffic. When the high traffic is over, the user can scale back to reduce the cost.

Cloud computing authorizes the application server, so it is used in identity management. It provides permissions to the users so that they can control the access of another user who is entering into the cloud environment.

14. What is the usage of utility computing?

Utility computing, or The Computer Utility, is a service provisioning model in which a service provider makes computing resources and infrastructure management available to the customer as needed and charges them for specific usage rather than a flat rate

Utility computing is a plug-in managed by an organization which decides what type of services has to be deployed from the cloud. It facilitates users to pay only for what they use.

15. Explain Security management regarding Cloud Computing.

– Identity management access provides the authorization of application services

– Access control permission is given to the users to have complete controlling access of another user who is entering into the cloud environment

– Authentication and Authorization provide access to authorized and authenticated users only to access the data and applications

16. How would you secure data for transport in the cloud?

This is a frequently asked question. Don’t forget to dive in more in-depth on this topic.

When transporting data in a cloud computing environment, keep two things in mind: Make sure that no one can intercept your data as it moves from point A to point B in the cloud, and make sure that no data leaks (malicious or otherwise) from any storage in the cloud.

A virtual private network (VPN) is one way to secure data while it is being transported in a cloud. A VPN converts the public network to a private network instead. A well-designed VPN will incorporate two things:

A firewall that will act as a barrier between the public and any private network.

Encryption protects your sensitive data from hackers; only the computer that you send it to should have the key to decode the data.

Check that there is no data leak with the encryption key implemented with the data you send while it moves from point A to point B in a cloud.

17. What are some large cloud providers and databases?

Following are the most used large cloud providers and databases:

– Google BigTable

– Amazon SimpleDB

– Cloud-based SQL

18. List the open-source cloud computing platform databases?

Following are the open-source cloud computing platform databases:

– MongoDB

– CouchDB

– LucidDB

19. Explain what is the full form and usage of “EUCALYPTUS” in cloud computing.

“EUCALYPTUS” stands for Elastic Utility Computing Architecture for Linking Your Programs to Useful Systems.

Eucalyptus is an open-source software infrastructure in cloud computing, which enables us to implement clusters in the cloud computing platform. The main application of eucalyptus is to build public, hybrid, and private clouds. Using this, you can produce your personalized data center into a private cloud and leverage it to various other organizations to make the most out of it and use the functionalities offered by eucalyptus.

20. Explain public, static, and void class.

Public: This is an access modifier, it is used to specify who can access a particular method. When you say public, it means that the method is accessible to any given class.

Static: This keyword in Java tells us that it is class-based, this means it can be accessed without creating the instance of any particular class.

Void: Void defines a method which does not return any value. So this is the return related method.

21. Explain the difference between cloud and traditional data centers.

In a traditional data center, the major drawback is the expenditure. A traditional data center is comparatively expensive due to heating, hardware, and software issues. So, not only is the initial cost higher, but the maintenance cost is also a problem.

Cloud being scaled when there is an increase in demand. Mostly the expenditure is on the maintenance of the data centers, while these issues are not faced in cloud computing.

22. List down the three necessary functioning clouds in cloud computing.
– Professional cloud

– Personal cloud

– Performance cloud

23. What are the building blocks in cloud architecture?

– Reference architecture

– Technical architecture

– Deployment operation architecture

– Reference architecture

– Technical architecture

– Deployment operation architecture

24. What do you mean by CaaS?

CaaS is a terminology used in the telecom industry as Communication As a Service. CaaS offers to the enterprise user features such as desktop call control, unified messaging, and desktop faxing.

25. What are the advantages of cloud services?

Following are the main advantages of cloud services:

Cost-saving: It helps in the utilization of investment in the corporate sector. So, it is cost saving.

Scalable and Robust: It helps in developing scalable and robust applications. Previously, the scaling took months, but now, scaling takes less time.

Time-saving: It helps in saving time regarding deployment and maintenance.

26. How can a user gain from utility computing?

Utility computing allows the user to pay only for what they are using. It is a plug-in managed by an organization which decides what type of services has to be deployed from the cloud.

Most organizations prefer a hybrid strategy.

27. Before going for cloud computing platform, what are the essential things to be taken in concern by users?

– Compliance

– Loss of data

– Data storage

– Business continuity

– Uptime

– Data integrity in cloud computing.

28. Give A Brief Introduction Of Windows Azure Operating System.

The Windows Azure operating system is used for cloud services to be run on the Windows Azure Platform. Azure is preferred as it includes the essential features for hosting all the services in the cloud. You also get a runtime environment which consists of a Web Server, Primary Storage, Management services, load balancers among others. The Windows Azure system provides the fabric for development and testing of services before their deployment on the Windows Azure in the cloud.

29. Mention About The Top Cloud Applications Now A Days?

Top cloud computing applications include Google docs which are very fast and secure. There is also a mobile version of google docs so you can access your data from a smartphone. Pixlr and Phoenix, jaycut also are the applications used for cloud computing.

30. What are the different data types used in cloud computing?

There are different data types in cloud computing like emails, contacts, images, blogs, etc. As we know that data is increasing day by day so it is needed for new data types to store these new data. For example, if you want to store video then you need a new data type.

Now if you want to know more, you can enroll in our cloud computing course– with training from industry professionals, use cases, and hands-on projects.

So we wrap up with our questions here, these questions will help you in the interview, All The Best!

Check out the PGP-Cloud Computing program by Great Learning, with 3 Million+ hours of learning delivered, 5000+ alumni, 300+ industry experts, and 8 top-ranked programs, Great Learning is among the top-ranked institution for analytics.

Get in touch with us for further details and don’t forget to mention your questions in the comments section, we will get back to you with the most industry-relevant answer.

 

This program has elevated my role – Rajesh Kumar, Engagement Lead at Cognizant, UK

Reading Time: 1 minute

Cloud Computing is swiftly becoming one of the top skills considered by tech-professionals to switch their career in. Read what Rajesh Kumar has to say about Great Learning’s PG program in cloud computing and how he was able to complete AWS architect certification.

I have recently completed the Post Graduate Program in Cloud Computing with Great Learning and would like to share my gratitude & experience.

First Things First, Kudos to

– Great Lakes Content Team for preparing & delivering the curriculum aligned for Managers to renew & scale on cutting edge technologies like Cloud, Containers, Microservices, Big Data, Business Transformations etc.

– Experienced mentors & Enter-trainers like Nirmallaya & Shiva, who delivered the content & their experience to students in an exceptional way rather than being monotonous & mediocre

– Emphatic Program Manager – Ekta Singh, Her timely support has kept me on the progress track for successful completion. She played a vital role in pushing to complete labs and projects that instilled confidence to pursue the technology ladder again (though I was core L4 techie a few years back). I truly appreciate her commitment to the overall success of the program.

This program has elevated my role to learn, practice & apply knowledge on all technology & business transformation in the digital world. My role is elevated from the Delivery Manager to Engagement lead focused on Business & Technology.  

I would definitely give credit to PGP-CC for the Training Program, Mentoring Sessions, Sharing abreast of all technology advancements under “Industry Focus” sections & Challenging with Labs, Projects & Capstone projects, that made me achieve AWS cloud practitioner certification and preparation for AWS Architect certification.

Upskill with Great Learning’s PG program in Cloud Computing and unlock your dream career.

5 Reasons Why Cloud Computing Will Grow in the Next 10 Years

Reading Time: 5 minutes

Sky is the limit for Cloud Computing. We are witnessing a massive growth in the adoption and evolution of cloud computing. In some sense, cloud providers like Microsoft, AWS, Google etc. have become to computing and IT, what Walmart, IKEA, and Tesco were for retail. They have huge global infrastructures (read data centers and networks) to offer seamless, on demand, real-time IT services for individuals as well as enterprises of all sizes.

Can you believe AWS even offers “Human work as a service”? Yes, you heard it right. So if you have a job and need human workforce, AWS can offer you that as well!

It may look like the cloud ecosystem has stabilized with a few major players from the west and typical consumers being start-ups, large enterprises, and niche use cases. But, you are in for a surprise here! In the next couple of years, the global market for cloud is already projected to be over $200 Billion. And this is only going to get bigger and better in the next decade!

Imagine a world, where from the moment you wake up until the time you sip your morning cup of tea or coffee, you’ve already created megabytes of data on your quality of sleep, heart rate, calories burnt etc. Isn’t this happening already? Well, yes, but today, your phone doesn’t recommend you a personalized breakfast option based on your last night’s sleep. Neither do you get a fully personalized holiday package deal with your best friends or family when you absolutely need it! We’re still far from this. At least at a mass scale.

These technologies will become commonplace when we accept them ubiquitously – at work, at home, during travel, and in public services. It is no surprise to see working professionals feeling an urgency to re-skill and upskill themselves in hot tech areas like cloud computing, big data engineering, cloud-based analytics and IoT. However, pursuing either isolated cloud computing courses or analytics courses are not very helpful. What you need is a truly experiential learning program or pathway.

“A world where every important physical entity (including us) is attached to various types of sensors that generate data 24×7 and automatically push it to a cloud platform, only to be consumed by a machine learning model hosted and managed by the cloud platform, to generate useful artificial intelligence “AI” based results.”

But how do we even get there? Where does all this data go? How is it analyzed? Where are all these millions of intelligent computers? How will I (an individual) ever be able to afford these computers? So, what will propel cloud in the next 10 years? Multiple factors like 5G networks, Big Data, Artificial Intelligence, Internet of Things and Smart Cities.

Super High-Speed Networks

Until last year, Indian mobile consumers kept a close track of their Internet usage and its associated costs. However, that habit is history now. Indians together consume more data than US and China combined. Astonishingly, this has happened in a short span of 12 months. “People have learned the habit of consuming gigabytes of data every day.”

In the context of India and most of the world, we have 4G telecom networks and data speeds today which is far better than what 3G was. Very soon, the world will witness 10 to 100 times faster networks with 5G. For all the right reasons, this might happen in India sooner than any other country. This means all of us will create and consume terabytes of data on a daily basis. And it is not just us, but everything around us. All our connected devices, home appliances, vehicles, buildings etc. will continuously generate 10 to 100 times the data being generated today – Simply because, with 5G we will be able to do so, at the tap of our fingers!

Such high-speed networks will only enable cloud providers to deliver services at an almost real-time basis without the slight “yet acceptable” latency of the networks available today. This is a positive development for the growth of cloud.

Big Data

“A faster Internet creates highly engaged customers and happier companies (pun intended).” Simply because you create more data.

When people are given a convenient option of downloading an HD video over other inferior resolutions, they will almost always prefer to consume HD. And arguably, they’ll do it many times during a day. More and more people will move closer to such media streaming services which will explode the subscriber base for such companies. e.g. Netflix, Amazon Prime, Hotstar etc. Interestingly, the more you consume from such platforms, the better for them – as your engagement generates continuous behavioral data points, which will allow the likes of Netflix to create better and relevant content for various user groups. If you already don’t know, Netflix is now transforming itself from a content streaming service to also a content creation platform.

It is amazing to note that “Televisions never had this big data advantage”.

However again, to create truly personalized content engines and recommendations, we would need highly skilled big data and cloud computing professionals. Remember, big data only became possible due to mobile devices, cheap storage, and cloud computing platforms.

Another exciting area of development is in the gaming industry! It’s already huge. But with companies and individuals attempting to create real-time massive online games, enhanced with AR/VR and blessed with upcoming 5G speeds, this cant get any better. With such an immersive experience, there is no doubt that people will spend even more time playing these online games across the world. And in this entertaining process, generate BIG DATA!

Artificial Intelligence

What do you do with all the data that we’ve talked about? Well, its used to understand YOU!

If businesses understand you really well (excluding PII – Personally Identifiable Information), they’re better enabled to serve you with the most personalized product or service. But how do they achieve this business goal?

All the big data created by users is ultimately consumed directly by an analytical tool like Tableau or is used to train machine learning models. Like humans, machine learns from observations. The better the data in terms of quality and quantity, the better is the learning outcome of these machines (understand software machines created using code).

Higher network speeds, bigger data pools, faster computing capabilities and most importantly ubiquitous real-time connectivity to the Internet, are all positives for AI’s continuing evolution. A recent MIT Technology Review article already believes that AI will double the cloud market to over $260 billion over the coming years. And this is just the start.

IoT & Smart Cities

With all the connected devices, data, cloud and artificial intelligence, it is only natural and imperative that we create communities truly based on these technologies. These communities (read smart cities) will require real-time insights and intelligent “thinking applications” which are built using Big Data, Machine Learning, and AI. And cloud helps us do exactly this from anywhere, anytime and for whatever you need to build.

To cash in on this trend and technological evolution, tech professionals are trying to jump the bandwagon of learning cloud computing, AI and IoT through on line learning courses on cloud computing and allied areas. Again to reiterate, isolated cloud certifications and courses are only as helpful as getting your foot in the door. In order to become a true expert, you will need more an online certification – a truly experiential cloud computing learning program. One such program is offered by Great Learning.

So in order to create tomorrow’s IT solutions based on billions of data generating devices, applications, algorithms, and people, you will have to change your approach in how you design and consume IT infrastructure and Applications. But more than that, Cloud fundamentally changes what we can possibly build tomorrow. This is the reason why we think, Cloud is a paradigm shift in computing.

Are you ready for the next wave of Cloud revolution?