10 Most Common Business Analyst Interview Questions

Preparing for a Business Analyst Job Interview? Here are a few tips and the most useful and common business analyst interview questions that you might face. 

Before attending an interview for a business analyst position, one should be through about their previous experience in the projects handled and results achieved. The types of questions asked generally revolve around situational and behavioural acumen. The interviewer would judge both knowledge and listening skills from the answers one presents. 

The most common business analyst interview questions are:

 

1. How do you categorize a requirement to be a good requirement?

A good requirement is the one that clears the SMART criteria, i.e., 

Specific – A perfect description of the requirement, specific enough to be easily understandable

Measurable – The requirement’s success is measurable using a set of parameters

Attainable – Resources are present to achieve requirement success

Relevant – States the results that are realistic and achievable

Timely – The requirement should be revealed in time 

business analyst interview questions

 

2. List out the documents used by a Business Analyst in a project?

The various documents used by a Business Analyst are:

a. FSD – Functional Specification Document

b. Technical Specification Document

c. Business Requirement Document 

d. Use Case Diagram

e. Requirement Traceability Matrix, etc.

 

3. What is the difference between BRD and SRS?

SRS (Software Requirements Specifications) – is an exhaustive description of a system that needs to be developed and describes the software – user interactions. While a BRD (Business Requirements Document) is a formal agreement for a product between the organization and the client. 

The difference between the two are:

business analyst interview questions

 

4. Name and briefly explain the various diagrams used by a Business Analyst.

Activity Diagram – It is a flow diagram representing the transition from one activity to another. Here activity is referred to the specific operation of the system.

Data Flow Diagram – It is a graphical representation of the data flowing in and out of the system. The diagram depicts how data is shared between organizations

Use Case Diagram – Also known as Behavioural diagram, the use case diagram depicts the set of actions performed by the system with one or more actors (users).

Class Diagram – This diagram depicts the structure of the system by highlighting classes, objects, methods, operations, attributes, etc. It is the building block for detailed modelling used for programming the software.

Entity Relationship Diagram – It is a data modelling technique and a graphical representation of the entities and their relationships. 

Sequence Diagram – It describes the interaction between the objects. 

Collaboration Diagram – It represents the communication flow between objects by displaying the message flow among them.

 

5. Name different actors in a use case diagram?

Broadly, there are two types of actors in a use-case:

a. Primary Actors – Start the process

b. Secondary Actors – assist the primary actor

They can further be categorized as:

i. Human

ii. System

iii. Hardware

iv. Timer

 

6. Describe ‘INVEST’.

The full form of INVEST is Independent, Negotiable, Valuable, Estimable, Sized Appropriately, Testable. With this process, the technical teams and project managers to deliver quality products or services.

 

7. What is Pareto Analysis

Also known as the 80/20 rule, Pareto Analysis is an effective decision-making technique for quality control. As per this analysis, it is inferred that 80% effects in a system are a result of 20% causes, hence the name 80/20 rule.

 

8. Describe the Gap Analysis.

It is utilized to analyze gaps between the existing system and its functionalities against the targeted system. The gap is inferred to the number of changes and tasks that need to be brought in to attain the targeted system. It compares performance between the present and the targeted functionalities.

 

9. Name different types of gaps that could be encountered while Gap Analysis

There are mainly four types of gaps:

a. Performance Gap – Gap between expected and actual performance

b. Product/ Market Gap – Gap between budgeted and actual sales numbers

c. Profit Gap – Variance between targeted and actual profit

d. Manpower Gap – Gap between required and actual strength and quality of the workforce in the organization

 

10. What are the various techniques used in requirement prioritization?

Requirement prioritization, as the name suggests, is a process of assigning priorities to the requirements based on business urgency in different schedules, phases, and cost among others.

The techniques for requirement prioritization are:

a. Requirements Ranking Method

b. Kano Analysis

c. 100 Dollar Method

d. MoSCoW Technique

e. Five Whys

 

Stay tuned to this page for more such information on interview questions and career assistance. If you are not confident enough yet and want to prepare more to grab your dream job as a Business Analyst, upskill with Great Learning’s PG program in Business Analytics and Business Intelligence, and learn all about Business Analytics along with great career support.

How will Cybernetics And Artificial Intelligence build our future?

We live in a world where what was considered science fiction mere decades ago has become a reality. Global, wireless internet coverage, 3D printed technologies, the Internet of Things powered by AI-based assistants, and, of course, cyborgs, are all part of the reality we live in.

Cyborgs? Yes, those are real. Look at Dr Kevin Warwick. The man can operate lights, switches, and even computers with the power of his mind thanks to a handy chip implant. Neil Harbisson has overcome achromatopsia thanks to an implant that allows the artist to process colours in real-time on a level unachievable by anyone else on the planet. 

If you were to do some research, you’d find out that these pioneers are merely the tip of a cybernetically enhanced iceberg bringing up the real question: if we’ve already come so far, what awaits us in the future?

Cybernetics

Some of the most prominent projects diving into the exploration of cybernetics feel like they were taken from a cyberpunk novel. And yet they are real. More on the matter, they mark what is potentially the future of humankind as a species. 

Full-spectrum vision. Typically, humans believe that the way we “see” the world is the only possible way. Cybernetics engineers would beg to disagree. A simple injection of nanoantenna has proven to give lab mice the superpower of night vision. The experiment taking place in the University of Massachusetts has only recently moved towards practical studies of the effects the antenna have on rodents, but it has already proven itself to be among the first stepping stones towards cybernetically enhanced eyesight. Additional breakthroughs in the field have shown promising results in turning eyes into video cameras, or even development of artificial retinas capable of returning sight to the blind. 

Artificial brain cells. Modern advancements in the niche of cybernetics have already grown neurons – the basic components of a human brain – in laboratory conditions. These cells, artificially raised on an array of electrodes are proving themselves as a superior replacement to the hardware and software controllers we have today. 

More on the matter, scientists are already using brain-computer interfaces in medicine. Most are designed for therapeutic purposes such as the Deep Brain Stimulation designed to aid patients with Parkinson’s disease.

We will be able to use said technology to create connections that operate via the remaining neural signals allowing amputee patients to feel and move their limbs with the power of their mind. In some cases, as it was with Nigel Ackland, some might even go as far as to use the word enhancement when talking about top tier prosthetics.

Enhanced mobility. Stronger, faster, more durable – those are the goals of military-grade exoskeletons for soldiers that are already branching out into the medical niche and serve as prosthetics for amputee victims.  The combination of hardware and AI-based software eliminate the boundaries of human capabilities while minoring the vitals of the wearer in real-time. 

Technopsychics. The University of Minnesota is working on a computer to brain interface capable of remotely piloting drones. The machines can detect the electrical signals transmitted by the brain to control functioning machines in real-time. If you can navigate a quadcopter through an obstacle course using only the power of your mind today, imagine what we’ll be piloting remotely tomorrow. 

Nanorobots. Self-repair, growth, and immunity to diseases will soon be true thanks to a simple infusion of nanobots into your bloodstream. Modern researches explore the idea of developing your blood cell’s little helpers that can be controlled in the cloud from your smartphone!

Artificial Intelligence

As you may have deduced from the examples above, the advancements in the cybernetics niche are directly related to the progress we make with Artificial Intelligence or Machine Learning technologies. 

We need the software capable of driving the hardware to its limits if we are to dive deeper into cyborg technology. Artificial Intelligence is supposed to become the bridge between the man and the machine according to prominent research such as Shimon Whiteson and Yaky Matsuka. These scientists are exploring new ways AI can help amputee patients to operate their robotic prosthetics. 

Furthermore, AI is expected to take control of machines doing sensitive work in hazardous areas. According to BBC, we already have smart bots capable of defusing bombs and mines yet they still require a human controlling them. In the future, these drones (and many more, responsible for such challenging tasks as toxic waste disposal, deep-sea exploration, and volcanic activity studies, etc.) will be powered purely by algorithms. 

Lastly, machines are expected to analyze and understand colossal volumes of data. According to Stuart Russell, The combination of AI-powered algorithms and free access to Big Data can identify new, unexpected patterns we’ll be able to use to mathematically predict future events or solve global challenges like climate change. 

What a time to be alive! 

If you wish to learn more about Artificial Intelligence technologies and applications, and want to pursue a career in the same, upskill with Great Learning’s PG course in Artificial Intelligence and Machine Learning.

 

15 Most Common Data Science Interview Questions

Data Science is a comparatively new concept in the tech world, and it could be overwhelming for professionals to seek career and interview advice while applying for jobs in this domain. Also, there is a need to acquire a vast range of skills before setting out to prepare for data science interview. Interviewers seek practical knowledge on the data science basics and its industry-applications along with a good knowledge of tools and processes. Here is a list of 15 most common data science interview questions that might be asked during a job interview. Read along.

 

1. How is Data Science different from Big Data and Data Analytics?

Ans. Data Science utilizes algorithms and tools to draw meaningful and commercially useful insights from raw data. It involves tasks like data modelling, data cleansing, analysis, pre-processing etc. 

Big Data is the enormous set of structured, semi-structured, and unstructured data in its raw form generated through various channels.

And finally, Data Analytics provides operational insights into complex business scenarios. It also helps in predicting upcoming opportunities and threats for an organization to exploit.

data science vs big data vs data analytics
How is Data Science different from Big Data and Data Analytics?

2. What is the use of Statistics in Data Science?

Ans. Statistics provides tools and methods to identify patterns and structures in data to provide a deeper insight into it. Serves a great role in data acquisition, exploration, analysis, and validation. It plays a really powerful role in Data Science.

 

3. What is the importance of Data Cleansing?

Ans. As the name suggests, data cleansing is a process of removing or updating the information that is incorrect, incomplete, duplicated, irrelevant, or formatted improperly. It is very important to improve the quality of data and hence the accuracy and productivity of the processes and organization as a whole.

 

4. What is a Linear Regression?

The linear regression equation is a one-degree equation of the form Y = mX + C and is used when the response variable is continuous in nature for example height, weight, and the number of hours. It can be a simple linear regression if it involves continuous dependent variable with one independent variable and a multiple linear regression if it has multiple independent variables. 

 

5. What is logistic regression?

Ans. When it comes to logistic regression, the outcome, also called the dependent variable has a limited number of possible values and is categorical in nature. For example, yes/no or true/false etc. The equation for this method is of the form Y = eX + e – X

 

6. Explain Normal Distribution

Ans. Normal Distribution is also called the Gaussian Distribution. It has the following characteristics:

a. The mean, median, and mode of the distribution coincide

b. The distribution has a bell-shaped curve

c. The total area under the curve is 1

d. Exactly half of the values are to the right of the centre, and the other half to the left of the centre

 

7. Mention some drawbacks of the linear model

Ans. Here a few drawbacks of the linear model:

a. The assumption regarding the linearity of the errors

b. It is not usable for binary outcomes or count outcome

c. It can’t solve certain overfitting problems

 

8. Which one would you choose for text analysis, R or Python?

Ans. Python would be a better choice for text analysis as it has the Pandas library to facilitate easy to use data structures and high-performance data analysis tools. However, depending on the complexity of data one could use either which suits best.

 

9. What steps do you follow while making a decision tree?

Ans. The steps involved in making a decision tree are:

a. Pick up the complete data set as input

b. Identify a split that would maximize the separation of the classes

c. Apply this split to input data

d. Re-apply steps ‘a’ and ‘b’ to the data that has been divided 

e. Stop when a stopping criterion is met

f. Clean up the tree by pruning

Decision tree
Steps involved in making a Decision Tree

10. What is Cross-Validation? 

Ans. It is a model validation technique to asses how the outcomes of a statistical analysis will infer to an independent data set. It is majorly used where prediction is the goal and one needs to estimate the performance accuracy of a predictive model in practice.

The goal here is to define a data-set for testing a model in its training phase and limit overfitting and underfitting issues. The validation and the training set is to be drawn from the same distribution yo avoid making things worse.

 

11. Mention the types of biases that occur during sampling?

Ans. The three types of biases that occur during sampling are:

a. Self-Selection Bias

b. Under coverage bias

c. Survivorship Bias

 

12. Explain the Law of Large Numbers

Ans. The ‘Law of Large Numbers’ states that if an experiment is repeated independently a large number of times, the average of the individual results is close to the expected value. It also states that the sample variance and standard deviation also converge towards the expected value.

 

13. What is the importance of A/B testing

Ans. The goal of A/B testing is to pick the best variant among two hypotheses, the use cases of this kind of testing could be a web page or application responsiveness, landing page redesign, banner testing, marketing campaign performance etc. 

The first step is to confirm a conversion goal, and then statistical analysis is used to understand which alternative performs better for the given conversion goal.

 

14. What are over-fitting and under-fitting?

Ans. In the case of over-fitting, a statistical model fails to depict the underlying relationship and describes the random error and or noise instead. It occurs when the model is extremely complex with too many parameters as compared to the number of observations. An overfit model will have poor predictive performance because it overreacts to minor fluctuations in the training data.

In the case of underfitting, the machine learning algorithm or the statistical model fails to capture the underlying trend in the data. It occurs when trying to fit a linear model to non-linear data. It also has poor predictive performance.

 

15. Explain Eigenvectors and Eigenvalues

Ans. Eigenvectors depict the direction in which a linear transformation moves and acts by compressing, flipping, or stretching. They are used to understand linear transformations and are generally calculated for a correlation or covariance matrix. 

The eigenvalue is the strength of the transformation in the direction of the eigenvector. 

 

Stay tuned to this page for more such information on interview questions and career assistance. If you are not confident enough yet and want to prepare more to grab your dream job in the field of Data Science, upskill with Great Learning’s PG program in Data Science Engineering, and learn all about Data Science along with great career support.

What are the career prospects for a DevOps Engineer?

The DevOps domain is getting attention for its role in building better communication, collaboration, and agility between software development and operations teams. The role of a DevOps engineer is hard to understand because it is the product of a dynamic workforce which has not yet stopped evolving.

DevOps is a software development strategy that bridges the gap between developers and their IT counterparts. It is a practice that aims to merge software development, quality assurance, and deployment and integration operations into a consolidated and continuous set of processes.

DevOps is a natural extension for Agile and other continuous delivery approaches. With DevOps, organizations can release tiny features quickly and incorporate the feedback they receive from stakeholders rapidly.

It’s good to note that DevOps is not merely a set of actions, but more a philosophy that facilitates cross-functional team communication.

What is DevOps Engineer?

DevOps engineers work with the software developers and IT professionals to track code releases. They are the people who wear multiple hats – software development, deployment, network operations, and system admins. Teamwork stands at the core of a DevOps practice and the overall success of a process depends on the same.

As such, DevOps engineers are expected to have a thorough understanding of various concepts such as version control, serverless computing, integration, testing, and deployment. 

The role of a DevOps engineer is formed out of the need of businesses to get hold of their cloud infrastructure in a hybrid environment. Organizations who work with DevOps spend relatively less time on managing configurations, deploying applications, and making tweaks and updates.

The Skills Needed for a Successful DevOps Engineering Career

According to Puppet, the most critical skills for a DevOps engineer are:

– Coding and scripting

– Process re-engineering

– Communication and collaboration

*Out of these, process re-engineering is the most selling skill.

 

Other skills that can enhance a DevOps engineering career are:

– Software development, system administration, and an understanding of all basic IT operations.

Experience and expertise with tools such as GitHub, Puppet, Jenkins, Chef, Nagios, Ansible, and Docker.

– Besides knowing off-the-shelf tools a DevOps engineer should also be well-versed with the basic coding and scripting languages such as Bash, PowerShell, C#, C++, Python, PHP, Ruby, Java, and so on.

– An understanding of database systems such as SQL and NoSQL database models.

– Communication and interpersonal skills are critical for a DevOps engineer since they have to ensure that the entire team behind a software works effectively and share and appreciate feedback to support continuous delivery.

The Roles and Responsibilities of a DevOps Engineer

In DevOps, there are frequent changes made to any software system which automatically entail testing and deployment. A DevOps Engineer is responsible to handle the IT infrastructure according to the business needs of the code deployed in a hybrid multi-tenant environment, needing continuous performance monitoring. 

Therefore, a DevOps engineer must be aware of the various development tools which are used by software developers to write new code or enhance the existing code.

A DevOps engineer needs to collaborate with the team to handle challenges that spring up in the coding or scripting part including libraries and SDKs. A DevOps engineer handles code that needs to fit across multi-tenant environments, including the cloud.

Here are the roles and responsibilities of a DevOps engineer, in a nutshell:

– Apply cloud computing skills to deploy upgrades and bug fixes across AWS, Azure, or GCP.

– Design, develop, and implement software integrations on the basis of user feedback and reviews.

– Troubleshoot and resolve production issues and coordinate with the development team to simplify and streamline code deployment.

– Implement automation frameworks and tools – CI/CD pipelines.

– Manage the IT infrastructure, which comprises of the network, software, hardware, storage, virtual and remote assets, and control over data cloud storage.

– Continuously monitor software environments for any loopholes.

– Analyze code continuously and communicate detailed feedback to software development teams to ensure improvement in software and timely completion of projects.

– Collaborate with team members to improve engineering tools, systems, procedures, and security arrangements.

– Optimize and enhance the business’ computing architecture.

– Conduct system checks for security, availability, and performance.

– Develop and maintain troubleshooting documentation to keep up with past and future fixes.

 

Apart from these explicit set of actions, DevOps engineers are also expected to follow the essential DevOps principles:

– Culture inherent in the need for communication, collaboration, and technical processes and tools.

– Automation of processes

– Measurement of the Key Performance Indicators

– Sharing feedback, knowledge, and best practices.

How Much Does a DevOps Engineer Earn?

The job of a DevOps engineer ranks #2 on Glassdoor’s Top 50 Jobs in America. Also, the role of a DevOps engineer has witnessed a jump of 225 per cent in postings on Indeed. An important question that occurs among the DevOps aspirants is What is DevOps Engineer Salary? 

Glassdoor mentions that the average salary of a DevOps engineer in India starts from INR 5.65 lacs per annum for an average of two years experience. For the same set, PayScale suggests that the average salary of a DevOps engineer is around INR 6.6 lacs per annum. PayScale also mentions that pay is also a function of the skill sets acquired by a DevOps Engineer. Also, most of the professionals in DevOps move to other related roles in a span of 10 years.

It is safe to say that a DevOps engineer’s job is quite in demand as businesses try to become more agile and take on continuous delivery approaches over long development cycles.

If you are considering a career as a DevOps engineer, upskill yourself with Great Learning’s DevOps Engineer Certificate Program

With Career support, I got to interview with many companies – Sai Ramya Machavarapu, Data Analyst at Mercedes Benz.

A career transition can be a daunting experience for many. But given the right direction, learning, and support, it is more like a cakewalk. That’s why here at Great Learning, we strive to provide the right learning, practical exposure, and complete career support. 

What has your professional journey been like?

I completed my graduation in Electronics and Communications Engineering from Amrita College, Bangalore. Then I moved to the USA to pursue my Masters in Electrical Engineering from the University of Missouri, Kansas-City in the year 2014. I got placed in Reliable Software Resources as a QA Tester and worked until May 2017. I will be joining Mercedes Benz very soon as a Data Analyst.

How did you develop an interest in Data Science? Why did you choose GL to pursue it?

Previously, I was working as a manual tester for a Consulting firm. The job profile involved manual testing for a project of Banking. The role was very limited and monotonous, so I decided not to go deeper into testing. I left my job and moved back to India. As I was from a non-programming background, I was very sceptical to get into coding and related fields. I was looking into various technologies and was suggested by a friend to consider Data Science as an option. I attended many seminars and workshops on Data Science organized by various companies. I developed an interest in this field and was looking for a classroom course. On the recommendation of the same friend, I joined Great learning to pursue PGP-DSE.

Coming from a non-programming background, was it difficult for you to understand the subjects?

Not at all. As most of the students in the batch were from the non-IT background, the course is designed keeping them in mind. The faculties ensured that the basics were covered. I understood that the course is based on Logics, so I slowly developed pace and contrary to my presumptions, I didn’t find it difficult. The faculty put in a lot of time and attention towards us and even repeated the sessions whenever required.

How was your experience of the academic and career support given by GL?

The team was always available, especially Akhila as she helped us thoroughly in preparing for the interviews and gave regular suggestions and feedback for us to improve at the same. Whenever we had any issues, Akhila and the team resolved them at priority.

With Career support, I got to interview with many companies like CTS, Mercedes, etc. Based on my experience, I realized that the curriculum is self-sufficient to crack any interview. The entire course is designed in a way to help us understand the concepts, crack interviews, and guide during the projects. 

What did you like the most in the program?

We were assigned mini projects on the completion of every topic. This gave a lot of hands-on experience of every topic in terms of understanding and its practical application. This hands-on experience on mini projects gave me a lot of confidence and helped me in exams as it gave a recap of all that we had learned in the course. After the completion of the course, during the capstone project, there were many remedial sessions to clear doubts. 

Share your experience of the interview with Mercedes.

The interview was organized by GL at Mercedes’s Bangalore office. The interview included a total of 3 rounds; 2-Technical and 1-HR. The first round included questions based on whatever I had mentioned in my resume and basic questions over Coding, ML, SQL, etc. 2nd round involved questions related to the Business aspect. The final round was an HR round, where they gave me a confirmation after the interview.

Any advice to aspirants who wish to take this course?

They should be confident in sharing what they know and admit to what they don’t know. Give your 100% to every interview thinking that this is the last opportunity as there is a huge competition in the market. There focus should be in developing a strong foundation of whatever they are learning. The interviews are based on basics and focus to test you in your understanding of the field. So, have a stronghold of basics and you will be good enough to crack through it.

 

Upskill with Great Learning’s PG program in Data Science Engineering and unlock your dream career.

 

GL helped me to kick-start my career – Yeknath Merwade, Associate Analyst at Ugam Solutions

One needs career support the most when they are a fresh graduate. The right direction and support at the right time help multifold in shaping a successful career. What kind of support did Yeknath get? Read on:

What has your professional background been?

I completed my Graduation in Electrical, Electronics & Communications Engineering in 2018 from Belagavi, Karnataka. I then took a course in Data Science at Great Learning, Bangalore and currently, I am working in Ugam Solutions as an Associate Analyst.

How did you develop an interest in Data Science?

I finished my graduation with 58% aggregate score. With this score, I was not eligible to attend interviews for any good role or company. I understood the need to upskill myself as my father suggested me to read about Data Science which has created a lot of buzz. After researching online, I developed an interest in it and got fascinated with what this field can do.

Why did you choose GL to pursue a course in Data Science Engineering?

After viewing the scope and growth opportunities, I immediately started to search for courses. But to choose the best out of them was a task in itself. All I wanted was to take a classroom program as for a fresher it was better compared to online training. I visited GL’s website for weeks and saw it was regularly updated with relevant data and testimonials. I checked the reviews on Google and LinkedIn as well. Finally, I looked at the faculty profiles on LinkedIn and saw their experience. I understood that GL is the best institute in India to study Data Science, so I took up the course here. 

What did you like the most in the program?

There were many things that I loved about the program.

  1. The Faculty: Since I looked at the LinkedIn profiles of almost all the teaching professionals, I got to know that they all were Industry experts and had a great experience in their respective fields. When I enrolled myself for the course, I was surprised to see how grounded and friendly they were. Also, they taught us everything from scratch. 
  2. Course-Curriculum: The course is well designed and well structured. The curriculum is exhaustive and gave me a good understanding of the domain. The course includes what is needed by the industry and everything is accommodated in the syllabus.
  3. Career Assistance: I got to sit on campus drive of 7 companies and got shortlisted in all of them. Apart from this, the CV reviews and Mock Interviews helped me develop confidence and crack interviews. Also, they organized Bootcamps for the students and helped us in all aspects. There were ample opportunities and it got us placed.

Overall it was a nice experience as I got good friends and faculty with whom I learned a lot and I am still in touch with them. I feel very grateful to GL, that helped me to kick start my career.

Being from a non-programming background, did you face any issues with the course or the transition?

Initially, it was very hard for me to adjust to the syllabus as I was not at all familiar with Coding or programming. The first week of the course started with Python, which was a new thing for me. Here, I would like to mention that the teaching faculty boosted my confidence by mentioning that “It is not rocket science and is easy to learn”. After the EDA session, I felt self-motivated and realised that irrespective of any branch, one can achieve success in their ventures. Slowly things started to fall in place. I was in regular sync with sessions, and the regular exams and quizzes kept us in constant touch of the topics. In the end, everything was good and great.

Share your experience of interviewing with Ugam?

I had 4 rounds of the interview; An SQL Test of 30 minutes duration, followed by a Case Study of 30 minutes duration again, a Technical round and finally an interview session with Vice President and HR. The technical round involved questions around my Project mentioned in the Resume and general technical questions to check my understanding of algorithms. With the VP, the interview was to check how my understanding can contribute to the Analytics team of Ugam and general questions from the HR. After the interview, I received a job confirmation from them. 

Any advice to our future aspirants of this course.

I would like to suggest to prepare well on Stats and SQL. The material is self-sufficient and includes in-depth content and curriculum. The placement assistance is superb and helps everyone in getting placed. So there is no need to panic for anything. Also, focus on your project as all my interview questions revolved around my Capstone Project.

 

Upskill with Great Learning’s PG program in Data Science Engineering and unlock your dream career.

I got to interview with 3 companies – Pushpendra Nathawat, Programmer Analyst at Cognizant

Finance has evolved to position itself as an important business function. Given the nature of this domain, it overlaps with analytics in many areas. Finance professionals and executives are finding new ways to leverage from this overlap and increase the value of this vertical in their organizations. 

What is your professional background?

I had completed my MBA from Tapmi School of Business in the year 2015. I then joined Vodafone and worked as a Relationship Manager for 10 months. I switched to HDFC and worked for over 1.75 yrs as an Assistant Manager. Currently, I am working with Cognizant as a Programmer Analyst.

Why did you think of upskilling? Why did you choose Great Learning?

I did an MBA with Finance as my specialization and while working with HDFC, I enrolled myself in Financial Risk Course with IIM Kashipur. Though I had good knowledge in Finance Domain, I had no understanding of Coding or Data Science. I felt the need to upskill and checked for the courses. While searching I found high recommendations for GL. So I left my job in Jaipur and moved to Bangalore to pursue a full-time program in Data Science Engineering with Great Learning.

What did you like most about the program?

The management, staff, and the faculty, everyone was very helpful. The faculty took a great deal of interest in teaching students and gave a good explanation of every topic. The management was very supportive in providing any assistance whenever the batch needed extra sessions or special classes for having a better understanding of the programming subjects. 

How was your overall experience at Great Learning?

Since I was from a non-programming background, initially it was a bit difficult to follow the specific modules. But later with the help of faculty, I could cope up with the subjects and it became easier to understand and manage. The faculty was very helpful in providing material and guidance, especially in my lacuna. They took extra effort in organizing classes over those areas during the weekends. Since I was very new to Data Science, I had to improvise a lot in terms of my CV & Interview performance. The Career assistance provided by GL helped me prepare an impressive CV & mock interviews prepped me to crack interviews.

Share your experience of Career fair organized by GL?

I got to interview with 3 companies; Kinara Capital, Credi India, and Cognizant. I cleared the interview with CTS which involved 3 rounds; 2 in Technical of 45 minutes duration each and 1 HR round of interview on the same day. The technical interviews involved testing my knowledge of Machine learning. I got the job confirmation on the same day.

 

Upskill with Great Learning’s PG program in Data Science Engineering and unlock your dream career.

The only resolution you should be making in 2017

Every New Year brings with it the hope of a new beginning in our lives and along with it, come the myriad of resolutions we make to ourselves. Research indicates that most of the resolutions made by people are towards fitness and weight loss. As a result, January becomes a windfall month for most gymnasiums and fitness studios while most of us don’t become any leaner or fitter with passing years the one thing that we can definitely achieve is being a better version of ourselves. To achieve that you don’t have to make tall promises to yourself just Make Learning a Habit.

Learning new things is simple, achievable and one of the most profitable investments you can make each year.

 

1. Learning is like weight-loss

train

Let me make an uncanny analogy here: Aspiring to becoming leaner is very similar to wanting to learn something new. Ultimately, you have to change something that’s core in your behaviour to have the desired results. Both these goals need focus, determination and lots of discipline. And lastly, just as in weight loss as in education, there are no low-hanging fruits or express results. Both take time to fructify, but once you go the distance, there is no looking back.

 

2. Why ‘Learning’ in 2017?

Why we need to learn in 2017

The right question here should be ‘Why Not’. There has never been a better time to learn and frankly speaking, with the changing dynamics of businesses and technology disruption impacting us, if we don’t make learning a habit in 2017 and onwards, our professional credentials would be questionable at best and irrelevant at worst. Learning new skills and upgrading one’s professional capabilities is no longer a matter of choice but a necessity to have a fruitful career. In today’s time and age, the half-life of knowledgehalf-life of knowledge is forever decreasing which means that one needs to keep learning always to stay professionally relevant. The new reality is that what you learn at 25, will not take you till 35.

 

3. What should I learn?

What should you learn

This is like standing by an ocean and trying to find the perfect starting point for your swim. What you can learn is limited only by your intellectual bandwidth and interest. For the sake of brevity, let us focus on what the professional in you needs to learn. Depending upon the industry you are in or aspire to be in, you need to understand the trends that are driving growth. If you are unclear about it, you should talk to your seniors from the industry and pick their brains. Pick an area that is affecting most companies in your space and eventually will impact everyone and build your skill sets in that. Professional competencies such as analytics, big data engineering, product management, information security, intellectual property, digital marketing etc. are high growth areas where most companies are struggling for ‘good’ talent. Finding a sweet spot like this and making yourself competent in it will ensure your career benefits from this talent shortage.

 

4. Where should I learn?

e-learning

Learning in 2017 will be easier than ever before. From blogs to YouTube or TED, from companies offering online learning to mobile apps, ‘lack of access’ cannot be your excuse to not learning. But having said that, having a plethora of options makes it overwhelming and confusing.

I come across some candidates who know what the skills they need to acquire but are not sure if they will be able to learn. I usually advise them to first test the waters by accessing some free content online. YouTube is usually a good source for this. See if you like what you are learning and are able to grasp it.

 

5. Why do we fail to learn online?

why we fail to learn

If you are the kind that does not suffer from such starting troubles, you will usually find your learning options to be either completely online courses or blended courses (online + occasional weekend classroom sessions). Given this spread, how do you decide which format to go for?

Completely online courses provide convenience since you don’t need to attend any classroom sessions. But, online learning has been plagued by abysmally low rates of completion. The main reason for this is that for most of us, we learn better when we learn in a classroom setting with peers and faculty, who we can talk to in person.

The flexibility of attending class room sessions over few weekends in a month gives you the advantage of mixing the best of two worlds – the flexibility of online learning and the learning effectiveness of classroom learning. In our blended analytics program, we have seen hundreds of candidates do our program after having done one or multiple online courses. When asked, the most common response we get is because they feel that their learning in the online programs was incomplete. Also, when it comes to acquiring hard skills such as analytics, big data or machine learning, it is important to focus on programs that are more exhaustive and immersive and don’t take a superficial approach by promising to teach something in a matter of some hours.

 

6. What will it take?

learning in 2017

Learning is for everyone. Amongst the thousands of candidates who take our programs every year, we see about 30% of them to be with in the 15-30 year experience bracket. While there is no age to imbibe the habit of learning, just like with all good habits, the sooner you do it, the better you are. Having said that, learning is hard work. Depending upon when was the last time you were in a class, you would need discipline, focus and perseverance to go the whole distance. Usually, we have seen that the first two months are the hardest but once you settle into a routine within the first sixty days, you will go one to achieve the results you desire. The advice that we give to all our learners is to start small. Begin by dedicating an hour every day for the first 2 weeks, then about 8-10 hours a week for the next thirty days. Small changes in your habit will ultimately lead to big gains in your learning and professional success.

On that note, in 2017, make a promise to yourself. To learn something new and to challenge your professional status quo. Make Learning a habit and build the career you’ve always wanted. Oh and as for fitness, try playing a sport – 5 days a week. It is fun and just as effective (or ineffective) 🙂