Technology in Higher Education

Reading Time: 4 minutes

Read on an article published in the Business Standard, by Mohan Lakhamraju, Founder – Great Learning, to know how technology has made its way peripherally into the teaching-learning process. 

Technology has massively impacted every aspect of our lives over the past 30 years. It has transformed how we work, communicate, find information, stay in touch, travel, eat, shop, consume content, stay healthy and even get entertained. Remarkably, one of the most impactful aspects our life, namely, higher education, has resiliently remained relatively undisrupted by technology.

By and large, at most of the higher education institutions (HEIs), students still attend classes taught by knowledgeable faculty, read from textbooks and handouts, take tests and exams, do projects, get grades and a degree certificate. This was the case half a century back and still remains the case. Of course, it has become easier for students to research online (as well as to plagiarize), share and learn from each other. And there are high tech projectors, computers, AV systems (and in some cases smartboards) in the classroom rather than slide projectors and chalk boards. But the basic teaching-learning process remains largely the same.

Is that really so?

While I have painted an extreme picture to make my point that technology has not really disrupted or impacted the core of higher education yet, that picture is not entirely accurate. If we look at the various dimensions of higher education, there are aspects where technology has gained massive penetration and others where there is very little. Let us take a deeper look.

The Business Process of Education

If the various activities in Higher Education are viewed as Business Processes, then Process and Workflow Automation through ERP (Enterprise Resource Planning) has gained substantial penetration across reputed HEIs around the world. All the processes including admissions, fee payments, course planning and scheduling, faculty and other teaching resource management, learning material distribution, student information tracking, etc, have all been automated using technology. In fact, these systems – ERP, SIS (Student Information Systems), LMS (Learning Management Systems), etc – have even moved to the cloud in the past few years. Thus, it’s fair to say that the business processes of higher education has been streamlined and automated by technology saving a lot of paper, time and reduced possibility of human error.

What about the actual Teaching – Learning Process?

Technology has also made its way peripherally into the Teaching-Learning Process in several ways. Some of these include

  1. Learning Management Systems: Many of the good HEIs around the world now use some kind of LMS which digitally facilitates several aspects that were earlier handled manually or physically. This includes digital distribution of learning material, communication and doubt clearing, assessments, information sharing like announcements, grades, etc,
  2. Lecture Capture Systems: Fairly common now at high quality HEIs are these systems which capture the classroom lectures digitally and make them available as videos to learners to view at their leisure. This is of great help to students who need to hear things a few times before they understand or to those who could not attend for some reason.
  3. Online Learning Resources: There are many high quality, free, open education resources (OERs), like Merlot, MIT Open Courseware, Apple University, YouTube, etc available now to both teachers and students. This empowers teachers by reducing the burden on them to create material from scratch.
  4. Online Learning: There has been a massive surge in online learning itself ranging from Online Programs now offered by many leading HEIs, to MOOCs such as Coursera and EdX, all of which happen through technology platforms.

I use the qualifier peripherally since none of them fundamentally change the predominant pedagogical model – the teacher educating the learner.

Does any of this improve learning?

Despite all the tech-enabled interventions mentioned above, the biggest determinant of learning effectiveness remains an inspiring and engaging teacher who is able to arouse curiosity, interest and engagement among learners. However, such teachers are few and far in between to make a broad based impact. So, is there nothing that technology can do to address this aspect – learning effectiveness? Turns out that there are a few ideas being used with some promising initial success.

  1. Flipped Classroom: In this model, students review the course content, typically readings and recorded lectures outside the classroom, and classroom time is used to apply the knowledge and solve problems. It inverts what traditionally has been classwork and homework, but is more engaging for students and has been seen to result in better learning outcomes.
  2. Gamification: Experience from computer games has shown that game dynamics create very high engagement. These are being borrowed and implemented in the learning process by introducing fine-grained competition, leaderboards, levels, etc, across all aspects of the learning process and not just the exam results.
  3. Learning Analytics: Though still in its early days, Learning Analytics promises to be a big game changer. It enables aspects such as learner engagement, his/her understanding of concepts, difficulties being encountered, progress being made and various performance parameters to be measured, analysed and even predicted.Learning Analytics brings measurement into aspects of education that have never been measured before. The resulting data can be analysed to get insights into learners both individually and collectively, to identify problem areas that students did not follow and to predict students likely to fail thus enabling necessary corrective action. Further, it can enable Adaptive Learning where learning resources and methods can be personalized to suit the learning dynamics of individual learners. This analysis has hitherto intuitively happened only in the minds of observant, caring and dedicated teachers. Now it can be applied broadly to have a real impact on learning effectiveness.

What more does the future hold?

An emerging area ripe with potential is the use of Augmented and Virtual Reality (AR/VR) to create immersive learning experiences that enable learning by doing rather than listening or watching, which can substantially improve learning effectiveness.

Further, with a data-driven approach slowly beginning to take hold in higher education, there is a tremendous scope to have very impactful data-led innovations. Prime among them is the notion of an Artificial Intelligence (AI) enabled Learning Assistant. Think of your own Jarvis (from Iron Man) as a personal tutor who knows your learning preferences and behaviour intimately, what and how much you have learned and can create a customized learning experience that best suit you. IBM Watson is working on this.

Many innovative startups as well as the big tech giants like Google, IBM, Apple, etc, are working on disruptive innovations in education, which makes the next decade very exciting for this important aspect of our lives.

Further Readings:

How Gamification Can Transform Education

Education Technology Enables You to Learn New Skills to Become Job Ready

Adoption of technology in online education

How IT Professionals Can Prepare Themselves to Deal with Layoffs

Reading Time: 3 minutes

The recent IT layoffs are a testament to the sharp shift that the technology sector is experiencing in terms of jobs and skills. Read on an article by our founder, Mohan Lakhamraju in the Economic Times, to find out what you can do to save your job and stay relevant in it for a long time!

“The nature of IT industry is changing where their clients are demanding higher levels of automation and greater leverage on their technology investments.

The IT industry’s original business model of deploying armies of programmers and charging for their time is rapidly becoming obsolete. Projects now have either outcome-based pricing or fixed price with high competition.

Therefore, IT companies are trying to achieve more with lesser manpower which is resulting in more automation. The jobs that are most at risk are those that involve routine maintenance and workflow management since these are increasingly getting automated.

Legacy skills like mainframes and software testing are also getting obsolete as newer technologies are getting adopted. The skills that are in demand are broadly in the areas referred to as SMAC – Social, Mobile, Analytics and Cloud.

Digital and Social media marketing is fast taking share from traditional media due to its data-driven nature and measurability. Every application is now moving to mobile, leading to more demand for mobile development. Applications are increasingly moving to cloud from being on large backend mainframes. Analytics and Data Engineering is permeating every domain.

To deal with the change and to sustain the fluctuations of the job market the professionals need to train themselves in the courses that are high in demand. The days of learn once and work forever, are gone. We are now in a world where continuous and lifelong learning will become the norm.

So, IT professionals will need to upgrade their skills at least every 3-4 years. They can do this by taking training programs offered by their companies. They can also learn by doing online courses.

For those that are motivated, there are plenty of free online resources to learn from. Those that need help and guidance should choose a more structured program from a reputed institution. The best solution is if they can pick up new skills by themselves on a regular basis. This requires them to follow the latest technology trends and stay up to date with them. Those that do this will be best positioned to convert the threat into growth opportunities.

For many professionals, developing these new skills is a matter of survival and staying relevant in a dynamically changing environment.

For those whose skills are getting outdated, more than ROI, the relevant question is what happens if they do not make the investment. In most of the cases, those that complete the courses properly and develop the new skills are able to transition to newer higher paying jobs and achieve ROI within 1-2 years.

Lastly, disruptive shifts in the IT and Technology industry are not new. They have been happening every decade. So, experienced people in the industry who have worked across decades are already used to seeing these shifts.

It is the younger professionals with less than 10 years of experience that are always caught off-guard because they would not have seen prior shifts and may get too comfortable with what they already know and stop learning. The advice to these young professionals is to make continuous learning a lifelong habit.”

Click here to read the original article.

Other Media Mentions:  

How Can You Fix Your Tech Skills Gap?

How to Get Your Way Around the IT Layoffs: “Scale to Skill”

Great Lakes Faculty Amongst Top 10 Analytics Academicians India – 2017

RIP Degree, Hello Competency?

Reading Time: 4 minutes

The inevitable transition of value from Degrees to Competencies in the knowledge economy

I had a conversation in Bangalore recently with a senior technology professional, one with over 20 years of experience in both large and small technology companies and currently in a VP role at one of the posterchildren of India’s Internet businesses. He said that when he interviews people these days, very rarely does he look at what degree they possessed. He is more interested in what they can do and what the most recent course they had done on Coursera.

Shift in Hiring Manager’s mindsets

Conversations with dozens of senior industry professions over the past couple of years indicate that most of them, in their hiring decisions, look more at what the candidates know (knowledge) and can do (competence) rather than what degree they have.

This is a phenomenon, increasingly common, that acknowledges the fact that our undergraduate and post graduate degrees are no longer good enough. The rapid pace of change being driven by technology has meant that we have to be constantly learning to keep up with the latest and best practices.

When we were growing up, people were introduced to each other an engineer, a doctor, a CA, a lawyer, a commerce graduate, an arts graduate, etc., – essentially tying our identity to our formal education. Job requirements were specified in these terms. The most essential requirement for a job would be the undergraduate or post graduate degree. Today, if you look at the job descriptions on Naukri or Linkedin for knowledge workers/roles, the degree would be the last or the second last thing that is mentioned with most of the other requirements specified in terms of competencies: x years of digital marketing experience, y years of data analysis experience, z years of design experience, etc. Unless it is from a very reputed top institution, the degree seem to hardly contribute to the interview evaluation. Even the value being added by the reputed institution seems to be attributed more to the filtering and motivation/drive of the candidates that it signals than the specific degree that is pursued there.

This transition is already manifesting in the recruitment practices of several of the most reputed companies in the world. A few weeks back, Ernst and Young, one of the most reputed consulting firms and a large recruiter of young talent globally, declared in its UK division that it would scrap the UG degree as a recruitment filter and instead rely on its internal competence assessment. Several technology companies like Google, Uber, Facebook, and closer to home, Flipkart, Snapdeal, etc., have started accepting candidates based on informal credentials like the “nanodegrees” from Udacity or the “specializations” and certificates from Coursera, which are merely signals of verified competencies and not accredited degrees or diplomas.

This phenomenon hit home for me recently from the most unlikely of sources. I was attending a talk at one of the TIE conferences on education in Delhi and the keynote speaker was an ex Pro Vice Chancellor of IGNOU. IGNOU is the world’s largest grantor of degrees in the world, having over 1 million enrolees and distributing hundreds of thousands of degrees and diplomas each year. I am sure these degrees are meaningful to a large number of people who were not fortunate enough to go through a full time college experience and are qualifying them for a large number of government and public sector jobs thus serving a valuable purpose for them. However, it is widely acknowledged that they hold little value in the knowledge economy due to the poor learning outcomes associated with them.

Given this background, I went into this talk expecting to hear a very traditional perspective on education from this septuagenarian gentleman. However, what I heard blew my mind. He had some of the most progressive and creative ideas I had ever heard on the transformation that is happening and will happen in the world of education and in the global talent markets. One of the points he made really stuck. He said, “Today, what KRA stands for has changed. It stands for ‘Kyun Rakhe Aapko (Why should I keep you)’”. I thought that this captured most succinctly the massive change in focus in the talent market from degree to competency.

Dawn of the Portfolio

If competencies are becoming all important, how does one showcase them or communicate about them? This is being done through creating a “Portfolio” or “body of work” that demonstrates the competence. This approach is not new. It has been widely used in other fields that require creativity and innovation, qualities that are increasingly more important in the knowledge economy. A photographer, an architect, an artist, a designer, a writer, a journalist, a film director, a PR executive – all of these professionals are judged by reviewing a portfolio of their prior work. This is now being applied to knowledge professionals as well. Good programmers are being judged by the code libraries they have shared on Github, the hackathon they have participated in and their topcoder rank. Good Data Analytics professionals are being judged the analytics problems they have solved on platforms like Kaggle. Marketing professionals are being judged by the blog posts and social media presence they have created for their brands.

I believe that this trend will accelerate further as it is in keeping with the general shift in decision making becoming more and more data driven. When recruiters can make decisions based on data that is directly relevant to them, like the directly relevant portfolios of candidates they are considering, they have a little reason to depend on a stamp applied by a third party education institution using methodology that may or may not be relevant to them.

So, it’s high time all knowledge professionals, particularly those at the early stages or their career, start creating their personal portfolios. That will be their currency in the competence-driven world of the future.