Why Many Edupreneurs and Stakeholders in Education Do More Harm Than Good

The education sector has seen a surge in the number of edupreneurs and stakeholders who claim to have revolutionary solutions to transform the way we learn. However, some of these individuals and organizations are doing more harm than good, often under the guise of innovation and disruption.

Many of these edupreneurs have jumped on the artificial intelligence (AI) bandwagon, either without fully understanding the technology and its limitations or deliberately exploiting the hype to secure venture capital (VC) funding. These VCs, in turn, promote the ideas they have invested in through aggressive marketing, often without empirical evidence to support their claims.

They claim that AI will ultimately replace teachers and make books and open educational resources obsolete, which is ironic, given that AI itself is trained on these very resources.

They tout buzzwords like "personalized learning," "discovery learning," "student-led learning," "problem-based learning," and "inquiry learning" as the future of education, without providing concrete evidence of their effectiveness. The assumption is that students can learn on their own, without the guidance of a teacher, using AI-based proprietary edtech tools.

However, this assumption is flawed. Novice learners are not miniature experts, and what works for an expert usually doesn't work for a beginner. This is known as the expertise reversal effect (Sweller, Ayres, Kalyuga, & Chandler, 2003), a reversal of the effectiveness of instructional techniques on learners with differing levels of prior knowledge.

While an expert can be given a problem to be solved after having been taught a certain technique or principle, a novice should be given a more structured approach to using that principle for solving the same problem, for example in the form of a worked example.

As learners progress, a fading procedure where steps in the solution procedure are gradually left open for the learner to complete on their own is more effective than an abrupt switch from worked examples to problems. This approach recognizes that learning is a gradual process that requires scaffolding and support, particularly in the early stages.

It is true that our education system has flaws that need to be addressed and criticized. However, this does not justify edupreneurs using these flaws as a pretext to introduce another precarious model that prioritizes their profits over improved learning outcomes.

We should be cautious of marketing gimmicks and critically evaluate the evidence behind these new approaches. It is essential to widely criticize and reject those that are not grounded in empirical evidence to prevent harm to our education system and, ultimately, to our students.

Why I think teachers can't be replaced:

Books, MOOCs, and AI are all valuable tools, but that doesn't mean AI will replace teachers and books. Instead, AI serves as an assistant for both teachers and students.

Here are a few points to consider, and there are many more:

  1. Finding the right kind of content is crucial, and discovering resources can be a daunting task that consumes hours of my time. Google search and AI are not silver bullets for finding tailored content. To get a comprehensive understanding, one needs to consult multiple sources such as Stack Overflow, Amazon ratings, online forums, GitHub, and peer-reviewed journals to determine which resources are available and which ones to select. Data collection and validation require time and expertise. Additionally, not everyone can afford expensive materials, and you need to go to dark web for torrents and Sci-Hub. This process becomes a research project in itself, which can be overwhelming for novices.

  2. Simply providing materials is not enough. I conducted a small experiment with a graduate student who had no prior programming experience. I gave her a good Rust video tutorial and asked her to complete it. However, she was lost, with hundreds of questions in mind. She didn't even know how to use an IDE like VSCode, which the video didn't explain. There were many gaps in the video that a student needs to fill in to get started and understand the material. When I began explaining the video, she was able to ask questions, get answers, and receive demonstrations on how to do it, as well as help with compilation errors. After that, she could revisit the video on her own and understand it from a new perspective.

Don't tell me that AI can replicate this experience. To even ask a question, one needs background knowledge of what to ask. AI answers are not always optimal or accurate, and they are limited by the data they've been trained on and computational power.

  1. The language barrier of the material is another issue. Most materials are in English, which is not the native language of many Indians. The pronunciation is also different, requiring a significant amount of vocabulary knowledge to understand, which takes years to develop. This doesn't mean we should provide native language materials exclusively, as not learning international languages can lead to less collaboration and segregation. Converting materials to native languages is a massive task that requires significant resources, and it's not feasible to convert all materials.

  2. Building grit is not easy, and students can't develop a growth mindset with the click of a button. It requires nudges and feedback from real humans to make students accountable for their learning.

  3. Furthermore, AI relies on content generated by teachers and researchers. Without data to train on, there can be no AI. So, how can it replace books and research articles?

  4. AI cannot teach human values. A proprietary AI sold by some profit-making edupreneur will unlikely to advise using open-source software or developing and promoting free and open-source software (FOSS).