Students working on computers in a library setting
Editorial

History on Repeat: Higher Ed’s Tech Hype Undermines Faculty — Again

4 minute read
Emily Barnes avatar
By
SAVED
As AI tools flood campuses, faculty are left out of planning and support. It’s a pattern higher ed can’t afford to repeat.

Tech budgets have soared. Faculty morale has sunk. And once again, higher education teeters on the edge of a familiar cliff and is captivated by the illusion that AI will solve engagement, build online learning at scale and eliminate the “friction” of faculty altogether.

After 15 years straddling both sides of the academic aisle, the divide is painfully clear. Administrators question why faculty resist change. Why they will not lower entry thresholds, adopt edtech faster or align more closely with institutional priorities like enrollment and retention. Faculty, in turn, ask why administrators bypass input, dismiss academic rigor and make top-down decisions about pedagogy without stepping foot in a classroom.

At its core, both sides are guilty of defending turf in a system already being rewritten. The era of mutual finger-pointing is over. Not because one side won, but because AI has entered the conversation, and it is not waiting for consensus.

The current wave of AI implementation in higher ed is not revolutionary; it is recycled from years of system integration. Institutions are once again investing in tools to work around faculty, rather than with them. But here’s the contradiction: the very metrics used to justify these purchases like student retention, satisfaction and learning outcomes are still inextricably tied to faculty interaction. The research confirms it. The platforms know it. Yet the budget refuses to reflect it.

Historical Patterns in Tech Adoption

This playbook is ultimately predictable. Blackboard dominated the 2000s. CRMs flooded campuses in the 2010s. Predictive analytics dashboards followed, each marketed as a game-changer for student success. But for faculty, the experience was rarely transformative. Its more often a top-down mandate, layered onto already strained workloads with minimal training and maximal oversight.

In 2009, I entered higher education as a campus librarian. My job? Help faculty upload presentations, troubleshoot learning platforms and build their first online courses. My other job, unstated but constant, was helping students find those courses, open their assignments, fix their passwords and, yes, unjam the printer. Faculty were not resisting innovation. They were navigating it unsupported, and the same was true for students.

The research backs this up. According to a 2022 study from Educause, major educational technologies often failed to reach full adoption due to poor integration with teaching practices and insufficient faculty involvement in procurement decisions. Tools were purchased before needs were understood by the people actually using them. That misstep continues today. Students still struggle with basic navigation. Faculty still patch together workflows with plugins and PDFs. Meanwhile, administrators chase platform consolidation, and vendors offer “turnkey” solutions that promise transformation at scale.

Now, AI has entered the cycle. With systems that generate syllabi, grade assignments and simulate instructional content, institutional buyers are once again being sold a vision of frictionless teaching. But what is really being optimized time or thought? Connection or compliance? If the past two decades offer any insight, higher ed may soon possess extraordinary tools that no one has the time, training or trust to use.

Related Article: What It’s Really Like to Teach College in the Age of AI

Research on Faculty Readiness and Institutional Investment Gaps

The Time for Class 2025 report from Tyton Partners provides a jarring look at this disconnect. According to the report, while 77% of administrators are optimistic about the impact of AI on student outcomes, only 24% of faculty report feeling prepared to integrate AI into their teaching. Even more concerning, fewer than 20% said they had received any formal training or institutional support to do so. In other words, nearly four out of five faculty are expected to adopt tools they have not been trained to use.

The investment chasm widens further when looking at how institutions fund system implementation versus faculty development. While nearly 60% of institutions reported significant new investments in AI and digital tools for 2025, fewer than 30% allocated parallel funding for faculty support, such as course release time, training stipends or instructional design partnerships. The story is clear: tools are being prioritized over people.

This is not a fringe concern. According to research from the Bill & Melinda Gates Foundation, effective student learning outcomes are most strongly correlated with high-quality instruction — not platform presence. Yet, the budgetary pattern continues to treat faculty as friction to be automated rather than value to be amplified.

The Disconnect Between Investment and Experience

Inside the LMS dashboards and AI product demos, the future looks sleek. Auto-graded assessments, nudging systems, engagement scores and all wrapped in user-friendly design. But for many faculty, the on-the-ground experience is anything but frictionless. Poor interface logic, limited customization and opaque algorithmic functions undermine pedagogical flexibility. 

Meanwhile, support staff and instructional designers report surging workloads. Without investment in intermediary roles that help bridge tech and teaching, AI-enhanced platforms become yet another demand on already-stretched teams. A 2024 Educause survey on institutional transformation found that one of the top three barriers to effective AI integration was "insufficient staff capacity to support faculty." This is a human system problem.

In contrast, another recent study found when institutions adopt co-design models, where faculty help shape AI implementations, they report smoother rollouts and greater innovation. Yet these approaches remain the exception, not the norm. The consequence? High-cost systems with low utilization. Or worse, repurposing AI as a surveillance tool that ranks teaching behavior, monitors chat logs, and commodifies learning without context.

Related Article: Students Speak Out: AI Is Changing School, and No One's in Charge

A Better Way Forward

Higher education does not need another decade of tool-driven transformation that leaves educators behind. It needs intentional, equity-minded innovation that treats faculty not as end-users, but as co-designers. Institutions must reallocate a portion of system implementation budgets to build what Tyton Partners calls a “faculty support infrastructure,” including AI literacy programs, team-based course design and incentives for experimentation.

Ethical adoption also means asking better questions during procurement: Does the platform offer explainable AI? Are accessibility and student data protections embedded into the codebase, not just bolted on? Do faculty have agency over what gets automated? These are not wish-list items; they are non-negotiables if AI is to serve learning rather than efficiency alone.

Finally, institutions must get past the financial fantasy of a frictionless, faculty governance-free future. Learning is not clean. It is complex, iterative and deeply human. AI can play a role, perhaps even a transformative one, but only when the faculty who facilitate that learning are resourced to adapt, evolve and lead. 

Learning Opportunities

We have been here before. We saw what happened when CRMs replaced advisors and dashboards replaced dialogue. Now, AI is being offered up as the next savior of higher ed. Only this time, it is not saving the classroom; it is quietly rewriting it. Unless institutions begin investing in faculty as aggressively as they invest in platforms, they are not building a future. They are funding a façade. And sooner or later, the system will break, again, not from too little technology, but from too little trust.

fa-solid fa-hand-paper Learn how you can join our contributor community.

About the Author
Emily Barnes

Dr. Emily Barnes is a leader and researcher with over 15 years in higher education who's focused on using technology, AI and ML to innovate education and support women in STEM and leadership, imparting her expertise by teaching and developing related curricula. Her academic research and operational strategies are informed by her educational background: a Ph.D. in artificial intelligence from Capitol Technology University, an Ed.D. in higher education administration from Maryville University, an M.L.I.S. from Indiana University Indianapolis and a B.A. in humanities and philosophy from Indiana University. Connect with Emily Barnes:

Main image: Rawpixel.com on Adobe Stock
Featured Research