Every time Washington announces a new executive order on artificial intelligence, universities breathe in, and too often exhale responsibility. The assumption is predictable: if the federal government signals a deregulatory posture toward AI, then long-standing education privacy obligations must be loosening as well. That assumption is wrong, and it is becoming one of the most dangerous misreadings in higher education’s AI moment.
FERPA — the Family Educational Rights and Privacy Act — does not bend to executive orders. It does not flex with political winds. It does not care whether AI is framed as innovation, competitiveness or national strategy. FERPA is a statute. And statutes do not yield to executive rhetoric.
The uncomfortable truth is simple: AI systems create records. Records trigger FERPA. Nothing in an AI-related executive order changes that legal reality.
Table of Contents
- What Executive Orders Can Do — and What They Cannot
- FERPA Is Technology-Neutral by Design
- The False Comfort of Deregulatory Signals
- Vendors Do Not Become FERPA-Compliant By Executive Fiat
- Why This Moment Actually Raises the Stakes
- The Line That Survives Every Administration
What Executive Orders Can Do — and What They Cannot
An executive order can shape tone, priorities and enforcement emphasis within federal agencies. A pro-AI order may discourage new guidance, slow regulatory expansion or signal that innovation should not be “over-regulated.” Universities often mistake this for legal cover.
But executive orders cannot repeal or amend federal law. The Family Educational Rights and Privacy Act was enacted by Congress and codified at 20 U.S.C. § 1232g. Its requirements are implemented through binding regulations at 34 C.F.R. Part 99. No executive order, whether Trump’s, Biden’s or anyone else’s, can rewrite those provisions.
This distinction matters. Executive orders influence how agencies behave; statutes determine what institutions must do. Universities that conflate the two are not being bold, they are being reckless.
Related Article: President Trump Signs Executive Order to Block State AI Laws
FERPA Is Technology-Neutral by Design
FERPA does not regulate tools. It regulates records.
Whether a student record is created through a paper form, an LMS clickstream, an email thread or an AI prompt is legally irrelevant. If the record is directly related to a student and maintained by an educational agency or a party acting for it, FERPA applies. That definition has remained stable across decades of technological change, precisely because Congress drafted FERPA to be medium-agnostic.
AI does not sit outside this framework. When an AI tutor logs student prompts, stores outputs, infers mastery, flags confusion or detects suspected misconduct, it is generating education records. When those records are retrievable, exportable or discoverable through enterprise compliance tools, they are squarely within FERPA’s scope.
No executive order changes the definition of an education record. No executive order creates a new FERPA exception for “AI-native” systems.
The False Comfort of Deregulatory Signals
The most pernicious effect of AI-friendly executive orders is psychological, not legal. They create a false sense of safety inside institutions. Leadership teams interpret silence from regulators as permission to move faster, cut governance corners and accept vendor assurances at face value.
This is precisely when risk accelerates.
FERPA enforcement does not rely solely on the Department of Education’s proactive audits. It emerges through accreditation reviews, litigation discovery, state-level privacy claims, contractual disputes and whistleblower disclosures. An executive order may quiet one channel while amplifying others.
History is instructive here. Periods of deregulatory enthusiasm are often followed by sharper accountability when failures surface. Institutions that relaxed internal controls during the “green light” phase find themselves exposed later, unable to explain why basic governance safeguards were ignored.
Vendors Do Not Become FERPA-Compliant By Executive Fiat
Another dangerous misconception is that deregulatory signals somehow cleanse vendor behavior. They do not.
FERPA permits vendors to access education records only under the school official exception, which requires direct institutional control, legitimate educational interest and strict limits on redisclosure. These requirements are articulated in regulation, not guidance, and reinforced repeatedly by the US Department of Education’s Student Privacy Policy Office.
An AI vendor that stores prompts, trains models or repurposes interaction data for analytics does not become FERPA-compliant because an executive order celebrates innovation. Institutions remain responsible for vendor conduct. Contracts remain enforceable. Violations remain violations.
Why This Moment Actually Raises the Stakes
Ironically, a deregulatory AI posture increases the importance of internal compliance voices. When leadership feels emboldened to move quickly, the margin for error narrows. When AI systems are deployed at scale without governance, the resulting records multiply silently until they are discovered, often during conflict, not routine review.
When that happens, the question is not, “What did the executive order say?” The question is, “Who knew, and when?”
In that context, FERPA is not an obstacle to innovation. It is an early warning system. It forces institutions to confront the reality that AI is not ephemeral conversation, it is durable documentation.
Related Article: Trump Unveils Massive AI Strategy: ‘We Will Not Allow Any Foreign Nation to Beat Us’
The Line That Survives Every Administration
Here is the principle that remains true regardless of who occupies the White House:
If a system creates, stores or analyzes student-identifiable information, FERPA applies.
That sentence holds in audits, in courtrooms, in accreditation reviews and in boardrooms. Executive orders do not dilute it. Political enthusiasm does not override it. Silence from regulators does not negate it.
Universities that understand this will build AI governance that lasts. Those that do not will learn, later, publicly and at far greater cost, that statutes are indifferent to slogans.
Learn how you can join our contributor community.