In 1970, a Summer Job Could Pay for College. That Equation Broke Somewhere Along the Way.
In 1970, a Summer Job Could Pay for College. That Equation Broke Somewhere Along the Way.
Here's a number worth sitting with: in 1970, the average annual tuition at a four-year public university in the United States was around $394. In today's dollars, accounting for general inflation, that's roughly $3,000. The actual average tuition at a public university today? Closer to $11,000 — and that's before room, board, and fees push the real number toward $27,000 a year.
Something broke. And it didn't break slowly.
When College Was a Working-Class Option
The postwar American university was, by historical standards, a remarkably accessible institution. The GI Bill had expanded higher education dramatically after World War II, and the infrastructure built during that boom — state schools, community colleges, land-grant universities — was designed with affordability as a central feature.
Through the 1960s and into the early 1970s, a student who worked a summer job — waiting tables, working construction, staffing a factory floor — could realistically cover a significant portion of their tuition at a state school. Financial aid existed, but many students didn't need much of it. The cost of a degree was calibrated, more or less, to what a young person from an ordinary family could manage.
This wasn't just theoretical. Economists who have studied the period estimate that in 1970, a student working full-time at the federal minimum wage over a summer could earn enough to cover roughly 80% of annual tuition at a public university. Today, the same calculation produces a coverage rate closer to 15%.
That gap — between what work could reasonably earn and what college actually costs — is where the American dream of accessible higher education quietly collapsed.
The Funding Shift Nobody Talked About at Graduation
To understand what happened, you have to follow the money — specifically, the state money that used to flow into public universities.
In the early 1970s, state governments funded roughly 75% of the operating costs of public universities. Tuition was essentially a small co-payment on an education that taxpayers were largely subsidizing as a public good. The assumption embedded in that model was that an educated workforce benefited everyone, and so everyone should help pay for it.
That assumption eroded steadily over the following decades. Beginning in the late 1970s and accelerating through the 1980s and 1990s, state legislatures began pulling back from higher education funding — first during recessions, when cuts were framed as temporary, and then more permanently as other budget priorities competed for the same dollars.
By 2020, the average state was covering less than 30% of public university operating costs. The gap had to come from somewhere. It came from tuition.
This shift was so gradual, and so distributed across so many budget cycles in so many state capitals, that it never generated the kind of public outrage a single dramatic price increase might have triggered. The frog didn't notice the water getting hotter. Neither did most families — until their kid got an acceptance letter and a financial aid package that didn't add up.
The Loan Industry Filled the Gap — and Changed the Incentives
The expansion of federal student lending was, in theory, a compassionate response to rising costs. If families couldn't afford tuition out of pocket, the government would help them borrow the difference.
But the availability of loans had a side effect that economists call the Bennett Hypothesis — named after a 1987 op-ed by then-Secretary of Education William Bennett, who argued that federal aid enabled universities to raise prices without losing students. The research on this remains contested, but the pattern it describes is hard to ignore: as loan availability increased, so did tuition, in a cycle that reinforced itself over decades.
Universities, meanwhile, were competing on metrics that cost money. Rankings rewarded research output, campus amenities, and administrative staffing. Between 1976 and 2018, the number of administrators and professional staff at American universities grew by roughly 164%, while the number of faculty grew by 92%. The student-to-administrator ratio shifted in ways that drove up overhead without directly improving classroom instruction.
By the time the average student of the 2010s graduated, they carried around $30,000 in federal loan debt — a figure that would have been incomprehensible to their parents' generation, many of whom had attended the same institutions at a fraction of the cost.
What a Generation Absorbed Without Realizing It
The consequences of this shift are still unfolding. Americans now collectively hold more than $1.7 trillion in student loan debt — a figure larger than total US credit card debt. For many borrowers, the monthly payment on that debt functions as a second rent, compressing their ability to save, buy homes, start businesses, or build the kind of financial stability that a college degree was supposed to guarantee.
The cruelest irony is that the credential itself became more necessary even as it became less affordable. The labor market increasingly demanded degrees for jobs that previous generations had obtained without them. The price went up; the exit became harder to find.
The Shift That Defined a Generation
Comparing 1970 to today isn't just an exercise in nostalgia. It's a way of identifying the specific moment — or rather, the long slow series of moments — when a social contract changed without most people noticing.
America decided, gradually and without much explicit debate, to move the cost of higher education from a shared public investment to an individual financial burden. The effects of that decision are still accumulating, in the loan balances of millions of Americans who did exactly what they were told — went to college, got the degree, and are still paying for it thirty years later.