Until recently I worked at the CSU system office. Like other states, California is wondering about math – who needs it, how much of it, and for what. My own background in the humanities sometimes let me claim an outsider’s objectivity, but most who know me know I happen to like quantitative reasoning.

When you ask which quantitative skills are useful for all college graduates, you get strange answers, that change over time. I find it a helpful trajectory to keep in mind, as we try to guess what’s next.

Some of our earliest recorded uses of math relate to counting, taxation, and calendars – things that were useful for the emerging technologies of settlement, agriculture, and sharing resources with strangers. These yielded some of our biggest initial breakthroughs, and legacies like separating circles into a number of degrees approximating the number of days in a year.

As astronomy outgrew human eyesight, we measured moving objects with more precision than counting alone could accurately model. We developed calculus surprisingly soon after the telescope’s wide adoption in the 17^{th} century. That knack for representing rates of change that themselves change over time went on to liberate several centuries of engineering.

If there’s a shift in our own era, then people in many public universities and state systems are trying hard to recognize it. Insistence on calculus as the pinnacle of quantitative reasoning – and the particular algebraic skills required to ascend it – has come under recent fire. It turns out that such algebra is beyond the reach of many students, or maybe just of their grade schools’ powers of preparation.

Yet these days more and more people need college, relatively few of whom go on to launch rockets. Weeding them all out over calculus feels like shortchanging both the students and the broader society, which regularly tells the state universities it would like us to produce more graduates.

Which has us wondering whether there are other kinds of quantitative reasoning that might do instead.

If there’s a front-runner in these sweepstakes then it’s statistics. Just as the telescope made fluency in calculus useful, the technological breakthroughs of virtually connected databases – big data – suddenly make ordinary people want to understand and work confidently with large pools of numeric information: how within those oceans to recognize patterns, to roil a record set, to surface significance.

This is no longer just a skill for government economists, the stats counterpart to algebra’s rocket scientist: these days we all need it. For most of us these vast record sets are as close as the phones in our pockets, and we’re expected as citizens and employees to respond intelligently to what they tell us.

This way of life is becoming a given so quickly that it’s hard to picture today’s college students using the phrase “big data” into mid-career, any more than my generation is likely to say “color TV.” It all is.

So then are we already too late? Should higher ed be peering around the corner past statistics, and bracing ourselves for other kinds of quantitative reasoning? At this moment of transition we have an opportunity to embrace diverse kinds of quantitative reasoning, before we simply replace one hegemony with another.

Lately I’ve started to wonder. One of the detours relatively late along the road to calculus is a branch of math called “optimization,” which seems increasingly indispensable in a world with too many people and a finite store of food, water, and carbon sinks.

We won’t all need this the way we all need statistical fluency, but it is growing, and spilling out of work and into citizenship, a sure sign that it’s positioned for a GE requirement. Your first exposure to it and mine, if we live another decade or two, is likely to come with your first purchase of a drone or self-driving car.

For a couple of centuries this branch of mathematics has been fiddling with the Traveling Salesman Problem, which seeks to calculate the shortest round-trip route comprising a number of destinations. It is surprisingly hard to solve, and above a threshold number of cities may be literally unsolveable. (For a lucid account see the 2013 story in Wired.) This is relevant to more than Fuller brush salesmen: shortest-route calculations could improve the design of computer chips, for example, or of chemically synthesized DNA.

Meanwhile, in an unrelated development, we seem to be crossing an exciting milestone in the reduction of pilot and driver error, which is the reduction of pilots and drivers. Self-flying drones and self-driving cars have raised the prospect – with Detroit automakers at least – of a new kind of vehicle ownership, moving off the one-driver-one-car paradigm and getting to something closer to sharing and swapping, driverless cars going empty down a stretch of road, summoned by the next temporary user.

Think about that for a moment, all those GPS-enabled devices rolled up onto serverfuls of big data, mapped to an infinite combination of nodes on a round trip that never ends, not just calculating that elusive optimization problem but living it. It’s not hard to think of a machine-learning solution to a problem unassisted humans have called unsolvable.

Whether it comes to pass or not, that kind of discipline-crossing quantitative reasoning, dipping into just enough algebraic reasoning, arithmetic, and sheer number sense to support other kinds of math, seems worth building into college for everyone.

Image source: “Greek Astronomy” at ibiblio.org