oh, the humanities – part three

burning-with-mast-1024x768

In the first of this set we looked at the humanities as a corner of the curriculum that makes meaning, and argued that it’s in for hard times – not because it attracts few majors, which is a constant worry, but for new reasons.

The second post looked at what some of those are: developments like machine learning and tools to offload our cognitive, biological, and physical processes, undermining the notion of Essentially Human.

nikesole-7Our interface with the world – the frontier between what’s human and what isn’t – has become strikingly porous. What’s left after the clever applications of prosthetics, 3-D printed organs, and machine learning? In effect, what remains is deciding what to do with all of it. But recent developments in neuroscience call free will itself into question, making that an unreliable peg on which to hang our existential hat.

At that point, the circular reasoning that has sustained us since the Renaissance – that humans matter because we’re human – will have run out of gas. And I’m not talking about circa George Jetson; this day is essentially here.

As I cast about for a new and improved raison d’etre, I find a cause for optimism in a recent book on macroeconomics, and a day a couple of months ago when I went to prison. I’ll start with the book.

rise-and-fall-of-american-growthLast Christmas I got Robert Gordon’s The Rise and Fall of American Growth, which elaborates on his argument that the recent stagnation in real wage growth is here to stay. He says this is because that spurt in our productivity and standard of living was the real anomaly, an unusual period from 1870 to 1970 where some one-time inventions like the phone, electricity, and the internal combustion engine all converged to give us a boost. As he says, municipal water and sewage is something you invent only once.

It’s an enormous book but very good, and if you’re reading me this far then you may also like it; Gordon is a fellow fan of the long view. (I took his macroeconomics course as a sophomore at Northwestern, and liked him then too.) Yet his premise is valuable mostly for orienting a fascinating and otherwise unwieldy account of recent history. As a main argument it’s unconvincing in two directions.

First, before the time of his book were some other one-off inventions, like the loom, the steam engine, the telegraph, and for that matter fire and the wheel. They probably had the same long-term and irreversible impacts, whether or not they register in traditional GDP. And coming out the other end, the time from 1970 to now, it seems any tapering of growth will last only until the next game changer.

If we can’t see it yet, we should admit that such is the nature of every paradigm shift. That’s why they shift paradigms. For example, an observer in 1815 could think sure, the steam engine might make communication faster than you get from riding a horse, but that speed was probably approaching its limit – not anticipating that the telegraph was about to take that limit up to the speed of light.

the stackIn reading Gordon’s book, I thought of our possible next paradigm shifts in a couple of ways. One, the same profound disruption wrought by the national electric grid could lie ahead of us with networked computing and artificial intelligence, especially as thinking and culture cross national boundaries. (See The Stack for a mind-boggling account from a computer scientist of how IT could usher in the kind of post-state world that’s been imagined for decades by Jürgen Habermas.)

Also in this tech-frontier category: the imminent likelihood we’ll find life on other planets, or inside moons of our own solar system, could give us a tech-driven growth spurt from flavors of biomimicry we haven’t yet imagined.

And then there’s the whole other category of untapped growth, equity. An awful lot of the world’s eight billion people have yet to benefit much from the inventions of 1870 to 1970. If the national economic engine isn’t revving like it did, then that’s hardly because we’ve saturated global demand.

cognitive-banking

While fretting about what the humanities will do in a post-human-centric world, I thought of both of my reservations about Gordon’s book. On the technological paradigm side, getting to the summit of human cognition and free will – and maybe surrendering our presumptive monopoly on both – may just bring the next hill into sight. We’ve seen this before, for example in theoretical physics.

In other words, it’s possible that unraveling the human capacities we understand so far will reveal other mysteries we didn’t know were there.

graduation

And then we get to the equity question, and the graduation ceremony I attended at the California Institute for Women. The CIW is a full-on prison, and going for a visit means breaching multiple heavy duty doors and razor wire, and forfeiting a civil liberty or two.

Once inside, you meet people who’ve studied and found salvation in, of all things, the humanities.

ciw

This is not a minimum-security Club Fed for embezzlers. These are some violent people doing, as one told me, “serious time.” Some of them are famous. Some of those in a 30-year stretch enrolled in a humanities graduate program offered by CSU Dominguez Hills, my employer, as a traditional postal correspondence course – important in a setting without internet access, and the last program of its kind.

peer-educators
Anitra Clark and Erica Hitchcock, Inmate Peer Educators, California Institute for Women

It made sense to me that a particularly popular genre is magic realism, but there are plenty of others. I have a stack of written testimonials from our graduates, describing not the escapism but the dignity that attends study in the humanities.

thegirlwhocircumnavigatedtheworldinadreamofherownmaking

And of course, beyond driving distance from my campus are the billions as yet untouched by Gordon’s catalog of miracles from the 19th and 20th centuries. For them – the vast majority of present-day humans, who live in developing and not-so-developing corners of the world – questions of machine-aided cognition, prosthetics and 3D printing, and dubious free will are mostly moot. We may sweat such things in the ivory towers, but they’re just less pressing down in the dungeons, both at home and abroad.

By my count that makes Cause for Optimism #1: in the relatively short term of decades, the humanities will remain vital for the vast majority of human beings who don’t have it all. Over that time period, the handful who do can work on how else to answer Why.

And so help me, even for that rarefied group, the tip of the epistemological spear, I think the disciplines we’ve been grouping into the humanities are in for some of their best days yet. We will replace the circular reasoning at the heart of today’s humanistic boosters with a much better, sounder line of reasoning.

I’ll call that Cause for Optimism #2, and save it for the fourth and last piece of the set, posted later.

calcutta-slums

Image credits: airship.net, Kenny Orthopedics, bankingtech.com, CDCR Today, maeryan.com, Just Detention International, The Girl Who Navigates the World in a Dream of Her Own Making by Paul Bond, BBC World Service

news from NACE

Last week the National Association of Colleges and Employers met in Las Vegas. Most of the national educational conferences I attend are pretty rarefied; they typically draw a few hundred people. This one had thousands, from all over the world. And, unlike the state-level workforce development events I used to attend, this one wasn’t all hype and hope. It was graced by actual employers, many of whom sponsored the event.

NACE sponsors 2

I was tagging along with the director of my university’s career center and a couple of her staff to hear firsthand what employers would like from us, unmediated by surveys or the higher ed press. Some takeaways while they’re fresh:

Not everyone recruits from campuses. The companies at this conference aren’t a cross section of the economy. There are few non-profits in NACE, even fewer from the public sector, and not a single mom-and-pop. The private businesses that trawl for freshly minted college grads are the ones big enough to have managers of “university relations,” and extensive in-house offices of onboarding and orientation. They see themselves – accurately – as our fellow educators. And since tenure in an entry level job is typically under two years, their business looks a whole lot like ours. People come in, they get some personal development, they leave. Each year’s round of new hires is called a “cohort.” For the companies that choose to engage in it, this very early career guidance is almost a contribution to the public good.

sessionThere is really something in it for them. Almost a contribution to the public good. The large and well-heeled do this out of enlightened self-interest. A vivid explanation came from insurance monolith AIG, whose presenters – both in talent acquisition and early development, one from the New York office and other from Chicago – spoke fluently of learning outcomes, engaging pedagogy, on-line portfolios and learning management systems. I about fell out of the folding chair.

They went on to say how it pays off: the people they hire fresh from college are versatile, impressionable, and forming lifelong networks and habits. After their crash course in the insurance sector and AIG culture, they will move on to take jobs with AIG’s customers, clients, vendors, and partners, and an early and positive experience with AIG could pay off for decades.

DEM_2016-NEVADA__-0eca1No one wants to look at an ePortfolio. I went to NACE hoping to find a warm welcome for best practices in higher ed: experiential learning, ePortfolios with meta-cognitive accounts from students of their college experiences, supported with artifacts that demonstrate developing proficiency over the undergraduate years. I mean, we’re all dissatisfied with the transcript and resume, right? What I saw was that no one, I mean no one, wants to even glance at these. The only thing employers don’t like about the one-page resume is that it’s too long.

But what ePortfolios develop does matter. I skipped lunch for an intense conversation with the recruiter from a large and noticeably successful investment firm. (I didn’t get hired.) Like many in the financial services sector they try to hire from the interns they host and get to know. So for our students that first cut is the hardest; there are thousands of applicants for only 40 internship slots, understood as the ticket to a job. Grades matter, but only to a point: once you’re above a 3.2 they don’t rank you on how far. Instead they turn to the other things they care about, gleaned from your resume and – if you’re lucky – an interview.

Only what an interview. The VP described a breathtaking zeal for quality control. The interviewers get daylong training retreats. Each one learns to focus on one or two of the dozen soft skills that her firm wants – results orientation, intellectual rigor, professionalism, communication skills. Finance doesn’t come up. Then all 120 or so interviewees go through six 30-minute interviews apiece, where they’re asked by pairs of interviewers “not what you know, but how you think.” Applicants are expected to speak confidently about their past experiences, what those experiences taught them, how they’d behave differently next time, how they know what they’re good at.

In other words, they’re effectively narrating what they’d discover about themselves by creating an ePortfolio. Except that the VP had never heard of ePortfolios, and when I gushed about them she was visibly unimpressed. She didn’t object, but their value is just utterly outside her sphere. It was like asking her to comment on an intern’s early childhood nutrition.

floorSoft skills are ascendant. I’m used to conferences where the sponsors and exhibitors sell ePortfolios, and software for tracking and reporting student learning outcomes, typically for accreditors and other oversight entities. Here it was also software, but outside of curriculum. There was a lot of CRM-style contacts management, alumni networking tools, job fair event management, and office workflow.

Two companies caught my attention, both peddling psychometric platforms to tell you whether your next hire would fit your company culture. The list of virtues was similar at both: problem solving ability, performance under pressure, work with diverse teams, creativity. Both had been around for decades, but each reported – in separate conversations – that business had boomed in just the past couple of years. Their market seemed to be the companies who wanted the same things their large, NACE-member competitors provide, but who have less in-house capacity or zeal.

95082d79880b98d9ecddcebcdf3b2fb2I asked both reps whether they noticed certain higher ed institutions or practices do a better job of developing these skills. I was hoping to hear support for undergraduate research, service learning, team-based or project-based learning. From one I got a blank look; his company had been in this business since the 1980s and it had never come up. He even seemed to doubt you could intentionally cultivate any of this, that instead some people were just naturally creative, or easy to get along with.

From the second, Maure Baker of Performance Assessment Network, I got better informed responses, maybe because he’d worked on college campuses before leaving to start up AmIJobReady.com for PAN. In his opinion there were patterns in the student experiences that develop these outcomes, but they didn’t relate to particular campuses or practices. Instead, he believes the key is institutional integration: when faculty, career services, curriculum, and student affairs are all collaborating, the students come out with better soft skills. As near as I can tell, his experience matches the best working hypothesis in my field.

There is opportunity here. Sometimes the language barrier was unsettling, like I’d overflown Vegas and landed in Turkmenistan. But more often I felt like a railroad baron looking across an open desert, where the biggest challenge is suppressing an unseemly giggle.

I would like that psychometric data, the decades of soft-skill assessments homegrown and tested by cycles of employers and graduates, so far apparently disconnected from academic affairs. PAN sorts them by economic sector, showing for example how nurses need more resilience under pressure than client service reps.

It would be cool to show those pie charts to incoming freshmen with declared majors, and run them through the same assessment. We could then hand them a personalized report on where their gaps are, writing out a kind of higher ed medical prescription for their next four years, comprised of courses as well as co-curriculars, and developed with faculty in the relevant majors. Then they’d go off and do it, keeping track of what they learn about the world and themselves along the way in – yes – an ePortfolio, not because anyone but their families may ever see it, but so that they’re ready to explain it all in a 90-second elevator pitch, or 30-minute high-octane interview.

session 2I know we already do some badging for these skills, and many summer orientation programs – including those at my university – include some kind of psychometric assessment and introduction to career services at entry. But we have far to go, and the two sides of higher ed seem to have been developing these from different directions.

All of this could be wrong. As my first impressions these are naive, and suspect. Also, I have to beware sampling error. For example, in the public sector – not represented at last week’s conference – we have to detail and defend every hire from charges of cronyism, and for some the backup of an ePortfolio may be welcome. (The internship coordinator at my college of business and public affairs believes a large employer in Sacramento may be one.) And for small businesses that can’t afford NACE – the majority of U.S. employers – free online evidence of student attainment would be a lot more feasible than that battery of interviews.

And we know from talking to our alumni that some employers do look at ePortfolios – maybe not to replace the resume, but to supplement the interview.

But it’s worth learning more about this area, and trying to connect it to emerging best practices on the postsecondary side.

I feel like I spent a couple of days in earshot of Promontory, Utah, but haven’t yet heard the golden spike.

20170607_215213

oh, the humanities – part two

burning-with-mast-1024x768

The earlier post in this set gave a typical defense of the humanities – disciplines like religion, philosophy, languages, and literature – which is that we all benefit from understanding ourselves as part of a bigger purpose. That intellectual birthright, systematically developed in college, helps orient our work and lives afterward. So, good job security for the soul searchers and poets, right?

In recent centuries, at least in the west, the searchers and poets have clumped around varieties of humanism – an ethic different from the set of disciplines called “the humanities,” but arising from them. Humanism takes the human condition as a self-contained, self-evident good, man as the measure of all things.

For some of us – maybe the majority on American campuses – it’s “secular humanism,” agnostic if not godless. We readily admit to a universe that includes the supernatural, imperceptable, and mysterious – whether called branes, dark matter, or Quetzalcoatl. But we don’t take daily cues from it.

These days even formal religions seem influenced by humanism, as we see falling from favor those that discount or threaten human well-being, for example by preaching intolerance, mutilation, or virgin sacrifice. They just don’t draw crowds like they used to.

So, humanism for all, and the answer to the eternally nagging Why is apparently some version of “because it’s us.”

That alone fuels a lot of the human enterprise. We work hard, cure diseases, and write apps for smart phones not just to get ahead personally, but also to add to the overall stock of human happiness. The more you contribute to the common good instead of the personal one, the more virtuous you feel, but really it’s all just different versions of petting ourselves. It’s been a surprisingly durable way of avoiding the question Why, at least up to now.

If that’s about to change – and I think it is – then college curriculum in the humanities should brace itself. But for what? The acceleration of change on a few fronts has made it harder for colleges and universities to guess what’s around the next hedge.

tara-johninmazeThose fronts:

1. Machine learning is now mostly inductive, a lot like human learning. As they catch on, our gadgets are taking over not just factory work but also driving, diagnosing disease, and even making art. Already our computers can translate, and our phones can see. Whatever our colleges teach people to do next, we want to take care that it’s things people will still be the ones doing.

2. To higher education this raises a not insignificant question: what is that? In other words, at the maturity of our current AI growth spurt, what will remain as the competitive advantage of homo sapiens, and then how do we organize our curriculum to cultivate it, so our graduates can be employed? The answer, increasingly, may be volition.

Machines can find, solve, and invent many more things than they used to, but we humans still have a corner on wanting to.

So does it come down to that? Will college learning be mostly about purpose and meaning, about why we should want some things more than others? About the nature of the good life, of telling right from wrong?

I would welcome that, but also enjoy the irony, that the build-out of our STEM infatuated, high-tech world could usher in a golden age for the arts and humanities.

Except that:

3. The idea of free will itself is under new fire, beautifully summarized in a recent Atlantic article. It was never on solid footing, empirically speaking: we feel like we act for ourselves but it’s been mighty hard to prove. Now we’re seeing that the interval between deciding to take action and taking it may be reversed; that is, that an opaque cognitive curtain keeps us from knowing what we do for a moment or two, during which we mentally process the intention, fooling ourselves into thinking our wishes matter.

In the context of machine learning, this may be an even bigger deal than we think. It seems that if a day comes when we have to concede we lack free will, then on that day we will really have run out of uniquely human capacities.

And that means we will also have to face up to the circularity of secular humanism and our longstanding measure of meaning, ourselves. It may not be enough anymore to say art, or health, or technology are valuable because they advance the human condition, because that condition will no longer be exclusively human.

On that day, maybe coming up fast, we’ll have reached the end of the street down which we kick the can that asks Why.

kick-the-can-1

What does a forward-looking university do then? I am not sure, but hope I’m not sitting on the platform behind the commencement speaker who has to explain that intellectual bequest to the next generation. “It’s been a pretty good run but we kind of ran out of steam toward the end there.”

I was ruminating on this, rummaging around for a ready reason to go to work each day when the present one sputters out, when a couple of experiences gave me hope that the humanities may yet have some good continuing uses, even after they’re understood as not uniquely human. One was a textbook I read this Christmas on macroeconomics, and the other was a day last month that I spent in prison.

But more on those later.

Image credits: airships.net, Tara in Poland, International Business Seminars

oh, the humanities – part one

burning-with-mast-1024x768

Before getting into academic administration I taught film and wrote screenplays. I’ve always liked movies, but not the same way as other film people. I don’t enjoy being on sets, for instance, or have a strong opinion about different movie stars, or whether we shoot on film or video. What draws me is the stories, that can feel like novels brought to life. (Some people who haven’t met her feel a personal connection to Emma Stone; I feel closer to Tess of the d’Urbervilles.)

Colleges and universities have trouble categorizing film departments. Usually we end up in the visual and performing arts, next to painting, dance, photography, and theater. I like the company but never felt a part of it; I get tired at night and I don’t smoke.

Instead I’ve identified with the literature and philosophy people, whose raw material feeds Hollywood. It was a little isolating, a storyteller exiled to live with the artists, like I should have had my own bathroom.

diagramIn college administration for an entire campus, the difference is less stark: arts and humanities are typically grouped together, film courses can land with either group, and the two feel equally dissed by the public’s obsession with STEM, degree production, and gainful employment. We are united in beleagueredness; you can spot us by our short shrifts.

This post will add to the handwringing about the humanities, but in a different way.

Frankly I’m not as worried as others about the prospects for our departments of literature, languages, philosophy, and religion. We have all been stepchildren at least since the Athenian system office poured Socrates a glass of hemlock, and yet we’re all still here. Apparently there’s something inherently necessary about making meaning.

No matter their majors and eventual professions, our students need and want to know how to string together their experiences into something significant. They expect college to help them read purpose into their lives. At denominational institutions they learn one way to do that; at secular comprehensives like mine, they can learn them all.

That urge to assign significance marks my humanist colleagues at committee meetings. In my experience this is truest when they try to talk to social scientists who – burdened with insecurities of their own – resist embellishment of any kind. People in psychology, public policy, and sociology don’t want to make meaning so much as discover it. Crossing that line feels to them like fudging the findings.

Consider this sample paragraph from a text on a subject dear to my heart, student success:

Five years ago the California State University began requiring incoming students to take summer classes before the freshman year, whenever their test scores indicated they were short of college-level proficiency in English, math, or both. As the new policy has covered a greater share of the students who are eligible, the CSU has seen a dramatic reduction in rates of fall remediation. Other factors are also at play; for example, California high schools now encourage more of their students to take rigorous college prep courses. Still, these results suggest the policy is working.

If you take out the last sentence, then to humanists the argument feels incomplete.

But if you leave it in, social scientists will worry that you’ve said more than the evidence supports.

For much of this decade I’ve been a humanist among social scientists, who work in higher education research, learning science, and policy. Their writing often looks to me like all the last sentences are missing. I’ve gotten past longing for my own bathroom door, but I sometimes want a different color in Track Changes.

Yet in the long run these distinctions are a comfort. The urge to assign significance, to answer a why with a because, gives my tribe its staying power. Few may have the nerve to declare a major in philosophy, but we all need a dose.


So then where is the new threat?

Well, in my opinion, for much of our history when we get to the rest of the “because,” we have been kind of cheating.

If you back up to the scale of millennia, then you can see the answer to “why” evolving in a clear direction. As far as we can tell, the earliest humans thought deeply about their purpose and the meaning of their lives. Even before we took our current physical form as Cro-Magnons we were acknowledging our dead in burial rituals, signaling an awareness of our own mortality, and an urge to defy it.

74484-adapt-536-1

Most embryonic cultures, including a handful that persist to this day, have venerated the dead and especially their ancestors. Moving forward in time you see the addition of supernatural beings, whether one or many, and myths of origin and destiny. These conceptions of a broader context orient our lives while we’re on earth, setting the tone for business deals, codes of law, and good manners, for example.

In Europe you can mark the apex of this approach around the 13th century, with Aquinas on the eve of the Great Schism and other fissures that challenged the assumption of One True anything beyond our immediate perception – challenges that included contact with cultures elsewhere, who read the invisible universe very differently, yet thrived.

From that point on – roughly the Italian Renaissance and the beginning of secular humanism in its present form – the argument gets strangely circular. Why are humans worth helping? Because they’re human. Why do we work hard? To promote human happiness. Why is human happiness valuable? Uh, it just is, and we take these truths to be self-evident. ‘Nother words, don’t hold your breath waiting for proof.

At a time of disruption and upheaval, at the trading crossroads of dozens of civilizations and three continents, the Italian humanists were relieved to rediscover the ancient Greek resort to the one indisputable universal: Man is the Measure of All Things.

With a tweak since then for gender equality, the slogan has served us surprisingly well for the last six or seven centuries.

library__3153But I think its time is running out, probably within our lifetimes. And for the life of me I’m not seeing a ready replacement.

I worry.

But more on that later.

Image credits: airships.net, National Geographic, fourthdoor.org

the next flavor of quantitative reasoning?

Until recently I worked at the CSU system office. Like other states, California is wondering about math – who needs it, how much of it, and for what. My own background in the humanities sometimes let me claim an outsider’s objectivity, but most who know me know I happen to like quantitative reasoning.

When you ask which quantitative skills are useful for all college graduates, you get strange answers, that change over time. I find it a helpful trajectory to keep in mind, as we try to guess what’s next.

math16

Some of our earliest recorded uses of math relate to counting, taxation, and calendars – things that were useful for the emerging technologies of settlement, agriculture, and sharing resources with strangers. These yielded some of our biggest initial breakthroughs, and legacies like separating circles into a number of degrees approximating the number of days in a year.

As astronomy outgrew human eyesight, we measured moving objects with more precision than counting alone could accurately model. We developed calculus surprisingly soon after the telescope’s wide adoption in the 17th century. That knack for representing rates of change that themselves change over time went on to liberate several centuries of engineering.

If there’s a shift in our own era, then people in many public universities and state systems are trying hard to recognize it. Insistence on calculus as the pinnacle of quantitative reasoning – and the particular algebraic skills required to ascend it – has come under recent fire. It turns out that such algebra is beyond the reach of many students, or maybe just of their grade schools’ powers of preparation.

Yet these days more and more people need college, relatively few of whom go on to launch rockets. Weeding them all out over calculus feels like shortchanging both the students and the broader society, which regularly tells the state universities it would like us to produce more graduates.

Which has us wondering whether there are other kinds of quantitative reasoning that might do instead.

If there’s a front-runner in these sweepstakes then it’s statistics. Just as the telescope made fluency in calculus useful, the technological breakthroughs of virtually connected databases – big data – suddenly make ordinary people want to understand and work confidently with large pools of numeric information: how within those oceans to recognize patterns, to roil a record set, to surface significance.

This is no longer just a skill for government economists, the stats counterpart to algebra’s rocket scientist: these days we all need it. For most of us these vast record sets are as close as the phones in our pockets, and we’re expected as citizens and employees to respond intelligently to what they tell us.

agsi-bhi-asset-heatmap-2262449This way of life is becoming a given so quickly that it’s hard to picture today’s college students using the phrase “big data” into mid-career, any more than my generation is likely to say “color TV.” It all is.

So then are we already too late? Should higher ed be peering around the corner past statistics, and bracing ourselves for other kinds of quantitative reasoning? At this moment of transition we have an opportunity to embrace diverse kinds of quantitative reasoning, before we simply replace one hegemony with another.

Lately I’ve started to wonder. One of the detours relatively late along the road to calculus is a branch of math called “optimization,” which seems increasingly indispensable in a world with too many people and a finite store of food, water, and carbon sinks.

We won’t all need this the way we all need statistical fluency, but it is growing, and spilling out of work and into citizenship, a sure sign that it’s positioned for a GE requirement. Your first exposure to it and mine, if we live another decade or two, is likely to come with your first purchase of a drone or self-driving car.

For a couple of centuries this branch of mathematics has been fiddling with the Traveling Salesman Problem, which seeks to calculate the shortest round-trip route comprising a number of destinations. It is surprisingly hard to solve, and above a threshold number of cities may be literally unsolveable. (For a lucid account see the 2013 story in Wired.) This is relevant to more than Fuller brush salesmen: shortest-route calculations could improve the design of computer chips, for example, or of chemically synthesized DNA.

Meanwhile, in an unrelated development, we seem to be crossing an exciting milestone in the reduction of pilot and driver error, which is the reduction of pilots and drivers. Self-flying drones and self-driving cars have raised the prospect – with Detroit automakers at least – of a new kind of vehicle ownership, moving off the one-driver-one-car paradigm and getting to something closer to sharing and swapping, driverless cars going empty down a stretch of road, summoned by the next temporary user.

Think about that for a moment, all those GPS-enabled devices rolled up onto serverfuls of big data, mapped to an infinite combination of nodes on a round trip that never ends, not just calculating that elusive optimization problem but living it. It’s not hard to think of a machine-learning solution to a problem unassisted humans have called unsolvable.

Whether it comes to pass or not, that kind of discipline-crossing quantitative reasoning, dipping into just enough algebraic reasoning, arithmetic, and sheer number sense to support other kinds of math, seems worth building into college for everyone.

Image source:  “Greek Astronomy” at ibiblio.org

 

Alexandria

In antiquity Alexandria was second only to Rome. The north African port was home to a famous library and one of the seven wonders of the world, the lighthouse Pharos. Both are long gone, but you can see some remains of the physical library.

379-part-of-the-original-alexandria-library
Visible remnants of the library at Alexendria, Egypt.

That was about all I thought of it until reading Stephen Greenblatt’s Pulitzer prize-winning The Swerve, which recounts the discovery in 1417 of a poem by Lucretius believed lost. This can sound like dry stuff but the story is vividly, almost luridly told, Greenblatt arguing that this is a moment to which the modern world can trace its origin.

the-swerve

His passage on Alexandria fits into a larger discussion of how all that ancient learning got lost in the first place – fire and intolerance directed at the books themselves, but also sheer time, random periods of social unrest, excessive scrolling and unscrolling, and bookworms.

In the center of the city, at a lavish site known as the Museum, most of the intellectual inheritance of Greek, Latin, Babylonian, Egyptian, and Jewish cultures had been assembled at enormous cost and carefully archived for research. Starting as early as 300 BCE, the Ptolomaic kings who ruled Alexandria had the inspired idea of luring leading scholars, scientists, and poets to their city by offering them life appointments at the Museum, with handsome salaries, tax exemptions, free food and lodging, and the almost limitless resources of the library.

Maybe it’s just because I work in one, but this sounds to me a lot like a university, down to the institution of tenure – but a good dozen centuries before the medieval European institutions we usually cite as our beginnings.

alexandria_02
The library at Alexandria as it may have looked.

And these weren’t merely repositories of knowledge: like our own, they were also expected to generate it:

The recipients of this largesse established remarkably high intellectual standards. Euclid developed his geometry in Alexandria; Archimedes discovered pi and laid the foundation for calculus; Eratosthenes posited that the Earth was round and calculated its circumference to within 1 percent; Galen revolutionized medicine. Alexandrian astronomers postulated a heliocentric universe; geometers deduced that the length of a year was 364 1/4 days and proposed adding a “leap day” every fourth year . . .

. . . The Alexandrian library was not associated with a particular doctrine or philosophical school; its scope was the entire range of intellectual inquiry. It represented a global cosmopolitanism, a determination to assemble the accumulated knowledge of the whole world and to perfect and add to this knowledge.

Who knew? Probably many who read this blog, but I found it a surprising and reassuring sign of something old and essentially human.

However bleak things get, or overrun with fire, unrest, and digital bookworms, we apparently feel driven to systematically and cooperatively keep track of what we know, and add to it.

Image credits: pegnsean.net, thelivingmoon.com

news from Bowling Green, KY

wku-campus

Western Kentucky University has a lot in common with the California State Universities that have employed me for around ten years. It’s an access-oriented, regional comprehensive university, it’s proud of its continuing academic quality in the face of unpredictable challenges, and it would like to improve its graduation rates.

To that end, the university leaders are looking at educational practices that engage their students personally in their learning, making them less likely to drop out.

On Friday I paid WKU a visit to learn more, and share what we’re doing in California. Our discussions focused on high-impact practices, and making them work for a greater share of WKU students by identifying a handful that can be offered consistently, equitably, and campus-wide.

For example, like some CSU campuses, WKU may decide to focus on service learning, undergraduate research, and internships in particular. Those few would then be systematically offered, coded into student records, and regularly assessed for impact.

The links in this sentence will take you to my slides from the morning presentation and the afternoon workshop.

For my part, these are points I want to remember from Friday’s meetings:

  1. Everyone is an educator. Although faculty are authors of WKU’s educational programs, I was struck that our meetings were attended in equal parts by advisers, staff, student leaders, administrators – pretty much everyone who interacts with students. I think one value of high-impact practices is that they take advantage of all the ways humans learn; to that end, this full-spectrum participation seems especially important.
  2. Intentional work requires ongoing professional development. WKU’s efforts in this area are led by Jerry Daday, Executive Director of its Center for Faculty Development. His involvement will be crucial: during a closing discussion of the resources needed for scale-up, people said they needed dedicated training for staff and faculty even more than they needed money.
  3. Colleges will want a role. This was the biggest surprise of my visit, that deans and associate deans need to see themselves in the emerging approach, and will be unhappy if they can’t. Because high-impact practices are often connected to the student’s choice of major, departments won’t feel their identities threatened. And at the large scale of the whole university, picking a handful of signature high-impact practices for everyone will strengthen the institution’s identity. But what about the layer in between the campus and its departments – say, the College of Arts and Letters, or the College of Nursing?

I’m not sure what to do about that. A good answer may lie in integrated approaches to curriculum, like the AAC&U GEMs project, or in “meta-majors,” broad clusters of related subjects that students pursue before they know exactly what to major in. Such integrated pathways may reside in a single college, and lend themselves to a distinct set of high-impact practices.

(References to meta-majors are getting more common, but the field doesn’t have a single authority I can link you to. One example I like is from Complete College America, which describes meta-majors in its “Guided Pathways to Success” toolkit. See the PowerPoint here, and especially slide 22.)

Or maybe, as some in the meetings believed, bringing along the colleges just isn’t a problem: we need those administrative units behind the scenes, and not because our students should know where they are on the org chart.

I get it, but I’m not so sure. We may find that more should be done at the college level with high-impact practices, and how they bring students in, and support their decision to stay.