oh, the humanities – part four

burning-with-mast-1024x768This series has looked at the shifting value of the humanities, in a world where the word “human” is in fresh flux. As we augment our intellects with machine learning and artificial intelligence, and our bodies with prosthetics and gene therapies, we should note the less visible but no less dramatic changes to fundamentally human activities like paying attention, interacting, and creating.

The first post made a distinction between my current handwringing and the longer, honorable tradition of simply complaining that no one majors in poetry anymore. I think that one’s still on solid ground; our faculty friends in literature, languages, and philosophy can look forward to centuries of continuing disfavor.

The second post shifted to my present concern, which is that the question of “why are our lives meaningful” may soon need more than the widely accepted response of secular humanism, which is essentially “because we’re human, and man is the measure of all things.” That yardstick is macerating in a bath of technology, and may soon dissolve completely.

The third post got a little more hopeful, by looking into the places where traditional secular humanism still has plenty of ground to clear and fields to plow, for example among the impoverished and imprisoned.

This one will close the set with what I think is the most convincing case of optimism on behalf of the humanities.

sapiensI owe the thinking in this post mostly to a book called Sapiens, and its extended discussion of “imagined realities.” I read it when it came out a couple of years ago and didn’t think much of it, but parts keep sticking with me.

I’m especially taken by Harari’s premise that our remarkable evolutionary success is owed to a little recognized human propensity. Yeah, brainpower is good, and so are upright walking and opposable thumbs. But the really significant innovation was significance itself, our tendency to collectively assign meaning and then believe in things that aren’t intrinsically true.

Paper money is one easy example of an imagined reality. So are traffic signals; nothing in the color red necessarily denotes stopping, but so long as we all agree to pretend it does, our streets work fine.

In the tradition of secular humanists, Harari puts religion into this category. But he defines religion to include a lot of things others don’t, such as the culture of Silicon Valley, and its zeal and evangelism on behalf of coding. He counts liberal democracy and American exceptionalism as religions, too.

Once you start seeing our world as a collection of collective fictions, they turn up everywhere. Today is Wednesday because we all say so, and not because there’s anything Wednesdayish about today’s spin around the poles.

In my world of higher education, the fictions feel especially arbitrary and rampant: the credit hour, the division of knowledge into disciplines, the agreed ways of organizing, explaining, and adding to what we know.

553d1293f2f7c8cd10e8865f06acb29c-business-people-team-meeting-by-vexelsIf your workday is like mine then it includes a meeting or two you’d rather not sit in. On the off-chance you find your mind wandering, I encourage you to note the collective fictions that go into all the assumptions, discussions, and resolutions you’re party to in a given hour, noting of course that the hour itself is a fiction, as are the job titles, resource allocations, and policy implications. For all the psychic investment – and it can be considerable – it’s amazing how purely hallucinatory the topics are.

Why?

I mean, let’s note that other animals do this less. Wolf packs may agree to imaginary boundaries between ranges and observe an invisible hierarchy, but they don’t codify it in zoning laws and tax brackets – to name two inventions that go back as far as recorded human history. There must be powerful reasons for doing this.

Harari believes we make these investments because the stories we tell ourselves are more than organizing. They are also keenly, eerily motivating. University presidents, fundraisers, consumer brand managers, and warmongers all make their livings on that energy.

Not to say you can do so with falsehood for long – although it helps to dress truth to advantage (paraphrasing Pope), you still need truth. And it’s possible not only that history is written by the victors, but also that the victors won because they had a more compelling history to write.

So now that I’ve had a year or two to ruminate on Harari’s thesis, I can imagine a time after ours when this will be how the word “human” is used. By then we can expect “human” to serve less as a self-evident, self-contained noun, and more as an adjective to describe that predilection for shared fiction, whether demonstrated by us talking primates, or by whatever comes next.

I think if we still organize our collective learning at universities, then they’ll be mostly in the business of intentionally, systematically developing the intellectual capacity for literature, religion, and philosophy. And even though we’ll still go begging for poetry majors, the humanities may be our main job. Performing even complex tasks will have become trivially easy for machines that don’t need college, but seeing the purpose behind the tasks will not.

alg-drowning-jpg

As the third post pointed out, the economist who argues that we’ve passed the last point of diminishing returns and finally hit the limits of economic growth fails to convince because he doesn’t allow for the next paradigm-breaker.

For macroeconomics as well as the humanities, I think we’re developing such paradigm-breakers right now, with connected intelligence, the imminent discovery of exoplanetary life forms that will challenge our assumptions about life itself, and – believe it or not – quantum entanglement, which promises to do to lightspeed communication what the telegraph did to the horse.

Copernican orbitsThese shifts are already underway, poised to upend the secular humanist complacency. They may finally kill our longstanding tautology that humans are significant because significance is human.

Instead we’ll enter another Copernican reframing, knocking ourselves out of the center and looking for our bearings elsewhere.

That elsewhere will need to be understood and organized, more than ever, by the disciplines of philosophy, religion, languages, and literature.

Our awesome capacity to narrate collective fiction will then be more vital to us than ever, and become one of the biggest purposes of future higher education.

And the first new story we’ll need to tell ourselves will begin with the word Because.

BigSurrealMain-1920x1393

If you’ve made it this far, then a note of gratitude: this series was harder than usual for me to write. Along the way I’ve benefited from comments and feedback from you, individually and over the social networks that syndicate this blog. Thank you.

Image credits: Airship.net, vexels.com, KQED, NY Daily News, The Cosmic Engine.

Advertisements

Dispositional Learning

title slideAt the annual Assessment Institute in Indianapolis on Sunday, I helped lead a workshop called – no kidding – Using ePortfolio to Document and Enhance the Dispositional Learning Impact of HIPs.

My co-facilitators were Marilee Bresciani Ludvik, Laura Gambino, and George Kuh. You can download our slides by clicking on Marilee’s brain, to the left.

Our hypothesis is that High-Impact Practices (HIPs) – things like learning communities, undergraduate research, and community engagement – may be especially good for promoting particular neuro-cognitive or “dispositional” learning.

These are the college learning outcomes that employers and society say they want more of – things like curiosity, persistence, self-regulation, flexibility, and the ability to work in diverse groups to tackle complex problems.

Leaders of HIPs programs on our campuses say they develop these outcomes in their students. They’re often frustrated that so far the benefits – the high impacts – have been counted in other ways, like personal satisfaction, higher grades, and increased likelihood to graduate. These are all valuable, but may gloss over some of the most powerful learning that comes with HIPs.

But this kind of learning can be hard to assess, which is where the student electronic portfolio comes in. By narrating and making sense of their educational experiences as they go along, and sharing with others their evolving sense of themselves, students may demonstrate how far they’ve developed these dispositional learning outcomes.

And as they do, they will give colleges a new set of tools for evaluating and improving the experiences we’re proudest to offer.

presenters

another habit I got from screenwriting

apex

I first got into higher education as an adjunct professor in screenwriting, having worked for a dozen years mostly on stories begun by others, doing rewrites and adaptations. It was fun, trying to make stories more visual, lifelike, or involving. Like a lot of adjuncts, I kept doing that work as I taught college on the side, giving my students a very current and realistic understanding of the field.

In the early meetings I’d take with producers, either to interview for the job or, after signing, agree to the story goals, I paid more attention to what the team wanted than to what my predecessors had written. Writing is very hard work and also a little daunting, and so it’s too tempting to go back to the previous draft and begin the repairs on each scene. But really, you want a clear idea of what you need to create, before you can tell which parts should change.

If you look too closely at the existing script too early, then you lose the very freshness you were hired to bring. And it’s hard to get back, something I came to think of, and later teach, as the tyranny of the first draft. It insidiously affects how you think.

Cut to 2017. At my campus we’ve had people arguing for a few months about a set of contradictory policies, and who has standing and which document controls. As an administrator it’s my job to pore over them and then render judgment, but I resist.

For one thing, it’s kind of boring. But really, I think the written record is less important than finding agreement on how things should be. Since the present documents are inconsistent, we’re hardly bound by them anyway. They’re just a prior draft, created by well meaning but flawed people a lot like us. So escape the tyranny, and focus instead on a way forward.

My insight, from writing straight-to-video science fiction.

Covalent2

It reminds me of a conversation I found puzzling a couple of years ago, but which now makes more sense to me. I was at a higher ed conference, and seated next to me for dinner was a senior administrator who’d taught chemistry.

She was insistent not only that it helped her do her current job, but that it was indispensable. She really believed that if you wanted to lead a university you should first study chemistry, because it disciplines your thinking about cause and effect, and limiting reactions, and the invisible bonds that make a process possible.

Fans of broad, liberal learning – that is, me and the people like me – will tell you the choice major doesn’t matter. Just go to college, because no matter what your focus you won’t use it much after the first one or two jobs.

But what if we’re exactly wrong? It seems instead as if no matter what your focus, you will indeed use it, and for the rest of your life, just in ways that are impossible to predict.

Image credits: Green Communications, nde-ed.org

oh, the humanities – part three

burning-with-mast-1024x768

In the first of this set we looked at the humanities as a corner of the curriculum that makes meaning, and argued that it’s in for hard times – not because it attracts few majors, which is a constant worry, but for new reasons.

The second post looked at what some of those are: developments like machine learning and tools to offload our cognitive, biological, and physical processes, undermining the notion of Essentially Human.

nikesole-7Our interface with the world – the frontier between what’s human and what isn’t – has become strikingly porous. What’s left after the clever applications of prosthetics, 3-D printed organs, and machine learning? In effect, what remains is deciding what to do with all of it. But recent developments in neuroscience call free will itself into question, making that an unreliable peg on which to hang our existential hat.

At that point, the circular reasoning that has sustained us since the Renaissance – that humans matter because we’re human – will have run out of gas. And I’m not talking about circa George Jetson; this day is essentially here.

As I cast about for a new and improved raison d’etre, I find a cause for optimism in a recent book on macroeconomics, and a day a couple of months ago when I went to prison. I’ll start with the book.

rise-and-fall-of-american-growthLast Christmas I got Robert Gordon’s The Rise and Fall of American Growth, which elaborates on his argument that the recent stagnation in real wage growth is here to stay. He says this is because that spurt in our productivity and standard of living was the real anomaly, an unusual period from 1870 to 1970 where some one-time inventions like the phone, electricity, and the internal combustion engine all converged to give us a boost. As he says, municipal water and sewage is something you invent only once.

It’s an enormous book but very good, and if you’re reading me this far then you may also like it; Gordon is a fellow fan of the long view. (I took his macroeconomics course as a sophomore at Northwestern, and liked him then too.) Yet his premise is valuable mostly for orienting a fascinating and otherwise unwieldy account of recent history. As a main argument it’s unconvincing in two directions.

First, before the time of his book were some other one-off inventions, like the loom, the steam engine, the telegraph, and for that matter fire and the wheel. They probably had the same long-term and irreversible impacts, whether or not they register in traditional GDP. And coming out the other end, the time from 1970 to now, it seems any tapering of growth will last only until the next game changer.

If we can’t see it yet, we should admit that such is the nature of every paradigm shift. That’s why they shift paradigms. For example, an observer in 1815 could think sure, the steam engine might make communication faster than you get from riding a horse, but that speed was probably approaching its limit – not anticipating that the telegraph was about to take that limit up to the speed of light.

the stackIn reading Gordon’s book, I thought of our possible next paradigm shifts in a couple of ways. One, the same profound disruption wrought by the national electric grid could lie ahead of us with networked computing and artificial intelligence, especially as thinking and culture cross national boundaries. (See The Stack for a mind-boggling account from a computer scientist of how IT could usher in the kind of post-state world that’s been imagined for decades by Jürgen Habermas.)

Also in this tech-frontier category: the imminent likelihood we’ll find life on other planets, or inside moons of our own solar system, could give us a tech-driven growth spurt from flavors of biomimicry we haven’t yet imagined.

And then there’s the whole other category of untapped growth, equity. An awful lot of the world’s eight billion people have yet to benefit much from the inventions of 1870 to 1970. If the national economic engine isn’t revving like it did, then that’s hardly because we’ve saturated global demand.

cognitive-banking

While fretting about what the humanities will do in a post-human-centric world, I thought of both of my reservations about Gordon’s book. On the technological paradigm side, getting to the summit of human cognition and free will – and maybe surrendering our presumptive monopoly on both – may just bring the next hill into sight. We’ve seen this before, for example in theoretical physics.

In other words, it’s possible that unraveling the human capacities we understand so far will reveal other mysteries we didn’t know were there.

graduation

And then we get to the equity question, and the graduation ceremony I attended at the California Institute for Women. The CIW is a full-on prison, and going for a visit means breaching multiple heavy duty doors and razor wire, and forfeiting a civil liberty or two.

Once inside, you meet people who’ve studied and found salvation in, of all things, the humanities.

ciw

This is not a minimum-security Club Fed for embezzlers. These are some violent people doing, as one told me, “serious time.” Some of them are famous. Some of those in a 30-year stretch enrolled in a humanities graduate program offered by CSU Dominguez Hills, my employer, as a traditional postal correspondence course – important in a setting without internet access, and the last program of its kind.

peer-educators
Anitra Clark and Erica Hitchcock, Inmate Peer Educators, California Institute for Women

It made sense to me that a particularly popular genre is magic realism, but there are plenty of others. I have a stack of written testimonials from our graduates, describing not the escapism but the dignity that attends study in the humanities.

thegirlwhocircumnavigatedtheworldinadreamofherownmaking

And of course, beyond driving distance from my campus are the billions as yet untouched by Gordon’s catalog of miracles from the 19th and 20th centuries. For them – the vast majority of present-day humans, who live in developing and not-so-developing corners of the world – questions of machine-aided cognition, prosthetics and 3D printing, and dubious free will are mostly moot. We may sweat such things in the ivory towers, but they’re just less pressing down in the dungeons, both at home and abroad.

By my count that makes Cause for Optimism #1: in the relatively short term of decades, the humanities will remain vital for the vast majority of human beings who don’t have it all. Over that time period, the handful who do can work on how else to answer Why.

And so help me, even for that rarefied group, the tip of the epistemological spear, I think the disciplines we’ve been grouping into the humanities are in for some of their best days yet. We will replace the circular reasoning at the heart of today’s humanistic boosters with a much better, sounder line of reasoning.

I’ll call that Cause for Optimism #2, and save it for the fourth and last piece of the set, posted later.

calcutta-slums

Image credits: airship.net, Kenny Orthopedics, bankingtech.com, CDCR Today, maeryan.com, Just Detention International, The Girl Who Navigates the World in a Dream of Her Own Making by Paul Bond, BBC World Service

news from NACE

Last week the National Association of Colleges and Employers met in Las Vegas. Most of the national educational conferences I attend are pretty rarefied; they typically draw a few hundred people. This one had thousands, from all over the world. And, unlike the state-level workforce development events I used to attend, this one wasn’t all hype and hope. It was graced by actual employers, many of whom sponsored the event.

NACE sponsors 2

I was tagging along with the director of my university’s career center and a couple of her staff to hear firsthand what employers would like from us, unmediated by surveys or the higher ed press. Some takeaways while they’re fresh:

Not everyone recruits from campuses. The companies at this conference aren’t a cross section of the economy. There are few non-profits in NACE, even fewer from the public sector, and not a single mom-and-pop. The private businesses that trawl for freshly minted college grads are the ones big enough to have managers of “university relations,” and extensive in-house offices of onboarding and orientation. They see themselves – accurately – as our fellow educators. And since tenure in an entry level job is typically under two years, their business looks a whole lot like ours. People come in, they get some personal development, they leave. Each year’s round of new hires is called a “cohort.” For the companies that choose to engage in it, this very early career guidance is almost a contribution to the public good.

sessionThere is really something in it for them. Almost a contribution to the public good. The large and well-heeled do this out of enlightened self-interest. A vivid explanation came from insurance monolith AIG, whose presenters – both in talent acquisition and early development, one from the New York office and other from Chicago – spoke fluently of learning outcomes, engaging pedagogy, on-line portfolios and learning management systems. I about fell out of the folding chair.

They went on to say how it pays off: the people they hire fresh from college are versatile, impressionable, and forming lifelong networks and habits. After their crash course in the insurance sector and AIG culture, they will move on to take jobs with AIG’s customers, clients, vendors, and partners, and an early and positive experience with AIG could pay off for decades.

DEM_2016-NEVADA__-0eca1No one wants to look at an ePortfolio. I went to NACE hoping to find a warm welcome for best practices in higher ed: experiential learning, ePortfolios with meta-cognitive accounts from students of their college experiences, supported with artifacts that demonstrate developing proficiency over the undergraduate years. I mean, we’re all dissatisfied with the transcript and resume, right? What I saw was that no one, I mean no one, wants to even glance at these. The only thing employers don’t like about the one-page resume is that it’s too long.

But what ePortfolios develop does matter. I skipped lunch for an intense conversation with the recruiter from a large and noticeably successful investment firm. (I didn’t get hired.) Like many in the financial services sector they try to hire from the interns they host and get to know. So for our students that first cut is the hardest; there are thousands of applicants for only 40 internship slots, understood as the ticket to a job. Grades matter, but only to a point: once you’re above a 3.2 they don’t rank you on how far. Instead they turn to the other things they care about, gleaned from your resume and – if you’re lucky – an interview.

Only what an interview. The VP described a breathtaking zeal for quality control. The interviewers get daylong training retreats. Each one learns to focus on one or two of the dozen soft skills that her firm wants – results orientation, intellectual rigor, professionalism, communication skills. Finance doesn’t come up. Then all 120 or so interviewees go through six 30-minute interviews apiece, where they’re asked by pairs of interviewers “not what you know, but how you think.” Applicants are expected to speak confidently about their past experiences, what those experiences taught them, how they’d behave differently next time, how they know what they’re good at.

In other words, they’re effectively narrating what they’d discover about themselves by creating an ePortfolio. Except that the VP had never heard of ePortfolios, and when I gushed about them she was visibly unimpressed. She didn’t object, but their value is just utterly outside her sphere. It was like asking her to comment on an intern’s early childhood nutrition.

floorSoft skills are ascendant. I’m used to conferences where the sponsors and exhibitors sell ePortfolios, and software for tracking and reporting student learning outcomes, typically for accreditors and other oversight entities. Here it was also software, but outside of curriculum. There was a lot of CRM-style contacts management, alumni networking tools, job fair event management, and office workflow.

Two companies caught my attention, both peddling psychometric platforms to tell you whether your next hire would fit your company culture. The list of virtues was similar at both: problem solving ability, performance under pressure, work with diverse teams, creativity. Both had been around for decades, but each reported – in separate conversations – that business had boomed in just the past couple of years. Their market seemed to be the companies who wanted the same things their large, NACE-member competitors provide, but who have less in-house capacity or zeal.

95082d79880b98d9ecddcebcdf3b2fb2I asked both reps whether they noticed certain higher ed institutions or practices do a better job of developing these skills. I was hoping to hear support for undergraduate research, service learning, team-based or project-based learning. From one I got a blank look; his company had been in this business since the 1980s and it had never come up. He even seemed to doubt you could intentionally cultivate any of this, that instead some people were just naturally creative, or easy to get along with.

From the second, Maure Baker of Performance Assessment Network, I got better informed responses, maybe because he’d worked on college campuses before leaving to start up AmIJobReady.com for PAN. In his opinion there were patterns in the student experiences that develop these outcomes, but they didn’t relate to particular campuses or practices. Instead, he believes the key is institutional integration: when faculty, career services, curriculum, and student affairs are all collaborating, the students come out with better soft skills. As near as I can tell, his experience matches the best working hypothesis in my field.

There is opportunity here. Sometimes the language barrier was unsettling, like I’d overflown Vegas and landed in Turkmenistan. But more often I felt like a railroad baron looking across an open desert, where the biggest challenge is suppressing an unseemly giggle.

I would like that psychometric data, the decades of soft-skill assessments homegrown and tested by cycles of employers and graduates, so far apparently disconnected from academic affairs. PAN sorts them by economic sector, showing for example how nurses need more resilience under pressure than client service reps.

It would be cool to show those pie charts to incoming freshmen with declared majors, and run them through the same assessment. We could then hand them a personalized report on where their gaps are, writing out a kind of higher ed medical prescription for their next four years, comprised of courses as well as co-curriculars, and developed with faculty in the relevant majors. Then they’d go off and do it, keeping track of what they learn about the world and themselves along the way in – yes – an ePortfolio, not because anyone but their families may ever see it, but so that they’re ready to explain it all in a 90-second elevator pitch, or 30-minute high-octane interview.

session 2I know we already do some badging for these skills, and many summer orientation programs – including those at my university – include some kind of psychometric assessment and introduction to career services at entry. But we have far to go, and the two sides of higher ed seem to have been developing these from different directions.

All of this could be wrong. As my first impressions these are naive, and suspect. Also, I have to beware sampling error. For example, in the public sector – not represented at last week’s conference – we have to detail and defend every hire from charges of cronyism, and for some the backup of an ePortfolio may be welcome. (The internship coordinator at my college of business and public affairs believes a large employer in Sacramento may be one.) And for small businesses that can’t afford NACE – the majority of U.S. employers – free online evidence of student attainment would be a lot more feasible than that battery of interviews.

And we know from talking to our alumni that some employers do look at ePortfolios – maybe not to replace the resume, but to supplement the interview.

But it’s worth learning more about this area, and trying to connect it to emerging best practices on the postsecondary side.

I feel like I spent a couple of days in earshot of Promontory, Utah, but haven’t yet heard the golden spike.

20170607_215213

oh, the humanities – part two

burning-with-mast-1024x768

The earlier post in this set gave a typical defense of the humanities – disciplines like religion, philosophy, languages, and literature – which is that we all benefit from understanding ourselves as part of a bigger purpose. That intellectual birthright, systematically developed in college, helps orient our work and lives afterward. So, good job security for the soul searchers and poets, right?

In recent centuries, at least in the west, the searchers and poets have clumped around varieties of humanism – an ethic different from the set of disciplines called “the humanities,” but arising from them. Humanism takes the human condition as a self-contained, self-evident good, man as the measure of all things.

For some of us – maybe the majority on American campuses – it’s “secular humanism,” agnostic if not godless. We readily admit to a universe that includes the supernatural, imperceptable, and mysterious – whether called branes, dark matter, or Quetzalcoatl. But we don’t take daily cues from it.

These days even formal religions seem influenced by humanism, as we see falling from favor those that discount or threaten human well-being, for example by preaching intolerance, mutilation, or virgin sacrifice. They just don’t draw crowds like they used to.

So, humanism for all, and the answer to the eternally nagging Why is apparently some version of “because it’s us.”

That alone fuels a lot of the human enterprise. We work hard, cure diseases, and write apps for smart phones not just to get ahead personally, but also to add to the overall stock of human happiness. The more you contribute to the common good instead of the personal one, the more virtuous you feel, but really it’s all just different versions of petting ourselves. It’s been a surprisingly durable way of avoiding the question Why, at least up to now.

If that’s about to change – and I think it is – then college curriculum in the humanities should brace itself. But for what? The acceleration of change on a few fronts has made it harder for colleges and universities to guess what’s around the next hedge.

tara-johninmazeThose fronts:

1. Machine learning is now mostly inductive, a lot like human learning. As they catch on, our gadgets are taking over not just factory work but also driving, diagnosing disease, and even making art. Already our computers can translate, and our phones can see. Whatever our colleges teach people to do next, we want to take care that it’s things people will still be the ones doing.

2. To higher education this raises a not insignificant question: what is that? In other words, at the maturity of our current AI growth spurt, what will remain as the competitive advantage of homo sapiens, and then how do we organize our curriculum to cultivate it, so our graduates can be employed? The answer, increasingly, may be volition.

Machines can find, solve, and invent many more things than they used to, but we humans still have a corner on wanting to.

So does it come down to that? Will college learning be mostly about purpose and meaning, about why we should want some things more than others? About the nature of the good life, of telling right from wrong?

I would welcome that, but also enjoy the irony, that the build-out of our STEM infatuated, high-tech world could usher in a golden age for the arts and humanities.

Except that:

3. The idea of free will itself is under new fire, beautifully summarized in a recent Atlantic article. It was never on solid footing, empirically speaking: we feel like we act for ourselves but it’s been mighty hard to prove. Now we’re seeing that the interval between deciding to take action and taking it may be reversed; that is, that an opaque cognitive curtain keeps us from knowing what we do for a moment or two, during which we mentally process the intention, fooling ourselves into thinking our wishes matter.

In the context of machine learning, this may be an even bigger deal than we think. It seems that if a day comes when we have to concede we lack free will, then on that day we will really have run out of uniquely human capacities.

And that means we will also have to face up to the circularity of secular humanism and our longstanding measure of meaning, ourselves. It may not be enough anymore to say art, or health, or technology are valuable because they advance the human condition, because that condition will no longer be exclusively human.

On that day, maybe coming up fast, we’ll have reached the end of the street down which we kick the can that asks Why.

kick-the-can-1

What does a forward-looking university do then? I am not sure, but hope I’m not sitting on the platform behind the commencement speaker who has to explain that intellectual bequest to the next generation. “It’s been a pretty good run but we kind of ran out of steam toward the end there.”

I was ruminating on this, rummaging around for a ready reason to go to work each day when the present one sputters out, when a couple of experiences gave me hope that the humanities may yet have some good continuing uses, even after they’re understood as not uniquely human. One was a textbook I read this Christmas on macroeconomics, and the other was a day last month that I spent in prison.

But more on those later.

Image credits: airships.net, Tara in Poland, International Business Seminars