recipes and learning

Cook of the SMS Ranch_ near Spur_ Texas_ Lee Russell_ 1939-600Lately I’ve noticed a significant change in the way my wife Cyndi understands new recipes, and it has me wondering if there’s something here for hybrid learning platforms, especially for adult learners.

(I should confess first that for me cooking is mostly a spectator sport.)

As she trolls online for ideas, I’ve seen she remains careful about what she eats, but not about what she looks at. She browses and lingers over recipes high in calories, cholesterol, and appeal. I am not the first to call it food porn.

What I find really interesting about these videos – and what prompted me to write about them here – is the educational approach.

If you can’t picture the scrolling videos I’m talking about, then you may first want to see this account from last year’s issue of The Cut, “Why These Recipe Videos Are Taking Over Your Facebook Wall.” The trend began with a feed from Tasty, whose sample video in the article is copied here:

There are a couple of things about this style of teaching and learning that I think could inform, and maybe reassure, those in my walk of life.

  1. Visual doesn’t mean aliterate. These short films have an interesting pedagogical approach, giving you an overview of the work you’re in for if you decide to make the dish. Because they’re short, sped up, and skimpy on the prep and clean-up they can be criticized as misleading, but that’s what you get with any gist. For most users – Cyndi for one, but others I’ve checked with – the video is never the last stop. It’s just the preface, followed by familiar text-based learning.
  2. The comments are just as instructive. Following the text recipe comes the string of user exchanges, reactions to the recipe, ideas for substitution, arguments. It’s learning made personalized, interactive, and crowd-sourced.

This would be revolutionary, except that no regime was overthrown. The written recipes remain at the center, as they’ve always been. They would be familiar to 1960s readers of the Joy of Cooking, but as if it had been accompanied by 16mm previews and prepaid chain letters.


Only – and this was the surprise for me – the learning isn’t just easier. It’s better, deeper. The combination of video and text would delight a cognitive psychologist or instructional designer.

It’s hard to overstate how fully our brains are dominated by visual modes of learning – not as text but as moving images. One of my screenwriting professors encouraged us to exploit this dominance, for example by letting the staging contradict the dialogue. He cited an old Arabic saying, “the eye is the thief of the senses.” He’d encourage us to think carefully about where scenes were set, what was in the background, how people looked . . .  the things beyond words that tell an audience what’s really going on. (This old Arabic saying is, as near as I can tell, a sample of my professor’s creativity. But it stuck with me.)

Updating his advice are current brain scans suggesting that about two thirds of the human brain are recruited for processing visual information, and a quarter of those neurons have no other known use. That’s an awful lot of cognitive horsepower, enabling a kind of parallel processing that perceives traits like size, shape, color, distance, speed, and direction all at once, instead of in the single-file sequence of words.

The Tasty videos on Facebook make full use of that advantage, conveying more information faster than you get verbally. But if the user likes the video and decides to go further, then the mode of instruction switches to the plodding, sequential, precise world of words and numbers .

The interaction with fellow learners afterward is also impressive, harnessing a known leverage point for the social dimension of cognitive growth. That feedback loop, giving the learner a chance to paraphrase and personalize, is missing from the traditional cookbook.

entreeTo me this suggests that free online content like Khan Academy is akin to the video side of Tasty. And our ubiquitous arrangements of learning communities and peer mentoring seem like the comment sections that follow. Neither one seems ready or able to replace the precise, plodding modes of text and lecture in the middle, and maybe – to judge from this analogy, anyway – never needs to.

They’ll just make it better. They may help us devise efficiencies like shorter semesters and three-year bachelor’s degrees, that bring college within easier reach.

Image credits:, AV Club, SD Caterings.




On the eve of the last presidential election, New Yorker Magazine published a book review using the word “epistocracy” to describe government by the educated, as contrasted with “democracy,” government by everyone.

In an espistocracy, a college degree or a refresher in civics might be a prerequisite to casting a ballot, or might earn you extra votes. It’s a fascinating book review and a thought-provoking idea – even if no one, including the author, would seriously defend it.

But then a few days after it was published, the presidential election returns came in. Many in my walk of life thought epistocracy might be worth a shot. The feeling was bipartisan: my Republican friends had simply begun their misgivings earlier, around the conclusion of primary season.

Lately our national institutions seem to epitomize not our best instincts but our worst. Why do we share these important decisions so indiscriminately? Should we? And should higher education be doing something different?

The New Yorker story reminded me of something my mother told me in my teens, when for some reason she’d drawn multiple consecutive rounds of jury duty. She wanted it to stop, not because she resented the civic burden but because she felt ill-equipped. A few decades of adult life and her circa 1959 B.A. in psych did nothing to ease the sense of inadequacy she felt judging others.

Her broader point and this book review challenge our assumptions about innate wisdom. Maybe some things aren’t simply built in, but need to be intentionally cultivated and then assessed by formal education, toward a better world, an epistocracy.

But even in the present dark age, there are a couple of reasons I think that would be a dead end.

First, such thinking overlooks some shortcomings in education, and the limits of what our field can realistically contribute. The fact is we don’t know enough yet about brain science, behavior, and learning to confidently measure juridical potential, or civic-mindedness, as unambiguously as we measure dexterity or height.

We are getting there, slowly, and we do have confidence that certain practices and programs promote teamwork, pluralism, and equity. But the assessment tools are still too clumsy, the faith too blind.
jonathan haidt

My second reservation about epistocracies was harder for me to define, until I came across a lecture Jonathan Haidt delivered last month called “The Age of Outrage.” You can read it here, or watch the address itself it by clicking on Jonathan’s face to the right.

He has spent years studying the polarizing effects of contemporary culture, and among academics he’s especially fearless about criticizing the extreme left. But even if you already know his work, this particular lecture is worth your attention. It benefits from all the practice.

And it highlights, I think, the main beef I had with epistocracy, even before I could put it into words: we already spend too much time filtering each other out.

Making distinctions among voters, even if we could put a number on what we mean by “qualified,” would take us even further into the echo chamber we need democracy to lead us out of.


oh, the humanities – part four

burning-with-mast-1024x768This series has looked at the shifting value of the humanities, in a world where the word “human” is in fresh flux. As we augment our intellects with machine learning and artificial intelligence, and our bodies with prosthetics and gene therapies, we should note the less visible but no less dramatic changes to fundamentally human activities like paying attention, interacting, and creating.

The first post made a distinction between my current handwringing and the longer, honorable tradition of simply complaining that no one majors in poetry anymore. I think that one’s still on solid ground; our faculty friends in literature, languages, and philosophy can look forward to centuries of continuing disfavor.

The second post shifted to my present concern, which is that the question of “why are our lives meaningful” may soon need more than the widely accepted response of secular humanism, which is essentially “because we’re human, and man is the measure of all things.” That yardstick is macerating in a bath of technology, and may soon dissolve completely.

The third post got a little more hopeful, by looking into the places where traditional secular humanism still has plenty of ground to clear and fields to plow, for example among the impoverished and imprisoned.

This one will close the set with what I think is the most convincing case of optimism on behalf of the humanities.

sapiensI owe the thinking in this post mostly to a book called Sapiens, and its extended discussion of “imagined realities.” I read it when it came out a couple of years ago and didn’t think much of it, but parts keep sticking with me.

I’m especially taken by Harari’s premise that our remarkable evolutionary success is owed to a little recognized human propensity. Yeah, brainpower is good, and so are upright walking and opposable thumbs. But the really significant innovation was significance itself, our tendency to collectively assign meaning and then believe in things that aren’t intrinsically true.

Paper money is one easy example of an imagined reality. So are traffic signals; nothing in the color red necessarily denotes stopping, but so long as we all agree to pretend it does, our streets work fine.

In the tradition of secular humanists, Harari puts religion into this category. But he defines religion to include a lot of things others don’t, such as the culture of Silicon Valley, and its zeal and evangelism on behalf of coding. He counts liberal democracy and American exceptionalism as religions, too.

Once you start seeing our world as a collection of collective fictions, they turn up everywhere. Today is Wednesday because we all say so, and not because there’s anything Wednesdayish about today’s spin around the poles.

In my world of higher education, the fictions feel especially arbitrary and rampant: the credit hour, the division of knowledge into disciplines, the agreed ways of organizing, explaining, and adding to what we know.

553d1293f2f7c8cd10e8865f06acb29c-business-people-team-meeting-by-vexelsIf your workday is like mine then it includes a meeting or two you’d rather not sit in. On the off-chance you find your mind wandering, I encourage you to note the collective fictions that go into all the assumptions, discussions, and resolutions you’re party to in a given hour, noting of course that the hour itself is a fiction, as are the job titles, resource allocations, and policy implications. For all the psychic investment – and it can be considerable – it’s amazing how purely hallucinatory the topics are.


I mean, let’s note that other animals do this less. Wolf packs may agree to imaginary boundaries between ranges and observe an invisible hierarchy, but they don’t codify it in zoning laws and tax brackets – to name two inventions that go back as far as recorded human history. There must be powerful reasons for doing this.

Harari believes we make these investments because the stories we tell ourselves are more than organizing. They are also keenly, eerily motivating. University presidents, fundraisers, consumer brand managers, and warmongers all make their livings on that energy.

Not to say you can do so with falsehood for long – although it helps to dress truth to advantage (paraphrasing Pope), you still need truth. And it’s possible not only that history is written by the victors, but also that the victors won because they had a more compelling history to write.

So now that I’ve had a year or two to ruminate on Harari’s thesis, I can imagine a time after ours when this will be how the word “human” is used. By then we can expect “human” to serve less as a self-evident, self-contained noun, and more as an adjective to describe that predilection for shared fiction, whether demonstrated by us talking primates, or by whatever comes next.

I think if we still organize our collective learning at universities, then they’ll be mostly in the business of intentionally, systematically developing the intellectual capacity for literature, religion, and philosophy. And even though we’ll still go begging for poetry majors, the humanities may be our main job. Performing even complex tasks will have become trivially easy for machines that don’t need college, but seeing the purpose behind the tasks will not.


As the third post pointed out, the economist who argues that we’ve passed the last point of diminishing returns and finally hit the limits of economic growth fails to convince because he doesn’t allow for the next paradigm-breaker.

For macroeconomics as well as the humanities, I think we’re developing such paradigm-breakers right now, with connected intelligence, the imminent discovery of exoplanetary life forms that will challenge our assumptions about life itself, and – believe it or not – quantum entanglement, which promises to do to lightspeed communication what the telegraph did to the horse.

Copernican orbitsThese shifts are already underway, poised to upend the secular humanist complacency. They may finally kill our longstanding tautology that humans are significant because significance is human.

Instead we’ll enter another Copernican reframing, knocking ourselves out of the center and looking for our bearings elsewhere.

That elsewhere will need to be understood and organized, more than ever, by the disciplines of philosophy, religion, languages, and literature.

Our awesome capacity to narrate collective fiction will then be more vital to us than ever, and become one of the biggest purposes of future higher education.

And the first new story we’ll need to tell ourselves will begin with the word Because.


If you’ve made it this far, then a note of gratitude: this series was harder than usual for me to write. Along the way I’ve benefited from comments and feedback from you, individually and over the social networks that syndicate this blog. Thank you.

Image credits:,, KQED, NY Daily News, The Cosmic Engine.

Dispositional Learning

title slideAt the annual Assessment Institute in Indianapolis on Sunday, I helped lead a workshop called – no kidding – Using ePortfolio to Document and Enhance the Dispositional Learning Impact of HIPs.

My co-facilitators were Marilee Bresciani Ludvik, Laura Gambino, and George Kuh. You can download our slides by clicking on Marilee’s brain, to the left.

Our hypothesis is that High-Impact Practices (HIPs) – things like learning communities, undergraduate research, and community engagement – may be especially good for promoting particular neuro-cognitive or “dispositional” learning.

These are the college learning outcomes that employers and society say they want more of – things like curiosity, persistence, self-regulation, flexibility, and the ability to work in diverse groups to tackle complex problems.

Leaders of HIPs programs on our campuses say they develop these outcomes in their students. They’re often frustrated that so far the benefits – the high impacts – have been counted in other ways, like personal satisfaction, higher grades, and increased likelihood to graduate. These are all valuable, but may gloss over some of the most powerful learning that comes with HIPs.

But this kind of learning can be hard to assess, which is where the student electronic portfolio comes in. By narrating and making sense of their educational experiences as they go along, and sharing with others their evolving sense of themselves, students may demonstrate how far they’ve developed these dispositional learning outcomes.

And as they do, they will give colleges a new set of tools for evaluating and improving the experiences we’re proudest to offer.


another habit I got from screenwriting


I first got into higher education as an adjunct professor in screenwriting, having worked for a dozen years mostly on stories begun by others, doing rewrites and adaptations. It was fun, trying to make stories more visual, lifelike, or involving. Like a lot of adjuncts, I kept doing that work as I taught college on the side, giving my students a very current and realistic understanding of the field.

In the early meetings I’d take with producers, either to interview for the job or, after signing, agree to the story goals, I paid more attention to what the team wanted than to what my predecessors had written. Writing is very hard work and also a little daunting, and so it’s too tempting to go back to the previous draft and begin the repairs on each scene. But really, you want a clear idea of what you need to create, before you can tell which parts should change.

If you look too closely at the existing script too early, then you lose the very freshness you were hired to bring. And it’s hard to get back, something I came to think of, and later teach, as the tyranny of the first draft. It insidiously affects how you think.

Cut to 2017. At my campus we’ve had people arguing for a few months about a set of contradictory policies, and who has standing and which document controls. As an administrator it’s my job to pore over them and then render judgment, but I resist.

For one thing, it’s kind of boring. But really, I think the written record is less important than finding agreement on how things should be. Since the present documents are inconsistent, we’re hardly bound by them anyway. They’re just a prior draft, created by well meaning but flawed people a lot like us. So escape the tyranny, and focus instead on a way forward.

My insight, from writing straight-to-video science fiction.


It reminds me of a conversation I found puzzling a couple of years ago, but which now makes more sense to me. I was at a higher ed conference, and seated next to me for dinner was a senior administrator who’d taught chemistry.

She was insistent not only that it helped her do her current job, but that it was indispensable. She really believed that if you wanted to lead a university you should first study chemistry, because it disciplines your thinking about cause and effect, and limiting reactions, and the invisible bonds that make a process possible.

Fans of broad, liberal learning – that is, me and the people like me – will tell you the choice major doesn’t matter. Just go to college, because no matter what your focus you won’t use it much after the first one or two jobs.

But what if we’re exactly wrong? It seems instead as if no matter what your focus, you will indeed use it, and for the rest of your life, just in ways that are impossible to predict.

Image credits: Green Communications,