Convergence: Big data, digital texts, and… what?

I’ve written once before in this space about digital curriculum;  and about big data.  Now, thanks to the wonders of human inventiveness, the impulse to engineer society for its own good, and the logic of the marketplace, it’s already hard to explore the educational possibilities for digital curriculum, without adopting simultaneously the lenses of accountability and commerce.  I wonder whether this instant move to control and commodification won’t impede progress in understanding the strengths and weaknesses of  the new tools.

Let’s start with a recent column by Will Oremus in Slate.com (“No more pencils, no more books“).  The article reports, and reflects, on two big developments.  “While the thinkers are arguing,” says Oremus, ” textbook publishers are acting.”

First is the current speedy transformation of Big Text (McGraw-Hill, Pearson, &Co) into Big Digital Content Delivery, or something. A McGraw-Hill imagineer (without, I suppose, the Disney twinkle) says that they aren’t a textbook company any more:  it no longer thinks of itself as a textbook company. “We’re a learning science company,” and, Oremus tells us, the logic of techno-economic development is impelling the ed products industry to incorporate more and more into their designs:

David Levin, CEO of McGraw-Hill Education, tells me his company views all of this as an imperative to reinvent its core products. To retain its value, Levin says, the textbook of the 21st century can’t just be a multimedia reference source. It has to take a more active role in the educational process. It has to be interactive, comprehensive, and maybe even intelligent. It has to make students’ and teachers’ lives easier by automating things they’d otherwise do themselves. The smarter it gets, the more aspects of the educational experience it can automate—and the more incentive schools and teachers will have to adopt it.

So what are these “smart things?”   The early betting seems to be that more and more eduproducts will include “adaptive technology,”  a digital environment that includes an artificial-intelligence layer which draws inferences about its  user (customer, client, student) on the basis of her responses to various prompts and actions within the environment.  Oremus digs into this development, one more manifestation of the apparently irresistible impulse to  automate “differentiated learning” (or at least “differentiated instruction,” see an earlier blog post here for links and opinions).

Reading the article, you can’t help but see notice the little tell-tale buzzwords: “learning styles,” “personalized learning,” “efficiency,”  “student outcomes,”  “interactive” and so on, words that carry a freight that includes a bit of meaning and a lot of atmosphere. Despite the good intentions and deep backgrounds of many of the visionaries and experts in this sector of the Ed Industry (once, education was a social enterprise, now it’s an Industry), the mutually reinforcing powers of policy and capitalism drive us all to quickly to seize upon some ideas, hopes, hypotheses about learning, and reify them as products.  Products are purchased, and then become imperatives — we have to justify our investment.  So where’s the room for critique?  For learning from experience?  For research to inform practice (not to mention purchasing)?

This is not just the moaning of a slow-coach Luddite — the questions raised by ed tech are often deep and maybe revealing — but research is itself a design process, requiring time, intuition, and serendipity as well as data (information collected as part of a theoretically-controlled inquiry) — not “data”as in “a massive record of events” in which to go fishing.

Critiques are always being made, of course, and not always in ivory towers or little blogs in the hills.  I recently read the comments of a voice new to me, whom you might enjoy, as well, one Emmanuel Derman.  It’s not that he is saying anything new, but in reflecting on worldwide fascination with Big Data (which drives much of the technological innovation that is coming to education), he reminds us of fundamentals.

Big Data is useful, but is not a replacement for the classic ways of understanding the world.  Data has no voice.  There is no “raw” data.  Choosing what data to collect takes insight;  making good sense of it requires the classic methods:  you still need a model, a theory, or an intuition to find a cause.   “Philosophy is a battle against the bewitchment of our intelligence by means of language,” wrote Wittgenstein. I take that to mean that language can deceive our natural intuition, and we need philosophy to reclaim it. In a similar sense, I would argue, science is a battle against the bewitchment of our intelligence by data.

In our work, we often use theories, because they help us design processes, tools, events, heuristics.  We need also to encourage and challenge each other (and ourselves) to test the theories, even some that we take for granted, using experience, logic, and of course data.

 

 

 

 

 

Advertisements
This entry was posted in Uncategorized. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s