Is adaptive learning ethical? Is it OK to experiment on one set of learners to improve results for another? Do EdTech companies have a higher duty of care to their users than regular businesses?
These are some of the questions that raced through my mind as I read that Facebook and Princeton University manipulated the emotional constitution of 689,003 users’ news feeds to discover what effect it would have on their emotions.
At first I was shocked. Why would a company deliberately manipulate the emotional state of their users, and then publicise it? And how could they possibly believe, as the study itself says, that:
“Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook, constitut[ed] informed consent for this research.”
Then I thought about how insignificant this must have seemed in that industry. In big data, everything is measurable – because interaction with the site is all that matters. Indeed, Facebook measured only the presence of key emotion-words in users posts; no one was interviewed. The user counts (and dollars) of big high-tech companies are so high, that their evidence-based design leaves ELT in its pseudo-scientific dust (forgive the hyperbole – great session at IATEFL). For example, take Google A/B testing 41 shades of blue for the links on their search page, to see which one is more conducive to clicks. So is Facebook’s experiment just an A/B test, with the same aim as all the others (improve user retention, increase sales conversion, etc.)?
Is there actually a fundamental difference?
Or is it merely that, as Professor of Law at the University of Maryland James Grimmelmann puts it:
“This study is a scandal because it brought Facebook’s troubling practices into a realm – academia – where we still have standards of treating people with dignity and serving the common good.”
Which is where the ELT industry comes in. Could we get away with such an audacious manipulation of our students? Could an EdTech company do this to its users?
As a test, I propose a similar experiment for Knewton: to manipulate the correctness of 689,003 users’ responses to discover what effect it has on their future answers. Give users incorrect solutions, and see if their performance is affected. Give them questions that are too difficult, and see if other (easier) questions are answered well (and vice-versa). Present inaccurate content and just see what happens.
There would be clear educational value to the knowledge produced: understanding the effect of confidence and reinforcement on performance. Understanding the importance (or otherwise) of content accuracy. Future learners could benefit from this knowledge.
Somehow I don’t think Knewton will take me up on this (though I welcome comments below from anyone who works for an adaptive education company). An education company manipulating the education of its users is not like a marketing company – Facebook – manipulating the emotions of its users.
As Jaron Lanier puts it in the New York Times:
“It is unimaginable that a pharmaceutical firm would be allowed to randomly, secretly sneak an experimental drug, no matter how mild, into the drinks of hundreds of thousands of people, just to see what happens, without ever telling those people. […] Unfortunately, this seems to be [acceptable] when it comes to experimenting with people over social networks.”
And here’s that accepted view on the Guardian: We shouldn’t expect Facebook to behave ethically.
Should we expect EdTech companies to?
Well if adaptive learning learning is unethical, then the whole practice of teaching is unethical. Teachers are constantly gathering data about individuals (Oh god, Mohammed is late AGAIN!), adapting their techniques to meet the needs of individual students, (I’d better s-p-e-l-l this out for those two, they just can’t spell!). And the very best teachers are especially good at MANIPULATING people (I’ll…*insert manipulative statement here*… to motivate this class/student!).
Thank you Lindsay for your thought provoking post.
In response to your question of whether Edtech companies have a higher duty of care, I would say ´yes´, absolutely. Facebook collects data on primarily trivial, informal social interactions, but Knewton et al collect data on something far more important and valuable (i.e. people´s education). The way that education data might be used (or misused) is a major issue.
I´d also add that your proposed experiment could hypothetically be done in a traditional classroom too. I guess there are good reasons why it never has.