Exploring the British Council MOOC

Either ELTjam and its community of commenters can see the future, or the British Council closely followed this post from January 2013 when they created their 6-week course Exploring English Language and Culture in partnership with FutureLearn.

There’s one critical difference, though. ELTjam thought an ELT MOOC probably wouldn’t work. The British Council made sure that it did. Although, as we’ll see, that does depend on your definition of ‘work’.

https://www.class-central.com/mooc/2135/futurelearn-exploring-english-language-and-culture
https://www.class-central.com/mooc/2135/futurelearn-exploring-english-language-and-culture

The course, which over 100,000 people signed up for, demanded two hours of students’ time per week and made use of two unscripted videos, aimed at around B1 level. Following up the videos were comprehension, grammar and language activities; chat forum discussions around the topic; and a final written activity – also to be submitted via the forums. A £24 optional participation certificate is available for anyone completing 50% of the course. Moderators were on hand to interact with participants, both conversationally and correctively, much as a live teacher would, but, more importantly, students could interact with each other. And they did – thousands of them.

Not 100,000 of them, though. The typical attrition rates are evident in the number of comments in the forums. I took the figures from the final assignment discussion boards as a very rough indicator of participation.

Let me stress how rough this is before anyone goes all ‘statistical research’ on me.

  • By the end of Week 6, people were still adding to the Week 1 discussion board, and the figures had increased from the previous week by anywhere between 400 and 1000 comments for all modules. So we can assume that, as a few more weeks pass, those numbers will go up even further as people study at their own pace.
  • The number of comments includes moderator feedback, which, logically, seemed to take place only for the first week’s comments, as the moderators moved on when the next week’s discussions opened.

Anna Searle, in an interview with Womanthology, published roughly halfway through the course, said that 55% of people who signed up had started the course. So let’s say 55,000 people signed up actually started. The number of comments for the final forum discussion at the end of Week 6 were:

  • Week 1 17,500
  • Week 2 10,369
  • Week 3 7,824
  • Week 4 5,830
  • Week 5 4,534
  • Week 6 2,678 (But the additional forum asking for feedback on the course was 3206.)

So, even allowing for 5 more weeks post-course participation at 1,000 per week, active numbers seem to be drastically reduced. Nevertheless, I’d count this as a course that had ‘worked’. Those numbers are impressive. How many go on to buy a certificate I’d guess at significantly lower than those completing the course, but I doubt that was the aim of the British Council. What’s more, the forums I skimmed through were brimming with enthusiasm. And feedback, as recorded on the site, was eulogistic:

  • “More than learning English, it was an opportunity to learn more about Britain and other global issues. I probably learned a lot these last weeks.”
  • “I would like to say the massive thank you to all of you, because this course was great and useful and I learned some knew things and I have got new grammar skills.”
  • “This is the best online course that I have done. I am already signed for new courses on FutureLearn.”
  • “When you watch Richard interviewing some people and going somewhere you have a strong feeling of visiting Britain.”
  • “It is wonderful to know something new about the greatest music and culture in the world!”

So positive was the reaction to Britain and British culture, explored through topics like British music and literature, Visiting Britain, English as a Global Language and Entrepreneurship (with British businessman Richard Branson), that I’m amazed we don’t do better at Eurovision. The course was pure advertising for Britain and studying with the Council, even to the point of camera angles which filled half the screen with their poster. (Forum moderators were also quick to link to other Council products like their apps or the forthcoming MOOC aimed at teachers.) But this is another way in which it can be said to have worked, since this is in line with the Council’s mission. Far from coursebook publisher concerns about producing a multi-cultural, globalised spread of content, thousands of people loved every second of learning about ‘this green and pleasant land’.

Although the topics, aside from Entrepreneurship, are fairly same-old (the dreaded Environment even features), I only saw one negative comment. The literature topic was mentioned often as a favourite, and people loved the social interaction. From my skimming I barely saw anyone talking about grammar and vocabulary or asking for more quizzes and tests. Except, that is, teachers:

  • “Something to consider? Perhaps if there were more tests/activities online for students to do in a course like this? It would [be] a way for people to check their knowledge.

  • “… although, I would have liked to of [sic] seen more grammar points made throughout this.”

Now, all this ties in very closely with what the commenters on Nick’s post thought 18 months ago: The best MOOCs for learning English would be ones that taught a topic of interest in English, rather than English itself. I’d go one step further and say more EFL materials need to be content-rich and teach real topics instead of blandified language and grammar vehicles. And the literature aspect? Graded Readers. I’ve said a million times how under promoted and underutilised these are.

The ELTjam voices of the future also knew that interactivity would be a key to a MOOC’s success:

… these constraints [would be] blown out of the water when MOOCs take on ‘social learning’ elements that allow users to participate in communication with their peers (in effect, allowing for unlimited student talking time). Social learning will, among many other things, create opportunities for peer teaching and review (two elements that are exactly in line with modern ELT pedagogy) which will bring a much-needed new dimension to digital learning, and, in my view, allow it to flourish like never before. Robert Vernon, ELTjam commenter.

The British Council course managed that as well as possible given the constraints of moderator time and capacity. Discussions looked lively and, at least during the week of a module going live, highly interactive, with people able to ‘like’ and respond to others’ posts. There was even some peer correction going on. This is the point where you might want to asses the MOOC under slightly different terms. Was it truly an ELT MOOC? That is, did it teach language?

Obviously, I don’t have any data to prove or disprove this. But it seems unlikely that 12 hours can make an awful lot of difference to anyone’s level. The grammar topics were things like passive, relative pronouns, ‘so’, -ing vs infinitive, ‘make’ vs. ‘do’ – subjects that get taught over and over with often no visible effect (in my classes at least!). For a student with no other access to lessons, it can’t have hurt, though. One thing that seemed to be the case from the level of language in the forums was that many of the students had a much higher level than B1 anyway, so even if you were to assess improvement somehow, you probably wouldn’t see much change in the measurable elements like grammar in right/wrong test questions. However, the British Council set expectations carefully. It wasn’t ‘learn English’ but ‘explore English’ so, on its own terms, it wasn’t a true ELT MOOC.

The last way in which the course can be said to have ‘worked’ is the massive quantity of freely given data that can be mined from it. The amount of qualitative data in the first forum discussion could feed several Second Language Acquisition PhDs alone. I could have written my MA ELT dissertation on responses to a question about learners’ feelings of identity when speaking English. Add to that the stats on participation by country, gender (predominantly female apparently), length of study time, preference for task type, grammar and language strengths ad infinitum.

What do the British Council have now? Invaluable data for designing a course that people might pay for and where to market it. Oh, and the email addresses of over 100,000 people who might want to sign up.

I’d say, yes, that definitely worked.

17 thoughts on “Exploring the British Council MOOC”

  1. So over 100,000 people signed up for the course, but by week 6 only 2,678 were active participants. I’m sorry, but that sounds like a disaster to me.
    As I understand, MOOCs typically have a 5% pass rate, on average. And that is far, far what Sebastian Thrun et al were hoping for when they invented the MOOC.

    Also, you say that the feedback at the end of the course was eulogistic. Fair enough, those who stuck it out really found it valuable. But there are some 97,000 people who did not find it particularly valuable and who didn’t leave comments.

    The data collection you talk about is very interesting though. Will the British Council be making that data available to researchers, do you know?

    Reply
  2. Hi Thomas,
    Thanks for the comment. I just want to make it clear how rough my figures are and how much they should be viewed as such.

    I only took the numbers from the final discussion forum, perhaps the grammar exercises had different figures, and this is not number of users, but number of comments (meaning user number is lower). And also, at the previously seen rate of participation in weeks after the module goes live, they would quite possible hit the 5% you mention rather than the 3% (or lower) it suggests now.

    As to whether that figure is a success or not. If you look at what app developers view as a good conversion rate for in app purchases (I know this is different as it is about monetising a product) “The general consensus across forums, independent research and brand-provided analysis is that most apps have a 1-2% average conversion, so hitting anything above 2% should be considered a strong conversion rate. “
    http://info.localytics.com/blog/mobile-apps-whats-a-good-conversion-rate

    As for the BC collecting the data, I have no idea. I was really just speculating on the incredible wealth of it that they must have amassed. They’re a public organisation/charity so really they should make it available but I have no idea how that all works

    Reply
  3. I’d just add that number of comments in a discussion forum isn’t really a measure of active participation in a MOOC or of course completion. I’ve taken part in two MOOCs but studiously avoided all discussion forums and message boards etc. It’s often not what a student is looking for and huge a distraction. They’re an added extra and can be pretty chaotic, as you can imagine (week 1 – 17,500 comments!). I think the fall-off in the number of comments is most usefully seen as an indication that these forums need to be handled in a different way. Perhaps dividing users into more manageable ‘seminar’ groups?

    Reply
    • I agree. Without access to the actual user engagement, I just wanted to take one aspect and show its progress.
      In fact, in the Writing MOOC I’m doing at the moment, the forums are the part I’m the least interested in. But, then again, I’ve only done 1 of the 3 assignments so far too. However, since a marker of the whole “online learning success” discussion in that original ELTjam MOOC Shmooc post was about interactivity I thought I’d look at the British Council MOOC that way for the purposes of my post.

      Reply
  4. Fair enough. And users can post multiple times, of course. But your post does indicate that using the forums was a essential part of the course, students had to post their written assignments through the forums, for example.

    Also your point about being able to use the data for research is moot if the British Council don’t actually make the data available.

    I’m afraid I’m just not convinced about this project at all. It just seems like a little marketing gimmick for the British Council and not much more to me.

    Reply
    • I didn’t mean to say any part of it was essential. By having a 50% completion requirement for getting a certificate, someone could possibly achieve that with no forum participation.

      It is essential if interactivity is used as a measurement of a successful ELT MOOC which is what commenters on the original MOOC post on ELTjam said. I think there is starting to be the thinking that the social side of learning online is important but am planning to write about that soon so I will have more to say.

      Pardon me, ever the cynic, but I think all any product is is marketing for another product or the company itself in some way. So, you’re probably right!

      Reply
  5. Hi Nicola – I was the lead educator on the British Council’s Exploring English course and I’ve read your comments with interest. I’d certainly agree that the huge interest in and enthusiasm for Britain and British culture was one thing that came across really strongly. I’d suggest we put JK Rowling in as our Eurovision entry next year. We’d walk it.

    We’re now into the two week grace period that all FutureLearn courses have to allow learners to finish the course / late starters to catch up etc and there still seems to be a fair amount of activity on the course. When that’s done and the dust has settled there’ll be more information about the numbers.

    Reply
  6. Hi Nicola,

    Firstly, great post – thanks very much for taking the time to write it.

    Nick’s original blog suggested that the key to MOOC success in ELT would be human interactivity (or what Rob Vernon describes as ‘social learning’), rather than the right-wrong paradigm of computer-generated, templated interaction.The reduced engagement in forums seems to suggest, however, that this isn’t what the learners necessarily want? Or … that forums don’t provide an adequate conduit for human interaction? What’s your take on this?

    I note also that moderators were employed. This is a good move from the BC, and reflects the lessons learned from the Udacity model. And yet, for any business or institution not subsidised by the tax-payer, this would represent a real cost that would have to either be a) passed on to the learner, or b) siphoned off profits from other commercial operations. This raises questions about the sustainability of moderated MOOCs, and whether their long-term destiny (that differentiates them from other, non-open online courses) is dependent on public funding.

    Finally, although this is an ELT MOOC and there was discussion in the first blog (rightly) about whether language acquisition presented different challenges for a MOOC, I’m struck by two factors which place the BC MOOC squarely within the matrix of standard MOOC projects: 1) the high attrition rate, and 2) the fact that the actual participants are at variance with the targeted learner demographic (i.e. they possess a skill set above the learning goals of the course – in the BC case, learners’ language skills were well above B1, and in undergrad-level courses run by universities, the learners who make it through to the end tend to be graduates).

    Reply
    • Interesting about the level for MOOCS always ending up higher than the target. I didn’t know that. Now I think of it, I feel like my writing MOOC is a bit basic for me! But then, it’s not such a gradable skill and you can always improve the basics.

      I think I would need to go through all the forums to say whether user engagement was or wasn’t what the learners wanted. I think the question is whether they can learn without interacting in the case of a pure ELT MOOC. And no-one has tested this out yet.

      I think people prefer to interact online when it’s not a requirement and more just an aspect of the platform e.g. Facebook, Twitter, ELTjam…

      Reply
    • Hi Nicola and Brendan,

      What other forms of interaction between learners are there, besides forums? I personally don’t think they’re as useful as forum organisers would wish.

      I also read about some A/B results from an online provider (name escapes me) who found similar attitudes – learners like self-study more. How can online providers such as MOOCS encourage better, or more directed peer-learning?

      David

      Reply
      • I suppose chat facility is the only other way I can think of but then it’s not moderated. Maybe as Diane says, smaller groups? As I said, people love interacting online – they just want it to be a choice not a requirement.

        Reply
  7. Very useful post Nicola, thanks. Two comments:

    I think we’ll start to move away from measuring success by the attrition rate in MOOCs set up like this – initial sign-ups will always be huge, as will the drop out rate. Maybe we’ll just say “4K students completed this, really good.” We tend to view drop outs as catastrophic following the usual university model and that’s not as applicable here.

    Regarding Brendan’s comment about student interaction and social learning vs. right-wrong paradigm self-study: the potential for S2S interaction via forums and other social media always seems so high, and I always get so enthusiastic about it, but I’ve seen time and time again that it doesn’t work out as hoped. For a wide range of reasons most students seem to prefer the old Murphy gap-fill. Two examples: I hired a freelancer to write some content several years ago, whose main job was to write on one of the most visited ELT sites on the web. The site was monetized by ads, so page views were monitored obsessively. Anyway, for our content I kept pushing him to produce more creative, social, interactive learning content, and he told me that the old-fashioned discrete answer content got about 95% of the page views on the site he wrote for, and that every time he would set up more creative tasks, students ignored them.

    The other example is from a large scale blended learning course by a major publisher for a university chain, deployed in a dozen campuses with many thousands of students. They used both traditional lexico-grammar exercises, and a variety of forum and other social tools for student interaction. When they did quality surveys with students, the overwhelming result was that the Ss preferred the gap fills.

    Now, of course it may be the case that, as Brendan pointed out, forums may not be the best mode for social learning. Or that most students are just more comfortable with the way they have learned since primary school. Or that gap fills are just easier and more mechanical, requiring less critical thinking time and effort? Or that again we always have a wide range of student profiles and learning preferences and we should provide a range of task types let the students decide what they prefer. Or that we need to be more topic-based for engagement purposes, as you pointed out? Probably a little of everything as we keep exploring.

    Reply
  8. You’re right! The MOOC model is not the university one so the drop out/visible participation rate is less significant. I read somewhere how a lot of people like MOOCS precisely because they can stop/drop in/out when they feel like it. Hard to know what else to assess their success by as an outsider though.

    You’re also completely right about what students like doing exercise wise. I do the Facebook page for an exam prep site and gap fill grammar/vocab posts get way, way more interaction than putting up videos or asking for more creative input. There’s a reason Grammar in Use sells so well. I think it’s about what students get back….the verification of a gap fill that took two seconds to do = the feeling of “I know English”. Participating in a forum takes longer and there is no unambiguous confirmation to follow the output and effort.

    Reply
  9. I’m coming to this discussion very late but just wanted to say this was very interesting. Over the last year or two many people’s enthusiasm about MOOCs has really plummeted so it’s great to get a different perspective with some data behind it, rough or not.

    As far as the attrition rate goes, there are plenty of good reasons to drop out of a MOOC that shouldn’t reflect badly on the course itself. Maybe you got exactly what you wanted out of the course halfway through. Maybe you realized the difficulty level wasn’t quite right for you. Maybe you got enough of a taste from the MOOC to jump to a paid course somewhere else. Maybe you wanted to find people to practice English with and, having done so on the forums, took your practice to a more convenient venue like WhatsApp, WeChat, Skype, etc.

    Also I think just the fact that the MOOC is free adds to the attrition rate. Maybe the data would disprove me but I think that if people were paying for the course, even just a small amount, they would value it more and be less likely to drop out. Of course fewer people would start the course if they had to pay even a dollar for it.

    I think the biggest issue MOOCs face is how to be sustainable (and/or profitable). Brendan makes a good point that paid moderation costs money, as does the technology and other assets. I have read that the average MOOC costs about $50,000 to produce (probably not including the costs of building the MOOC software platform itself). And right now MOOCs are still searching for revenue streams. They are free, to draw as many into the funnel as possible, but I’m not sure how many will convert — in this case paying for a certificate. What MOOCs want is accreditation but that has not happened yet. Of course, when / if you can get actual course credit for a MOOC then you can be sure that they’ll no longer be free!

    Reply

Leave a comment