When I suggested we need an ongoing discussion with our students about EdTech, the responses varied. Some comments were made about the validity of the questions, and what inferences could be drawn from them. They were mostly fair, but I wonder if we’re learning as much as we can from the way the startup world does user feedback.
Sure, startups methodically measure user engagement via click-throughs and conversions, just as we do SLA research. They also sit with their users and just talk about the product. The aim of these conversations is not to get a sound academic basis to theorise, it’s to get an insight into how their users feel.
So here’s how the students I spoke to feel.
Targets
‘I want an app that helps me with something skils’
Most, though not all, of the students I spoke to are on a pathway to university study. Popular apps include IELTS 6 Trainer. Apps like Busuu and Duolingo simply didn’t get a mention. TOEIC and FCE apps were preferred. An (implied) assumption of the students is that these tests somehow possess specific language. I mean, what is TOEIC grammar?
Content is King …
‘This one have an answer you can use it to practice.’
Dictionary apps need easy-to-understand definitions with good examples. Test-specific apps need quality model answers and explanations. Students expressed these wishes, although they didn’t seem well equipped to judge them. Indeed, despite agreeing with their teachers that they should use an English-English dictionary, they often preferred Google Translate.
… but function is Queen
‘You can check any word when you see it in the story.’
Students want in-situ, in the moment, context-relevant help. Students praised graded-readers and reading apps when they had explanations to words in context. Students see value in that, even if they knew their teachers might prefer them to try nutting out the meaning for themselves first.
Local is better
‘This app is made for Arab students so it is better for me.’
The favoured apps seemed to be ones specific to a certain culture or language group. I guess that’s no surprise: Whatsapp wouldn’t have near the market-penetration they do in the West as KakaoTalk do in Korea.
TOEIC tests are big in Japan and Korea, and those students choose apps marketed to them in Japanese and Korean. Sadly, none of the culturally-specific apps students showed me targeted language problems specific to that group of learners.
We know it’s not perfect
‘Some students always use E-dictionary, not their brain to remember the meaning of words thus next time they still don’t know it.’
A lot of students expressed a concern about the use of technology: that the technology made things too easy. Yet no students said they wanted a paper dictionary or to make their own flash-cards. Some students talked about being distracted by the phone, yet no students said they didn’t want to use their phone in class.
What students want vs. What teachers want
‘I can always check the answer when I want.’
Scott Thornbury joined a long list attempting to define principals of SLA-informed EdTech for us. Yet there is a conflict with what students liked. Students wanted model answers in their IELTS tests, word lists in their vocab apps. They wanted apps that would help them with their test (not their English). They wanted dictionary features integrated into their reading apps – technically I guess a point of agreement – but the power of visual instructions will lead most students to click on a definition before using sentence clues to conjure one of their own.
What should we make of this?
Featured Photo Credit: mattcornock via Compfight cc. Text added by ELTjam.
What I glean from this is that you are a pretty good detective. But where are these students from, where were they asked and how old were they?
A lot can be gleaned from a chat. As you suggest, it may vary from cohort to cohort.
This group: South-East Asian and Middle-Eastern. Mostly young (20s). They were asked in class, in the library, and at home (via email).