This blog has been pretty quiet for the last few months. I’ve been busy studying ‘An Introduction to Digital Environments for Learning’, the first module of my part-time MSc in digital education at Edinburgh Uni.
It’s been great getting back into education. Compared to last time around, it’s amazing how much more you can learn as a father of three who gets to bed at a reasonable hour and drinks responsibly.
Anyway, the last 12 weeks have been pretty mind-blowing. In no particular order, here are the top five things I think I’ve learned:
- There’s no such thing as a ‘digital native’ or a ‘digital immigrant’. Contrary to a popular myth, people born after 1980 haven’t got oddly shaped brains just because they’ve grown up using digital technology. Universities’ widespread panic that they’ll be forced to ‘change or die’ by hordes of wild-eyed teenagers wielding smartphones has no basis in evidence.
- Technology and education can overlap. We shouldn’t start out with a traditional (face-to-face) model of education and then look for a piece of technology that allows us to ‘virtualise’ it. Technology can inform education too: for instance, look at how people across the world can collaborate through Hangouts and wikis.
- I. Hate. Second. Life. Is it a game? Is it a social network? Is it virtual reality? I’ve spent many hours in it, and I still haven’t got a clue what I’m doing. But maybe that’s just me.
- Everyone’s different. Some people learn through dialogue, and some learn in silence. Neither of these options is ‘good’ or ‘bad’, and universities have to enable choice rather than being proscriptive.
- Research, research, research. There’s an awful lot of hyperbole and conjecture around how people learn in the digital age. No one age group, region or subject area can be pigeonholed. The only way to keep learning relevant and useable is to research what works and what doesn’t with a particular group of students.