When I started University in 2008, I was obliged to take an Introduction to ICT course in the first year of my Digital Media major. We spent a large amount of time in this course, learning how to use Microsoft Office 2003. We sat in the computer lab, working our way through tedious worksheets which taught us how to format text, how to use various tools like the Format Painting Brush, and how to plug things into Excel spreadsheets. When we finished, we had to have our work checked by a supervisor, and then we would proceed to the next worksheet, which was usually another way to do the same menial task.
There were people taking this course who had never used a computer before. They were struggling with the feeling that they were never going to be able to remember all these steps; that the minute they moved on to the next course, all their computer knowledge would desert them and they would be useless.
They weren’t being taught to read; they were just told to memorise the story.
At the other end of the spectrum were people who, like me, couldn’t get out of that lab fast enough. Unfortunately for us, there was no way to skip ahead, to prove your competency in using a computer. Doing so required that you be able to demonstrate every possible combination of techniques for aligning table cells, say, rather than just the one you needed to get the job done — in other words, you had to have memorised almost every single feature of Microsoft Office 2003.
That course was a complete waste of time for all but one or two of the six hundred or so students who took it. I can say this for a few reasons, not least of which is that two years later, Office 2010 arrived at the University with its redesigned interface, and all those painfully memorised steps became mostly useless. The lecturers responsible for that particular course, having tied themselves so strongly to step-based skills in a particular software package, were still teaching Office 2003 when I left in 2011. I wonder what 2011’s class will think of the skills they learnt when they graduate, twelve years after that software was released.
I think they might just feel a little cheated.
Likewise, students who graduated high school around about 2010 might be justified in feeling slightly cheated in their ICT education. They were probably taught to use computers in a lab at school, where monitors and screens were laid out for them in rows (a situation which hardly reflects the real world but, for some reason, persists in educational institutions). They were probably taught how to do basic word processing and how to use spreadsheets. They probably gave a Powerpoint or Keynote presentation or nine during their schooling years, and they most probably never encountered a touch screen.
Unfortunately for those students, the computing world has changed; not completely, but enough to make many of their skills obsolete if they acquired them by following task-driven sets of instructions from teachers or text-books.
Some of those students would have learnt that in order to achieve task x, you first click on the File menu, and then click Open, and so on and so on. They probably never got beyond this because they didn’t have the interest or the time to explore, and their teachers didn’t have the deep knowledge that is needed to teach students how to read an interface and to find out for themselves, how to use it. These students were probably never asked “Where would I find the Save feature? Why? Why is it called a File menu?”
They were not learning to read. They were memorising stories.
Some students would probably have been curious and interested enough to fiddle around and learn things. These students probably ended up doing most of the skill teaching during classes as, having finished, they had nothing better to do than help people around them. These students probably understood how to use an iPhone before the school had even decided to ban them.
The first batch of students is going to have to learn everything all over again when they hit a workforce which is suddenly employing tablets and smart-phones which require a different paradigm — a paradigm that their teachers could not possible have taught them at school, because it didn’t exist in the educational world then. They will probably follow their comfortable pattern of learn the steps this time too. They are, unfortunately, doomed to repeat this process every time a new paradigm comes along.
The second group of students won’t have this problem. The difference: curiosity, and the self-motivated desire to fiddle around and learn things; the desire, having learnt one way, to learn a better one.
Which group do you think is going to have more desirable skills for an employer? The group that learns and even improves a system off their own bat, or the one that requires a half-day workshop every time a new version of Microsoft Word comes along?
ICT teachers have an interesting job. They spend their time (or they should) preparing students to use tools that don’t exist yet. To do that successfully, we need every student to be in the second group — the curious, tinkering one. We don’t have that yet — not even close.
Our current approach is akin to a teacher who, rather than teach children to read, simply asks them to memorise the stories in the book.
We have got to stop pretending that computer skills can be taught. They cannot. They must be learnt.