Is the UK’s computing curriculum too focused on coding?

The government introduced its computing curriculum in 2014 to give young people the digital skills they need for the future, but is it fit for purpose?

England’s national computing curriculum, which was last revised in 2014, has been the subject of much criticism. One of the main complaints is that it is simply too academic and inflexible and does not teach learners the real-world skills they need.

Another common gripe is that it is too heavily focused on coding, a skill that is expected to become less useful over time as the activity becomes more automated in the move to a progressively artificial intelligence (AI)-based world.

Peter Twining, professor of education (futures) at the Open University, says: “Although computing science is as important a subject as English or maths and teaches students heuristics to help them tackle problems, it’s just too narrow as vital considerations such as digital literacy have been cut out.”

To make matters worse, whereas in the past there was a whole range of technology-related subjects available for learners to take, today computer science is the only option remaining. But the problem, says Twining, is that “we don’t all need to be computer scientists”. Instead, he says, “we need a workforce that is digitally competent” and understands issues such as ethics and how technology it is likely to affect society.

“We’ve currently got an industrial age curriculum and learning approach that aren’t fit for purpose – and the last version of the curriculum made things worse by going backwards,” he says. “It became more about abstract teaching, so knowing stuff and remembering facts, rather than helping people learn in a contextual, experiential way, which is much more effective.”

Michelle Perkins, director of future talent initiatives at management consultancy Capgemini, agrees that change is necessary. “I’d like to see more diversity introduced into the curriculum in a similar way to Wales’ Digital Competence Framework, because, while we need computer scientists, the tech industry also needs a bunch of other skills too,” she says.

In fact, rather than simply offering computer science as a standalone subject, Perkins would like to see tech embedded across the entire school curriculum.

“All young people love their gadgets and they need tech skills to survive and grow, but most don’t want to get involved in programming, so they lose interest”
Michelle Perkins, Capgemini

“All young people love their gadgets and they need tech skills to survive and grow, but most don’t want to get involved in programming, so they lose interest,” she says. “To my mind, it’s a life skill but it’s only presented in one way, and so fewer people, especially girls, take it as a subject because they don’t see the correlation between tech skills and the real world.”

Instead, they tend to focus on old stereotypes of white, male programmers coding in darkened rooms, which turns them off.

But the current curriculum is not even particularly good at training computer scientists, says Fernando Hernandez, managing director for Europe at 3D printer supplier XYZ Printing. Because technology is changing at “an alarming rate”, the curriculum is “already out of date” only four years after it was created, he says. This means that the skills taught to 16-year-olds are often redundant by the time they enter the workforce.

Engaging with tomorrow’s technology

Another problem is the curriculum’s theoretical focus. Graham Hunter, vice-president of certifications at IT trade association CompTIA, says: “Although theory has to be taught and a certain amount of academic rigour is required, we would like to see the curriculum become much more practical and, rather than just focus on programming, explore and explain a range of emerging technologies.”

In fact, CompTIA would like to see 10 core skills being developed to make learners workforce-ready. These include hard, technical skills such as how to recognise and respond to cyber security threats, as well as a wide range of soft skills, such as how to work effectively in a multi-generational workforce.

XYZ Printing’s Hernandez endorses this approach. “We need the curriculum to teach students the principles that underpin technology and have them question the status quo,” he says. “Only by doing this will we give them the transferable problem-solving and innovation skills to engage with the technology of tomorrow and, more importantly, become active players in shaping it.”

Such considerations are also expected to gain importance as the AI era takes hold. As the Open University’s Twining says: “We don’t really know how things will change in future and so we don’t really know what new skills will be required. This means people will need to be effective at learning and coping with change.”

Read more about the computing curriculum

  • The number of girls taking computing subjects at GCSE or equivalent level is significantly less than when the curriculum was introduced in 2014.
  • During the opening of Salesforce’s London Innovation Centre, minister for digital and the creative industries Margot James emphasised the importance of getting the younger generation involved in tech careers.

It is also likely they will need to demonstrate workplace skills such as creativity in problem solving, collaboration and critical thinking in order to differentiate themselves from machines.

But the problem today, says Twining, is that “most schools are not preparing children for the future”. He adds: “Current teaching techniques make you dependent, rather than independent, in your thinking and they turn a lot of people off learning completely.”

A key issue is that the education system was set up during the industrial revolution to “get people to conform”, whereas in the AI era, the opposite kind of characteristics will be required.

“Schools really should be preparing students for the real world, but the key thing that drives them is assessment and being accountable,” says Twining. “The problem there, though, is that the focus is on the wrong thing as it’s about three-hour paper-based exams that are unable to assess children’s ability to learn, solve problems, collaborate and communicate.”

As XYZ Printing’s Hernandez points out, the fact that students are still taking paper-based exams “could not be more unsuitable for a technology-based subject”, not least because learners “can appear to be proficient in the subject through knowing the theory, but lack the skills to perform basic tasks in real life”.

Effecting meaningful change

But in order to effect meaningful change, it will be necessary to do more than simply tweak the current computing curriculum. Instead, what will be required is a shake-up of the entire school system along with its existing pedagogical approaches.

On the one hand, says Twining, the focus should be less on teaching individual subjects and more on empowering children to discover things for themselves using approaches such as problem-based learning. But on the other, altering how and what is assessed in schools would serve to drive a significant “shift in practice” because of the focus placed upon it.

Another important change would be to create tighter links between what students learn and what they are likely to do in the workplace, says Capgemini’s Perkins. To achieve this, she advocates closer ties between education and the business community, which could include everything from employers having a certain amount of input into the computer science curriculum, to more spokespeople going into schools to talk about possible careers in tech.

But to create real and sustainable transformation, a major investment in teacher training would also be required beyond what has already been promised. At the end of last year, the government pledged to invest £84m to improve the skills of existing computer science teachers and treble the number of those trained to deliver the curriculum to 12,000 by 2022. It also guaranteed to pay for the creation of a National Centre for Computing, which would produce training material and provide support to schools.

“Current teaching techniques make you dependent, rather than independent, in your thinking and they turn a lot of people off learning completely”
Peter Twining, Open University

But a report from the Royal Society at the end of 2017 indicated just how much work still needs to be done even at a basic level. For example, England met only 68% of its recruitment targets for new entries into computing teacher-training courses that year. To make matters worse, over the past five years, the country has also experienced a shortfall of 1,000 computing teachers recruited into schools.

In fact, things have got so bad that despite government targets to reduce immigration from outside the European Union, the Migration Advisory Committee has officially added computing teachers to its “shortage occupation list”, which means visa restrictions there have been lifted.

Even more worryingly, of those teachers who are already teaching the curriculum, 67% feel ill-equipped to do so. Unsurprisingly, 54% of schools across England currently fail to offer computer science at GCSE at all.

But even those that do are experiencing problems. As CompTIA’s Hunter points out: “How many teachers teaching the curriculum have industry knowledge or have even studied IT as a distinct path? A lot of them end up teaching it as a secondary subject because there simply isn’t anyone else.”

Another important consideration is that, even if additional funding were to become available to support transformation in the educational system, its complexity and large number of moving parts means change takes a long time to filter through, even at the subject level. For example, if the curriculum were changed, it would also mean the final GCSE exam had to be altered, along with timetables and what is taught to learners in years eight and nine.

As Capgemini’s Perkins concludes: “While we’d all like to see more tech in the curriculum, as is the case in South Korea and Taiwan, it’s a difficult proposition because education is so much part of, and ingrained into, a country’s culture. So we’d have to think carefully about what would work in the UK – you can’t just take a model from somewhere else and drop it in, because it simply wouldn’t work.”

This was last published in July 2018

Read more on IT education and training

Join the conversation

5 comments

Send me notifications when other members comment.

Please create a username to comment.

The curriculum its self is not actually focused on programming, it is split in to three strands, Digital Literacy, Computer Science and Information Technology. The problem that many teachers face is that many do not know how to approach the Computer Science strand and have concentrated on programming. Again we also have the exam boards interpretation of the programs of study, and they lean heavily towards programming, the changes in what is available to study at GCSE level is also a problem as reforms have removed ICT almost entirely. All in all it isn't the curriculum fault but the reforms at GCSE level that have put us in this position.
Cancel
I think, based on 25 years working in ICT after training as a Biochemist (it's an old but true saw that to succeed in ICT you probably failed to get a job in something else), that coding certainly has it's place in computing education but as a means much more than as an end.  Teach kids the basics of coding and then build from that to understanding requirements, problem solving, testing &c.  I've long thought subjects should be more integrated in schools, not just taught in isolation.  For example how the events of history relate to the technological changes and pressures from growing science, how advances in mathematics and literacy lead to changes in how businesses operated which coupled with the printing press and improved transportation to drive the renaissance which in turn lead the further increase in literacy and changes in the arts &c.  Maybe instead of just teaching kids in history about Napier's Bones, a mechanism for extracting logarithms, show them how to write code to do the same job then build on that to show how it can be used to solve real world problems; link history to computing, maths, physics and the soft skills of problem solving.
Cancel
The problem with discussions on computing in schools is that the basic disciplines change slowly, if at all. Meanwhile the skills in current demand change faster than the timescale for agree a curriculum, let alone delivering it. Coding was fashionable in the early 1980s when the micros in schools programme produced a generation of talent including the sound and visual effect engineers, (alias real-time programmers) who enabled British bands to dominate the world. Then came the "use of IT", alias how to use Microsoft Office, which turned off the next generation. Now coding is back again. Its true value, however, is because it is a proxy for pure logic. Its great value is that it can draw in the non-academic, especially when the code is used to control physical objects and illustrate practical engineering disciplines. Add data analytics (probably best embedded in history or geography projects) and systems thinking (whose absence lies behind most IT disasters of recent years) and we have the basic disciplines needed for most of modern life - not just for "computer scientists". Meanwhile learning how to cheat an AI based examination system may well be the best way of cutting through the mix of hype and fudge to comprehend the reality behind the emerging "smart" world.   
Cancel
I agree.
Cancel
Programming provides an introduction to logical and creative thinking, free of cultural bias, which in my view makes it a worthy part of the primary curriculum even if it's not in complete sync with industry trends.
Cancel

-ADS BY GOOGLE

SearchCIO

SearchSecurity

SearchNetworking

SearchDataCenter

SearchDataManagement

Close