Don't miss
  • 12
  • 6468
  • 6097
  • 20

Computer Science is not the answer to everything

By on March 13, 2012
Print Friendly

This is reproduced with permission from the blog of Martin Darby, creative director at Remode studios.


Shared under CC by Michael Surran

In recent years there has been a lot of talk about how to fix education, to prepare young people for a career in growing tech sectors, such as games development. The Livingstone-Hope review has been a catalyst in the reform of ICT in secondary education by Michael Gove, to create a deeper computing education that instills foundation knowledge early in students’ careers.

I fully support this. I myself was a victim of ‘ICT’ lessons, so I know how boring and pointless they often were. However, I do not agree with the seemingly-widespread consensus that getting more young people into computer science degrees is a magic bullet.

It wouldn’t provide 90% of students with the breadth of skills they need for where the industry is probably heading. It’s not a pragmatic response to the problem. And it narrowly focuses on the tech side of game development, even though proficient and competitive artwork is as important as ever.

The industry is changing

It’s no secret that ‘traditional’ games have become more and more complicated and costly to make over time. Back in the late 80’s/early 90’s, it was a still a big step for games to be programmed in a third level language like C++. For the majority of big developers, it became the norm to create technology/tools/engines in-house, to serve the specialist needs of a single franchise. Doing this requires a high level of technical skill, from understanding hardware architecture, right up to OOP.

However, this model is becoming unworkable. The retail packaged goods model is in decline. Studio closures and widespread job losses have been frequently reported lately, while the output of the remaining studios is increasingly homogeneous. In addition, Moore’s Law is charging forward at full force. Apple releases a new phone or tablet every year that was more powerful than the last, making the 5-7 year hardware cycles of consoles past seem positively limp.

The way games get made, and the skills required, have already started to change.

The two strands

I believe that the development industry is splitting into two strands: technology vendors and games developers. Most games are so complicated that it is no longer economically viable to develop both technology and games. New technology released by specialist vendors is simplifying the technological side for studios that design games.

I have seen this first hand. When I started Remode in 2007, if you wanted to make a half-decent real-time 3D game there were really only two options: license Unreal or an equivalent engine for big bucks, or do your own tech/heavily modify something open source. Fast forward five years to 2012, and the landscape has changed dramatically. At Remode we use Unity Pro to develop games for growing open platforms such as phones, tablets and the web. Technologically and financially, it’s a far superior solution. I would now only consider developing in-house tech for fledgling technology such as HTML5.

Buy from Amazon UK

Chris Anderson argues in Free that online and digital businesses are predisposed to a freemium model.  He describes how the economics of free lead to massive skews in market share that allow one or two key players to dominate (think Google or Zynga). This is exactly what we are seeing with Unity and Unreal.  Unity have captured huge market share, creating a downward pressure that has caused Epic to also review their licensing terms and also give away a basic version of their engine for free. They are creating a de facto standard for 3D games development in the same way that Flash did for 2D web games development.

Fundamental market changes mean alterations in the demand for skills. In the future, developers will no longer be able to wage a technological arms race with one another.  Instead they will have to rely on creativity to innovate and compete.

In other words, it will get easier to make games but harder to make good games.

For me, this means that computer science grads are not the flawless hire that they used to be.

What compsci doesn’t teach

I think we need to remind ourselves that computer science is hardcore tech.  It’s understanding how the computer works from the hardware, through to binary, assembly, compliers right up to third level languages and OOP programming; highly relevant for large engine vendors, less so for most games developers. What are they not learning?   What is being left out that could help create these new great games?

1. Cultural Literacy

An understanding of culture, including pop culture, and the ability to work that understanding to one’s advantage. Cultural literacy would lead to more interesting game designs rather than poor, buggy, stereotypical rehashes of existing games. Culturally literate coders might not have advanced art skills, but they would be able to tell whether something was simple and good (eg geometry wars) or simple and naff (eg most of XBLIG and the App Store). They would think critically about where their product sits in a sea of other games. I did an art-tech hybrid course myself and learnt the hard way that the ability to question ‘why’ is a difficult skill to master.

2. Team work

I think development team work is something many grads lack.  This means organisation, social skills and objective problem solving.  As I have learnt from interviewing, hiring and running my team at Remode, you can have the best practical skills in the world but if you cannot work well in a team you are a huge risk.

3. Experience

Poster by Catherine Anyango

The opportunity to fail earlier.  We are all human and as such we all learn from our mistakes (well hopefully!).  Using more standardised, higher level tools such as Unity earlier on means more prototyping, faster.  This increases the chances of speeding up this learning process from solid OOP skills to better creative output.

The games coders of the future will be an evolution of what we consider the ‘game designer’ to be today.  Development is already changing at a fast pace and it will take time to fully realise how extensive these changes prove to be.  At Remode we have already begun running and structuring the tech team in this way and I have found working with modern-minded coders who are visually literate to be a rewarding experience that has led to better quality games in less time.

Putting three years to good use

Games education degrees are ripe for a revitalised creative-tech crossover. We need to wash away 90’s multimedia monstrosities, but still cross-pollinate computer science to make it more relevant to where the industry is going.  This seems to be a controversial opinion given how outspoken some more senior developers are about computer science as a subject, and how “mickey mouse” games specific degrees seem to be seen by the industry.

It is too easy for AAA programmers to criticise today’s students.  They have had twenty, sometimes thirty years to learn the low level 3D graphics or networking pipelines gradually throughout their careers.  Degrees are still three years long like they were in the 80’s.  Is it really fair to expect students to acquire over 20 years’ worth of knowledge in the same amount of time?

If you are an undergraduate or university applicant reading this and are looking to get into games development then you have some interesting choices to make.  I know coders who work in AAA/console development that get paid no more than I pay Unity coders to do work that is apparently less technically proficient.

Things have changed so much in the five years since I started Remode. There are some programmers out there who will see what I am saying as pure heresy. I’m not trying to necessarily slam computer science entirely. If you work hard at a good computer science degree then the chances are you will be a good coder. I am just trying to make the point that it is not perfect. I believe it will become increasingly imperfect for the majority.

About Martin Darby

  • This purpose of this article was to identify the needs of the emergent *majority*, not split hairs over how a *minority* of enterprising programmers manage to create something completely left-field.


    The underlying thrust however of the article is the main problem. You seem to be arguing that we won’t need technical teams because we can just use pre-built engines like Unity. This frankly, is ridiculous.”

    -No I am not, I am saying that the role of the technical team will probably start to change.

    You miss my point about computing power.  Graphics technology hasn’t taken any fundamentally huge leaps in a while, in that time silicon catches up.  You can now create current-gen looking stuff with realtime shadowing, shaders etc in Unity. 5 years ago the only way to do that at a reasonable speed was lower level.  You talk about lengthy console cycles:  I think that has more to do with market risk than whether it “can be done”.I think a good example of what I am talking about could be something like Minecraft, or Runescape that are written in Java.  But does that have anything to do with the merit or cultural relevance of the idea that draws attention from millions of adoring fans? No.

  • I think that Martin’s point is that we are heading into a “post-platform” world. We are moving past the stage where we have to design our own cameras or lenses (as Stanley Kubrick did for Barry Lyndon) and reach the stage where the game-makers craft is about using known tools with flair and talent. 

    We will be designing mechanics, story, artstyle and experiences, not raw AI or code.

    More importantly, when everyone doesn’t think they have to make a new engine every time, games will be cheaper to make, and hence more innovative.

    I’m inclined to agree with Martin.

  • The idea that team work isn’t taught on Computer Science degrees is laughable. Team Projects form a core part of most CompSci degrees. It’s important in far more industries than games development.

    The underlying thrust however of the article is the main problem. You seem to be arguing that we won’t need technical teams because we can just use pre-built engines like Unity. This frankly, is ridiculous. Every single game that has made a significant contribution to pushing games forward to improve the AI, to improve the graphics significantly, to design new modes of play requires a new or altered engine. If you look where the innovation is in the industry, it’s right next to the people writing their own engine.

    “I did an art-tech hybrid course myself and learnt the hard way that the ability to question ‘why’ is a difficult skill to master.” 

    Is it possible that this is taught on other courses? I think so. 

    Moore’s Law has nothing to do with the ability to use pre-built engines over and over again. In fact despite Moore’s Law we are finding that this console cycle has lasted longer than either of the two previous. This longer cycle length (which is the opposite of the prediction you make) is the reason that engines can be re-used so effectively.

    Finally, the initial point you raise is also wrong. It’s about getting more people to develop content rather than simply use it. ICT teaches the use of Office software right now, not any useful development skills for either a Unity coder or a computer scientist. It’s true that it would be nice if `Art` did CG as well as conventional art but that’s a seperate argument.

    Lastly on a personal view, I’m not sure games development is really a University course. I’m not saying it’s not a good course, but is it really a University course? Does the research and non-industry based environment really benefit the applied content that games development should contain (such as the use of recent engines).

  • Hugely agree with your point on the need for more creative-tech crossover courses, even outside of the Games industry all software solutions today are demanding a much more intuitive UI and in some cases the feel at the front is what sells the solution!