- ARPDAUPosted 4 years ago
- What’s an impressive conversion rate? And other stats updatesPosted 4 years ago
- Your quick guide to metricsPosted 5 years ago
Computer Science is not the answer to everything
In recent years there has been a lot of talk about how to fix education, to prepare young people for a career in growing tech sectors, such as games development. The Livingstone-Hope review has been a catalyst in the reform of ICT in secondary education by Michael Gove, to create a deeper computing education that instills foundation knowledge early in students’ careers.
I fully support this. I myself was a victim of ‘ICT’ lessons, so I know how boring and pointless they often were. However, I do not agree with the seemingly-widespread consensus that getting more young people into computer science degrees is a magic bullet.
It wouldn’t provide 90% of students with the breadth of skills they need for where the industry is probably heading. It’s not a pragmatic response to the problem. And it narrowly focuses on the tech side of game development, even though proficient and competitive artwork is as important as ever.
The industry is changing
It’s no secret that ‘traditional’ games have become more and more complicated and costly to make over time. Back in the late 80’s/early 90’s, it was a still a big step for games to be programmed in a third level language like C++. For the majority of big developers, it became the norm to create technology/tools/engines in-house, to serve the specialist needs of a single franchise. Doing this requires a high level of technical skill, from understanding hardware architecture, right up to OOP.
However, this model is becoming unworkable. The retail packaged goods model is in decline. Studio closures and widespread job losses have been frequently reported lately, while the output of the remaining studios is increasingly homogeneous. In addition, Moore’s Law is charging forward at full force. Apple releases a new phone or tablet every year that was more powerful than the last, making the 5-7 year hardware cycles of consoles past seem positively limp.
The way games get made, and the skills required, have already started to change.
The two strands
I believe that the development industry is splitting into two strands: technology vendors and games developers. Most games are so complicated that it is no longer economically viable to develop both technology and games. New technology released by specialist vendors is simplifying the technological side for studios that design games.
I have seen this first hand. When I started Remode in 2007, if you wanted to make a half-decent real-time 3D game there were really only two options: license Unreal or an equivalent engine for big bucks, or do your own tech/heavily modify something open source. Fast forward five years to 2012, and the landscape has changed dramatically. At Remode we use Unity Pro to develop games for growing open platforms such as phones, tablets and the web. Technologically and financially, it’s a far superior solution. I would now only consider developing in-house tech for fledgling technology such as HTML5.
Chris Anderson argues in Free that online and digital businesses are predisposed to a freemium model. He describes how the economics of free lead to massive skews in market share that allow one or two key players to dominate (think Google or Zynga). This is exactly what we are seeing with Unity and Unreal. Unity have captured huge market share, creating a downward pressure that has caused Epic to also review their licensing terms and also give away a basic version of their engine for free. They are creating a de facto standard for 3D games development in the same way that Flash did for 2D web games development.
Fundamental market changes mean alterations in the demand for skills. In the future, developers will no longer be able to wage a technological arms race with one another. Instead they will have to rely on creativity to innovate and compete.
In other words, it will get easier to make games but harder to make good games.
For me, this means that computer science grads are not the flawless hire that they used to be.
What compsci doesn’t teach
I think we need to remind ourselves that computer science is hardcore tech. It’s understanding how the computer works from the hardware, through to binary, assembly, compliers right up to third level languages and OOP programming; highly relevant for large engine vendors, less so for most games developers. What are they not learning? What is being left out that could help create these new great games?
1. Cultural Literacy
An understanding of culture, including pop culture, and the ability to work that understanding to one’s advantage. Cultural literacy would lead to more interesting game designs rather than poor, buggy, stereotypical rehashes of existing games. Culturally literate coders might not have advanced art skills, but they would be able to tell whether something was simple and good (eg geometry wars) or simple and naff (eg most of XBLIG and the App Store). They would think critically about where their product sits in a sea of other games. I did an art-tech hybrid course myself and learnt the hard way that the ability to question ‘why’ is a difficult skill to master.
2. Team work
I think development team work is something many grads lack. This means organisation, social skills and objective problem solving. As I have learnt from interviewing, hiring and running my team at Remode, you can have the best practical skills in the world but if you cannot work well in a team you are a huge risk.
The opportunity to fail earlier. We are all human and as such we all learn from our mistakes (well hopefully!). Using more standardised, higher level tools such as Unity earlier on means more prototyping, faster. This increases the chances of speeding up this learning process from solid OOP skills to better creative output.
The games coders of the future will be an evolution of what we consider the ‘game designer’ to be today. Development is already changing at a fast pace and it will take time to fully realise how extensive these changes prove to be. At Remode we have already begun running and structuring the tech team in this way and I have found working with modern-minded coders who are visually literate to be a rewarding experience that has led to better quality games in less time.
Putting three years to good use
Games education degrees are ripe for a revitalised creative-tech crossover. We need to wash away 90’s multimedia monstrosities, but still cross-pollinate computer science to make it more relevant to where the industry is going. This seems to be a controversial opinion given how outspoken some more senior developers are about computer science as a subject, and how “mickey mouse” games specific degrees seem to be seen by the industry.
It is too easy for AAA programmers to criticise today’s students. They have had twenty, sometimes thirty years to learn the low level 3D graphics or networking pipelines gradually throughout their careers. Degrees are still three years long like they were in the 80’s. Is it really fair to expect students to acquire over 20 years’ worth of knowledge in the same amount of time?
If you are an undergraduate or university applicant reading this and are looking to get into games development then you have some interesting choices to make. I know coders who work in AAA/console development that get paid no more than I pay Unity coders to do work that is apparently less technically proficient.
Things have changed so much in the five years since I started Remode. There are some programmers out there who will see what I am saying as pure heresy. I’m not trying to necessarily slam computer science entirely. If you work hard at a good computer science degree then the chances are you will be a good coder. I am just trying to make the point that it is not perfect. I believe it will become increasingly imperfect for the majority.