Richard Jacques & James Hannigan Interview: Panel at Barbican Centre
On May 18, 2011, I attended and assisted in a panel discussion at the Barbican Centre in London, right next to where I go to university at the time of writing (the Guildhall School of Music and Drama). At the time, the Barbican were also displaying an exhibit of some of the work of multimedia artist Cory Arcangel, and so to coincide with this they decided to hold a discussion panel on the artistic merits of video game music and its evolution over the years.
The organisers at the Barbican contacted me in January through the Guildhall, asking me if I knew of anyone who would be interested in taking part in this discussion and who would be able to offer valuable insight into the medium. I sent them all an email with a long list of names of composers, journalists and authors, and they booked three of those people. The panelists were Richard Jacques and James Hannigan — arguably the UK’s two current most successful game composers — and it was chaired by Professor James Newman. Newman is an author and university professor from Bath Spa University who studies, researches, and lectures on video games and the culture surrounding the medium as well as other forms of new media. He has also worked in collaboration with Nottingham Trent University on projects such as GameCity and the National Videogame Archive (part of the National Media Museum). For a university professor, he’s got a really good sense of humour and made everyone laugh several times.
The discussion topics ranged from the interactivity which makes videogame music unique, the evolution from generative / algorithmic to full orchestral scores, licensing music, making linear soundtrack albums, and the challenges of composing music for modern video games.
James Newman opened the discussion with the necessary introductions and then showed a clip of Metal Gear Solid 4. This was demonstrating the direction that modern video games, particularly Western ones, seem to be taking, becoming more cinematic in nature. Many of the most popular video games of modern times feature guns, gritty settings, and big explosions — stuff that wouldn’t be out of place in a big summer blockbuster film — and this is the way the music has gone too. Metal Gear Solid is a series where the lead composer for the series since the secondnd entry has been Hollywood big shot Harry Gregson-Williams, and only proves his point further.
However, he did emphasise that, despite this direction that modern video games are taking, video games are still a non-linear medium. No matter what they are, which is what makes video games, and therefore video game music, unique. These days, many gamers tend to think of non-linear video games as games with branching story-lines, multiple ending possibilities and choices that effect the overall outcome of the game, the kind of elements made popular in several of Bioware’s franchises such as Mass Effect and The Old Republic. However it doesn’t matter if you’re taking part in a shoot out, racing in a car, investigating a mystery, building a city, playing a sport, leading and directing an army, jumping from platform to platform… whatever it is, if it’s a video game it’s all non-linear as you’re controlling what’s happening on screen to a certain extent.
Even the original Arcade game Dragon’s Lair, which is the most linear game that I can think of, has some elements of interactivity about it, if you don’t press the right direction or the right button at the right time, the main character Dirk gets killed. This presents two possibilities, that you progress to the next part of the game or that Dirk dies in some comical way, and the music has to adapt to either possibility seamlessly. This is interactivity in it’s most simple form, and how developers and composers make music interactive beyond that is where it gets interesting.
What I just described is known as re-ordering, as there are several things the player could do at any one time. For this reason the pieces of music have to be relatively interchangeable depending on what could happen next. This is seen in pretty much every video game ever made.
The other basic form of interactivity used a lot in Western games is layering. This was talked about and demonstrated a number of different ways. Richard Jacques demonstrated layering by showing a clip of his most recent score, James Bond 007: Blood Stone, and then by embarrassing me (to be fair I did volunteer to do this). He opened up a track from the game, got me to dress up in a white shirt and white jacket, gave me a super soaker pistol, and got me to act out being James Bond in three different states: no action/stealth, tension and full on action. While I did this, he changed the music by crossfading between layers on Vegas Pro 10 depending on what I was doing. I did so making the changes quite gradually as would happen in a video game, and then, while running to the front of the room during a ‘full on action’ state, I dived and made a spectacular pratfall in front of everyone, slightly ripping the jacket in the process. A fail of a stunt which is now stored in the Barbican Centre archives! Still it was all good fun, and demonstrated how interactive music is actually more simple than it may seem at first in terms of implementing a track into a game and how it works.
James Hannigan meanwhile demonstrated his BAFTA nominated score working in the strategy game Republic: The Revolution, which involves both re-ordering and layering in a vastly complex network of possibilities. The track that plays over the top of the basic map screen where players make their moves and decisions is quite ‘droney’, but the music subtly yet noticeably changes via the music layering, fading and building depending on several factors including the time of day and the mood of the game. Meanwhile, when a cutscene is triggered, the music that accompanies it is linear, similar to how a film or television show would be scored.
For this reason, Hannigan and Jacques both agreed that making soundtrack albums is difficult, as there is a lot of editing involved. Furthermore, they mentioned that the music often loses some of its impact when heard out of context. I always try to bear this in mind when writing reviews of soundtracks. I think that Eastern game music is often easier to release as a soundtrack than Western game music, because many Eastern composers tend to write very melodic music, whereas current Western composers tend to focus on making more interactive background scores. Eastern composers do use interactive elements in their music, but differently and more obviously; for example, in Super Mario Galaxy 2, there are many different tracks that have different states or variables, but they are used in more obvious ways such as the music getting faster and higher when going faster on the sliding levels or the ball rolling levels. In contrast, Western composers tend to use interactivity a lot, but more subtly.
Perhaps most interesting about this discussion was Richard Jacques and James Hannigan’s thoughts on generative / algorithmic music, or what we now know as 8-bit or chiptune music. Both of them are not too keen on using chiptune music over the current orchestral sound pallete, because they both think it is counterintuitive, and so people who grew up in the 80’s and 90’s becoming nostalgic for 8-bit music in recent years has been a surprise for the two of them. To be fair to both of them, this attitude comes from a history of having no choice but to work with ‘outdated’ sounds, and both composers wanted to explore more ‘realistic’ sounding music. This is a big contrast to what Bear McCreary was saying in Matt’s interview with him, who said that the best and most melodic video game music is, or started out as, 8-bit music, as all you had to work with was basic sounds, therefore composers were forced to overcompensate on the melody front. This is perhaps why people remember the chiptune music that James composed for the Cloudy With a Chance of Meatballs video game much more than the rest of the soundtrack — he said he gets a lot of emails about that.
One girl in the question and answer session at the end actually said to both of them “Please don’t ever forget your roots”. It’s interesting that they both say that because, according to James, video game music did become less respected in the mid 90’s — with the PlayStation, Nintendo 64 and Dreamcast era — when video games started using midi music, as the early midi cards they used sounded like really bad imitations of real instruments. However, many people knew that it was only a matter of time when video games would advance to the point where full orchestral scores would become commonplace.
Overall, this was a very interesting discussion on the processes and thinking behind the music we hear in video games. Other topics covered included the use of licensed music, using Wipeout as an example. In addition, the composers discussed the various challenges composers may encounter. For example, James mentioned when he was challenged to write some music for a financial spreadsheet in a management sim. It’s a very creative process that requires knowledge and skills beyond traditional music theory and compositional skills, but one that is immensely rewarding when it’s successful.
Posted on May 18, 2011 by Joe Hammond. Last modified on February 28, 2014.