MIDI Talk Episode 08: A Visit to Chase Bethea’s Interactive World
Video game composer Chase Bethea has a simple approach that guides him through the myriad complexities of his job: “I’m always thinking about the player first,” he offers. That is a perspective he comes by honestly. “I am a player first; I’ve been playing games since I was six. Most people start (learning) their first (musical) instrument at that age, but that was my instrument.”
Born in Chicago, Bethea received his higher education in Southern California, earning an Associates Degrees in Audio Engineering from The Los Angeles Recording School, another AA in Music Theory and Composition from Moorpark College, and, finally, a BM in Media Composition from California State University Northridge.
After finishing LA Recording School in 2007, Bethea was mixing popular bands around Los Angeles and working at Music Plus TV (which became Vlaze Media) when he came to the realization that composing music for games was a viable career option. “I started writing music in 2001, so through high school I was making tracks and experimenting with everything I could possibly think of, and people would tell me, ‘(Your music) sounds like it should be in a video game.’ I didn’t understand what that was and how to tap into it, until the IT person at Music Plus TV said, ‘Hey, this sounds like it should be in Castle Crashers,’ which was a very popular game. So I thought ‘You know what, I’ve been told this for seven years. I think I’ll look into this more.’”
Since that time, Bethea has shipped music in more than 20 games, including I Can’t Escape: Darkness, Super Happy Fun Block, Aground, Cubic Climber, and Potions Please. His Cubic Climber score earned a “Noteworthy” on Destructoid.com, and in 2016, Bethea was nominated for an Outstanding Artist–Independent Composer award from VGMO (Video Game Music Online). He also worked on pre-production for Virtual Reality Company’s Jurassic World VR Expedition, and with a half dozen projects currently in progress, it’s amazing Bethea finds the time to serve on the IASIG (Interactive Audio Special Interest Group) Steering Committee, too!
Simone Capitani from Audio Modeling pinned Bethea down for an extended discussion that took a deep dive into the process of composing music for games and VR. What appears here is only a part of the conversation; for the complete exchange point your browser to: A Visit to Chase Bethea’s Interactive World — MIDI Talk Ep. 8.
From Fruit to Flux
Bethea has used technology since he started writing music, working in Image-Line Software’s FruityLoops (which morphed into FL Studio) for years before eventually migrating to his current primary composing tool, Steinberg Cubase. His first real exposure to the requirements of composing music for games came when he was contracted to provide music for Tim Karwoski’s Electron Flux, a game for Android devices. There were many lessons to be learned, Bethea recalls, including “understanding what loops were and how they were (used), understanding the limitations of the device (on which the game would be played), and understanding how much your music is going to be (data) compressed.” He learned to generate his finished content at a high resolution, so that it would survive the often brutal bit rate reduction to its delivery format with at least a shred of fidelity. And then there was the issue of audio file formats.
“MP3s do not loop well in games; they have a gap,” Bethea explains, “so if you were to send those (to the developer), it would be jarring for the player.” (This is because MP3s encode based on blocks of data that rarely coincide with a given musical tempo, making precise looping impractical.) “But you can’t send WAV files, either, they’re way too big. I wasn’t using OGG files just yet, so, at the time, what I had to do was figure out a way to do a different version of the WAV. I was natively compressing the best way I could. Obviously, it wasn’t the best utilization, but it worked.”
We Control the Vertical and the Horizontal
As a composer for interactive media, Bethea views his work through an entirely different lens than composers working in linear media like film or TV. “You know where a movie is going to go. We design the game, but we never know what the player is going to do or at what speed, so things need to adapt to enhance the player experience overall,” he elucidates. “You really need to think in a design format to comprehend it, and this is what can trip up a lot of composers, because they typically won’t have that design mentality. You need to plan out what you’re going to do before you do it. Then, if the game needs an orchestra, you have to adapt to those things: you already wrote the music, you designed it, you designed the different layers – the vertical, the horizontal – but now you need an orchestra to perform it. It’s like an onion, with layers and layers.”
(Vertical and horizontal composition are two primary techniques used to create adaptive music. Vertical composition is the process of stitching together fully composed and produced chunks of music, where the order of chunks changes depending on game play. In horizontal composition, larger chunks of music are composed in multiple synchronized layers that are brought in or out to change the texture and feeling in response to gameplay. The two techniques are commonly mixed and matched at different points in a game.)
The 11-Day Virtual Sprint With Dinosaurs
Media production is typically performed in a high-stress, fast-paced environment, but projects involving cutting edge technology have the added challenge of unforeseen issues cropping up, and interactive media is subject to constant changes in the fundamental design and structure of the project. The biggest and coolest projects tend to be the craziest, and so it proved to be with Bethea’s work on pre-production for The Virtual Reality Company’s Jurassic World VR Expedition.
“It was an 11-day sprint; I only had 11 days to conceptualize and get approved assets for this iconic IP (Intellectual Property). I have to say, it was pretty challenging,” recalls Bethea in a tone of awe. “(The project was being done) in the Unreal engine. I brought my hard drive of sounds and music things, and was trying to conceptualize those sounds that everybody knows.
“I’m in meetings everyday, I’m driving down into Los Angeles, but I was not familiar with what pre-production was. Pre-production is something that radically changes almost every two hours! ‘We think we want this. OK, whatever meeting we had? We’re not doing that anymore. Now we’re doing this. Tomorrow, we’re doing this plus three other things. Oh, but, by the way, you better be in that meeting to do that, too, AND you’ve still got to get the work done.’ In 11 days!
“I freaked out for the first five days. I even went in on a weekend, but that weekend saved me, because when I did that, I actually finished a day early! I’m flying through Cubase doing these things and implementing the music into the system and giving feedback and testing the VR technology, finding limitations like: it doesn’t accept 24-bit (audio), it can only work with 16-bit. And it can have WAV files, but how do they interact with the nodes in the blueprint system? And using the hierarchies and the workflow of the repository, so that everyone is getting the check-ins and things are working together. You do the music, push it to the repository, demo it on the headset, listen, figure it out, it’s good, move on to the next thing, rinse, repeat, rinse, repeat. Long, long days, but good experience, I was pretty proud to finish in that time, and it was the most creative experience I could ever ask for. I would do it again; it was actually really great.”
Chase’s AI Sidekick
As a composer deeply enmeshed in technology and having to produce creative content in short timeframes, Bethea has some thoughts on how he’d like to see technology serve him better. “I have had some epiphanies of what I would like to have,” says Bethea as he lays out his dream. “I would like an AI assistant. I would love to design a product where, when I’m writing music and I know my weaknesses, I can ask the AI assistant, ‘Hey, with this Eb minor can I do this?’ And I play it, and it helps me along the way. ‘Well, actually, I found some stuff online and I thought that you might do this, let me pull this in.’ It enacts a MIDI drop and says, ‘Do you like this?’ and I’ll say ‘No, I don’t think I like that, but what if I did this instead?’ You can come up with some really different things. Our brains and our minds can only absorb so much in a day. I can only have so many of the books behind me (gesturing to a bookshelf in the background) that I can read, but if (the assistant is) reading that stuff for me, and saying, ‘You mentioned that you like this person for inspiration. Did you know that they used this melody style or this theory set for this?’ ‘No, I didn’t.’ – that would be really, really cool. I think it would be dangerous, but it would be cool at the same time. I conceptualize it as being better than Google Assistant, but for music.
Modeling’s Massive Difference
Having written for both electronic and orchestral instruments, Bethea has great appreciation for the strengths of the modeled instruments Audio Modeling produces and is enthused by his experience with them. “They’re so great. Wow. I was a conductor’s assistant, so I was able to be around an orchestra every single week for, like, two years, and hearing the technology of how you have the expression really down and the vibratos in the instruments…it’s incredible. I’m really, really, really, really impressed. A few of my composer friends said, ‘You have got to try this and have it integrated.’ And it really makes a massive difference with the musicality. Obviously, nothing beats live musicians, but this is the second best thing and they can sit next to each other. I would love a piano version, a supernatural one. There’s so many great, great products that you’re doing, and it’s fantastic.”