what did I do?

Another official peeky peek of what I’m working on once removed.

https://tech.fb.com/facebook-reality-labs-replica-simulations-help-advance-ai-and-ar/

As the AI bot moves through the room, it navigates past two couches, a coffee table, and some small tan chairs. Zipping over the area rug, it heads toward the keys that researchers asked it to find.

In this case, though, the environment is a digital simulation that is part of Replica, a research project created by Facebook Reality Labs (FRL). Replica is a photo-realistic re-creation of 18 sample spaces, such as an office conference room and a two-story home set up by researchers.

FRL made these virtual spaces to help AI researchers teach machines about the real, physical world — an important step in developing more capable real-world assistants as well as next-gen augmented reality (AR) and virtual reality (VR) experiences. The idea is that if researchers can train an AI system to locate a virtual missing set of keys in a lifelike digital living room, it will eventually enable a real-life robot to do the same with real keys in a real room, too. And if AR and VR applications can learn how to interact with different physical environments, then one day we’ll be able to use our photo-realistic digital avatars to drop in on a family birthday party halfway across the world.

But researchers believe these systems will learn best if the simulated environments capture the subtle details — like mirror reflections and rug textures — necessary to make them virtually identical to the real thing. That’s why FRL created Replica.

“The Replica data set sets a new standard in the realism and quality of 3-D reconstructions of real spaces,” says Julian Straub, an FRL Research Scientist who worked on creating Replica. Straub studied electrical engineering in Germany before earning his PhD in computer science at MIT and eventually joining FRL to work on machine perception. As described by its Chief Scientist, Michael Abrash, FRL’s mission is to develop the technologies needed to establish AR and VR as the next computing platform. Projects like Replica will play an important part in realizing that vision.

The accuracy and fidelity of Replica’s digital re-creations come from the combination of a well-engineered and integrated camera rig with a high-accuracy depth capture system, a state-of-the-art simultaneous localization and mapping (SLAM) system, and a dense reconstruction system. FRL’s high-accuracy depth capture system, which uses dots projected into the scene in infrared, serves to capture the exact shape of big objects like tables and chairs and also smaller ones, like the remote control on the couch.

The custom-built SLAM and dense reconstruction system transform the raw video streams captured from the camera rig into Replica re-creations of the real spaces that even a careful observer might think are real. (More details can be found in the Replica data set white paper as well as in the team’s 2018 SIGGRAPH conference paper describing the mirror and glass reconstruction system.)

A virtual robot (known as an embodied agent in the research community) is spawned in an unfamiliar Replica environment. It starts at a random position, shown here as a blue dot, and is then asked to navigate to a destination (the red dot). The agent is given instructions relative to its starting position — e.g., “Go 10m north and 15m west” — but it has no map to guide it. The agent must use only its sensory input (a regular RGB camera and in some cases also a depth camera) to accomplish its goal.

Practicing a task millions of times in just one hour

Replica can be loaded up in AI Habitat, a new open platform for embodied AI research. Facebook AI created AI Habitat to be the most powerful and flexible way for researchers to train and test AI bots in simulated living and working spaces. AI Habitat allows researchers to put a bot into a Replica environment, so it can learn to tackle different tasks, like “go check if my laptop is on my desk in the kitchen.” These chores are simple for humans, but for machines to master them, they must recognize objects, understand language, and navigate effectively. Today’s machines — like robotic vacuums, for example — can respond to commands, but they don’t understand and adapt to the world around them as people do. AI Habitat will help researchers develop bots that understand the physical world. But it is also an important research tool for creating next-gen AR experiences that begin to merge the physical and digital worlds. If we can teach an AI system to understand the physical space around you, we might one day be able to use it in combination with AR glasses. For example, it could help us to place your grandma’s digital avatar in the seat next to you or to display digital reviews right next to a restaurant or store as you walk by.

Replica provides realistic 3D data, and AI Habitat provides simulation with speed and flexibility. While other simulation engines commonly run at 50 to 100 frames per second, AI Habitat runs at over 10,000 frames per second (multi-process on a single GPU). This enables researchers to test their bots much more quickly and effectively — an experiment that would take months on another simulator would take a few hours on Habitat. Facebook AI research intern Erik Wijmans, who is also a PhD student at Georgia Tech, and AI Resident Bhavana Jain used the system to do state-of-the-art research, training their bot with over a billion frames of experience. Using a rough estimate of how quickly people can look around and move in the real world, that would be the equivalent of more than 30 years of experience. A virtual bot can also bump into countless walls and make other mistakes as it learns, without any risk of doing real-world damage.

Facebook is now open-sourcing AI Habitat and releasing its Replica data set, so anyone in the community can build on it, try new approaches, compare their results, and learn from others’ work. (Technical details on Habitat are available here, and the Replica environments can be downloaded here.) This kind of open sharing of information between researchers at different companies and organizations has been key to recent advances in AI technologies like natural language understandingcomputer vision, and embodiedQA, and researchers at Facebook AI and FRL believe the same will be true here.

To establish performance benchmarks that can be used by everyone in the field, Facebook AI also recently created the Habitat Challenge. The contest invited engineers and researchers from across the AI community to find the best way for bots to complete a particular navigation task in AI Habitat. “AI Habitat offers close to real-world experience for learning navigation,” says one of the challenge participants, Dmytro Bobrenko.

The Replica data set identifies and labels the objects found in its virtual spaces, assigning a different color to different categories of objects, like “chair” or “wall.” AI researchers can use this “semantic segmentation” data to help develop smarter systems.

Together, these technologies will one day enable machines to learn to operate intelligently in the real world rather than just on our smartphones or laptops, says Dhruv Batra, a Facebook AI Research Scientist and Georgia Tech professor who leads the Habitat team. He and his colleagues call this the shift from “internet AI” to “embodied AI.” It means teaching machines using not just static data sets (like photos of cars) but also interactive environments (such as a simulated parking lot full of virtual cars that an AI bot can explore and examine). Batra and many other AI researchers believe this kind of interaction is necessary for building a new wave of smart tools to help us in the physical world as well as in the digital one.

Building the tools to create ‘social presence’

By training systems through AI Habitat’s advanced, open-platform simulations, researchers can make progress on embodied AI technologies that, until now, have largely remained in the realm of science fiction. Batra foresees not just intelligent assistants but also tools to help the visually impaired better navigate their surroundings, for example.

For Richard Newcombe, a Research Director at FRL, one of the most exciting future applications is bringing “social presence” to the physical world around us. VR today enables people to share a virtual space with a friend who is hundreds of miles away. Newcombe is working to bring a new level of realism to the experience and to make social presence possible in everyday life, too, through the use of AR glasses. With this technology, you’ll one day be able to see and interact with that friend just as if he or she were sitting on the couch next to you. To create this kind of social presence, AI systems need to know how to make the digital version of your friend interact naturally and realistically with the actual physical couch and room around you, or put you in a simulated environment that looks and seems real.  

“Much as the FRL research work on virtual humans captures and enables transmission of the human presence, our reconstruction work captures what it is like to be in a place; at work, at home, or out and about in shops, museums, or coffee shops,” Newcombe says. He is passionate about developing technology that can sense and understand the state of the world. He started work in the field with an apprenticeship at age 16 and went on to study robotics, computer vision, and machine learning at the University of Essex before earning his PhD at Imperial College London. He joined Facebook four years ago to start the research and incubation team dedicated to the future of machine perception for AI and XR applications, and the launch of Replica represents an important step forward in making this a reality.

Another Replica environment captures details like the electrical outlets on the wall and objects behind the glass doors of a bookcase.

A responsible, open approach

Building experiences like social presence will require additional breakthroughs in hardware as well as continued advances with training resources like Replica and AI Habitat. But there are also important privacy and security considerations, says Newcombe.

“We must be incredibly diligent in generating the reconstruction, scene understanding, and AI reasoning systems,” he says. Researchers and engineers as well as outside experts and the general public will have to collaborate to work through the social and personal consequences of a potentially transformative new technology. To do so, they need to know what is possible and to share updates on their progress so others can be part of that public discussion. At F8, Facebook’s keynote speakers discussed recent work on ethical design and addressing bias, which will be important as research progresses on AR experiences and embodied AI.

With the Replica scans, the data was anonymized to remove any personal details (such as family photos) that could identify an individual. In building this 3D reconstruction technology, the FRL researchers also made sure they created a robust system for handling and storing data even before they started scanning. For example, the data is stored securely in a server that can be accessed only by a limited number of researchers, and the team has regular reviews with privacy, security, and systems experts to make sure they are following protocol and implementing the latest and most rigorous safeguards possible. The scans are made available to the broader research community only after these steps are completed.

It will take many more technological breakthroughs before experiences like AR social presence and advanced AI assistants are a reality. For example, Facebook AI researchers will explore ways to build realistic physics modeling into AI Habitat so an AI bot can learn what happens when it knocks a virtual glass off a virtual table. As this work progresses, the Replica and AI Habitat researchers believe these projects will play an important part in Facebook’s future. By enabling the next generation of embodied AI, these technologies will unlock the potential of AR glasses — giving people a better understanding of their world and helping them to connect, work, and collaborate in powerful new ways.

“Using AR glasses as the platform, social telepresence and useful AI assistants will help you be the most effective you, engaging in the world however you want to,” says Newcombe.

Scott Colburn (AKA Jabon) Talks Pasta, VR Audio, and Animal Collective

A couple weeks back, I went to Neumos to catch Avey Tare of Animal Collective. The evening’s opening act, Jabon, blew my mind ” it was this totally whacked out phantasmagoric performance piece complete with costumes, spoken word interludes, and props. To hear his website describe it, “avant garde ambient disco comedy”. Needless to say, I was dying to talk to the guy about his artistic process, so I started looking into him …Turns out he’s also Scott Colburn, a recording engineer who’s recorded indie classics for the likes of Animal Collective and Arcade Fire, live in-studios for countless bands over at KEXP, and is even in the process of remastering the discography of legendary weirdo band The Residents. I was admittedly a little worried that the Jabon performance style would carry over into our interview, but from the first few words we exchanged the guy was affable and totally open about his work on some legendary projects (even the Animal Collective stuff, which is often shrouded in mystery.) Warning though: some of this interview definitely dips into total audio nerd territory.

(This interview has been edited for length and clarity).


RIELY URBANO: So, how did the rest of your shows with Avey Tare go?

SCOTT COLBURN: It was cool, I just did this Seattle show and another show in Portland. Portland didn’t feel as magical as Seattle …Holocene is kind of a weird venue. It’s split into three separate spaces, so during my set everyone was over in a whole different space getting beers waiting to see Avey Tare. I could see that some people were kind of responding to the music, but it was tough to get those people really involved. It was also kinda bizarre cause there was this one guy in the front row at Portland who was just not fazed by my set at all – usually I can get a laugh, or a smile, or an eyebrow raise from the people up front at shows, but this guy was just giving me nothing.

RU: So, has the Jabon show always been like this? Or has it evolved over the years?

SC: Oh no, Jabon started in a totally different place. I mean the project started in ’84, and for that whole first decade from ’84 to ’94 I only played two shows. One was in Indianapolis and another in Pittsburgh, at Carnegie Melon. It started out in college, when I was watching all these avantgarde movies and listening to bands like Black Flag ” I was also really interested by some of Greg Ginn’s work on the side with Gone, some of that instrumental Black Flag stuff. For a while I was just doing things like tape loop recordings with some poetry or maybe comedy skits laid over the top. That was how I did the show at Indianapolis; at Pittsburgh I had on I think Melies’ a Trip to the Moon, and I was kind of sound-tracking that. But I was also really inspired by the band Chrome ” I really respected them because they didn’t have any interest in really performing live, they just wanted to record. It was definitely like that for me, I was mostly just interested in Jabon as a recording project. A lot of my early stuff too was comprised of me recording myself, playing the drums for like 20 minutes straight, and then going back and recording bass over the drum track, and then guitars, and so on, and then I’d go back and I’d pull out the best moments from the whole 20 minute piece and that would be the song. I had a few live shows like that, where I was playing guitar over these drum and bass parts that I had recorded as well.

RU: How did things develop to the current Jabon live experience?

SC: Past that point I was just staying really busy, working all day and then all night and through the weekends too usually, just recording bands. But then I ended up recording Feels and recording Strawberry Jam …and at the end of that, Dave (Avey Tare) told me he was really into my music, and ended up asking me to do some shows supporting them on the West Coast, and I said yes. And through the 2000s I also spent a lot of time playing with a band, Wizard Prison. We wore robes and everything, but there wasn’t a mask, because we played in silhouette behind a screen that we’d project avantgarde movies on. We played a show for 6/6/06, then 7/7/07, 8/8/08, and then the project ended with the show for 9/9/09. That was definitely a precursor to Jabon Mark II. Beyond that, it was also that I started using Absynth, a synth from Native Instruments around then, and the first patch I heard as soon as I started the software up just really connected with me. So, a lot of my live show is built around exploring Absynth as an instrument as well. And there were all sorts of creative experiments, like one year I made a song every Monday for the whole year. I remember my wife being like “there’s no way you’re gonna be able to make that happen,” and I was like “well okay, here, watch me.” And at the end of that I had like, four albums worth of material, I mean that’s fifty-two songs. And I also did a whole series of fifteen-minute shows, all over Seattle, and that was interesting because it became about playing to all these unconventional spaces throughout the city.

RU: So beyond Black Flag and Chrome, are there any major influences that guide your work, especially since you’ve gone more electronic?

SC: Well I’m really into those two, and then also the Residents. Actually, at this Portland show, I had two guys come up to me after my set and say, “you must be influenced by the Residents, aren’t you?” I said of course, one of them shows me their Residents tattoo, and then I tell them …You know I have all the Residents’ analog tapes at home at my place, right? And they go “yeah, we know …We know who you are.”

RU: You’ve got the Residents’ analog tapes at your house?

SC: Yeah, I’m remastering and digitizing their whole discography, it’ll be coming out over the next few years.

RU: So do you consider yourself to be of that tradition, The Residents, Captain Beefheart, and so on?

SC: Well I was honestly never too deep into Beefheart, what I dug about the Residents was really their anonymity. I am pretty big on Zappa though, and I dig Amon Tobin.

RU: I’m glad you brought up Animal Collective, because I’ve been meaning to ask …I mean, Feels and Strawberry Jam are two of the best sounding records I’ve ever heard. Do you have any sort of a guiding philosophy or technical system when you’re recording artists?

SC: Well, thank you for that. I apply analog techniques to the digital realm. There’s a sort of finality to analog recording that’s pretty valuable ” if you think you can record a part better than you did the first time, you have to totally erase that first take. It really forces a level of decision making that I think tends to get the best performances out of artists. You have to really ask yourself “can I do this better?” And really be sure that you can when you’re recording analog, and I try to bring that out of artists when I’m recording them. Besides that, it’s really just a matter of figuring out what the band’s vision is and just b-lining for that.

RU: Do you have any advice you’d give to young musicians or potential audio nerds?

SC: Well one thing I’d say is that it’s really a different world and a different industry than it was when I started, because almost anyone can start making music pretty quickly and pretty easily these days. Anyone that buys a MacBook already has a DAW (digital audio workstation) by default, and I mean from there you can upgrade to Logic Pro, or you can use Ableton, Cubase, Reason …there are tons of options. I think that Logic is a real musician’s DAW, ’cause it comes preloaded with so many incredible sounds …A lot of the younger musicians that I’ve worked with come to me having worked mostly in Logic. The flip side of that is what’s tough though, because a lot of people that come to me want me to mix their stuff and it’s recorded really poorly. It’s always gonna be easier to mix something if you recorded it well. That was actually one of the reasons I kind of decided to leave the music industry and say “alright, I’m gonna work on VR audio now.”

RU: And so what are the big technical questions and concerns with audio in VR?

SC: Well it’s all about sort of placing audio in space, which is super interesting in VR because it goes well beyond 5:1 or surround sound or anything like that, The sound is literally everywhere within this virtual space. A few years back me and my wife went to a VR film festival, and I didn’t know what to expect. I kind of had this idea that we’d come and all sit in a movie theater, all wearing VR goggles, and that none of these guys would have any idea what they were doing with sound, and that it would be really bizarre, but it wasn’t really like that. You’d come, stand in line for the movie you wanted to see, and then individually watch in this VR space for a few minutes, then go and get in line for the next movie you wanted to see. But I could tell that these guys didn’t know what they were doing with their sound, and I knew I could help. And the work I’m doing on this is stuff you probably won’t hear about for another five years. So I’ve done some work with Microsoft on Hololens, some spatial stuff for that, and then for Oculus as well.

RU: And are there any plugins you use specifically for this precision level spatial stuff?

SC: Oh no, you make the original audio in a DAW like you would anything else, but the process of making it work in VR all happens in a game engine like Unreal or Unity. A lot of those engines have sound systems all of their own, and it’s all about fitting the audio in to that. Some of it can be really creative, like “hey, I’m gonna take this sound and associate it to this character, or to this place, and so on.” It’s all work with these really big teams, tons of people involved. It’s kind of like making a record, in some ways …I mean when you’re recording a band you’ve gotta figure out okay, who’s the guy leading this band, who’s the guy who’s lazy and who isn’t gonna show up …It’s kind of like that, except with a programming department, a social media department, a writing department, and so on.

RU: Do you think I could fire off some quick audio nerd questions at you?

SC: Yeah, sure, I don’t mind.

RU: Favorite microphone?

SC: I actually do have one …the U47. Now, I don’t own one, cause they’re tricky to find. First off, you’ve gotta find someone stupid enough to sell one to you, then you’ve gotta be ready to shell out like $25,000 or $30,000. A lot of the bigger studios you’ll go to have them, like for instance there are a couple studios in Seattle that have one. They’re not crazily uncommon, but they’re definitely expensive.

RU: And have you ever recorded with one?

SC: Yeah actually, I recorded some stuff for this French artist Cali, we recorded in Southern France which was just incredible. He’s not marketed to the US really ’cause he doesn’t sing in English, but he’s kind of a big deal in France. And one of the studios we were recording in had a U47 in it …and I was like “we need to record all of our vocals here, quick.” Another French band I recorded, I managed to get a few things recorded on a U47. In terms of mics I own, there’s the Coles 4038 for guitar, KSM32 for vocals or bass, AKG C3000 B for bass and for female vocals, and then the Blue Dragonfly for vocals and drums, interestingly enough.

RU: Reminds me of how the Tame Impala guy records his kicks with a ’57 …I’ve just accepted at this point that no one ever records drums the way I’d expect.

SC: Yeah, I know what you mean. Reminds me of the time I walked into a session that I wanted to see how they were doing what they were doing. And it was just ’58s recording almost everything. At first, I was confused, but then it kind of made sense to me ” I mean, that’s the live sound. How many shows have you been to where it’s just ’58s run straight into the PA? And if you’re making a record with the intention of bringing people out to your live show, it totally makes sense to record the way it’s going to sound live. And if someone sees you live and then feels curious about the record, it totally makes sense to record in a way that’s gonna convey how it sounded live. That’s definitely something I think about when I’m recording bands. I just remember listening through mixes when we were finishing up Feels, and some of those guys saying like, “man, this is what I hear when I’m up on stage at the shows.” There’s just something really cool about capturing what the musicians themselves are hearing when they’re performing live. I mean, at the end of the day, the ’58 is just that rock sound. You can record someone with a U47, but you’ve also gotta record them with the ’58 if you’re trying to get that rock sound.

RU: OK. Favorite kind of synthesis? Additive, subtractive, FM, anything like that?

SC: I don’t really think about that that much. I draw heavily from Absymth and work with it a lot, beyond that I also really dig the Moog synth emulators from Arturia, which are really powerful. With a lot of these software versions they are recreating the original thing, in an even higher definition than the original synth could have ” I mean the old synths we would record on 24-bit systems, whereas your average computer someone’s making music on these days is 64 bit. Bob Moog himself has even given a lot of these software emulators his blessing, and as someone who’s used a lot of the originals, I really think the emulators do a pretty good job. A lot of the modern Jabon material is built mostly around Absynth patches, Moog emulators, and then a sample pack that isn’t even in circulation anymore; it’s a bummer cause the samples are really good.

RU: Would you say you have an all-purpose compressor?

SC: That’s an interesting question. For keyboards, I tend to like the stock ProTools compressor. For live drums, I like the Waves HComp, but on the stereo bus, I like the Waves L2 as a limiter. One thing I’m excited by is the Waves MV2, which is both a downwards and upwards compressor. It’s cool cause upwards compression isn’t like an expander, it literally just pulls the quiet stuff up from a given signal, pulling out a lot of little details. When you’ve got that working with a conventional downwards compressor at the same time, it gives you a really nice little band that you can then move upwards or downwards.

RU: The way you’re discussing this subtler sort of technique reminds me of some of my favorite mixes of all time, like the on Aja or Off the Wall; it felt like there was so much less compression on anything back then.

SC: Well, I think there was probably more compression than you’d think on records like that, it’s just that they were using hardware compressors. The difference between hardware compressors and the software versions is huge …everything I said about synth emulators, I really can’t say about emulated gear.  You could take a hardware compressor, turn the knobs to the exact same spot they’re at on the emulator, and still end up with a totally different sound. There are companies out there that have really painstakingly tried to recreate the feel of old hardware compressors, and it’s just never quite right. Some of those old compressors, you could really be compressing the hell out of something and it would sound incredibly natural.

RU: Feels and Strawberry Jam feel like pretty dry records to me, mix wise. Do you have any long-held opinions or techniques regarding reverb?

SC: Actually, it’s interesting that you ask that, because the Animal Collective have a fan forum and fans have obsessed there for a pretty long time about the mix on Feels especially, up to the point where I even had people telling me that I should drop in and let them know how I recorded the album and just settle the score. So, I went on this forum and I told them the truth, which is that everything pretty much just had a basic compressor, a plate reverb, and a slap delay on it, and that’s it. But no one believed me! They kept going “no, that can’t be true because you can hear this, this, this and this,” kept asking me for more. But it really was just that, I mean a compressor for dynamic range, a plate because I think plate reverbs really just supplement the sound of the room and sound really cool, and then the slap for depth. Anything else you’re hearing in there, that’s the band.

RU: Well and yeah, I kind of got that impression based off the fact that Avey’s guitar tone at the show sounded almost exactly like the tone on Feels, which made me wonder if that wasn’t just his rig.

SC: I know, right? His live show was super bizarre to because actually, all the samples were coming from Jeremy (the live drummer.) Dave had all this gear around him, so I assumed he was triggering samples, but when I asked him about it, he was like “no man, that was Jeremy. No one ever expects the drummer to be the guy triggering the samples, that’s why we did it that way.”

RU: I’ve got to admit, I feel kind of like I’m cheating here, using my journalistic access just to soak up all this audio knowledge or whatever.

SC: Oh no I don’t mind at all, I teach audio engineering on and off at the UW when I’m not off recording somewhere outside of Seattle, I kind of see it as a way of giving back.

RU: Having been in the business as long as you’ve been, is there anything you’re grumpy about?

SC: Well I’ve always been grumpy about ProTools ” I just don’t get the obsession. The MIDI is horrible in ProTools, always has been, and it’s not like it’s the most popular software out there. It’s very big in the United States, but internationally, a lot of people use Cubase. I’ve always been a Nuendo user, which is kind of like Cubase on steroids, but then again, it’s the most expensive DAW on the market so I can see why people don’t necessarily bite for it. I still haven’t spent enough time with Ableton to feel like I really get that either.

RU: Yeah, working with Logic has really sort have trained me to think in a timeline and Ableton definitely challenges that.

SC: Yeah, it’s just so different from how any other DAW works ” I wouldn’t say I’m confused by it, I just really haven’t spent enough time with it.

RU: Anyways, I still some questions about the Jabon live show. Now, you’re a musician, but you’re also an author. Can you tell me more about the book you read from at your live show?

SC: (laughs) Yeah, the book is definitely there just to sort of lighten up the mood at the live shows. I know that the energy can be kind of dark, and the music can get kind of sinister, so the book is really there just to sort of say “hey, this is supposed to be light and kind of funny.” I think a lot of what I do with Jabon is really funny. And as far as the words go, that poem comes from a really old piece, I think from ’84 or ’86, so I they’ve been floating around for a really long time. It wasn’t always like the show you saw, with the jazz and listing all the pasta, but I’ve been saying that poem at live shows for like 20 years, I know those words by heart. The book is really there mostly just to ground it and have a concrete visual reference for people and drive the joke home, but really, it’s just a children’s book I found at a thrift store that I pasted the words of the poem over.

RU: Follow up question, what’s your favorite kind of pasta?

SC: That’s a really interesting question. I think I mentioned it at the show.Strangled priest. I’ve never actually tried it, but when I was in Italy, this crazy amazing restaurant I went to was all out of the Strangled Priest by lunch, so it must be pretty incredible.

RU: For any newcomers to the Jabon universe, would you recommend any specific album as an entry point?

SC: Well a lot of what I have out is on my bandcamp site. There isn’t really a record out that represents the Jabon live show. Like for instance, the show you saw, I refer to as the Jabon club experience. For one year I only did shows where I would announce them and then I’d play for fifteen minutes for free, and that’d be it. I did one at the Fremont Troll, at the Bus Tunnel, in West Seattle, all over …And it was a response to kind of what was bumming me out about Seattle club culture, where the artists were always getting paid last ” I wanted to do something where it was like, “alright, no one’s making money here. We’re not selling beer, there’s no bouncer or booking agent, this is just gonna be really simple and direct.” Since then it doesn’t happen that often that it feels right to play a show. I mean just now, Dave called me up and said, “hey, do you wanna do these shows?” And I said yes …But even these shows kind of reminded me of the things about the music industry that made me want to leave in the first place. The next time I do a show is probably the next time Dave calls me up and asks, cause me and him are buds.

RU: So you wouldn’t say there’s a definitive Jabon release?

SC: Well, no, if there was it would be some sort of polished representation of that Jabon club experience and putting that together would be a ton of work. That show is composed of tons of different pieces I’ve made over the years and they’re stitched together for the show, but at this point it just feels like it’d be a huge undertaking to try and pull all that together within an album. I mean, I’ve got these Residents masters to do, I’ve got VR audio to work on, so the Jabon album is honestly pretty low on my list of priorities these days.