Separation In Real Time Is What Makes Streaming So Terrifying

Broadcasters have known this forever.

Please Remember:

The opinions expressed are mine only. These opinions do not necessarily reflect anybody else’s opinions. I do not own, operate, manage, or represent any band, venue, or company that I talk about, unless explicitly noted.

Want to use this image for something else? Great! Click it for the link to a high-res or resolution-independent version. This image was found on Pixabay, here.

I was recently THE engineer for a concert live stream, my first time being in that position, and it was a bit of a rocky start.

My encoding settings were too much for the streaming machine (a laptop) to handle, and so the stream that the audience saw was a nightmare of frozen video and general “chop.” Folks were greatly displeased, quite understandably, and if you think live music is stressful…

…try adding the stress of trying to figure out why a streaming broadcast is going wrong, while also trying to mix the streaming broadcast AND the show for a few folks in the seats.

AND for the musicians.

Luckily, it was a pretty simple show, or I would have gotten flattened.

For me, as a guy whose work is very heavily weighted towards concert sound, there’s a special terror to broadcast. The fear comes from having realtime output that’s essentially impossible for a single craftsperson to monitor effectively. Folks who specialize in broadcast have had to deal with this problem for ages, and wield all manner of effective strategies for navigating the conundrum. I didn’t, though, and it was tremendously educational to have a shoe on the other foot.

What I mean by that is I was facing what a lot of studio engineers run smack into when coming over to the live-music side: We use lots of the same technology and terminology, but the production disciplines are very different. You can’t expect to be an ace at broadcast simply because you’re a seasoned operator for live music. The needs are different, and the pitfalls don’t have the same geometry.

In live music, what you get used to is hearing the same output (or largely the same output) as everyone else. You hear it immediately, and so does the audience. If something isn’t right, you have a common reference point.

With broadcast, the output is real time for all practical purposes, but you’re separated from it. You experience the show so differently from your audience that the two realities are almost totally unmoored. You can fix that by having a separate broadcast studio or broadcast truck, but that’s not something that happens for sole operators. You can open the stream on your phone and get something of a clue, but all the immediate output in the space basically drowns your sense of what’s happening.

Plus, you can get everything else right and still fail the majority of your participants.

Monitors okay? Check.

Main mix okay? Check.

Streaming preview seems okay? Check.

Audience experience at the far end? HOUSTON WE HAVE A PROBLEM, SEND CHEESE-BALLS AND BEER!

The first three items are plenty difficult by themselves, but the last point is what really matters. It’s the creature that will eat you alive.

So, what’s the lesson here? Other than realizing how broadcast is a different discipline, I’d say that “you haven’t done it until you’ve done it.” You can’t live through the process of streaming production by thought experiment, or even by limited empirical work. Until you’ve actually put a stream through the Internet, and seen how your hardware and software truly interact with that task, you don’t know what you’re dealing with. It’ll sober you up in a hurry.

But getting sobered up is an opportunity to learn, because you begin to get a handle on knowing what you don’t know. That’s true for me, and I have to say that I’m pretty excited to have another crack at the whole thing in July. Once you see things go wrong, it’s a chance to do them the right way. I’m craving that chance.