Simulated Live with Wowza and Apple HTTP Live Streaming

Last summer, we switched our primary Flash streaming over to 316 Networks partly because of the simulated live capability they offered, and partly for the Media Suite backend. We continued to use Wowza on EC2 for our mobile users, since the solution works very well. Unfortunately, simulated doesn’t work for our mobile users, who are limited to live.

We have 3 replays a week for each of our two web services: the traditional service from Sunday morning, and the praise service from Sunday night.

I should probably step back for a moment and explain what “Simulated Live” means. It’s a recorded event, but from a user standpoint, it behaves like a live event. There’s a set start time, and if you come in 20 minutes after the start of the event, you get the video 20 minutes in; there are no “trick play” DVR functions like fast-forward or rewind. This gives a shared experience for everyone who is watching, and also keeps us legit with the copyright restrictions, as our “live” events are considered extensions of the actual live event in the sanctuary. Simulated live is also known as “pseudostreaming”

Wowza doesn’t natively support pseudostreaming (although the Stream Class API does have some scheduling capability — see this post), so we needed some way of broadcasting recordings on a schedule. I could use the VT5 machine and the Scheduler to replay the recordings on a schedule, but the big downside is that it consumes local bandwidth, which is in short supply.

What I needed was a way of streaming the archive files (high and low bandwidth) that were created by the Kulabyte encoder during the live event. Ideally, since I already upload these to 316 for rebroadcast, I’d like to not have to upload it to two locations.

Enter the lovely open-source encoder, ffmpeg. My concern about using ffmpeg was that re-encoding an already encoded file had the potential of introducing compression artifacts and adding CPU load. I was very happy to find an obscure command-line setting that tells it to copy the input file’s audio and video codec directly. The only thing ffmpeg would be doing would be extracting the audio and video streams from the MPEG-4 container and stuffing them into a Flash container without molesting the actual streams at all. Added bonus: ffmpeg can also not only output to RTMP, but can take RTMP as an input as well.

Fortunately, Media Suite’s media bin makes videos available via direct RTMP and HTTP and helpfully provides the CDN URL for those files. Another method of storage would be to use Amazon S3 and get the files either via HTTP or Cloudfront RTMP.

Attempt #1: Installed ffmpeg on a local Ubuntu box via apt-get, no dice. It refused to connect to the RTMP server. After some research, I found that the version of ffmpeg on the Jaunty version of Ubuntu is 0.50, and seems to have some weird build options.

Attempt #2: I downloaded the ffmpeg 0.61 source, ran a quick configure/make/make install on it, and tried again. Success! I was taking an MP4 recording on a disk, and streaming to Wowza. I then changed the input from a local file to to the URL provided by 316, and weird stuff started happening. And then I realized that the URL had some GET parameters in it that left a question mark and two ampersands in the URL that I needed to escape before bash would parse it correctly. Once I fixed that, it started running happily, and I was seeing the stream on my iPod.

Attempt #3: I Downloaded the ffmpeg 0.61 source to my Wowza server on EC2 and crossed my fingers that the build would go off without a hitch. Success! I then transplanted the command line I used on my test box, tweaked the destination server to the local Wowza install, and fired it off. Success again!

Now that I’d proved the concept, it was simply a matter of putting that command into a cron job and waiting to see if it fired off. And then realizing that the server is in eastern time. And then realizing that I should probably correctly specify the path to ffmpeg (doh!). But once I got those silly details ironed out, off it went.

So now I have ffmpeg on my EC2 system, consuming virtually no CPU, pulling my archive files from 316 (only had to upload them once), streaming to Wowza on a scheduled basis, without chewing up T1 bandwidth at our main site.

Unfortunately, there’s a little bit of brain damage involved in pulling it from 316, because I’ll have to go change the filename in the cron job every week. Perhaps I’ll end up uploading it to S3 after all and just giving it a static name.

Here’s the commandline used:

/path_to/ffmpeg -i rtmp://rtmp.server/filename.mp4 -re -sameq -acodec copy -vcodec copy -f flv rtmp://localhost/live/stream

Commandline options used:

  • -i : specifies input. This can be either RTMP, HTTP, or a local file.
  • -re : near-realtime mode
  • -sameq : Keep quality settings
  • -acodec copy : send input audio stream unmolested to the output
  • -vcodec copy : send input video stream unmolested to the output
  • -f flv : Force output to FLV container

Sony VISCA RS-422 Control

Update – January 2014: Wow, 3 years later this is still one of the most popular posts on this blog! I’ve had some questions about using this with the EVI-HD1, which has only RS-232 DIN ports. Theoretically, it should work, but you might need to alter some pinouts in the breakout box, and I would highly recommend using shielded/grounded cable, as RS-232 is an unbalanced signal. A reader is going to give it a try, and if it’s successful, I’ll update the post with some pictures.

Update – October 2021: It blows my mind that nearly 11 years later, this is still one of the most frequently visited posts on the site…

Now, back to our regularly scheduled blog post!

We recently acquired a few more of Sony’s excellent EVI-D70 cameras for use in our chapel for streaming weddings, funerals, and other events in our smaller worship space.

When we remodeled the space a few years back, it was originally designed with these cameras in mind.  The original intent was to provide some additional angles for videographers to use, but the idea never really took off. Due to lack of use, the cameras originally installed were re-purposed for Resurrection Online in the main sanctuary. Things have come full circle now, and the ability to stream services and events from this space is being requested. As a result, we acquired some more cameras, and are in the process of updating the camera system in that room.

The original design used an AMX touchscreen/joystick controller and a custom integration over RS-232, with each camera homerun to the control rack. There were numerous difficulties with the cameras randomly freezing up and not responding to controls, requiring someone to get on a ladder and power-cycle the unit.

As part of the updated system, we’ve ditched the AMX controller and are using Sony’s RM-BR300 control unit which is designed for this particular camera. We also have user familiarity, since we already have one of these controllers in our main sanctuary for the BRC-H700 remote camera mounted on the catwalks (aka, the “SkyCam”). The controller can do Sony’s VISCA protocol over RS-232 (via a Mini-DIN) or RS-422 (via a Phoenix connector).

This is where it got sticky. We have an 8-conductor homerun cable from each camera position, but the Sony controller is designed to daisy-chain the VISCA ports. Each camera has two RS-232 Mini-DIN ports (one in, one out). Fortunately, both RS-422 and RS-232 in this application only require four wires, so we can loop out and back on the same cable.

Due to the annoyance factor of having to re-terminate Mini-DIN connectors, I opted for the RS-422 port which uses a Phoenix screw terminal (Part # 1840434 in case you need to order one – Sony wants an obscene amount of money for them, they’re dramatically cheaper at an electronics supplier like Mouser). RS-422 also has the advantage of much longer signal path due to its balanced signal. Since we’re also adding a new location, I wanted to be able to wire it up with standard Category 6 twisted-pair cabling. This cable also has eight conductors, making it ideal for the task. In terms of flexibility, RJ-45 is king in the twisted-pair world, so I had do design a means of daisy-chaining my VISCA ports via ordinary patch cords.

At first, I was a little baffled by the wiring of VISCA, since the RM-BR300 connector pinout is exactly backwards from that of the one on the cameras, and the documentation provided is a little confusing. Fortunately, the Sony POSC was quick to help and they e-mailed me a wiring diagram for this specific application (and were kind enough to allow me to post it. I translated that into two main components, a breakout box and a standard cable, that would work on either the controller or the cameras.

To make the cable, I simply took a patch cord off the shelf, lopped one end off, and terminated it on the Phoenix connector:

The wiring is as follows:

Now, you’ll notice my wiring diagram shows the orange pair on the first two, and the picture shows green. This is because I found out (after much frustration of tracing signals) that the patch cord I grabbed happened to be wired for 568A rather than the more common 568B. Simply swap orange and green if this is the case.

Once I got the cables sorted out, I then replicated Sony’s wiring diagram with a handful of data jacks. The connections go like this:

I used bits I had on the shelf, but I would recommend using a different jack color for the control input so you don’t get it confused. Once I got it wired up, this is what I had (I colored the control jack black with a Sharpie):

Even if this install only has three cameras, I wired it up for five, to fill a six-way biscuit box that I had on the shelf (these are Lucent/Avaya components):

.. and put the lid on it with some labels:

As for the hookup, set the DIP switches on the bottom of the controller and the cameras to use RS-422 and either 9600 or 38400 bps, and hook them up. Note that they must be in sequence, or the whole chain will be broken if you skip a slot. Plug a camera into #1, it will be #1 on the controller after they self-enumerate on startup, in order of closest to farthest on the chain. Connecting a camera will cause the controller to re-initialize.

Action Shot:

I used a biscuit box, but you could also use a modular patch panel to do the same thing. I hope to use a second category 6 run with an S-Video termination on it (2 pairs) and power (other 2 pairs) so that the whole system can run off a standard 2-cable pull.

The video game is changing

Nope. Not talking about your XBox or Playstation or even your Wii.

A while back, I posted about why Blockbuster is screwed. The scene just got bleaker, and not just for Blockbuster. Now the entire Cable TV industry is facing a major conceptual shift.

Mark Melanson blogged today about Netflix mulling over the idea of ditching the physical media distribution concept that they perfected. Netflix has already induced a lot of insomnia with the senior management at Blockbuster. The cable people need to start worrying for two reasons:

  1. This is going to clobber Pay-Per-View revenues, especially if Netflix gets major licensing deals on fresh content.
  2. This is going to clobber the data networks that these same cable operators are selling to their TV customers.

But there’s more. The way we watch content in general, not just movies, is changing dramatically. What the cable companies fail to realize is that they’re not really in the content business. They’re in the business of selling a wire into your house, and they need to provide you with a compelling reason to pay them for that wire, so they piggyback a bunch of TV on it. In many cases, they’ll bundle IP and phone service too.

One of the problems is that when you’re selling a wire as a content delivery mechanism, you either have to produce a lot of compelling content, or acquire it somehow. There’s plenty of that out there to be had, but at a price. And that can lead to the content producers holding their customers hostage as a bargaining chip against the middleman. By the way, Fox and Cablevision, have you noticed that this makes your customers very angry? I bet Major League Baseball is selling a ton of online viewing subscriptions. That was revenue that could have been yours.

Fortunately, consumers have a few options to consume content that isn’t dependent on the company providing the wire into the house. One only has to look at the success of Hulu, Major League Baseball, and Netflix to see that. Of course if your internet access is coming from the same place as your TV, the content provider can quite easily lock you out, as Fox did to their Cablevision consumers.

The problem is, in the current environment, TV is still very much something tied to time and place. What content you get over cable or broadcast is subject to the scheduling whims and programming choices made by the stations, networks, and cable operators.

We as consumers have tried to work around this with DVRs (timeshifting) and devices like SlingBox (placeshifting) in order to consume content on our terms.

This works, but to a point. It also provides unecessary stress on the last mile of the networks. It’s also ridiculously expensive for the consumer. I no longer have cable. Or a TV, for that matter. Most everything I watch is picked up over-the-air by my Windows Media Center DVR and watched via another machine on my network, or online via Hulu or the content provider’s website.

The downside to this arrangement is that when watching online, there’s still a delay from the original airtime to when it’s actually made available on the web. This generally doesn’t bother me as I’m not a slave to TV schedules, but I do miss out somewhat on the shared experience of millions of others watching (and tweeting about) a show at the same time.

Then there are other shows that aren’t available in either format. I can’t watch Mythbusters on the web very easily without violating copyright law. There’s always Netflix and TV Show DVDs (which have been hugely successful) for that, but it’s not convenient.

Here’s what most of the content companies are failing to realize: Consumers will find a way to watch the content they want to watch, when they want to, on the device they want to, and generally care little about intellectual property laws meant to preserve originality.

If you’re a content company that’s not making your full content available via streaming, you’re missing out on a potential audience. It also has to be easier to consume legally than illegally.

Hulu is a great example of making it easy to consume content. Netflix is doing a great job of adapting.

The other great challenge with cable providers is that there’s a finite amount of content that can be stuffed down the wire. The current model involves sending everything down the wire at once and having the machine at the consumer’s end of the wire display a given one. Some great technological progress has been made to increase that capacity, but it’s still finite. Wouldn’t it be a lot simpler to send only the content actively being consumed down the wire?

Better still, give me a virtual DVR in the cloud and let me pick from a whole host of content. I still want to watch my favourite shows when I’m on the road. I can’t do that with cable. I may have eclectic tastes that don’t line up with what makes money for the cable operator. If I like to watch Curling and Cricket, I’m out of luck, because there may be 3 of us in the whole area who care about those sports.

Say you’re the Discovery Channel, or Fox. Instead of selling your content wholesale to the cable operators, stream your content directly to the consumer, in HD. I still think you can make money doing this, either with advertising or paywalls.

Imagine a virtual “cable” operator. Not bound by geography or cable plants, but rather open to the entire planet, and you offer a menu of content. Charge by the channel. Or by the show. (We’re talking micropayments here, but if most people are willing to shell out 60-100 bucks a month for a buffet of channels, and ultimately go back to the same 10 channels, there should be money to be made). You don’t even have to provide the streaming infrastructure, let the content providers worry about that. Just sell/broker access to it. The distribution is handled by the major CDNs anyway.You can even offer obscure content that doesn’t have a lot of demand. Stop being a slave to schedules. Sure, release new content every week, but let people watch it on their schedule. If you’re not sure how to make that work for you, go ask Felicia Day. She’s got it figured out.

The 2010 Winter Olympics were a good step in that direction. Even so, geographical restrictions on content (imposed mainly due to licensing issues) really got in the way. Many people found ways around it with proxies. Here’s a clue to content providers: Consumers don’t really care about geography. Why should I be disallowed from watching a show or event on the CBC or the BBC simply because of where I happen to live? Your content is compelling to me! I’m even willing to pay for it, either with real money or by watching your ads (just don’t get too crazy with the ads or I’ll go somewhere else). You’re missing out on a revenue opportunity when you should be going after every one of those you can get.

Suddenly, the guys in the business who are charging for a wire to the house should be getting nervous. The current cable paradigm is tantamount to charging $100 for a chinese buffet with only one steam table. The value proposition simply isn’t there. That fact that you’re still in business at all is a testament to the power of monopolies and heavy-handed legal action.

Cable operators need to get out of the content business. It’s killing them. Might as well get out of the voice business too, since that’s not going to stick around long. But if you’re willing to take that wire and provide me with a transport mechanism for all this content out there (in other words, IP access), I’m all over it. I’m a customer of my cable company. And all I buy from them is data. I’m fortunate enough to have cable competition in my area, but the competitor wants to charge me extra for not having TV content clogging up my wire. Sorry, that doesn’t fly with me.

Why on earth would you want to restrict the size of your audience? There are millions of consumers wanting to consume a ton of available content out there. Don’t get in the way. If you do, the consumers will cut you out of the action and you’ll eventually find yourself off in the booth in the corner with the magazines and newspapers, crying into your beer and wondering why nobody loves you anymore.

Update (10/23/10): CBS, ABC, and NBC demonstrate that they don’t get it. They are shutting out Google TV users from viewing their content. Oh well, they’ll figure it out eventually. If they’re lucky, before they become completely irrelevant.

Update 2 (10/29/10): And now we hear that XBox Live is now bigger than Comcast.

Blockbuster is screwed.

Digital distribution is the future of media. Physical media is dead. Yeah, I know, you’ve heard it but don’t believe it.

Today, Penny preached at Resurrection and showed a clip from The Blind Side. Andrea and I had been meaning to see the movie for a while. Since the kids actually went to bed quietly and early, we figured we’d RedBox it and have a nice movie night at home.

One problem though – when your pastor preaches to a couple thousand people and include a clip, there’s a pretty good chance that you’re going to have a hard time finding said movie anywhere near the church. As a backup plan, I fired up google and searched for a torrent version of it. Within about 30 seconds, it was downloading. I then went to Redbox.com, searched for the movie (got lucky and the local box actually had one!), reserved it online, hopped into the car, drove over to the price chopper and picked up the movie. (in retrospect, it would have been faster to take my bike, but it was warm and VERY humid) . I took less than thirty seconds at the kiosk, and drove home. As I sat down in front of my computer, the download had just completed.

Total time elapsed: 17 minutes. In that time, 700MB had downloaded, and hadn’t even uploaded a complete single block (so don’t worry, MPAA, I didn’t actually share any of it). Since I had the DVD, I watched that instead, and had to confront issues such as cleaning the last renter’s fingerprints off the disc and sitting through commercials on the DVD that I paid to rent. I’ll go on faith that the file I downloaded contained the video, and it was kinda nice having a backup plan in case the disc was unusable. Either way, the content owners did get paid.

The process of reserving and picking up a movie on redbox is insanely easy and quick. And downloading a torrent was even easier. If you’re in the business of physical entertainment media, I hope that you’re trying to figure out your exit from that strategy. Browsing and renting at Blockbuster is a painful and expensive process, and that’s why their days are numbered. Redbox has a good thing going, but looking five to ten years down the line, they should be seeing a world of digitally distributed content, not physical media. Netflix has the right idea, but their streaming catalog could use much improvement.

How can content producers leverage the ease and efficiency of peer-to-peer technologies like BitTorrent? The distributed distribution model is incredibly efficient as several companies have discovered where software distribution is concerned. They need to stop fearing peer-to-peer digital distribution and instead leverage its power.

Browser-Aware Player

One of the big challenges of streaming to the web is the sheer diversity of devices out there.

This past week, I pushed out some modifications to the player code on our live page that switches the player code based on what the user is connecting with. The genesis of this change was a problem with our change to JW Player Version 5 causing our PlayStation users to no longer be able to watch our video since JW v5 requires Flash 10 and Sony apparently doesn’t care about its customers. After a successful test with the Playstation, I extended the code to provide an HTML5 <VIDEO> tag for our iPhone users (allowing us to clear up some the clutter on the sidebar), as well as MMS and RTSP links around a graphic mimicking the Flash-based player in order to provide a consistent user experience for our Android/WebOS/BlackBerry/WinMo users.

EDIT: The main reason I’m not doing straight HTML5 with Flash fallback (a much more elegant solution) is that we’re sending out VP6 for our flash users and a lower-bandwidth h.264 stream for our mobile users. We’re not currently using h.264 for our flash users because of the poor quality of the h.264 encoder in Flash Media Live Encoder. Once we get a “real encoder“, we’ll send out a single set of h.264 streams and use HTML5 with fallback.

The code is here.

Anatomy of an online worship service

(or, How Amazon Cloudwatch helps manage Wowza server load)02-21-10-AM-AWS

This morning I woke up to two things: Beautiful Kansas City February weather (aka, an ice storm), and a voicemail from the Senior Pastor, asking if we had sufficient online capacity to support a larger-than-usual stream audience. Online worship streaming is a great option for these weather events that have been so common this winter (and not just in the KC area – we see increased online attendance when the weather gets foul elsewhere, like the DC storms of a few weeks ago).

My first indication that this was going to be a big event was Woopra showing 30 people on the web page half an hour before we start sending any kind of video (which is itself 75 minutes before we actually start the morning service). Usually there are two or three. Fifteen minutes after we started sending video, we were already cranking out 20-30 streams (again, we usually only have a small handful at this point).

02-21-10-AM-CPU

AWS CPU Usage

Most weeks, we run two Wowza repeaters pulling from a single origin server, which gives us plenty of capacity. I had to spin up a third repeater by the beginning of the pre-service music, a fourth about 10 minutes later, and a fifth after five more minutes. I set my threshold for spinning a new server at 75% CPU on the repeaters, as indicated by the AWS CloudWatch monitors. In the case of a heavy influx of viewers, this gives the new instance enough time to get up and running before the other repeaters hit 100% CPU.  Wowza tells me this is at about 180Mbit/sec on a small instance, which for us means around 300 streams. The CPU threshold of 75% works out to about 260 streams.

Unfortunately for our online worshipers, our web server was bogging down pretty hard at

Web Server CPU usage

Web Server CPU usage

the beginning of the service, where the two CPU cores were maxed out for about 15-20 minutes, which translated into slower page loads. The database server wasn’t sweating too hard, so I suspect this could have been helped with better PHP caching. Fortunately for me, this had the effect of slowing down the rate of incoming streams, which allowed me to get new repeaters going before the existing ones started choking.

You can see in the graph where we added new repeaters, and how fast they ramped up. It also shows how incredibly well Wowza’s built-in load balancing works. We eventually leveled out at a little over 1100 streams, which meant our EC2 instances were cranking out 600-700 Mbps for nearly an hour:

AWS Network Usage

AWS Network Usage

Meanwhile, this is what we were seeing on Woopra (note the fortunate souls escaping the ice storm in Aruba and the Cayman Islands!):

2-21-10-AM

Next step is to define rules in Cloudwatch for automatically scaling. For that to work, I’m going to need to build my own Wowza AMI, since the current method of starting repeaters involves sending the instance a startup package from the client. I’ll need to build this configuration into the server for CloudWatch scaling to work properly.

Making Sense of Mobile Streaming

Now that we’ve gotten streaming to computers down pat, I’ve set my sights on delivering a good experience for mobile users. Unfortunately, with the wide variety of mobile platforms out there, this is not an especially easy task. The Mac/PC/Linux issues are complicated enough, and it gets really tricky when the platform ecosystem has half a dozen major players (and a truckload of minor ones)

Since July or so, we’ve been using a preview version of the recently released Wowza V2 server software to deliver our video content to iPhone/iPod devices that support Apple’s new HTTP Streaming format. With minimal changes, Wowza V2 can also rebroadcast the same H.264/AAC stream over RTSP, which reaches a lot more devices. But this is where it gets complicated. BlackBerry has been supporting RTSP for some time, but it’s only recently that they’ve supported h.264/AAC media. According to their KB article on the subject, you can do H.264 on the following:

  • Bold 9000/9700
  • Tour 9630
  • Storm 9500/9520/9530/9550
  • Curve 8900/8520

Most HTC phones have a streaming media app that supports RTSP, but only recent versions seem support H.264. For example, my Mogul has the app, but I can only hear the audio. Brian‘s Touch Pro 2 gets both (and on the TP2’s WVGA screen, it looks amazing!).

Windows Media Player supports RTSP, but doesn’t come with an H.264 codec (even in Windows 7!!!! BOOO!!!!). I have yet to get the RTSP stream to work on Windows Media Player. The mobile player doesn’t support RTSP at all, just MMS and HTTP (but not the same HTTP as Apple! Grr!), and with the 9.5 generation of Windows Media Services (2008), MMS has gone away in favor of HTTP (which Microsoft calls Smooth Streaming, also not supported on WiMo).

The Palm Pre is supposedly able to do RTSP and H.264, but I’m waiting to hear back from one of our pre-wielding pastors to see if this is actually the case.

Thanks to Daryl Hunter at lifechurch.tv for letting me know that it works on his HTC Hero (Android 1.5). It seems that on Android you can’t manually enter an RTSP URL into the browser bar, but a web link or tinyurl redirect that goes to an RTSP URL does work.

Meanwhile, VLC player will play just about anything you throw at it, including the RTMP flash stream. Pity it’s not available in a mobile version.

So, as it stands now, in order to deliver a mobile experience to as many people as possible, I’m still going to need to run a separate Windows Media server for our Windows Mobile clients, But everyone else should be able to pull from the “iPhone” stream (which I’m probably going to need to rename), as long as the device supports H.264/AAC and RTSP.

 

Amazon and new Cloudfront Streaming

Earlier this week, I got an e-mail from Amazon Web Services, with some new goodies being announced.

The first was new pricing for Wowza. While the cost of each instance-hour is going up slightly, the cost of bandwidth is dropping about 15%. This makes me happy.

The second was that Wowza has released version 2. I’ve been working with preview versions of this since last summer with our iPhone streaming. They’ve made some very cool improvements to the product.

Lastly was an item that intrigued me. Amazon has, via Flash Media Server, added RTMP streaming capability to Cloudfront, Amazon’s cloud answer to CDN. Now instead of being able to widely distribute files, you have the capability of setting up a distribution that provides RTMP streaming of any supported video file in an S3 bucket. If you use S3 for on-demand video content, this is big for you. No longer do your viewers have to download the entire file (and run up the bandwidth meter in the process). They can now skip directly to the points they need and only use the bandwidth for stuff they actually watch.

This is potentially very good news for services that serve on-demand content from S3 (such as blip.tv)

It’s not so good news for the folks at Wowza, because I no longer have to spin up a Wowza instance to serve content stored in S3. Luckily for Wowza, the Cloudfront streaming doesn’t do live video.

It’ll be interesting to see how this plays out. As it currently stands, we could replace blip.tv with this functionality, but for the small cost, we get a whole lot of value from Blip.

On the perils of LED lighting and video…

For the last couple of years, we’ve used LED Christmas lights in our sanctuary. Considering how many we have (hundreds), the electricity savings are probably non-trivial.

All our LED strings in the sanctuary are plugged into either a stage dimmer or, where a dimmer port was unavailable, an Elation UniBar hooked into an RC4 Magic wireless DMX receiver (with the transmitter wired into DMX up in the catwalks). This allows us to control the Christmas lights along with the rest of the theatrical lighting via the Hog. It’s a very nice setup.

The other day, when Frank was running the stream, he saw the Christmas lights were fading in and out in sequence, and called up to the Penalty Box (the plexiglass-wrapped area at the back of house where the lighting operator and worship producer sit) and asked them to quit playing with the lights. As it turns out, they weren’t and the lights were all on. Mysteriously, they were fading in and out in sequence on the wide shot camera. When we looked at them on one of the other remote cameras, everything looked normal.

Then it hit me. I went to the remote control on the wide camera and cranked down the shutter speed, and lo, the lights gradually came together until they were all on. This is what it looked like:

Most stage dimmers operate by switching the AC cycle on and off via pulse width modulation. LEDs then only show one half of the AC sine wave, making them strobe, effectively reproducing the square-wave pulses that are modulating the dimmers. What We were seeing on the cameras was a beat frequency of the camera’s shutter speed and the strobing of the LEDs. You don’t see this on incandescent lighting because of the thermal persistence of the filaments. But why were the lights cycling at different times? Each one was connected to a different dimmer circuit, and those circuits are spread among the three AC phases coming into the dimmer room (which has a monster 2000-amp breaker).

So, if you’re shooting video of anything that has LED lights in it, make sure your shutter speed is at 1/60, or the lights are going to start acting strangely.