Gear Review: GoFanco Wireless HDMI

A few weeks ago, I went to Amazon and picked up a cheap wireless HDMI transmitter to solve a camera connection challenge at the church. I needed to send a GoPro feed back to the booth without running cables all over the floor (or worrying about the GoPro’s live streaming latency — it uses HLS with a real short segment size– and getting that stream into the switcher was not a trivial task).

As is so often the case with these “off-brand” (or less well-known brand) devices, my expectations were low, and I fully expected to return it after a week.

That didn’t happen. Not only was it easy to set up, the picture quality was excellent, and latency almost nonexistent. I immediately picked up another set to run a mobile confidence monitor cart. It’s also able to send IR control to the receiver.

As it turns out, GoFanco is starting to make a name for themselves in the video accessories market for both pros and consumers, not entirely unlike how BlackMagic got their start. They offer quite a wide assortment of gizmos to move video signals around.

The mobile GoPro rig. This could also potentially be used for linking live UAS footage back to a switcher.

Since I am a wireless network engineer by profession, I had significant concerns about how well this would behave in the spectrum – it advertises that it uses 5GHz, and I expected it to grab as much spectrum as it could (as most wireless video devices tend to), and walk all over everything else in the band. So I hauled out my Ekahau Sidekick and its spectrum analyzer to see how well-behaved it would be… And I was pleasantly surprised to discover that it was very well-behaved on the Wi-Fi spectrum… because it’s actually running Wi-Fi!

It’s running 802.11ac on a 20MHz channel (and the channel selection allows 10 different channels, which tells me it’s running on UNII-1 and UNII-3 and avoiding the DFS bands). Airtime usage is quite efficient, around 4%, which is shocking for a video application. And perhaps most useful is that it runs on 5VDC, and the supply is rated at 2A… Which means I can use a USB battery to power the transmitter and the GoPro (and a 20Ah slice will run this rig All. Day. Long.

Additional features allow not just 1:1 link, but 1TX:2RX, 2TX:1RX, all using a single channel. And because it’s quite efficient in spectrum/airtime usage, it does this in such a way that will coexist peacefully with your Wi-Fi.

It also means that if a presenter brings a laptop and wants to put it up on the screen, the transmitter can be powered from the laptop itself. This thing is a definitely a worthwhile addition to your tool kit.

The gory technical details,

  • Channel 0: Wifi Channel 36 (20MHz) (Default)
  • Channel 1: Wifi Channel 44 (20MHz)
  • Channel 2: Wifi Channel 157 (20MHz)
  • Channel 3: Wifi Channel 157 (40MHz)
  • Channel 4: Wifi Channel 149 (20MHz)
  • Channel 5: Wifi Channel 153 (20MHz)
  • Channel 6: Wifi Channel 149 (40MHz)
  • Channel 7: Wifi Channel 153 (40MHz)
  • Channel 8: Wifi Channel 165 (20MHz)
  • Channel 9: Wifi Channel 161 (20MHz)

This appears to be fairly smart frequency selection behaviour since these preprogrammed channels look like it will never set itself up on the secondary channel of a 40MHz pair, which is good for co-existence with other Wi-Fi. When powering up the unit, it will start on 0 but then switch to the last channel it was using once it completes booting, which only takes about 5 seconds.

Each channel is its own encrypted SSID named LK_<Channel #>.

There is a pair of LEDs on each unit: The transmitter has one that indicates it is getting a good HDMI signal, the other indicates that it has synced up with the receiver. On the receiver, one indicates that it has a good wireless signal (solid indicates a connection, blinking indicates active data transmission), and the other indicates that it has synced up with the transmitter. A mild annoyance here is that changing the channel on the receiver will not trigger the transmitter to switch channels. However, the included IR remote will let you do so. The transmitter also has an HDMI pass through, so you can insert it between a source and a monitor.

Here’s about a minute of traffic captured from the Sidekick. A channel change happens around 30 seconds in. The channel change process is pretty straightforward and exactly what you’d expect. When initiated from the receiver, it sends a deauth frame, and then the transmitter continues to beacon on its existing channel. When channel change is initiated on the transmitter, it will send a series of broadcast deauth frames to the SSID, change to the new SSID and start beaconing (this takes less than a second). Meanwhile, the receiver is looking for beacons from its pal, and when it sees the right SSID on its channel, it sends a broadcast probe request, gets the response from the transmitter, and goes through the standard association process. Management frames do not appear to be protected, so this device is vulnerable to deauth spoofing.

Data rate hovers around 100Mbps according to AristaPackets analysis of the capture. Given their use of off-the-shelf Wi-Fi for the networking component, I wouldn’t be surprised in the least to discover that the video protocol running underneath the hood was NDI, or something based on it. Why reinvent the wheel? I’d really love to crack open the encryption on this guy and see…

Given that this is using standard 802.11, the advertised range is about 50 metres, but it could easily be made to go longer distances simply by attaching a 2×2 MIMO directional antenna. Antenna connectors are RP-SMA.

One caveat: When I first set it up, the receiver was having a hard time staying up and maintaining signal… I quickly discovered that I had grabbed the wrong 5V power supply, and it was only able to source up to 1A – This device definitely needs more juice than that. Once I grabbed the correct 5V power supply, everything worked great. If you use a USB cable to power it, make sure the USB supply can source the full 2A (any supply designed for tablets or higher end smart phones should be adequate)

All in all, not a bad little setup for $200 and small change. It appears to be engineered above its price point, which is a great value.

On the perils of LED lighting and video…

For the last couple of years, we’ve used LED Christmas lights in our sanctuary. Considering how many we have (hundreds), the electricity savings are probably non-trivial.

All our LED strings in the sanctuary are plugged into either a stage dimmer or, where a dimmer port was unavailable, an Elation UniBar hooked into an RC4 Magic wireless DMX receiver (with the transmitter wired into DMX up in the catwalks). This allows us to control the Christmas lights along with the rest of the theatrical lighting via the Hog. It’s a very nice setup.

The other day, when Frank was running the stream, he saw the Christmas lights were fading in and out in sequence, and called up to the Penalty Box (the plexiglass-wrapped area at the back of house where the lighting operator and worship producer sit) and asked them to quit playing with the lights. As it turns out, they weren’t and the lights were all on. Mysteriously, they were fading in and out in sequence on the wide shot camera. When we looked at them on one of the other remote cameras, everything looked normal.

Then it hit me. I went to the remote control on the wide camera and cranked down the shutter speed, and lo, the lights gradually came together until they were all on. This is what it looked like:

Most stage dimmers operate by switching the AC cycle on and off via pulse width modulation. LEDs then only show one half of the AC sine wave, making them strobe, effectively reproducing the square-wave pulses that are modulating the dimmers. What We were seeing on the cameras was a beat frequency of the camera’s shutter speed and the strobing of the LEDs. You don’t see this on incandescent lighting because of the thermal persistence of the filaments. But why were the lights cycling at different times? Each one was connected to a different dimmer circuit, and those circuits are spread among the three AC phases coming into the dimmer room (which has a monster 2000-amp breaker).

So, if you’re shooting video of anything that has LED lights in it, make sure your shutter speed is at 1/60, or the lights are going to start acting strangely.

Live Streaming On a Budget (Part 3) – The Process

For background information, see Part 1 and Part 2

Server Infrastructure

We currently send our Flash stream to a small EC2 instance of the Wowza AMI, configured with the liverepeater application in origin mode (using the prebundled packages), and 3 small EC2 instances running liverepeater edge mode that pull from that server. We also run a custom Windows AMI running Windows Media Services, so we have 5 instances in all. Total cost for these is 68 cents an hour.

We have a set of five Elastic IPs in reserve so that we don’t have to jack with DNS every time and configure the repeaters in a round-robin DNS that is referenced by the player. The origin and Windows instances also have their own Elastic IPs in DNS. It costs us a penny for every hour each of these IPs are not mapped to an instance, but it’s a small price to pay for simplifying it.

Unfortunately, the process of spinning up the servers and assigning their IP addresses is not especially attractive to someone who is nontechnical. I’m working on devising a way to do this easily from a web interface, but that’s still on the drawing board. If anyone is handy with perl or PHP and wish to assist with this, let me know. Perl has a module on CPAN to access the Amazon API, I don’t know about PHP. The API tools that Amazon provides are written in Java, and there’s a Firefox extension for managing the machine images.

The Windows instance is configured with WMS and a publishing point set to pull from the encoder and auto-start. The VT[5] machine has a static IP and is NATted to an external IP so all I have to do is fire it up and wait for the encoder to see the incoming connection.

Getting it rolling

On Sunday morning, I show up at about 9:00 and sit through the first traditional service to get a feel for how the service flows.  While I’m doing that, I’m getting VT[5] ready for the morning’s services, creating the pre-service announcement loop from the slide graphics our communications department puts up on BackPack. Once VT[5] is running and presenting the video device to the system, I start Flash Media Encoder and Windows Media Encoder. At around 10:00 or so, I’ll spin up the Windows instance (which takes about 15 minutes to boot), and then fire up my 4 Wowza servers (which are usually ready in 1-2 minutes). After assigning the Elastic IPs to my instances, it takes about a minute or so to propagate the changes through Amazon’s network. When everything is ready, I launch the slide loop and a music bed and start both encoders.

Once that’s going, I turn to my laptop and fire up Outlook (to catch the feedback forms from Wufoo), PlanningCenter (to follow the service rundown), IRC (tech chat with people involved in the stream), an RDP session to the Windows instance to watch client counts, and a small VB script that polls each of the Wowza servers for client counts and adds them all up for me so I can see how many people are watching.

Over on another screen that’s at the encoding station, I launch Woopra to watch live site stats and to see where our audience is coming from, as well as keeping an eye on network traffic with WhatsUp Gold and watching the stream coming off the Wowza and Windows servers.

We’re currently building out an ops console station that will offload the monitoring to staff  (that means yours truly most weeks), and a volunteer will be handling the switching. The protoype: 11089636

This will live just around the corner from the switching workstation.It will also be equipped with a laptop dock. I’ll post pics when it’s all done. This will also be equipped with a backup system with WireCast in case the VT[5] bursts into flames or something equally catastrophic.

The Service

Most of the time, I’m feeding the IMAG program into the live stream, but will take a different camera shot every now and then to establish context or provide better visuals when IMAG is running a full-screen slide (such as prayer times). I have access to all 7 IMAG camera feeds as well as an additional Sony EVI-D70 camera on stage pointing out at the sanctuary seating. We’re working on getting remote control of that camera so that we can get some extra shots.

On the audio side, we feed our onstage mics back to a separate console in the recording studio and mix for broadcast there. We run the output through a dbx compressor upstream of the VT[5]. We’re also in the process of cabling for direct input of the FOH mix into the VT[5] in case of a failure in the broadcast mix hardware.

The Aftermath

Once the service is over, our video crew will shoot the postlude for the web, and then we’ll run slides for a few more minutes before shutting eerything down. Then we repeat the whole process at the evening service or any special broadcasts.

See? It’s just that simple!

Live Streaming On a Budget (Part 2) – Encoding

If you missed it, go back to Part 1 to see how we got here.

As I mentioned previously, Flash video was a key functional requirement of the project due to its cross-platform ability and its near ubiquity in the browser. This was a solution that wouldn’t require most of our audience to download anything extra to their machine.

In order to stream Flash, you have a couple of options:

  1. use a dedicated encoder system that slurps video in one end and spits Flash out the other. From a simplicity standpoint, this is great. From a production standpoint, not so great, because more than likely what you’re feeding it is your IMAG program, which doesn’t lend itself very well to people outside the room. I’ll cover that in a later post.
  2. Use a PC with a capture card and Flash Media Encoder. Cheap, simple to put together, but it suffers the same issues as option #1.
  3. Do some switching in software and encode to Flash.
  4. Run a dedicated switcher into an encoder. This gets expensive in a hurry.

We initially went with Option #3 using Telestream‘s WireCast software. WireCast is a nice inexpensive option (it was about $300 at church/educational pricing) and does its job reasonably well.

WireCast is available both for Mac and Windows (although a license key doesn’t allow you to move between platforms, a drawback). The Mac version has a slightly more polished user interface and the ability to show your program feed on a VGA output. The Windows version can’t do that, but it does have the option of encoding to Windows Media. The software will take any video source presented to the OS, whether it’s a USB webcam, a DV Device such as a video deck/camera/analog bridge, or a dedicated capture board such as ViewCast’s Osprey line. It’s got some serious advantages for portability (a laptop and a couple of DV-capable cameras, and you’re good to go!), we encountered a number of issues with this solution that eventually led us to another solution. If you’re looking at using WireCast with multiple analog sources, I’d highly recommend 2 capture cards being fed by a matrix switch with 2 outputs. This makes multi-input scenes in wirecast a lot easier to configure.

We configured our WireCast system with a pair of Osprey 210 PCI cards in a quad-core Dell Optiplex 755 (I really would have liked an additional card, but the Opti mini-tower only has two PCI slots). Each Osprey card had its own audio capture which presented itself in Wirecast.

At the time, Wirecast didn’t actually encode to Flash directly, but rather to H.264 QuickTime that was then converted to Flash by the Wowza software (It was the folks at Wowza that recommended WireCast to us). Wirecast has since added the capability to encode directly to Flash, although it only does H.264. One of the problems we immediately encountered with this conversion was that the video lagged the audio by 10 or 11 frames. Initially, we had to add a hardware audio delay prior to encoding to compensate for this, which meant that any streams we recorded had the audio lagging by 10 or 11 frames. Wowza later added an option in their configuration to compensate at the server end which allowed us to record in sync and get rid of the delay unit.

By this point, the audio was in sync, but was still not the quality we expected of it. Frequent drops and “zippering” on both the stream and the recording seemed to indicate that there was a hardware bottleneck on the encoder. When we started looking at performance metrics on the Dell, we noticed that not only was CPU during the encoding up around 65%, merely firing up WireCast would consume 10-15% CPU dealing with Deferred Procedure Calls (DPCs). We initially suspected the Osprey drivers as DPCs are often related to video hardware and bad drivers. However, if I fired up Flash Media Encoder and pointed it to the same cards, it wasn’t a problem.

We then began suspecting that PCI might be a problem and got our hands on an Osprey 450e card which was quad encoders on a single PCIe board (PCIe has dramatically more bandwidth to work with than PCI or PCI-X, even at a single lane). This was still a problem, so we started looking at whether the PC was the problem. We got a demo of a used Precision 690 from Stallard Technologies and gave that one a try, with no success. The PC platform was not working for us.

So, we gave our friends at Apple a call and asked if we could try out a Mac Pro for 30 days and purchase it if this fixed it. This is where we discovered many flaws in the plan:

  1. Wirecast Mac requires a different license key than the Win version. Added bonus: Our vendor seemed to have vanished.
  2. Osprey cards don’t work on Mac.
  3. The video capture world for Mac consists of a bunch of consumer-grade DV bridges, or really expensive broadcast-grade HD DV bridges. PCIe hardware is pretty much limited to DeckLink.
  4. While the DeckLink Driver supports multiple cards in one system, using them is entirely application-dependent, and WireCast doesn’t do that.

This process took us several months to figure out, and we bought and returned thousands of dollars of hardware. Our vendors were very gracious to us through the process. A big thank you goes out to Apple, CDW (Osprey Cards) and B&H (Decklink) for their support. Thanks also to Matt at Stallard for letting me peruse their warehouse in search of a PC platform with sufficient PCIe slots, and then letting me return it when it didn’t solve all my problems (it didn’t make julienne fries, either)

At the end of it all, it became clear that while WireCast is a great application, it wasn’t going to meet our needs on a weekly basis (but I can see using it either as a backup plan, or for other venues as we expand). So, we began the quest for a new solution.

Early on in the process, I’d encountered a device called the TriCaster from NewTek. PC-based switcher in a turnkey box, with streaming capability. Looked neat, but for the inputs we’d need, we were looking at  $10,000, which wasn’t going to work within our budget. After some conversations with Terry Storch at Lifechurch.tv, I found out that NewTek makes a PC-based version called VT[5] which is a software/hardware combo where you supply the PC and OS. With an added breakout box (SX-84) that expands the number of inputs (up to 24 composite signals!!!!) beyond the initial 3, we were able to apply this combo to our Optiplex 755 for about $6000. An SDI breakout is also available, but we’re not using it. The package comes with an entire production studio – switcher, playback, capture, nonlinear editing, character generation, live sets, chroma key, and animation, and lots of other goodies too.

The Tricaster has a panel in the desktop that does streaming via Flash, but the default profiles max out at 480×360@30fps. I’m told you can create custom profiles, but this is not supported by NewTek. On the VT[5] side, however, it merely presents the program output as video and audio devices to the OS. Added bonus is that the driver is coded such that multiple applications can use it simultaneously. To encode Flash, I simply launch FME after the software starts. I can then configure my stream with anything FME supports. Meanwhile, I can also launch an instance of WME that can feed a Windows Media stream to a server for low-bandwidth and mobile devices. For us, this is a huge advantage of the VT[5] over Tricaster. Both have an optional USB control surface (LC-11) that presents a mechanical user interface of buttons and T-bar that’s familiar to anyone who’s operated a switcher. Thanks to a donor to our technical production ministry, we were able to acquire one which should make it a lot easier on the volunteers who will be operating it.

Because the VT[5] is a hardware-software solution, a lot of the issues we encountered with WireCast aren’t a factor. With WireCast, most of the magic happens in software. NewTek’s product makes all the magic happen on the PCI board, and merely uses the software to control the hardware. This alleviates much of the bandwidth constraint posed by the PCI interface. Rumour has it the next generation of the NewTek hardware that is present in the TriCaster XD300 is a 16x PCIe device (unsurprisingly, as it supports HD).

So far, the VT[5] is looking good.

(I’ve also heard of some places that have used one of the M/E buses on their production switcher to feed the stream, and then using the main program out of the switcher to do IMAG. I’ve not tried this, so I can’t tell you how well it works.

Stay tuned for the next installment, where I’ll talk about our production process.

Live Streaming On a Budget (Part 1) – Genesis

When our senior pastor started casting his vision of an Internet Campus which would revolve around a live (or nearly-live) stream of worship, it became pretty apparent that this was not going to scale well or cheaply. Over the course of the summer of 2008, we started exploring creative ways to do the impossible with nearly no money.

Read More

March Madness: The Network Plumber’s Perspective

Web video is clearly here to stay. Heck, I currently have 40% of my time dedicated to producing and delivering web video of our weekend worship services. I think this is tremendously cool stuff, and traditional one-way RF-based video delivery (a.k.a. TV) is pretty lame. My kids have no concept of a broadcast schedule. Their content world is one that is immersive, interactive, and on-demand.

We’re now coming up on that season that we network admins have begun to dread over the last few years: March Madness. With networks advertising live web video of every. single. game., those of us charged with the care and feeding of our WAN pipes are blanched in abject terror. We know that 95% of our staff is going to want to watch them while they work. It doesn’t take much math skill to figure out that multiplied by a couple hundred people, even viewing one event means that the remaining 3 people in the organization that don’t really give a hoot about hoops aren’t going to be able to get any work done and pick up the slack the rest of us are leaving.

When you do internet video on the scale of the NCAA tournament, or a news network during a major news event, you’re relying on the performance of your CDN. Naturally, you want to accurately count eyeballs so that the advertisers pay you appropriately. A lot of time and effort goes into engineering thse things, and it’s quite remarkable how well this works.

CNN’s approach using Octoshape is a creative one, that really pushes P2P technology into the mainstream of legitimacy. I was present at the creation nearly ten years ago [+] [++]when Gnutella was leaked to the world, and changed the rules of the multimedia distribution game, and recall thinking how interesting things were going to become. Out of the Gnutella proof-of-concept came LimeWire and others, and then BitTorrent figured out how to dial the concept to a global scale. Now the same idea is being integrated into mainstream CDNs with Octoshape and other “cloud” applications.

It seems to me that the CDN operators should be able to find a way to engineer their networks such that a corporate network admin (such as myself) could download a piece of software onto a spare piece of gear and run a node of the CDN, internal to the corporate network (or, for that matter, run it as a VMWare virtual appliance). This not only softens the blow to my WAN pipes, but also lightens the load on the public parts of the CDN. The only thing then going across the WAN connection is a single instance of each stream being requested by clients internal to the company. Then it simply phones home with the proper client count for advertiser tracking, and bingo, people can get work done, as well as watch their favourite team make a run at the Final Four.

…Or we network admins can simply block the CDN in their content filters and tell their users that we’re mean party poopers, depriving them of their hoops and depriving the webcasters of their revenue.