Live Streaming on a Budget (Part eleventy) – Metrics

It’s all well and good that we’re putting this stream out there… But if we’re not reaching anyone, it’s kind of a big waste of money and time. How do we go about finding out how many people we’re reaching?

Fortunately, Wowza has a built-in mechanism for reporting the number of active stream connections via an HTTP connection to the streaming server. It does this primarily for load balancing purposes but the data is easily parsed for other things as well.

I currently have a VB Script (the code is horrid, because I suck at VB)  that connects to each of the streaming servers, parses the response into a numerical value, adds them up to get a total stream count. The script runs on a 5-second delay loop, keeps track of the peak, and gives me the following output, where the red area is the origin server, green is the repeaters, and blue is the iPhone streams. Windows streams are gleaned directly from Windows Media Services.

Stream count output

Stream count output

That’s all well and good, but how many actual people are watching? We know there are several people who watch this alone, but others do it in groups or with their family. Initially, when we benchmarked other churches, we were told that a ratio of about 1.8 people per stream was a pretty reliable guess. We went with that for a while, as we gathered our own data.

To gather our data, we created a sign-in/feedback form for our web audience that functions very much like the friendship pad we pass around in our physical worship services. One key question that you find only online is “how many people are worshipping today?” Based on the aggregated data from these forms (we’ve collected nearly 8,000), we found that our ratio was closer to 1.7, so we started using that for the purposes of reporting. We typically see about 40% of our peak stream count send us a feedback form, so the per-stream count is probably a fairly representative sample. Periodically, we’ll see the ratio jump up to 2:1 in special circumstances such as inclement weather at our central campus, and we’ll adjust the numbers for that service accordingly.

I’m sure there’s a better way than my cheesy vb script, but it works for now.

Live Streaming on a Budget (Intermission) Speedlinks!

Here’s a bunch of useful links accumulated recently during the course of my research. They’re here as much for my own recall as they are for your benefit.

Live Streaming on a Budget (Part 5) – Automating EC2

I mentioned a few posts back that I was looking for a way to automate startup and shutdown of the servers. Thanks to some great sleuthing by Justin Moore at Granger Community Church, I got some scripts to start from. I had to make some modifications to suit our exact purposes, but that was relatively easy.

Ingredients:

Linux system with:

  • Java Runtime (in Ubuntu, the package is sun-java6-jre)
  • Amazon EC2 API Tools (installed in /opt/ec2/tools)
  • Wowza Startup Packages (installed in /root/ec2)
  • EC2 keys (installed in /root/ec2)

Note: Because these scripts are run from cron, you’ll need to put all your environment variables to run EC2 at the beginning of each one.

I have 6 separate versions of the startup and termination scripts, one for each server I need to start. I could roll it into one big script, but putting them in their own individual ones not only lets me do an individual machine manually, I can run them all in parallel from cron, which shortens the startup time.

The startup script functions as follows:

  1. Assign environment variables for EC2 and for the machine parameters
  2. Launch machine with ec2-run-instances, redirect output to a temporary file*
  3. Parse temporary file and extract the instance ID, and put it into an environment variable
  4. Write contents of instance ID environment variable to a file in /root/ec2 for use by the shutdown script
  5. Wait 30 seconds
  6. Start checking instance status every so we know when it’s running (wait 10 seconds to check again if it’s not)
  7. Attach any EBS volumes (Optional – I don’t currently need this, so it’s commented out)
  8. Assign Elastic IP
  9. Execute any additional commands via ssh (Optional, I don’t have any that need to run)

* The original scripts use /tmp/a, which is fine, but I had to make each script do its own temporary file since all 6 were running simultaneously and I ran into problems with getting the right Instance IDs set.

The shutdown script works like this:

  1. Query AWS for all running instances
  2. Issue EC2 termination call
  3. ???
  4. PROFIT!

Lastly, put it in your crontab:

# m h  dom mon dow   command
15 8 * * 0 /root/start-windows.sh
25 8 * * 0 /root/start-origin.sh
25 8 * * 0 /root/start-iphone.sh
25 8 * * 0 /root/start-repeater1.sh
25 8 * * 0 /root/start-repeater2.sh
25 8 * * 0 /root/start-repeater3.sh

0 19 * * 0 /root/term-windows.sh
0 19 * * 0 /root/term-iphone.sh
0 19 * * 0 /root/term-origin.sh
0 19 * * 0 /root/term-repeater1.sh
0 19 * * 0 /root/term-repeater2.sh
0 19 * * 0 /root/term-repeater3.sh

This starts up all of them (except the Windows instance which needs more time) at 8:25 on Sunday morning and shuts them down at 7 on Sunday evening.  (be sure that if you’re using GMT on your Linux box to take that into account).

If you’re using an encoder that can be started and stopped on a schedule, synchronize your times with this, and you’ll be golden. The Wowza EC2 images take about 60-90 seconds to fully get up and running, and the Windows one takes about 10-15 min. Currently, the Windows server pulls from the encoder via a 1:1 NAT rule, so the WME instance can be running before the EC2 server is going. When EC2 is ready, it simply connects to the encoder and is off and running.

Live Streaming on a Budget (Part 4) – How it works

We started streaming to iPhones today. Huge success, way easier than it ought to be, now that the iPhone does HTTP streaming and Wowza’s V2 software supports everything needed to stream to the FruitFone. All we had to do was shell out $250 to MainConcept for their AAC encoder plugin for Flash Media Live Encoder. (Although we subsequently discovered that there is a bug with the MainConcept encoder that cause audio sync problems on iPhone, so we’ve moved iPhone encoding off to a separate machine)

There are a lot of layers to this onion, so I put together a block diagram that links everything from the cameras to client.

Live Streaming On a Budget (Part 3) – The Process

For background information, see Part 1 and Part 2

Server Infrastructure

We currently send our Flash stream to a small EC2 instance of the Wowza AMI, configured with the liverepeater application in origin mode (using the prebundled packages), and 3 small EC2 instances running liverepeater edge mode that pull from that server. We also run a custom Windows AMI running Windows Media Services, so we have 5 instances in all. Total cost for these is 68 cents an hour.

We have a set of five Elastic IPs in reserve so that we don’t have to jack with DNS every time and configure the repeaters in a round-robin DNS that is referenced by the player. The origin and Windows instances also have their own Elastic IPs in DNS. It costs us a penny for every hour each of these IPs are not mapped to an instance, but it’s a small price to pay for simplifying it.

Unfortunately, the process of spinning up the servers and assigning their IP addresses is not especially attractive to someone who is nontechnical. I’m working on devising a way to do this easily from a web interface, but that’s still on the drawing board. If anyone is handy with perl or PHP and wish to assist with this, let me know. Perl has a module on CPAN to access the Amazon API, I don’t know about PHP. The API tools that Amazon provides are written in Java, and there’s a Firefox extension for managing the machine images.

The Windows instance is configured with WMS and a publishing point set to pull from the encoder and auto-start. The VT[5] machine has a static IP and is NATted to an external IP so all I have to do is fire it up and wait for the encoder to see the incoming connection.

Getting it rolling

On Sunday morning, I show up at about 9:00 and sit through the first traditional service to get a feel for how the service flows.  While I’m doing that, I’m getting VT[5] ready for the morning’s services, creating the pre-service announcement loop from the slide graphics our communications department puts up on BackPack. Once VT[5] is running and presenting the video device to the system, I start Flash Media Encoder and Windows Media Encoder. At around 10:00 or so, I’ll spin up the Windows instance (which takes about 15 minutes to boot), and then fire up my 4 Wowza servers (which are usually ready in 1-2 minutes). After assigning the Elastic IPs to my instances, it takes about a minute or so to propagate the changes through Amazon’s network. When everything is ready, I launch the slide loop and a music bed and start both encoders.

Once that’s going, I turn to my laptop and fire up Outlook (to catch the feedback forms from Wufoo), PlanningCenter (to follow the service rundown), IRC (tech chat with people involved in the stream), an RDP session to the Windows instance to watch client counts, and a small VB script that polls each of the Wowza servers for client counts and adds them all up for me so I can see how many people are watching.

Over on another screen that’s at the encoding station, I launch Woopra to watch live site stats and to see where our audience is coming from, as well as keeping an eye on network traffic with WhatsUp Gold and watching the stream coming off the Wowza and Windows servers.

We’re currently building out an ops console station that will offload the monitoring to staff  (that means yours truly most weeks), and a volunteer will be handling the switching. The protoype: 11089636

This will live just around the corner from the switching workstation.It will also be equipped with a laptop dock. I’ll post pics when it’s all done. This will also be equipped with a backup system with WireCast in case the VT[5] bursts into flames or something equally catastrophic.

The Service

Most of the time, I’m feeding the IMAG program into the live stream, but will take a different camera shot every now and then to establish context or provide better visuals when IMAG is running a full-screen slide (such as prayer times). I have access to all 7 IMAG camera feeds as well as an additional Sony EVI-D70 camera on stage pointing out at the sanctuary seating. We’re working on getting remote control of that camera so that we can get some extra shots.

On the audio side, we feed our onstage mics back to a separate console in the recording studio and mix for broadcast there. We run the output through a dbx compressor upstream of the VT[5]. We’re also in the process of cabling for direct input of the FOH mix into the VT[5] in case of a failure in the broadcast mix hardware.

The Aftermath

Once the service is over, our video crew will shoot the postlude for the web, and then we’ll run slides for a few more minutes before shutting eerything down. Then we repeat the whole process at the evening service or any special broadcasts.

See? It’s just that simple!

Live Streaming On a Budget (Part 2) – Encoding

If you missed it, go back to Part 1 to see how we got here.

As I mentioned previously, Flash video was a key functional requirement of the project due to its cross-platform ability and its near ubiquity in the browser. This was a solution that wouldn’t require most of our audience to download anything extra to their machine.

In order to stream Flash, you have a couple of options:

  1. use a dedicated encoder system that slurps video in one end and spits Flash out the other. From a simplicity standpoint, this is great. From a production standpoint, not so great, because more than likely what you’re feeding it is your IMAG program, which doesn’t lend itself very well to people outside the room. I’ll cover that in a later post.
  2. Use a PC with a capture card and Flash Media Encoder. Cheap, simple to put together, but it suffers the same issues as option #1.
  3. Do some switching in software and encode to Flash.
  4. Run a dedicated switcher into an encoder. This gets expensive in a hurry.

We initially went with Option #3 using Telestream‘s WireCast software. WireCast is a nice inexpensive option (it was about $300 at church/educational pricing) and does its job reasonably well.

WireCast is available both for Mac and Windows (although a license key doesn’t allow you to move between platforms, a drawback). The Mac version has a slightly more polished user interface and the ability to show your program feed on a VGA output. The Windows version can’t do that, but it does have the option of encoding to Windows Media. The software will take any video source presented to the OS, whether it’s a USB webcam, a DV Device such as a video deck/camera/analog bridge, or a dedicated capture board such as ViewCast’s Osprey line. It’s got some serious advantages for portability (a laptop and a couple of DV-capable cameras, and you’re good to go!), we encountered a number of issues with this solution that eventually led us to another solution. If you’re looking at using WireCast with multiple analog sources, I’d highly recommend 2 capture cards being fed by a matrix switch with 2 outputs. This makes multi-input scenes in wirecast a lot easier to configure.

We configured our WireCast system with a pair of Osprey 210 PCI cards in a quad-core Dell Optiplex 755 (I really would have liked an additional card, but the Opti mini-tower only has two PCI slots). Each Osprey card had its own audio capture which presented itself in Wirecast.

At the time, Wirecast didn’t actually encode to Flash directly, but rather to H.264 QuickTime that was then converted to Flash by the Wowza software (It was the folks at Wowza that recommended WireCast to us). Wirecast has since added the capability to encode directly to Flash, although it only does H.264. One of the problems we immediately encountered with this conversion was that the video lagged the audio by 10 or 11 frames. Initially, we had to add a hardware audio delay prior to encoding to compensate for this, which meant that any streams we recorded had the audio lagging by 10 or 11 frames. Wowza later added an option in their configuration to compensate at the server end which allowed us to record in sync and get rid of the delay unit.

By this point, the audio was in sync, but was still not the quality we expected of it. Frequent drops and “zippering” on both the stream and the recording seemed to indicate that there was a hardware bottleneck on the encoder. When we started looking at performance metrics on the Dell, we noticed that not only was CPU during the encoding up around 65%, merely firing up WireCast would consume 10-15% CPU dealing with Deferred Procedure Calls (DPCs). We initially suspected the Osprey drivers as DPCs are often related to video hardware and bad drivers. However, if I fired up Flash Media Encoder and pointed it to the same cards, it wasn’t a problem.

We then began suspecting that PCI might be a problem and got our hands on an Osprey 450e card which was quad encoders on a single PCIe board (PCIe has dramatically more bandwidth to work with than PCI or PCI-X, even at a single lane). This was still a problem, so we started looking at whether the PC was the problem. We got a demo of a used Precision 690 from Stallard Technologies and gave that one a try, with no success. The PC platform was not working for us.

So, we gave our friends at Apple a call and asked if we could try out a Mac Pro for 30 days and purchase it if this fixed it. This is where we discovered many flaws in the plan:

  1. Wirecast Mac requires a different license key than the Win version. Added bonus: Our vendor seemed to have vanished.
  2. Osprey cards don’t work on Mac.
  3. The video capture world for Mac consists of a bunch of consumer-grade DV bridges, or really expensive broadcast-grade HD DV bridges. PCIe hardware is pretty much limited to DeckLink.
  4. While the DeckLink Driver supports multiple cards in one system, using them is entirely application-dependent, and WireCast doesn’t do that.

This process took us several months to figure out, and we bought and returned thousands of dollars of hardware. Our vendors were very gracious to us through the process. A big thank you goes out to Apple, CDW (Osprey Cards) and B&H (Decklink) for their support. Thanks also to Matt at Stallard for letting me peruse their warehouse in search of a PC platform with sufficient PCIe slots, and then letting me return it when it didn’t solve all my problems (it didn’t make julienne fries, either)

At the end of it all, it became clear that while WireCast is a great application, it wasn’t going to meet our needs on a weekly basis (but I can see using it either as a backup plan, or for other venues as we expand). So, we began the quest for a new solution.

Early on in the process, I’d encountered a device called the TriCaster from NewTek. PC-based switcher in a turnkey box, with streaming capability. Looked neat, but for the inputs we’d need, we were looking at  $10,000, which wasn’t going to work within our budget. After some conversations with Terry Storch at Lifechurch.tv, I found out that NewTek makes a PC-based version called VT[5] which is a software/hardware combo where you supply the PC and OS. With an added breakout box (SX-84) that expands the number of inputs (up to 24 composite signals!!!!) beyond the initial 3, we were able to apply this combo to our Optiplex 755 for about $6000. An SDI breakout is also available, but we’re not using it. The package comes with an entire production studio – switcher, playback, capture, nonlinear editing, character generation, live sets, chroma key, and animation, and lots of other goodies too.

The Tricaster has a panel in the desktop that does streaming via Flash, but the default profiles max out at 480×360@30fps. I’m told you can create custom profiles, but this is not supported by NewTek. On the VT[5] side, however, it merely presents the program output as video and audio devices to the OS. Added bonus is that the driver is coded such that multiple applications can use it simultaneously. To encode Flash, I simply launch FME after the software starts. I can then configure my stream with anything FME supports. Meanwhile, I can also launch an instance of WME that can feed a Windows Media stream to a server for low-bandwidth and mobile devices. For us, this is a huge advantage of the VT[5] over Tricaster. Both have an optional USB control surface (LC-11) that presents a mechanical user interface of buttons and T-bar that’s familiar to anyone who’s operated a switcher. Thanks to a donor to our technical production ministry, we were able to acquire one which should make it a lot easier on the volunteers who will be operating it.

Because the VT[5] is a hardware-software solution, a lot of the issues we encountered with WireCast aren’t a factor. With WireCast, most of the magic happens in software. NewTek’s product makes all the magic happen on the PCI board, and merely uses the software to control the hardware. This alleviates much of the bandwidth constraint posed by the PCI interface. Rumour has it the next generation of the NewTek hardware that is present in the TriCaster XD300 is a 16x PCIe device (unsurprisingly, as it supports HD).

So far, the VT[5] is looking good.

(I’ve also heard of some places that have used one of the M/E buses on their production switcher to feed the stream, and then using the main program out of the switcher to do IMAG. I’ve not tried this, so I can’t tell you how well it works.

Stay tuned for the next installment, where I’ll talk about our production process.

Live Streaming On a Budget (Part 1) – Genesis

When our senior pastor started casting his vision of an Internet Campus which would revolve around a live (or nearly-live) stream of worship, it became pretty apparent that this was not going to scale well or cheaply. Over the course of the summer of 2008, we started exploring creative ways to do the impossible with nearly no money.

Read More

Round Table Session 1 Notes

Cool Tools:

  • BombBomb
  • RoyalTS
  • RDTabs
  • MRemote
  • LovelyCharts
  • SpiceWorks
  • Kiwi/SolarWinds
  • Likewise
  • Mobiscope

Volunteers in IT:

  • How do we recruit volunteers?
  • Volunteer Fairs
  • Be clear about requirements
  • Background checks
  • This is a production environment, not a training ground
  • As leaders, we need to define the scope way ahead of time
  • Give your volunteers a tour, show the blinkenlights
  • Good opportunity for people out of work to keep skills sharp, feel valued
  • Weekend Announcements

Offsite Backup

  • Backups are for the weak of faith — bryson
  • What needs to be backed up, how often – not an IT decision, but a business decision
  • Control/security of offsite data
  • What’s the most important to leadership in case of a disaster?

Live geoanalytics – need help!

I’m looking to put together a live map for seeing where people are coming from on our live stream. One format of this map would be a full-screen display at the ops console, the other would be a small map on the website itself. If you’re using this kind of technology, Id love to know how you are doing it, whether it’s with a monthly service, or you rolled your own code.What I’ve looked at so far:

Google Analytics: Doesn’t come anywhere close to realtime. Looks like about a 24-hour waiting period for your data. Looking at the historical data for the live site, it doesn’t seem to be all that accurate either. Numbers, locations, and durations of visits seem to be way off what we’re seeing in our feedback and in our logs.

W3Counter: Seems interesting, but their site performance/availability is a major problem. I smell scalability issues.

VisiStat: Very nice product, but a little spendy for what I’m after, considering its shortcomings. Live map doesn’t appear to have the ability to specify a timeframe. Either you refresh the page and it adds new visits to a blank map, or you leave it up and nothing falls off the back.

Feedjit: I use this for my blog, and it’s great for that (see widget in the sidebar). But I can’t see using this for a “real” site. I greatly dislike the inability to customize the widget beyond text color (I really don’t want it showing the geoblogosphere link, it’s completely irrelevant and a distraction). It too seems to lack the ability to restrict the map by timeframe.

None of these products appeared to have the ability to customize the map display, most of them had a map that was ridiculously small and didn’t scale with the browser window.

If you rolled your own, how complex was it? What was the cost for the geolocation data?

EDIT: Forgot about Woopra… Looks awesome, but it’s still vaporware.

EDIT^2: OK, so Woopra isn’t technically vaporware, apparently real people are using it, but it’s been in “beta” for a very long time.

We interrupt regular geek programming…

…to bring you this important public service announcement regarding child car seat safety.

My friend Christine asked our state troopers about carseat safety rules here in Kansas. Here is his reply:

Hi Christine,
britax-duo-plus-isofix-car-seatmy name is Trooper Tim McCool. I’m the Troop B (Topeka) Public Resource Officer. I’m also a Child Passenger Safety Technician/Instructor. I can appreciate your question, our current law is somewhat confusing. The origins of our current law start back in the 1980’s and the law has been revised several times over the years. Our legislators have tried to keep up with the current recommendations but have not always been successful.  As law enforcement officers we try to look at what is recommended nationally and try to apply that to our local law. Our law doesn’t say you have to use a forward facing seat at one year of age it says that you must be using a seat properly, and if you follow the national recommendations then you should be using a rear facing convertible to its upper weight limit rear facing. What also leads to confusion is that the AAP currently recommends that the minimum you should turn a child around forward facing is now at 18 months and 25 pounds in weight. As you see, lots of information. Best rule of thumb, that will keep you  out of trouble is to always secure your child in a CPS seat and follow the national recommendations. If you meet a law enforcement officer, most of the officers will defer to one of us that is a CPS Tech. and will support the national recommendations. Again, we law enforcement officers don’t make the laws here in Kansas, we only enforce them. If you or anyone else would like to see our law changed then I would suggest that you contact your local legislator and make your feelings known to them. If they don’t hear from their constituents, they won’t know that there is an issue. Please feel free to contact me if you have any further questions.

Tech. Trooper Timothy I. McCool
Public Resource Officer
Kansas Highway Patrol – Troop B

So, there you have it. 12 months and 20 pounds is now outdated information. Remember that 18 months and 25 lbs is a minimum, the reality is that you should keep them rear-facing as long as they are within height and weight limits of the seat (which for most is 33 lbs). We had to turn F around at 12 months on the dot because she was 34 lbs. C is still under 33 lbs, but she’s a lot taller than a rear-facing seat can handle. We didn’t flip her around until she was about 2.

Naturally, make sure the seat is properly installed in the car, and your child in the seat. If in doubt, get it checked. 95% of all carseats are improperly installed.