The Valley Cost Model: Broadcatching and Net Television

Given that I've written about how BitTorrent could work with transactions, it probably isn't surprising that I've been thinking about how the concept of broadcatching (learn more) might support online video distribution that looked like "made for Internet television". We've tried "cybercasting festivals" and "live Internet shows" and all of them suffered from the expense of the peaks of demand. While broadcatching optimizes away those peaks (turning them into the least expensive phase of distributing the video), it leaves the problems of archiving older materials as the new cost problem. Does this mean that the best new "net television" business model might be, "free when it's new, but you pay for archives?"

Optimizing the Peaks

In 2000, we tried daily video coverage from the Sundance Film Festival. Microcast (dot com victim #1 in this essay) shipped this huge remote digitization and streaming rig to our production hub in Park City, and we produced 30-minute shows on a daily basis. The amount of server resources involved in making that content available was orders of magnitude higher than what runs indieWIRE on a daily basis, because the spikes in demand where so high -- everyone who wanted to see these video pieces was checking it out during the Festival, so even having a hundred people streaming simultaneously took tremendous resources.

The experiment with Pseudo (dot com victim #2) suffered similar problems: their model was "live shows" broadcast from their "studios" in Manhattan. The expense of aggregating that attention to their streaming servers was more than they could ever make from in-stream advertisement -- the size of the audience precluded the television model of pricing advertising from working. If Pseudo had moved to a subscription fee they likely wouldn't have seen that as additional income -- as the subscription environment would have limited the number of viewers, further dropping the value of the in-stream advertisement.

Here's the counter-intuitive thing about BitTorrent and RSS for people who have worked with video online in the past: by exaggerating the demand (and making people cluster up even more in time by auto-downloading the newest "episode" as soon as it is published), BitTorrent allows you to minimize the total amount of bandwidth being served up as a video publisher (by ensuring that there are lots of other people downloading to share with as peers.) This efficiency, though, is short-lived and people dropping by to visit that video file even a couple of days later will need to rely more and more upon the primary bandwidth that you are providing.

The Valleys: Broadcatching and Archiving

This is where my fear about "broadcatching" comes in, and where transactional models (whether micro-transaction or subscription) seem to provide a compromise. If (as a "net video publisher") the optimization of distribution costs created by BitTorrent and RSS is the key to profitibility, then I am disincentivized from providing deep archives -- once the pareto efficiency of the peer swarming drops below a certain level (as demand drops), the bandwidth costs return more and more to the costs of just posting the video file (which are financially difficult for a video publisher to bear.)

The new mantra is: the higher the demand, the lower the cost. Now you have to plan your costs around the valleys rather than the peaks, but the costs are still difficult. Fortunately, the idea that develops from this is one that is familiar to the Web: new content is free, and archival content costs. In this case, the relatively good gatekeeping of torrent files at the server level -- whether as authentication (seperate from a transaction) or a transaction (using something like BitPass) or even depublishing (no longer available) -- provides that potential.

Envisioning the Valley Cost Model

As a model, that might mean you can afford to offer your "newest episode" for free for a limited period of time -- that period of time when your bandwidth costs are the lowest because peer swarming is the most efficient. At that point, the "other revenues" you can bring in might be sufficient since you spared the majority of the bandwidth cost. This even provides an added incentive to adopt the strange new tools needed to partipate in "torrenting with RSS feeds" (as it helps to ensure that you get your copy during the free period.)

Sometime later, as the demand for that torrent drops below some inverse tipping point (perhaps we should call it a "tripping point"), you need to start gatekeeping it -- charging a fee to access it, or requiring a site subscription to access it, or even removing from the Web and selling DVD copies of it at a higher margin. In fact, you might even be able to do all of those -- imagine a new episode for free the first day, site subscription required for seven days after that, than it's pay-per-access for everyone for the rest of month before it comes offline and onto the next DVD.

The more successful your RSS push, though, the faster you're likely to need to gateway that torrent file -- the initial optimization of bandwidth might be a very brief event. In a perfect world, you might even be able to calculate this on the same kind of "pareto optimization" that BitTorrent itself works on: imagine a pay-per-access charge that was calculated directly from how much of the bandwidth the server will have to bear (the inverse of how many other people are peering the file.) In that kind of a system, you could have a much more fluid system -- one where a slashdotting of attention could drive up demand and drive down the price at the same time.

That Tricky Ethos Again

Just because you can accomplish something online technically doesn't mean it will work with the audience -- I've no desire to have one of my projects be "dot com victim #3" of this essay. It all comes down to the ethos of the community that develops around BitTorrent and how they view different models of transaction as a part of the network. In a piece that Ernest Miller wrote about my previous entry, he picked up on the issue that has me really excited -- that independents are more likely to convince their audience that "we're all in this together" than corporations can:

"I believe that a well-designed market using broadcatching would encourage cooperation between creators and consumers, turning distribution into a collaborative effort. Sure, corporations could play this game, but independents could be on an almost equal footing, both would have consumers as their partners. I'm still thinking about the possibilities here, but I think they may be one of the most significant aspects of broadcatching."

This is part of what makes the "new is free" model of the valley cost model so interesting for independents -- it reinforces the value of the most ardent fans and subscribers by giving them the content as close to free as the business model can allow, encouraging them to recruit new participants among the "first consumers" in much the way early fax machine owners pushed the adoption of faxes -- because that adoption adds value (in broadcatching's case, by lowering costs) to everyone in the network.

posted to Emerging Systems on March 22, 2004