Category Archives: business

Hosted desktop, what’s the point?

I keep coming across small businesses that are buying into hosted desktop services because someone told them it was a good idea.

What am I missing? I really don’t get the value this adds. Quite the reverse, I look on in incredulity at the self inflicted pain of a user trying to run YouTube in a web browser on a “hosted desktop” service back to their local desktop.

Of course there is short term value in rolling out thin-client hardware and managing central hosted desktops for some larger organisations. These are the poor folks who through inertia, regulatory pressures, or just downright lack of imagination are stuck with running labyrinthine suites of legacy Windows executable software that was conceived in a previous era. Presenting and maintaining a complex configuration of kranky desktop software, often customised on a user role basis, is probably best done by turning the whole steaming heap into a hosted service in it’s own right, consumed from a simplified desktop. I get that.

What I really don’t understand is why more agile, and otherwise unencumbered businesses are being sold desktop PCs on which they then consume hosted desktops, on which they then consume other cloud services over the Internet. What is the point of that in 2015? why not have them consume the hosted web based services directly on a local thin desktop. When I ask this question of folks that are selling this stuff I get a lot of hand waving about access control, security, central configuration. None of it at all convincing. As far as I can see the value is a bit asymmetric as in “it is our only way of getting a slice of the x per seat per month market, after all if we don’t sell customers a hosted desktop then all the valuable recurring revenue just goes to Microsoft, Google et-al and we are left in the commodity business of swapping out dead mice and keyboards”.

That’s all very well, but at some stage the SME customer is going to work out that rendering the UI of their largely web based and increasingly media rich applications through some external server somewhere, just slower than if they were consuming the service directly in a local client is the reverse of adding value.

There must be some compelling pain point that these services resolve to make them saleable, so folks who are deploying them, what is the big enhancement they bring?

Published by:

When will Metcalfe’s law kill the Telephone

Metcalfe’s law is a perfect fit to explain business phenomena like how WhatsApp built a $19bn valuation for themselves last year, and in doing so removed an estimated $30bn a year of global SMS messaging revenues from carriers in just a few weeks once the application achieved critical mass.

Bob Metcalfe is a pretty smart guy, he invented Ethernet, founded one time industry giant 3Com (now part of HP), and proposed an equation that estimates the value of any communication network based on the number of participants. The value he came up with was n(n-1)/2 which in English means: The value of a communication network is proportional to the square of the number of participants. Various folks have proposed tweaks to this over the years, and of course it only gives a relative magnitude of the value of a small number of participants vs a larger number on the same platform. It doesn’t deliver a quantitative monetary value without knowing a lot more about what the network facilitates and assumes all nodes have the same value etc. It is however widely accepted as the rule that explains how networks grow rapidly (exponentially) in desirability and value once they achieve a critical mass. It’s why WhatsApp and iMessage get such a big share of the messaging “market” when there are 100s of other messaging apps that do the same thing. This is all pretty intuitive stuff really, we don’t need a mathematical equation to model the fact that I’m going to choose a messaging app where I can find most of my friends.

As the cannibalisation of SMS shows, Metcalfe’s law isn’t all about growth inflexion points. Curves that go up rapidly, can come down again pretty quickly too. Here’s the interesting bit though, I don’t think it is as simple as a bidirectional application of Metcalfe would imply. I’m sure there is lots of hysteresis, or rather a different equation, on the way back down.

When a technology with real users has been incumbent for a long time, a ‘long tail’ happens as numbers on the network reduce to a successively harder core of highly entrenched end-users. Objectively Metcalfe’s law still applies, but I suspect that as a network shrinks there is an extra factor in the value perception representing an individual bias based on the length of time that user has habitually chosen the technology. It isn’t just the established preferences of individuals behind the inertia that established networks have, there are (only occasionally rational) economic reasons for larger entities to continue to use a network that they have made a substantial capital or organisational investment in. This means that the decline of things like telephone calls probably won’t be the mirror image of the exponential growth new communication networks enjoy.

There is now no real doubt that use of the public switched telephone network for primary communication is declining across the board and has probably never anyway existed for the youngest generation who are about to enter the workplace. Will this fall reach a certain inflection point and then drop off a cliff as Metcalfe’s law would imply or will we see a slow lingering decline over many decades. What is your view?

Published by:

Why the WebRTC video codec choice is important

business networking

webrtcReal time peer to peer communication on the web has had a good couple of months. At the end of January Chrome and Firefox demonstrated interop of two independent WebRTC implementations and then a couple of weeks ago Google squelched the MPEG LA attack on the open status of it’s key VP8 codec in a deal that granted them the full rights to any MPEG LA pool IP related to VP8 (if any ever existed) and used this to grant a free licence to other implementations.

Most sensible folks can see why it is necessary to agree on one mandatory video codec for the web which is openly available to all implementers.  The next generation of applications need to be able to rely on endpoints talking the same language which means that the technology chosen has to be universal.

The trouble is that nobody can agree what this should be. Established players, especially those who own codec IP want this to be H.264 as this works in their commercial favour and puts new entrants at a disadvantage. Others who see the need to build standards out of open, freely implementable technologies favour VP8 which was built from the ground up to avoid the patent thicket around H.264.

There are some marginal differences between H.264 and VP8 on encoding efficiency, but the only real argument for H.264 that stands up is that it would make it easier for new WebRTC implementations to talk directly to existing, mostly embedded hardware endpoints that only currently implement H.264. Whilst this is valid, it won’t be a common scenario on tomorrows Internet and there are plenty of ways to achieve interoperability. Relatively speaking there really aren’t that many existing H.264 embedded hardware implementations  – hands up if you have a video phone on your desk? Certainly not compared to the billions of  WebRTC endpoints that will exist in released web browsers within months. With H.264 as an optional or plugin codec, vendors with legacy H.264 devices could simply take care to use an endpoint with optional H.264 support in their application, upgrade their current embedded hardware to support VP8, or if all else fails transcode on their proprietary MCUs. Encumbering the whole Internet to use H.264 to accommodate this one use case is unacceptable collateral damage.

Just when it seemed that the VP8 vs H.264 tradeoff would go VP8 after the Google/MPEG LA announcement, Nokia served a fairly astonishing spoiler this week when it bowled an IP disclosure in to the IETF claiming it’s own rights in VP8 which it wouldn’t be prepared to licence on any acceptable terms. It later admitted that this was deliberately done to derail the efforts to standardise on VP8. If Nokia’s claims are genuine and it really does own sigificant IP in VP8 then it is probably a good thing that it did disclose it at this stage, although it would be interesting to know why they didn’t do so very much earlier. If it’s claims are found to be weak then it is a pretty shocking way to try and manipulate a standards process.

Rather perversely my company is developing software that gateways between WebRTC and legacy SIP video phones among other things, it would actually be good for us if H.264 was mandated in WebRTC. It would be very bad for the Internet though so I’m really hoping that the questions around Nokia’s VP8 claims are quickly resolved in the right direction!

Published by: