Discover more from Stoic Observations
The Serverless Revolution
A couple weeks ago, something profound occurred to me about enterprise software. I realized that when it is priced by server or by…
A couple weeks ago, something profound occurred to me about enterprise software. I realized that when it is priced by server or by processor, that it’s a ripoff. When it is priced by data usage, it’s a bargain. Why should you be paying for the potential of being able to use 8 cores even when you are not using them? Admittedly this is rather easy to see if you have experience working in the cloud and then take a turn doing on-premise practices, but it seemed rather profound to me. Imagine, if this is not obvious to you, that you pay a standard or premium license fee to your mobile phone carrier based upon whether you are using a brand new smartphone or an old one. Right. It makes no sense to pay money for anything but the minutes you use on the phone. Mobile phone billing is done right. You pay for minutes. All software could be that way, so that you’re not paying for servers, but for functions that spring to life do your bidding, charge you a fractional penny and then die.
This week my boss sent me a link to a guy named swardley who likes ducks. It turns out that this character is quite capable of blowing my tiny mind. He’s done it twice already, and so now I have the burden of a speck of enlightenment. It is an enlightenment which is commensurate with my understanding of (Wladawsky-Berger 1999) n-tier computing and then (Vogels 2008) horizontal scaling. So since I’ve been doing cloud for six years, I now understand what’s happening next. God save us all.
Tying software development to economics and cost accounting has long been the stuff of magic, SWAG and charlatans. At least that’s what it seemed like for most of my career. But I think Simon Wardley has the solution. He has outlined an ecosystem and a framework for understanding how one can iterate (captive) algorithms towards mutual value for the developers and the customers. He calls it FinDev.
Like most useful thinking on the progressive edge of IT, one must assume AWS. That is to say, very little of what exists in the world that is not part of AWS’s ecosystem can be thought of as having great potential in the future of computing. AWS is beyond doing things very well, they are evolving at a monstrous pace and at enormous scale. As an aside, I asked some Agilists last night at El Torito why Amazon manages its businesses so well. He said it’s because Amazon is a collection of relatively small businesses that work on a common billing system, and that’s what keeps it simple to manage. They are not only eating, but profiting from their own dogfood, which is the fully meticulous tracking of compute resource costing, and now with Lambda, down to the function. All the hardware is a sunk cost. Cloud computing is a utility. What matters now is billing by the function. Simply assume the cloud. It’s already done.
Think about that for a moment. Most Enterprise IT departments only have the vaguest sense of how to allocate the cost of their compute environment to their end users. Amazon has made that into a profitable business for a half-dozen years. Now they’re so good at it, that they can account to the subsecond, one function execution at a time.
So there is a scary aspect to this which is something we all should have been afraid of all the time. It is what happens to craft when things become industrialized. Your personal touch matters less in a market which is defined towards optimization, cost-cutting and efficiency. Nobody cares how you show off the horse, we’re all driving cars now. Nobody cares about your budget system, we’re all using SAP now. Nobody cares about your fat client, we’re all using browsers now. What’s coming is a COTS revolution in which your college professor optimized the Towers of Hanoi solver and now owns the moneyTicker on the algo. In a global library of cloud interoperable functions, the scope of what you get to work on gets narrower and narrower. The good news is that we are 20 years away from lockdown. The V8 of compute engine economies is invented. Say hello to the next 50 years. You Wankels don’t stand a chance. When I was an undergrad, I used to think of software as the same thing as law. There are lots and lots of lawyers but only a few legislators. The assumption was that the best lawyers at some point got to legislate and the rest just interpreted and borrowed citations for the benefit of those who never read the law. I believe there will be some measure of stare decisis in the new FinDev ecosystem.
So the future belongs to engineers who really know their customer’s needs. The economy of FinDev provides value to developers and customers only to the extent that something can be built (at scale in the cloud) that customers want to use. When you charge by the use, that’s a different business model than anything we’ve seen. Chances are it will be disruptive because it will go after captive inefficiently spent money. But there’s greenfield out there too. More hopefully, there are new places computing can and will go once we wean ourselves from the economics of capacity planning, system depreciation, outsourced consulting and all that. I think AWS will be capable enough to handle global innovation in this regard; they’re certainly leading. Now is the time to work our way towards best practices, evolving towards the revolution.
In Martin Cruz Smith’s Arkady Renko series, the protagonist, Renko informally adopts an orphan who is a chess genius. Playing at the genius level, the kid doesn’t require a board or pieces. He, and those like him, can just recite moves. He has a virtual queen and doesn’t even need hardware. If you’re thinking about physically moving pieces, you’re not playing chess.
Originally published at www.cubegeek.com on December 9, 2016.