Why SaaS Means Better Software
By Phil Jeffery, VP Engineering
The Software as a Service (SaaS) market is experiencing strong growth. Gartner estimates it will expand more than 20% in 2017 to reach $46.3 billion, and that by 2025, over half of large enterprises will have successfully implemented an all-in SaaS strategy.
This is no surprise because SaaS offers customers many advantages, from cheaper and lower risk rollout of new services to more stable financial planning and higher rates of user takeup. In this post I look at another, less-publicized driver for adoption: why customers can also expect to be buying fundamentally better software when purchasing a SaaS service.
Harnessing the global data set
Software development used to be about creating algorithms to solve an entire problem set during development time. It’s no secret that advances in machine and deep learning techniques have changed this. Code is becoming less important than the data we use to train it. One thing that all such learning approaches have in common: they need lots of data.
As a result, the companies positioned to write the best software are those with the largest data sets to work from. These are always the SaaS vendors; on-premise solutions rely at best on incomplete telemetry sent home from the edge, that cannot easily be enriched once deployed.
Google, Facebook, Amazon et al are therefore leading the charge, and AI is seeping into all industries including front-office trading as we blogged about last year. Smaller companies with comprehensive niche data, such as Duco, can offer game-changing insights of their own. In our case, we are excited by predictive use cases such as setting up data reconciliation processes automatically, or offering ongoing suggestions to tune matching rules.
Winning the optimization game
One of the hardest things in software development is picking the right time to optimize your code. Every Computer Science student learns Donald Knuth’s famous adage that “premature optimization is the root of all evil”. But developing software for constrained, static environments such as an on-premise device has necessarily ingrained a habit of optimizing before real-world usage is fully understood. It’s a numbers game in which you never know the odds against you.
By contrast, the elastic capacity in today’s data centers mean that it is often cheaper to scale out compute than invest in developer time to optimize software performance. The major Cloud providers are all providing serverless compute, auto-scaling and geo-distribution, giving regular application developers access to powerful and extensible infrastructures.
It’s not all perfect: the recent AWS outage had major impact partly because several applications were not designed to support node failure, as they should have been.
But done right, SaaS delivery enables providers to focus on features, scale out with infrastructure if their product wins, and pay the costs to optimize only when it is clear that they are betting on a sure thing.
So, won’t that lead to a bunch of bloated code that over time grinds to a halt? No, because in a SaaS world, the provider eats the cost of inefficiently designed software. All hosting and operational costs are borne by the provider, so it is strongly in their interest to ensure that this burden is as limited as possible. Efficiency lowers cost and is therefore a critical competitive advantage.
By contrast, software vendors in an on-premise world have an unwitting ally to save them from the costs of poorly designed software: the IT departments of their customers, who have no option but to move heaven and earth to make a solution work. SaaS software is more efficient, because there is little choice for the provider.
Buying the future as well as the present
Software isn’t perfect. It has bugs, and they need to be fixed and rolled out. Security flaws need to be patched. It is also never finished; there is always a newer version with better, more robust features. The quality of software you are consuming is directly related to how recent a version you are on.
Releasing and upgrading on-premise software is an enormous task. Upgrades have to be vetted by customers, assigned resources and assigned time windows. They then need to be risk assessed by staff sometimes poorly equipped to understand what is required, and typically with more to lose than gain from a rollout.
Change is slow – and all the time the software rots. As a result, huge emphasis is placed on ensuring that a new software release is rock solid. The cost of all those extended test cycles is inevitably passed to the customer, either in the form of higher prices or reduced feature sets.
In contrast, SaaS products allow providers to dramatically optimize the delivery process. Features ship as they are ready, not after waiting months for an arbitrary release deadline to be hit. New features can be released globally, but selectively enabled, based either on customer demand or as part of A/B testing programs. Testing costs are lower because the frequent release cycle means that a new version consists of a much smaller, easier-to-test set of features. This means more investment in new features, and more value to customers.
Clients of services such as Duco benefit from new features every month. When they buy SaaS, they are buying the future as well as the present. That’s also why when we went shopping for HR, collaboration, roadmap and CRM tools, we bought SaaS options in BambooHR, G Suite, ProductBoard and Salesforce rather than their on-prem competitors.
Standing on the shoulders of giants
The SaaS model provides an incredible set of building blocks to fast-track software development. SaaS providers can leverage all the amazing work from browser providers, technology standards, and now the Infrastructure as a Service (IaaS) and Platform as a Service (PaaS) providers revolutionizing the data center. In addition, restrictive open source licenses such as GPL that constrain software distribution, are less onerous for hosted SaaS purposes.
Taken together, this creates an internet service delivery chain that is tested by thousands of services every second of every day. Brutal competition means that the cost of this delivery chain is falling fast, while quality is improving at a similar pace. SaaS providers can therefore focus their resources on improving their feature sets and software quality rather than the enablers surrounding it.
In conclusion, there is a lot of great on-premise software available, and there certainly remain valid uses for keeping applications hosted on-site. But the dynamics of the software industry make SaaS software an almost inevitable winner everywhere else.
At Duco, we have focused on SaaS from the very beginning and our software quality benefits from it every day. Fast turnarounds of new features delivered over the internet service delivery chain is part of our DNA. And we’re excited by the opportunities ahead in leveraging our data set to deliver ever more intelligent solutions to our customers!