Recently, across the social media spectrum there has been an uptick in government program leaders, technology salespeople, and systems integrators all sharing opinions on how best to deploy new technology quickly, cost effectively, and with the proper architecture to support both short- and long-term federal program goals.
Unsurprisingly, there’s little consensus, with different program acquisition strategies, system designs, architectural approaches, delivery models, or commercial software products all being touted as “the way.” One area of common ground is the desire to eliminate “vendor lock-in” by using open-source software (OSS) to build solutions and systems.
Still, the question remains: When it comes to avoiding vendor lock-in, is it better to build or to buy?
In this blog, we’ll share how Immuta’s leadership team navigated this question, and why in today’s cloud-driven data ecosystem, the answer might be both.
What is Vendor Lock-in?
This idea of vendor lock-in is a misunderstood and often misrepresented concept. The debate surrounding it is not new in any way, yet some of the key considerations seem to be getting lost in the noise and FUD of social media and sales cycles. But with the current “Cloud Smart” Federal Cloud Strategy encouraging cloud technology adoption, this conversation warrants a rehash.
In reality, the concept of vendor lock-in is relatively outdated. It stems from the days when a handful of behemoth technology vendors were selling proprietary capabilities that were near impossible to integrate with other vendors’ offerings, and cloud adoption was still nascent. With the rapid adoption of cloud infrastructures and the agility they provide, some of the lessons learned in our recent past have been forgotten by program leaders looking to make an impact and save taxpayer money by building systems completely out of OSS on cloud infrastructure.
Building OSS Systems for Modern Clouds: The Pros & Cons
Over the past 20+ years, I’ve worked for various companies to bring a range of technologies to market, from proprietary technology-based commercial software and OSS-based systems to paid support models, consulting technology services, and fixed-fee projects.
In that time, a few key technology paradigm shifts have changed the vendor landscape, including:
- The proliferation of OSS projects
- The adoption of web-based technologies
- The mass adoption of software standards
Each has helped encourage innovation amongst vendors, while ensuring marketplace competition is continuously evolving. Having worked for companies that have contributed to or run open-source projects, as well as those that leverage available open-source components for end products, I’ve seen the benefits of OSS adoption firsthand. After all, without the proliferation and standardization in the use of open source software, the modern internet-driven world would not be what it is today (Thanks Linus Torvalds!).
Having said that, using open-source software on a project in lieu of alternative commercial off-the-shelf (COTS)-based software products does NOT automatically eliminate vendor lock-in. In many cases, this approach creates a myriad of negative impacts for government programs, including higher risk for failure, longer ROI cycles, and missed chances to incorporate true innovation.
Often, organizations that are fully reliant on OSS will build something from scratch that is only one-tenth as good as what they could have bought from a commercial company – and they could have delivered value in a fraction of the time spent developing and integrating the OSS components.
Choosing to only use no-cost OSS components to build out an enterprise system can lead to an entirely worse type of vendor lock-in – one in which the individuals doing the integration work leave the organization and take their tribal knowledge with them. We’ve seen this scenario play out across government programs (I’m looking at you, program who shall not be named!!), and it often leads to a perpetuation of obsolete technology and designs that hinder modernization and create massive amounts of taxpayer waste. In these situations, custom-built, outdated software components could easily be replaced by COTS tools with far better capabilities.
Making the Case for Building and Buying in Modern Data Stacks
Why does the build vs. buy debate continue despite the obvious benefits of incorporating commercial software into modern data stacks?
We often hear from government program leaders who say they are “stuck” due to funding constraints that limit their ability to buy commercial software. At the same time, these leaders may be paying a systems integrator to build the next generation of their software as their million-dollar key deliverable. So, not only are they spending money to build something they could have bought and implemented quickly, but they are waiting months for the from-scratch features or function that could have been “turned on” with a commercial purchase. In the meantime, these from-scratch projects are likely siphoning off taxpayer dollars from other initiatives.
But, in their minds, they’re not “locked in” by a vendor. Henry Ford once famously said:
“If you need a machine and don’t buy it, then you will ultimately find that you have paid for it and don’t have it.”
When I’ve seen some of these projects, I’ve felt strongly that the intent has been great, yet the outcomes and systems built could have been delivered faster, with greater innovation, and with a much lower total cost of ownership (i.e., fewer tax dollars being spent for the same outcome). A better approach could be to leverage a “build and buy” approach, integrating best-of-breed COTS products for key system components, along with OSS built with an interoperable, API-first architecture that could be deployed consistently across a multi- or hybrid-cloud model. This would provide the best aspects of building purpose-built systems, with the speed to value of buying a turnkey capability and minimal overhead. You could think of it akin to how modern, modular homes are built.
Data Access Control: Build or Buy?
The best commercial software companies continually assess what features the market is demanding and create roadmaps based on customer input. Their product engineering teams consist of dozens of developers and engineers working to continually innovate based on these trends and feedback. However, for small development teams on government programs that don’t benefit from mainstream trends or continuous feedback loops, building anything comparable is a much greater challenge.
This was the case for Immuta’s founders who, when working on data analytics and engineering projects in the Intelligence Community, experienced firsthand the gaps that were holding these small teams back. The responsibility to deliver data products for mission-critical analysis and work with massive volumes of heterogeneous data that had unique policies and rules for use was truly “life and death” in nature, and extremely time sensitive in the national security space. The pressure, scale, complexity, and sensitivity of these projects continues to be unique to the Intelligence Community, providing an advanced perspective that could only be attained through experience.
This helped Immuta’s founders understand the actual technical and operational challenges associated with data analytics at scale, and how they held organizations and users back from achieving their modernization goals. Perhaps more importantly, there was a realization that as cloud adoption increased, this would be a problem for any organization – both public and private sector – focused on improving analytics and becoming more data-driven. Data teams would inevitably be tasked with building a data access control and policy management system from scratch to ensure secure and efficient data use, or there would have to be a solution on the market that was flexible and scalable enough to do so. The level of sensitivity associated with government data, however, meant any solution had to be airtight – a tall order for the small government program development teams.
Immuta’s focus since day one has been to simplify data access for analytics use cases by automating the enforcement of data access, privacy, and governance policies in a streamlined, modern, cloud-native architecture. We now have teams of product engineers building features and functions that serve customers across various global industries, including the public sector.
I’ve lived on many sides of the “build versus buy” discussion in my career and have learned some lessons along the way that shape the way I see the world. This is the conclusion I’ve reached:
- If agencies want to achieve faster, more innovative outcomes with improved time-to-value, the “debate” should not be “build vs. buy.” Instead, an ongoing discussion and exploration of how to integrate a “build and buy” approach to delivering new capabilities will provide fast time-to-value, better ROI for taxpayers, and more rapid impact to mission outcomes.
- The best SI teams and program leaders know this and incorporate architectural strategies to ensure that “build and buy” in open systems designs that leverage best-of-breed commercial technologies are integrated to empower users to meet mission needs.
- When we get down to the root of what software does for us, we find the underlying purpose and benefit is that it saves us time. It’s pointless to waste time trying to “build” software from the ground-up, as commercial offerings exist to solve a major portion of your needs. You will get better outcomes, save taxpayer money, and better impact your missions when you incorporate a “build and buy” strategy.
Whether you decide to build, buy, or both, it’s important to take into account the changing nature of cloud ecosystems and ensure that your systems and applications are future-proof. Are they scalable, compatible, and most importantly, secure? Will they help meet the data-driven goals on your organization’s roadmap, and achieve better outcomes for your customers?
Asking yourself these questions and investing time, money, and resources accordingly will help avoid feeling like you’re locked into a solution, while also ensuring you’re able to be as productive, efficient, and innovative as possible.
If you’re wondering how data access control plays into the build vs. buy debate, set up a call with one of our experts to better understand how a tool like Immuta fits into your data stack.