WavebreakMediaMicro - Fotolia
The FOSS acronym – standing for free, open source software – has been a clarion call for many since the open source movement started, despite being nominally based on a misinterpretation of what open source is all about.
A group of people got together in 1998 to examine the “free software” market, out of concern that the concept was being hijacked and perverted in political and moral terms, prompting a push to create a more commercially oriented definition of what it has to offer.
This community also sought to reinforce the fact that the source code belongs to someone, and is their intellectual property.
The key driver for using open source code is to avoid paying money to proprietary software suppliers, such as Microsoft and Oracle, with the FOSS community on hand to provide a high level of support in response to any problems that crop up.
That said, the adoption of open source cannot be predicated on who the money is going to, and neither can it be based solely on community support.
After all, what if, say, an organisation’s sysadmin goes off to the community and sees several different possible solutions. How do they know which one to choose, and guard against uploading bad code or allowing an infection, such as the recent WannaCry ransomware Trojan, to take hold?
The success of Linux as an operating system does not hinge on it being free. It has succeeded because it is good at what it does, and the organisations that rely on it gain full support (at a cost) from the hardware and software supplier community.
Open source is not “free” because every deployment has a cost associated with it, but what it does allow for is choice, and an organisation can go down one of two routes in adopting it.
They can install, operate and manage the software themselves, and the resulting costs are predominantly generated through acquiring the skills required to maintain such software.
Read more about open source cloud platform
- The OpenStack community of users and contributors are united in their belief in the enterprise-readiness of the open source platform, but accelerating its adoption requires some back-to-basics thinking, finds Ahead in the Clouds.
- Cloud Foundry’s executive director, Abby Kearns, sets out the organisation’s plans to ensure that market awareness of the platform keeps pace with its rapid enterprise adoption.
If the open source code is broadly supported, such costs may be relatively low. But for highly specialised and lesser-used code, support could be very expensive.
The alternative approach is to take on a support agreement with a professional provider. Whether this works out as being less than licence plus maintenance on commercial off-the-shelf software is a moot point.
Overall, it seems the cost of operating a professionally supported open source platform verses a commercially provided platform are pretty similar over the lifetime of the deployment.
As it stands, Quocirca knows of only one company that operates an “open source-only” IT strategy. After three years, the IT department figured that the new, open source-only environment was costing three times what the previous proprietary environment had. So, one-size-fits-all does not apply to open source economics.
Open source cloud platforms
Even so, open source is now being lauded in various quarters as the best way to provide a flexible and open means of operating a hybrid cloud platform.
With open source cloud platforms, such as OpenStack, being at the forefront of many private and public clouds, this may sound quite plausible.
Indeed, Google has been positioning itself as a great proponent of open source for this purpose. This can also be taken further to consider the hardware – a prime example being the Open Compute Platform, started by Facebook but now running as an independent organisation supported by a raft of large technology, cloud and end-user organisations.
But the question must be asked: does open source bring enough to the table to justify the adoption of an open source-first or -only strategy? In Quocirca’s view, the answer is no. Open source is not the answer – it is just an approach.
Take the example of a company that wants to run a highly specialised number-crunching application. It decides that an open source cloud approach is ideal, so it uses Linux and OpenStack as the main part of its cloud platform, and then installs Open Source NumberCrunchBlitz on top.
It finds support for Linux and OpenStack no problem (both are mature, well-supported platforms), but NumberCrunchBlitz was written by a student who now has a house, spouse and kids to support.
The student no longer supports the code he wrote, there is no one else using the application, and the organisation now has to build up its own skills to maintain and advance the code.
There is no possible hope of financial remuneration down the line for said company because the code is covered by an open source licence that prevents selling anything based on it.
But what happens if the code is not written well, or if it is tightly coupled to a specific database, using proprietary data calls? What happens if these dependencies become broken as the database code is updated by the group looking after it?
Open standards for interoperability
Now we are on to where the real point of focus should be – not on open source, but on open standards.
In a world focused on the application programming interface economy, ensuring that one thing interoperates with another is far more important than focusing on the cost model for the software.
For hybrid cloud, this is becoming an imperative. The move to microservices operating in containers requiring multi-cloud orchestration is not predicated on open source.
But it is predicated on each part of the chain being able to move data across boundaries smoothly and efficiently. This is no old-style enterprise application integration problem: the use of enterprise service buses is not the answer. High-fidelity connections are required, which must adhere to de facto standards as already used by the majority in the market.
Overall, it comes down to choosing the right tool for the job. In some cases, this may well be open source software. In other cases, it may be commercial software. Even so, adopting a proprietary approach across a hybrid, multi-cloud environment must be seen as anathema. Open standards have to be the rule.