API series - MongoDB: Overcoming the API dilemmas of the real world

This is a guest post for the Computer Weekly Developer Network API series written by Vivek Bhalla in his position as senior manager of market intelligence at enterprise open source ‘developer data platform’ company MongoDB.

Bhalla points out that (today, in 2022) there’s a lot of excitement and hyperbole around APIs and what they offer, so what should we really be thinking about when we consider their place in the modern IT stack and their space in the universe?

Bhalla writes as follows…

If we think about the needs of the modern end user, the API hype is understandable.

Consider the apps that reside on a mobile handset for communication, navigation, purchasing, ascertaining the weather and connecting to an employer’s business tools. App proliferation (and sprawl) has mandated that APIs have come to the fore to provide a degree of consistency and standardisation. In addition, the needs of the user, be it consumer or business, are more acute today than ever before.

Slow is the new down

If a user’s experience of these apps is a negative one, be it a delay in connecting or overall app slowness, those users will swiftly move on to an alternative. The saying ‘slow is the new down’ [i.e. we used to complain about downtime if a system were to crash and burn, but now we rank slow just as bad as a meltdown or crash] is often touted when it comes to end user experience and expectations, not to mention their demands.

This has had a knock-on effect on those responsible for developing these apps.

There is a constant race to deliver greater and more diverse features and functionality while concurrently optimising the user’s experience. APIs are at the core of delivering this. As a consequence, APIs find themselves constantly being updated and upgraded. By the time a developer has adopted a particular API, it’s common that a newer version has already been released. This is particularly acute for open source APIs that have numerous contributors without a dominant organisation to provide some degree of governance. The scenario is not dissimilar to that of browsers or operating systems whereby the end user finds the version they have downloaded has already been superseded by yet another subsequent release.

This is often driven by developers striving for architectural purity.

API architectural purity

Constantly looking to refine the API piecemeal to demonstrate a degree of responsiveness to the community they serve, developers naturally strive for architectural purity.

The adoption of Agile development practices to ‘deliver less but more frequently’ only exacerbates this situation. However, this can lead to a ‘blind spot’ around the practical operational implementation and management of the API itself.

Considerations such as the release cadence, dealing with multiple versions, deprecation of redundant features and the upgrade process itself are often an afterthought. Talk to any IT operator or system administrator, however… and they’ll say these should be considered from the outset as a priority. The pain of upgrade mishaps, breaking changes, and configuration tweaks create an accumulation of IT technical debt that often goes under the radar of those not tasked with having to contend with it.

Fears of upgrading to the latest version of an API can also pervade due to backwards incompatibility.

This reluctance to upgrade only compounds the technical debt that mounts up. You want to upgrade your software, be it a database, monitoring platform, service management suite or whatever else, without worrying that a behaviour change in the API will break your front-end application serving your end users. All software vendors have tried their best to ensure each release is backward-compatible, while also adding new features. However, even with this intention planned from the outset, breaking compatibility has sometimes been unavoidable in order to fix specific issues or deliver new capabilities.

Removing this pain point would be a significant boon. For those tasked with architecting or procuring such technologies, determining technology partners who have anticipated this challenge and looked to solve it for their customers, will save themselves and their organisation months of headache later down the road. Being the one who avoided such a pitfall will also be seen favourably by the business more broadly, assuming this is highlighted sufficiently internally.

A subset of ‘core’ commands

Identifying those software vendors that enable frequent and seamless upgrades to take advantage of new capabilities (and yes, the inevitable bug fix or security update), while concurrently protecting an organisation’s previous investments and work is key.

One way to do this for a database API is by creating a subset of ‘core’ commands that are most frequently used to read and write data, create collections, indexes and so on. Those commands are then prioritised and engineering teams delivering the API updates must ensure they remain backward-compatible in any new release. This group of  commands can evolve as new features are added, but they must be implemented in a manner that guarantees backward compatibility. When coupled with an agile release approach, this gives customers the ability to adopt new functionality at a pace their competitors are unable to match, as they feel confident they can upgrade safely.

Regardless of who is selected as a technology partner, identify those potential suppliers who actively recognise that software changes and in lockstep with this, so do the underlying APIs.

Anticipate that change in the API. Accept and embrace that it will evolve in the years to come. With the expectations of end users and their demands, this is inevitable. Those who provide APIs must do so in a manner that imbues confidence. It is only by instilling this trust, that organisations will then be in a position to take advantage of an increased cadence of new features and capabilities that an API offers. At this point comes the competitive advantage over one’s rivals who are only starting on this journey months or even years later.

Data Center
Data Management