This is a guest blogpost by Jason Kingdon, chairman and CEO, Blue Prism
Link all computers together – why would you do that?
Way back in the 1980s various academic and scientific institutions started linking their computers together into a single global network. Its underlying mission was supporting continued communication in the event of nuclear war. Helping to pass files and bits of data, it introduced a new form of electronic communication called “e” mail. Email was fast – amazing in some ways, compared to letters and the then version of the phone.
The early 1990s saw commercial networks join this linking and the phenomenon of the Internet took hold. The World Wide Web – making the Internet more accessible – lit the way and the story we all know took hold.
How profound the changes introduced were not understood; and can be argued are still not. It took until the mid-1990s for the dotcom phenomenon to emerge and, despite the casualties of the ‘burst’, almost all the innovations seen as excessive at the time are now commonplace (pet food and home delivery were two front-line casualties in the early 2000s recriminations).
Commentators argue now over the pace of innovation and change brought about by this network: the pace of productivity and accessibility it has produced. Without disturbing these arguments, it is easy to see social media’s impact, the global lock-down management, the accessibility of education, entertainment, hyper-communication, the technification of society, and the profound changes to economics, politics and life. Those that remember all agree – everything has changed.
Link software applications together – it worked for computers …
The low expectations of the impact of linking computers together has a parallel right now in application software. The robotic process automation (RPA) movement champions software, designed to mimic humans. As such it can re-purpose human interfaces as machine interfaces and, by design, can make use of all machine interfaces. So what?
Where to start …
The whole history of computing has been a quest to address the interoperability issue and the ability to manage and understand process resources. At a technical level it is hard to overstate why this innovation offers a supercharging of productivity, and again, changes everything. No longer is it possible to say “systems are incompatible” – all systems now are; even if they were never designed to be. You may not be interested in the revolution – but the revolution is interested in you.
Machines now have a “universal” middleware, and this middleware through AI reaches into human services too. The first casualties here are the investments and efforts to impose a single design authority, which is now obsolete. It is as if overnight the whole software landscape has become SOA compliant. SOA or Service Oriented Architectures was a deep techie program to cure the interoperability and re-use issue. It was a way that techies would agree to write their software so other software could use it. The effort was something akin to asking all journalists to format in English and tag their work in the same way; or asking all musicians to write and record all music in the key of C Major. In other words, all historic efforts to solve this interoperability question required epic, idealistic and fantastical compromises from all participants.
The new order created by RPA, however, takes things as they are and is totally inclusive – past, present and future. A potential for singularity, if you will.
Universal middleware and hyper-containerisation
The technical arguments don’t stop here; the new order that this universal middleware implies is that containerisation becomes “democratised”. This means all services can be wrapped and re-purposed for both machine and human consumption.
What is containerisation? It is the wrapping of a service to a common standard, it means you create a codified (computer and human readable) interface to the underlying capability. It is like an API to a Maths or AI algorithm at one end, a mainframe flight booking system at another; and then again at a new extreme, an API to something like … Amazon, or even … the Internet.
What does this mean?
Firstly, it goes way beyond APIs (application programming interfaces) – simple codifications that are used by software to get modules chatting. The process of RPA wrapping means whole services can be machine controlled.
A good example of this already occurs – the price comparison sites have become automated service switching businesses. A robot can now keep your house utility bills optimised against the full range of 3rd party services offered; a new banking service could “pixelate” (digitise) the services from existing providers to offer a “single pain of glass” to a user mixing and matching best products. Imagine mixing best international transfer services with stockbroking, house banking and mortgages, and so on.
The lack of barriers between applications means all of these recombinations become in scope – new levels of creativity are unleashed, very akin to the power of the Internet.
It also changes the way internal “services” are carried out within corporations: like mortgages, underwriting, HR, finance. The new regime allows the abstraction of these services from process outcomes from the underlying methods of their execution; this means hyper-scalability – core services becoming new core competences that can be “rented” and marketed at a whole new scale. The mortgage services company, or the trade finance service company, or the physical asset finance management company; the banking compliance company etc.
Like the ye olde days of the internet; creative destruction is being invited at all levels. It is as if the digital equivalent of the Iron Curtain has come down – freedom of movement is now abounding.
Digital freedom of movement
One of the profoundest implications of this new world is on the future: if all digital services are consumable by all others then the old tricks of “walled gardens” and “lock in” – incompatibilities built in to the design to ensure that consumers are trapped – become obsolete. This has the effect of linking R&D efforts in a veritible liquid form. It means at one level we are all collaborating now.
AI can already read at modest levels unstructured human writings. But technical documents like “written software” and “user interfaces” and are easy pickings: robots can tirelessly rewrite these documents into modern accessible and liquid forms (APIs). More robots can tirelessly re-compose these outputs into further accessible mash-ups and composites. A singularity if you will. Would that be enough to change the world?