This is a guest post for the Computer Weekly Developer Network written by Jason Knight in his capacity as co-founder & CPO of OctoML – a company known for its platform and solutions that automatically maximises machine learning model performance on any hardware, cloud or edge device.
Knight wrights as follows…
Low-code/no-code (LC/NC) solutions have been on the rise for the last decade. They fill a skills gap that enables less sophisticated practitioners to achieve business outcomes without deep expertise or knowledge required for traditional software engineering.
What we should also say is… LC/NC solutions are poised to make a whole new category of technologies more accessible and usable: machine learning models.
At first glance, ML may appear to be at-odds with the NC/LC ethos. Because today ML models require specialised knowledge to build, train and deploy, there is a perception that ML is impossibly technical and inaccessible to all but a few elite engineers. But the complexities underlying ML today are an artifact of the immaturity of the ecosystem and solutions and not an inherent aspect of machine learning itself.
To understand where this is all headed, let’s look to the past.
A brief history of data
For decades, databases have sat at the heart of most applications. But for much of that time, instantiating and maintaining databases required highly specialised knowledge and skill sets. Today however, databases and data solutions have matured to the point where software engineers can reach for the nearest SQL library, without needing to understand the internal details of database architectures.
Today, the brains of some of the most exciting applications is AI. So, slowly but surely, it is getting easier for software engineers and IT operations teams to work with machine learning models and deploy them to intelligent applications and services. Very soon, we’ll see ML tooling and infrastructure mature to the point that ML models can be handled like any other piece of code.
Software engineers will soon be able to deploy models without needing to become “experts” in machine learning. ML tooling will sit front-and-centre of LC/NC solutions — even the ‘glue code’ that today requires software engineers will also be taken on by ML techniques.
Software development is about writing code that machines can understand, which, when done by hand, requires deep knowledge of programming languages, libraries and APIs. Today’s no-code solutions are geared towards abstracting these layers of complexity, achieving a business outcome by removing the requirement for understanding a rigid set of coding rules.
On the other hand, ML is designed to seek non-obvious answers in more complex and nuanced ways. As machine learning tooling matures and better exposes the simpler underlying primitives of providing positive and negative examples of the desired behavior to users, ML will become the predominant vehicle that LC/NC solutions are based on. When that day comes, almost anyone will be able to solve business problems with thelp on an ML algorithm.
Imagine a user interacting with an ML-powered LC/NC system.
First, the model will learn to imitate the user’s desired behaviour and then enable the user to provide easy written, verbal feedback to tune the resulting algorithms. People without ML expertise will then be able to leverage AI intelligence to improve almost any business task or discover highly personalised content in their personal lives (e.g. search through classifieds for specific kinds of job listings).