The Computer Weekly Developer Network is in the engine room, covered in grease and looking for Artificial Intelligence (AI) tools for software application developers to use.
This post is part of a series which also runs as a main feature in Computer Weekly.
With so much AI power in development and so many new neural network brains to build for our applications, how should programmers ‘kit out’ their AI toolbox?
Interest in AI has surged in the last decade. Along with this, AI tooling — which we’ll regard broadly here as anything that helps engineers build AI more efficiently — has found itself in the spotlight.
But how important is tooling?
Will it be tooling that allows us to advance AI further, or is tooling simply a distraction from ‘real’ work?
Tooling vs theory
The question comes down to the assessing the balance of tooling vs theory — and asking which will truly lead us to the AI revolution?
Consider the tooling around Deep Learning, the poster child of recent AI advancements.
Google, Facebook, Microsoft and Amazon have developed or adopted TensorFlow, PyTorch, Microsoft Cognitive Toolkit and MXNet respectively, highlighting the importance of these tools.
A decade ago, we would have expected these companies to develop most of their tooling as proprietary closed-source software, similarly to early versions of Microsoft Visual Studio. Instead, we now see tools developed under open source licenses, with strong inputs from a global community of developers.
These tools undoubtedly help developers to work more efficiently.
Instead of having to spend mental effort on backpropagation formulas, AI engineers can work with abstract, high-level building blocks. This makes the engineering feedback loop much faster and it is comparable to how writing computer code in a high-level programming language is easier than writing code in an assembly language.
The AI industrial revolution
The industrial revolution reduced the need for repetitive human labour and, equivalently, AI tooling has reduced the need for low-level mathematics in building AI solutions.
Many programmers today do not need to know how a compiler functions to write useful computer software. With advanced enough tooling, developers can build useful AI models without the need to study calculus and linear algebra.
But tooling can also be a distraction, or even a net negative. Frameworks can suffer from depreciation over time. As Google pushes out new versions of TensorFlow, models that were written last year are already obsolete.
Most documentation, tutorials, books, presentations and academic papers become obsolete within months.
Developers spend time becoming experts in specific tools only to find that they need to start over a few months later.
Furthermore, not understanding the fundamental concepts can cause people to drastically overstate their results. Researchers today regularly claim to have solved a problem with AI, when in fact they have simply uncovered random results that look interesting due to careful selection from trillions of random simulations.
Hammer to fall
While advanced tooling can help us, we shouldn’t ignore simple alternatives.
A hammer in 2019 is not so different from a shaped rock that humans started using 3.3 million years ago. Similarly, a modern computer keyboard is not that different from those found on typewriters, and many professional engineers still use old code editors such as Vim and Emacs.
It remains to be seen how much tooling will help us develop the next generation of AI, and how much of this advancement will rather be done using older and simpler methods.
In conclusion, definitely don’t ignore developments in tooling, but also make sure you own your tools, instead of vice-versa.
Codementor tweets at @codementorio