HP_Photo - stock.adobe.com
While adoption of machine learning (ML) technology has yet to hit the mainstream, the most common means of implementing it today is to purchase enterprise applications in which such functionality is embedded.
According to a study by 451 Research, only one in five organisations have to date implemented some form of ML software in one or more parts of the business, while a further 20% are at the proof-of-concept stage. Another 13% plan to introduce the technology in the next 12 months, a further 15% within the next three years and just under a third have no such plans at all.
Interestingly, of those businesses which have gone down the artificial intelligence (AI) route, some 38% have introduced ML-enhanced packages from third-party suppliers. This figure compares with 27% building their own applications using open source tools, 24% doing so using supplier-specific tools and 11% using ML-enabled software from service providers.
Nick Patience, research vice-president of AI applications and platforms at 451 Research, explains why going down the enterprise application route is so popular. “Building machine learning applications is hard and a lot of software developers haven’t figured out how to do it yet,” he says. “Also, data scientists are in short supply and are very well paid as a result, which is why large software companies have lots of them, and other firms don’t.”
But it is not just a general lack of skills that is an issue here. In fact, the single biggest challenge for user organisations relates to data collection, preparation and quality issues. This is because the adage “garbage in, garbage out” is even more pertinent in an AI-based world than elsewhere, because it is all about the data.
Machine learning use cases
As for the ways in which ML is being used today, common areas of adoption include customer service and document analysis. Beyond that, “use cases get industry-specific very quickly”, says Patience.
For example, in an industrial setting, the software is used for predictive maintenance, as well as supply and demand forecasting. Financial services companies employ it for activities such as fraud detection and customer service. But the sector with the biggest potential upside in terms of transformation, Patience believes, is healthcare, although uptake here is unlikely to take off unless a host of privacy issues are resolved.
Despite the ongoing supplier hype, adoption is unlikely to become widespread for another couple of years. In the interim, Patience expects to see struggles for dominance between enterprise application suppliers, such as Oracle and SAP, and platform suppliers, such as IBM and Amazon.
“There’s definitely going to be a battle for the market and it’s a long way from being resolved,” he says. “The question is one of build versus buy, and whether people will rely on the main platform vendors, the application vendors, or a mixture of the two.”
Another challenge for customers when making buying decisions is the question of which ML model to opt for – an important consideration as they are “valuable intellectual property and control the functionality in the applications”.
As Patience warns: “You can’t just chuck the same model into Salesforce if you’ve been using Oracle, which means we’re likely to see a battle for machine learning adoption at all levels.”
Whatever the future may hold, the following three case studies demonstrate how ML-enhanced packages from enterprise applications suppliers are starting to be used today.
Flint Hills Resources – predictive maintenance
Flint Hills Resources (FHR) believes ML technology could prove an important means of helping it maximise the efficiency and safety of its refining, chemicals and biofuels business over the next three to four years.
The introduction of Infor’s newly released Coleman AI platform is intended to save the Kansas-based company time and money by supporting a move from a six-monthly scheduled maintenance regime, which costs it hundreds of millions of dollars each year, to a predictive maintenance approach that is much more just-in-time and reduces the need to store expensive inventory.
“Our big use case here is predictive maintenance, so we don’t have to take pieces of equipment down every six months and work on them if they don’t need it,” says Lucas Randall, FHR’s director of digital transformation. “From a refinery processing perspective, it’s more efficient if all the equipment is running, so it’s about trying to find the right blend of operations and maintenance to optimise activity.”
Lucas Randall, FHR
While the project is still in the early stages, the Koch Industries-owned company started working with Infor’s data scientists a couple of months ago. They were provided with 10 years’ worth of maintenance-related work order histories for two refineries and created a model that predicted the time lines between those work orders to understand when the next ones were likely to fall due. Real-world cases are now being tested to see if the model has worked.
While Lucas says that “getting the data into Coleman was easy”, what proved more testing was getting it into a fit state for use in the system. This involved pulling the requisite information together and ensuring it was clean enough to guarantee accurate outputs.
“We’ve done clean-up reporting for a subset of the data but never on this scale, so we’ve had to introduce quality processes to ensure it was good enough to go into the machine learning model,” says Lucas.
The next phase of the initiative will be to introduce sensors – as their cost comes down – that can recognise temperature shifts or other changes beyond “steady state” to provide a more holistic view of what is happening with the equipment. These sensors will eventually replace field workers, who will be reskilled to move into safer, higher-value data analysis work.
“The idea is to plug enterprise asset management into the Coleman models and incorporate process and sensor data into a blended view that tells us about the health of the equipment and the situation with work orders, because the two together will help us realise our vision of predictive maintenance,” says Lucas.
Standard Life Aberdeen – boosting sales conversion rates
Standard Life Aberdeen’s aim in introducing Salesforce’s Einstein AI platform is to use sales and marketing data more effectively to better support its 18,000 UK financial advisors in boosting customer conversion rates.
The Aberdeen Standard Investments side of the business, which undertakes both asset management activities for institutional and wholesale clients and investment management for high-net-worth individuals, is currently conducting the groundwork necessary to get the first use cases up and running in Einstein by the end of 2019.
In going down this route, the first core aim of the UK-based firm, which was formed following the merger of Standard Life with Aberdeen Asset Management in 2017, is to optimise how sales and marketing information is used to engage with financial advisors more effectively rather than turn them off by bombarding them with too many untargeted messages.
A second goal is to identify the right opportunities for them to interact with clients and provide the appropriate prompts to make a sale more likely. A third is to understand advisors’ overall sales performance by means of a dashboard to make it easier to adapt the firm’s strategy if required.
Duncan Muir, Standard Life Aberdeen
“The focus is on creating a data-driven business with Einstein at its core, which means using data insights as effectively as possible,” says Duncan Muir, the firm’s head of business-to-business customer relationship management (CRM). “We’ve been building out this capability for some time, but now we want to supercharge it.”
Over the past six months, Standard Life Aberdeen has been working to ensure its core data is “robust and in the right place”. It has also been busy introducing the right structures and processes to ensure its sales and marketing teams are comfortable and familiar with the new ways of working.
“We’re already increasing conversion rates by about 20% as we’ve been applying the principles but just doing it manually,” says Muir. “We’ve been using data to target people, but in smaller populations, and Einstein will supercharge that.”
Other expected benefits include a 50% reduction in the manual effort currently required to collate and analyse data. “Einstein will do that all the time with much more highly targeted activities so engagements are more relevant and timely,” says Muir. Each salesperson is also expected to save about three-and-a-half hours per week in providing information to advisors, a figure that is expected to double when Einstein Voice natural language processing software is implemented in future.
“The aim is to get the first use cases up and running by the end of 2019, and then in 2020, we’ll build them out and extend them to more teams, such as operations and service. It may only be early days, but this technology has the potential to make a huge impact on the business,” says Muir.
Pregis – business transformation
Pregis intends to use SAP’s Leonardo AI-based applications to help it broaden out its activities as a solutions provider in the US packaging sector.
The Illinois-based company started the shift from being a pure packaging manufacturer, making both the packaging products itself and providing onsite machines to package customer orders, about three years ago. It now takes more of a consultative approach to help reduce damage and improve the “unboxing” experience for its clients, which means they can charge more.
But the hope is that the move to Leonardo will ultimately enable Pregis to “shift the market and come to the industry with a range of innovations”, says Jeff Mueller, the firm’s vice-president and chief information officer. As an example of what he means, Mueller cites the tyre industry, which not only provides customers with the tyres themselves, but also offers various services, such as free checks and paid-for warranties.
Jeff Mueller, Pregis
As to how the company intends to achieve its aims, the idea is that it will use the data gleaned from predictive maintenance activities undertaken by Leonardo on its packaging machines to look for patterns and trends. One possible use case here, which Pregis hopes to roll out to its most important distribution partners over the next 18 months, is to offer them better insights into their inventory status, so staff can be redeployed onto higher-value tasks.
A further aim over the longer term is also to use predictive maintenance data obtained from packaging machines based at customer sites to help clients understand how productive their distribution centres are.
But it is currently early days for the firm’s initiative. It plans to start its migration from the internet of things-based release of Leonardo to the ML-based platform-as-a-service version at the end of October 2019, and expects it to go live by around mid-January 2020.
“We’re moving from reactive to proactive to predictive maintenance, but we’re only on the early stages of our data journey when you’re talking about AI and machine learning,” says Mueller.
Even so, the company anticipates that the move to predictive maintenance will generate significant cost savings. Despite having made six acquisitions recently, Pregis expects to keep the headcount of its field service team flat by reducing the time they spend on maintenance tasks by 10%.
“The next phase is revenue growth as a result of automatic replenishment based on monitoring how much the machines consume, which we can then invoice for,” concludes Mueller.
Read more about machine learning in enterprise applications
- Podcast: Machine learning applications – from lab to enterprise.
- Hands-on experience building bots on the Azure cloud and briefings from Microsoft experts show that the future of AI is here, and it’s more user-friendly than you might expect.
- SAP wants to stake a claim in IoT with SAP Leonardo IoT and integration with Microsoft Azure IoT Hub, but observers call the announcement more marketing exercise than anything.