Appian & AWS accelerate AI automation accessibility

Accessibility matters.

When users first say Microsoft’s ‘accessibility’ features appear some three decades ago, not everybody knew what was being offered. 

Stanford University’s history of accessible technology pages reminds us that in 1995 – Microsoft issued Windows 95, “[This was] the first time its OS had built-in accessibility features (rather than as an add-on).”

Accessibility today still matters, only more so, obviously.

This is (some or all of the reason) why Appian has now signed a collaboration agreement with Amazon Web Services (AWS) with the aim of making generative Artificial Intelligence (gen-AI) more accessible to enterprise business processes. 

Why isn’t gen-AI more accessible from the get-go? 

Primarily of course because operating Large Language Models (LLMs) is complex, time-consuming and expensive – not to mention the management burden associated with setting up appropriate privacy guardrails and data lockdown procedures when LLMs are connected to proprietary mission-critical data – and then there’s the whole set of tasks associated with maintaining, training and re-braining (Ed: not actually a term yet, but it should be) AI models where hallucinatory bias is detected and so on. 

Oh, yes, and there’s also the fact that organisations face a shortage of data scientists and increasing information technology backlogs as a whole.

Maintaining, training & re-braining 

To attempt to address these concerns and a host of related challenges in this space,  Appian will invest resources to find novel (as in new, not as in fancy) ways to combine Appian’s native AI capabilities and the Appian data fabric with the LLMs provided by Amazon Bedrock and the Machine Learning (ML) capabilities from Amazon SageMaker. 

For those who might enjoy a reminder, Amazon Bedrock is a fully managed service that offers a choice of high-performing Foundation Models (FMs) from AI companies via a single Application Programming Interface (API), along with a set of capabilities that organisations need to build generative AI applications with security, privacy and responsible AI. 

Amazon SageMaker is a service to build, train, and deploy ML models for any use case with fully managed infrastructure, tools, and workflows.

Appian automation

Completing the picture here then, Appian is designed to automate mission-critical business processes in some of the most highly regulated and security-sensitive industries. These customers want to use AI while maintaining the security of their data. 

Bullish about the prospects for success as a result of this new strategic agreement, Appian says that its private AI approach gives enterprises control over their own data and makes sure their data is not used to train public models that other organisations can use. 

“Appian’s low-code AI process platform with its Appian AI Skills capability enables customers to easily incorporate AI into business processes, letting AI and humans work together seamlessly,” said Michael Beckley, CTO at Appian. “The AI economy is here and it will quickly create a strong competitive advantage for organisations that know how to use it. Enterprises that upgrade their core business processes with AI and Appian’s data-driven insights will thrive while those that fail to do so will lose control of their data and their future. Amazon Bedrock is an important enabler for our private AI vision. Together, we give organisations the ability to effortlessly use AI for process automation with data privacy and security at the forefront.”

By using the capabilities of Amazon Bedrock, Appian gains the ability to host LLMs within customer compliance boundaries and privately customise those models, ensuring that sensitive data remains secure and confidential. 

Sagacious security software

Michael Beckley, CTO at Appian: The AI economy is here, it’s time to open a bank account and start making transactions.

Amazon SageMaker allows Appian customers to create, train and fine-tune proprietary AI models using their own data. Because AWS AI and ML services are designed with security and privacy controls, the hyperscaler giant says that Appican customers can use their data to receive more accurate and relevant results from their generative AI applications based on the needs of their businesses.

Managing director of technology partnerships at AWS Chris Grusz finalises the news of this technology union this week by saying that companies deeply care about security and privacy, especially when leveraging AI. He thinks that this collaboration between AWS and Appian will help customers streamline mission-critical business processes confidently by leveraging AI and machine learning in a secure and compliant manner.

End-to-end auto-AI

In terms of key takeaways here… well, it’s automation & acceleration first, but not without security & solidity second… and really, all those factors have to come together on one plate (did someone say unified platform or single pane of glass, or both even?) with skills, skills, skills and of course a nice resounding end-to-end message to seal the deal. Accessibility factors relating to generative AI may be one of the key challenges for this year’s enterprise IT stack architects and the software application development professionals who populate it. 

This is more than just accessibility via text-to-speech, this is end-to-end AI accessibility via automation-to-accelerate… try fitting that on a t-shirt.

CIO
Security
Networking
Data Center
Data Management
Close