Pega CTO: We’re the AI ‘workflow backbone’ on a spine of Google Cloud & AWS

Enterprise generative AI choice.

It’s not a term we might have even considered thus far i.e. the adoption of artificial intelligence at a decision-making and generative level is still so embryonic inside most enterprise organisations that questions over choice of generative AI engines, models and services has largely not been part of the discussion.

That reality could be about to change.

Enterprise AI decisioning and workflow automation platform provider Pega used its Pega World iNspire 2024 conference in Las Vegas this week to announce an expansion of its trademarked Pega GenAI capabilities to connect to Amazon Web Services (AWS) and Google Cloud’s Large Language Models (LLMs). 

The additions are designed to allow Pega clients to use the integration of generative AI technologies offered by AWS and Google Cloud into the decisions and workflows within Pega Platform. 

Into ‘vast’ generative AI

The company insists that through this expansion, Pega’s enterprise platform will enable enterprise users to connect what is being called “a vast range of generative AI services and models” with Pega GenAI architecture to support those generative AI models within Pega solutions. 

In other words, customers will be able to use Google & AWS gen-AI inside, within and throughout workflow automation layers provided by Pega designed enable businesses to run Pega, but get the AI from these two hyperscalers.

The access will AWS Amazon Bedrock – a fully managed service that offers a choice of high-performing foundation models (FMs) from AI companies via a single API – and Amazon Titan; from Google Cloud, including Vertex AI and Google Gemini; plus also Claude from Anthropic

The AI workflow backbone

“The generative AI market is evolving rapidly as are generative AI strategies,” said Don Schuerman, chief technology officer, Pega.

“[Users] know the best model for them will depend on a variety of factors, including their own strategy and infrastructure, effectiveness, performance, speed, trust and cost, so having a choice is key. The extension of these relationships underlines Pega’s commitment to becoming the workflow backbone for generative AI solutions to enable truly transformational change for our clients.  Our trusted partners play an important role in helping us to deliver these outcomes.” 

Enterprises that use Pega for generative AI-infused development, engagement, service and back-to-front office operations workflows are promised the ability to benefit from unified governance, auditability, and controls across all applications of generative AI throughout their operations.

Pega GenAI is available to all Pega clients through Pega Cloud.

“We’re excited to expand our partnership with Pega and provide its clients with access to our curated set of over 150 models in Vertex AI,” said Rodrigo Rocha, head of global apps ISV partnerships, Google Cloud. “Generative AI empowers organizations to transform their operations, making it more important than ever to equip customers with the ability to choose the best model to suit their business needs.”

AWS and Google Cloud generative AI models will be available in Pega Connect GenAI, a plug-and-play architecture that allows low-code developers to author prompts and get immediate value from generative AI in any workflow or decision. 

This enables Pega low-code developers to build custom generative AI-powered capabilities into their workflows to help boost the productivity of employees and agents interacting with them. For example, if a process, such as a claims or approvals workflow, includes a range of documents, Pega GenAI can be used to build a component to summarise documents on the fly and give end-users an at-a-glance overview of critical information when they open their assignments.

Don Schuerman, chief technology officer, Pega.

Photo credit:
Sander Almekinders, Dutch journalist & raconteur.

CIO
Security
Networking
Data Center
Data Management
Close