Listen to this story
|
Organisations often struggle with data silos and inefficiencies when implementing generative AI solutions. This affects over 70% of enterprises today, but global software company Pegasystems, aka Pega, seems to have cracked the code by using its patented ‘situational layer cake’ architecture.
This approach democratises the use of generative AI across its platform, allowing clients to seamlessly integrate AI into their processes. They can choose from any LLM service provider, including OpenAI, Google’s Vertex AI, and Azure OpenAI Services, thereby ensuring consistent and efficient AI deployment across all business units.
“Our GenAI implementation at the rule type levels allows us to democratise the use of LLMs across the platform for any use case and by mere configuration, our clients can use any LLM service provider of their choice,” said Deepak Visweswaraiah, vice president, platform engineering and side managing director at Pegasystems, in an interaction with AIM.
Pega vs the World
Recently, Salesforce announced the launch of two new generative AI agents, Einstein SDR Agent and Einstein Sales Coach Agent, which autonomously engage leads and provide personalised coaching. This move aligns with Salesforce’s strategy to integrate AI into its Einstein 1 Agentforce Platform, enabling companies like Accenture to scale deal management and focus on complex sales.
Salesforce integrates AI across all key offerings through its unified Einstein 1 Platform, which enhances data privacy, security, and operational efficiency via the Einstein Trust Layer.
“We have generative AI capabilities in sales cloud, service cloud, marketing cloud, commerce cloud, as well as our data cloud product, making it a comprehensive solution for enterprise needs,” said Sridhar H, senior director of solution engineering at Salesforce.
SAP’s generative AI strategy, on the other hand, centres around integrating AI into core business processes through strategic partnerships, ethical AI principles, and enhancing its Business Technology Platform (BTP) to drive relevance, reliability, and responsible AI use across industries.
“We are adding a generative AI layer to our Business Technology Platform to address data protection concerns and enhance data security,” stated Sindhu Gangadharan, senior VP and MD of SAP Labs, underscoring the company’s focus on integrating AI with a strong emphasis on security and business process improvement.
Oracle, on the other hand, focuses on leveraging its second-generation cloud infrastructure, Oracle Cloud Infrastructure (OCI). It is designed with a unique, non-blocking network architecture to support AI workloads with enhanced data privacy while extending its data capabilities across multiple cloud providers.
“We’re helping customers do training inference and RAG in isolation and privacy so that you can now bring corporate sensitive, private data…without impacting any privacy issue,” said Christopher G Chelliah, senior vice president, technology & customer strategy, JAPAC at Oracle.
Meanwhile, IBM has watsonx.ai, an AI and data platform designed to help companies integrate, train, and deploy AI models across various business applications.
IBM’s generative AI strategy with watsonx.ai differentiates itself by offering extensive model flexibility, including IBM-developed (Granite), open-source (Llama 3 and alike), and third-party models, along with robust client protection and hybrid multi-cloud deployment options. At the same time, Pega focuses on deeply integrating AI within its platform to streamline business processes and eliminate data silos through its unique situational layer cake architecture.
Pega told AIM that it distinguishes itself from its competitors by avoiding the limitations of the traditional technological approaches, which often lead to redundant implementations and data silos. “In contrast, competitors might also focus more on channel-specific designs or product-centric implementations, which can lead to inefficiencies and fragmented data views across systems,” said Visweswaraiah.
Situational Layer Cake Architecture
Pega told AIM that its approach to integrating GenAI processes into business operations is distinct due to its focus on augmenting business logic and decision engines rather than generating code for development.
It employs the situational layer cake architecture, which as a part of Pega’s exclusive centre-out architecture, helps to adapt microjourneys for different customer types, lines of business, geographies, and more.
“Our patented situational layer cake architecture works in layers making specialising a cinch, differentiating doable, and applying robust applications to any situation at any time, at any scale,” said Visweswaraiah.
He added that enterprises can start with small, quick projects that can grow and expand over time, ensuring they are adaptable and ready for future challenges.
In addition to this, the team said it has the ‘Pega Infinity’ platform, which can mirror any organisation’s business by capturing the critical business dimensions within its patented situational layer cake.
“Everything we build in the Pega platform, processes, rules, data models, and UI is organised into layers within the situational layer cake. This means that you can roll out new products, regions, or channels without copying or rewriting your application,” shared Visweswaraiah.
He further said that the situational layer cake lets you declare what is different and only what is different into layers that match each dimension of your business.
Simply put, when a user executes the application, the Pega platform slices through the situational layer cake and automatically assembles an experience that is tailored exactly to that user’s context.
Visweswaraiah believes that this architecture has given them a great opportunity to integrate GenAI into the platform at the right layers so it is available across the platform.