UHG
Search
Close this search box.

Channel-Specific and Product-Centric GenAI Implementation in Enterprises Leads to Data Silos and Inefficiencies

Pega employs ‘situational layer cake’, which, as a part of its exclusive centre-out architecture, helps adapt microjourneys for different customer types, lines of business, geographies, and more.

Share

Listen to this story

Organisations often struggle with data silos and inefficiencies when implementing generative AI solutions. This affects over 70% of enterprises today, but global software company Pegasystems, aka Pega, seems to have cracked the code by using its patented ‘situational layer cake’ architecture. 

This approach democratises the use of generative AI across its platform, allowing clients to seamlessly integrate AI into their processes. They can choose from any LLM service provider, including OpenAI, Google’s Vertex AI, and Azure OpenAI Services, thereby ensuring consistent and efficient AI deployment across all business units.

“Our GenAI implementation at the rule type levels allows us to democratise the use of LLMs across the platform for any use case and by mere configuration, our clients can use any LLM service provider of their choice,” said Deepak Visweswaraiah, vice president, platform engineering and side managing director at Pegasystems, in an interaction with AIM

Pega vs the World 

Recently, Salesforce announced the launch of two new generative AI agents, Einstein SDR Agent and Einstein Sales Coach Agent, which autonomously engage leads and provide personalised coaching. This move aligns with Salesforce’s strategy to integrate AI into its Einstein 1 Agentforce Platform, enabling companies like Accenture to scale deal management and focus on complex sales.

Salesforce integrates AI across all key offerings through its unified Einstein 1 Platform, which enhances data privacy, security, and operational efficiency via the Einstein Trust Layer. 

“We have generative AI capabilities in sales cloud, service cloud, marketing cloud, commerce cloud, as well as our data cloud product, making it a comprehensive solution for enterprise needs,” said Sridhar H, senior director of solution engineering at Salesforce.

SAP’s generative AI strategy, on the other hand, centres around integrating AI into core business processes through strategic partnerships, ethical AI principles, and enhancing its Business Technology Platform (BTP) to drive relevance, reliability, and responsible AI use across industries.

“We are adding a generative AI layer to our Business Technology Platform to address data protection concerns and enhance data security,” stated Sindhu Gangadharan, senior VP and MD of SAP Labs, underscoring the company’s focus on integrating AI with a strong emphasis on security and business process improvement.

Oracle, on the other hand, focuses on leveraging its second-generation cloud infrastructure, Oracle Cloud Infrastructure (OCI). It is designed with a unique, non-blocking network architecture to support AI workloads with enhanced data privacy while extending its data capabilities across multiple cloud providers.

“We’re helping customers do training inference and RAG in isolation and privacy so that you can now bring corporate sensitive, private data…without impacting any privacy issue,” said Christopher G Chelliah, senior vice president, technology & customer strategy, JAPAC at Oracle.

Meanwhile, IBM has watsonx.ai, an AI and data platform designed to help companies integrate, train, and deploy AI models across various business applications.

IBM’s generative AI strategy with watsonx.ai differentiates itself by offering extensive model flexibility, including IBM-developed (Granite), open-source (Llama 3 and alike), and third-party models, along with robust client protection and hybrid multi-cloud deployment options. At the same time, Pega focuses on deeply integrating AI within its platform to streamline business processes and eliminate data silos through its unique situational layer cake architecture.

Pega told AIM that it distinguishes itself from its competitors by avoiding the limitations of the traditional technological approaches, which often lead to redundant implementations and data silos. “In contrast, competitors might also focus more on channel-specific designs or product-centric implementations, which can lead to inefficiencies and fragmented data views across systems,” said Visweswaraiah. 

Situational Layer Cake Architecture 

Pega told AIM that its approach to integrating GenAI processes into business operations is distinct due to its focus on augmenting business logic and decision engines rather than generating code for development. 

It employs the situational layer cake architecture, which as a part of Pega’s exclusive centre-out architecture, helps to adapt microjourneys for different customer types, lines of business, geographies, and more. 

“Our patented situational layer cake architecture works in layers making specialising a cinch, differentiating doable, and applying robust applications to any situation at any time, at any scale,” said  Visweswaraiah.

He added that enterprises can start with small, quick projects that can grow and expand over time, ensuring they are adaptable and ready for future challenges.

In addition to this, the team said it has the ‘Pega Infinity’ platform, which can mirror any organisation’s business by capturing the critical business dimensions within its patented situational layer cake. 

“Everything we build in the Pega platform, processes, rules, data models, and UI is organised into layers within the situational layer cake. This means that you can roll out new products, regions, or channels without copying or rewriting your application,” shared  Visweswaraiah. 

He further said that the situational layer cake lets you declare what is different and only what is different into layers that match each dimension of your business. 

Simply put, when a user executes the application, the Pega platform slices through the situational layer cake and automatically assembles an experience that is tailored exactly to that user’s context. 

Visweswaraiah believes that this architecture has given them a great opportunity to integrate GenAI into the platform at the right layers so it is available across the platform. 

📣 Want to advertise in AIM? Book here

Picture of Aditi Suresh

Aditi Suresh

Aditi is a political science graduate, and is interested in technology, social media, and culture.
Related Posts
Association of Data Scientists
Tailored Generative AI Training for Your Team
Upcoming Large format Conference
Sep 25-27, 2024 | 📍 Bangalore, India
Download the easiest way to
stay informed

Subscribe to The Belamy: Our Weekly Newsletter

Biggest AI stories, delivered to your inbox every week.

Flagship Events

Rising 2024 | DE&I in Tech Summit
April 4 and 5, 2024 | 📍 Hilton Convention Center, Manyata Tech Park, Bangalore
Data Engineering Summit 2024
May 30 and 31, 2024 | 📍 Bangalore, India
MachineCon USA 2024
26 July 2024 | 583 Park Avenue, New York
MachineCon GCC Summit 2024
June 28 2024 | 📍Bangalore, India
Cypher USA 2024
Nov 21-22 2024 | 📍Santa Clara Convention Center, California, USA
Cypher India 2024
September 25-27, 2024 | 📍Bangalore, India
discord icon
AI Forum for India
Our Discord Community for AI Ecosystem, In collaboration with NVIDIA.