Discussions and Interviews of Top CEOs and CTOs in AI https://analyticsindiamag.com/intellectual-ai-discussions/ Artificial Intelligence, And Its Commercial, Social And Political Impact Tue, 03 Sep 2024 06:24:19 +0000 en-US hourly 1 https://analyticsindiamag.com/wp-content/uploads/2019/11/cropped-aim-new-logo-1-22-3-32x32.jpg Discussions and Interviews of Top CEOs and CTOs in AI https://analyticsindiamag.com/intellectual-ai-discussions/ 32 32 How Joule is Helping SAP Find Moat in Spend Management https://analyticsindiamag.com/intellectual-ai-discussions/how-joule-is-helping-sap-find-moat-in-spend-management/ Mon, 02 Sep 2024 10:05:48 +0000 https://analyticsindiamag.com/?p=10134239

SAP customers are now reaping the benefits of Ariba, Concur and Fieldgrass – companies which SAP has acquired over the years – under one platform.

The post How Joule is Helping SAP Find Moat in Spend Management appeared first on AIM.

]]>

In today’s fast-paced world, it is vital for organisations to optimise procurement, manage supply chains and control costs effectively. For instance, cocoa prices have surged over 400% this year, posing a significant challenge for chocolate manufacturers.

Spend management tools provide comprehensive visibility into a company’s spending patterns, streamlining procurement processes. Generative AI could further optimise spend management as it can provide predictive analytics, supplier recommendations and enhanced negotiations strategies. 

This is exactly what SAP has done. The company has consolidated all aspects of spend management under one umbrella, calling it Intelligent Spend and Business Network (ISBN). And now, it has thrown generative AI into the mix.

SAP customers are now reaping the benefits of Ariba, Concur and Fieldglass – companies which SAP has acquired over the years – under one platform coupled with SAP Business Network.

While spend management is not a new concept, clubbing together the solution powered by generative AI and Joule, a proprietary AI assistant developed internally by SAP, they believe the solution holds potential.

How GenAI is Changing Spend Management

“At SAP, our AI-first product strategy involves deeply embedding AI into core business processes. Rather than overlaying AI onto existing workflows, we are fundamentally reimagining processes, positioning AI as a constant, collaborative partner for every user,” Jeff Collier, chief revenue officer, SAP Intelligent Spend & Business Network, told AIM.

By using Ariba, a procurement tool, customers can now leverage the power of generative AI models to accelerate their planning processes across multiple categories (including 3rd party market data from Beroe) and reduce supplier onboarding time, Collier further revealed.

(Jeff Collier, chief revenue officer, SAP Intelligent Spend & Business Network)

Generative AI models are also helping Ariba customers make sense of historical data and derive insights from it and make recommendations in real-time, which in return is helping them in their procurement process. 

“To optimise external workforce and services spend for greater insight, control, and savings, SAP Fieldglass customers are now leveraging AI-enhanced SOW description generation, AI-enhanced job descriptions, and AI-enhanced translation of job descriptions,” Collier added.

AI Agents are Coming

If not LLMs, AI agents are expected to truly scale AI. When asked if these could be part of SAP’s ISBN solution, Collier said that SAP is focused on embedding digital assistants directly into our products through Joule, SAP’s generative AI copilot. 

Joule provides AI-assisted insights and will be integrated as a standard feature across the SAP portfolio, including ISBN.

“In my conversations with chief executives, business leaders, and decision-makers around the world, a common theme has emerged– the urgent need to do more with less,” he said. 

He added that the COVID-19 pandemic has significantly expanded responsibilities, yet headcounts have remained flat or declined, making productivity a critical driver for AI interest among business leaders.

“The real question now is how swiftly these leaders can embed AI into their operations to start reaping its benefits. With the proliferation of tools, complex policies, and the need to maximise return and value, organisations are seeking chat-based interfaces or agents to guide them through their tasks and answer queries efficiently,” he added.

Hence, it makes sense for SAP to integrate AI agents into its entire portfolio. With Joule, SAP users can save time and increase productivity by describing their ideas, asking analytical questions, or instructing the system, rather than navigating through traditional clicks or coding. 

ISBN, an Interesting Prospect for India 

Explaining further what ISBN solutions from SAP truly mean, Ashwani Narang, Vice President, Intelligent Spend and Business Network, SAP Indian Subcontinent, told AIM that spend management extends beyond traditional procurement to include the entire supply chain, contingent labour, and employee expenses. 

“Various departments—marketing, finance, and others—constantly request funds, highlighting the need for comprehensive oversight. Procurement used to be the sole area focused on savings, but now spend management encompasses all financial outflows, including those outside the organisation like partners, logistics providers, and consultants,” Narang said.

(Ashwani Narang, Vice President, Intelligent Spend and Business Network, SAP Indian Subcontinent)

Narang believes, as the country transitions towards being a manufacturing economy with a great consensus on producing things locally thanks to initiatives like ‘Make in India’, ISBN could be a great tool for Indian companies. 

“The more you become a manufacturing economy, the more working capital becomes important and I believe that’s where the onus is going to be,” he said.

SAP has witnessed triple-digit growth with ISBN, and according to Narang, thousands of customers in India are already leveraging the solutions.

The solutions can also help companies with category management. “For instance, if a mattress manufacturer needs cotton, it’s crucial to assess whether it’s a supplier-power or buyer-power category. Knowing if enough suppliers offer the specific grade of cotton helps in negotiating prices,” he added.  

Narang said that Joule can provide insights into market conditions and supplier dynamics, guiding strategic sourcing and ensure better negotiations, he added.

Moreover, ISBN is also not limited to large enterprises with a global footprint. The portfolio of SAP customers for ISBN includes mid-size and small enterprises as well.

“Any company with significant purchasing needs—whether it’s a relatively small firm with a turnover of INR 1,000 crore or a large corporation like Microsoft with a $50 billion valuation—can benefit from SAP’s intelligence solutions.”

The post How Joule is Helping SAP Find Moat in Spend Management appeared first on AIM.

]]>
This YC-Backed Bengaluru AI Startup is Powering AWS, Microsoft, Databricks, and Moody’s with 5 Mn Monthly Evaluations https://analyticsindiamag.com/ai-breakthroughs/this-yc-backed-bengaluru-ai-startup-is-powering-aws-microsoft-databricks-and-moodys-with-5-mn-monthly-evaluations/ Tue, 20 Aug 2024 04:40:40 +0000 https://analyticsindiamag.com/?p=10133091

By mid-2023, Ragas gained significant traction, even catching the attention of OpenAI, which featured their product during a DevDay event.

The post This YC-Backed Bengaluru AI Startup is Powering AWS, Microsoft, Databricks, and Moody’s with 5 Mn Monthly Evaluations appeared first on AIM.

]]>

Enterprises love to RAG, but not everyone is great at it. The much touted solution for hallucinations and bringing new information to LLM systems, is often difficult to maintain, and to evaluate if it is even getting the right answers. This is where Ragas comes into the picture.

With over 6,000 stars on GitHub and an active Discord community over 1,300 members strong, Ragas was co-founded by Shahul ES and his college friend Jithin James. The YC-backed Bengaluru-based AI startup is building an open source stand for evaluation of RAG-based applications. 

Several engineering teams from companies such as Microsoft, IBM, Baidu, Cisco, AWS, Databricks, and Adobe, rely on Ragas offerings to make their pipeline pristine. Ragas already processes 20 million evaluations monthly for companies and is growing at 70% month over month.

The team has partnered with various companies such as Llama Index and Langchain for providing their solutions and have been heavily appreciated by the community. But what makes them special?

Not Everyone Can RAG

The idea started when they were building LLM applications and noticed a glaring gap in the market: there was no effective way to evaluate the performance of these systems. “We realised there were no standardised evaluation methods for these systems. So, we put out a small open-source project, and the response was overwhelming,” Shahul explained while speaking with AIM.

By mid-2023, Ragas had gained significant traction, even catching the attention of OpenAI, which featured their product during a DevDay event. The startup continued to iterate on their product, receiving positive feedback from major players in the tech industry. “We started getting more attention, and we applied to the Y Combinator (YC) Fall 2023 batch and got selected,” Shahul said, reflecting on their rapid growth.

Ragas’s core offering is its open-source engine for automated evaluation of RAG systems, but the startup is also exploring additional features that cater specifically to enterprise needs. “We are focusing on bringing more automation into the evaluation process,” Shahul said. The goal is to save developers’ time by automating the boring parts of the job. That’s why enterprises use Ragas.

As Ragas continues to evolve, Shahul emphasised the importance of their open-source strategy. “We want to build something that becomes the standard for evaluation in LLM applications. Our vision is that when someone thinks about evaluation, they think of Ragas.”

While speaking with AIM in 2022, Kochi-based Shahul, who happens to be a Kaggle Grandmaster, revealed that he used to miss classes and spend his time Kaggeling. 

The Love for Developers

“YC was a game-changer for us,” Shahul noted. “Being in San Francisco allowed us to learn from some of the best in the industry. We understood what it takes to build a successful product and the frameworks needed to scale.”

Despite their global ambitions, Ragas remains deeply rooted in India. “Most of our hires are in Bengaluru,” Shahul said. “We have a strong network here and are committed to providing opportunities to high-quality engineers who may not have access to state-of-the-art projects.”

“We have been working on AI and ML since college,” Shahul said. “After graduating, we worked in remote startups for three years, contributing to open source projects. In 2023, we decided to experiment and see if we could build something of our own. That’s when we quit our jobs and started Ragas.”

Looking ahead, Ragas is planning to expand its product offerings to cover a broader range of LLM applications. “We’re very excited about our upcoming release, v0.2. It’s about expanding beyond RAG systems to include more complex applications, like agent tool-based calls,” Shahul shared.

Shahul and his team are focused on building a solution that not only meets the current needs of developers and enterprises, but also anticipates the challenges of the future. “We are building something that developers love, and that’s our core philosophy,” Shahul concluded.

The post This YC-Backed Bengaluru AI Startup is Powering AWS, Microsoft, Databricks, and Moody’s with 5 Mn Monthly Evaluations appeared first on AIM.

]]>
Former Nutanix Founder’s AI Unicorn is Changing the World of CRM and Product Development https://analyticsindiamag.com/ai-breakthroughs/former-nutanix-founders-ai-unicorn-is-changing-the-world-of-crm-and-product-development/ Mon, 19 Aug 2024 10:56:33 +0000 https://analyticsindiamag.com/?p=10132923

Backed up Khosla Ventures, DevRev recently achieved unicorn status with a $100.8 million series A funding round, bringing its valuation to $1.15 billion.

The post Former Nutanix Founder’s AI Unicorn is Changing the World of CRM and Product Development appeared first on AIM.

]]>

DevRev, founded by former Nutanix co-founder Dheeraj Pandey and former SVP of engineering at the company, Manoj Agarwal, is an AI-native platform unifying customer support and product development. It recently achieved unicorn status with a $100.8 million series A funding round, bringing its valuation to $1.15 billion.

Backed by major investors, such as Khosla Ventures, Mayfield Fund, and Param Hansa Values, the company is on the road to proving the ‘AI bubble’ conversation wrong. “Right now, there’s a lot of talk in the industry about AI and machine learning, but what we’re doing at DevRev isn’t something that can be easily replicated,” said Agarwal in an exclusive interview with AIM.

Agarwal emphasised the unique challenge of integrating AI into existing workflows, a problem that DevRev is tackling head-on. Databricks recently announced that LakeFlow Connect would be available for public preview for SQL Server, Salesforce, and Workday; DevRev is on a similar journey, but with AI at its core, it remains irreplaceable.

DevRev’s AgentOS platform is built around a powerful knowledge graph, which organises data from various sources—such as customer support, product development, and internal communications—into a single, unified system with automatic RAG pipelines. 

This allows users to visualise and interact with the data from multiple perspectives, whether they are looking at it from the product side, the customer side, or the people side.

The Knowledge Graph Approach

Machines don’t understand the boundaries between departments. The more data you provide, the better they perform. “Could you really bring the data into one system, and could you arrange this data in a way that people can visually do well?” asked Agarwal. 

The Knowledge Graph does precisely that – offering a comprehensive view of an organisation’s data, which can then be leveraged for search, analytics, and workflow automation.

Agarwal describes the DevRev platform as being built on three foundational pillars: advanced search capabilities, seamless workflow automation, and robust analytics and reporting tools. “Search, not just keyword-based search, but also semantic search,” he noted.

On top of these foundational elements, DevRev has developed a suite of applications tailored to specific use cases, such as customer support, software development, and product management. These apps are designed to work seamlessly with the platform’s AI agents, which can be programmed to perform end-to-end tasks, further enhancing productivity.

“The AI knowledge graph is the hardest thing to get right,” admitted Agarwal, pointing to the challenges of syncing data from multiple systems and keeping it updated in real-time. However, DevRev has managed to overcome these hurdles, enabling organisations to bring their data into a single platform where it can be organised, analysed, and acted upon.

The Open Approach

The company’s focus on AI is not new. In fact, DevRev’s journey began in 2020, long before the current wave of AI hype. “In 2020, when we wrote our first paper about DevRev, it had GPT all over it,” Agarwal recalls, referring to the early adoption of AI within the company. 

Even today, DevRev primarily uses OpenAI’s enterprise version but also works closely with other AI providers like AWS and Anthropic. In 2021, the platform organised a hackathon where OpenAI provided exclusive access to GPT-3 for all the participants. 

This forward-thinking approach allowed DevRev to build a tech stack that was ready to leverage the latest advancements in AI, including the use of vector databases, which were not widely available at the time.

One of the biggest challenges that DevRev addresses is the outdated nature of many systems of record in use today. Whether it’s in customer support, CRM, or product management, these legacy systems are often ill-equipped to handle the demands of modern businesses, particularly when it comes to integrating AI and machine learning.

Not a Bubble

DevRev’s architecture is designed with flexibility in mind, allowing enterprises to bring their own AI models or use the company’s built-in solutions. “One of the core philosophies we made from the very beginning is that everything we do inside DevRev will have API webhooks that we expose to the outside world,” Agarwal explained. 

As DevRev reaches its unicorn status, Agarwal acknowledges the growing concerns about an “AI bubble” similar to the dot-com bubble of the late 1990s. “There’s so many companies that just have a website and a company,” he said, drawing parallels between the two eras. 

However, he believes that while there may be some hype, the underlying technology is real and here to stay. “I don’t think that anybody is saying that after the internet, this thing is not real. This thing is real,” Agarwal asserted. 

The key, he argues, is to distinguish between companies that are merely riding the AI wave and those that are genuinely innovating and solving real problems. DevRev, with its deep investment in AI and its unique approach to integrating it into enterprise workflows, clearly falls into the latter category.

The post Former Nutanix Founder’s AI Unicorn is Changing the World of CRM and Product Development appeared first on AIM.

]]>
Accel’s Prayank Swaroop on Navigating Challenges and Data Moats in Indian AI Startup Investing https://analyticsindiamag.com/intellectual-ai-discussions/accels-prayank-swaroop-on-navigating-challenges-and-data-moats-in-indian-ai-startup-investing/ Mon, 19 Aug 2024 05:28:28 +0000 https://analyticsindiamag.com/?p=10132881

“My belief is that India is a great market, and smart founders come and keep on coming, and we'll have enough opportunities to invest in,” said Prayank Swaroop, partner at Accel.

The post Accel’s Prayank Swaroop on Navigating Challenges and Data Moats in Indian AI Startup Investing appeared first on AIM.

]]>

As pioneers in the startup VC ecosystem, Accel (formerly known as Accel Partners), with over four decades of experience, entered the Indian market in 2008. They placed their initial bets on a nascent e-commerce company poised to compete with Amazon. 

In 2008, Accel India invested $800,000 in seed capital into Flipkart, followed by $100 million in subsequent rounds. The VC firm went on to back some of today’s most successful ventures, including AI startups. “We’ve invested in 27 [AI] companies in the last couple of years, which basically means we believe these 27 companies will be worth five to ten billion dollars [in the future],” said Prayank Swaroop, partner at Accel, in an exclusive interaction with AIM. 

Swaroop, who joined Accel in 2011, focuses on cybersecurity, developer tools, marketplaces, and SaaS investments, and has invested in companies such as Aavenir, Bizongo, Maverix, and Zetwerk. Having placed careful bets in the AI startup space, he continues to be optimistic, yet wary, about the Indian ecosystem. 

Swaroop observed that while the Indian ecosystem has impressive companies, not all can achieve significant scale. He mentioned that they encounter companies that reach $5 to $10 million in revenue quickly, but they don’t believe those companies can grow to $400 to $500 million, so they choose not to invest in them.

Swaroop told AIM that Accel doesn’t have any kind of capital constraints and can support as many startups as possible. However, their focus is on startups with the ambition to grow into $5 to $10 billion companies, rather than those aiming for $100 million. “I think that is our ambition,” he said. 

Accel has also been clear about having no inhibition in investing in wrapper-based AI companies. They believe that as long as the startup is able to prove that they will find customers by building GPT or AI wrappers on other products, it is fine.  

“The majority of people can start with a wrapper and then, over a period of time, build the complexity of having their own model. You don’t need to do it from day one,” said Swaroop.

However, he also pointed out that for a research-led foundational model, it’s crucial to stand out, and that one cannot just create a GPT wrapper and claim it’s a new innovation.

Accel has invested in a diversified portfolio including food delivery company Swiggy, SaaS giant Freshworks, fitness company Cult.fit, and insurance tech Acko. Accel has made its second highest number of investments in India with a total of 218 companies, only behind the United States with 572. In 2022, the market value of Accel’s portfolio was over $100 billion.

Accelerating AI Startups

Accel has a dedicated programme called Accel Atoms AI that looks to invest in promising AI-focused startups across early stages. The cohort of startups will be funded and supported by Accel partners and founders to help them grow faster. 

Selected startups in Accel Atoms 3.0 received up to $500k in funding, cloud service credits, including $100,000 for AWS, $150,000 for Microsoft Azure, $250,000 for Google Cloud Platform, GitHub credits, and other perks. The latest edition, Atoms 4.0, is expected to begin in a couple of months.

While these programmes are in place, Accel has been following a particular investment philosophy for AI startups. 

Accel’s Investment Philosophy

The investment philosophy of Accel when it comes to AI startups entails a number of key criteria, that includes even the type of team. “It’s a cliched thing in VC, but we definitely look at the team,” said Swaroop, saying that they need to have an appreciation of AI.

He emphasised that teams must embrace AI, and be willing to dive into research and seek help when needed, demonstrating both a commitment to learning and effective communication.

Accel also focuses on startups that solve real problems. Swaroop believes that founders should clearly identify their customers and show how their solution can generate significant revenue.

“We get team members who are solving great things, and we realise they are solving great things, but they can’t say that. When they can’t say that, they can’t raise funding. Basically, are you a good storyteller about it?” he explained.

Revenue Growth Struggles  

Swaroop further explained how VCs are increasingly expecting AI startups to demonstrate rapid revenue growth. 

Unlike traditional deep tech companies that may take years to generate revenue, AI firms must show significant commercial traction within 12 to 18 months. He also stated that as VC investment in AI rises, startups without clear revenue paths face growing challenges in securing funding. 

“Even to the pre-seed companies, we say, ‘Hey, you need to show whatever money you have before your next fundraiser. You need to start showing proof that customers are using you.’ Because so many other AI companies exist,” he said. 

Swaroop also highlighted how investment behaviour for AI startups has changed over the last year where investors are now asking the hard questions.

VCs Obsess Over Data Moats

Speaking about what differentiates an AI startup and their moat, Swaroop highlighted how the quality of datasets may be a deciding factor; and “not so much” with Indic datasets.

“I don’t think language datasets can be a moat, because everybody understands language. Recently, in the Bhashini project, IISc gave out 16,000 hours of audio, so it is democratic data. Everybody owns it, so what’s proprietary in it for you?” asked Swaroop.  

Proprietary datasets, such as those in healthcare or specialised fields, are valuable due to their complexity and the effort required to create them. “I think startups should pick and choose an area where they have uniqueness of data, where they will have proprietary data which is different from just democratic data. That’s the broad thing,” said Swaroop.

Irrespective of the moat, India continues to be a great market with multiple opportunities for investment. In fact, at a recent Accel summit, Swaroop jokingly mentioned how he did not invest in Zomato during its early stage, but there are no regrets. Interestingly, Accel has invested heavily in Zomato’s competitor, Swiggy.

“I think the first thing you have to let go of as a VC is FOMO, the fear of missing out, that’s why I could not think of a company that I regret not investing in, because, my belief is that India is a great market. Smart founders come and keep on coming. We’ll have enough opportunities to invest in,” concluded Swaroop, excited to meet the next generation of founders working in the AI startup ecosystem. 

The post Accel’s Prayank Swaroop on Navigating Challenges and Data Moats in Indian AI Startup Investing appeared first on AIM.

]]>
Bengaluru’s Control One is Bringing Forklifts to Life with Physical AI Agents https://analyticsindiamag.com/ai-insights-analysis/bengalurus-control-one-is-bringing-forklifts-to-life-with-physical-ai-agents/ https://analyticsindiamag.com/ai-insights-analysis/bengalurus-control-one-is-bringing-forklifts-to-life-with-physical-ai-agents/#respond Wed, 07 Aug 2024 09:40:10 +0000 https://analyticsindiamag.com/?p=10131741

Pranavan said that to compete with Boston Dynamics, the company needs at least $500 million in funding, but “given our capital efficiency and cost factors, we would be able to do with $100 million as well”.

The post Bengaluru’s Control One is Bringing Forklifts to Life with Physical AI Agents appeared first on AIM.

]]>

Within the AI landscape of India, with the minimal funding that startups get, it is hard to get into the sector, especially if you are getting into robotics. But backed by the power of jugaad, a team of experts from Tesla, and advisors from India, Bengaluru-based Control One has taken up the task of building the next Boston Dynamics from the country.

The key to Control One’s innovation lies in its focus on vision-based Visual SLAM (Simultaneous Localisation and Mapping) systems instead of traditional LiDAR technology. “Using vision, we try to make the machine understand its surroundings and navigate accordingly,” said Pranavan S, the founder and CEO, in an interaction with AIM

Recently, this one-year-old startup unveiled India’s first physical AI agent for the global market, which was installed on a forklift. With the integration, the forklift could interact with its surroundings and execute tasks based on simple voice inputs. The company calls it a 3-brain system, where operators can manage the AI decision remotely in real-time.

Visual SLAM, or vSLAM, helps the robot navigate using reference points in the environment, just like humans do. Pranavan said what separates Control One from every other robotics company in India is the focus on building an operating system for robotics. 

Currently, the forklift operates only on the ground but Pranavan said they aim to operate it in three directions, which includes picking up objects and placing them.

Founded in 2023, Control One attracted investment in April and raised $350k from industry leaders like iRobot co-founder Helen Greiner, CRED  founder Kunal Shah, and executives from Tesla, Walmart, and General Electric. 

Last year, NVIDIA Research unveiled Eureka, an AI agent that automatically generates algorithms for training robots using GPT-4 prompts and reinforcement learning. It seems like that is the future when physical AI agents will be everywhere.

OS for Robots

But when it comes to Control One, “We are building an OS that can understand the environment around it, akin to human behaviour,” explained Pranavan. “For example, we humans navigate dynamically, even in crowded and ever-changing environments, using our vision and depth perception to take predictive actions. This is what we aim to achieve with our machines.”

Control One’s OS, dubbed ‘One AI’ can be seamlessly integrated into existing warehouse equipment like pallet movers and forklifts. These systems autonomously navigate and improve performance over time, setting a new standard for intelligent material movement. 

Currently only operated using remotes, Control One aims to introduce prompts in natural language in the future, but the issue is around the regulations of the same. 

Pranavan elaborated on the advantages of vision systems over LiDAR. “LiDAR can detect obstacles, but it doesn’t understand what those obstacles are or their movement dynamics. vSLAM, on the other hand, can analyse pixel vectors to determine if an object is approaching, moving away, or crossing paths, allowing the robot to make more informed decisions.”

Despite their focus on vision systems, Control One acknowledges the complementary role of LiDAR for safety. “We use LiDARs to meet market safety standards, ensuring our robots can stop immediately if they get too close to an object,” said Pranavan. 

Though the company does training in the real-world, simulation also accounts for around 30% of the training. “The problem with simulation is that the AI would work in the settings you put it in, but when transferred into the real world, uncertainties such as uneven floor can also affect the physical agent’s performance,” said Pranavan.

The Boston Dynamics of India

While there are companies such as DiFACTO Robotics, Novus Hi-Tech, and Gridbots Robotics, Control One is focused on building the whole stack of a robot using physical AI agents.

The most interesting part is that all of this happens on-edge using NVIDIA A2000 GPUs. “We started with NVIDIA’s A5000 series but moved to the 2000 series to reduce power consumption,” said Pranavan. “Our next iteration will consume less than 80 watts and run entirely on battery, extending its operational life.”

This efficient use of power is crucial for Control One’s vision-based system, which demands significant computational resources. “Our current demo operates at 280 TOPS of AI computing power, plus it runs on an AMD processor. The next iteration will be more power-efficient while maintaining high performance.”

Pranavan, with experience of building SP Robotics Works as a co-founder, said that frugal innovation has always been part of what Indian startups do, explaining how he started figuring out how a 128 MB RAM computer worked back in his young days. 

He said that to compete with Boston Dynamics, they need at least $500 million in funding, but “given our capital efficiency and cost factors, we would be able to do with $100 million as well”.

Giving AI to Blue Collar Jobs

Control One is focusing on the warehousing sector, where there is a significant labour shortage and a high demand for automation solutions, especially in the global market. “The warehousing market is facing a huge labour crisis. Our system enables one person to manage multiple robots, effectively multiplying their productivity,” said Pranavan.

One AI is designed to manage entire fleets of robots, enabling remote operation and supervision. “Our ultimate goal is to allow one person in a remote location to manage an entire warehouse in another part of the world. OneAI will be the interface between humans and machines, helping operators make informed decisions.”

With a vision of building a humanoid, maybe in the future, Pranavan said that you don’t necessarily need to build a humanoid to operate machinery in factories. “We are operation system makers, not humanoid makers, yet,” said Pranavan, as he says that he is looking to partner with OEMs in the future, with several already under way.

The post Bengaluru’s Control One is Bringing Forklifts to Life with Physical AI Agents appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-insights-analysis/bengalurus-control-one-is-bringing-forklifts-to-life-with-physical-ai-agents/feed/ 0
Devnagri is Building a Multilingual ‘Brain’ to Enable Companies Expand to Tier 2 & 3 Cities https://analyticsindiamag.com/intellectual-ai-discussions/devnagri-is-building-a-multilingual-brain-to-enable-companies-expand-to-tier-2-3-cities/ https://analyticsindiamag.com/intellectual-ai-discussions/devnagri-is-building-a-multilingual-brain-to-enable-companies-expand-to-tier-2-3-cities/#respond Sun, 04 Aug 2024 14:01:45 +0000 https://analyticsindiamag.com/?p=10131291

Devnagri's dataset is robust, comprising over 750 million data points across 22 Indian languages.

The post Devnagri is Building a Multilingual ‘Brain’ to Enable Companies Expand to Tier 2 & 3 Cities appeared first on AIM.

]]>

Hyperlocal content is becoming crucial for businesses to expand into tier two and tier three cities in India. Devnagri, a data-driven generative AI company, is paving the way by developing a solution, which they call the brain for Indian companies. 

Nakul Kundra, the co-founder and CEO, told AIM about the moat of the company in the era of Indic AI startups

“Devnagri is dedicated to helping businesses move into new markets by providing hyperlocal content. Our machine translation capabilities enable businesses to transform their digital content into multiple languages, allowing them to engage with diverse customer bases,” Kundra explained.

Based in Noida and founded in 2021, Devnagri specialises in personalising business communication for non-English speakers. The company had recently raised an undisclosed amount in a Pre-Series A round led by Inflection Point Ventures. These newly acquired funds will be used for marketing, sales, technology enhancement, R&D, infrastructure, and administrative expenses.

Devnagri leverages open-source LLMs, such as the latest Llama 3, integrating it with its existing dataset and proprietary translation engine for 22 languages. It tailors business communications for diverse linguistic audiences, seamlessly integrating its technology into both private and government infrastructures. 

“We built application layers on top of our machine translation engine,” Kundra elaborated. “These layers allow customers to upload documents, select languages, and even customise the content before translation. The system understands specific tones and terminologies, ensuring that the translated content aligns with the business’ communication style.”

Similarly, New York-based RPA firm UiPath recently partnered with Bhashini. The focus of this collaboration is the integration of Bhashini’s language models with the UiPath Business Automation Platform. This will facilitate seamless translations of documents and other essential areas, specifically targeting Indian languages supported by Bhashini.

Companies such as CoRover.ai and Sarvam AI are also in the similar field of building translation capabilities for companies. Even big-tech companies, such as Microsoft and Google, are heavily focused on translation into Indic languages for catering to the Indian market. 

What’s the Moat Then?

However, Kundra said that Devnagri’s proprietary technology lies at the heart of this initiative, and also the moat of the company. “We’ve created our own machine translation capabilities from scratch,” Kundra said. “Businesses can use our APIs to integrate this technology directly into their platforms, localising content in real-time.”

Devnagri’s dataset is robust, comprising over 750 million data points across 22 Indian languages. “We initially built our models using a vast dataset, and recently we’ve incorporated SLMs and LLMs to enhance quality and address grey areas identified through customer feedback,” Kundra said. 

The goal is to create a single brain for businesses, integrating all touchpoints and datasets into a cohesive system that understands and responds in the desired tone.

“We adapt existing models and integrate them with our proprietary technology, ensuring high-quality multilingual capabilities,” Kundra added.

Collecting data for such a comprehensive system is no small feat. “Our data comes from multiple sources, including open-source dataset corpus, customer data, and synthetic datasets we create,” Kundra explained, saying that the introduction of new datasets from Bhashini also helps the company improve its models.

Devnagri’s multilingual capabilities extend beyond text-to-voice-based conversational bots. “We are developing multilingual bots that allow customers to interact in their preferred languages, whether it’s Marathi, Kannada, or any other language,” Kundra said and added that they aim to reduce the latency to as little as possible.

The Road Ahead

When asked about Devnagri’s differentiating factor, Kundra emphasised on their multilingual bots. “These bots are essential for companies operating pan-India. They handle calls in multiple languages, switching seamlessly to accommodate the caller’s preference, all with the lowest latency.”

Security and privacy are paramount, especially when dealing with government organisations, and customers such as UNDP, and Apollo, among several others. “All our modules are proprietary, enabling us to bundle and position them securely within enterprises or government agencies,” Kundra assured.

Devnagri’s journey has been remarkable, marked by notable milestones like their appearance on Shark Tank India 2022. “Multilingual conversation is the need of the hour, and our solutions aim to optimise costs and improve efficiency for enterprises.”

The firm has also received numerous prestigious awards, including the TieCon Award 2024 in San Francisco, the Graham Bell Award 2023, and recognition as NASSCOM’s Emerging NLP Startup of India.

“Our machine translation engine is a foundational model. It enables us to build conversational bots that understand and respond in multiple languages, tailored to specific business needs,” Kundra said.

As Devnagri looks to the future, their focus remains on building comprehensive AI solutions that cater to the diverse linguistic landscape of India. “We aim to create an ecosystem where businesses can thrive in any language, offering seamless multilingual interactions and superior customer experiences,” Kundra concluded.

The post Devnagri is Building a Multilingual ‘Brain’ to Enable Companies Expand to Tier 2 & 3 Cities appeared first on AIM.

]]>
https://analyticsindiamag.com/intellectual-ai-discussions/devnagri-is-building-a-multilingual-brain-to-enable-companies-expand-to-tier-2-3-cities/feed/ 0
Kuku FM is Using Generative AI to Make Everyone a Full-Stack Creative Producer https://analyticsindiamag.com/intellectual-ai-discussions/kuku-fm-is-using-generative-ai-to-make-everyone-a-full-stack-creative-producer/ https://analyticsindiamag.com/intellectual-ai-discussions/kuku-fm-is-using-generative-ai-to-make-everyone-a-full-stack-creative-producer/#respond Fri, 02 Aug 2024 06:30:00 +0000 https://analyticsindiamag.com/?p=10131210

"AI is going to be commoditised; everybody will have access to the tools. What will remain crucial is the talent pool you have – the storytellers."

The post Kuku FM is Using Generative AI to Make Everyone a Full-Stack Creative Producer appeared first on AIM.

]]>

Kuku FM, a popular audio content platform backed by Google and Nandan Nilekani’s Fundamentum Partnership, is harnessing the power of generative AI to revolutionise how stories are created, produced, and consumed. This transformation is spearheaded by Kunj Sanghvi, the VP of content at Kuku FM, who told AIM that generative AI is part of their everyday work and content creation.

“On the generative AI side, we are working pretty much on every layer of the process involved,” Sanghvi explained. “Right from adapting stories in the Indian context, to writing the script and dialogues, we are trying out AI to do all of these. Now, in different languages, we are at different levels of success, but in English, our entire process has moved to AI.”

Kuku FM is leveraging AI not just for content creation but for voice production as well. The company uses Eleven Labs, ChatGPT APIs, and other available offerings to produce voices directly.

“Dramatic voice is a particularly specific and difficult challenge, and long-form voice is also a difficult challenge. These are two things that most platforms working in this space haven’t been able to solve,” Sanghvi noted. 

In terms of long-form content moving to generative AI, Kuku FM also does thumbnail generation, visual assets generation, and description generation and Sanghvi said that the team has custom GPTs for every process.

Compensating Artists

AI is playing a crucial role in ensuring high-quality outputs across various languages and formats. “In languages like Hindi and Tamil, the quality is decent, but for others like Telugu, Kannada, Malayalam, Bangla, and Marathi, the output quality is still poor,” said Sanghvi. 

However, the quality improves every week. “We put out a few episodes even in languages where we’re not happy with the quality to keep experimenting and improving,” Sanghvi added.

Beyond content creation, AI is helping Kuku FM in comprehensively generating and analysing metadata. “We have used AI to generate over 500 types of metadata on each of our content. AI itself identifies these attributes, and at an aggregate level, we can understand what makes certain content perform better than others,” he mentioned.

One of the most transformative aspects of Kuku FM’s use of AI is its impact on creators. The platform is in the process of empowering 5,000 creators to become full-stack creative producers. 

“As the generative AI tools become better, every individual is going to become a full-stack creator. They can make choices on the visuals, sounds, language, and copy, using AI as a co-pilot,” Sanghvi said. “We are training people to become creative producers who can own their content from start to end.”

When asked about the competitive landscape such as Amazon’s Audible or PocketFM, and future plans, Sanghvi emphasised that AI should not be viewed as a moat but as a platform. “Every company of our size, not just our immediate competition, will use AI as a great enabler. AI is going to be commoditised; everybody will have access to the tools. What will remain crucial is the talent pool you have – the storytellers,” he explained.

Everyone’s a Storyteller with AI

In a unique experiment blending generative AI tools, former OpenAI co-founder Andrej Karpathy used the Wall Street Journal’s front page to produce a music video on August 1, 2024. 

Karpathy copied the entire front page of the newspaper into Claude, which generated multiple scenes and provided visual descriptions for each. These descriptions were then fed into Ideogram AI, an image-generation tool, to create corresponding visuals. Next, the generated images were uploaded into RunwayML’s Gen 3 Alpha to make a 10-second video segment.

Sanghvi also touched upon the possibility of edge applications of AI, like generating audiobooks in one’s voice. “These are nice bells and whistles but are not scalable applications of AI. However, they can dial up engagement as fresh experiments,” he said.

Kuku FM is also venturing into new formats like video and comics, generated entirely through AI. He said that the team is not going for shoots or designing characters in studios. “Our in-house team works with AI to create unique content for video, tunes, and comics,” he revealed.

Sanghvi believes that Kuku FM is turning blockbuster storytelling into a science, making it more accessible and understandable. “The insights and structure of a story can now look like the structure of a product flow, thanks to AI,” Sanghvi remarked. 

“This democratises storytelling, making every individual a potential storyteller.” As Sanghvi aptly puts it, “The only job that will remain is that of a creative producer, finding fresh ways to engage audiences, as AI will always be biassed towards the past.”

The post Kuku FM is Using Generative AI to Make Everyone a Full-Stack Creative Producer appeared first on AIM.

]]>
https://analyticsindiamag.com/intellectual-ai-discussions/kuku-fm-is-using-generative-ai-to-make-everyone-a-full-stack-creative-producer/feed/ 0
Futuristic Acer AI PCs Coming Soon in Indian Market https://analyticsindiamag.com/intellectual-ai-discussions/futuristic-acer-ai-pcs-coming-soon-in-indian-market/ https://analyticsindiamag.com/intellectual-ai-discussions/futuristic-acer-ai-pcs-coming-soon-in-indian-market/#respond Sun, 28 Jul 2024 08:37:46 +0000 https://analyticsindiamag.com/?p=10130428

So far, Acer has launched the Acer Swift 14 AI PCs, the TravelMate series, and its Predator Helios AI gaming laptops in India. 

The post Futuristic Acer AI PCs Coming Soon in Indian Market appeared first on AIM.

]]>

Around 241.8 million units of personal computers were sold all across the world in 2023. Despite this, it was the worst year on record for PC sales, with a nearly 15% decline compared to the previous year, according to Gartner.

In India too, last year, the PC market declined by 6.6%. Low market sentiments post-pandemic, supply chain constraints and geopolitical tensions contributed to the decline; but now PC makers are hoping generative AI could help alter their fate.

The top PC makers in the world have been quick to ship AI-powered PCs in most markets. Acer, which has a relatively small portion of the PC sales market, witnessed a 12.3% increase in sales in 2023, the highest among all.

Acer too has already launched a series of AI PCs that are also available in the Indian market which comes with built-in AI features

New AI PCs Coming Soon to Indian Market 

In an interview with AIM, Sudhir Goel, chief business officer at Acer India said, “At Computex 2024, we have showcased a lot of new products, which we are thrilled to introduce to the Indian market in the coming year.”

So far, Acer has launched the Acer Swift 14 AI PCs, the TravelMate series, and its Predator Helios AI gaming laptops in India.  

“With Swift AI, all functionalities, ranging from image enhancement to voice processing, are performed locally, thanks to the neural processing unit integrated within our laptops. This ensures an elevated level of privacy and security, paramount for individual and corporate users,” Goel said.

The TravelMate business PCs are equipped with sophisticated enterprise-grade AI security, and they will soon be available in India. These laptops feature advanced AI tools, including Acer LiveArt and the GIMP with Intel’s Stable Diffusion plugin. 

It also leverages NPU for AI-accelerated applications to blur the background, automatically frame, and maintain eye contact during video conferencing. Moreover, AI will optimise power consumption during long conferencing calls.

“In the gaming sector, AI integration will redefine immersive gaming by enabling real-time map generation and dynamic creation of in-game elements based on live data. Additionally, we will introduce advanced AI-driven monitors to elevate the overall user experience,” Goel revealed.

The Predator Helios 16 laptops are already available in the Indian market. However, the true advantage of AI emerges when an AI model can be run locally on the device. Given the gargantuan sizes of these Large Language Models (LLMs), they can only run on the cloud.

“With Acer’s cutting-edge Neural Processing Units (NPU), our laptops can handle LLM tasks directly on the device. As we look to the future, we envision local LLM capabilities becoming a significant differentiator in the market,” Goel revealed.

Acer Laptops Will have New AI Processors 

The Acer Swift AI PCs come with Qualcomm’s Snapdragon Elite processors. However, Acer plans to offer a range of new processors in its upcoming laptops. 

“While Snapdragon’s latest technology offers remarkable capabilities, we are also exploring options from Intel and AMD. Our strategy is to evaluate and incorporate processors from all these leading providers based on their strengths and innovations. This approach ensures that our laptops can cater to a wide range of needs, from exceptional performance and efficiency to specialised AI features,” Goel said. 

The TravelMate P6 14 laptop features Intel Core Ultra 7 processors with Intel vPro Enterprise, Intel Graphics, and Intel AI Boost.

Will AI PCs Boost the Market?

While Acer’s introduction of AI-capable PCs is impressive, other PC manufacturers have swiftly followed suit. Dell and HP have also released AI-powered PCs in the Indian market this year. Most recently, Microsoft unveiled its Surface AI PCs in India, featuring Snapdragon Elite Processors.

PC makers are hoping AI could help pull the market from the stalemate that it was last year. Research firm Canalys predicts that the PC market will see an 8% annual growth in 2024 as more AI PCs hit the market. Canalys also predicts AI PCs will capture 60% of the market by 2027. 

Goel also believes generative AI has the potential to significantly boost laptop sales. “Over the past few years, the PC industry has been striving to make devices more powerful, efficient, thinner, and lighter. However, it lacked a transformative technology that could truly revolutionise the market. With the advent of AI, this missing piece has finally arrived,” he said.

AI-driven workloads in PCs could enhance performance, enable new functionalities, and create a more seamless user experience. PCs makers are desperately banking for this to happen.

Acer, too Turns to Server Business and Consumer Electronics

Earlier this year, Acer also launched its consumer electronics and home appliances brand, Acer Pure in India. When we asked Goel whether it is a result of declining PC sales, he said, “Acer India is the fastest-growing PC brand in India, and we have seen remarkable YoY growth for our PC business with 2X growth in the consumer market and market leadership in some of the commercial segments.”

He stressed that PCs remain Acer’s core business. Interestingly, Acer also launched its server business in India a few years back, a segment dominated by Dell, HP, and Lenovo, other PC brands Acer competes with.

Called Altos Computing, it caters to the growing demand for high-performance servers and workstations in India’s digital infrastructure landscape. 

“It includes introducing AI-powered solutions to support local cloud and data storage initiatives, which are crucial for governmental and corporate digital transformation priorities,” Goel concluded.

The post Futuristic Acer AI PCs Coming Soon in Indian Market appeared first on AIM.

]]>
https://analyticsindiamag.com/intellectual-ai-discussions/futuristic-acer-ai-pcs-coming-soon-in-indian-market/feed/ 0
Okta Could Make Billions More by Providing Identity to AI Agents https://analyticsindiamag.com/ai-origins-evolution/okta-could-gain-billions-more-by-providing-identity-to-ai-agents/ https://analyticsindiamag.com/ai-origins-evolution/okta-could-gain-billions-more-by-providing-identity-to-ai-agents/#respond Wed, 24 Jul 2024 08:22:00 +0000 https://analyticsindiamag.com/?p=10130050

Okta could sell subscriptions of its identity software for AI agents

The post Okta Could Make Billions More by Providing Identity to AI Agents appeared first on AIM.

]]>

Okta, a leader in the identity and access management (IAM) market, makes a significant portion of its billion-dollar revenue by selling multi-year subscriptions for its software.

The San Francisco-based company, which serves over 19,000 customers globally, made $2.12 billion in revenue last year, out of which around $1.79 billion came from subscriptions alone.

Now, the company sees a significant opportunity in AI that could potentially double its revenue– it wants to provide identity to AI agents. 

In an interaction with AIM during his recent visit to India, Eugenio Pace, Okta’s president, business operations, said that in the future, AI agents will carry out actions on behalf of us humans. 

“AI agents will make reservations for you and book hotel or flight tickets. But what happens when an AI agent makes a wrong payment?” Pace asked.

Okta Could Provide Identity to Billions of AI Agents 

In today’s interconnected world, distinguishing whether an action was undertaken by you or an AI agent can be challenging. Imagine a future where every person with a smartphone has their own personal AI assistant, interacting with one another and performing various tasks simultaneously.

By 2025, the world is projected to have 7.4 billion smartphone users, which could potentially mean 7.4 billion AI agents. These AI agents need not be limited to smartphones; they could be accessible through your PC or a device like Amazon’s Alexa.

“You can envision a world where I delegate certain powers to an agent to act on my behalf from an identity standpoint. However, we would set clear limits, specifying that my identity can only be used, for example, to pay specific accounts up to certain limits. 

“Okta could play a crucial role in managing identities as we transition into a future where AI agents handle various tasks,’’ Pace said.

AI Agents Could Deepen Okta’s Pockets 

Now, picture Okta selling multi-year subscriptions for its Okta Identity Cloud platform and services for a billion AI agents—this could significantly boost its revenue.

“We sell software based on the number of users who leverage the software. Most enterprise SaaS software operates under this user-based pricing model. Until now, this has been straightforward: if a company has 1,000 employees, they pay for 1,000 users. 

However, if each employee now has their own AI assistants, theoretically, the user count doubles. “This presents a significant challenge and an intriguing evolution in our industry, not only from a security perspective but also in terms of our business models,” Pace pointed out.

Nonetheless, he also stresses that this is a very forward-looking statement. While AI agents are expected to be the next iteration of the generative AI lifecycle, we are yet to see widespread deployment and integration.

Okta AI 

Last year, at its flagship event, Oktane, the company announced a host of generative AI features for its customers. Pace said Octa AI includes capabilities to predict suspicious logins or activities that could be potentially not real or from bad actors.

“In its simplest use case, we provide an authentication service that acts as an intermediary between a user and an application. When a user attempts to log in, they go through our service. Being positioned in the middle of this transaction allows us to assess how it unfolds,” Pace said.

To validate this, Okta utilises generative AI, which analyses over 60 different signals, such as device type, time, IP address, location, and the specific application involved, to verify the authenticity of each transaction before permitting it to proceed.

In 2023, Okta admitted that a hacker managed to steal all of its customers’ data by using a stolen credential to access its support case management system.

Interestingly, Okta, which also provides identity solutions to OpenAI, also leverages generative AI to provide in-product guidance on system configuration. 

“We observe how users configure their systems and understand their intent. Based on this behaviour, we can make recommendations, such as enabling certain features or disabling others, tailored to their usage patterns. 

“Essentially, it functions like an automated consultant within the browser, offering personalised advice to optimise user experience,” Pace added.

Okta in India 

Okta, with approximately 20,000 customers worldwide and handling 20 billion logins every month, established its first office in Bengaluru, India, last year. Since then, the company has grown to employ 300 staff members.

The company plans to further strengthen its presence in the country and attract more customers. It has offices in Tokyo, Osaka, Singapore, Amsterdam, Dublin, Frankfurt, Manila, Munich, Sydney, Seoul, Stockholm, and Toronto.

According to Pace, the company’s growing presence in India is one of its biggest growth investments.  

“We have a robust partner ecosystem in India, which means we don’t just directly sell to our customers, but we also do so through our partner ecosystem. While the US is a more mature market since that’s where we had our origin, India is a fast-growing market and will become a big market for Okta in the years to come,” he concluded.

The post Okta Could Make Billions More by Providing Identity to AI Agents appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-origins-evolution/okta-could-gain-billions-more-by-providing-identity-to-ai-agents/feed/ 0
India’s Beatoven.ai Shows the World How AI Music Generation is Done Right https://analyticsindiamag.com/ai-origins-evolution/indias-beatoven-ai-shows-the-world-how-ai-music-generation-is-done-right/ https://analyticsindiamag.com/ai-origins-evolution/indias-beatoven-ai-shows-the-world-how-ai-music-generation-is-done-right/#respond Mon, 22 Jul 2024 08:46:56 +0000 https://analyticsindiamag.com/?p=10129783

Within a year, Beatoven.ai amassed more than 100,000 data samples, which were all proprietary for them.

The post India’s Beatoven.ai Shows the World How AI Music Generation is Done Right appeared first on AIM.

]]>

AI music generation is a tricky business. Amidst copyright claims and the need for fairly compensating artists, it becomes an uphill task for AI startups, such as Suno.ai or Udio AI, to gain revenue and popularity. 

However, Beatoven.ai, an Indian AI music startup, has gotten the hang of it in the most ethical and responsible way possible.

One of the most important reasons for that is its co-founder and CEO Mansoor Rahimat Khan is a professional sitar player himself and comes from a family of musicians going back seven generations. “I was very fascinated by this field of music tech,” he said. 

Khan told AIM that he started his journey at IIT Bombay and realised that though there were not many opportunities in India, he wanted to combine his passion for music and technology. 

Beatoven.ai is part of the JioGenNext 2024, Google for Startups Accelerator, and AWS ML Elevate 2023 programs. Khan said that the team applied to many accelerator programs because they realised they needed a lot of compute to fulfil the goal of building an AI music generator. 

The company raised $1.3 million in its pre-series A round led by Entrepreneur First and Capital 2B, with a total funding of $2.42 million.

After switching several jobs, Khan met Siddharth Bhardwaj and building on their shared passions for music and tech founded Beatoven.ai in 2021. “After coming back from Georgia Tech, I got involved in the startup ecosystem, and started working with ToneTag, an audio tech startup funded by Amazon,” said Khan. 

Everyone Needs Background Music in their Life

The co-founders found out that the biggest market was in the generation of sound tracks for Indie game developers, agencies, and production houses. “But when we look at the nitty gritty of the industry, copyrights are a very scary thing. We thought that generative AI could be a solution to this.” Khan said that the idea was to figure out how users could give simple prompts and generate audio.

Mansoor Rahimat Khan with Lucky Ali

The initial idea was to create a simple consumer focused UI where users could select a genre, mood, and duration to generate a soundtrack. But that was when the era of LLM hadn’t started and NLP wasn’t good enough for such tasks. “We started in 2021 before the LLM era, and our venture capital came from Entrepreneur First. We raised a million dollars in 2021 and quickly built our technology from scratch.”

The biggest challenge like every other AI company was the collection of data. “You either partnered with the labels that charged huge licensing fees or scraped [data]. That was the only other option. But if you did that, you would be sued,” said Khan.

All of the Tech

This is where Beatoven.ai takes the edge over other products in the market. Khan and his team started contacting small, and slowly bigger artists for creating partnerships and sourcing their own data. The company had a headstart as no one was talking about this field back then. Within a year, it amassed more than 100,000 data samples, which were all proprietary for them.

During the initial days, Beatoven.ai did not use Transformers. Khan said that it is one of the reasons that the quality was not that great. Later, when Diffusion models came into the picture, the team realised that it is the way forward for AI-based music generation. 

The company started by using different models for different purposes, this included the ChatGPT API from OpenAI. The Beatoven.ai platform also uses CLAP (Contrastive Language-Audio Pretraining), which is mostly used for video generation. 

Apart from this, the company uses latent diffusion models like Stability AI’s Stable Audio, VAE models, and AudioLLM, for different tasks such as individual instruments within the generated music. Then the company uses an Ensemble model for mixing all these individual audios together. 

For inference, the company uses CPUs (instead of GPUs), which keeps it fast and optimised, while reducing costs. 

Trained Fairly

Khan admitted that the audio files generated by Suno.ai’s have superior quality right now, but they also use Diffusion models, which makes them a little slow. “The quality is significantly better from where we started, but it’s not quite there yet.” Khan added that currently the speed is high because the company uses different models for different tasks.

To further expand the data, Beatoven.ai started partnering with several outlets such as Rolling Stone and packaged it like a creator fund. In January 2023, it announced a $50,000 fund for Indie music as a part of the Humans of Beatoven.ai program for expanding their catalogue. 

This gave Beatoven.ai a lot of popularity and many artists wanted to partner with the team. Khan said that the company aims to do more licensing deals to expand music libraries. “When it comes to Indian labels though, they are not yet open to licensing deals,” said Khan. 

Beatoven.ai’s model is certified as Fairly Trained and also certified by AI for Music as an ethically trained AI model.

Apart from music generation, Beatoven.ai is launching Augment, similar to ElevenLabs’s voice generation model. This would allow agencies to connect to Beatoven.ai’s API and train on their own data to make remixes of their own music. For the demo, Khan showed how a simple sitar tune could be turned into a hip-hop remix. 

“You can just use your existing content and create new songs. That’s the idea,” he said.

Currently, Beatoven.ai is also testing a video-to-audio model using Google’s Gemini, where users can upload a video and the model would understand the context and generate music based on that. Khan showed a demo to AIM where the model could also be guided using text prompts for better quality audio generation. 

Not Everyone is a Musician

Khan envisions that in the near future, companies such as Spotify or YouTube start open sourcing their data and offer APIs to make the AI music industry a little more open.

Meanwhile, while speaking with AIM, Udio’s co-founder Andrew Sanchez said, “It’s enabling for people who are just up and coming, who don’t yet have big professional careers, the resources, time or money to really invest in making a career. “It’s enabling a whole new set of creators.” This would make everyone a musician

When it comes to Beatoven.ai, he said that he aims to head in a more B2B direction as building a direct consumer app does not make sense. “I don’t believe everybody wants to create music,” added Khan, saying that not everyone is learning music in the world. That is why, the company is currently focused only on background music without vocals. 

The post India’s Beatoven.ai Shows the World How AI Music Generation is Done Right appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-origins-evolution/indias-beatoven-ai-shows-the-world-how-ai-music-generation-is-done-right/feed/ 0
Google is Trying to Mimic ‘Google Pay’ for Transportation via Namma Yatri https://analyticsindiamag.com/intellectual-ai-discussions/google-is-trying-to-mimic-google-pay-for-transportation-via-namma-yatri/ https://analyticsindiamag.com/intellectual-ai-discussions/google-is-trying-to-mimic-google-pay-for-transportation-via-namma-yatri/#respond Fri, 19 Jul 2024 12:00:26 +0000 https://analyticsindiamag.com/?p=10129637

Earlier this week, Google invested in Moving Tech, the parent company of Namma Yatri and also lowered the prices for Google Maps AI for Indian developers. 

The post Google is Trying to Mimic ‘Google Pay’ for Transportation via Namma Yatri appeared first on AIM.

]]>

Back in 2017, Google made UPI and cashless transactions household terms with the launch of  Google Pay in India. The app has over 67 million users in the country, its largest single market to date. 

Now, the company is trying to mimic the success of Google Pay to democratise transformation for all via Namma Yatri, the Bengaluru-based open-source rival of ride-hailing services like Ola and Uber.

Interestingly, earlier this week, Google invested in Moving Tech, the parent company of Namma Yatri. On the same date, Namma Yatri users also came across new features in its service including rentals and instant travel. 

Previously in 2020, Google said it plans to invest $10 billion in India over the next five to seven years as the search giant looks to help accelerate the adoption of digital services in the key overseas market.

Namma Yatri, initially a subsidiary of the payments company Juspay, became a separate entity called Moving Tech in April. CEO Magizhan Selvan and COO Shan M S, formerly with Juspay, now lead this new mobility business. 

Launched more than a year ago, it became a preferred service for many given its driver-centric approach with lower commission rates and a community-driven model tailored to local needs. Recently, it also expanded to multiple cities beyond Bengaluru, including Chennai and Delhi.

Similarly, Google is doing the same with ONDC as well, making it the UPI of ecommerce. At the recently held Google I/O Connect in Bengaluru, the company announced that it has lowered India-specific pricing for its Google Maps Platform API. Interestingly, it is offering up to 90% off on select map APIs for people building on top of ONDC.

Making Money Simply

The success of Google Pay in a price-sensitive market like India can be attributed to strategic initiatives, market understanding, and leveraging unique conditions. It leveraged the UPI infrastructure, an open API platform from the National Payments Corporation of India (NPCI), which enabled instant online bank transactions. This reduced entry barriers for new users and merchants. 

India’s cash-dependent economy, with many unbanked and underbanked people, was targeted by Google Pay through an easy-to-use digital payment solution requiring only a bank account linked to UPI. 

It solved the crisis which Indian fintech players like PhonePe and Paytm had been struggling with for a long time. 

Google built an ecosystem in India by integrating various services like bill payments, mobile recharges, and peer-to-peer transactions, embedding itself in daily financial activities. In 2021, it expanded its network of banks offering card tokenisation on its app, adding SBI, IndusInd Bank, Federal Bank, and HSBC India as partners.

Now, even international tourists can access the app in India. 

Now They Look to Make Travel Simple

In India, Namma Yatri competes with the likes of Uber, Ola and Swiggy-backed Rapido. 

Google Maps, with an established customer base in India, has incorporated Gemini into it and has come up with new features like Lens in Maps, Live View navigation, and address descriptors, specifically for Indian users. 

However, earlier this week, Ola founder Bhavish Aggarwal decided to ditch Google Maps for its own in-house Ola Maps. After Microsoft Azure’s exit last month, this is Aggarwal’s second move to get rid of Western apps. Yesterday, he reduced the pricing for the Ola Maps APIs even further to encourage other developers to build on it. 

“We’ve been using Western apps to map India for too long, and they don’t understand our unique challenges: street names, urban changes, complex traffic, non-standard roads, etc,” said Aggarwal.

Meanwhile, Rapido, Uber, Namma Yatri and others ride-hailing platforms continue to use Google Maps. Currently, Rapido, which started off as a bike taxi service has expanded to include auto rickshaw, carpooling services, taxicab hailing, parcel delivery, and third-party logistics services in over 100 Indian cities.

Namma Yatri: The UPI of Transportation

“Imagine the platform similar to Namma Yatri being adopted in cabs, metros, or any platform which is serving the passengers,” Selvan told AIM, when Moving Tech was first launched. “We essentially want to become the UPI of transportation.”

Namma Yatri stood out for the drivers since the app started off as free of commission but now charges a basic of just INR 25 per day. In contrast, Ola and Uber take a 25-30% commission from drivers per ride. 

One Namma Yatri driver told AIM that the low subscription fee compared to Uber and Ola has helped him. Within six months of switching to Namma Yatri, he was able to fund his two children’s weddings.

The app has onboarded 49,000 auto drivers and 550,000 users in five months, with approximately INR 12 crores ($1.5 million) paid out to drivers. It celebrated 500 million downloads in March.

Although Namma Yatri currently lacks features like bike taxis and carpooling, with Google’s support it may soon expand to include these options. 

The founders said that Namma Yatri will leverage the new funds to grow its engineering and R&D competencies, and also include more types of transportation, including buses. 

On the other hand, Google has found an ideal partner to strengthen its presence in India’s transportation sector. Given its expertise in revolutionising online purchases, it is set to replicate the ‘Google Pay moment’ in Indian transportation soon.

The post Google is Trying to Mimic ‘Google Pay’ for Transportation via Namma Yatri appeared first on AIM.

]]>
https://analyticsindiamag.com/intellectual-ai-discussions/google-is-trying-to-mimic-google-pay-for-transportation-via-namma-yatri/feed/ 0
[Exclusive] BharatGPT’s Ganesh Ramakrishnan’s AI Startup bbsAI Tackles Limited Indic Data Challenge https://analyticsindiamag.com/ai-origins-evolution/exclusive-bharatgpts-ganesh-ramakrishnans-ai-startup-bbsai-tackles-limited-indic-data-challenge/ https://analyticsindiamag.com/ai-origins-evolution/exclusive-bharatgpts-ganesh-ramakrishnans-ai-startup-bbsai-tackles-limited-indic-data-challenge/#respond Thu, 18 Jul 2024 08:08:55 +0000 https://analyticsindiamag.com/?p=10129469

In May, the Department of Science & Technology (DST) announced the launch of a new hub dedicated to creating Indic language models. This new hub, BharatGPT, was created in collaboration with IIT Bombay, IIT Madras, IIT Hyderabad, IIIT Hyderabad, IIM Indore, and IIT Mandi.  The initiative aims to develop LLMs in Indian languages for India, […]

The post [Exclusive] BharatGPT’s Ganesh Ramakrishnan’s AI Startup bbsAI Tackles Limited Indic Data Challenge appeared first on AIM.

]]>

In May, the Department of Science & Technology (DST) announced the launch of a new hub dedicated to creating Indic language models. This new hub, BharatGPT, was created in collaboration with IIT Bombay, IIT Madras, IIT Hyderabad, IIIT Hyderabad, IIM Indore, and IIT Mandi. 

The initiative aims to develop LLMs in Indian languages for India, along with applications for Indian enterprises.

Apart from working on BharatGPT, Ganesh Ramakrishnan, a professor at IIT Bombay, has been dedicated to developing translation engines. To continue this bid, he has co-founded bbsAI with Ganesh Arnaal, which has been a decade in the making. 

“Arnaal approached me in 2013 with the idea of developing a translation engine to translate technical books from English into Hindi and other major Indian languages. Thus, the Udaan Translation Project was born,” Ramakrishnan recalled in an exclusive interaction with AIM

At the recent Global INDIAai Summit 2024, Ramakrishnan discussed how AI can produce groundbreaking outcomes for real business applications in data-scarce environments. He underscored the significance of creating small language models and innovating algorithms. 

Emphasising on human centricity and inclusive AI, Ramakrishnan added that the approach of making small language models for Indic languages addresses the challenge of limited data, enabling the delivery of dependable and practical solutions to the industry. This has led to the founding of bbsAI.

Initially funded by Arnaal, bbsAI officially became a commercial entity in February 2023, entering a licence agreement with IIT Bombay for the commercial exploitation of the Udaan Translation Engine.

Flying with Udaan

The journey for bbsAI started with the Udaan, which stands out in the crowded market of translation tools. Ramakrishnan explained, “Our engine is a result of training models that are probabilistic in nature, but we introduced technical dictionaries as constraints to overcome hallucinations and inaccuracies.”

This deterministic approach, powered by their own open-sourced data-efficient machine learning algorithms and grounded in extensive language resource research by Arnaal, ensures accurate and context-appropriate translations in scientific and technical fields. “The Udaan Translation Engine offers a comprehensive ecosystem: an OCR engine preserving the source document’s style and layout, a translation engine, and a user-friendly post-editing tool.”

In 2022, Ramakrishnan and Arnaal met education minister Dharmendra Pradhan, who appreciated their dedication to building Udaan. “They have developed a translation tool— Udaan—that is breaking the language barrier in education by translating learning materials in Indian languages,” the minister tweeted.

(From left to right) Ganesh Ramakrishnan, education minister Dharmendra Pradhan, and Ganesh Arnaal

Revolutionising the Insurance Industry

Expanding its offerings to leverage its digitalisation and OCR capabilities, bbsAI has introduced a suite of AI-enhanced process automation solutions that has the potential for a variety of use cases across industries. 

“As a natural extension of the machine learning capabilities we have built over the years, we have begun to offer process automation solutions by building small language models that can provide intelligent, accurate and inherently deterministic solutions to automate a variety of business processes,” Ramakrishnan elaborated. 

bbsAI developed an AI solution for ICICI Lombard’s quotation management system (QMS). “Our solution captures data from various file formats and populates it automatically into the templated underwriting formats, delivering productivity gains,” said Ramakrishnan. 

This solution is a global first in the insurance industry, achieving over 90% accuracy while adhering to strict data privacy regulations with limited datasets. “We have delivered a staggering accuracy of over 90% while completely eliminating hallucinations,” he emphasised.

Small Language Models and Explainability

Ramakrishnan explained bbsAI’s unique approach, which is built on small language models and explainability by design. 

“LLMs perform many tasks, but for business use-cases, explainability and reliability are crucial,” Ramakrishnan stressed. This focus on deterministic solutions has enabled bbsAI to create accurate, reliable, and explainable AI solutions, fostering greater industry adoption. 

“We integrate domain knowledge and cross-industry understanding as an integral part of the development process, not as an afterthought,” he added.

Moving Beyond POCs

One of bbsAI’s significant milestones is its transition from proof of concept (PoC) to real-world AI solutions. “The key is shifting from probabilistic to deterministic models, providing explainable and accurate solutions,” noted Ramakrishnan. 

This approach has not only inspired user confidence but has also demonstrated tangible benefits in efficiency and productivity for clients. “With our unique approach, we have successfully converted AI promises into products and solutions,” he asserted.

bbsAI’s journey from a visionary project to a trailblazer in business automation and translation technology is truly remarkable. “We at bbsAI are passionate about making technology available to all Indians,” added Ramakrishnan.

Bharat Bhasha Sanganan

At its core, bbsAI is driven by the vision of Bharat Bhasha Sanganan, meaning Indian language computing. “In India, only those who know English have privileged access to technology. If we look globally, most developed nations have access to technology in their native languages,” Ramakrishnan explained. 

bbsAI (which stands for Bharat Bhasha Sanganan AI) has taken significant steps to bridge this gap, starting by creating a complete Hindi user interface for LibreOffice, bbsहिन्दीoffice and is planning to extend this to other major Indian languages.

bbsAI has a natural synergy with the National Education Policy (NEP), which has catalysed higher learning through Indian languages, aligning perfectly with bbsAI’s mission. 

“From the academic year 2023-24, engineering and medicine are being taught in 11 Indian languages,” Ramakrishnan mentioned. This shift is expected to boost the demand for textbooks in Indian languages, making bbsAI a valuable partner for publishers and academic institutions. 

“We have been working on machine translation for technical domains for over a decade, ensuring the use of domain-specific vocabulary in our translations,” he concluded.

The post [Exclusive] BharatGPT’s Ganesh Ramakrishnan’s AI Startup bbsAI Tackles Limited Indic Data Challenge appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-origins-evolution/exclusive-bharatgpts-ganesh-ramakrishnans-ai-startup-bbsai-tackles-limited-indic-data-challenge/feed/ 0
From a Small Town in Maharashtra to Silicon Valley: Aqsa Fulara’s Inspiring Journey with Google https://analyticsindiamag.com/intellectual-ai-discussions/from-a-small-town-in-maharashtra-to-silicon-valley-aqsa-fularas-inspiring-journey-with-google/ https://analyticsindiamag.com/intellectual-ai-discussions/from-a-small-town-in-maharashtra-to-silicon-valley-aqsa-fularas-inspiring-journey-with-google/#respond Tue, 16 Jul 2024 06:53:15 +0000 https://analyticsindiamag.com/?p=10129243

Fulara is responsible for scaling AI and ML products including Recommendations AI and now Meridian models.

The post From a Small Town in Maharashtra to Silicon Valley: Aqsa Fulara’s Inspiring Journey with Google appeared first on AIM.

]]>

Growing up in a small town in Sangli, Maharashtra, Aqsa Fulara, an AI/ML product manager at Google since 2017, like many other women, faced societal norms that often discouraged women from pursuing higher education far from home. 

“Coming from a community where moving out of my parent’s home to a hostel for higher education was frowned upon, I put all my energy towards getting into this prestigious engineering college in the same city,” Fulara told AIM in an exclusive interview.  

Her dedication paid off when she was admitted to Walchand College of Engineering, where she did her BTech in computer science and engineering. This academic achievement was just the beginning. 

Fulara’s passion for learning and her desire to push the boundaries led her to the University of Southern California (USC), where she pursued a master’s degree in engineering management, and since then “there was no looking back!”, Fulara shared gleefully. 

“While my experiences in India provided me with a solid technical foundation and analytical approach to solving problems, my experiences at USC and Stanford focused a lot more on practical applications of cutting-edge technology,” she added. 

According to recent surveys, compared to other developing countries, fewer women in India reported being discouraged from pursuing scientific or technical fields (37% vs. 65%). The primary challenges faced by women students in India are high-stress levels (72%), difficulties in finding internships (66%), and a gap between their expectations and their current curriculum (66%).

Fulara’s path to AI and ML was not marked by a single dramatic moment but rather a gradual buildup of curiosity and fascination with technology. Her inclination towards solving problems and understanding complex systems drew her to this field. 

“That led me to my capstone project on behaviour recognition and predicting traffic congestion in large-scale in-person events and thus, building products for congestion management,” she added. 

Leadership Mantra: Building the Culture of Innovation

If you’re familiar with Google’s Vertex AI Search, you likely know about Recommendations AI. Now branded as Recommendations from Vertex AI Search, this service leverages state-of-the-art machine learning models to provide personalised, real-time shopping recommendations tailored to each user’s tastes and preferences. 

One of the key figures in scaling this product is Fulara, who has been instrumental in its growth since 2021. Fulara has also been the force behind the highly acclaimed products in Google Cloud’s Business Intelligence portfolio, such as Team Workspaces and Looker Studio Pro. 

Fulara considers Looker Studio as one of her favourite projects. “Imagine having a personal data analyst assistant who can provide customised recommendations and help you make informed decisions,” she added. 

Having worked with Google for over seven years now, one thing that Fulara values most about the company is the freedom to explore and innovate. “Whether it’s pursuing a 20% project in a new domain, growing into a particular area of expertise, or participating in company-wide hackathons, Google provides much space for creativity and innovation,” she shared. 

This environment has allowed her to pivot her career towards product management, building on her AI experiences and focusing on delivering business value through customer-centric solutions.

Leading AI product development comes with its own set of challenges. “AI products have a larger degree of uncertainty and ambiguity, with challenges in terms of large upfront investment, uncertain returns, technical feasibility, and evolving regulations,” she explained. 

To manage these challenges, Fulara fosters a culture of experimentation and agility. “We release MVPs for testing far ahead of production cycles to rigorously test and benchmark on production data and user behaviours,” she added, allowing her team to make informed decisions even with incomplete information.

Fulara emphasises the importance of managing scope creep tightly and sharing outcome-based roadmaps upfront. “We’re solving for problem themes, not necessarily just churning out AI features,” she noted. This strategy helps maintain focus and adapt to changes quickly. 

Future of AI 

Looking ahead, Fulara sees generative AI, personalised recommendations, and data analytics as transformative forces in the coming decade, making data and insights more accessible and workflows more collaborative. 

AI and ML models are becoming increasingly pervasive, assisting in personalised shopping journeys, optimising marketing strategies, and improving data-driven decision-making across various industries.

Read more: Beyond Pride Month: True Allyship Needs No Calendar

The post From a Small Town in Maharashtra to Silicon Valley: Aqsa Fulara’s Inspiring Journey with Google appeared first on AIM.

]]>
https://analyticsindiamag.com/intellectual-ai-discussions/from-a-small-town-in-maharashtra-to-silicon-valley-aqsa-fularas-inspiring-journey-with-google/feed/ 0
Big-Tech Companies Push for Gender Sensitisation https://analyticsindiamag.com/intellectual-ai-discussions/big-tech-companies-push-for-gender-sensitisation/ https://analyticsindiamag.com/intellectual-ai-discussions/big-tech-companies-push-for-gender-sensitisation/#respond Fri, 12 Jul 2024 12:45:05 +0000 https://analyticsindiamag.com/?p=10126728

“Contrary to the common belief of tokenism and corporate rainbow-washing, I believe we’re moving in the right direction.” 

The post Big-Tech Companies Push for Gender Sensitisation appeared first on AIM.

]]>

Most AI products today are the result of data collected from diverse users, which can inherently contain biases. This calls for the need to ramp up efforts to mitigate these biases rather than attempting to overfit the data to preconceived notions. 

“We are trying to build products used by billions of users, [and] working on important technologies that will have a deep impact on society,” said Google head Sundar Pichai in an interview, addressing the dire need for inclusivity and diversity in the workplace. 

“A diverse workforce will help the company develop better products, and more importantly, tackle problems in society better,” he added.

Sandeep Sharma, assistant manager, GlobalLogic India, and an openly queer individual, agrees. “Organisations can foster inclusivity by establishing ally network groups and providing platforms to drive LGBTQIA+ inclusion in the workplace. Additionally, sensitisation programs for employees can be transformative,” Sharma told AIM in a recent conversation. 

For someone not comfortable discussing his sexuality and attending Pride events just as an ally, Sharma has come a long way. 

Sandeep Sharma, Assistant Manager, GlobalLogic India

He is now actively involved with GlobalLogic’s DEI accelerator team, striving to create an inclusive environment for his LGBTQIA+ colleagues. He has been facilitating SOGIE (sexual orientation, gender identity, and gender expression) sensitisation sessions to help others understand the lives, experiences, and challenges faced by the LGBTQIA+ community better.

Previously, we have spoken about how allyship in the workplace helps familiarise the concept of understanding the non-binary nature of gender. This is where gender sensitisation programs, employee resource groups and Pride networking groups come into the picture for corporations. 

Big Tech’s Efforts

Big techs like Google, Apple, Microsoft, Meta, and NVIDIA, among others, recognised the importance of such programs long ago. 

“Inclusive teams that propagate and advance inclusive principles will have the deepest impact in building products designed for everyone,” said Satya Nadella, the chief executive officer of Microsoft, in a previous interview. 

For example, Apple’s Tim Cook, who is openly gay and has often spoken about the same, has had an LGBTQ-focused employee network called Pride@Apple since 1986. 

Similarly, Google established its first LGBTQ employee group, Gayglers, over 20 years ago, which has had a massive impact on steering the company in a more LGBTQ-inclusive direction. So does Microsoft. 

GlobalLogic is also one such company fostering D&I through several initiatives to support LGBTQIA+ employees, such as inclusive hiring practices, support groups, and awareness campaigns. According to Sharma, these initiatives have significantly impacted the work environment, creating a culture of acceptance and respect. 

Originally from Delhi, Sharma, a computer science engineer, has worked as a tech product owner in various Indian cities over the past seven years.

AI to Solve Inclusivity?

“Contrary to the common belief of tokenism and corporate rainbow-washing, I believe we’re moving in the right direction. While many organisations and brands might currently leverage Pride as a strategy, this undeniably starts the conversation,” he added.  

“This is especially crucial in countries where LGBTQIA+ community is still battling the fight against social stigma and basic rights. It is a long and demanding journey and while it may not be ideal today, progress has to begin somewhere.”  

However, the ever-evolving AI and analytics landscape holds the potential to drive LGBTQIA+ inclusivity in the coming years. For example, AI-powered language models can be trained to use more inclusive and gender-neutral language, helping normalise diverse identities in digital communication.

“With AI increasingly integrated into our daily lives. It mustn’t replicate societal bias. Data models trained predominantly on heteronormative information may exhibit harmful prejudices. To combat this, we must dismantle institutional barriers. 

“This includes the absence of gender-neutral or inclusive language, a lack of inclusive imagery, and the features that reflect LGBTQIA+ identities and experiences,” said Sharma. 

Interestingly, according to a McKinsey & Company report, companies in the top quartile for gender diversity are 21% more likely to outperform on profitability. Additionally, organisations with more gender-diverse executive teams have a 27% likelihood of outperforming their peers on longer-term value creation. 

Challenges Remain

The inclusion of transgenders in the workforce still remains a challenge. Sharma advocates for infrastructural changes like gender-neutral washrooms and policies accommodating preferred names and pronouns for creating environments that empower non-binary and gender minorities in the workforce.

On the other hand, AI-driven health applications could be developed to address the specific medical needs of transgender individuals, improving access to personalised care. Additionally, AI can assist in content moderation on social media platforms, more effectively identifying and removing hate speech or discriminatory content targeting the community. 

“LGBTQIA+ isn’t just an abstract concept; it encompasses real people still fighting for social acceptance and basic rights” he added. 

The post Big-Tech Companies Push for Gender Sensitisation appeared first on AIM.

]]>
https://analyticsindiamag.com/intellectual-ai-discussions/big-tech-companies-push-for-gender-sensitisation/feed/ 0
Vivekananda Pani on Building ‘AI Rocket’ for India https://analyticsindiamag.com/intellectual-ai-discussions/vivekananda-pani-on-building-ai-rocket-for-india/ https://analyticsindiamag.com/intellectual-ai-discussions/vivekananda-pani-on-building-ai-rocket-for-india/#respond Wed, 10 Jul 2024 11:39:22 +0000 https://analyticsindiamag.com/?p=10126451

“Scaling the moon is actually a lot of distance. If you walk, it would take 25 lives, versus if you build a rocket today, you will reach it in a month.”

The post Vivekananda Pani on Building ‘AI Rocket’ for India appeared first on AIM.

]]>

The Indic data problem has been on every Indian AI researcher’s mind. One of the primary challenges in developing AI for Indian languages is the scarcity of high-quality data. Unlike English, which has a vast amount of digital content available, Indian languages lack sufficient natural data to train AI models.

“For Indian languages, or any language other than English, this has not been true [availability of data online], because of which we do not have a large amount of natural data,” Vivekananda Pani, the CTO and co-founder of Reverie Language Technologies, told AIM.

Various approaches are being considered to address this data problem. One involves using artificial methods to augment existing data, such as converting English data into Indian languages using machine translation. 

However, Pani cautioned against relying solely on this approach. “The best way is not, let’s say, if we try to augment and continue to augment forever… because in the English language that’s not likely to happen,” he said.

Building a Rocket for Indic Languages

Pani advocates for a more sustainable solution: enabling people to create a large amount of natural data that can be fed back into AI systems. This includes leveraging content from various media formats, such as audio and video, which can be transcribed and used to train AI models. 

Speaking about the need for building something like ChatGPT in India, Pani gave the perfect analogy. “Scaling the moon is actually a lot of distance. If you walk, it will take 25 lives, versus if you build a rocket today, you will reach it in a month.”

One problem that Pani pointed out is the standardisation of the Indian language in the digital world. Giving the example of the word “sahayogi”, Pani and his team explained that there are several ways in which people type the word on their phones. 

This is also because there is no uniformity for Indian languages across different types of keyboards, while English is uniform across the board. 

“Even if somebody doesn’t know English, but wants to send a message on WhatsApp, they tend to write the message in their native language using English letters,” Pani explained that it is only for the digital world, but when people write in books and letters, it is usually in native scripts. 

Earlier while speaking to AIM, Raj Dabre, a prominent researcher at NICT in Kyoto, discussed the complexities of developing chatbots for Indian languages. If you type something in Hindi but in the Roman script, current AI models like Llama and others are able to process it to some degree. 

Dabre and his team are working on training models in Romanised versions of Indic data, leveraging the knowledge in the English language, and transferring it to Indic languages.

“Technology has actually been a barrier in our case,” said Pani. He explained that this is something that needs to be addressed very quickly. 

Working on the same issue is Pushpak Bhattacharyya, who was recently appointed as the chairman for the committee for standardisation in artificial intelligence, set up by the Bureau of Indian Standards (BIS)

Bhattacharyya argues that while existing LLMs can be adapted for Indian languages, creating specialised foundational models could lead to significant efficiency gains. Similarly, smaller models with less amount of data would also not be feasible in the longer run. 

Native Languages for AI is a Need

At Reverie Language Technologies, Pani explained that the team has been working on this problem for a very long time, and building that ‘AI rocket’. 

“We started in an era when there was absolutely zero Indian language data in the digital media,” he recalled, highlighting the progress made from their first speech model using only 100 hours of data to more recent models utilising at least 10,000 hours.

“In India, we still have less than 7% people who are fluent in English,” Pani explained that there is definitely a need for building an AI model in Indian languages, and not just relying on the models by the West.

Pani also addressed the startup scenario in India for fundamental research in AI. He notes that while historically, there has been a lack of investment in long-term research projects, the landscape is changing. “Now that OpenAI showed the world what is possible, people have a belief that this can be achieved and therefore let’s go and invest,” he observed.

Another issue is building hardware in India, which Pani explained is a lot harder. “On that front, we are far behind right now. We don’t have enough skills. We will have to import skills and it’s a very, very expensive process,” he said. 

Agreeing with Bhavish Aggarwal’s recent comment on comparing OpenAI with East India Company, Pani said that a lot of data that Indian users “donate” to companies such as Google through their phones is kept with the company and is only accessible to Indian researchers if they pay for it. 

“When you look at countries such as China and Japan, they have their computing world in their native language,” Pani added, saying there needs to be a bigger push from the government for putting standards and baseline fundamentals. 

“These fundamentals today are actually governed by a few American companies and formed by them. So they would not really think of what is right for our country,” Pani concluded.

The post Vivekananda Pani on Building ‘AI Rocket’ for India appeared first on AIM.

]]>
https://analyticsindiamag.com/intellectual-ai-discussions/vivekananda-pani-on-building-ai-rocket-for-india/feed/ 0
‘India has a High Appetite for Generative AI,’ says AWS’s Chief Medical Officer https://analyticsindiamag.com/intellectual-ai-discussions/india-has-a-high-appetite-for-generative-ai-says-aws-cmo-dr-rowland-illing/ https://analyticsindiamag.com/intellectual-ai-discussions/india-has-a-high-appetite-for-generative-ai-says-aws-cmo-dr-rowland-illing/#respond Tue, 09 Jul 2024 07:15:00 +0000 https://analyticsindiamag.com/?p=10126260

Dr Illing completed his undergraduate paediatric training at St John's Medical College and Hospital, Bengaluru, almost two decades ago. 

The post ‘India has a High Appetite for Generative AI,’ says AWS’s Chief Medical Officer appeared first on AIM.

]]>

Generative AI holds significant promise for India, given its cultural and language diversity. At the AWS Summit, Washington DC, Rowland Illing, the director and chief medical officer for global healthcare and non-profits, AWS, acknowledged the same. 

“The appetite for generative AI in India is high. The technology can help with language and culturally sensitive information, addressing the breadth of languages in India. It’s exciting to see opportunities in the National Digital Health Mission as well,” he told AIM

Illing’s sentiments resonate well since many Indian healthcare companies are actively tapping into the potential of generative AI. Today, every health company is turning into an AI company. 

For example, Apollo partnered with Google Cloud to improve its AI-powered Clinical Intelligence Engine (CIE) using MedLM and retrieval-augmented generation (RAG), integrating extensive clinical data for improved decision-making. 

While Narayana Health, in partnership with Microsoft, is making diagnostic accuracy and operational efficiency better, Manipal is personalising cancer care through imaging and predictive analytics with AI. 

On the startup side, apart from Qure.ai, Khosla Ventures-backed Healthify provides custom diet and fitness plans via its generative AI assistant Ria. The team uses solution models based on diverse statistical models, including AWS, Meta, and OpenAI. 

“I have a great respect for the Indian health-tech ecosystem. I think it’s an incredibly rich and exciting space to be in. So, we’ve got great health-tech companies that are already building on AWS across India,” said Illing, emphasising the work of Bengaluru-based startup Qure.ai in medical imaging AI.

AWS is closely working with many widespread Indian companies, such as the Center for Cellular and Molecular Biology in India, the Poshan Tracker, and the CoWIN vaccination platform. During the pandemic, CoWIN was built natively on AWS, supporting 2.2 billion vaccination doses across the country. 

Beyond the functioning of AWS in India, Illing also shares another special connection with India. The Oxford University alumni completed his undergraduate paediatric training at St John’s Medical College and Hospital, Bengaluru, almost two decades ago. 

Amazon Bedrock for Health

Meanwhile, Genomics England, a UK government-owned entity, is leveraging Claude on Amazon Bedrock, AWS’ fully managed generative AI service, to assist researchers in linking genetic variants to medical conditions. Illing told AIM that it resulted in the discovery of 20 new gene associations with learning disabilities, providing actionable insights for future research and treatment. 

This research, based on peer-reviewed articles, aims to improve genetic tests and health, especially for intellectual disability, by quickly processing vast data to identify potential gene associations. 

Genomics England used Anthropic’s LLM Claude, available through Amazon Bedrock, to make this happen. Interestingly, Amazon concluded a $4 billion investment in the AI startup in March of this year. 

“We have a great relationship with Anthropic. We’re invested in them, and we get early releases of their models, like Claude 3.5 through Bedrock,” he added. 

“Bedrock is a great vehicle because it allows users to access foundation models through APIs and provides functionalities like guardrails to ensure LLM answers are bound by constraints. Users can enrich foundation models with their data without sharing it with the model.” 

However, this is just one example, as numerous companies leverage Bedrock’s diverse models from various AI providers like Anthropic, OpenAI, and Meta, to meet their unique needs, supporting AWS’ mission to democratise foundational models. 

The model evaluator function allows users to compare the models based on speed, cost, performance, or accuracy. “We believe there won’t be one foundation model to rule them all but hundreds, if not thousands, giving users a choice in a secure way,” commented Illing. 

“The current trend we are observing is the explosion of analytics and generative AI. This is an end-to-end data story, focusing on extracting meaningful insights from vast amounts of data, often multimodal. Generative AI plays a crucial role in understanding and analysing this data.”

At the AWS Summit Bengaluru in May, the company announced that Amazon Bedrock is now generally available in the Mumbai (Asia Pacific) region to allow large organisations across India to build and scale generative AI applications with enterprise-grade security and privacy.

Since 2017, AWS, the cloud subsidiary of Amazon, has trained over 5.5 million people in India on AI and cloud skills and over 8.3 million across the APAC and Japan regions. 

With generative AI moving from proof of concept to deployment this year, the company plans to train about two million individuals in AI globally by 2025 through its “AI-Ready” initiative.

Shifting to Cloud

Apart from generative AI, there is a global shift happening towards cloud adoption in healthcare, especially after the COVID-19 pandemic, which needed remote work. Despite a decrease in global IT spending in 2020, cloud spending grew by over 6%, reaching $258 billion, with experts predicting the market will double in the coming years.

“Previously, the cloud was used for disaster recovery and backup, running things on-premises. However, with security being a top priority and concerns about outages, bad actors, and ransomware attacks, we are seeing a shift in organisations moving their production clinical environments to the cloud,” explained Illing. 

This involves using clinical workloads with patient-identifiable data in a cloud-read-only mode initially, before running things on-premises. Over time, organisations have found that cloud-based systems often outperform their on-premises counterparts, leading to a complete transition to cloud-based production environments.

As per Nutanix’s report, the future of healthcare relies on moving away from legacy systems. Currently, 27% of healthcare companies use only traditional, non-cloud data centres and business systems, more than any other industry.

Several notable examples illustrate this trend. The world-renowned academic medical centre, Tufts Medicine and Danville-based Geisinger Health in the US,  now run its electronic healthcare records (Epic) natively on AWS. 

Similarly, New South Wales in Australia is migrating its entire eHealth system, including the single digital patient record for the state, onto AWS. Apart from new startups, legacy health giants like Pfizer, Amgen, and Merck, are also building drug discovery platforms on AWS. 

More recently, AWS chartered into protein folding. It has partnered with the NY-based research team EvolutionaryScale to bring the latter’s ESM3 multimodal large language models available on AWS, aiding applications from drug discovery to carbon capture. 

The post ‘India has a High Appetite for Generative AI,’ says AWS’s Chief Medical Officer appeared first on AIM.

]]>
https://analyticsindiamag.com/intellectual-ai-discussions/india-has-a-high-appetite-for-generative-ai-says-aws-cmo-dr-rowland-illing/feed/ 0
Beyond Pride Month: True Allyship Needs No Calendar https://analyticsindiamag.com/intellectual-ai-discussions/beyond-pride-month-true-allyship-needs-no-calendar/ https://analyticsindiamag.com/intellectual-ai-discussions/beyond-pride-month-true-allyship-needs-no-calendar/#respond Sun, 07 Jul 2024 08:41:38 +0000 https://analyticsindiamag.com/?p=10126080

Middle and junior management levels require more mentorship and active engagement to truly foster inclusivity.

The post Beyond Pride Month: True Allyship Needs No Calendar appeared first on AIM.

]]>

Pride Month is celebrated in June to honour the 1969 Stonewall Uprising, which was a tipping point for the Gay Liberation Movement in the US. However, now that Pride Month is over, what’s next?

Diversity and inclusion are hot topics right now, frequently discussed in the media as awareness of workplace inequalities grows. Although many people want to drive change by bringing a safe space for their LGBTQIA+ colleagues, there’s often uncertainty about how, or even where to start; This is where the idea of allyship in the workplace takes shape. 

Saurabh Bajpai, VP of Everyday Banking, NatWest Group, told AIM that there has been an increase in support for the LGBTQIA+ community in the Indian tech ecosystem regarding acceptance and allyship. Yet, he pointed out that middle and junior management levels require more mentorship and active engagement to truly foster inclusivity.  Bajpai is also the Chair, of Employee Led Network, LGBT+ India Inclusion Council.

“We need more allyship coming from middle or junior-level management teams. Just ticking boxes would not help, and we need to actively engage for and with the community,” he added. 

But allyship does not just end here. Showing support also means respecting their identities and choosing pronouns, even beyond pride month.

The Bitter Reality 

According to a recent report by Deloitte, six in 10 respondents identify as bisexual, 7% as gay and 4% as lesbian while 17% identify as asexual in the Indian tech sector. 

The numbers show that while corporates are slowly making strides towards creating a healthy workspace for LGBTQIA+ members, there is a long way to go in the struggle for acceptance and equality beyond June.

Bajpai’s journey shows us that resilience and self-discovery began in Lucknow, where he was born and raised. He eventually ventured into computer science and completed his higher education at Kanpur University and SASTRA University.

“Throughout this journey, I got the opportunity to express myself, celebrate my wins and failures, except there were no discussions around my identity and sexual orientation,” Bajpai, who identifies as a gay man, told AIM.

Despite his professional achievements, Bajpai’s personal journey was marked by internal struggles and societal challenges regarding the same. From a young age, he knew he was homosexual but felt compelled to hide his true self due to bullying and societal expectations.

This internal conflict persisted even after moving to Mumbai for job, “I started acting ‘straight’ and with that, my belief got stronger that people will love and accept me like this. But even though cities changed, the fight inside my mind about my identity was still not over” added Bajpai.

Eventually, he came out to his friends, feeling relieved, and said, “This was the moment I stopped struggling in my mind. But my coming out journey hasn’t stopped since then.”

Professionally, Bajpai faced challenges, particularly in dealing with preconceived notions and stereotypes in the workplace. “It takes a lot for a person to keep switching between identities. In my early career, even after pretending to be of a different sexual orientation, people still made nasty comments about me,” he said.

Having said that, coming out to even one colleague is a big step, and the process should not be rushed. Identifying allies, finding strength, and becoming vocal and visible are crucial steps for self-help before seeking external support. 

He emphasised, “Coming out is a never-ending journey; you go to different organisations and have to go through the same cycle again. And it’s a very personal journey where one should not rush to come out.”

However, his early years of horrible experiences due to sexuality shaped his leadership philosophy, which is based on respecting all employees and advocating for talent and attitude over superficial judgments. 

“We need to have more representation in leadership roles from the community so that they can act as role models for the rest,” added Bajpai.

NatWest Group has implemented several initiatives to support the LGBTQIA+ community, including reverse mentoring sessions, inclusive policies, and infrastructural support. Its flagship program, TRANSpire, launched two years ago, focuses on providing career opportunities and support to the entire rainbow community, emphasising mental well-being and career development.

Importance of DE&I in AI

In December 2023, CNBC reported that big techs like Microsoft, Google, and Meta have downsized their Diversity, Equity, and Inclusion (DEI) efforts, laid off DEI staff and leaders of diverse employee groups, and downsized learning programs.

This incident illuminates the bigger picture of how a lack of diversity in AI impacts product functionality and accessibility, leading to bias and inaccuracies. For example, Buzzfeed’s Midjourney-generated images of Barbies from different countries faced accusations of racism and cultural inaccuracies. 

Not so long ago, Google had to temporarily suspend its image-generating feature after Gemini inaccurately depicted people of colour in Nazi-era uniforms, showcasing historically inaccurate and insensitive images.

German AI cognitive scientist Joscha Bach believes this bias wasn’t hardcoded, but inferred through the system’s interactions and prompts.

He said that Gemini’s behaviour reflects the social processes and prompts fed into it rather than being solely algorithmic and the model developed opinions and biases based on the input it received, even generating arguments to support its stance on controversial issues like meat-eating or antinatalism.

Such behaviours of AI models are nothing but mirrors of society, urging a deeper understanding of our societal condition. Therefore, it is crucial to treat everyone equally and respect each other’s choices and opinions, ensuring that such biases are not ingrained in AI systems. 

The post Beyond Pride Month: True Allyship Needs No Calendar appeared first on AIM.

]]>
https://analyticsindiamag.com/intellectual-ai-discussions/beyond-pride-month-true-allyship-needs-no-calendar/feed/ 0
CP Gurnani Says, ‘There is No Human Being Who is Not Working on Generative AI’ https://analyticsindiamag.com/intellectual-ai-discussions/cp-gurnani-says-there-is-no-human-being-who-is-not-working-on-generative-ai/ https://analyticsindiamag.com/intellectual-ai-discussions/cp-gurnani-says-there-is-no-human-being-who-is-not-working-on-generative-ai/#respond Wed, 03 Jul 2024 12:30:00 +0000 https://analyticsindiamag.com/?p=10125692

"Whether you are a creative person, a student, a politician, or a journalist, you can take help from generative AI. It's just a matter of time before it becomes part of your life."

The post CP Gurnani Says, ‘There is No Human Being Who is Not Working on Generative AI’ appeared first on AIM.

]]>

After celebrating the milestone of building Tech Mahindra for 19 years, the company’s former CEO, CP Gurnani, decided to set sail on a separate journey. Though not far from his techno-optimist belief.

Just a few months after his farewell from Tech Mahindra, he announced the launch of AIonOS, a business venture in partnership with InterGlobe Enterprises’ Rahul Bhatia.

This takes us back to Sam Altman’s visit to India. When the OpenAI chief claimed that India could not build a model comparable to ChatGPT, Gurnani said, “Challenge accepted.”

“Now that we have taken the challenge, we will deliver, and we will brainstorm,” Gurnani told Nikhil Malhotra, his erstwhile chief innovation officer at Tech Mahindra’s Project Indus. Speaking with AIM, Gurnani said, “I spoke to my chief innovation officer that time at Tech Mahindra… Six hours later, he says, ‘I have a plan.’”

Just recently, Project Indus went live. Tech Mahindra was able to develop an Indian LLM for local languages and 37+ dialects in just five months, spending less than $5 million. “When ET covered the seven best AI companies that will have an Indian LLM, they covered Tata, NVIDIA, and Tech Mahindra,” he said.

“I have to thank the Tech Mahindra leadership team for delivering what I promised on my behalf. I also have to thank Sam Altman as he got me so interested in AI,” Gurnani added, explaining the origin of AIonOS. “I promise you, on my birthday, which is December 19, I will deliver at least a couple of elements,” he said about the launch of AIonOS

AIonOS currently offers specialised AI products and technologies, including custom solutions, industry-specific products, data insight engines, and an AI-led customer experience.

“I can only say there is no human being, including you, who is not working on generative AI. The reality is that you probably framed these questions by asking generative AI,” Gurnani quipped. 

Highlighting the potential of generative AI, he noted, “Whether you are a creative person, a student, a politician, or a journalist, you can take help from generative AI. It’s just a matter of time before it becomes part of your life.”

The Future is About Agent AI

One thing that Gurnani is completely convinced about is that the future of AI is in Agent AI. 

“I’m actually more convinced that the faster adoption will be in Agent AI. Each one of us will have an AI agent that knows us so well, it will analyse our business and our routines, and support us in becoming more productive and efficient,” he explained. He even envisages a future where AI agents take over meetings, although he personally hopes he can continue enjoying face-to-face interactions.

India is Good with Innovation

Regardless of the fact that it has been noted that India merely copies innovation from the West, Gurnani expressed optimism about the AI startup ecosystem in India. He recognised their role in addressing market gaps, as startups like Krutrim and Sarvam AI do. “All startups have very good intent. They represent the gaps in the market offerings,” he said.

He categorised startups into three types: those improving existing processes, those creating new marketplaces, and those working on deep tech innovations.

He shared an example of deep tech innovation in India, mentioning the success stories of ISRO. “NASA used to spend billions of dollars on space missions. ISRO costs around $300-400 million. It’s because there was a leader willing to challenge the status quo,” Gurnani stated, underscoring the importance of leadership and innovation over mere funding.

Gurnani highlighted the significant work happening at various research centres and institutions in India, like IIT Madras. He praised the efforts of individuals like Ashok Jhunjhunwala at IIT Madras, who, despite limited funds, are driving groundbreaking innovations. “There is a space launch pad, EV charging stations, vertical takeoff drones—all designed and built in India,” he pointed out.

He also named IISc’s Centre for Brain Research, funded by Kris Gopalakrishnan, which is making strides in understanding diseases like Parkinson’s and Alzheimer’s. “The amount of work happening on a paltry sum of about Rs 30 crores a year is astounding,” Gurnani said, emphasising the importance of frugal innovation in India.

Stay Relevant

Gurnani remained positive, stating, “I am here to help fix the issues. Nothing annoys me.” He also expressed scepticism about AGI, calling it a “counter-intuitive” term and questioning its practicality. “Either you are artificial, or you have no idea,” he asserted.

Regarding the debate on work hours, Gurnani supported the idea of working hard but balanced it with self-development. “Narayan Murthy is a respected leader. The trolling he faced for suggesting a 70-hour work week was unnecessary. I believe in working 40 hours for your job and 30 hours for yourself,” he advised.

In conclusion, Gurnani’s message to the youth was clear: “Stay relevant. The industry is changing very fast. Do not assume that your degree is sufficient to keep you employed or successful. Stay up to date with the fundamentals and keep adapting to the changes.”

The post CP Gurnani Says, ‘There is No Human Being Who is Not Working on Generative AI’ appeared first on AIM.

]]>
https://analyticsindiamag.com/intellectual-ai-discussions/cp-gurnani-says-there-is-no-human-being-who-is-not-working-on-generative-ai/feed/ 0
IT Giants’ Lofty Generative AI Revenue Claims is Just AI-Washing https://analyticsindiamag.com/intellectual-ai-discussions/tall-generative-ai-revenue-claims-it-companies-ai-washing/ https://analyticsindiamag.com/intellectual-ai-discussions/tall-generative-ai-revenue-claims-it-companies-ai-washing/#respond Wed, 03 Jul 2024 08:57:51 +0000 https://analyticsindiamag.com/?p=10125668

While generative AI is creating new business opportunities for NTT DATA, but most of them are in the Proof of Concept (POC) stage. 

The post IT Giants’ Lofty Generative AI Revenue Claims is Just AI-Washing appeared first on AIM.

]]>

IT services companies were one of the quickest to integrate generative AI and launch GenAI offerings, besides the hyperscalers and SaaS companies. 

However, the burning question on everyone’s mind is: How much revenue are these IT companies actually generating from generative AI alone? 

Companies like Accenture have announced billions of dollars in revenue from generative AI so far. Indian IT giant TCS, too, has revealed a generative AI deal pipeline worth $900 million. 

However, according to Tanvir Khan, executive vice president of cloud, infrastructure, digital workplace services, and platforms at NTT DATA, much of the generative AI revenue being reported is generative AI integration into existing business lines and not new business avenues. 

“If you look at the revenues people are posting on generative AI, and the number of use cases, it’s disproportionately higher than the incremental revenue in the industry. [Going by] all the revenue that people say is being generated by AI, it should have created much more growth. The fact is it hasn’t,” Khan told AIM in an exclusive interview.

Generative AI Revenue is AI Mostly Washing

Khan said there are three categories within the realm of generative AI that fall under AI washing. First, there are pure AI use cases where generative AI is genuinely employed. These instances constitute legitimate generative AI revenue.

Second, existing operations, such as call centres integrating chatbots, can now be labelled as generative AI revenue despite generative AI contributing only a portion to the overall customer service revenue. 

Third, in business-as-usual scenarios like managing cloud infrastructure, where generative AI enhances productivity through automation, the entire revenue stream can sometimes be portrayed as generative AI revenue, even though the underlying business model remains unchanged. 

“The extent to which revenue is attributed to generative AI can vary greatly depending on how broadly the term is defined, leading to what we refer to as AI washing,” Khan said.

Deriving Value from GenAI 

Enterprises have derived great value from generative AI. For instance, in software development projects in general, AI is said to make programmers more efficient by 30-35%.

“This is the productivity value. ​​When you come to the second category of net new incremental value, which is not productivity, but driving new revenue, those use cases are just beginning to emerge,” Khan said.

Moreover, Khan believes there is a third category where mature AI-based digital products will emerge, however, those are still a few years away.

“You can actually draw parallels to the internet. Things like online banking and online brokerages took longer to emerge, even though the internet was there for a while. Hence, these types of use cases at scale might be a couple of years away,” Khan said.

Venkata Malapaka, senior director, data and analytics leader at NTT DATA, believes a lot of the tall claims has to do with the FOMO (fear of missing out) factor.

“We need to view everything in context. Very few real, impactful generative AI use cases have actually made it into production,” Malapaka told AIM.

He believes that on the consumer front, we have seen significant progress, as evidenced by applications like ChatGPT and the integration of AI into everyday tools such as WhatsApp and Google Maps. 

“However, in the enterprise domain, there are fewer instances of generative AI being applied in actual production use cases,” Malapaka pointed out.

GenAI Projects at NTT Data

NTT DATA, one of the largest IT services companies in the world, made $30.04 billion in revenue in 2023. For the company, generative AI is the largest growing revenue stream.

“The reason for that is the denominator is so small that the percentages are big. I’d like to remind people that AI is a marathon, not a sprint. We are overestimating what it will do in the next three, six, or nine months, but we are greatly underestimating what it will do in the next three, six or nine years,” Khan said.

Currently, around half of NTT DATA’s projects are powered by generative AI in a major way. Whereas around 90% of the projects have been touched by generative AI in some form.

Khan also revealed that generative AI is creating new business opportunities for NTT DATA, but most of them are in the Proof of Concept (POC) stage. 

“There are hundreds of use cases with project sizes ranging from a quarter million to half a million dollars. The goal is to scale these initiatives into larger projects worth $10 million to $20 million each. 

“While this transformation will take time, GenAI is introducing entirely new types of use cases that were not there in the past,” he pointed out.

NTT DATA’s Foundational Models

NTT DATA is among the few IT companies that is building its own foundational models. Tsuzumi is its lightweight language model fluent in both Japanese and English and has parameter sizes of 600 million and 7 billion. 

The ultra-light model is designed in such a way that it could run on a CPU.

“Building our own foundational model offers a sustainability advantage. The compute power required for training and inference is significantly smaller compared to GPT-3.5, while still achieving acceptable or comparable results,” Malapaka said.

While the company has only released the Japanese version of Tsuzumi, the English version is expected to come this summer. Furthermore, Khan revealed that it will take a few more quarters before we see real traction with the model in enterprise use cases.

Nonetheless, NTT DATA is not in the business of developing foundational models. “We are building foundational models not to compete with others in the market, but it is more of an insurance policy,” Khan concluded.

The post IT Giants’ Lofty Generative AI Revenue Claims is Just AI-Washing appeared first on AIM.

]]>
https://analyticsindiamag.com/intellectual-ai-discussions/tall-generative-ai-revenue-claims-it-companies-ai-washing/feed/ 0
Former BharatPe CPO Builds AI Doctor for 8 Billion People Worldwide https://analyticsindiamag.com/intellectual-ai-discussions/former-bharatpe-cpo-is-building-an-ai-doctor-for-8-billion-people/ https://analyticsindiamag.com/intellectual-ai-discussions/former-bharatpe-cpo-is-building-an-ai-doctor-for-8-billion-people/#respond Tue, 02 Jul 2024 12:42:23 +0000 https://analyticsindiamag.com/?p=10125602

JiviMedX is powered by the Llama 3 8 billion and 70 billion parameter model

The post Former BharatPe CPO Builds AI Doctor for 8 Billion People Worldwide appeared first on AIM.

]]>

Ankur Jain, the former chief product officer of BharatPe, left the fintech giant last year to venture into AI and healthcare. Together with G V Sanjay Reddy, the chairman of Reddy Ventures, he co-founded Jivi.ai, which happens to be Jain’s second stint in entrepreneurship. 

Jain, who has a background in AI and machine learning, founded Jivi in January 2024. Within months of incorporation, the startup announced its first model Jivi MedX, which outperformed popular models like Google’s Med-PaLM 2 and OpenAI’s GPT-4 on the Open Medical LLM Leaderboard.

In an exclusive interaction with AIM, Jain revealed that the company plans to launch a series of models focussed on healthcare in the coming months.

“While Jivi MedX is a text model, we are working on a series of other models that we internally call a model cluster. For example, there could be a different model that specialises in diabetes, and for ophthalmology, there will be a different model. 

“The next model from Jivi will be a vision model. We are working on a multimodal MedX,” Jain said.

The vision Jain has with his new startup is to build an AI medical companion that the eight billion-strong population in the world can use for free. AI is finding use cases in other domains of healthcare such as drug discovery and genoming. However, Jain’s startup is focussed on primary healthcare. 

Not Just a Chatbot 

The startup’s aim is to create a product which will be useful to end users as well as doctors. Jain revealed that the AI medical companion he is building will have voice capabilities built in, which means everyone would be able to converse with the model in their own languages. 

“We will have voice capabilities and the models will eventually understand the top 25% of the world languages, which covers around 90% of the population. However, the technology is not there yet. Even OpenAI has not released the voice capabilities for GPT-4o,” the Stanford alumni pointed out.

However, it’s just a matter of time, according to Jain, who comes from a family of doctors. “I think in six months, it will become mainstream. We are not building a chatbot, people will have to explain their symptoms to the model and for this, voice is the best medium.”

Regulatory Challenges

Jain’s endeavour is bold since healthcare is one of the most highly regulated industries in the world. Moreover, regulators around the world are contemplating regulating the technology. 

“For instance, we already have telemedicine regulation, whether you use AI in telemedicine or not, telemedicine rules will apply. In the end, the healthcare side will always supersede the technology, whether AI or non AI, we have to abide by the regulation,” Jain said.

Nonetheless, Jain acknowledges that governments worldwide are contemplating regulations and changes are imminent. 

“We already adhere to standards like ISO 42,001, and we must comply with evolving regulatory frameworks, not only in India but globally. 

Similarly, in AI, especially in healthcare, we welcome and adhere to regulations. Regulations ensure that only serious players can enter the market, driving the delivery of quality products to consumers,” Jain added.

Powered by Llama 3

Jain revealed that JiviMedX is powered by the Llama 3 8 billion and 70 billion parameter model. The business started by trying out the existing models, but none of them produced good enough results for the medical domain.

“We tried Mistral, Llama 2 and a few other domain-specific models but none of them helped us in achieving the results we were looking for. The accuracy at best was around 55-60%, which was not good enough,” Jain revealed.

To make the models more accurate, the startup experimented with various approaches initially, including direct preference optimisation (DPO) and reinforcement learning with human feedback (RHLF) frameworks but these methods did not yield the desired results. 

“So we settled with odds ratio preference optimisation (ORPO). Interestingly, not many people in the world have used it so far. We kept experimenting with it, akin to tuning hyperparameters. We conducted approximately 150 experiments to achieve the desired outcome,” Jain said.

Currently, JiviMedX has an average score of 91.65% across Leaderboard’s nine benchmark categories. However, Jain adds that even 92% is not good enough and efforts are underway to increase the model’s accuracy level to 95%.

While MedX uses Llama 3 for its intelligence, the other models Jivi is building will not necessarily be powered by Llama 3. Other models in the cluster could use a different model for intelligence depending on the use case.

‘My First Hire was a Doctor’

“We plan to launch a product in the coming months, but we need to be 100% sure that the model works and does not hallucinate. Currently, the medical community is evaluating it and we will launch it as soon as we get the green light,” Jain said.

Interestingly, Jain revealed that even though it’s an AI startup, his first hire was a doctor. Currently, his team consists of 24 people, and nearly 40% of them are doctors.

“We are involving doctors from Day 0 basically. Half of my team is full of in-house surgeons and physicians. We have also collaborated with the doctor’s community throughout. We built a prototype, shared it with the community and incorporated their feedback, that’s the process,” he added.

Jivi’s Secret Sauce 

Jivi trained MedX using domain-specific data gathered from web scraping, along with information sourced from books, medical journals, research papers, and clinical notes. However, the data alone does not solve the problem. 

“An article about diabetes found on the web tends to be quite generic, typically targeting end users. In contrast, our analysis delves into the past 50 years of diabetic research data. We aggregate insights from various sources, compiling them into a comprehensive knowledge base. This is our secret sauce,” Jain pointed out.

Another challenge the company faces is keeping the data up to date. For instance, if there is new research in diabetes, the startup has to ingest the data into the model.

“So, we have to ingest every new development into the model every month. We plan to do this every week. Currently, we are ingesting once a month,” Jain said.

Can AI Make Money? 

Creating an AI model is one thing; monetising it is a completely different challenge. AI requires huge investment, but building a sustainable and profitable business model remains a challenge.

Jain plans to release the product for general consumers for free. However, the startup also plans to create a B2B segment where hospitals, pharma companies, and insurance providers will provide Jivi’s product to their end users as a software-as-a-service (SaaS) product. 

“Our business model involves selling premium services and also cross-selling. We can connect end users to diagnostic services, and we can also cross-sell e-commerce products like medical devices,” Jain said.

The post Former BharatPe CPO Builds AI Doctor for 8 Billion People Worldwide appeared first on AIM.

]]>
https://analyticsindiamag.com/intellectual-ai-discussions/former-bharatpe-cpo-is-building-an-ai-doctor-for-8-billion-people/feed/ 0
Meet the Indian Who Created an Open Source Perplexity Over One Weekend https://analyticsindiamag.com/intellectual-ai-discussions/meet-the-indian-who-created-an-open-source-perplexity-over-one-weekend/ https://analyticsindiamag.com/intellectual-ai-discussions/meet-the-indian-who-created-an-open-source-perplexity-over-one-weekend/#respond Fri, 28 Jun 2024 04:30:00 +0000 https://analyticsindiamag.com/?p=10125105

"They underestimated the power and potential of dedicated developers. I'm proving them wrong, one line of code at a time."

The post Meet the Indian Who Created an Open Source Perplexity Over One Weekend appeared first on AIM.

]]>

When AIM asked why Perplexity AI couldn’t be built in India, little did we know that it was actually possible. The best part is that a guy did it over the weekend, and made it open source. 

Bishal Saha, a dropout from Lovely Professional University, created Omniplex. Speaking to AIM, Saha said that despite initial job rejections from Perplexity AI, it’s his tenacity that led him to creating an open-source alternative that is slowly gaining traction.

https://twitter.com/Mr_BishalSaha/status/1773439390808969630

“I saw Perplexity AI gaining huge traction and thought I could contribute, but they rejected me twice,” Saha recalled, that he had applied for a job at Perplexity, but they were only hiring people from Google. 

Undeterred, he decided to build his own version. “I built the Omniplex prototype in one weekend. It was just a single-page app, but it started gaining interest after I shared on Reddit,” he explained. 

Saha has made Omniplex open-source, sharing his code with the world. “I’m not asking for any money. The source code is openly available,” he stated. His commitment to open-source reflects a broader vision of democratising AI. 

For Saha, the cost per search query is minimal, but scaling remains an issue. Saha leverages credits from OpenAI and Bing APIs, managing costs effectively. “Per search query, it costs me about $0.007, and I’ve got $2,500 in credits from OpenAI. I haven’t spent half of it yet,” he adds.

Omniplex is built using the Vercel AI SDK, Firebase, NextJs, and Bing, and currently runs on OpenAI’s GPT model. Saha states that his goal is to eventually support all LLMs such as Claude, Mistral, and Gemma.

Perplexity is Nothing But a Hype

Saha said that it is actually quite easy to build something like Perplexity. But most alternatives to Perplexity are not gaining much traction. “Even in our discussions, people are not excited about it. But I believe in the power of open-source,” he emphasised. 

Saha’s insights into the AI industry are sharp and critical. He described Perplexity AI as a “hype to the extent that it cannot be a billion-dollar business”. 

According to Saha, Perplexity’s user numbers and revenue figures often don’t add up. “I’ve done much more research, and you can also find that they use Bing for their searches and then scrape the top results. It’s not as revolutionary as it’s made out to be,” he asserted. 

“It’s not a big business; they’re paying a lot to acquire traffic.” In his own words, “They underestimated the power and potential of dedicated developers. I’m proving them wrong, one line of code at a time.”

The Indian startup ecosystem is too focused on Indic LLMs

Saha said that since he is not from an IIT or NIIT, the bias is evident in his own experiences and those of his peers. “YC looks for the tech pedigree. If you’re not from an IIT, you’re not worth an interview,” he shared, pointing to the broader issues of meritocracy and elitism in the startup world in India, and globally.

“Most Indian investors are not into AI; they don’t understand it. They prefer startups from IITs or Ivy League colleges,” he said, while sharing that investors are often reluctant to speak to founders from other universities. 

Despite this, Saha’s journey is a testament to resilience and innovation. “I’ve worked for two YC-backed companies and did a lot of startups, but being a dropout from LPU, investors don’t like me much,” Saha shares candidly. He initially ventured into fintech, building account aggregators and financial advice apps such as Gullak Money and Pocket Money. 

However, due to stringent regulations and lack of investment, he pivoted towards AI.

Now, Saha is shifting his focus again. “I’m moving out of the AI race. It’s absurd how many startups are entering YC without a clear plan for the next three years,” he said. 

Instead, he’s exploring augmented reality with a former Google employee in his stealth startup. “When AR goes mainstream, there will be a gap. Web and app developers lack 3D design experience. I want to build a platform like Figma for augmented reality,” he revealed.

The post Meet the Indian Who Created an Open Source Perplexity Over One Weekend appeared first on AIM.

]]>
https://analyticsindiamag.com/intellectual-ai-discussions/meet-the-indian-who-created-an-open-source-perplexity-over-one-weekend/feed/ 0
This Surat-based Company is Creating AI Hardware for India https://analyticsindiamag.com/intellectual-ai-discussions/this-surat-based-company-is-creating-ai-hardware-for-india/ https://analyticsindiamag.com/intellectual-ai-discussions/this-surat-based-company-is-creating-ai-hardware-for-india/#respond Thu, 27 Jun 2024 07:48:21 +0000 https://analyticsindiamag.com/?p=10125044

Vaaman is poised to make a difference for developers and researchers in India, with affordable prices and easy availability, cutting down reliance on hardware from other countries.

The post This Surat-based Company is Creating AI Hardware for India appeared first on AIM.

]]>

Starting a hardware company based in India is no easy task. However, Surat-based company, Vicharak, took on the herculean task of churning out hardware in-house, designed specifically for AI workloads. The company recently secured funding of INR 1 crore,  boosting its valuation to INR 100 crore. 

Speaking with AIM, founder and CEO Akshar Vastarpara said that Vicharak’s focus is not just on creating hardware, it’s redefining computing technology. 

“Our first target is to develop a GPU-like technology that can be used in mobile phones, laptops, and servers. We are approaching this in a very different way, starting with the consumer base but scaling to servers and lower-level areas as well,” Vastarpara explained.

This led to the creation of Vaaman. It is a complete packaged computing board that boasts a six-core ARM CPU and an FPGA with 112,128 logic cells. Its distinctive design allows it to tackle challenges that existing products can’t. With a 300-MBps connection between the FPGA and CPU, Vaaman is optimised for hardware acceleration and excels in parallel computing. 

India for the World

“Our goal is not to compete directly [with NVIDIA] but to offer something unique. FPGAs are reconfigurable chips, capable of doing many things that ASIC (Application-Specific Integrated Circuit) startups can’t. We can achieve 90% efficiency compared to what they offer,” Vastarpara said confidently.

Vicharak’s products are poised to revolutionise single-board computing. “We are in the same industry as Raspberry Pi, but our boards include FPGAs alongside processors, offering a complete AI infrastructure,” he elaborated. 

Priced at $180, their boards offer a competitive edge with advanced capabilities at an affordable cost.

Moreover, Vicharak aims to make its software completely free while maintaining proprietary IP on their hardware. “Our plan is to integrate FPGAs into every kind of computer, much like CPUs and GPUs today. The software we develop will be free, but the IP for our FPGA designs will remain ours,” Vastarpara clarified.

Not just hardware, Vicharak is also in direct competition to NVIDIA’s CUDA with its focus on software. Its flagship product, Gati, exemplifies this vision. 

“Gati is our AI exploration project. We’re writing our own infrastructure on top of FPGA, creating a stack similar to what NVIDIA does with CUDA,” said Vastarpara. The goal is to enable AI inference on FPGAs, offering a flexible and powerful alternative to traditional GPUs and CPUs.

Apart from Vaaman and Gati, Vicharak has also built Axon, a processor powered by Rockchip RK3588S, an 8-core 64-bit SoC, utilising a 8 nm lithography process. It also integrates a 4-core GPU and a built-in NPU which provides up to 6 TOPS of performance for AI workloads. 

Future Outlook

Looking ahead, Vicharak aims to make its software completely free while maintaining proprietary IP on their hardware. “Our plan is to integrate FPGAs into every kind of computer, much like CPUs and GPUs today. The software we develop will be free for use, but the IP for our FPGA designs will remain ours,” Vastarpara clarified.

Reflecting on his journey, Vastarpara shared, “I graduated as a software engineer in October 2016. While I could write software, I realised that my true passion lay in electronics and hardware. That’s how Vicharak was born.” 

Initially, Vastarpara focused on consultancy projects, which allowed him to bootstrap his company. “We grew a team of 30 people, and by 2022, we had shifted our focus entirely to consumer-facing products,” he added.

Vicharak is already garnering interest from various sectors. “We are set to demo our product within two months, working with government contractors for smart traffic systems and robotics startups. We are nearing the launch stage and expect to see our technology in practical use soon,” he said.

The post This Surat-based Company is Creating AI Hardware for India appeared first on AIM.

]]>
https://analyticsindiamag.com/intellectual-ai-discussions/this-surat-based-company-is-creating-ai-hardware-for-india/feed/ 0
Salesforce Data Cloud Solve Indian Government’s Data Problem https://analyticsindiamag.com/intellectual-ai-discussions/salesforce-data-cloud-solve-indian-governments-data-problem/ Sun, 23 Jun 2024 03:30:00 +0000 https://analyticsindiamag.com/?p=10124248

At the recently held World Tour Essentials event in Mumbai, Salesforce India announced its entry into the public sector domain.

The post Salesforce Data Cloud Solve Indian Government’s Data Problem appeared first on AIM.

]]>

At the recently held World Tour Essentials event in Mumbai, Salesforce, the leading CRM software provider, announced its entry into the public sector domain.

In a press conference, Salesforce India CEO and Chairperson Arundhati Bhattacharya shared the company’s desire to bring its cloud and AI solutions to the government to help them deliver better citizen services to India’s populace. 

Bhattacharya revealed that the company has formed a new division focusing on selling its solution to the Union government, the state governments, and quasi-government agencies.

Under Bhattacharya, who was the first woman to be the chairperson of the State Bank of India (SBI), Salesforce has grown significantly, recording a 50% revenue jump to INR 6,000 crore in FY 2023. The company’s employee count has also increased from around 2500 to over 10,000 in the country.

Given Salesforce India’s above-average performance, venturing into the public sector seems the next logical step. Nonetheless, announcing the public sector division was the easier part. The biggest test for Salesforce remains convincing the government to leverage its solutions.  

( Salesforce press conference held in Jio World Convention Centre, Mumbai)

Government’s Appetite for Public Cloud 

In an exclusive interview with AIM at the sidelines of the World Tour Essentials event, Bhattacharya said that Salesforce’s biggest challenge is that the majority of its critical services are on the cloud. 

“We do have some on-prem solutions, such as the integration layer in Mulesoft or the graphic analytics layer in Tableau. But other than that, much of our solutions are cloud-based. We need to find out what is the appetite of the government in order to use the public cloud,” she said.

While major government agencies and departments are in dire need of digital transformation, the government has shown an appetite to leverage the cloud and has formed the National Cloud Initiative of India. The initiative, called Meghraj, aims to host various government applications and services on the cloud.

Moreover, the Ministry of Electronic and Information Technology (MeitY) has provided Cloud Service Provider (CSP) empanelment to public cloud companies like AWS. Notably, Salesforce leverages AWS’ cloud infrastructure. 

Even though the company refrained from revealing much about this new division, Bhattacharya did say that it is still in its nascent stage and could take a couple of years before we see any possible outcomes. She also said that Salesforce has initiated talks with a few parties.

“The other thing, of course, is that we have to learn how to participate and respond to a request for proposals (RFPs) on time. These documents are often extensive and detailed, requiring a learning curve to understand the response process, which also involves collaborating with partners,” Bhattacharya said.

Data Cloud Could be a Seller for Salesforce 

One of the biggest hindrances for many organisations is that most of the data is in silos, and this is even true for many government organisations. 

Bringing scattered data together into a cohesive source and deriving meaningful insights remains a significant challenge. This is where Salesforce’s Data Cloud could prove to be a valuable tool when it tries to woo the government. 

“Data Cloud isn’t something you need ages and ages to build. We have customers that have actually done it in the span of just 90 days and these are not small customers. These are large-scale customers, nothing in the scale of the government, but definitely large scale. 

“When you leverage Data Cloud, it’s something where you are not really storing any data; you are merely getting intelligence from the pools of data that you already have and presenting it in real-time. That’s what it does and that’s something the government could possibly use or be interested in using,” she said.

Data Cloud is one of the fastest, if not the fastest growing, organic innovation to come out of Salesforce. In fact, the company is projected to make half a billion dollars in revenue from Data Cloud in just two years.

Moreover, it also does not matter where the data is sitting. “It’s irrelevant whether the data is stored in Snowflake, DataBricks, or elsewhere. Organisations have a wealth of additional data beyond their typical data warehouse, such as client email interactions, contact centre chats, social media profiles, and ongoing online interactions,” she pointed out.

The data, however, do need to be in digital form. In some government offices, some data could be in physical form. “If the data is not digital, then it becomes a challenge. “ If it is not digitised, we can help you digitise it and ensure that the data that’s coming in in the future stays in a digital format.”

Salesforce Public Sector Solutions go beyond Data Cloud, offering out-of-the-box apps tailored for rapid digital service delivery in government agencies. 

Leveraging the Salesforce Einstein Platform, these solutions empower government organisations to future-proof their IT investments with citizen developer tools, AI, and automation. This approach enhances service agility and efficiency and supports digital transformation efforts. 

Building Trust in AI 

Over Time, multiple thinkers in the AI space, as well as many Salesforce spokespersons, have stressed the importance of building trust in AI.

The primary concerns, be it public or private sector enterprises,  regarding trust in AI primarily revolve around the data. According to Bhattacharya, catering to the customer’s data requirement and ensuring there are no mishaps is the best way to build trust in AI.

“If we commit to ensuring data privacy and confidentiality, we must uphold that promise without compromise. It is imperative that our offerings reflect this commitment.

“Moreover, the government will have its own laws and regulations as to where it wants the data, how it wants it to be treated, what data can be accessed, and what data should not be accessed. How should it be masked when at rest? How should it be encrypted when in flight, so all of that has to be very assiduously met,” she said.

AI Needs a Global Compact

Bhattacharya also opines that AI today is mostly self-regulated. While some jurisdictions have some regulations, some don’t have any regulations at all. However, it requires a global compact.

“All countries should agree that for the good of mankind, this is akin to the nuclear non-proliferation treaty. We must now agree on principles and actions that benefit humanity collectively,” she said.

This does not have to be a regulatory body but a collective agreement where countries agree that ‘this is what we will do and this is what we will not do with AI.’

Interestingly, this is not the first time an idea like this has been conceived. Sam Altman, the CEO of OpenAI, also shared similar thoughts. 

He urged the US to establish a global organisation similar to the International Atomic Energy Agency (IAEA), focusing specifically on AI regulation.

His comments raised some concerns in India as technology’s impact and implications are shaped by human choices, values, and societal contexts. India also has unique socio-economic challenges, such as income inequality, poverty, and access to basic services. 

Furthermore, if such a regulatory body is established, does India secure a seat at the table? If so, does it wield a significant voice in decision-making? Or will it be dominated by Western bureaucrats who will regulate the technology based on their country’s risks and concerns?

Bhattacharya believes India already has a strong voice in the global space. Also, today, the world itself is very interconnected. “You should go to Silicon Valley and see how many Indians are working there. Our chief engineer happens to be from Andhra. He’s obviously an expat and lives in the US, but he’s from India,” she pointed out.

Moreover, we can’t take an approach like China, which bans Western technologies. What is required is a collective approach in which all parties work together for the betterment of humanity.

The post Salesforce Data Cloud Solve Indian Government’s Data Problem appeared first on AIM.

]]>
Sabre Uses GenAI Tools to Boost Productivity & Innovation for 800 Software Engineers https://analyticsindiamag.com/intellectual-ai-discussions/sabre-uses-genai-tools-to-boost-productivity-innovation-for-800-software-engineers/ Thu, 20 Jun 2024 07:43:17 +0000 https://analyticsindiamag.com/?p=10124069

Apart from internally experimenting with AI, it has AI-powered models like SabreMosaic and Sabre Travel AI.

The post Sabre Uses GenAI Tools to Boost Productivity & Innovation for 800 Software Engineers appeared first on AIM.

]]>

Texas-based leading travel industry software and technology provider, Sabre, has been leveraging AI and ML for a while now. In a recent interaction with AIM, Sandeep Bhasin,  VP (software engineering) at Sabre, said that internally, the company has introduced generative AI tools to approximately 800 software engineers, significantly enhancing productivity and innovation. 

By leveraging generative AI, the company aims to improve the efficiency of its development processes and bring new products to market faster.

Apart from internally experimenting with AI, it recently unveiled SabreMosaic. This AI-powered modular platform shifts from the traditional PNR system to a modern offer and order approach and enables airlines to offer personalised and dynamic retail experiences. 

Traditional airline processes relied heavily on the PNR to manage traveller information, but this system has struggled to evolve with the expansion of airline product offerings. The rise of modern retailers like Amazon has shifted traveller expectations towards personalised experiences. 

“Travellers now expect the same level of personalisation from their travel providers as they have come to expect in other B2C retailing spaces,” added Bhasin.

This shift is driving airlines to adopt modern retailing platforms like SabreMosaic, which supports both PNR and Offer-Order systems. Offer-Order systems optimise and personalise travel offers, allowing airlines to sell a variety of products beyond just flight tickets. 

“What excites me is how this dual support is crucial as airlines transition from being mere suppliers of seat inventory to modern retailers of diverse travel content,” he added. 

The product is driven by the company’s suite of AI/ML-powered solutions called Sabre Travel AI, developed using Google Cloud’s Vertex AI platform. The team also uses other models like multi-cloud data warehouse Big Query for its data lake.

Back in 2022, Sabre announced a 10-year-long partnership with Google to leverage its capabilities for new innovation. The former’s extensive travel data and this partnership will enable it to provide effective, data-driven recommendations. 

Talking about the same, Bhasin said, “Our wealth of travel data gives us the advantage to leverage the power of data, often referred to as the new oil, to drive our analytical models.” This capability allows Sabre to offer real-time interaction and decision-making on a global scale. 

The real-time capabilities of the model is another differentiator making interactions faster and instant. 

Can AI Help in the Cancellation Crisis?

Even though the Indian aviation sector has seen remarkable growth over the past few years, with domestic air passenger traffic increasing by 13% annually to about 15.4 crore in 2023-24, major airline chains like IndiGo, Vistara, Air India, Akasa Air, and SpiceJet have faced challenges in maintaining flight schedules. As of the January-March 2024 period, flight cancellations and delays rose by 34%

Given this current situation, where flight cancellations are more common than ever, AI and ML can play a crucial role in mitigating these disruptions. According to Bhasin, re-accommodation logic can manage flight disruptions more efficiently. 

As Bhasin explained, “As airlines move further towards modern retailing, it will be important that a travellers’ personalised ‘bundle’ of products is kept together as an offer so that, even if disruption happens, the passenger still receives their expected ancillaries or experiences.” 

Additionally, robust optimisation strategies and technology are essential for creating schedules and routes that are less prone to disruptions.

Aviation companies are already using AI, especially generative AI for customer experience. For example, Air India introduced AI.g, a virtual travel assistant, powered by ChatGPT on WhatsApp. Even IndiGo launched an AI chatbot called 6Eskai. Both the models are built on OpenAI’s GPT-4. 

India as a Market

“The India centre plays a crucial role. In fact, a lot of our recent technology transformation initiatives, including our move to Google Cloud, were run out of the India centre,” commented Bhasin. This includes major initiatives like moving from mainframe systems to open systems and transitioning to cloud infrastructure.

The company has major GCCs in Krakow, Bengaluru, and Montevideo. In India,  Air India is one of its largest customers along with various travel agencies and hotels. The Bengaluru Global Capability Center (GCC) plays a crucial role in driving technological transformation. 

The Bengaluru GCC has been instrumental in these transformations, developing industry-leading solutions for the travel business and providing a strong IT backbone to global operations. 

Bhasin highlighted, “Several products from our retail intelligence suite, along with the leadership team behind it, including our MLOps practice, are also based out of our Bengaluru GCC.”

The company’s retail intelligence suite includes AI and ML-powered products like Air Price IQ and Ancillary IQ, which offer personalised and optimised air prices and ancillary services. For example, Air Serbia and Chile-based LATAM air services have co-innovated with Sabre to bring these products to market, resulting in substantial revenue increases. 

With Air Price IQ, our customers can expect up to a 3% revenue increase in flight revenue. With Ancillary IQ, it could be up to 10% of the overall ancillary revenue.

The Bengaluru GCC has been instrumental in these transformations, developing industry-leading solutions for the travel business and providing a strong IT backbone to global operations. 

Several products from the Retail Intelligence suite, along with the leadership team behind it, including our MLOps practice, are also based out of the Bengaluru GCC. The company is currently in talks with these customers to implement Sabre Travel AI solutions. 

“India is one of the most dynamic, complex, and fast-growing markets in the world right now, and it’s one of our key markets for 2024 and beyond. And our team in India is critical to our tech transformation and innovation efforts,” concluded Bhasin. 

The post Sabre Uses GenAI Tools to Boost Productivity & Innovation for 800 Software Engineers appeared first on AIM.

]]>
‘I See No Sign that Large Language Models are Sentient. They are not Creative,’ Says GitHub CEO https://analyticsindiamag.com/intellectual-ai-discussions/i-see-no-sign-that-large-language-models-are-sentient-they-are-not-creative-says-github-ceo/ Sat, 15 Jun 2024 03:30:00 +0000 https://analyticsindiamag.com/?p=10123741

Everybody has a different understanding of what AGI even means and what the ‘G’ in AGI really stands for.

The post ‘I See No Sign that Large Language Models are Sentient. They are not Creative,’ Says GitHub CEO appeared first on AIM.

]]>

GitHub CEO Thomas Dohmke is a busy man, not only found wandering the streets of Bengaluru, reminiscing old memories, but also captivating developers and enterprises alike in the city.

The company, which recently announced the technical preview of GitHub’s newest product- Copilot Workspace, showed live demos of how Copilot Workspace to developers in India.

Developers can interact with Copilot Workspace through chat as well as with a voice interface in multiple languages like Hindi, Kannada, and other Indian languages. 


However, not everyone has access to this newest tool from GitHub. When AIM asked Dohmke if GitHub could announce the general availability of Copilot Workspace during their flagship event, GitHub Universe 2024, which is taking place in October, he said- no promises.

But when we posed the same question to Sharryn Napier, vice president of Asia Pacific, India, Japan & China, and GitHub, who accompanied Dohmke to India, her response was a little bit more optimistic. “Hopefully,” she said. 

( Dohmke in Brigade Road, Bengaluru in 2008 and 2024)

GitHub is Wooing Customers in India 

During his visit, Dohmke visited Infosys’ campus and launched the first GitHub Centre of Excellence in Bengaluru. At Universe 2023, held in San Francisco, Dohmke told AIM that Infosys was one of the early adopters of GitHub.

Moreover, Dohmke was also hobnobbing with executives from Paytm and Indian IT companies such as Tech Mahindra and HCLTech.

Some other notable customers for GitHub Copilot enterprise include LambdaTest, MakeMyTrip, Swiggy, Cognizant, Air India, Glance, and the Indian government-owned public portal Government e-Marketplace (GeM).

Given India’s ever-evolving tech landscape and the growing number of developers, it makes sense for GitHub to position India as an important market. 

According to GitHub, over 15.4 million developers in India are building on GitHub, growing 33% year over year (YoY). Moreover, the Microsoft-owned company also revealed that India is set to overtake the US in terms of the number of developers by 2027. 


Another interesting piece of data from that report is that Indian developers are the second-biggest contributors to generative AI projects on GitHub after the US.

Hence, it makes perfect sense for GitHub to become aggressive with sales activities in the country. Interestingly, GitHub also strongly positions itself in other Asian markets, such as Singapore. 

( Dohmke at Github Constellation 2024) 

Defining the ‘G’ in AGI 

Given that GitHub is positioning AI as critical throughout the software development lifecycle, we asked Dohmke what his thoughts were on artificial general intelligence (AGI).

“Today I see no sign that machine learning models or Large Language Models (LLMs) have sentience. They are not creative. They are machines created by us that help us with the things that we want to do or don’t want to do,” he told AIM in an exclusive interaction.

However, this does not mean AI today can’t be transformative. Taking the example of code scanning, Dohmke said, “We shipped code scanning in public preview earlier this year that helps developers to fix their security vulnerabilities. So I think we are more focused on what we can do today with AI and how we can help developers write better codes without worrying much about the future.”

Nonetheless, Dohmke does make a very good point about AGI. He said everybody has a different understanding of what AGI even means and what the ‘G’ in AGI really stands for.

What Dohmke is indicating is how we even know when an AI system becomes conscious. Today, LLMs encompass all the knowledge available in the world, a trait unattainable by humans. Yet, does this superiority over human capabilities make them inherently better or even sentient?

Copilot Workspace vs Devin

GitHub’s attempt to woo developers and enterprises in India also makes sense, given that many alternatives to GitHub Copilot have emerged recently. 

Most notably, Cognition Lab’s Devin, which the startup positioned as the world’s first ‘AI Engineer, ‘ is gaining prominence. Other tools, such as StarCoder 2 and Amazon Q for Developers ( previously Codewhisper), are also gaining prominence.

Previously, we have asked GitHub how Copilot Whisperer differs from Devin. Even though they are designed to solve similar problems, there are some fundamental differences, said Jonathan Carter, head of GitHub Next, in an earlier interaction before the technical preview announcement of Copilot Workspace.

“We don’t view GitHub Copilot Workplace as an ‘AI engineer’; we view it as an AI assistant to help developers be more productive and happier,” Carter said.

At GitHub Galaxy, we asked Dohmke his thoughts on Devin, and he gave a very CEO-esque answer. “First of all, competition is great. You don’t want to play a sport without any competitor, so we embrace competition.”

But he did share his thoughts on what makes Copilot different from Devin. The difference is the approach. While others [Devin] have taken an autonomous first approach, Copilot Workspace helps the developers through the software development life cycle. I think both of these approaches are valid and they’re probably just on different timescales when they’re leading to success,” Dohmke said.

AI is not taking over jobs, but it is redefining job roles? 

At Galaxy, most of the questions directed towards Dohmke were concerned with AI’s impact on jobs. Dohmke’s response to those questions was reassuring.

“In many ways this new age of AI has actually created more demand on developers because now somebody has also to build all the AI systems,,” he said.

The recent boom in AI is creating newer AI companies in Silicon Valley and, likewise, in India. In the last year or so, many generative AI companies have mushroomed in India. 

“Humanity has so many problems to solve and there is so much work to be done. We are not going to run out of work.”

However, many experts AIM has spoken to earlier are of the opinion that AI will be able to do what an entry level coder is required to do in an organisation. However, Dohmke had a different opinion. 

“When you embark on a new job, whether fresh out of college or transitioning from another company, the primary challenge is understanding the company’s operations and code bases, which could be thousands of files.”

A tool like Copilot could be very useful for entry-level coders. Moreover, Dohmke believes that going forward, applicants will be expected to be adept at leveraging AI tools like Copilot and ChatGPT

“Some software companies have even begun incorporating Copilot into their interview processes, replacing traditional coding exercises with tasks that assess applicants’ ability to utilise these tools effectively. 

“This shift underscores the evolving skill sets required in the workforce, emphasising adaptability and proficiency in leveraging AI technologies over rote memorisation of coding syntax.”

Moreover, Dohmke also advocates for teaching students in colleges the integration of AI coding assistants.

“I believe graduates will emerge from university with a much stronger skill set. We’ve already observed this trend in colleges transitioning from teaching Python to Rust, a more intricate programming language. This shift is driven by the substantial opportunities it presents for developers, particularly those proficient in Python. I anticipate witnessing more such transitions in educational institutions.”

Insure codes?

According to GitHub, using AI makes coders 55% faster than those who don’t use any AI coding assistant. GitHub Copilot has also become the most widely used AI tool by developers.

However, concerns have been raised about the efficiency of the codes generated by these tools. Findings from the code reviewing tool GitClear indicate that the proliferation of AI coding assistants in 2022 and 2023 coincided with an increase in the frequency of code requiring correction within two weeks of its creation. 

Dohmke shared an anecdote to respond to this. “I spoke with a CEO who was coding with his son. They needed to convert binary to decimal and initially wrote 150 lines of code. Upon consulting Copilot, it condensed the solution to just two lines by leveraging an open-source library.”

He pointed out that this demonstrates increased efficiency in code management, a crucial aspect for companies. “Moreover, if you don’t want to use that open-source library, it still gives you code without the open-source library.”

The trick, according to Dohmke, is in prompting. 

“While the best programmers may outperform the model, the focus in larger companies is on raising the average skill level. Studies also show improvements in pull request and merge rates, indicating enhanced developer focus and quality during the development process,” he added.

The post ‘I See No Sign that Large Language Models are Sentient. They are not Creative,’ Says GitHub CEO appeared first on AIM.

]]>
TransUnion Bolsters Global Team with 55% Indian Tech Talent  https://analyticsindiamag.com/intellectual-ai-discussions/transunion-bolsters-global-team-with-55-indian-tech-talent/ Thu, 13 Jun 2024 12:36:45 +0000 https://analyticsindiamag.com/?p=10123586

The financial giant has over 25% of its workforce based in India.

The post TransUnion Bolsters Global Team with 55% Indian Tech Talent  appeared first on AIM.

]]>

In 2018, Chicago-based consumer credit reporting agency TransUnion, which has been in business for over five decades, opened its first global capability centre (GCC) in Chennai. Since then, the company has expanded its footprint in other Indian cities including Bengaluru, Pune, and Hyderabad.

“The success of our Chennai centre paved the way for us to expand our GCC network to six centres across three continents,” Debasis Panda, SVP and head, TransUnion GCCs (India, South Africa and Costa Rica), told AIM in a recent interaction. 

India GCC is the largest, making up over a quarter of the company’s workforce. 

The India centres house 55% of TransUnion’s technology talent, 56% of operations personnel, and 39% of analytics experts. 

This network, spanning India, South Africa, and Costa Rica, now employs over 4,000 associates and has become integral to TransUnion’s global operations. 

“These centres in India leverage local talent to support and enhance TransUnion’s core capabilities, including technology support, data analytics, business process management, and contact centre operations,” Panda explained. 

This strategic location allows TransUnion to expand its time zone and language coverage, facilitating global operations and contributing to economic growth in the regions it operates in.

India Team – Global Hub

India is the hub of TransUnion’s strategy, contributing to its global operations. The India GCCs operate as a microcosm of the enterprise, embedding almost every global function within their operations. 

These centres provide specialist capabilities across various domains, including product and platform solutions, data science and analytics, system architecture, intelligent automation, and business process management.

“The GCC India plays a pivotal role in TransUnion’s mission to migrate products and services to the cloud, ultimately transforming our operations by enabling streamlined product delivery and faster innovation,” said Panda. 

This shift has streamlined product delivery and faster innovation. The India team’s contributions have been instrumental in several modernisation initiatives. 

Recently, the team’s work on the OneTru platform, which leverages TransUnion’s data assets, cloud infrastructure, and AI capabilities, has significantly enhanced the company’s ability to deliver comprehensive and compliant consumer insights.

Another achievement of the team is revamping TransUnion’s solution enablement platform. This platform integrates separate data and analytic assets designed for credit risk, marketing, and fraud prevention into a unified environment. 

The India team supports internal infrastructure and development platforms. It allows consistent and secure development, deployment, and management of enterprise applications in a hybrid cloud environment.  

Leveraging Talent and Expertise

“India’s strong digital skills and infrastructure support a compelling growth and expansion opportunity for us,” explained Panda. 

The Indian talent offers a full stack of capabilities, including technology solutions, data science, and business process management. “This expertise allows TransUnion to provide leading-edge capabilities to our clients and colleagues worldwide,” the spokesperson said.

 However, due to a limited talent pool and intense competition in specialised areas like AI and ML, the company has introduced specialised initiatives to bolster hiring efforts and retain top talent.

For example, it has implemented a unique talent-focused operating model and a compelling Employee Value Proposition. This includes specialised training programs and a culture of continuous learning. The company also offers programs that support job mobility within the organisation, aiding career growth.

“Our unique operating model and continuous learning culture help us attract and retain top talent,” said Panda.

It also ensures a competitive advantage in attracting and retaining talent through several initiatives. For example, the ‘University Graduate Program’ develops the employment readiness of graduates, while the ‘TU Connect’ program offers comprehensive learning and problem-solving initiatives. 

Employee engagement, health and wellness programs, and CSR initiatives enhance employee satisfaction and retention. 

Additionally, a transparent internal career mobility framework promotes internal career opportunities, enabling 25% of associates to advance internally each year. These efforts are supported by a strong Employee Value Proposition, competitive benefits, and a flexible work environment.

Similarly, another core principle of the GCC in India is a focus on diversity, equity, inclusion, and belonging (DEIB). “Our focused efforts have resulted in a 10% increase in diversity ratio since 2018, with our current gender diversity ratio at 31%,” Panda highlighted.

AIM Media House is hosting its flagship GCC summit, MachineCon, in Bengaluru, on June 28, 2024. The event will feature over 100 GCC leaders, including Balaji Narasimhan, head of operations and GCC site leader at TransUnion. 

Don’t miss this opportunity—get your passes today!

The post TransUnion Bolsters Global Team with 55% Indian Tech Talent  appeared first on AIM.

]]>
Snowflake Looks to Upskill Developers in India’s Rural Towns https://analyticsindiamag.com/intellectual-ai-discussions/snowflake-looks-to-upskill-developers-in-indias-rural-towns/ Wed, 12 Jun 2024 11:21:18 +0000 https://analyticsindiamag.com/?p=10123378

"We believe it's part of our charter to educate the market," said Vijayant Rai.

The post Snowflake Looks to Upskill Developers in India’s Rural Towns appeared first on AIM.

]]>

Snowflake is making significant strides in India, with a vision that transcends traditional go-to-market strategies. AIM caught up with Vijayant Rai, the managing director of Snowflake India, at Snowflake’s Data Cloud Summit 2024. Rai elaborated on the company’s multifaceted approach to establishing a robust presence in the country.

“We’re looking at India in a multi-dimensional way,” Rai said, underscoring the company’s diverse operations. This includes a significant presence in Pune, where a team of 500 professionals handles operations and support. Additionally, Snowflake is leveraging India as a hub for global customers through its Global Capability Centers (GCCs). 

“It’s not just about go-to-market; it’s also about what we can do for global customers from India,” he explained. Snowflake aims to be an integral part of this transformation by partnering with enterprises and SMBs to drive innovation.

The company’s engagement with India is driven by the country’s rapid economic growth and digital transformation. “India is the fastest growing global economy, with a growth rate of over 7%,” he said. Rai noted that the widespread adoption of digital public goods like the UPI framework has positioned India as a data-driven economy. 

The AI in India Approach

Snowflake’s approach to AI is pragmatic, focusing on laying strong data foundations. “There’s no AI strategy without a data strategy,” Rai asserts. The company is helping enterprises break data silos and establish robust data governance frameworks. This is essential for leveraging advanced technologies like generative AI, which Rai believes will revolutionise how businesses operate.

Snowflake is launching boardroom-level workshops to support this transformation and educate senior management on devising effective data strategies. “We believe it’s part of our charter to educate the market,” Rai said. These workshops are designed to ensure that enterprises can maximise the potential of AI and other emerging technologies.

Snowflake is also addressing the demand for skilled developers by offering extensive training and certification programs. These initiatives extend beyond major cities to Tier-2 and Tier-3 locations, even small villages in India, reflecting Snowflake’s commitment to democratise access to AI. 

“India is front and centre in our strategy,” Rai affirmed, highlighting Snowflake’s dedication to making a meaningful impact in the country.

The Tech Talent Prowess

One of Snowflake’s key initiatives is to leverage India’s talent and technological prowess. “We see India not just as a support hub but as a centre for innovation, especially in AI and data-driven technologies,” Rai stated. 

He highlights the significant role of GCCs, with over 600,000 tech professionals in India driving data innovation. Snowflake is committed to supporting these centres to scale and enhance their capabilities.

The company is also focused on nurturing the developer community in India. “We’re investing heavily in skilling and ensuring that people are exposed to the Snowflake platform and various aspects of data and AI,” Rai said. This includes initiatives like language support in their large language model, which accommodates all major Indian languages. 

Teams from Pune and other locations are already contributing to Snowflake’s global projects, and this collaboration is set to deepen over time.

In terms of market strategy, Rai emphasised the importance of understanding local business cultures and nuances. “We have experienced teams in Delhi, Bengaluru, and Mumbai who have worked in various verticals and understand the unique needs of different industries,” he said. 

This local expertise is crucial in navigating the fast-paced technological changes that Indian enterprises are embracing.

Snowflake’s Shift to Generative AI Paying Off

Snowflake has been on an acquisition spree and much of its focus is on expanding its generative AI capabilities. Ever since Sridhar Ramaswamy joined as the CEO of Snowflake after the acquisition of Neeva, generative AI has been one of its biggest focuses. 

In an exclusive interview with AIM, Snowflake head of AI Baris Gultekin said that he had worked with Ramaswamy for over 20 years at Google, and called him an incredible leader. “Sridhar brings incredible depth in AI as well as data systems. He has managed super large-scale data systems and AI systems at Google,” Gultekin said.

In addition, Microsoft announced an expanded partnership with Snowflake, aiming to deliver a seamless data experience for customers. As part of this, Microsoft Fabric’s OneLake will now support Apache Iceberg and facilitate bi-directional data access between Snowflake and Fabric.

Moreover, in a recent interview, Ramaswamy revealed that the cloud data company plans to deepen its collaboration with AI powerhouse NVIDIA. “We collaborated with NVIDIA on a number of fronts – our foundation model Arctic was, unsurprisingly, done on top of NVIDIA chips. There’s a lot to come, and Jensen’s, of course, a visionary when it comes to AI,” Ramaswamy said.

The post Snowflake Looks to Upskill Developers in India’s Rural Towns appeared first on AIM.

]]>
AWS Trains Over 5.5 Mn in AI and Cloud Skills in India  https://analyticsindiamag.com/intellectual-ai-discussions/aws-trains-over-5-mn-in-ai-and-cloud-skills-in-india/ Tue, 11 Jun 2024 11:50:20 +0000 https://analyticsindiamag.com/?p=10123273

“Keeping responsible AI as our goal, we work backwards from customer needs and provide what they require.”

The post AWS Trains Over 5.5 Mn in AI and Cloud Skills in India  appeared first on AIM.

]]>

Since 2017, AWS, the cloud subsidiary of Amazon, has trained over 5.5 million people in India on AI and cloud skills and over 8.3 million across the APAC and Japan region.

With generative AI moving from proof of concepts to deployment this year, the company plans to train about two million individuals in AI globally by 2025 through its “AI-Ready” initiative.

Intuit WDSW June 2024

“We want to democratise and simplify generative AI so that customers can innovate and scale through AWS services. Keeping responsible AI as our goal, we work backwards from customer needs and provide what they require,” Guru Bala, head of solutions architecture, AWS specialised services, told AIM at AWS Summit Bengaluru, last month. 

Through a mix of free and paid courses on platforms like Coursera, individuals can enhance their skills in generative AI. Competency partners like Shellkode also play a crucial role in developing various generative AI services to meet the diverse needs of customers.

“Our approach to responsible AI involves defining, measuring, and mitigating risks at every stage, from model training to deployment,” explained Bala. “We want to make sure that AI technology is used ethically and responsibly.”

Productivity Enhancement with AWS AI 

A recent EY report shows that generative AI could potentially boost India’s GDP by an estimated $359 to $438 billion by 2030.

Organisations across India are recognising substantial gains in productivity and improved customer experiences by integrating generative AI into their products and solutions. 

Bala noted, “As a result, boardrooms here (India) are increasingly asking, ‘What is the business impact in terms of productivity, customer experience, or time to market?’” 

For example, Amazon Q’s coding companion Amazon CodeWhisperer enables developers to work more productively by providing AI-powered code suggestions in real-time, across 15 programming languages. In some cases, developers were 27% more likely to complete tasks successfully and did so an average of 57% faster, helping them build whatever fuels their passion more easily.

Bala emphasised the importance of understanding business problems and leveraging data as the biggest differentiator. 

“One thing that companies should really understand is their business problems. The data that they have got is their biggest differentiator, and they should learn how to apply generative AI on their data to solve business problems,” he said.

Amazon Bedrock Now in India

At the summit, AWS announced that Amazon Bedrock, a fully managed generative AI service, is now generally available in the AWS Asia Pacific (Mumbai) Region. 

Bedrock provides the choice of top-notch models from both Amazon’s own first-party offerings like Amazon Titan and various third-party models. This includes families of foundational models from AI-focused companies like Meta, AI21 Labs, Anthropic, Cohere, Mistral, Stability AI, and more. 

Bala explained that the launch in India enhances generative AI capabilities by improving processing and response times due to its proximity to users. It also brings more flexibility and choice of models without getting bogged down by the complexities of deployment and inference.

Initially launched globally in select regions in 2023, Bedrock’s expansion to Mumbai enables customers, including those in the public sector and regulated industries, to innovate with generative AI while maintaining control over application execution and data storage. 

“Plus, we have several customers in regulated industries, such as life insurance, who require services to be available within India for regulatory compliance. By making these services available in the India region, we ensure that these customers can leverage Bedrock while adhering to local regulations,” he added. 

However, adopting generative AI comes with several challenges. AWS approached these by first helping businesses define their objectives. 

Instead of offering generative AI as a one-size-fits-all solution, it has worked backwards from customer needs, determining whether traditional AI or generative AI would best solve their problems. 

Future in India

According to Bala, India’s teams play a crucial role in AWS’s global operations. The expertise of Indian engineers and developers is integral to services like Amazon Bedrock, driving the company’s innovation, both in the country and globally. 

Looking ahead, AWS’s vision is clear and focused. It aims to continuously innovate in generative AI, introducing new models and variants to make the technology more accessible and easier to use. 

The post AWS Trains Over 5.5 Mn in AI and Cloud Skills in India  appeared first on AIM.

]]>
The Missing Link for Indian Language Chatbots: Indic Data  https://analyticsindiamag.com/intellectual-ai-discussions/the-missing-link-for-indian-language-chatbots-indic-data/ Mon, 10 Jun 2024 07:35:02 +0000 https://analyticsindiamag.com/?p=10122984

“You will see many claiming that they can make a chatbot or LLM for Indian languages; 99% of those are transient,” said Raj Dabre.

The post The Missing Link for Indian Language Chatbots: Indic Data  appeared first on AIM.

]]>

In recent times, there has been a noticeable upswing in the efforts to build Indic language models. And even though some of these models are adequate for various tasks, their adoption remains abysmally low compared to their ‘superior’ English counterparts. A huge challenge here is the availability of Indic languages datasets.

In a conversation with AIM, Raj Dabre, a prominent researcher at NICT in Kyoto, adjunct faculty at IIT Madras and a visiting professor at IIT Bombay, discussed the complexities of developing chatbots for Indian languages.

“These models [GPT-3] have seen close to tens of trillions of tokens or words in English. Unless you have seen the entirety of the web, or more or less all of it, none of these models will be able to actually solve the generative AI problem for that [Indian] language,” said Dabre. 

The crux of the issue lies in the sheer lack of digitised data in Indian languages. Since English holds a major grip over the internet, the digitised content available for Indian languages remains vastly insufficient.

Bridging the Gap with RomanSetu

Dabre rued that chatbots for Indian languages are still a dream. “You will see a lot of people claiming that they can make a chatbot or LLM for Indian languages, but 99% of those things are transient. They are not going to be too useful in production, because nobody has solved the data problem yet,” said Dabre.

He explained that unless a monolingual dataset is created in Indian languages, which is at parity with the scale of English datasets, we won’t be able to build chatbots that answer in Indic languages without faltering in some way. “However, there are other strategies to transfer the capabilities of English to Indian languages. If someone figured that out, the data problem might be half solved,” said Dabre.

To tackle the issue, Dabre, with researchers from AI4Bharat, IIITDM Kancheepuram, A*STAR Singapore, Flipkart, Microsoft India, and  IIT Madras, introduced the RomanSetu paper, which explains a technique that unlocks multilingual capabilities of LLMs via Romanisation. 

If you type something in Hindi but in the Roman script, current AI models like Llama and others are able to process it to some degree. That is what Dabre and his team are working on—to train models in Romanised versions of Indic data, leverage the knowledge in the English language, and transfer it to Indic languages.

The team found out through the RomanSetu paper that this approach actually works better than training models on native scripts like Devanagari. “It is like a shortcut,” said Dabre. “We cannot properly pursue the goal of building the next-big LLM for Indic languages unless we solve the data problem.”

Additionally, Dabre is currently working on speech translation models, which have a huge demand in India. He is also one of the creators of IndicBART and IndicTrans2 models at AI4Bharat, founded by his seniors at IIT Bombay. 

The team created the IndicNLG Benchmark around the time GPT-3 was launched. But there was not much conversation around Indic language generation at that time. And now that ChatGPT is here, everyone is into building chatbots.

Dabre’s journey with NLP began over a decade ago at IIT Bombay under the guidance of Pushpak Bhattacharya, the former president of the Association of Computational Linguistics and a leading figure in the field of Indic NLP.

He later did his PhD at Kyoto University, under the guidance of Sadao Kurohashi, the director of the National Institute of Informatics, Japan, and a leading figure in Japanese NLP. 

Training Big Indic Models Can Be a Waste of Compute

Giving the example of Sangraha dataset by AI4Bharat, which has 251 billion tokens of data in 22 languages, and citing Microsoft’s Phi models, Dabre said that given the scale of data, it is more appropriate for India to build a 1 billion parameter model instead of a larger one. 

“A 1 billion parameter model might just do a decent job,” he added, saying that there is not enough data to train such a big model, and called it an injudicious use of compute. However, once the data scales improve, models should also scale.

Another paper by Dabre is assisting researchers to train models with synthetic data. The paper, Do Not Worry if You Do Not Have Data, highlights that translating English documents into Indian languages causes no harm and can roughly have the same amount of knowledge and performance. “You can get by, but it is not an ideal solution; the data needs to be cleaned,” said Dabre.

He concluded that unless these models have an Indian context, they can’t work very well with Indian languages. 

The post The Missing Link for Indian Language Chatbots: Indic Data  appeared first on AIM.

]]>
Ascendion Elevates Chennai as Global Hub for GenAI Innovation https://analyticsindiamag.com/ai-origins-evolution/ascendion-elevates-chennai-as-global-hub-for-genai-innovation/ Fri, 07 Jun 2024 11:30:00 +0000 https://analyticsindiamag.com/?p=10122833

The company is hiring 300 freshers this year. More than half of them will be trained in AI in its Chennai facility in the next couple of months.

The post Ascendion Elevates Chennai as Global Hub for GenAI Innovation appeared first on AIM.

]]>

Ascendion has inaugurated its GenAI Studio in Chennai, bringing the city at the forefront of generative AI innovation for global clients. This new facility will leverage Chennai’s rich talent pool and favourable business environment to drive significant advancements in AI.

The Chennai hub is spearheaded by Prakash Balasubramanian, executive vice president of engineering management and head of India engineering operations. 

In an interview with AIM, Balasubramanian elaborated on why Chennai was selected as the hub for Ascendion’s AI efforts. “We started with Chennai because we have extremely talented folks here, who possess deep engineering and mathematics knowledge.” 

Additionally, key AI and data leaders within the company are already based in Chennai, making it a logical starting point. On a personal note, he added, “I’m from Chennai, and the CEO is from Chennai, so we thought we’d give something back to the city where we’ve all started.”

CEO Karthik Krishnamurthy said, “Our new AI studio in Chennai is filled with expert talent, hands-on technology, and inspiration, all designed to excite, provoke, and generate applied GenAI solutions that will drive business forward and positively impact lives all over the world.”

Balasubramanian said that the company plans to expand to more cities globally and is currently aiming for five more AI studios. The Chennai studio boasts a 3D printed LLM-powered robot, called Diva, for interaction on the front desk. 

The Vision for India

The AI studio in Chennai is envisioned as the hub for Ascendion’s global AI innovations. Balasubramanian outlines the primary objectives for the studio. One of them is engineering platform development. The studio will be the birthplace for Ascendion’s engineering platform, AVA+, designed to simplify software engineering. 

“All the AI capabilities we are building on this platform will originate from Chennai,” he noted. Critical AI programs will be managed and driven from Chennai. “We are already doing this, but we will continue to drive a lot of the critical programs around AI from here,” Balasubramanian stated.

A key component of Ascendion’s strategy is upskilling its workforce to adeptly handle generative AI. 

“We want our engineers to be very good consumers of GenAI, leveraging it to enhance their day-to-day work. We also aim to transform many of our engineers into creators of GenAI, developing LLMs and solving complex business problems,” Balasubramanian explained. 

The GenAI studio will also serve as a training ground for new talent. “We are hiring 300 freshers this year, with 150 to 160 of them being trained on AI in Chennai in the next couple of months.”

Along with all this, it will also act as a collaboration hub where clients can bring their problem statements, brainstorm solutions, and leave with a minimum viable product. “It will serve as a client Innovation Centre where we can model solutions and scale them subsequently.”

Michelangelo of AI

Central to Ascendion’s AI strategy is the AVA+ platform, particularly the Michelangelo studio, which dramatically shortens the time from concept to a functional software application. Balasubramanian describes Michelangelo’s capabilities: “In a matter of a couple of hours, you can have a fully functional application from the time you started with just an idea.” 

The company has integrated it within their operations and is not serving it as products to its customers.

In a demo with AIM, Balasubramanian showed how Michelangelo allows engineers to upload simple sketches or wireframes and automatically generate functional code, significantly reducing the traditional iterative process that can take weeks. “We believe this can disrupt the way engineering happens, making the process faster, better, and more efficient,” he asserted.

When asked about competition, Balasubramanian emphasised the tangible impact Ascendion’s solutions have made. “We are driving productivity improvements of 30 to 40% on an average, and our clients are seeing significant business benefits,” he said. 

Ascendion’s proactive approach to integrating AI into all aspects of software engineering sets it apart in a rapidly evolving market.

Ascendion is committed to staying ahead of the curve in AI innovation. “The field of AI is very hot, and we are going to see more and more innovation in the coming days. We are prepared to disrupt the traditional ways of working and lead the charge in AI-driven engineering,” Balasubramanian concludes.

Along similar lines, Accenture and Tech Mahindra also unveiled their Generative AI Studios last year to boost its employees’ generative AI confidence and upskill them.

The post Ascendion Elevates Chennai as Global Hub for GenAI Innovation appeared first on AIM.

]]>