AI Startups Stories, Breakthrough and Journeys of CEOs https://analyticsindiamag.com/industry-insights/ai-startups/ Artificial Intelligence, And Its Commercial, Social And Political Impact Fri, 30 Aug 2024 11:06:16 +0000 en-US hourly 1 https://analyticsindiamag.com/wp-content/uploads/2019/11/cropped-aim-new-logo-1-22-3-32x32.jpg AI Startups Stories, Breakthrough and Journeys of CEOs https://analyticsindiamag.com/industry-insights/ai-startups/ 32 32 This Bengaluru-based AI Startup Knows How to Make Your Videos Viral https://analyticsindiamag.com/ai-breakthroughs/this-bengaluru-based-ai-startup-knows-how-to-make-your-videos-viral/ Fri, 30 Aug 2024 10:30:00 +0000 https://analyticsindiamag.com/?p=10134152

The team has built a metric called "virality score", which is derived from a dataset of 100,000 social media videos.

The post This Bengaluru-based AI Startup Knows How to Make Your Videos Viral appeared first on AIM.

]]>

Editing videos can be a tedious and time-consuming task, taking video editors hours and days to get their footage ready for release. Moreover, hiring a team of video editors is not what every content creator or small company wants to invest in. This is where vidyo.ai comes into the picture. 

Launched two years ago by Vedant Maheshwari and Kushagra Pandya, the platform has experienced remarkable growth, scaling from zero to approximately 300,000 monthly active users, and achieved a revenue milestone of $2 million. Notably, a significant portion of vidyo.ai’s revenue, about 85%, comes from the US market.

Most recently, the company was part of the Google for Startups Accelerator programme. The company hasn’t raised any funding since its seed round of $1.1 million in 2022.

The team has made significant strides in addressing one of the industry’s most persistent challenges: video editing.

Maheshwari and vidyo.ai’s journey into the realm of video content and social media began over eight years ago, during which he collaborated with creators and influencers to refine their content strategies across platforms like YouTube, TikTok, and Instagram. 

It was during this period that Maheshwari identified a major pain point: the time-consuming and complex nature of video editing.

This insight led to the creation of vidyo.ai, a platform designed to streamline the video editing process. The vision was to leverage AI to handle 80-90% of the editing, leaving users with the flexibility to make final adjustments before sharing their content on social media.

The platform caters to a diverse user base, including content creators, podcasters, and businesses seeking to generate short-form content with minimal effort. “We essentially enable them to let the AI edit their videos, and then they can publish directly to all social media platforms using our software,” Maheshwari added.

How vidyo.ai Works

vidyo.ai combines OpenAI’s models with proprietary algorithms to transform raw video footage into polished content. Users upload their videos to the platform, which then processes the content through a series of OpenAI prompts and proprietary data. This includes analysing what kind of videos perform well online, identifying potential hooks, and determining effective calls-to-action (CTAs).

“We run the video through multiple pipelines, identifying key hooks and combining them to create a final video. Our algorithms then score these videos based on their potential virality,” Maheshwari elaborated on the process. This “virality score” is derived from a dataset of 100,000 social media videos, allowing the platform to suggest the most promising clips for engagement.

When compared to other video editing tools like GoPro’s QuikApp and Magisto, vidyo.ai distinguishes itself with its frame-by-frame analysis of both video and audio content. Unlike these platforms, which often edit videos based on mood or music, vidyo.ai dives deeper into the content to optimise for social media performance.

“We do a comprehensive analysis of the content, ensuring that every aspect is optimised for virality,” Maheshwari said. This level of detail, combined with the ability to publish directly across multiple platforms, provides users with a unique advantage.

Challenges and Opportunities

Despite its success, vidyo.ai faces challenges common to Indian startups, particularly in securing funding. Maheshwari noted that while Indian VCs are cautious about investing in AI, preferring application-layer solutions over foundational work, US VCs often have a more aggressive approach.

“We’ve gone from zero to $2Mn in ARR in less than two years, which is remarkable. However, raising subsequent rounds of funding in India remains challenging due to a lack of clarity on how AI investments will pay off,” Maheshwari explained, saying that the VCs of the US would be ready to invest looking at this metric alone.

He also reflected on the possibility of starting the company in the US instead of India, citing potential benefits in terms of ease of operations and investor interest. “It often feels like running a company in India comes with more challenges compared to the US,” he admitted.

When it comes to finding the moat, vidyo.ai still stands on the tough ground of maybe Instagram, LinkedIn or TikTok releasing a similar feature on the base app. “There is definitely a little bit of platform risk,” but Maheshwari explained that it is unlikely that customers would shift to that since they don’t want to restrict themselves to the workflow of a creation platform. 

Comparing it to building something like Canva, Maheshwari said that vidyo.ai plans to expand its offerings, including the potential integration of generative AI features like deep-fakes and avatars. Currently, the team is also working on building an AI-based social media calendar, which would suggest to users content that would work the best in the coming week.

Maheshwari envisions building a comprehensive suite of tools for social media creation and publishing. “Our goal is to develop a full-stack solution that encompasses every aspect of social media content creation,” he said.

The post This Bengaluru-based AI Startup Knows How to Make Your Videos Viral appeared first on AIM.

]]>
Cracking YC: A Guide for AI Startups https://analyticsindiamag.com/ai-origins-evolution/cracking-yc-a-guide-for-ai-startups/ Tue, 27 Aug 2024 11:30:00 +0000 https://analyticsindiamag.com/?p=10133856

Since most of the current startups are tech and AI based, it is important for the founder to be a 'technical founder'.

The post Cracking YC: A Guide for AI Startups appeared first on AIM.

]]>

Today is the deadline for submitting applications to Y Combinator Fall Batch 2024. For the YC W24 program, 260 companies were selected from over 27000 applications. “With an acceptance rate under 1%, this was one of the most selective cohorts in YC history,” said Garry Tan, President & CEO of Y Combinator.

Despite this, the startup accelerator is still processing hundreds of applications every hour and this might be the best time to apply. But there are definitely a few things to keep in mind before you get selected and have an opportunity to click a photo with the famous YC signboard.

To start with, YC has its own list of Dos and Don’ts for getting started. This includes getting to the point, staying away from buzzwords, and showing the team that you won’t break within a year. 

As of early 2024, YC has funded over 4,500 startups with a combined valuation of more than $600 billion, making it an attractive lifeline for many emerging startup founders.

Interestingly, this year, YC has also taken a lot of interest in Indian AI startups, as their focus is increasing on founders and application based generative AI. What can startups learn from all the announcements so far? 

Focusing on a Technical Team with a Technical Founder

While YC will be at an advantage by investing in startups that will potentially hit millions of dollars, the YC programs are equally enticing for others as there is no requirement on formal degree or actual returns, YC simply bets on a founder and their idea.

One of the most important aspects that inventors look at is the founding team of a startup. And since most of the current startups are tech and AI based, it is important for the founder to be technical. So much so that he/she should be able to build the product by himself, if not, “you are not a technical founder,” said Isabella Reed, founder of CamelAI, which is part of YCW24.

Though some might disagree since tools like Cursor AI are changing the definition of what an engineer means, it is true that a startup which is building a tool, is a tech startup. For Indian founders, this is also a wake up call to start adopting tools like Cursor and others since it is now a given default for the future of tech products. 

One of the interesting strategies that founders are finding out is pitching their ideas to VCs and finding out the flaws and what can be improved, and some are still working on the pitch till 3 AM. One founder pitched to VCs and was able to get another co-founder for his startup, create a go-to-market strategy, and also send his product to beta users for feedback. 

Those who can’t get VCs, are asking people on X to help them with their YC applications and pitches. Which in many cases is turning to a roast show.

Meanwhile, one of the most important things for founders is to remember to not become a generative AI influencer on the way to YC. Though it can be fun to watch and preach, it is not helpful for the pitch and the process of getting selected. 

The most important thing to keep in mind is showing traction, picking a specific metric to back, a passion for the startup you are building, while also presenting the honest flaws of the product.

Don’t Get Blacklisted

It is not easy to get into YC. Amartya Jha, the co-founder and CEO of CodeAnt AI, which recently got into YC, narrated in a post how his team got rejected the first time because they couldn’t explain their product well to the investors. Despite this and the fear of getting blacklisted, they made a 45-minute video explaining their product and sent it to YC again. The following week, they managed to get another interview with YC and were finally selected!

Though an inspirational story, the lesson is to explain your product well in the first go.

Another Indian startup Asterisk by Mufeed VH, the creator of Devika, also got into YC. The team started with the Stition.ai and building cybersecurity products and is now finally into YC, becoming the first startup to get into YC from Kerala.

Given the small acceptance rate, founders keep scrolling their email inbox to find the invitation letter for the interview. Though it might be difficult to get an acceptance letter, it is definitely useful to at least fill out the form. 

“It’s the smallest set of questions you can ask to get a clear signal about a startup, so it’s also the smallest set of questions you can answer to get a clear signal about your own startup,” said Pete Koomen, group partner at YC.

Moreover, consistency is king. Kashif Ali, the founder of TaxGPT, which is part of the YC S24’s batch said that he applied seven times with three different companies in the last four years, and finally got in at the 8th time. 

At the same time, not everyone needs YC. If you are a true entrepreneur, you can be successful without YC. Keep applying though.

P.S. No sweat equity.

https://twitter.com/CCgong/status/1827441029114736739

[This is an opinion and not a promotional article]

The post Cracking YC: A Guide for AI Startups appeared first on AIM.

]]>
This AI Startup Wants to Fix Your Code Bugs. And Got YC Backing for It https://analyticsindiamag.com/ai-breakthroughs/this-ai-startup-wants-to-fix-your-code-bugs-and-got-yc-backing-for-it/ Mon, 26 Aug 2024 04:49:46 +0000 https://analyticsindiamag.com/?p=10133723

Tata 1mg has written hundreds of custom policies on CodeAnt platform.

The post This AI Startup Wants to Fix Your Code Bugs. And Got YC Backing for It appeared first on AIM.

]]>

Getting into YC takes work. Amartya Jha, the co-founder and CEO of CodeAnt AI, narrated in a post how he and his co-founder Chinmay Bharti got rejected the first time because they couldn’t explain their product well to the investors. Despite this and the fear of getting blacklisted, Jha and Bharti made a 45-minute video explaining their product and sent it to YC  again. 

The following week, they managed to get another interview with YC and were finally selected! But what is so good about what they are building? 

“Developers spend 20 to 30% of their time just reviewing someone else’s code. Most of the time, they simply say, ‘It looks good, just merge it,’ without delving deeper,” Jha explained while speaking with AIM. This leads to bugs and security vulnerabilities making their way into production. 

With generative AI, coding is undeniably getting easier. At the same time, the quality of code produced by these AI generators is still far from those of the human coders. This makes code review crucial for every organisation and this is where CodeAnt AI comes into the picture.

CodeAnt’s AI-driven code review system can drastically reduce these vulnerabilities by automating the review process and identifying potential issues before they reach the customer.

Founded in October 2023, CodeAnt AI is already making waves in the industry by automating code reviews with AI. The journey began at Entrepreneur First, where Jha met Bharti. The company quickly gained traction, securing a spot in YC by November 2023. 

By February 2024, they had launched their first product, attracting major clients like Tata 1mg, India’s largest online pharmacy, and Cipla, one of the country’s biggest pharmaceutical companies. “These companies were amazed that such a solution even existed,” Jha recalled. “And both contracts were paid, not just trials.”

What’s the Moat?

What sets CodeAnt AI apart from other players in the market such as CodeRabbit and SonarSource, which joined the market before CodeAnt, is its unique approach to code review. The company has developed its own dataset of 30,000+ checks, meticulously created to address every possible code commit scenario. 

“This is our pure IP,” Jha said. “We wrote our own algorithms to analyse code, understand its flow, and identify areas that need improvement. Our AI, combined with our custom engineering solutions, runs these checks on every code commit, offering unprecedented accuracy.” 

The platform also supports more than 30 programming languages.

While competition in the AI-driven code review space is growing, Jha remains confident in CodeAnt AI’s unique value proposition. Its biggest competitor is SonarSoruce, but interestingly, its lead investor is also one of the investors of CodeAnt. 

CodeRabbit relies solely on AI, which leads to a lot of hallucinations and false positives. “Our approach, which combines AI with deterministic policies, gives us a significant edge,” said Jha.

“The demand for tools like ours is only going to grow as AI-generated code becomes more prevalent,”  Jha added that tools like GitHub Copilot or Cursor are far from generating accurate code anytime soon.

Jha further elaborated on the limitations of competitors that use AI exclusively. When developers give a large codebase to AI, it tends to hallucinate. 

To mitigate this, CodeAnt built a foundation of hard-coded checks, which are further enhanced by AI. This reduces false positives and ensures that the code is not only correct but also secure and compliant with industry standards.

Amartya Jha speaking at Bangalore Python Meetup

Stands Out with Customisation

One of CodeAnt AI’s standout features is its ability to allow enterprises to input their own data and create custom policies. “For example, Tata 1mg has written hundreds of custom policies on our platform. Before CodeAnt AI, they would have had to build a similar platform themselves to enforce these policies,” Jha said, and added that Tata 1mg has written code in Python for the last eight years.

Now, they can simply use CodeAnt platform to ensure that their code complies with their specific guidelines, which is especially valuable for large organisations with complex codebases.

This level of customisation is not only beneficial for code quality but also for compliance with security frameworks. “Imagine a tool that knows exactly how code should be written in your organisation and can flag issues related to SOC 2 compliance, GDPR, or data privacy. It makes the process of maintaining compliance much easier and more efficient,” Jha added.

In the future, Jha will focus on expanding CodeAnt AI’s capabilities, particularly in the areas of security and code quality. “We’re currently working with some of the largest security and backup companies in the world, helping them review their code for vulnerabilities. Our team may be small, but we’re making a big impact,” he said.

Despite its San Francisco roots, CodeAnt AI maintains a strong presence in India, with Jha overseeing operations from Bangalore. The company is poised for further growth, with plans to expand its product offerings and deepen its market presence. 

“We are going deeper into security and code quality. We will soon be making announcements about new products and partnerships, so stay tuned,” Jha hinted.

The post This AI Startup Wants to Fix Your Code Bugs. And Got YC Backing for It appeared first on AIM.

]]>
This YC-Backed Bengaluru AI Startup is Powering AWS, Microsoft, Databricks, and Moody’s with 5 Mn Monthly Evaluations https://analyticsindiamag.com/ai-breakthroughs/this-yc-backed-bengaluru-ai-startup-is-powering-aws-microsoft-databricks-and-moodys-with-5-mn-monthly-evaluations/ Tue, 20 Aug 2024 04:40:40 +0000 https://analyticsindiamag.com/?p=10133091

By mid-2023, Ragas gained significant traction, even catching the attention of OpenAI, which featured their product during a DevDay event.

The post This YC-Backed Bengaluru AI Startup is Powering AWS, Microsoft, Databricks, and Moody’s with 5 Mn Monthly Evaluations appeared first on AIM.

]]>

Enterprises love to RAG, but not everyone is great at it. The much touted solution for hallucinations and bringing new information to LLM systems, is often difficult to maintain, and to evaluate if it is even getting the right answers. This is where Ragas comes into the picture.

With over 6,000 stars on GitHub and an active Discord community over 1,300 members strong, Ragas was co-founded by Shahul ES and his college friend Jithin James. The YC-backed Bengaluru-based AI startup is building an open source stand for evaluation of RAG-based applications. 

Several engineering teams from companies such as Microsoft, IBM, Baidu, Cisco, AWS, Databricks, and Adobe, rely on Ragas offerings to make their pipeline pristine. Ragas already processes 20 million evaluations monthly for companies and is growing at 70% month over month.

The team has partnered with various companies such as Llama Index and Langchain for providing their solutions and have been heavily appreciated by the community. But what makes them special?

Not Everyone Can RAG

The idea started when they were building LLM applications and noticed a glaring gap in the market: there was no effective way to evaluate the performance of these systems. “We realised there were no standardised evaluation methods for these systems. So, we put out a small open-source project, and the response was overwhelming,” Shahul explained while speaking with AIM.

By mid-2023, Ragas had gained significant traction, even catching the attention of OpenAI, which featured their product during a DevDay event. The startup continued to iterate on their product, receiving positive feedback from major players in the tech industry. “We started getting more attention, and we applied to the Y Combinator (YC) Fall 2023 batch and got selected,” Shahul said, reflecting on their rapid growth.

Ragas’s core offering is its open-source engine for automated evaluation of RAG systems, but the startup is also exploring additional features that cater specifically to enterprise needs. “We are focusing on bringing more automation into the evaluation process,” Shahul said. The goal is to save developers’ time by automating the boring parts of the job. That’s why enterprises use Ragas.

As Ragas continues to evolve, Shahul emphasised the importance of their open-source strategy. “We want to build something that becomes the standard for evaluation in LLM applications. Our vision is that when someone thinks about evaluation, they think of Ragas.”

While speaking with AIM in 2022, Kochi-based Shahul, who happens to be a Kaggle Grandmaster, revealed that he used to miss classes and spend his time Kaggeling. 

The Love for Developers

“YC was a game-changer for us,” Shahul noted. “Being in San Francisco allowed us to learn from some of the best in the industry. We understood what it takes to build a successful product and the frameworks needed to scale.”

Despite their global ambitions, Ragas remains deeply rooted in India. “Most of our hires are in Bengaluru,” Shahul said. “We have a strong network here and are committed to providing opportunities to high-quality engineers who may not have access to state-of-the-art projects.”

“We have been working on AI and ML since college,” Shahul said. “After graduating, we worked in remote startups for three years, contributing to open source projects. In 2023, we decided to experiment and see if we could build something of our own. That’s when we quit our jobs and started Ragas.”

Looking ahead, Ragas is planning to expand its product offerings to cover a broader range of LLM applications. “We’re very excited about our upcoming release, v0.2. It’s about expanding beyond RAG systems to include more complex applications, like agent tool-based calls,” Shahul shared.

Shahul and his team are focused on building a solution that not only meets the current needs of developers and enterprises, but also anticipates the challenges of the future. “We are building something that developers love, and that’s our core philosophy,” Shahul concluded.

The post This YC-Backed Bengaluru AI Startup is Powering AWS, Microsoft, Databricks, and Moody’s with 5 Mn Monthly Evaluations appeared first on AIM.

]]>
Meet The Brains Behind AI Anchors on Doordarshan and Aaj Tak https://analyticsindiamag.com/ai-origins-evolution/meet-the-brains-behind-ai-anchors-on-doordarshan-and-aaj-tak/ https://analyticsindiamag.com/ai-origins-evolution/meet-the-brains-behind-ai-anchors-on-doordarshan-and-aaj-tak/#respond Thu, 01 Aug 2024 04:30:00 +0000 https://analyticsindiamag.com/?p=10131023

In the coming months, the startup will launch AI anchors for DD Sports and DD News.

The post Meet The Brains Behind AI Anchors on Doordarshan and Aaj Tak appeared first on AIM.

]]>

India’s public sector broadcaster Doordarshan recently introduced two AI anchors – Krish and Bhoomi – who deliver weather forecasts, commodity prices, farming trends, updates on agricultural research, and information on state welfare programmes to millions of farmers.

Enabling this is a Delhi-based startup called Personate AI. Incorporated in 2021, the startup is helping broadcasters, media houses and even content creators develop virtual AI agents.

Last year, Aaj Tak became the first broadcaster in India to host an AI anchor named Sana. The virtual anchor is multilingual and provides new updates multiple times throughout the day.

“We introduced India’s first AI anchor, Sana, at the India Today Conclave in the presence of Prime Minister Narendra Modi. Following that, we launched several anchors for various brands, including Vendhar TV in South India and ongoing campaigns for Zee

“Since then, AI anchors have been developed for multiple news channels, including Russian media,” Rishab Sharma, chief technology officer & co-founder at Personate.ai, told AIM.

The startup has developed eight different AI anchors for Aaj Tak and successfully launched them across its regional channels. Last month, Modi’s interview with the channel was also translated and broadcasted in seven hyperlocal languages using the startup’s technology.

Building on this success, the startup approached Prasar Bharati, which was impressed with its vision and chose to integrate AI anchors for DD Kisan. Sharma also revealed that AI anchors will soon be introduced for other Doordarshan channels, including DD Sports and DD News.

Personate.ai is also co-founded by Akshay Sharma, who serves as the CEO. Completely bootstrapped and profitable, the company has over 25 enterprise customers. 

Personate AI Studio

The startup has created an AI studio that allows users to produce a clone or a digital avatar of themselves. To date, the startup has collaborated with five media houses, achieving an average return on investment (ROI) of approximately 160%, according to Sharma.

“It’s adding more viewers per minute. Indian viewers have become accustomed to viewing AI content, be it a reel or a video shot on social media,” he said.

Personnate’s AI studio replaces the human component with a synthetic one. This synthetic element can be an actual human, or content creator or a synthetic personality.

For Aaj Tak, the startup also created a clone for their managing editor Anjana Om Kashyap. This was done by taking a short video clip, typically five minutes of the individual’s data. With AI, the clone was ready within minutes to read anything on screen with text-to-speech technology.

In contrast, creating a synthetic anchor involves designing from scratch. “This process includes pixel manipulation, designing the body, overlaying textures and clothing, and stitching the face. For a synthetic person, we spend about a week crafting the design,” Sharma revealed.

The personality is created with a 3D model by adding textures and shapes to the body. Once the model is ready, AI steps in to control its movements. 

Explaining in the context of video games, Sharma said that traditionally animators decided how the character moved. “Here, the role of the animator shifts to generative AI, which now directs how the character behaves and interacts,” Sharma said.

Large Vision Models 

Personate has developed a large vision model (LVM), which is a generative AI model similar to LLMs, but generates pixels instead of text. Examples of popular LVM include OpenAI’s Sora and Google’s Imagen.

The AI model translates a 3D anchor to a 2D screen, ensuring that the output is ultra-realistic. On a 2D screen, the challenge is how to rotate the model and adjust its positioning relative to camera angles, even though there is no actual camera. 

“The goal is for the model to behave as if it’s responding to the camera, creating a convincing and lifelike appearance,” Sharma said.

One of the biggest challenges in training an LVM is data. Moreover, models like OpenAI’s Sora are trained on trillions of data points. According to Sharma, Personate’s AI model too is trained with multi-trillion data points.

The startup tapped into the past experiences of the founders to collect data to train the model. Sharma revealed he began his journey with the Indian Space Research Organisation (ISRO) and later worked with Reliance.

Currently, Personate.ai is the only Indian company with such capabilities. Beyond India, Synthesia, a startup based in London, offers similar solutions.


Synthesia’s platform enables users to create videos using pre-generated AI avatars or by generating digital representations of themselves, which they call artificial reality identities. The startup is backed by NVIDIA and is being leveraged by the United Nations.

The post Meet The Brains Behind AI Anchors on Doordarshan and Aaj Tak appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-origins-evolution/meet-the-brains-behind-ai-anchors-on-doordarshan-and-aaj-tak/feed/ 0
Bengaluru Startup Builds AI Models That Contact Centres Can Run on the Edge https://analyticsindiamag.com/industry-insights/ai-startups/bengaluru-startup-builds-ai-models-that-contact-centre-can-run-on-the-edge/ https://analyticsindiamag.com/industry-insights/ai-startups/bengaluru-startup-builds-ai-models-that-contact-centre-can-run-on-the-edge/#respond Thu, 04 Jul 2024 11:54:56 +0000 https://analyticsindiamag.com/?p=10125802

The size of SLMs developed by Gnani.ai are relatively smaller compared to an LLM like GPT-4 or even smaller LLMs like Llama-3

The post Bengaluru Startup Builds AI Models That Contact Centres Can Run on the Edge appeared first on AIM.

]]>

Founded in 2017 by Ganesh Gopalan and Ananth Nagaraj, Gnani.ai is a Bengaluru-based startup that claims to facilitate over 1 million conversations daily with its product line, which is meant for contact centres.

The company has over 100 customers in banking and financial services, insurance, telecom, automotive and healthcare industries.

In May, the startup launched a series of voice-first SLM (small language models), trained meticulously on millions of audio hours of proprietary audio datasets and billions of Indic language conversations.

“What we’re doing is something different, like a fusion of voice and text models. It’s a multimodal model but right now we are focused on voice and text,” Ganesh Gopalan, the CEO of Gnani.ai, told AIM in an exclusive interview.

So far, the company has built a series of five models designed for the banking, finance, security and insurance (BFSI) sector.

Gopalan revealed that the models are multilingual. In the US, it supports both English and Spanish, whereas, back home, the model is designed to support 12 Indian languages.

“We also plan to launch a model designed for the automotive industry because we have a lot of customers in this industry. Healthcare is going to be a sector in the future,” Gopalan said.

Building for the Edge

The size of SLMs developed by Gnani.ai are relatively smaller compared to an LLM like GPT-4 or even smaller LLMs like Llama-3 7 billion parameters. Gopalan believes the size of SLMs will come down even further, enabling the deployment of them on the edge.

“In the future, we will deploy these models on the edge because the size is also coming down drastically. We believe the solution to many enterprise problems isn’t always found in the generalised 100 billion+ parameter models that companies often tout. 

“These models are excellent for generic applications but may not always address specific enterprise needs effectively,” he said.

Moreover, many enterprises hesitate to adopt third-party models due to concerns about their proprietary nature, uncertainty about the training data used, and security apprehensions. Running a model on the edge where the customers’ data are solves a lot of these problems.

“We think running models on the edge will be a lot cheaper. So, at some point, all these models will be on edge. And that’s something that we are working actively towards,” Gopalan said.

What Gnani.ai considers its strength is the ability to quickly fine-tune the model based on enterprise data and make it production ready. 

“It’s one thing to have an SLM for the BFSI sector, but the real value to a company is when you have a model built just for their data. So that’s what we do. We take our model to enterprises and help them build on top of it. We provide them with necessary tools that can quickly launch the model based on their data,” Gopalan said.

AI Agents are Coming 

Today, AI agents are believed to be the next big iteration in the AI cycle. Previously, Kailash Nadh, the CTO of Zerodha, have said the prospects of having AI agents are very high but perhaps not in a nice way.

While Gopalan agrees AI agents will be the next big leap, he adds that Gnani has been building AI agents for over four years. 

“We automate processes that are done by contact centre agents through our voice bots. We built AI agents that helped one of the largest banks in India in collecting over a billion dollars in the last six months,” Gopalan said.

The bot handled payment reminder calls to customers, ensuring timely payments and helping them find the appropriate payment methods. 

“For a US client, our AI agents are assisting the contact centre agents by providing instantaneous answers to customer queries as conversations unfold,” he added.

AI agents will also change how contact centre operations work. The future is multimodal, according to him. “ For instance, if you encounter an issue with your laptop today, calling the contact centre requires providing a tag number and describing the problem, which can be challenging. 

“Instead, you can show a video, and an AI bot can analyse the visuals, identify the problem area, suggest solutions, and present options to the agent. The agent, using human intelligence, can then assess the problem and provide a solution,” he said.

IVR, Biggest Impediment to Customer Experience 

However, there is a lot of apprehension about AI agents making contact centre agents redundant. Gopalan believes there is a long way to go before AI makes call centre agents redundant, if it ever happens. 

“We currently deploy AI bots for numerous use cases. However, systems are still evolving for scenarios that necessitate human intelligence or empathy. In these areas, AI bots are not ready yet,” Gopalan said.

Moreover, he believes interactive voice responses (IVR) could be the biggest impediment to customer experience. 

“No one enjoys navigating through automated menus and pressing buttons when contacting a contact centre with an urgent issue,” he pointed out.

Gopalan believes AI will have to bring an end to IVR before it eliminates jobs of call centre agents. Contact centre business is one segment which is being impacted by generative AI, and discussions around AI making contact centre agents redundant is widespread.

“Another reason contact centres will remain relevant for a long time is that it’s not just about having people to answer calls. Many companies lack the integrations with CRM systems, ticketing tools, and other necessary infrastructure. Additionally, a significant portion of customer service knowledge resides within the minds of contact centre employees,” he concluded.

The post Bengaluru Startup Builds AI Models That Contact Centres Can Run on the Edge appeared first on AIM.

]]>
https://analyticsindiamag.com/industry-insights/ai-startups/bengaluru-startup-builds-ai-models-that-contact-centre-can-run-on-the-edge/feed/ 0
Meta, Hugging Face Launch EU-Focused AI Startup Accelerator Programme  https://analyticsindiamag.com/industry-insights/ai-startups/meta-hugging-face-launch-eu-focused-ai-startup-accelerator-programme/ https://analyticsindiamag.com/industry-insights/ai-startups/meta-hugging-face-launch-eu-focused-ai-startup-accelerator-programme/#respond Tue, 25 Jun 2024 09:17:45 +0000 https://analyticsindiamag.com/?p=10124576

Meta’s GenAI VP Ahmad Al-Dahle announced on Tuesday that the programme would aim to “accelerate innovation, drive business growth and strengthen the European tech ecosystem.”

The post Meta, Hugging Face Launch EU-Focused AI Startup Accelerator Programme  appeared first on AIM.

]]>

Meta, Hugging Face and Scaleway have partnered up to launch an AI accelerator programme geared towards European startups.

Meta’s GenAI VP Ahmad Al-Dahle announced on Tuesday that the programme would aim to “accelerate innovation, drive business growth and strengthen the European tech ecosystem.” The programme is based out of Station F in Paris, the world’s largest startup campus, and is supported by the Parisian business school HEC Paris’ incubator, Incubateur HEC Paris.

This seems to be one of the first AI accelerator programmes with a focus on startups from the EU. Last week, AWS announced an accelerator programme for startups in the Asia Pacific and Japan regions, with 40 startups coming from India.

Similarly, earlier this year, GitHub also announced $400,000 in funding for ten startups specifically focusing on open-source AI-based solutions. This was a part of their 10-week GitHub Accelerator programme.

Who, When, And Why?

The Meta-backed programme will run for four months, with a focus on European startups that are using open-source foundation models to develop advanced AI solutions. This is in alignment with Meta’s overall focus towards open-source AI models.

Applications are currently open for startups to sign up and will close on August 16. A total of five startups in the Minimum Viable Product (MVP) or product stage will be selected to undertake the programme, which is scheduled to start on October 1, 2024, and go on till early February 2025.

According to the programme details, startups will be selected by a panel of experts from Meta, Hugging Face, Scaleway and HEC Paris Incubator. Due to a focus on open-source models, startups will be picked based on their use of foundation models and integration plans.

The partnership also means that startups will gain benefits from each of the partners in different areas. According to the programme overview, “Selected startups will benefit from technical mentoring by Meta, access to Hugging Face’s tools and Scaleway’s computing resources to develop services based on open-source AI technology.”

The post Meta, Hugging Face Launch EU-Focused AI Startup Accelerator Programme  appeared first on AIM.

]]>
https://analyticsindiamag.com/industry-insights/ai-startups/meta-hugging-face-launch-eu-focused-ai-startup-accelerator-programme/feed/ 0
How These Indian Startup Founders will Help OpenAI Build GPT-6 https://analyticsindiamag.com/industry-insights/ai-startups/how-these-indian-startup-founders-will-help-openai-build-gpt-6/ https://analyticsindiamag.com/industry-insights/ai-startups/how-these-indian-startup-founders-will-help-openai-build-gpt-6/#respond Tue, 25 Jun 2024 07:38:00 +0000 https://analyticsindiamag.com/?p=10124533

Rockset’s main product features real-time indexing, SQL-based search, and analytics capabilities that can process data from sources such as Kafka, MongoDB, and DynamoDB. 

The post How These Indian Startup Founders will Help OpenAI Build GPT-6 appeared first on AIM.

]]>

OpenAI recently acquired Rockset, a data analytics company co-founded in 2016 by former Meta engineers Dhruba Borthakur and Venkat Venkataramani, for $105 million (INR 905 crore). This acquisition aims to leverage Rockset’s advanced analytics to enhance OpenAI’s retrieval infrastructure.

Venkataramani, a graduate of NIT Tiruchirappalli, played a crucial role in scaling Meta’s data infrastructure to handle billions of queries per second with high reliability and contributed to the development of technologies like TAO, Dragon, Memcache, McRouter, MySQL, RocksDB, HBase, MongoRocks, and MyRocks.

“AI has the opportunity to transform how people and organisations leverage their own data. That’s why we’ve acquired Rockset, a leading real-time analytics database that provides world-class data indexing and querying capabilities,” said OpenAI in its blog post.

“Rockset’s infrastructure empowers companies to transform their data into actionable intelligence. We’re excited to bring these benefits to our customers by integrating Rockset’s foundation into OpenAI products,” said OpenAI COO Brad Lightcap.

AI Agents To Lead the Way

Rockset will help OpenAI build AI agents capable of taking actions on the users’ behalf.  “In the future, assistants will become agents to whom you will delegate certain data access rights and permissions to act on your behalf. It might start with simple tasks like booking a meeting or making a reservation, and then they will quickly get better from there,” posted Rockset founder Venkatramani on LinkedIn a month ago.

This is similar to what Microsoft chief Mustafa Suleyman recently said about GPT-6’s ability to take actions, which is most likely to come out in two years. 

OpenAI CTO Mira Murati said in an interview that the next generation of GPT will be ‘PhD-level’ compared to GPT-3 (toddler) and GPT-4 (high school). She also said the next model will be released in a year and a half.

Rockset, based in San Mateo, California, employs approximately 88 people. The company has secured $109 million in funding from prominent investors, including Greylock Partners, Sequoia Capital, and Glynn Capital. 

This funding, through a Series B, implies a $300-$500 million valuation. Despite substantial backing over eight years, the business has struggled to gain significant traction, with public figures indicating $10-20 million in revenue.

But why Rockset? 

Borthakur said there are two sets of companies: one builds the latest AI models and the second comprises traditional data companies that store enterprise data to power business analytics and applications.

“And they (AI companies) want AI models to be able to leverage the data to power intelligent assistants and agents. So, the AI world and the data world are converging towards the same end goal,” he said 

Rockset announced that it will join OpenAI to enhance the retrieval infrastructure for OpenAI’s product suite. “We will help OpenAI solve the complex database challenges that AI applications face at a massive scale,” the post said.

Rockset’s main product features real-time indexing, SQL-based search, and analytics capabilities that can process data from sources such as Kafka, MongoDB, and DynamoDB. 

Its technology is optimised for quick search, filtering, aggregations, and joins, making it an excellent tool for applications that need real-time data processing and analytics. 

Integrating SQL with advanced search capabilities will allow OpenAI to streamline its data workflows, reducing the complexity of data management and improving overall efficiency. This is particularly beneficial for research and development, where quick and accurate data analysis is essential.

OpenAI Loves RAG 

Moreover, Rockset augments the capabilities of LLMs through Rockset’s retrieval augmented generation (RAG) feature. This would be a nice addition for OpenAI enterprise customers who want to use models with their proprietary data and aim to reduce hallucination.

By incorporating their data, customers can enhance LLMs to provide contextual and more accurate results. This integration expands the possibilities of LLMs in various applications, from content generation to information retrieval.

Rockset uses a converged indexing approach that combines row, columnar, and search indexes, enabling fast searches across multiple data types and formats. Currently, no other database company (MongoDB, Elasticsearch, or Amazon Redshift), offers this specific service.

While Elasticsearch excels in keyword search and Weaviate and Pinecone specialise in vector search, Rockset merges these capabilities to provide both precise keyword matches and semantically rich search results. 

Rockset’s hybrid vector search combines traditional keyword search with vector search, allowing for more relevant and context-aware search results. This hybrid approach is particularly beneficial for OpenAI, which deals with massive volumes of unstructured data, including text, images, and audio.

Unlike some other solutions that require complex infrastructure management, Rockset offers a fully managed, serverless architecture. This reduces operational overhead and simplifies scalability. It eliminates the need for manual infrastructure management, allowing OpenAI to scale its operations seamlessly and efficiently.

This scalability is crucial for handling the vast amounts of data required for training large AI models, ensuring that OpenAI can continue to push the boundaries of AI research without being hindered by infrastructure limitations.

OpenAI is Not Alone

Databricks acquired MosaicML in 2024 for $1.3 billion to enhance its capabilities in generative AI and LLMs. Most recently, Databricks acquired Tabular as well to leverage Apache Iceberg, a leading open-source table format for data lakes.

Last year, ThoughtSpot, an AI-powered analytics platform last valued at $4.5 billion, acquired Mode Analytics, a business intelligence startup, for $200 million in cash and stock.

Recently, IBM acquired HashiCorp in a deal valued at $6.4 billion, aiming to expand its portfolio of cloud-based software products to capitalise on the growing demand for AI-powered solutions. The deal will be closed by the end of 2024.

HashiCorp’s technology will complement IBM subsidiary Red Hat, IBM AI platform Watsonx, the vendor’s consulting arm, and its offerings in data security and IT automation.The trend of acquisition of data and AI companies is likely to continue in 2024 as well.

The post How These Indian Startup Founders will Help OpenAI Build GPT-6 appeared first on AIM.

]]>
https://analyticsindiamag.com/industry-insights/ai-startups/how-these-indian-startup-founders-will-help-openai-build-gpt-6/feed/ 0
OpenAI Acquires Startup Multi to Develop ChatGPT Desktop App https://analyticsindiamag.com/industry-insights/ai-startups/openai-acquires-startup-multi-to-develop-chatgpt-desktop-app/ Mon, 24 Jun 2024 16:31:53 +0000 https://analyticsindiamag.com/?p=10124471

Multi allows users to share applications across workspaces through screen sharing over video calls.

The post OpenAI Acquires Startup Multi to Develop ChatGPT Desktop App appeared first on AIM.

]]>

OpenAI has acquired a startup called Multi to build the ChatGPT desktop app. Alexander Embiricos, the founder of Multi, will be joining OpenAI. Effective today, Multi will be shut down. “We’ve closed new team signups, and currently active teams will be able to use the app until July 24, 2024, after which we’ll delete all user data,” the company said in a blog post.

“We’ve been increasingly asking ourselves how we should work with computers. Not on or using computers, but truly with computers. With AI. We believe it’s one of the most important product questions of our time,” added the company in its blog post.

“So much to say, but what matters most? Talking to users and shipping. With that in mind… I couldn’t be more excited to have joined the ChatGPT desktop team. Send me your feedback and bug reports. DMs open!,” posted Embiricos on X. 

Multi, previously known as Remotion, is a collaboration platform co-founded by Embiricos in 2021 during the significant shift to remote work. The company aimed to replicate the spontaneous collaboration of in-person work environments through a remote communication platform. The goal was to facilitate impromptu video calls and presence-based interactions.

Multi allows users to share applications across workspaces. Teams can code, edit, and work together as if they were side by side, regardless of their physical location. The platform supports native apps to provide the best user experience, emphasising clear screen sharing and audio quality.

The company  raised $13 million in funding to date from investors including Greylock Partners and First Round.

This is OpenAI’s second acquisition in a week, following its purchase of Rockset, a data analytics company co-founded in 2016 by former Meta engineers Dhruba Borthakur and Venkat Venkataramani, for $105 million (INR 905 crore). This acquisition aims to leverage Rockset’s advanced analytics to enhance OpenAI’s retrieval infrastructure.

The post OpenAI Acquires Startup Multi to Develop ChatGPT Desktop App appeared first on AIM.

]]>
In the End, All AI Startups will be Acquired https://analyticsindiamag.com/industry-insights/ai-startups/in-the-end-all-ai-startups-will-be-acquired/ Fri, 21 Jun 2024 09:07:45 +0000 https://analyticsindiamag.com/?p=10124174

“We're actually very picky. I spend time on acquisitions every week, and we pass on companies all the time,” said Ali Ghodsi, CEO and co-founder of Databricks.

The post In the End, All AI Startups will be Acquired appeared first on AIM.

]]>

Hugging Face recently acquired Spanish-based startup Argilla for $10 million. With this acquisition of a collaboration platform for AI engineers and experts, the company has closed four acquisitions. 

“The volume of acquisition opportunities really skyrocketed in the past few months. We are receiving over ten acquisition opportunities a week these days,” said Hugging Face CEO Clem Delangue, in an interview with Bloomberg

It’s not just them, but a number of big companies are being bombarded with acquisition requests from startups, ringing in the question of whether all AI startups will eventually face the indispensable fate of acquisition? 

Spike in Acquisition Requests 

“We’re always interested in interesting acquisitions,” said co-founder and CEO of Databricks Ali Ghodsi, in the backdrop of the recently concluded Data + AI Summit. 

Considering themselves to be ‘really picky’, Ghodsi said that they invest a lot of time on acquisitions every week. 

“We’re just looking at what’s strategic for us at any given time. If there’s really good synergy, we will do it,” said Ghodsi, commenting on an ongoing joke that Databricks will have to announce a big billion-dollar acquisition for its Data + AI Summit every year. 

Databricks chief also clarified his acquisition strategy with ‘people’ being the prime focus. “We start with the people. Are they going to work out culturally at Databricks? Is that going to be a match? There’s no point in acquiring a company, and the employees hating each other. Then it’s just a matter of time before they leave. So that’s really important for us,” he said. 

Ghodsi considers product experience for users, and financials as other important factors for any acquisition.

The Viable Option

Given the large compute requirements and the need for access to a huge customer base, emerging AI startups may best survive under the umbrella of an existing thriving company.

“Companies like Hugging Face and others are becoming sort of magnets because of the visibility, talent and compute that we have,” said Delangue, who also mentioned how startup founders are reaching out to him via emails and even social media with acquisition pitches. 

Startups with great teams and good traction might not receive the level of VC funding required. In such cases, acquisition opportunities may be a viable option. 

Interestingly, big tech companies that have the money to invest heavily, can continue to pour funds into AI developments. Meta chief Mark Zuckerberg had recently said that they are investing “to stay at the leading edge,” and that they are also “scaling the product before it is making money”. 

However, this lee-way is not available to all emerging startups. The recent example of Stability AI, where the founder left the company, showed the world how a promising AI startup that failed to receive adequate fundings last year started facing problems. 

Preparing for the Wave

Delangue spoke about how companies are investing in hot topics such as LLMs and referenced Mistral’s recent funding. However, Hugging Face is placing its bets on startups working on less hyped topics such as datasets. “Argilla, working on datasets, on data, in my opinion, is now becoming the bottleneck or the most important.”

While Hugging Face’s recent acquisition route is investing in the outliers, big-tech companies’ acquisition style has been pretty straight-forward. If not acquisition, big-tech companies pump hefty amounts into companies making massive progress.  

Microsoft recently made a $650 million deal to use Inflection’s models and even hired the startups’s top talent, including Mustafa Suleyman, to lead their AI division. 

Likewise, Apple recently acquired a Canadian-based computer vision company Darwin AI to further their AI venture, with the company acquiring 32 AI startups in 2023. 

As per a recent report by AIM Research, Indian AI startups alone have raised a funding of $560 million across 25 funding rounds. Going by the influx of acquisition requests big techs are receiving, it seems almost clear that all AI startups will be eventually acquired. All eyes on OpenAI!  

The post In the End, All AI Startups will be Acquired appeared first on AIM.

]]>
How Much Does It Cost To Build an AI Research Startup in India? https://analyticsindiamag.com/industry-insights/ai-startups/how-much-does-it-cost-to-build-an-ai-research-startup-in-india/ Thu, 20 Jun 2024 11:41:18 +0000 https://analyticsindiamag.com/?p=10124088

Assuming a seed fund of $10 million, an AI startup in India can last for around 2 years without raising any more funds and not running inference.

The post How Much Does It Cost To Build an AI Research Startup in India? appeared first on AIM.

]]>

Everyone hopes to build an AI startup. After leaving OpenAI, Ilya Sutskever has started his own venture called Safe Superintelligence. The startup is bound to raise billions of dollars in pursuing its goal of building ASI, but what about Indian startups that are aiming to do fundamental AI research?

Speaking with AIM, Soket AI Labs founder & CEO Abhishek Upperwal revealed the numbers required to build a research-based AI startup. So far, the company has already built Pragna-1B, which is a foundational model specifically for Indic languages.

Upperwal said that, currently, funding is just enough to make-do for AI research within a startup. “Yes, there are fewer funds that are available as compared to any foreign markets, but I also believe that we can maybe make-do with that particular fund and then ultimately grow in scale after the seed stage,” said Upperwal. 

He explained that for the seed stage in India, a funding of $5 million or $10 million is still a decent amount. This roughly translates to around INR 40-50 crores. This is still very little when compared to the 100s of millions raised by companies in the West.

“If VCs can trust these companies in the generative AI space, we can definitely do wonderful stuff for sure,” added Upperwal.

Where Does the Money Go?

The gap in funding is because of the market. India is a smaller market, therefore the ticket sizes are still way smaller compared to the West. But Upperwal said that the ticket sizes being small pose a problem when compared to the work that startups have to do at the foundational layer. 

Upperwal gives the example of building an estimated 7 billion parameter foundation model out of India. He said that the cost for the compute alone would be close to $2 million. For reference, one NVIDIA H100 costs around INR 30 lakhs, or $36k. 

To build a 7 billion parameter model, considering a six month time frame, a startup would require at least a dozen NVIDIA GPUs for the training period, taking into account time for other factors like inefficiencies.

This is while considering that the model is built in one shot. “It takes a lot of experiments, and checkpoints, or the path that you are taking fails, so you need to rebuild from the previous checkpoint,” explained Upperwal. 

Earlier, Upperwal had told AIM that it took the company six months to train the Pragna model, which involved many experiments with different models and a total of 150 billion tokens. It took close to 8000 GPU hours on NVIDIA A100s to train the model.

Accounting for all of these, an ideal amount to do a lot of foundational work in AI at the seed stage is anywhere around USD $7-15 million, which is close to around INR 125 crore

All of this is including the cost of running the business such as hiring talent and paying bills, and does not include the cost of making the models ready for production or inference. That would increase the funding requirement to, at least, more than double.

In the same conversation, Speciale Invest founder and partner Arjun Rao said that Indian VCs are interested in investing in the development phase, more than the research phase of AI. It would take a lot of time to research, build a model, compare it with others, and then figure out how it can be commoditised, which is something VCs are still figuring out. 

Assuming they’re working with a team of eight people and a funding of approximately $10 million (or INR 82 crore), an AI startup in India can survive for about 2.33 years at the current estimated monthly expenditure of INR 2.92 crore, which does not include inference costs.

Assuming that the $2 million as Upperwal mentioned above goes towards compute just for training a 7 billion model for around six months, it still would account for 80.2% of the expenditure for 2 years, with the rest going towards salaries and other expenditure. 

This calculation assumes that the monthly costs remain constant and does not account for potential increases in expenses due to scaling, inflation, or other operational changes.

How Much Are Indian Startups Getting?

While these numbers seem reasonable, they are comparatively lower when looking at the global standard set by OpenAI, Anthropic, or Mistral, who have raised billions of dollars. 

For reference, Sarvam AI, which has announced its intention of building foundational AI models, has raised a total of $41 million. Pranav Mistry-led and Reliance-backed TWO.AI has raised $20 million for building their SUTRA line of models, and Krutrim raised $50 million becoming India’s first generative AI unicorn.

This is just for initial research. SML CEO and founder Vishnu Vardhan emphasised the huge investment required to build and scale complex AI models. In an exclusive interview with AIM, Vardhan disclosed his plans of raising $200-300 million for the same. The company only recently launched Hanooman, its own foundational model. 

“That’s the kind of money we need to launch this kind of a product. We’ve already spent tens of millions of dollars, but that won’t work,” he said, about building a GPT-5 level model in India. 

Even if you add the total funds raised by companies in India, it’s still nothing when compared to OpenAI raising billions single-handedly. It definitely would cost a lot more to build an AI startup in India to compete with the West.

The post How Much Does It Cost To Build an AI Research Startup in India? appeared first on AIM.

]]>
This Sequoia-Backed Startup Uses AI to Help You Sleep Better  https://analyticsindiamag.com/industry-insights/ai-startups/this-sequoia-backed-startup-uses-ai-to-help-you-sleep-better/ Mon, 17 Jun 2024 12:36:50 +0000 https://analyticsindiamag.com/?p=10123838

Besides consumer products, Wakefit is experimenting with models like OpenAI’s GPT-4 and Google’s Gemini to improve supply chain management, demand planning, and forecasting.

The post This Sequoia-Backed Startup Uses AI to Help You Sleep Better  appeared first on AIM.

]]>

Last week, at its first-ever media conference in India, Sequoia-backed Indian startup Wakefit announced its intention to improve sleep health and quality for Indian consumers, this time using AI. 

Regul8 is one of the flagship products of the company’s recently launched Wakefit Zense, the country’s first AI-powered sleep solutions suite. 

It is India’s first mattress temperature controller, which allows users to manually set the temperature between 15°C and 40°C or choose from presets like neutral, cold, warm, ice, and even fire. The icing on the cake: It also supports dual preferences, so individuals can customise their sides of the bed independently, eliminating the common household dispute over the AC remote.

On an average, a home air conditioner in India can use about 3,000 watts of electricity an hour in India. However, thanks to Wakefit’s recently launched Regul8, the sleep controller mattress, you don’t need an AC anymore. Most importantly, it is 60% more energy-efficient than a 1.5-ton air conditioner.

The other flagship product is an AI-powered contactless sleep-tracking device called Track8

Explaining the technology behind Track8, Yash Dayal, the CTO of Wakefit, told AIM,  “The tracker works by placing a passive sensor below the mattress, and as a person sleeps, it leverages ballistocardiography. This means that tiny vibrations from heartbeats, snoring, or any movement are read by our sensor. “These raw signals are then processed through AI and ML models to derive sleep metrics, restfulness, and other body vitals.”

Integrating both Track8 and Regul8 can create a system where temperature regulation is based on how you sleep. According to Dayal, the products, including AI and ML models, were made by the in-house team consisting of about 80 to 100 tech experts.  

Discussing the broader AI strategy, Dayal said, “We don’t rely on generative AI models for these consumer products, but we are experimenting with models like those from OpenAI and Gemini internally for efficient supply chain management, demand planning, and forecasting.”

What about Data Privacy? 

“Users have the option to choose whether to share their data or not. Their data is used solely to provide insights on sleep quality, such as heart rate variability, respiratory rate, movement index, and snoring index. It is anonymised, encrypted, and used only to improve their sleep quality,” said the director and co-founder Chaitanya Ramalingegowda, in an exclusive interview with AIM

Making High-Quality Products Affordable for All 

Founded in 2015, Wakefit’s philosophy is rooted in making high-quality products affordable for middle-class consumers. Led by co-founders Ankit Garg and Ramalingegowda, Wakefit has created a foundation for itself by building coming-of-age products like orthopaedic mattresses and dual comfort solutions, but at affordable prices. 

“Ankit and I come from middle-class backgrounds, so we understand the constraints of operating on a budget. From the beginning, our philosophy has been to make high-quality products affordable. For example, why should an orthopaedic memory foam mattress cost INR 50,000 when it can cost INR 12,000?” said Ramalingegowda.

This principle extends to their latest tech-enabled sleep solutions, which are designed to offer cutting-edge technology at a fraction of the cost of similar products in the US and Europe.

“The core idea was to leverage this success in non-tech products by applying material sciences, to create something uniquely beneficial for sleep,” added Ramalingegowda. 

“In markets like the US and Europe, similar products cost around INR 4.5 lakh. We built our product from the ground up to make it available for INR 45,000 to 50,000. We are middle-class folks building for middle-class India,” explained Ramalingegowda.

For example, high-end sleep technology brands like Eight Sleep offer smart mattresses with advanced sleep tracking and temperature control features, at around INR 2.8 lakh for the Pod 3 model. 

Similarly, Sleep Number provides adjustable mattresses with integrated sleep tracking and temperature regulation, often exceeding INR 3 lakh. Chili Technology and Bryte also offer premium sleep solutions with advanced features, with prices reaching up to approximately INR 3.75 lakh.

By offering its products at a mere fraction of that price, Wakefit positions itself as a cost-effective alternative to these luxury competitors, making sleep technology more accessible to the middle-class Indian market. This strategic pricing is an effort to bridge the gap between affordability and high-quality sleep solutions.

Future Prospects

Looking ahead, Wakefit plans to integrate generative AI and work with voice-based ecosystems to provide more actionable insights rather than passive data reporting. “Over the next two years, we will steadily launch new features and products, adding more value to our customers,” concluded Ramalingegowda. 

Wakefit’s commitment to innovation and accessibility positions it as a leader in the sleep technology market in India. With the launch of Wakefit Zense, the company is set to revamp how Indians experience sleep, making AI-powered sleep solutions an integral part of everyday life.

The post This Sequoia-Backed Startup Uses AI to Help You Sleep Better  appeared first on AIM.

]]>
All AI Startups Have the Potential to be the Next OpenAI https://analyticsindiamag.com/industry-insights/ai-startups/all-ai-startups-have-the-potential-to-become-the-next-openai/ Mon, 17 Jun 2024 11:18:07 +0000 https://analyticsindiamag.com/?p=10123822

“It's a great opportunity because it brings people along; it gives them an intuitive sense of the capabilities and risks,” said OpenAI CTO Mira Murati.

The post All AI Startups Have the Potential to be the Next OpenAI appeared first on AIM.

]]>

In a recent interview, OpenAI CTO Mira Murati said that the models the company uses in its labs are not far behind those currently available to the public.

Speaking on the release of GPT-4o, she said that it was a massive deal for OpenAI to be able to make this kind of technology accessible to the public. “I don’t think there is enough emphasis on how unique that is for the stage where the technology is today.

“In the sense that inside the labs, we have these capable models and they’re not that far ahead of what the public has free access to. That’s a completely different trajectory for bringing technology into the world than what we’ve seen historically,” she said.

This brings up an interesting point of how, while many AI startups struggle to survive past their early stages, it is still possible to rise up, much like OpenAI did in the last few years.

While Murati’s point is specifically about OpenAI’s ability to make advanced models widely available, she also emphasised that this is to ensure that the general public is fully aware of how the technology is progressing.

“It’s a great opportunity because it brings people along; it gives them an intuitive sense of the capabilities and risks. The opportunities are huge now,” she said. Murati’s stance implies that startups are not far behind in terms of the technology needed to develop advanced AI systems.

Behind OpenAI’s Success

OpenAI is widely regarded as an AI success story. Initially founded as a non-profit towards the goal of developing beneficial AGI, the company quickly rose through the ranks of the startup world to become a household name.

However, there were several factors that helped OpenAI gain the attention it did. Namely, access to funding as well as a large talent pool, allowed the startup to focus largely on quality research.

One of the key points to remember about OpenAI is that its big break with ChatGPT came only a few years after the company received a $1 billion investment from Microsoft. Additionally, the timing was near perfect, as they had focused their attention on the development of Transformer models, shortly after the release of the 2017 paper, ‘Attention is All You Need’.

Of course, it’s key to note that all of these factors rely on ensuring that your problem statement is one that can result in something ground-breaking. 

What Needs to Be Done?

Murati’s implication that AI models are much more widely available now than ever before, means that startups need to try harder to stand out. OpenAI appeared as a pioneer in making an easy-to-use AI product widely available to the public for free. However, only years later, doing the same doesn’t have the same impact anymore.

In India, several startups have managed to make a splash, albeit not as big as OpenAI’s. Successful Indian startups like Krutrim, Kissan and Sarvam have been able to leverage gaps between the AI revolution and use cases specific to India, allowing them to break ground as pioneers in the country.

Being able to identify certain gaps and plug them in a way that ensures a startup’s continuity is a tough task. This is especially true as rapid advancements from bigger companies mean that startups with a poorly thought-out plan and product find themselves obsolete almost overnight.

With the proper resources, however, this can be remedied. One must keep in mind that OpenAI’s ability to get ahead of the curve was solely due to a team of researchers that tried to leverage then-unexplored technology for use among the general public.

AI advancements mean that a lot more students and professionals are pivoting towards studying AI, thereby creating a large talent pool, especially in India. Besides, with startup funding increasing exponentially, it seems that the only roadblock now is trying to find a way to stand out in a sea of problem statements.

However, as Murati said, “If you can push human knowledge forward, you can push society and civilisation forward.”

The post All AI Startups Have the Potential to be the Next OpenAI appeared first on AIM.

]]>
Bengaluru’s HSR is All About AI Startups https://analyticsindiamag.com/industry-insights/ai-startups/bengalurus-hsr-is-all-about-ai-startups/ Fri, 14 Jun 2024 12:53:31 +0000 https://analyticsindiamag.com/?p=10123735

Bengaluru is home to 1,000+ AI startups and over 7,000 startups in all.

The post Bengaluru’s HSR is All About AI Startups appeared first on AIM.

]]>

Bengaluru, often dubbed the ‘Silicon Valley of India’, has always been synonymous with startups and innovation. In recent years, the city has embraced the AI trend, making it a significant player in the global AI landscape. 

Srikrishna Swaminathan, the CEO of Factor.AI, wrote on LinkedIn, “One building, and all AI companies. Factors.ai can claim to attracted all AI here, as we were the first occupants, Amal Mishra, Urban Vault.”

Urban Vault, a building in HSR Layout, is currently housing five notable AI companies, namely Loop AI, Factors AI, Raga AI, Frinks AI, and Actyv AI, thereby, alluding to HSR’s pivotal role in nurturing AI startups

HSR: The AI Hub

In an earlier conversation with AIM, Ganesh Gopalan, the CEO and co-founder of GNANI.AI, highlighted that they wished to establish the startup in HSR, calling it a hub for tech talent. 

Some of the vibrant AI startups buzzing in HSR Layout include MachineHack Gen AI, a GenAI startup for the developer community; Zolnoi Innovations Pvt. Ltd., which leverages AI to enhance production efficiency; and HyperVerge, a deep-tech AI company. 

Then there’s SigTuple, specialising in intelligent solutions for medical diagnosis using state-of-the-art AI techniques; Invento Robotics, creating intelligent robots for a range of applications; and Talview, offering an AI-powered recruitment platform for global enterprises.

This unique ecosystem thrives due to the diversity of startups focusing on various sectors, including logistics, sales, developer tools, manufacturing, and fintech. The proximity of these companies fosters collaboration and innovation, making mergers and acquisitions more seamless.

Source: LinkedIn

AI Startups in Bengaluru

Speaking to AIM at IGIC 2024, Sanjeev Kumar Gupta, the CEO of Karnataka Digital Economy Mission, said, “There are over 1,000 AI startups in Bengaluru. With initiatives like Beyond Bengaluru, we are also promoting AI startups, in Mysuru, Hubli, and Mangaluru.”  

The emergence of companies like Sarvam AI, which focuses on developing advanced AI models, reflects Bengaluru’s commitment to technological innovation.

Krutrim AI, led by Ola’s founder Bhavish Aggarwal, has raised $50 million in funding at a $1 billion valuation and is now India’s first AI startup to reach the unicorn status. 

Innovation in AI isn’t limited to corporate giants. Startups like KOGO OS are disrupting markets with their AI operating systems, offering modular AI assistants tailored for diverse industries. 

Meanwhile, Karya AI is pioneering in the rural employment sector, leveraging AI to create job opportunities through tasks in local languages, with notable partnerships with tech giants like Microsoft and Google.

These home-grown initiatives underscore Bengaluru’s status as a hub for AI innovation, promising transformative impacts across multiple sectors.

Karnataka Government’s Support to Bengaluru Startups

According to a report released by NASSCOM, Bengaluru is home to over 7,000 startups, solidifying its status as India’s leading startup hub and accounting for 20% of the country’s overall startup activity.

Priyank Kharge, Karnataka minister for IT/BT, recently unveiled a new scale-up program, Hypergrowth Global Karnataka, at London Tech Week. This initiative aims to accelerate the global commercialisation and international market expansion of the best tech companies in Bengaluru and Karnataka. 

The program offers local companies access to global mentors, expert scaling advice from leading executives, go-to-market support, and connections to potential new customers and investors to enhance their international growth capabilities.

The post Bengaluru’s HSR is All About AI Startups appeared first on AIM.

]]>
AWS Teams Up with Accel to Support GenAI Startups in APJ https://analyticsindiamag.com/industry-insights/ai-startups/aws-teams-up-with-accel-to-support-genai-startups-in-apj/ Fri, 14 Jun 2024 10:03:58 +0000 https://analyticsindiamag.com/?p=10123673

The AWS Generative AI Spotlight program will select up to 120 early-stage startups across the region, including 40 from India.

The post AWS Teams Up with Accel to Support GenAI Startups in APJ appeared first on AIM.

]]>

AWS has announced the launch of the new AWS Generative AI Spotlight program in Asia Pacific and Japan (APJ). This is a four-week accelerator program to support early-stage startups in the region that are building generative AI applications.

In India, the company is partnering with the venture capital firm Accel for this program. Last year, the company and Accel launched ML Elevate 2023, a six-week accelerator that supported 35 generative AI startups in India

The program will select up to 120 early-stage startups across the region, including 40 from India. 

For example, with help from AWS, fintech start-up Fibe has improved customer support efficiency by 30%. 

The participants will also have access to the company’s Activate program for startups. They can receive up to $100,000 in AWS credits.

The company’s Generative AI Spotlight program in APJ is collaborating with venture capital firms and organisations in key cities across the region. 

 Generative AI Accelerator

Additionally, the company has announced a $230 million commitment for generative AI startups to accelerate the creation of generative AI applications worldwide. This funding will provide early-stage companies with AWS credits, mentorship, and education to further their use of AI  and ML technologies.

A major portion of the commitment will fund the second cohort of the AWS Generative AI Accelerator program. This 10-week program will provide hands-on expertise and up to one million dollars in AWS credits to each of the top 80 early-stage startups using generative AI to solve complex challenges. 

The program will identify top startups in areas such as financial services, healthcare, media, entertainment, business, and climate change. Participants will receive sessions on ML performance enhancement, stack optimisation, and go-to-market strategies, along with business and technical mentorship based on their industry vertical.

AWS’s Support for Startups

Matt Wood, VP of AI products at the company, stated, “With this new effort, we will help startups launch and scale world-class businesses, providing the building blocks they need to unleash new AI applications that will impact all facets of how the world learns, connects, and does business.”

AWS has a long history of supporting startups, with 96% of all AI/ML unicorns running on its platform. The new commitment aims to further accelerate the growth of generative AI startups by providing them with the necessary resources and mentorship.

This is not the first time AWS has committed to helping startups, last month AWS collaborated with Shellkode, a cloud company, to train one lakh women developers in generative AI.

The post AWS Teams Up with Accel to Support GenAI Startups in APJ appeared first on AIM.

]]>
Bengaluru-Based Startup rampp.ai Raises $500,000 https://analyticsindiamag.com/industry-insights/ai-startups/bengaluru-based-startup-rampp-ai-raises-500000/ Thu, 13 Jun 2024 12:31:40 +0000 https://analyticsindiamag.com/?p=10123587

The startup, which is currently powered by OpenAI and partnered with Microsoft, founded the RADI Navigator Platform.

The post Bengaluru-Based Startup rampp.ai Raises $500,000 appeared first on AIM.

]]>

Bengaluru-based startup rampp.ai announced on Thursday that they had successfully bagged angel funding of USD $500,000.

The startup, which is currently powered by OpenAI and partnered with Microsoft, founded the RADI Navigator Platform. The platform helps deliver contextual and customised insights to enterprises in order to allow them to better strategise across business domains and functionalities.

The company was founded by tech entrepreneurs Ajay Agrawal and Huzefa Saifee in 2023. This is the fourth such venture by the duo, in which they leverage technology to provide highly tailored business solutions to organisations.

“We see early investor support as a validation of our core strategy to leverage GenAI for tackling transformative challenges. Constant market & technology shifts, coupled with external confounding factors, make enterprise-wide transformation daunting, and our solution provides leaders with crucial insights to tread with confidence,” said Agrawal, who is also the CEO of the startup.

According to the company, the funding has come from a group of seasoned executives and investors, a majority of whom come from India and North America.

In particular, they target companies that are aiming to grow and pivot, allowing rampp.ai to help them leverage AI to navigate these intricacies. “This results in agile adaptation and scalable pivoting with measurable outcomes, no matter the distinct challenges a business faces,” the company said.

Termed as a “GenAI-based enterprise-grade business transformation solutions” provider, the company’s major product is the RADI Navigator Platform.

Short for rampp.ai Artificial Digital Intelligence, the platform makes use of a deep industry and technology knowledge base to create an effective roadmap for companies in real time. This includes incorporating stakeholder inputs and offering tools to ensure that a business can accelerate its processes.

While not much is known about the capacity of the startup that has partnered with OpenAI and Microsoft, this is not the first time that either major company has partnered with smaller startups. However, a partnership with a Bengaluru-based startup is particularly interesting as OpenAI has started to put more focus on its operations in India.

For one, the company only recently hired its first India-based employee. Further, they are also hoping to establish a local team in the country.

The post Bengaluru-Based Startup rampp.ai Raises $500,000 appeared first on AIM.

]]>
Antler to Invest $10 Million in Early-Stage Indian Startups Over Next Six Months https://analyticsindiamag.com/industry-insights/ai-startups/antler-to-invest-10-million-in-early-stage-indian-startups/ Wed, 12 Jun 2024 09:36:56 +0000 https://analyticsindiamag.com/?p=10123360

'NOW is the best time to build a startup in India!"

The post Antler to Invest $10 Million in Early-Stage Indian Startups Over Next Six Months appeared first on AIM.

]]>

Antler, the venture capital firm, announced plans to invest $10 million in early-stage Indian startups over the next six months. Partners Rajiv Srivatsa and Nitin Sharma will lead the initiative, targeting idea-stage founders with investments of $500,000 per company, totaling 20 investments. This $10 million is part of the $75 million that Antler committed for India.

Srivatsa shared the news on X, emphasising the timeliness of building startups in India. “NOW is the best time to build a startup in India!,” he posted. 

He also provided further details on the investment: “Towards this, Nitin Sharma & I are committing $10M from @AntlerIndia into exceptional idea-stage founders in the coming 6 months. $500K per company X 20 investments – We will work with you in your ambiguous -1 to 0 phase and get you >$1Mn in total within 6-9 months of starting up!”

Srivatsa encouraged aspiring entrepreneurs to write “Build” in the comments to get a message from the Antler team. He also invited those with startup ideas or interest to tag others and join an AMA next week for more details.

Antler launched the $285 million Antler Elevate Fund in June 2023, focusing on Series A rounds and later stages, with investments ranging from $1 million to $10 million. 

Since its establishment in 2018 in Singapore, Antler has expanded its investments and networks globally, covering regions such as the U.S., Europe, Africa, and Asia, including Vietnam, Japan, and Malaysia.

The post Antler to Invest $10 Million in Early-Stage Indian Startups Over Next Six Months appeared first on AIM.

]]>
Why an Open Cloud Network Would Benefit Indian Tech Startups, Developers, IITs https://analyticsindiamag.com/industry-insights/ai-startups/why-open-cloud-network-benefit-startups-developers-iits/ Wed, 12 Jun 2024 04:29:14 +0000 https://analyticsindiamag.com/?p=10123317

OCC aims to create common standards, APIs, and interoperability protocols to enable seamless integration and provisioning of compute resources

The post Why an Open Cloud Network Would Benefit Indian Tech Startups, Developers, IITs appeared first on AIM.

]]>

The cloud computing market is dominated by three names – AWS, Microsoft Azure, and Google Cloud. However, accessing these Cloud Service Providers (CSPs) could prove to be expensive for micro and small businesses, individual developers, as well as research institutions in India.

So what is the solution? An open cloud computing (OCC) network. 

People+AI is on a mission to develop an interoperable cloud computing network that could transform India’s growing tech ecosystem.

The idea isn’t just to democratise cloud computing in India but also to empower smaller players within the Indian public cloud sector, like Vigyanlabs and Von Neumann AI, among others.

“One of the challenges these smaller players are facing is discoverability. Moreover, besides helping them with marketing, OCC also aims to support them in figuring out the right tax subsidies, infrastructural policies and with ease of doing business,” Tanvi Lall, director of strategy at People+AI, told AIM.

People+AI, which branches out from Nandan Nilekani-backed non-profit EkStep Foundation, has already teamed up with 24 technology partners, including the likes of Oracle Cloud, Vigyan Labs, Protean Cloud, Dell, NeevCloud, and Tata Communications, among others.

Democratising Compute in India 

Based on the concept of open-networks such as the Open Network for Digital Commerce (ONDC), the aim of OCC is to establish an open network where consumers can discover and access heterogeneous compute providers through standard interfaces. 

“It’s an open network of providers powered by protocols, powered by trust, where diverse providers can log in and then on the other hand customers who need computing in different shapes and formats can go to this interface and find the provider on the other end,” Lall said.

According to Lall, an OCC will be significant for students and research institutions that need compute for their research work, hackathons, etc. 

It will also benefit individual developers who are experimenting with newer things, for example, fine tuning an open source model. 

“These are developers who are not developing any products but are carrying out different projects. This also includes pre-startup developers who might not be able to afford a CSP,” Lall added.

Moreover, the open cloud network will prove to be of significance for  Micro, Small & Medium Enterprises (MSMEs). India is one of the largest MSME markets in the world, and today many of these enterprises are in their digital transformation process.

“These companies today want a website, they want some sort of CRM tool, some planning tool, business ops. It might not be the high calibre graphics processing units (GPU) compute, but they want some compute,” Lall said. 

“However, there are also MSMEs who are starting to apply AI to their business process, for example, to improve sales, deploying AI chatbots for customer service, etc.”

These companies might even need GPUs for inference and other similar workloads. Lastly, Indian tech startups also stand to benefit a lot. These are startups, which are developing B2B or B2C AI applications for others to use and might need high calibre GPUs, according to Lall.

“This is a very big category and they need significant reliable compute power because they are the ones selling these AI applications to others,” she said.

Furthermore, OCC aims to create common standards, APIs, and interoperability protocols to enable seamless integration and provisioning of compute resources from multiple providers within the network.

This implies that although your data may be stored with one CSP, with OCC, you can access services from another provider without the need to migrate your data. Generally, migrating data from one CSP proves to be challenging.

Challenges Ahead 

While it’s an ambitious project, implementing it on a population scale will truly help the whole ecosystem reap its benefits. However, implementing this at such a large scale remains a challenge.

According to Lall, building the technology is not the biggest issue. “We have talented people in this country who can build the tech very easily,” she said. The biggest challenge for Lall and her team will be developing the ecosystem and sustaining it.

Moreover, it will be critical for these cloud companies to find value in the network, the same goes for the end users. 

What makes many enterprises choose the CSP is their ability to provide scalable and flexible infrastructure, robust security measures, seamless integration with existing systems, and reliable customer support.

The network will have to ensure the same things are easily available for the stakeholders in the ecosystem. 

“We’re adopting a highly granular approach to this. It’s not merely compiling a GPU directory or a simple listing of available GPUs; while that’s undoubtedly beneficial, we’re emphasising the importance of considering user experience, deployment strategies, and other critical factors,” Lall said.

Moreover, according to Lall, what makes the CSPs like AWS worthwhile is the whole community they have developed. “They put out these videos and documentation which help you find quick fixes to your problems while using their services.

“At OCC as well, we need to establish an ecosystem where individuals and entities such as startup founders seeking compute resources can easily communicate. They should be able to engage in straightforward conversations about their requirements,” Lall said.

Presently, there isn’t a centralised platform for this purpose. While there may be some Discord channels available, according to Lall, there isn’t a single venue where users can explore offerings and interact with each other.

The Indian AI-as-an-Infrastructure Landscape 

In the last year or so, we have seen various startups emerging in the AI cloud space. For instance, startups such as Yotta and NeevCloud have announced their entry into the AI cloud space. 

Yotta, backed by the Hiranandani group, is setting up a GPU infrastructure of 32,768 GPUs by the end of 2025. Likewise, NeevCloud intends to acquire 40,000 GPUs by 2026.

Tata Communications, which is also a 24-member technology partner for People+AI, is ramping up efforts to establish AI cloud infrastructure by aligning with NVIDIA.

There are also other players like Jarvis Labs, for instance, which is based in Coimbatore. What OCC also sets out to enable is to bring computing closer to MSMEs or tech startups located in Tier 2 and Tier 3 cities in India. 

Can Hyperscalers be Part of it?

According to Lall, hyperscalers like AWS, Azure and Google Cloud could even be part of the network. “ Our entire approach is plus one. The idea is not to leave certain folks out. There is room for everybody and there is a reason why we approached Oracle Cloud.”

“If the hyperscalers feel they have something to offer in the Indian context, they are welcome. It would also be foolish for us to imagine that everyone will abandon the big CSPs as and when the network is up and running,” she concluded.

The post Why an Open Cloud Network Would Benefit Indian Tech Startups, Developers, IITs appeared first on AIM.

]]>
‘Creativity in Indian Open Source will Flourish, Blooming a Spring of AI Innovations and Startups’ https://analyticsindiamag.com/industry-insights/ai-startups/creativity-indian-open-source-flourish-blooming-innovations/ Tue, 11 Jun 2024 14:28:00 +0000 https://analyticsindiamag.com/?p=10123296

Dohmke added that India’s burgeoning developer community, combined with the newfound possibilities of AI, will not only accelerate digital transformation, but will drive immense human and economic progress for India. 

The post ‘Creativity in Indian Open Source will Flourish, Blooming a Spring of AI Innovations and Startups’ appeared first on AIM.

]]>

GitHub CEO Thomas Dohmke is optimistic that India will emerge as a leader in the age of AI. According to a GitHub report released last year, India already has a large developer base, and the number of developers in the country is set to surpass developers in the US by 2027.

Moreover, GitHub revealed that there are over 15.4 million developers in India building on GitHub, growing 33% year-over-year (YoY). 

While speaking at the recently held GitHub Galaxy 2024 event in Bengaluru, Dohmke said, “India will not just be a global leader, but the global leader in the age of AI.

Children and adults alike will learn to code in their native language, leading to a prolonged groundswell of developers. 

“Creativity in Indian open source will flourish, blooming a spring of AI innovations and startups. And Indian businesses will carve a competitive advantage in the global market, as their developers build software with an accelerated speed of code.”

Dohmke added that India’s burgeoning developer community, combined with the newfound possibilities of AI, will not only accelerate digital transformation, but will drive immense human and economic progress for India. 

“Developers are already 55% faster when using Copilot, meaning the software economy, valued globally in the trillions, is moving rapidly faster because of AI.

In a way that no other country on Earth can claim: India is on the brink of a great convergence between the world’s largest population of developers and the newfound possibility of AI.

If enabled, this great convergence will generate a consequential economic boom in India that could be felt around the world for generations to come.”

Today, GitHub Copilot has 1.8 million paid subscribers, and has been adopted by over 50,000 organisations globally. 

In India, GitHub revealed that it continues to see the adoption of its Copilot-powered platform by companies in every industry, including Cognizant, MakeMyTrip, Paytm, Swiggy, Glance, and Air India, among others. 

Copilot is also being leveraged by public sector customers such as the  Government e-marketplace (GeM).

GitHub Partners with Infosys to Launch Centre of Excellence

Recently, GitHub also partnered with Infosys to launch the first GitHub Center of Excellence in Bangalore. This initiative aims to leverage AI and advanced software solutions to drive global economic growth.

This collaboration promises to enhance the speed and efficiency of software production worldwide by integrating GitHub Copilot across Infosys’ developer teams and extending its capabilities to its clients. The partnership represents a generational opportunity for Global Systems Integrators (GSIs) to spearhead advancements in the AI and software sectors.

“At Infosys, we’re passionate about unlocking human potential, and GitHub is a strategic partner in this endeavor. GitHub Copilot is empowering our developers to become more productive, efficient, and enabling them to focus more on value creating tasks.

Generative AI is transforming every aspect of the software development lifecycle, and using Infosys Topaz assets, we are accelerating Gen AI adoption for our clients. We are excited to work with GitHub to unlock this technology’s full potential and deliver client relevant solutions,” Mohammed Rafee Tarafdar, CTO at Infosys, said.

The post ‘Creativity in Indian Open Source will Flourish, Blooming a Spring of AI Innovations and Startups’ appeared first on AIM.

]]>
Meet the Creators of India’s AWAAZ https://analyticsindiamag.com/industry-insights/ai-startups/meet-the-creators-of-indias-awaaz/ Tue, 11 Jun 2024 08:26:48 +0000 https://analyticsindiamag.com/?p=10123199

AWAAZ stands out for its single-shot voice cloning capability, which can replicate a voice from a mere five-second audio clip.

The post Meet the Creators of India’s AWAAZ appeared first on AIM.

]]>

Text-to-speech (TTS) models are comparatively easier to make in English than in other languages. To fill this gap, IIT Guwahati alumni Sudarshan Kamath and Akshat Mandloi started smallest.ai, and decided to create one for Hindi as well. They call it AWAAZ.

With state-of-the-art Mean Opinion Scores (MOS) in Hindi and Indian English, AWAAZ can fluently converse in over ten accents, reflecting the diverse linguistic landscape of India.

The inception of AWAAZ was driven by the founders’ recognition of a gap in the market for high-quality, affordable TTS models for Indian languages. “When we started building, we realised that the models required for a voice bot were not mature for Indian languages. Existing models for non-English languages were nowhere close to production,” explained Kamath in an exclusive interaction with AIM.

Citing OpenAI’s GPT-4o, which is a generalised model, Kamath said that the company aims to build specialised models that can be tailored for customer support, even for small business. It is also cheaper than other Indian language TTS models, such as Veed.io and Murf.ai.

Janta ki AWAAZ 

AWAAZ stands out for its single-shot voice cloning capability, which can replicate a voice from a mere five-second audio clip. The model also boasts a low streaming latency of just 200 milliseconds. 

To make this technology accessible, smallest.ai has set an introductory price of INR 999 for 500,000 characters, positioning AWAAZ as a cost-effective solution, claiming to be ten times cheaper than its competitors, such as ElevenLabs. 

Kamath said that the language model is about 750 million parameters in size, leveraged using existing open source models.

Kamath attributes the affordability of AWAAZ to their focus on data quality and model efficiency. “Our model is much smaller than those of competitors like ElevenLabs. Despite this, we achieve high-quality speech because our data is highly refined,” he explained. 

smallest.ai uses AWS for cloud services, although they remain flexible about potential future partnerships.

The Dataset of AWAAZ was the Critical Part

Kamath and Mandloi launched smallest.ai in October 2023. The initial goal was to create a voice bot for India capable of qualifying leads and handling customer support. This led to the development of SAPIEN, a voice bot for sales, marketing, and customer support. 

However, the lack of robust TTS models for Indian languages led them to focus on core model development, resulting in the creation of AWAAZ. “The data quality for TTS models reduced drastically when we moved away from English to other languages. It is worse for South Asian languages,” said Kamath.

The Indic data problem has been highlighted several times by researchers when speaking with AIM, be it for text or voice models. 

“We spent a lot of time perfecting the dataset, using over thousands of hours of audio from various people from different states in India. We focused on data quality to ensure a diverse representation, making our model suitable for production-level deployment,” Kamath said. 

The team invested significant resources into this endeavour, with over six months dedicated purely to the development and iterations of data quality.

AWAAZ is currently limited to Hindi and Indian English, but Kamath emphasises the importance of understanding the quality of the output. “The most difficult part is the data. If you tried our model in Tamil, it might respond a little, but we don’t advertise that capability because it’s not up to our standards yet,” he said. 

Way Forward

The company’s ambitious roadmap includes expanding the model’s capabilities. “Our next step is moving closer to GPT-4o-like abilities for Indian languages, where the model can generate answers with a voice, enhancing the interactive experience,” Kamath revealed. 

Additionally, smallest.ai is exploring the development of voice-to-voice models, aiming to offer custom solutions for specific business needs such as lead qualification and customer support.

The founders are committed to AI’s understanding of multimodal data. 

“We’ve been fascinated by AI’s potential to understand more than just text. Speech is one of those areas where AI can truly start to seem human, much like in the movie ‘Her’,” Kamath said, reflecting on the broader vision that drives their work.

The post Meet the Creators of India’s AWAAZ appeared first on AIM.

]]>
Indian AI and Robotics Startup Swaayatt Robots Secures $4 Mn to Advance Level 5 Autonomy  https://analyticsindiamag.com/industry-insights/ai-startups/indian-ai-and-robotics-startup-swaayatt-robots-secures-4-mn-to-advance-level-5-autonomy/ Sat, 08 Jun 2024 04:32:30 +0000 https://analyticsindiamag.com/?p=10122884

The company will further raise $7 million at a valuation of $175 million.

The post Indian AI and Robotics Startup Swaayatt Robots Secures $4 Mn to Advance Level 5 Autonomy  appeared first on AIM.

]]>

Bhopal-based Indian AI and Robotics startup Swaayatt Robots recently raised $4 million at a valuation of $151 million from US investors. The company will further raise $11M at a valuation of round $175M-$200M.

“In 2021, we raised $3 million, of which we have utilised only $650,000. So in total, we now have $6.3 million to carry our research forward. We are already operating in cities, highways, and off-road locations,” said Sanjeev Sharma, chief of Swaayatt Robots, in an exclusive interview with AIM.

“By the end of this year, we’ll be creating a blueprint that could solve Level Four Autonomy globally. To scale that model, we may raise around $1.5 billion,” said Sharma.

In 2021, investors worldover poured a record $9.7 billion into autonomous vehicle development. However, last year that amount dropped by nearly 60% to just $4.1 billion. According to a McKinsey report, autonomous driving is the future, with self-driving vehicles projected to become a $300 billion to $400 billion revenue opportunity by 2035.

Sharma added that the company is planning a major demo in August. “If the August demo doesn’t excite Sam Altman and Elon Musk, along with the world, I don’t know what will,” he said. Earlier this year, the company executed a demo where it claimed to have achieved Level 5 Autonomy.

The autonomous driving scene in India is still in its nascent stages, but several startups like Minus Zero, Flowdrive, Flux Auto, and Netradyne are actively working to develop self-driving technologies designed for the country’s chaotic road conditions. 

Recently, Swaayatt Robots conducted a demo where their autonomous vehicle avoided piles of soil and rocks dumped on the road for construction purposes. 

Negotiating a construction site with well-structured traffic cones is typically considered a challenge by the autonomous driving industry in the West. “You will notice our vehicle made a 90-degree turn at an unmanaged intersection,” said Sharma, adding that it is a huge problem even for companies like Tesla and Waymo. 

Last year in October, the company enabled autonomous vehicles to negotiate bidirectional traffic on single-lane roads. The demo was conducted at a relative speed of 44 kilometers per hour. “Now, that R&D is going to be scaled up to an extent where the relative velocity at the point of crossing can be maintained at 60 kilometers per hour,” said Sharma.

Focus on Autonomous Driving R&D 

As of now, Swaayatt Robots hasn’t really sold their technology to anyone and has been conducting demos with Mahindra Bolero. However, in the near future, the company will focus on OEM integration in existing vehicles, specifically targeting the military and autonomous trucking market. 

“For autonomous driving markets in the military domain, we cannot expect new vehicles to be operated in service, you would want to develop a system that can be retrofitted into these aftermarket vehicles,” said Sharma. 

He added that aftermarket segments in autonomous trucking in North America are humongous. “If you look at the US trucking market, I mean, it’s an $800 billion addressable market,” said Sharma. He further said that Swaayatt Robots has been in talks with some of the fleet owners in the US.

Regarding India he said, “We have understood the requirements of almost every single OEM in the country. We are in touch with most of the OEMs in India, except for Mahindra. It’s ironic because Anand Mahindra recently praised us.” 

Even though we are not in touch with Mahindra directly, he explained that Mahindra’s vehicles are robust and the easiest to modify. “Mahindra has this Mobileye system, which is incorporated into the XUV700.”

Better Than Tesla?  

Sharma likes to position Swaayatt as an R&D company, and unlike Tesla at this stage, it is not planning to sell cars anytime soon. “What we are doing is very sophisticated R&D in the general autonomous navigation domain to make autonomous driving happen,” he said. 

“There is no agency on the planet, be it Musk’s Tesla or Altman’s OpenAI, that is working on bi-directional traffic,” he said, adding that in the October 2023 demo, their vehicle successfully negotiated bi-directional traffic on a single lane. 

He added that by 2030, only four to five autonomous driving companies will survive, highlighting the immense R&D challenges involved. He emphasised that safety is a crucial area to address. Referring to GM’s Cruise accident, he pointed out that it resulted in a total halt of operations and several investigations.

 Sharma is a huge admirer of Wavye AI. “Since 2019, with the advent of Wavye AI, people have been discussing autonomy without relying on maps. We were the first technology to enable vehicles without the reliance on high-definition maps. In 2017, we implemented multi-RL agents without requiring any maps,” he said. 

“We have done R&D in the perception domain to an extent that there are now only 5 problems where supervised learning is required, and for those problems as well we are developing self-supervised models,” he added.

Until now, the company has been conducting demos exclusively with Mahindra Bolero. In the near future, the company plans to demonstrate with multiple vehicles, including Thar and Fortuner. “One demo we will be doing will have Thar and Bolero crossing each other in an autonomous fashion,” concluded Sharma.

The post Indian AI and Robotics Startup Swaayatt Robots Secures $4 Mn to Advance Level 5 Autonomy  appeared first on AIM.

]]>
Cohere Picks Enterprise AI Needs Over ‘Abstract Concepts Like AGI’ https://analyticsindiamag.com/industry-insights/ai-startups/cohere-picks-enterprise-ai-needs-over-abstract-concepts-like-agi/ Fri, 07 Jun 2024 13:00:00 +0000 https://analyticsindiamag.com/?p=10122866

The company recently introduced fine-tuning for Command R, which surpasses larger, more expensive models.

The post Cohere Picks Enterprise AI Needs Over ‘Abstract Concepts Like AGI’ appeared first on AIM.

]]>

Recently, Silicon Valley displayed a rare mixed emotion towards achieving AGI. This became more apparent during the recent banter between Meta’s Yann LeCun and xAI’s Elon Musk.  

However, unlike its contemporaries like OpenAI or Antropic, Google DeepMind or even xAI, the Canadian startup Cohere’s AI ambitions are not focussed on AGI, at least for now

“We remain concentrated on designing AI solutions that deliver better workforce and customer experiences for businesses today rather than pursuing abstract concepts like AGI,” Saurabh Baji, SVP of engineering, Cohere, told AIM

Some experts predict it will arrive next year, others say by 2029, and some think it will never happen. Some claimed that AGI has already been achieved multiple times in 2024 with projects like Devin and Claude-3 Opus, leading to the belief that future achievements might be overlooked.

Cohere offers models in three categories: Embed, Command, and Rerank, each designed for specific use cases and customisable. 

At CouldWorld 2023, co-founder and chief executive officer Aidan Gomez stated that the company is also building embedding models, which are expected to perform twice as well as competitors on varied and noisy datasets. 

Unlike generative models trained on public internet data, embedding models are trained on enterprise data, retrieving information from specific data sources.

Enterprise Needs is Cohere’s Focus

In April, the startup launched Command-R, a language model for enterprise use with a 128k context window and support for ten languages, performing tasks like RAG and tool integration. 

Building on its success, they introduced Command R+, which ranked 6th on the Arena leaderboard, matching GPT-4-0314 based on over 13,000 human votes. It is regarded as one of the best open models.

“Command R+ is our most powerful model to date targeted for use cases needing complex reasoning and is highly performant for tool use and building AI agents,” said Baji. “We are seeing growing demand from customers for scalable models that customers can use to bring AI applications into large-scale production,” he added.

This focus on enterprise-critical features allows Cohere to cater to both large, and complex enterprises like Oracle, Accenture, and McKinsey and fast-growing startups like Borderless AI and AtomicWork, helping them streamline operations and boost productivity. 

For example, Oracle integrates Cohere’s models into applications for finance, supply chain, HR, sales, marketing, and customer service, while AtomicWork uses Command R+ and Rerank models to boost IT support efficiency. 

The company recently introduced fine-tuning for Command R, which surpasses larger, more expensive models. 

The company’s technology enhances business processes in financial services, technology, and retail through enterprise search, copy generation, and AI assistants. It offers flexible deployment options, including on-premises solutions for highly regulated industries, ensuring data privacy and security.

Up Next

“At Cohere, we are continually working to iterate and improve our models to adapt to the unique needs of our enterprise customers. We want to ensure that our products solve real-world business problems today and are designed to excel in an enterprise environment,” said Baji. 

The company is currently working on making the latest Command R models widely available on platforms like Microsoft Azure, Amazon Bedrock and Oracle Cloud Infrastructure (OCI). 

Looking ahead, Cohere wants to continue to innovate, focusing on refining its models to better meet enterprise customers’ needs. It also unveiled the Cohere Toolkit, which aims to speed up the development of generative AI applications. 

By the end of March, the company generated $35 million in annualised revenue, up from $13 million last year. The startup recently raised $450 million from investors, including NVIDIA, Salesforce Ventures, Cisco, and PSP Investments, pushing the valuation to $5 billion.  

Based in Seattle, Washington, Baji has been with Cohere for about two years. “With Cohere’s world-class team, singular focus and clear strategy, it is well positioned to lead the effort to accelerate wider enterprise adoption,” concluded Baji. 

The post Cohere Picks Enterprise AI Needs Over ‘Abstract Concepts Like AGI’ appeared first on AIM.

]]>
Another Indian Startup is Entering the AI Cloud Space with 40,000 GPUs https://analyticsindiamag.com/industry-insights/ai-startups/another-indian-startup-entering-ai-cloud-space-40000-gpus/ Wed, 05 Jun 2024 08:33:36 +0000 https://analyticsindiamag.com/?p=10122524

The startup has placed an order for 8000 NVIDIA GPUs with HPE.

The post Another Indian Startup is Entering the AI Cloud Space with 40,000 GPUs appeared first on AIM.

]]>

Narendra Sen, the CEO of NeevCloud, is a man of ambition. He envisions constructing an AI cloud infrastructure tailored for Indian clients, comprising 40,000 graphics processing units (GPUs) by 2026. This infrastructure aims to support Indian enterprises with training, inference, and other AI workloads.

Yotta, another prominent AI cloud provider, has recently gained attention for its recognition as the inaugural NVIDIA Partner Network (NPN) cloud partner in India, achieving the Elite Partner status globally.

Yotta’s ambitious plan is to set up an AI cloud infrastructure with 32,768 GPUs by the end of 2025. NeevCloud wants to do better.

Moreover, NeevCloud will soon launch an AI inferencing platform which will provide open-source models like Llama 3 series, Mistral models and DBRX by Databricks

“Later this month, we plan to introduce the DBRX because we see a great demand for the model in this country. Subsequently, we will launch the text-to-image models from Stability AI.

“We are focusing on a step-by-step approach to ensure smooth operation. User experience is paramount, and if everything proceeds as planned, we might expand the range. Meanwhile, we are also enhancing our capabilities in tandem with these developments,” Sen told AIM.

AI Inference 

Inferencing involves applying the pre-trained model to new input data to make predictions or decisions. So far, around 3,500 developers have signed up to use NeevCloud’s inference platform.

The company plans to launch the beta version of the platform this month and provide developers with free tokens to lure them to come and test NeevCloud’s inference platform.

For the inferencing platform, the company plans to leverage around 100 NVIDIA GPUs, including NVIDIA H100s, A100, and L40 GPUs. Moreover, NeevCloud plans to introduce AMD’s M1300X to the mix. 

“AMD’s M1300X provides cost benefits compared to the rest. Indians don’t care about the brand. What they care about is that the API should work, latency should be fast, and they should get immediate tokens – that’s it,” Sen stated.

NeevCloud could also be the first company to bring Groq’s language processing units (LPU) to India. Recently, Sen posted a photo of him on LinkedIn, posing with Jonathan Ross in Groq’s headquarters in San Francisco. 

( Source: LinkedIn) 

Sen, however, refrained from revealing much in this regard as things are still not finalised. “While we all know the big names like NVIDIA, there are other players in the market that we are trying to onboard, like SambaNova,” he said.

AI Infrastructure as a Service 

For the AI cloud, Sen revealed that they have placed an order with HP Enterprise (HPE) for 8000 NVIDIA GPUs, which they are expecting to receive in the second half of this year. 

NeevCloud will compete directly with Yotta in the AI-as-a-infrastructure space. Notably, another company that has already making GPUs more accessible in India is E2E Network. 

The NSE-listed company offers NVIDIA’s H100 GPU and NVIDIA A100 Tensor Core GPUs at a competitive price compared to large hyperscalers. 

Recently, Bhavish Aggarwal, the founder of Ola Cabs, announced his decision to offer Krutrim AI Cloud to Indian developers. 

Similarly, Tata Communications also partnered with NVIDIA to build a large-scale AI cloud infrastructure for its customers in the private as well as the public sector.

While Krutrim and Tata Communications have not revealed the number of GPUs they plan to deploy, NeevCloud plans to deploy another 12,000-15,000 GPUs by 2025. “Then, by 2026 we will deploy the remaining to hit the 40,000 GPUs target,” Sen said.

How will NeevCloud Fund the 40,000 GPU Acquisition?

However, deploying a GPU cluster of around 40,000 GPUs will cost billions of dollars. According to Sen, it will cost them approximately $1.5 billion. 

While Yotta is backed by the Hiranandani Group, NeevCloud is banking on its data centre partners to help not only procure but deploy the GPUs as well.

So far, NeevCloud has partnered with three large data centre companies in India, two of which are based in Chennai and one in Mumbai. One among them is one of the largest data centre operators in India, Sen said.

“What we have in place is a revenue-sharing model. While they already have the data centre infrastructure, they need to deploy the GPUs on our behalf, and NeevCloud will bring in the customers, who will access both their AI cloud capacity (to be deployed) and their data centre servers,” Sen said.

Sen established NeevCloud in 2023. However, Sen has been running a data centre company in Indore called Rackbank Datacenters for many years now. 

Located in Crystal IT Park, Indore, the data centre is 35,000 sq ft and has a capacity of 32,000+ servers.

The company has innovated a liquid immersion cooling technology, named Varuna, to effectively cool high-density computing hardware utilised for AI, machine learning, and high-performance computing (HPC) tasks. 

This method entails immersing servers and other IT equipment in a dielectric, non-conductive coolant liquid, facilitating direct heat transfer from the components to the liquid.

The post Another Indian Startup is Entering the AI Cloud Space with 40,000 GPUs appeared first on AIM.

]]>
Databricks Acquires Data Management Startup Tabular for Over $1 Billion https://analyticsindiamag.com/industry-insights/ai-startups/databricks-acquires-data-management-startup-tabular-for-over-1-billion/ Tue, 04 Jun 2024 17:53:30 +0000 https://analyticsindiamag.com/?p=10122481

The acquisition of Tabular follows that of Arcion for $100 million and MosaicML for $1.3 billion in 2023 and Okera in 2023, to expand its capabilities in data management, AI, and security.

The post Databricks Acquires Data Management Startup Tabular for Over $1 Billion appeared first on AIM.

]]>

Databricks, a leading data and AI company, has announced its acquisition of Tabular, a data management startup founded by Ryan Blue, Daniel Weeks, and Jason Reid.

This strategic move aims to unify the two leading open-source lakehouse formats, Apache Iceberg and Delta Lake, to improve data compatibility and interoperability for enterprises.

The financial terms of the deal remain undisclosed, but industry estimates suggest the acquisition is valued between $1 billion and $2 billion.

A Shared Vision of Openness

Both Databricks and Tabular have a strong commitment to open-source formats. Databricks, known for its contributions to open-source projects, has donated 12 million lines of code and is the largest independent open-source company by revenue. This acquisition underscores Databricks’ dedication to open data formats, ensuring companies maintain control over their data and avoid vendor lock-in.

The Rise of Lakehouse Architecture

Databricks pioneered the lakehouse architecture in 2020, integrating traditional data warehousing with AI workloads on a single, governed copy of data.

This architecture, which relies on open formats, has been widely adopted, with 74% of enterprises deploying a lakehouse according to a survey by MIT Technology Review. The foundation of the lakehouse is open-source data formats that enable ACID transactions on data stored in object storage, improving reliability and performance.

Addressing Format Incompatibility

Despite their shared goals, Delta Lake and Iceberg, the two leading open-source lakehouse formats, have developed independently, leading to incompatibility.

This fragmentation has undermined the value of the lakehouse architecture by siloing enterprise data. Databricks aims to address this issue by working closely with the Iceberg and Delta Lake communities to bring interoperability to the formats.

The introduction of Delta Lake UniForm last year was a step towards this goal, providing compatibility across Delta Lake, Iceberg, and Hudi.

Future Plans and Integration

With the acquisition of Tabular, Databricks plans to invest heavily in expanding the ambitions of Delta Lake UniForm. The integration of Tabular’s technology and expertise will help Databricks enhance its data management platform, enabling companies to leverage AI more effectively.

The acquisition is expected to close in Databricks’ second fiscal quarter, subject to customary closing conditions.

This acquisition marks a significant step in Databricks’ strategy to strengthen its position in the market and offer a more powerful and versatile data management and AI solution to its customers.

The post Databricks Acquires Data Management Startup Tabular for Over $1 Billion appeared first on AIM.

]]>
Jio Backed Startup TWO AI Unveils ChatSUTRA, India’s Answer to ChatGPT https://analyticsindiamag.com/industry-insights/ai-startups/jio-backed-startup-two-ai-unveils-chatsutra-indias-answer-to-chatgpt/ Tue, 04 Jun 2024 12:39:42 +0000 https://analyticsindiamag.com/?p=10122474

ChatSUTRA is powered by SUTRA by TWO AI, models trained with an innovative LLM architecture that learns new languages independently, making it multilingual, highly scalable, and cost-efficient.

The post Jio Backed Startup TWO AI Unveils ChatSUTRA, India’s Answer to ChatGPT appeared first on AIM.

]]>

Jio backed TWO AI announced the launch of SUTRA through its new AI app, ChatSUTRA, available at two.chat.ai. ChatSUTRA is now accessible via web and will soon be available on iOS and Android.

The startup raised a $20M seed fund in February 2022 from Jio Platforms and South Korean internet conglomerate Naver. “Jio has been one of our key partners for a long time and has invested in us from the very beginning,” said Pranav Mistry, the founder of TWO, in an exclusive interaction with AIM. 

He added that Reliance Jio Infocomm chairman Akash Ambani takes a keen interest in the growth of the startup. “I meet with them often. Jio’s vision is to bring the power of AI through its services. Being a Jio partner gives us access to this market,” he said.

Multilingual versatility is at the core of ChatSUTRA’s power, enabling users to engage in AI conversations in over 50 languages without language confusion or cultural hallucination. Supported languages include Hindi, Gujarati, Bengali, Marathi, Korean, Japanese, Greek, and Arabic.

Eighty percent of the world speaks languages other than English. ChatSUTRA aims to give non-English speakers equal access to the best in generative AI, enabling them to seek knowledge, find answers to complex questions, and explore different cultures. 

Whether users need advanced translation, text summarisation, creative and collaborative writing, recipes for foreign dishes, or DIY repair instructions, ChatSUTRA delivers quick and comprehensive answers.

The ChatSUTRA interface is simple and straightforward, featuring a list of saved conversations on the left, a center area filled with “Conversation Starter Cards,” and a variety of languages to choose from. Users can log in to store conversations across devices and access them anywhere.

ChatSUTRA-Pro, to be released soon, will offer early access to TWO’s latest features, higher chat message limits, and access to SUTRA’s best-performing models. Additionally, SUTRA is available as an API for developers at docs.two.ai.

Mistry, the driving force behind ChatSUTRA, said, “ChatSUTRA represents the best in Multilingual AI assistance available for the world, not just English speakers. Our mission is to fix the language gap in AI language models, and with the launch of ChatSUTRA, I invite everyone to try AI for themselves in हिंदी, ગુજરાતી, বাংলা, العربية, मराठी, తెలుగు, தமிழ், ಕನ್ನಡ, മലയാളം, ଓଡ଼ିଆ, ਪੰਜਾਬੀ, and over 50 other languages.”

“SUTRA is AI for all, AI that can understand nuances of languages and dialects. AI beyond just English,” he added.

ChatSUTRA is powered by SUTRA by TWO AI, models trained with an innovative LLM architecture that learns new languages independently, making it multilingual, highly scalable, and cost-efficient.

The major advances in AI, primarily in English, have left 80% of the non-English-speaking population on the sidelines. Existing AI models predominantly train and scale in English, limiting access to high-quality LLMs for non-English speakers. 

ChatSUTRA aims to address this gap, unlocking AI’s potential in large economies such as India, Korea, Japan, and the MEA region.

ChatGPT, Claude.ai, and Perplexity exhibit some multilingual capabilities but often revert to English and struggle with complex multilingual tasks, especially in lower-resource languages. 

Traditional multilingual LLMs train on data biased towards English, leading to confusion and reduced performance across languages.

ChatSUTRA will continue to evolve with new features, advanced SUTRA-Pro models, and mobile versions on iOS and Android.

The post Jio Backed Startup TWO AI Unveils ChatSUTRA, India’s Answer to ChatGPT appeared first on AIM.

]]>
Zoho Invests in Drone Startup Yali Aerospace to Solve Emergency Medical Deliveries  https://analyticsindiamag.com/industry-insights/ai-startups/zoho-invests-in-drone-startup-yali-aerospace-to-solve-emergency-medical-deliveries/ Tue, 28 May 2024 07:18:13 +0000 https://analyticsindiamag.com/?p=10121847

The investment in Thanjavur-based startup adds to Zoho’s push of encouraging rural players.

The post Zoho Invests in Drone Startup Yali Aerospace to Solve Emergency Medical Deliveries  appeared first on AIM.

]]>

Sridhar Vembu, founder and CEO of Saas giant Zoho Corporation, announced that the company will invest in Yali Aerospace. The drone startup based out of Thanjavur, in Tamil Nadu, has built a fixed wing drone with vertical take off and landing capabilities. 

With a range of 150 km, payload of 7kg, and a maximum speed of 155 km/hr, the drone can help in delivering medicines and organs to remote hospitals, thereby bypassing the problems of emergency road transport.

Source: X

Yali Aerospace, is led by a husband-wife duo Dinesh Baluraj and Anugraha who returned from the Netherlands to build the startup in their hometown. Baluraj comes from an Aerospace engineering background having done his masters from Technical University of Munich. 

Zoho’s Rural Push

Vembu has always been a strong voice for pushing rural and local growth and has been continuously working towards growing tier 2 and 3 cities. Zoho has opened up over 25 satellite offices in these tier cities. 

Recently, the company has also opened up R&D centres in Tenkasi (Tamil Nadu) and Kottarakara (Kerala) to retain local talent and encourage people to move closer to their hometowns. The 3500 sq. ft. park in Kottarakara is said to train over 5000 people over the next five years.

“I have to look at how to positively create jobs, moving skills and capabilities, and ensuring that we retain our talent, because there is a lot of pressure in rural areas for talent to migrate away,” said Vembu in an earlier interview. 

Zoho, is the first bootstrapped company in the world to cross 100 million users and the company serves over 700K businesses across 150 countries.

The post Zoho Invests in Drone Startup Yali Aerospace to Solve Emergency Medical Deliveries  appeared first on AIM.

]]>
Top 12 Indian Generative AI Startups in 2024 Building GenAI Products https://analyticsindiamag.com/industry-insights/ai-startups/12-indian-genai-startups-building-insane-products-you-should-know-about/ Thu, 23 May 2024 06:37:52 +0000 https://analyticsindiamag.com/?p=10121314

From building AI agents that can converse in Indic languages to developing AI chatbots that help UPSC aspirants prep better, these Indian startups can do it all!

The post Top 12 Indian Generative AI Startups in 2024 Building GenAI Products appeared first on AIM.

]]>

Infosys co-founder Narayana Murthy recently said that Indians are good at applying ideas generated elsewhere for the betterment of the nation. He also added that it would take time for the country to invent new things.

“There are going to be APIs and people are going to use them. That’s the way things will get built. It’s not bad to be a wrapper, it’s just that you shouldn’t be a shallow wrapper. You have to think about the value you’re adding on top of the model,” said Google CEO Sundar Pichai.

India might have arrived late to the AI party, but the future of AI in India is not that bleak. With an increase in investments, more initiatives like AI4Bharat, and industry and academia partnerships to bolster research in the country, India can definitely up its AI game!

Top GenAI Indian Startups in 2024

NameModels/IndustryFounder Name
KOGO AIIndic languagesRaj K Gopalakrishnan
Sarvam AILLMsVivek Raghavan and Pratyush Kumar
PAiGPTAI ChatbotEshank Agarwal, Addya Rai, Siddharth Singh, and Deepanshu Singh
Soket AI LabsEthical AGIAbhishek Upperwal
KissanAIAgri-tech startupPratik Desai
Subtl.aiLlama 3 8B modelVishnu Ramesh
dot.agentAgent Management SystemAnurag Bisoi
Stition.aiSecurity for LLMsMufeed VH
CognitiveLabIndic LLMAdithya S Kolavi
MachineHackData analysis toolBhasker Gupta
TWOMultilingual and cost-efficient languagePranav Mistry
TensoicLLaMAAdarsh Shirawalmath

Here are 12 Indian startups that are leading the GenAI wave in India. 

1. KOGO AI

Bengaluru-based deep tech startup KOGO AI has developed a platform that helps companies build AI agents that can converse in Indic languages. Using the platform, companies can build an AI agent from scratch within minutes.

Initially, these agents will be able to support conversations in Urdu, Hindi, and English, with plans to include another 73 languages, both Indian and global, soon.

For this, the Bengaluru-based startup has partnered with Bhashini, the Indian government’s initiative aimed at breaking language barriers in India, and Microsoft to make the agents multilingual.

2. Sarvam AI

Established in July 2023, Sarvam AI was co-founded by Vivek Raghavan and Pratyush Kumar to make generative AI accessible to everyone in India at scale. 

“We think this is a foundational technology, and we don’t want India to become solely a prompt engineering nation,” said Raghavan in an exclusive interview with AIM. 

The company has raised $41 million in its Series A funding round led by Lightspeed Ventures with participation from Peak XV Partners and Khosla Ventures.

Last year, Sarvam AI also open sourced OpenHathi, an Indic Hindi LLM built on top of Llama 2. On Hugging Face, the model has been downloaded more than 18,000 times last month.

It recently also open-sourced ‘Samvaad’, a curated dataset with 100,000 high-quality conversations in English, Hindi, and Hinglish, totalling over 700,000 turns.

Further, Sarvam AI is collaborating with Meta to develop vernacular LLMs and has partnered with Microsoft to create an Indic voice based LLM.

3. PAiGPT

PAiGPT, India’s first AI chatbot for UPSC aspirants, recently released its app for Android and iOS. 

The app’s USP is its ability to fetch real-time information on various topics and current affairs, similar to Perplexity AI and Google Gemini. However, what sets it apart is its feature that provides trending topics and the option to create multiple-choice questions based on the available information.

https://twitter.com/deepanshuS27/status/1790706717971660839

Founded in September 2022 by Eshank Agarwal, Addya Rai, Siddharth Singh, and Deepanshu Singh, the app also allows aspirants to upload images of editorials from popular newspapers and then generate summaries. 

4. Soket AI Labs

India now has a company building solutions to achieve AGI and beyond. Soket Labs is an AI research firm with a vision to further the advancement in AI towards ethical AGI.

Founded in 2019 by Abhishek Upperwal, the company is part of NVIDIA’s Inception Programme and AWS Activate for training compute access.

Soket AI Labs recently introduced Pragna-1B, India’s first open-source multilingual model designed to cater to the linguistic diversity of the country. Available in Hindi, Gujarati, Bangla, and English, the model comes with 1.25 billion parameters and a context length of 2048 tokens.

5. KissanAI

In a major step forward for AI in agriculture, agri-tech startup KissanAI recently launched Dhenu Vision LLMs for crop disease detection.

Last year, KissanAI also released Dhenu 1.0, an agricultural LLM tailored for Indian farmers. Recently,  it released Dhenu Llama 3, fine-tuned on Llama3 8B. 

The agriculture generative AI startup also teamed up with UNDP to develop the pioneering voice-based vernacular generative AI CoPilot for Climate Resilient Agriculture (CRA) practices. This initiative aims to deliver crucial advice to thousands of Indian farmers, especially smallholders who have been hit hard by climate change.

6. Subtl.ai

Subtl.ai, is addressing the challenges of generative AI in enterprise environments. It focuses on creating solutions that enable enterprises to handle sensitive data securely without exposing it to the internet.

Vishnu Ramesh, founder of Subtl.ai, calls it a ‘private Perplexity built on light models for enterprise’. 

Subtl.ai has developed a proprietary product that leverages the Llama 3 8B model, allowing businesses like the State Bank of India to access and respond to inquiries quickly and securely, directly citing provided sources of information. 

7. dot.agent 

dot.agent is the world’s first AMS (AI/Agent Management System) that acts as a central hub that directs requests to the most suitable AI agent or model for the task. This “smart dispatcher” continuously learns from your data & adapts to your specific use case.

It allows Dot to outperform AI models like GPT-4 and Devin in real-world use cases, potentially reducing your AI costs by up to 60%! Dot for Code Generation is also purportedly 8x better than GPT-4. 

8. Stition.ai 

Stition.ai focuses on building security products for LLMs. Stition’s security product that can automatically find safety flaws without human intervention and patch vulnerabilities has been in public beta since December. A full release is expected soon.

https://twitter.com/mufeedvh/status/1789278065685872754

Mufeed VH, the founder of Stition.AI, recently released an open-source passion project called Devika. This Indian version of Devin can understand human instructions, break them down into tasks, conduct research, and autonomously write code to achieve set objectives. 

9. CognitiveLab

Founded by Adithya S Kolavi, CognitiveLab recently released an Indic LLM leaderboard for the growing number of Indic language models entering the scene without a uniform evaluation framework.

The Indic LLM leaderboard offers support for seven Indic languages – Hindi, Kannada, Tamil, Telugu, Malayalam, Marathi, and Gujarati – providing a comprehensive assessment platform. Hosted on Hugging Face, it currently supports four Indic benchmarks, with plans for additional benchmarks in the future.

Click here to check it out.

10. MachineHack

MachineHack Generative AI, one of the few pure-play generative AI startups in India, has launched DataLyze, a generative AI data analysis tool, making data analytics accessible to everyone. 

Launched in 2018, MachineHack is an all-in-one platform designed for data engineers, data scientists, machine learners, and developers at all levels. Users can enhance their skills, compare their expertise with peers, write articles, learn coding, apply for jobs, and build impressive portfolios. 

11. TWO

TWO is a tech company that aims to redefine human-AI interactions through its proprietary multilingual and cost-efficient language models called SUTRA. These are ultrafast, multilingual, online generative AI models that can operate in 50+ languages with conversational, search, and visual capabilities.

SUTRA-Online are internet-connected and hallucination-free models that understand queries, browse the web and summarise information to provide current answers. It can answer queries like “Who won the game last night?” or “What’s the current stock price?” accurately.

12. Tensoic

Mumbai-based software development company Tensoic released a Kannada Llama aka Kan-LLaMA — a 7B Llama-2 model, LoRA PreTrained and FineTuned on Kannada tokens.

Just a few days after releasing Kan-Llama, the researchers also released a playground to test the model. Hooked to NVIDIA A100 GPUs, Tensoic released the playground in partnership with E2E Networks, one of the biggest providers of cloud GPUs in India.

The post Top 12 Indian Generative AI Startups in 2024 Building GenAI Products appeared first on AIM.

]]>
This Akash Ambani-Backed Startup is Building Multilingual LLMs for India https://analyticsindiamag.com/industry-insights/ai-startups/this-akash-ambani-backed-startup-is-building-multilingual-llms-for-india/ Thu, 23 May 2024 05:02:23 +0000 https://analyticsindiamag.com/?p=10121307

SUTRA is a model built from scratch, not fine-tuned or based on any other LLM.

The post This Akash Ambani-Backed Startup is Building Multilingual LLMs for India appeared first on AIM.

]]>

TWO, a startup backed by Reliance Jio, recently launched a family of models called SUTRA. These cost-efficient, multilingual generative AI models excel in 50+ languages, offering speech, search, and visual processing capabilities. 

The startup raised a $20M seed fund in February 2022 from Jio Platforms and South Korean internet conglomerate Naver. “Jio has been one of our key partners for a long time and has invested in us from the very beginning,” said Pranav Mistry, the founder of TWO, in an exclusive interaction with AIM

He added that Reliance Jio Infocomm chairman Akash Ambani takes a keen interest in the growth of the startup. “I meet with them often. Jio’s vision is to bring the power of AI through its services. Being a Jio partner gives us access to this market,” he said.

Before founding TWO in 2021, Mistry served as Samsung Technology & Advanced Research Labs’ (STAR Labs) President and CEO. 

In 2009, Mistry developed SixthSense, a wearable gestural interface that integrates digital information with the physical world, enabling users to interact with data using natural hand gestures. This technology was introduced during his TEDIndia talk in 2009 and has since garnered widespread attention.

TWO’s SUTRA Line of Products

As of now, TWO offers four models on the SUTRA playground: Sutra Light, Sutra Pro, Sutra Turbo, and Sutra Online. “Some of our partners in Korea and India have already started evaluating our models and conducting pilots in their own products,” said Mistry.

In terms of capabilities, Mistry said, “SUTRA models are 56 billion parameters,” adding that it is a very small model compared to larger models showcasing a trillion parameters like  OpenAI’s GPT-4o

“The power of small models is that they can run very efficiently and at a very low cost. In order to run this model, we require a single NVIDIA RTX A6000 GPU” added Mistry. 

TWO is planning to launch ChatSUTRA this month, a platform where users can start using SUTRA’s multilingual models in 50+ languages for almost any task – to chat, question, learn, brainstorm, write, and more. 

TWO also has an AI-powered social media app called Zappy, which is quite popular in South Korea. “One of our apps, Zappy, which uses millions of AI-to-user conversations, is powered by SUTRA. Right now, it’s available in Korea, and we are planning to bring Zappy to India very soon this summer,” said Mistry.

Another product from TWO is Geniya which can browse data from the internet using Google, rivalling Perplexity AI. Mistry said that Geniya is still in public beta and users can try it out, following the official launch expected sometime in June.

SUTRA’s Architecture 

SUTRA is a model built from scratch, not fine-tuned or based on any other LLM. It combines the LLM with neural machine translation (NMT) to accurately handle idiomatic expressions and colloquial language. “Our specialised NMT models are significantly smaller in parameter size, requiring much less data for training”, Mistry said. 

This ensures that SUTRA  not only grasps the literal meaning of given inputs but also understands the cultural context, which is essential for effective communication.

Mistry also highlighted that they have a dataset advantage, as they have trained Sutra on the millions of user-to-AI conversations happening on Zappy.

“We can actually use the user to AI  conversation data in order to improve the quality of SUTRA,” said Mistry, adding that they have Korean data from over 20 million conversations that SUTRA was originally trained on in Korea.

SUTRA’s Customers 

SUTRA models are currently available as APIs as well. Mistry said that he thinks that the Asia Pacific market is a huge opportunity for non-English AI models. 

“We have access to companies like Jio, as well as Naver and SK Telecom in Korea. We want to work  with these  telecom companies to bring the power of their cloud and edge networks to distribute the power of SUTRA,” said Mistry.

SUTRA is not Alone 

The Indian AI startup ecosystem is currently booming. Sarvam AI launched the OpenHathi series last year and is currently working on Indic voice LLMs. Meanwhile, Tech Mahindra is working on ‘Project Indus’.

This month, the Hanooman model was jointly released by SML India and 3AI Holding, an Abu Dhabi-based investment firm. Bengaluru-based CoRover also introduced BharatGPT, earlier this year.

In the meantime, Ola Cabs chief Bhavish Aggarwal is building Krutrim AI. Additionally, the Nilekani Center at AI4Bharat in IIT Madras released Airavata, an open-source LLM for Indian languages.

“I am aware of Sarvam AI, Krutrim AI, as well as the work from Tech Mahindra and SML’s Hanooman,” said Mistry. 

However, Mistry believes that it’s not so much about the competition. “It’s  about more people working together towards the goal of bringing the power of AI to India and SUTRA  wants to be a part of this journey,” he concluded.

The post This Akash Ambani-Backed Startup is Building Multilingual LLMs for India appeared first on AIM.

]]>
How to Build Sustainable AI Startups https://analyticsindiamag.com/industry-insights/ai-startups/how-to-build-sustainable-ai-startups/ Wed, 22 May 2024 05:20:49 +0000 https://analyticsindiamag.com/?p=10121216

Sam Altman believes in not building an AI business but rather a business which has AI as a technology.

The post How to Build Sustainable AI Startups appeared first on AIM.

]]>

With the advancements brought in by GPT models, GPT-4o being the latest, creating sustainable AI startups that leverage artificial intelligence and can last and grow over time has become increasingly important.

In a recent podcast, OpenAI chief Sam Altman spoke about how to either create a business that thrives even if the next AI model isn’t significantly better, or develop a system that gets more useful as AI models improve or advance. 

Additionally, he favoured not building an “AI business” in most cases, but rather a business that uses AI as a technology. Giving an example, he drew parallels to the early days of the App Store, where many people built simple apps like Flashlight which became obsolete with an iOS upgrade.

Meanwhile companies like Uber established sustainable businesses as smartphones improved, leveraging phones as the key technology that significantly enabled their operations.

How OpenAI Plans to Monetise

Recently, OpenAI made GPT-4o available to everyone for free (with usage limits), offering features like browsing, data analysis, and memory. 

Additionally, Plus users will receive up to 5x higher limits and earliest access to features like the new macOS desktop app and next-generation voice and video capabilities. The move highlights OpenAI’s efforts to encourage upgrades to their monetisation plans, as discussed in the podcast.

Altman said that they are yet to figure out ways to make an expensive technology like GPT-4 available to users for free. He emphasised that while they aim to provide advanced AI tools for free or at a minimal cost as part of their mission, the high expenses currently pose a significant barrier.

Meanwhile, OpenAI recently became a Reddit advertising partner, which likely indicates that the company can leverage Reddit’s large user base to advertise its own products and services, potentially driving more customers and revenue.

OpenAI’s revenue for this year has surpassed the $2-billion mark, according to reports from the Financial Times. Therefore, like OpenAI showcasing its continuous revenue generation, startups must also ensure they can sustain their business models in the long run.

Do startups need to follow big companies 

A few days ago, Cred founder Kunal Shah cast a wide net asking people on X this direct question: “Who is building an AI application in India”, receiving nearly 300-400 responses. 

Dharmesh BA, who is working on a stealth startup, noted that many products were simply wrappers around existing models in various modalities. He categorised these apps as CRUD (Create, Read, Update, Delete) and warned that building apps based on the assumption that OpenAI or current LLMs can’t perform specific tasks could lead to a disaster.

Each time OpenAI updates or releases a new version, many startups find themselves rendered obsolete because the enhanced capabilities of OpenAI often solve the problems these startups were aiming to address.

When OpenAI introduced ChatGPT Enterprise, it sent shockwaves across several SaaS startups that had developed products around ChatGPT or offered wrappers based on ChatGPT APIs for business clients. 

Additionally, Dharmesh’s post highlighted a perspective that attempts to confine an extremely powerful technology, like LLMs—which can be compared to a genie capable of doing anything—into a limited space such as mobile apps or websites. 

These technologies are capable of much more complicated and valuable work, and by limiting their potential, we are not utilising them in the medium they are meant to reside in. 

What about Indian Startups

In yet another post on X, the Cred founder said that early-stage startups should be easy to iterate and late-stage startups should be hard to distract. This highlights the mentality of Indian startups that are not iterating and not innovating enough.

In India, researchers and enterprises should prioritise building large models, technical benchmarking, and AI industrial standardisation over developing specific use case apps, which are easily replicated and improved upon. 

Most envision LLMs as operating systems where users choose their own apps, but these apps’ longevity depends on the base provider, like OpenAI’s architecture. 

But as AIM wrote, the question remains as to why such research isn’t being conducted domestically, especially as tech giants like OpenAI and Google focus more on Indic languages, posing a threat to those developing for the Indian ecosystem.

The post How to Build Sustainable AI Startups appeared first on AIM.

]]>
E2E Cloud and AIC-NITF Partner to Boost Indian AI Startup Ecosystem https://analyticsindiamag.com/industry-insights/ai-startups/e2e-cloud-and-aic-nitf-partner-to-boost-indian-ai-startup-ecosystem/ Tue, 21 May 2024 05:53:14 +0000 https://analyticsindiamag.com/?p=10121091

E2E Cloud announced a special initiative for incubators: the first 25 incubators will receive GPU credits worth up to 2 lakh rupees each.

The post E2E Cloud and AIC-NITF Partner to Boost Indian AI Startup Ecosystem appeared first on AIM.

]]>

E2E Cloud, a leader in GPU cloud solutions, has entered a strategic partnership with Atal Incubation Centre – Nalanda Institute of Technology Foundation (AIC-NITF), backed by the Atal Innovation Mission under NITI Aayog. 

This collaboration aims to enhance innovation and entrepreneurship in India, particularly in AI.

Through a newly signed MoU, E2E Cloud will provide its state-of-the-art GPU infrastructure and expertise to startups incubated at AIC-NITF. This includes access to 64 H100 GPU super pods, expandable to 2048 GPUs, which will support startups in their innovation and scaling efforts.

Tarun Dua, CEO of E2E Cloud, highlighted the importance of this partnership, stating, “Our collaboration with Atal Incubation Centre – Nalanda embodies our shared commitment to nurturing technological innovation. By providing startups at Nalanda with advanced GPU resources at affordable rates, we aim to reduce the barriers for startups to build and scale.”

AIC-NITF, dedicated to promoting innovation and entrepreneurship, will leverage E2E Cloud’s resources to help startups tackle complex challenges and introduce groundbreaking solutions. Durga Prasad Gouda, CEO of AIC-NITF, remarked, “This collaboration with E2E Cloud aligns perfectly with our mission to nurture technology-led startups. Access to cutting-edge GPU technology will enable our incubatees to accelerate their research and development efforts, paving the way for impactful solutions”

Furthermore, E2E Cloud announced a special initiative for incubators: the first 25 incubators will receive GPU credits worth up to 2 lakh rupees each, enhancing their ability to leverage advanced technology and expand their operations.

The post E2E Cloud and AIC-NITF Partner to Boost Indian AI Startup Ecosystem appeared first on AIM.

]]>