AI Insights - Understanding Artificial Intelligence https://analyticsindiamag.com/ai-insights-analysis/ Artificial Intelligence, And Its Commercial, Social And Political Impact Tue, 03 Sep 2024 09:44:56 +0000 en-US hourly 1 https://analyticsindiamag.com/wp-content/uploads/2019/11/cropped-aim-new-logo-1-22-3-32x32.jpg AI Insights - Understanding Artificial Intelligence https://analyticsindiamag.com/ai-insights-analysis/ 32 32 Wake Me Up When Companies Start Hiring Clueless Modern ‘Developers’ https://analyticsindiamag.com/ai-origins-evolution/wake-me-up-when-companies-start-hiring-clueless-modern-developers/ Mon, 02 Sep 2024 09:00:42 +0000 https://analyticsindiamag.com/?p=10134230

People who know how to drive are not all F1 racers.

The post Wake Me Up When Companies Start Hiring Clueless Modern ‘Developers’ appeared first on AIM.

]]>

“Programming is no longer hard” or “everyone’s a developer” are the most common phrases that one would hear on LinkedIn or X as everyone is basically talking about Cursor, Claude, or GitHub Copilot. But the problem is that most of the people who claim so are not developers themselves. They are merely ‘modern developers.’

Register for NVIDIA AI Summit India

Santiago Valdarrama, founder of Tideily and an ML teacher, who has been actively asking developers if they are using Cursor or not, started another discussion that tools such as Cursor and others are basically tools that can only assist existing developers in writing better code. “Wake me up when companies start hiring these clueless modern ‘developers’,” he added.

He gave an analogy of calling yourself an F1 racer after playing a racing game on an iPad.

In all honesty, it is undeniable that the barrier to entry for becoming a developer has dropped significantly ever since Cursor, even ChatGPT, dropped. People have been able to build software for their personal use and even build apps in mere hours. But, this does not eliminate the fact that it is currently only limited to just creating such apps and low level software.

“You Can’t Correct Code if You Don’t Know How to Code”

Given all this hype around the end of software engineering roles, developers and programmers are getting worried about the future of their jobs. It is indeed true that software engineers have to upskill faster than anyone else, but the fear of getting replaced can be pushed off to at least a few years.

Having tools such as Cursor and Claude are only good enough if a developer actually knows how the code actually works. The real game-changer is how developers who use AI will outpace those who don’t. “The right tools can turn a good developer into a great one. It’s not about replacing talent; it’s about enhancing it,” said Eswar Bhageerath, SWE at Microsoft. 

AI only takes care of the easy part for a developer – writing code. The real skill that a developer has is reasoning and problem solving, apart from fixing the bugs in the code itself, which cannot be replaced by any AI tool, at least anytime soon. Cursor can only speed up the process and write the code but correcting the code is something that only developers can do.

Moreover, bugs generated within code with AI tools are not easily traceable by developers without using any other AI bug detection tool. Andrej Karpathy, who has been actively supporting Cursor AI over GitHub Copilot, also shared a similar thing while working. “it’s slightly too convenient to just have it do things and move on when it seems to work.” This has also led to the introduction of a few bugs when he is coding too fast and tapping through big chunks of code.

These bugs cannot be fixed by modern ‘developers’ who were famously also called ‘prompt engineers’. To put it simply, someone has to code the code for no-code software.

Speaking of prompt engineers, the future will include a lot of AI agents that would be able to write code themselves. The future jobs of software engineers would be managing a team of these AI coding agents, which is not possible for developers who just got into the field by just learning to build apps on Cursor or Claude. It is possible that the size of the teams might decrease soon as there would be no need for low level developers.

Upskilling is the Need of the Hour

That is why existing developers should focus on developing engineering skills, and not just coding skills. Eric Gregori, adjunct professor at Southern New Hampshire University, said that this is why he has been teaching his students to focus more on engineering than just programming. “AI is too powerful of a tool to ignore,” he said, while adding that existing limitations of coding platforms have been removed completely. 

“Hopefully, AI will allow software engineers to spend more time engineering and less time programming.” It is time to bring back the old way of learning how to code as modern developers would be tempted to just copy and paste code from AI tools, and not do the real thinking. 

The F1 driver analogy fits perfectly here. Most people can learn how to drive, but would never be able to become a race driver. The same is the case with the coding tools. But if all people need is prototyping and designing an initial code, AI driven developers would be able to do a decent enough job.

That is why a lot of pioneers of the AI field such as Karpathy, Yann LeCun, Francois Chollet, even Sam Altman, say that there would be 10 million coding jobs in the future, the ones that would require the skills of Python, C++, and others. As everyone in some way would be a ‘modern developer’, and most of the coding would be done by AI agents. 

It is possible that most of the coding in the future would be in English, but most of it would be about debugging and managing the code generated by AI, which is not possible for someone who does not know coding from scratch.

The post Wake Me Up When Companies Start Hiring Clueless Modern ‘Developers’ appeared first on AIM.

]]>
How Do You Differentiate an AI Agent from a Human? https://analyticsindiamag.com/ai-insights-analysis/how-do-you-differentiate-an-ai-agent-from-a-human/ Thu, 29 Aug 2024 10:49:54 +0000 https://analyticsindiamag.com/?p=10134079

A survey revealed that 69% of Indians think they don’t know or cannot tell the difference between an AI voice and a real voice.

The post How Do You Differentiate an AI Agent from a Human? appeared first on AIM.

]]>

There are millions of small businesses in the world, and in the future, all of them could have AI agents carry out functions like customer support and sales. Meta CEO Mark Zuckerberg recently said there could be more AI agents in the world than humans.

Venture capitalist Vinod Khosla predicts that most consumer interactions online will involve AI agents handling tasks and filtering out marketers and bots. However, this raises a fundamental question: How do you differentiate between an AI agent and a human? Or worse – what if bad actors start using these agents?

In this context, Ben Colman, the co-founder and CEO of Reality Defender, made a bold statement: “Even someone with a PhD in computer science and computer vision can’t distinguish between what’s real and what’s fake.”  

Malicious actors have exploited fake identities to commit fraud, spread disinformation, and pull off countless deceptive schemes. With AI agents, these bad actors will be able to scale their operations with unprecedented efficiency and finesse.

What’s the Solution?

A recent research paper ‘Personhood Credentials’ (PHCs) explores an approach to tackle the growing challenge of distinguishing real people from AI online. These digital credentials allow users to prove they are human without revealing any personal information. 

The study is authored by researchers, including Srikanth Nadhamuni, former head of technology, Aadhaar; Steven Adler from OpenAI; Zoë Hitzig from the Harvard Society of Fellows; and Shrey Jain from Microsoft – all of whom have significant expertise in AI and technology.

The paper outlines two types of PHC systems – local and global – highlighting that they don’t need to rely on biometric data.

Imagine a government or another digital service provider issuing a unique credential to each person. To prove their ‘humanity’, users would employ clever cryptographic techniques called zero-knowledge proofs, allowing them to confirm their identity without revealing the specifics. 

These credentials would be stored securely on personal devices, offering a layer of online anonymity. 

The researchers suggest that PHCs could either replace or work alongside existing verification methods like CAPTCHAs and fingerprint scans, which are increasingly struggling to keep up with AI. 

Another possible solution could be a simple declaration – letting people know that they are interacting with an AI. Praveer Kochhar, the co-founder of KOGO Tech Labs, emphasised the importance of transparency. “Immediately as a call starts, there has to be a declaration that you’re talking to an AI agent. Once you declare that, it’s very straightforward,” Kochhar told AIM.

Earlier in April, a user on X, Alex Cohen, shared that Bland AI, a San Francisco-based firm, uses a tool for sales and customer support that can be programmed to make callers believe they are speaking with a real person.

Source: X

As seen in the post above, had the bot not acknowledged being an “AI agent”, it would have been nearly impossible to differentiate its voice from that of a real woman.

Well, now chatbots are being deployed to fight against phone scammers. 

Dali Kaafar, a researcher at Macquarie University, Australia, and his team, have designed AI chatbots Malcolm and Ibrahim, to address the concern.

So, when scammers call, they think they’ve reached a potential victim. But they talk to “Malcolm”, an elderly man with an English accent, or “Ibrahim”, a polite man with an Egyptian accent. 

However, the same innovation can again be used to scam innocent people. So, in scenarios like these, mere declarations or a centralised credential system may not help you identify an AI agent. 

Are they Good Enough?

A McAfee Corp survey revealed that 69% of Indians think they don’t know or cannot tell the difference between an AI voice and a real voice. Also, there were reports earlier regarding the rise of fraudulent robocalls amid US elections.

So, how apt are these solutions for the rising AI crimes? 

Johanne Ulloa, the director of solutions consulting at LexisNexis Risk Solutions, warns, “Chatbots can still be misused to enhance the effectiveness of phishing emails, as there’s nothing to prevent them from generating texts that prompt customers to log in to an online account under the guise of security.” 

With the advent of text-to-speech technology, AI’s potential to compromise security grows even more concerning. Earlier this year, OpenAI showed off the voice capabilities of their GPT-40 model, which eerily sounded like Scarlett Johansson.

“These AI systems can take any text and replicate a sampled voice, leading to situations where people have received messages from what appeared to be relatives, only to discover their voices had been spoofed,” Ulloa noted. 

Many startups today are building AI agents that can be integrated via telephony, WhatsApp, and mobile applications. As AI agents become more prevalent, there is a need for more discussions in the industry and research for better tools to mitigate the risks of technology misuse.

The post How Do You Differentiate an AI Agent from a Human? appeared first on AIM.

]]>
What Black Myth: Wukong’s Success Tells us About Investing in AI Startups https://analyticsindiamag.com/ai-insights-analysis/what-black-myth-wukongs-success-tells-us-about-investing-in-ai-startups/ Tue, 27 Aug 2024 06:30:00 +0000 https://analyticsindiamag.com/?p=10133809

VCs should be patient while investing in AI startups.

The post What Black Myth: Wukong’s Success Tells us About Investing in AI Startups appeared first on AIM.

]]>

Much like the AI industry, sceptics can say the gaming industry is also alive because of hyped products. And currently, the ‘hype’ is around Black Myth: Wukong

The game developed by Game Science, which is backed by the Chinese tech giant Tencent and Hero Interactive, is nothing short of a visual masterpiece, which is powered by NVIDIA GPUs. At the same time, its success story can be compared to OpenAI’s ChatGPT and also teaches a lot about how investments should work in the startup landscape.

To set the record, Black Myth: Wukong sold over 10 million copies in just three days, becoming the fastest selling game, beating Elden Right, Hogwarts Legacy, and even Red Dead Redemption 2. This is highly reminiscent of OpenAI acquiring 1 million users in just five days, compared to Instagram reaching 2.5 months to reach 1 million downloads.

But apart from the users, the game also tells us about what investing in AI startups means. 

The Word is ‘Patient Capital’

Black Myth: Wukong took the company five years to make. Game Science was established in 2014 in Shenzhen by seven former Tencent Games employees. Before shifting their focus to Black Myth: Wukong in 2018, the startup released mobile games and the shift to making the AAA game only happened because of the rise of Steam users in China. 

At that time, the company had 13 employees and in August 2020, when Game Science unveiled the first trailer for the game to attract talent, received over 10,000 resumes, including applications from AAA gaming companies and international candidates willing to apply for Chinese work visas. 

The development team eventually grew to 140 employees.

In March 2021, Tencent acquired a 5% minority stake in Game Science, emphasising that their role would be limited to technical support, without influencing the company’s operations or decisions. Though the financial backing was there, the company received several controversies around the game’s content and technical problems due to the shift from Unreal Engine 4 to Unreal Engine 5.

Before Tencent and Hero Interactive’s investment, Game Science’s financial performance was largely dependent on the success of their mobile games. While these games were commercially successful, the studio’s primary goal was to develop high-quality console games, which required significant financial backing. 

Similarly, AI Takes Time

This is what patient capital stands for. Believing in what the startup is building and giving them years to develop their products. What Game Science did right with their journey was developing smaller revenue generating games along the way to make up for the cost of building Wukong, which is something that AI startups should focus on. 

Or maybe they need investors like Microsoft and YC, who believe in OpenAI.

Arjun Rao, partner at Speciale Invest, also told AIM a similar strategy when investing in R&D startups. He said that it is essential to be patient when investing as these startups are still in the budding stage, and placing the bet on the right founders is paramount. “Founders do not need to worry about the current downturn and keep a long-term mindset,” said Rao.

AI startups, just like games, require extensive patient capital from investors as they require extended periods of R&D before their innovations can be brought to market. That is what happened with Microsoft backing OpenAI, eventually resulting in the release of Chat`GPT.

OpenAI started laying the foundation for ChatGPT in the beginning of 2020 when they released GPT-3. Though a great technological marvel, the company wasn’t generating profits, and still isn’t even after two years of ChatGPT’s release in 2022. 

India is expected to have around 100 AI unicorns in the next decade. This wouldn’t be possible if investors do not trust the founders and pour money into R&D. Prayank Swaroop, partner at Accel, said that VCs are increasingly expecting AI startups to demonstrate rapid revenue growth.

“Even to the pre-seed companies, we say, ‘Hey, you need to show whatever money you have before your next fundraiser. You need to start showing proof that customers are using you.’ Because so many other AI companies exist,” said Swaroop. 

This is what investing in AI startups should look like. The game that took more than five years to make couldn’t have been possible if the investors were just looking for immediate profitability and not betting on the founders. Maybe, Indian investors need to relook at their investment strategy.

The post What Black Myth: Wukong’s Success Tells us About Investing in AI Startups appeared first on AIM.

]]>
The Persistent Flaw in Image Models: Time https://analyticsindiamag.com/ai-insights-analysis/the-persistent-flaw-in-image-models-time/ Mon, 26 Aug 2024 12:34:28 +0000 https://analyticsindiamag.com/?p=10133766

At the root of this issue lie the datasets used to train these AI image models.

The post The Persistent Flaw in Image Models: Time appeared first on AIM.

]]>

It’s a bad time for AI image generators – literally. Most notable AI models, including the likes of DALL.E 3, Midjourney, Stable Diffusion, and Ideogram 2.0, are all struggling to generate analog clocks, often defaulting to 10:10. 

When AIM tested DALL.E 3 with a simple prompt asking it to generate the image of an analog clock showing the time as 6:30, both of the images it produced were timed at 10:10.

DALL_E cant tell the right time

We even tried the recently released FLUX.1 AI Pro model with the same prompt and the results were similar to what DALL·E 3 produced.

FLUX.1 AI can't tell the right time

What Explains This Fixation? 

At the root of this issue lie the datasets used to train these AI image models. 

Most of the clock and watch images used to train these models come from product photos and advertisements. As a strategy, clocks are almost always set to 10:10 as this V-shaped arrangement keeps the hands from obscuring logos usually placed at the 12:00 position and creates an aesthetically pleasing “smiling” face

Now, since AI models learn patterns from training data, this overrepresentation of the 10:10 configuration gets baked in. The AI doesn’t understand the actual meaning or mechanics of clock hands – it simply learns from the statistical pattern that clocks should look like this. 

Unfortunately, this issue goes beyond generating images of analog clocks. The deeper problem is that AI doesn’t understand the concept of time. 

Tex-to-Image Models Don’t Get Time 

A Reddit user pointed out that AI knows no concept of time. “It is basically a mathematical function where for a certain input a corresponding output is calculated. There is no timing mechanism whatsoever in that path,” he said, adding that LLMs can only derive this conclusion from their training data and this is how LLMs work.

A Medium user reported how LLMs like ChatGPT are essentially “blacked out” between prompts, unable to maintain a continuous memory of events or experiences. This limitation prevents AI from forming a coherent understanding of the passage of time, as it lacks the ability to perceive and process time-related cues in the same way as humans do.

ChatGPT cant get time
Source: Medium

Sure, this problem can be solved by using a custom prompt where it will ask about time to an internal clock but then differentiating between real time and hypothetical time will be another challenge and that is the reason why ChatGPT has not integrated this feature. 

AI’s inability to understand the concept of time also stems from its absence of real-world experiences. The development of a comprehensive understanding of time requires a gradual process of learning through experience and problem-solving. 

The Reasoning Part

AI systems, including the advanced ones, are not considered to be conscious and that explains why AI can’t reason. They process information but lack self-awareness, feelings, and intentions that are central to human consciousness. This lack of consciousness limits their ability to reason flexibly and understand context deeply.

Explaining AI’s inability to reason, Subbarao Kambhampati, professor at Arizona State University, said, “The tricky part about reasoning is, if you ask me a question that requires reasoning and I gave an answer to you, on the face of it, you can never tell whether I memorised it.” 

A Medium post by James Gondola suggests that AI systems lack their own moral reasoning, which may lead to biases or overlook the subtle ethical factors that human judgement naturally considers.

There are techniques like Chain-of-Thought (CoT) prompting where you prompt the model to break down its reasoning process into a series of intermediate steps. This way, you can clearly understand how the model reached that specific output. 

MIT researchers have proposed a neuro-symbolic AI technique that enables LLMs to better solve natural language, maths, and data analysis problems. It has two key components: neural networks that process and generate natural language and symbolic systems that perform logical reasoning and manipulation of symbols. 

By combining these two, neuro-symbolic AI can leverage the strengths of deep learning for handling unstructured data while also incorporating the reasoning capabilities of symbolic systems.

The post The Persistent Flaw in Image Models: Time appeared first on AIM.

]]>
India will Have 100 AI Unicorns in the Next Decade https://analyticsindiamag.com/ai-insights-analysis/india-will-have-100-ai-unicorns-in-the-next-decade/ Thu, 22 Aug 2024 10:00:00 +0000 https://analyticsindiamag.com/?p=10133498

With $20 billion dry powder, India’s AI startup ecosystem is definitely foreseeing unicorns.

The post India will Have 100 AI Unicorns in the Next Decade appeared first on AIM.

]]>

India’s AI ecosystem is on fire. This year alone, all our feeds were filled with announcements from AI startups raising funds or launching innovative products and solutions. Surprisingly, the number of AI unicorns in India is still just one – Krutrim.

Can that change? Rahul Agarwalla, managing partner at SenseAI Ventures, definitely believes so

In a recent podcast, Agarwalla said that he is witnessing major growth in India’s AI ecosystem. “Over the past six months, we have seen over 500 AI startups. That’s a phenomenal amount of innovation that is going on in the country,” he said, adding that this gives him the belief that India will give birth to around 100 AI unicorns in the next decade.

The message sounds motivating, but should be taken with a grain of scepticism. Agarwalla was talking about all AI startups, including the ones using existing infrastructure or barely building applications on top of others. But India, which is often dubbed as the AI use case capital of the world, is largely struggling with innovation in the AI realm, at least at the moment. 

Similarly, Arjun Rao, partner at Speciale Invest, the firm which recently made three investments in the space, told AIM, “We are bullish about many successful AI companies being built from India, but making number predictions is fraught with risks. It is fairly clear that we’re in the very nascent stages of this AI revolution. Each year we’ll make progress and will need to iterate and solve problems along the way.”

Apart from the names like Sarvam AI, Krutrim, TWO.AI, Project Indus, SML, and a few others, Indian AI startups are largely powered by jugaad, and not the VC money. This is something that needs to change fairly quickly. 

Speaking with AIM, Prayank Swaroop, partner at Accel India, said that the 27 AI startups his firm has invested in in the past couple of years will be worth at least five to ten billion dollars in the future, which also includes wrapper-based startups. “A majority of people can start with a wrapper and then, over a period of time, build the complexity of having their own model. You don’t need to do it on day one,” said Swaroop.

AI Soonicorns on the Rise 

According to AIM Research data, Indian AI startups raised a total funding of $560 million across 25 rounds in 2024, which is a decline of 49.4% compared to the previous year. When it comes to global AI startups, the funding was around $27 billion from just April-June this year. 

Despite the slowdown in India, AI remains the hottest theme for investors as the demand from enterprises is increasing. “The numbers around AI investing need to be taken with a pinch of salt… AI today is like the cloud – almost every new startup is utilising it to varying degrees,” said Alok Goyal, partner at Stellaris Venture Partners. About 50% of the firm investments over the last 12 months were around AI-first startups. 

As several Indian AI startups are slowly moving towards building foundational AI solutions, it is necessary for them to raise funds in billions to keep up with the infrastructure cost and acquiring the necessary talent. 

Apart from that, for VCs, the opportunity also lies in several AI applications startups as mentioned above. But betting on AI startups to become a unicorn in a decade is still a risky bet for them as the generative AI landscape keeps changing with the release of new products from the big-tech companies. 

“Deals are also being announced later as startups are building in stealth. We are aware of at least 15 AI investments which are yet to be announced,” said Agarwalla from SenseAI.

Dry Powder is All You Need

The dream of 100 AI unicorns over the next decade is somewhat achievable given the amount of money the investors are sitting on. Rajan Anandan, the managing partner at Peak XV Partners, recently spoke about this. He sees potential for growth and innovation, and foresees India making a significant impact on the global stage. 

At the Global IndiaAI Summit, Anandan said, “Our firm has over INR 16,000 crore of dry powder. We just want more people starting up in AI.” 

He added that Peak XV has invested in 25 AI startups so far. Emphasising that capital availability is not a problem, Anandan mentioned that the VC ecosystem has $20 billion ready to be invested in Indian startups. He also noted that AI is currently the most significant theme, with investors showing strong enthusiasm for startups in this field. 

Antler, a VC firm, recently announced a $10 million investment in early-stage Indian AI startups. Rajiv Srivatsa, partner at Antler, shared the news on X, emphasising: “NOW is the best time to build a startup in India!”

With incubators and startup programmes like Google, NVIDIA, and JioGenNext, along with the government procuring GPUs from NVIDIA specially for Indian AI startups, it seems like the prediction might be true. 

At the same time, investors have also become wary about who they are investing in. Earlier, it was said that many VCs don’t have a thesis for AI startups, but now, the questions during pitches are extremely detailed. Though they are risk averse, the understanding of AI is now within their grasp.

The post India will Have 100 AI Unicorns in the Next Decade appeared first on AIM.

]]>
Is It Time to Call USA the United States of India? https://analyticsindiamag.com/ai-insights-analysis/is-it-time-to-call-usa-the-united-states-of-india/ Wed, 21 Aug 2024 14:16:04 +0000 https://analyticsindiamag.com/?p=10133418

India is closer to replacing China as the largest international student body in US universities.

The post Is It Time to Call USA the United States of India? appeared first on AIM.

]]>

With a population of over 1.4 billion, India clinched the title of the world’s most populous country, overtaking China in 2023. Strikingly, about 65% of the Indian population is aged between 15 and 64, giving the nation a significant advantage in leveraging the potential of a large working-age population. 

However, India’s global footprint extends beyond these numbers. The country is fast approaching another milestone, becoming the largest international student body in US universities.  

In the 2022-23 academic year alone, US universities saw an enrollment of nearly 2,69,000 Indian students, marking a remarkable 35% surge. 

This educational migration is part of a broader trend. As of 2024, over 5.4 million Indians made the USA their home, with more than 3.3 million categorised as Persons of Indian Origin. 

Source: Statista

Recently, a user on X pointed out that of the Fortune 500 companies, 16 are led by Indian-origin CEOs. 

https://twitter.com/rajeshsawhney/status/1825417868197945492?s=12

But is that something to brag about?

Story of Every Indian in the US 

Despite the rising number of Indian students pursuing education in the USA, their path is far from smooth. 

They face significant challenges in securing even internships, attributed to a slowdown in job growth and heightened competition during election years. Factors such as rising inflation, increased cost of living, and sponsorship difficulties further compound their challenges.

But for whoever cracks it there, it’s definitely an achievement to celebrate. 

When Parag Agrawal was appointed the CEO of Twitter in 2021, it sparked discussions on the factors contributing to the success of Indian-origin individuals in senior positions within tech companies. 

Some reports suggested that cultural shifts and strategic innovations spearheaded by Indian-origin CEOs played a pivotal role in the growth trajectory of companies like Microsoft and Google. 

Meanwhile, Microsoft CEO Satya Nadella offered a different perspective in his book ‘Hit Refresh’. “Our industry does not respect tradition. What it respects is innovation,” he emphasised.  

Indians Stay, but Don’t Rule

Even as Indians migrate to the US, the old habit of being bossed around by other countrymen seems to persist – a shadow of the past. 

Indian Americans constitute the second-largest immigrant population in the US. However, it’s notable that Indians are not widely represented in leadership roles across many of the world’s top companies. 

According to the CompTIA report, there are over 5,57,000 software and IT service companies in the US, with approximately 13,400 tech startups launched in 2019 alone. However, only a small fraction of individuals, who were previously Indian citizens but are now US citizens, hold higher positions there. 

Despite India’s prominence in the tech industry and the presence of thousands of alumni from the IITs in the US since the 1960s, Indian passport holders occupying top positions in globally significant institutions remains rare. 

On the Other End

Indian IT giants have established a robust presence in the US. According to the HDFC Securities report, Tata Consultancy Services (TCS) employs 50,000 people, Infosys 35,000, HCL Technologies 24,000, Wipro 20,000, and LTI Mindtree 6,500. 

Together, these companies account for approximately 2% of the US tech industry workforce. 

Harbir K Bhatia, the CEO of the Silicon Valley Central Chamber of Commerce, told the Press Trust of India, “India is one of the largest leaders of innovation in Silicon Valley. Data shows that 40% of Silicon Valley CEOs or founders are from South Asia or India.” 

Source: Statista 2024

Apple, a leader in the global tech industry, also reflects this trend, with nearly 35% of its workforce being of Indian origin. 

Adding to this narrative, Microsoft CEO Satya Nadella, in an interview with CNN-TV18, mentioned that Microsoft employs the second-largest number of engineers from India. 

When asked about the increasing number of CEO positions held by people of Indian origin, Nadella said, “In a country of 1 billion plus people, we should be seeing one-fifth of the CEO posts being held by Indians.”  

Time to Come Back to India

Whatever the US offers, maybe Indians there consider coming back and establishing their talent here. 

Recently, Ola CEO Bhavish Aggarwal said that all the big tech companies including Google, Microsoft, Adobe and IBM in the world are run by Indians, but there is no Indian big-tech company in the world. 

“We feel proud of our Indians who are running global big tech companies, but there’s no big tech company that we can call Indian,” he said.

Reflecting on the same lines, Shekar Sivasubramanian, the CEO of Wadhwani AI, had earlier mentioned that he came back to India as this is the best place to experiment and innovate.

But given India’s current population, the idea of welcoming back our fellow countrymen raises concerns about further population growth. It’s hard to ignore the words of our favourite leader, Infosys co-founder Narayan Murthy, who pointed out the rising population is a major concern. 

“Since the Emergency period, we Indians have not paid enough attention to population control. This poses a risk of making our country unsustainable. In comparison, countries such as the US, Brazil and China have far higher per capita land availability,” he said.

The post Is It Time to Call USA the United States of India? appeared first on AIM.

]]>
Is Sarvam 2B Good Enough for Indian AI Developers? https://analyticsindiamag.com/ai-breakthroughs/is-sarvam-2b-good-enough-for-indian-ai-developers/ Wed, 21 Aug 2024 11:42:56 +0000 https://analyticsindiamag.com/?p=10133383

One good thing is that this will mark the end of ‘Indic Llama Garbaggio’.

The post Is Sarvam 2B Good Enough for Indian AI Developers? appeared first on AIM.

]]>

Sarvam AI, or what we call the OpenAI of India, is probably one of the most celebrated startups by Indian developers. Its focus on Indian languages and open source philosophy has been the cornerstone for AI development in India. 

Recently, at Sarvam’s recent launch event, the company announced Sarvam 2B, its open source foundational model trained on 4 trillion tokens of data with 50% of it in 10 Indic languages. 

According to co-founder Vivek Raghavan, Sarvam 2B is among other small language models in the market, such as Microsoft’s Phi-3, Meta’s Llama 3, and Google’s Gemma Models, which were all decent enough for Indic AI development. 

But is Sarvam 2B really good enough for Indic AI developers?

First, to clear the air, the model uses a slightly modified Llama architecture, but is trained from scratch. This means there is no continued pre-training or distillation, making it the first foundational model built entirely for Indic languages. 

We can also call it an ‘Indic Llama trained from scratch’. This is several steps ahead of the company’s earlier approach of building OpenHathi, which was a model fine-tuned on top of Llama 2.

When compared to other models, Sarvam 2B outperforms GPT-4o and gives tough competition to Llama 3.1-8B and Gemma-2-9B in speed and accuracy for Indic languages.

The End of Indic Llama?

Models from Google, Meta, or OpenAI, have been increasingly trying to do better at Indic language translation and inference, but have been failing big time. Indian developers have been fine-tuning these models on billions of tokens of Indian language data, but the quality hasn’t necessarily improved. 

Plus, these models have rarely been adopted for any production. “The best thing about Sarvam’s 2B model is that it puts an end to the Indic Llama garbage,” said Raj Dabre, a prominent researcher at NICT in Kyoto and a visiting professor at IIT Bombay. 

Pratik Desai, the founder of KissanAI, agrees with Dabre and said that none of those models were useful for production. He still has those same doubts about Sarvam 2B, even though it is a great stepping stone towards better Indic models.

“I doubt it will be used in production. Actually none of the 2B models are good enough to serve paid customers,” said Desai, adding that even models such as Gemma 2B are not useful – not even for English use cases.

Desai further explained that LLMs with 2 billion parameters struggle with following consistency and instructions even for summarisation, and are only good enough as helper models. But since Sarvam is also planning to release more versions in the future, these issues might get resolved eventually. 

Regardless, the small size of the models make them useful for edge use cases and on-device LLMs. This, along with an audio model called Shuka released by Sarvam, would be helpful for the developer community to experiment with. But taking these models into production is still a hard decision to make as the output can become unreliable for several use cases, as is the case with other English models of the same size.

Along with this, the company also released Samvaad-Hi-v1 dataset, a collection of meticulously curated 100,000 high-quality English, Hindi, and Hinglish conversations, which is an invaluable resource for researchers and developers. 

A Good Start

One of the key aspects of Sarvam 2B’s brilliant performance is it is completely trained on synthetic data, which is similar to Microsoft’s Phi series of models. Even though the debate around using synthetic data is still on, Raghavan said that it is essential to use synthetic data for Indian languages as the data is not available on the open web. 

Furthermore, using synthetic data generated by the Sarvam 2B model would also help in improving the model’s capabilities in the future, or would make it run in circles. 

Raghavan has also said that the cost of running Sarvam 2B would be a tenth of other models in the industry. With APIs and Sarvam Agents in the pipeline, it is clear that Sarvam’s sovereign AI approach is helping them build great models for Indian developers. 

The post Is Sarvam 2B Good Enough for Indian AI Developers? appeared first on AIM.

]]>
Why is Elon Musk’s Grok2 using FLUX.1 AI? https://analyticsindiamag.com/ai-insights-analysis/why-is-elon-musks-grok2-using-flux-1-ai/ Wed, 21 Aug 2024 05:24:14 +0000 https://analyticsindiamag.com/?p=10133334

FLUX.1 AI is one of very few text-to-image models that generate human hands right and it can be locally hosted too.

The post Why is Elon Musk’s Grok2 using FLUX.1 AI? appeared first on AIM.

]]>

FLUX.1 AI, developed by Black Forest Labs, a German artificial intelligence startup, is quickly proving itself superior to the competition in both image quality and creative flexibility. The startup has also collaborated with GrokAI and is experimenting with their FLUX.1 model to expand Grok’s capabilities on X.

Unlike other popular AI image generators like DALL-E and Midjourney, FLUX.1 appears to have minimal content filters or restrictions. This aligns with Elon Musk’s stated goal of pushing back against what he sees as over-censorship of AI platforms.

This way, Grok 2 users will now allow users to create uncensored deepfakes, like one where the 2024 US presidential candidates Kamala Harris and Donald Trump pose as a couple. 

Apart from fewer restrictions and filters, FLUX.1 AI competes with popular models like Midjourney, DALL-E, Stable Diffusion, etc. and beats all of them in-terms of ELO scores.

ELO-comparision
(FLUX.1 AI gets more ELO scores than popular models. Source: Medium)

A key differentiator of FLUX.1 is its ability to accurately render human hands and legs, an area where previous AI image models struggled due to inadequacies in training datasets.

FLUX.1 also supports a diverse range of aspect ratios and resolutions up to 2.0 megapixels. 

The Fusion of Multiple Architectures

What sets FLUX.1 AI apart from other models is its unique mixture of multiple architectures, allowing it to achieve superior results. At the core, FLUX.1 AI combines the strengths of both transformers and diffusion models. This powerful blend allows FLUX.1 AI to generate images with unprecedented speed and quality.

Transformers that work on neural networks excel at understanding and processing sequential data, such as text. They help FLUX.1 AI interpret and accurately translate text prompts into visual representations. Diffusion models, on the other hand, are skilled at generating high-quality images by iteratively refining noise into coherent structures. FLUX.1 AI leverages diffusion techniques to create images with intricate details and realistic textures.

Another great feature of FLUX.1 AI is prompt adherence and quality. Whether you use simple or complex prompts, the model delivers high-quality images that closely match the input description. A Medium user named LM Po used the prompt “a cat looking into a camera, point of view fisheye lens” and it gave results that can be compared to Midjourney V6.

        (FLUX.1’s output after “a cat looking into a camera, point of view fisheye lens” prompt)
(FLUX.1’s output after “a cat looking into a camera, point of view fisheye lens” prompt)

FLUX.1 AI Challenges Others Open Source Models

FLUX.1 AI is better than Stable Diffusion
Generated by FLUX.1 AI

When you compare the best open source text-to-image models, you are left with limited choices including FLUX.1 AI and Stable Diffusion. However, compared to Stable Diffusion, FLUX.1 AI is quite easy to prompt.

In head-to-head comparisons, FLUX.1 AI consistently outperforms Stable Diffusion in generating photorealistic images with lifelike details and textures.

Because of its open-source nature, there are multiple forks available for FLUX.1 AI. One good example is Marketing Assistant App based on FLUX.1 [Schnell], allowing users to create social media content, marketing advertisements, and more for free. 

Pierrick Chevallier, an AI designer on X, said that his experience with FLUX.1 AI was amazing. “Plus, its text handling is way better than SD3 and Midjourney,” he added, further praising model.

Users across multiple social media platforms are now blending FLUX.AI with other tools like Midjourney, Udio and Luma AI to create videos with transitions that would have required hours of work.

Surprisingly, Black Forest Labs, the company behind the new open-source AI tool consists of former Stability AI employees. These are the same minds behind tools like Stable Diffusion, Latent Diffusion, and Stable Diffusion XL.

In short, move over Stable Diffusion, Midjourney, Imagen and DALL-E, the new AI image generation champion in town is here to stay. 

The post Why is Elon Musk’s Grok2 using FLUX.1 AI? appeared first on AIM.

]]>
Bhavish Aggarwal is the Elon Musk of India https://analyticsindiamag.com/ai-insights-analysis/bhavish-aggarwal-is-the-elon-musk-of-india/ Tue, 20 Aug 2024 11:25:57 +0000 https://analyticsindiamag.com/?p=10133269

“Tesla is for the West, Ola is for the Rest”.

The post Bhavish Aggarwal is the Elon Musk of India appeared first on AIM.

]]>

As India celebrates 78 years of independence, the “Made in India” movement is making its mark globally, with Indian startups challenging established industries across the globe. Just as Elon Musk has revolutionised electric vehicles in America, Bhavish Aggarwal is leading a similar charge with Ola in India. 

Tesla is for the West, Ola is for the Rest,” resonated Aggarwal, at its annual flagship event, Sankalp 2024, where the company highlighted its achievements and outlined plans for the coming years.

Aggarwal emphasised three factors to drive the vision for the company — Youth, Dream, and Future

As Ola races to join the ranks of top global players, it’s hard to ignore the resemblance between Aggarwal and the world’s wealthiest man, Elon Musk. 

Common Traits

Aggarwal launched Olatrip.com in 2010, marking the beginning of his entrepreneurial journey.

Over the past 14 years, he’s gone on to establish Ola Consumer (Ola Cabs), Ola Electric, India’s first listed two-wheeler EV company, and six months ago set on building the country’s top AI and computing company– Ola Krutrim. 

(Source: Ola)

Aggarwal’s vision is rooted in his deep respect for India. “Never underestimate the Indian consumer – they are the smartest – they want to build and be the future”, he asserted.  

Further taking steps towards sustainability, Ola has also established India’s first lithium manufacturing factory. As Aggarwal put it, “The world cannot achieve sustainable goals without India.”

Turning to Musk, his entrepreneurial journey started in 1995, with Zip2, an American technology company. Since then, Musk has co-founded six companies, including electric car maker Tesla, rocket producer SpaceX, and tunnelling startup Boring Company.

Ownership and Valuation

In 2017, Ola Electric was born as a subsidiary of Ola, but between 2018-19, it ventured out as its own entity. Fast forward to 2024, and the company’s market cap stands at $7.81 billion. Aggarwal holds a significant 30.02% stake in the company. 

In comparison, Tesla with its market cap of $698.31 billion, is leading the electric vehicle space. Musk, who owns roughly 12% of Tesla, has pledged more than half his shares as collateral for personal loans amounting to $3.5 billion. 

Meanwhile, Ola’s footprint is very small compared to Tesla, it is steadily growing. Aggarwal’s entrepreneurial spirit shines through his AI startup, Krutrim, which recently achieved unicorn status, after securing $50 million in funding from investors including Matrix Partners India. 

This makes Krutrim the first Indian AI startup to hit a billion-dollar valuation, just a month following the launch of its LLM. The company, whose name itself means “artificial” in Sanskrit, has ambitious plans, like developing data centres and aiming to build servers and supercomputers for the AI ecosystem. 

On the other hand, Musk’s xAI is making waves in the AI industry with its mission to accelerate human scientific discovery. In May, xAI achieved an $18 billion valuation after raising $6 billion, with investors like Andreessen Horowitz, Sequoia Capital, Saudi entrepreneur Prince Alwaleed Bin Talal, and Fidelity Management & Research Company. 

What’s Next?

The similarities between Aggarwal and Musk go beyond business as both are no strangers to controversy and receive a good amount of backlash from the media. 

Musk once boldly declared at the New York Times annual DealBook Summit 2023, “I have no problem being hated.”

Similarly, Aggarwal has taken controversial stances, such as criticising gender pronouns, which he dismissed as “pronouns illness” from the West. 

Musk’s personal life also sparked outrage when he disowned his transgender daughter, claiming she was “killed” by what he calls the “woke mind virus.” 

So both men have chosen to stand by their beliefs, regardless of the wave of criticism that follows.

As we further trace the parallels between Aggarwal and Musk, two elements are notably absent. Ola has yet to dip its feet into the social media game – though perhaps a strategic partnership would have rescued Koo from shutting down, and allowing Aggarwal to claim that field as well.

He is anyway planning to build digital public infrastructure (DPI) for social media.

https://twitter.com/bhash/status/1788640296378490915

And while India doesn’t have SpaceX yet, may later establish ‘AntarikshOla’ someday? 

The post Bhavish Aggarwal is the Elon Musk of India appeared first on AIM.

]]>
Remote Work Needs to be Banned https://analyticsindiamag.com/ai-insights-analysis/remote-work-needs-to-be-banned/ Sun, 18 Aug 2024 08:30:00 +0000 https://analyticsindiamag.com/?p=10132847

“It’s not just a productivity thing, I think it’s morally wrong,” Elon Musk noted.

The post Remote Work Needs to be Banned appeared first on AIM.

]]>

When the COVID-19 pandemic hit the world, working from home became a norm. While it was the only choice at first, many of us now have adopted this working model perhaps longer than needed. 

Recently, former Google CEO Eric Schmidt reflected on this shift and said, “Google decided that work-life balance, going home early, and working from home, is more important than winning.” 

Schmidt highlighted the reality and remarked that startups like OpenAI are taking over Google because employees at these big companies come to the office just one day per week. 

https://twitter.com/alexkehr/status/1823480786349383879

Ironically, back in April in 2021, current Google CEO Sundar Pichai suggested that the ideal way to achieve work-life balance is to offer employees two days of work from home. 

But, it seems like the comfort zone is not for innovators.

Several scientists from Google DeepMind have left the company to establish new AI startups. One such venture is Paris-based startup H, which develops foundational AI models. Mistral AI, another AI company founded in 2023 by former Google DeepMind and Meta researchers, focuses on building open-source and commercial AI models. 

Unacherated Labs, a generative AI startup, was launched by three Google DeepMind researchers – David Ding, Charlie Nash, and Yaroslav Ganin. Additionally, Tokyo-based Sakana AI was founded by David Ha and Llion Jones, two former Google researchers.

And the list goes on. 

Is Work from Home a Moral Issue?

Earlier in 2023, Elon Musk, Tesla CEO, made headlines for opposing remote work culture. “I think that the whole notion of work from home is a bit like the fake Marie Antoinette quote, ‘Let them eat cake’,” Musk told CNBC.

“It’s not just a productivity thing, I think it’s morally wrong.”

Musk noted that if one plans to work from home, but expects others—like those who manufacture cars, prepare and deliver food, or fix the house—to go to work everyday, then it is not morally right.

Source: X

Speaking of the productivity end, OpenAI CEO Sam Altman in the previous year, noted that employees working from home create confusion while working from the office creates new products. 

At a session organised by fintech company, Stripe, he stated, “I think definitely one of the tech industry’s worst mistakes in a long time was that everybody (thought they) could go full remote forever, and startups didn’t need to be together.”

He further added that the experiment on that is over, and the technology is not yet good enough that people can be fully remote forever, particularly on startups.

In Office ≠ Working

But does coming to the office really prove an employee’s productivity? 

Last year, Amazon insisted its employees return to work 3 days a week. The company CEO Andy Jassy sent a warning message saying that people who don’t want to go back to the office – “It’s not going to work out for you.”

Following this announcement, a group of employees walked out in protest at the company’s Seattle headquarters. 

Source: Reddit

If you happen to stay in traffic front cities like Bengaluru, then the scene is four hours travelling for nine hours of work. So, being productive after long hours of travel is a question of concern. 

Source: X

Tech Companies Introduce New Tricks

No matter what an employee thinks or wants, the remote work era has come to an end as tech companies are implementing tricks to lure employees back to the office. 

Wirpo, starting in May 2023 allowed employees to pick any three days a week to work from the office, offering campus activities to make the shift more appealing. 

HCLTech too embraced thrice weekly office routine. However, TCS mandated five-day office attendance and linked variable pay to in-person presence.

Sonata Software rolled out its return to office gradually, while Infosys required select employees to work ten days a month in the office. 

Well, can’t forget Infosys founder Narayana Murthy’s suggestion that Indian youths should work 70 hours a week. But he didn’t mention that employees should work from the office only! Maybe that’s a plus?

The post Remote Work Needs to be Banned appeared first on AIM.

]]>
Why Isn’t There a Delete or Undo Button in LLMs? https://analyticsindiamag.com/ai-insights-analysis/why-isnt-there-a-delete-or-undo-button-in-llms/ Thu, 15 Aug 2024 06:10:09 +0000 https://analyticsindiamag.com/?p=10132697

If you think protecting private data was hard with databases, LLMs make it even harder.

The post Why Isn’t There a Delete or Undo Button in LLMs? appeared first on AIM.

]]>

“Where’s the delete and undo button in LLM?” asked Anshu Sharma, co-founder and CEO of Skyflow, who was introducing the concept of LLM vaults in a recent interaction with AIM

“LLM vaults are built on top of the proprietary detect engine that detects sensitive data from the training datasets used to build LLMs, ensuring that this data is not inadvertently included in the models themselves,” he added, saying the company has built a proprietary algorithm to detect sensitive information in unstructured data that is being stored in the vault. 

The Need for LLM Vault 

Sharma has a stronger reason to believe so. “While storing data in the cloud with encryption will safeguard your data from obvious risks, in reality, we need layers. The same data can not be given to everyone. Instead, you can have an LLM vault that can identify sensitive data while inference and only share non-sensitive versions of the information with LLM,” said Sharma, suggesting why the LLM vault matters. 

LLM-Vault-workflow
Source: StackOverflow

The vice president of Amazon Web Services, Jeff Barr, also mentioned that “The vault protects PII with support for use cases that span analytics, marketing, support, AI/ML, and so forth. For example, you can use it to redact sensitive data before passing it to an LLM”. 

Gokul Ramrajan, a tech investor, explained the importance of LLM vaults, saying, “If you think protecting private data was hard with databases, LLMs make it even harder. “No rows, no columns, no delete.” What is needed is a data privacy vault to protect PII, one that polymorphically encrypts and tokenises sensitive data before passing it to a LLM”. 

A few weeks ago, when Slack started training on user data, Sama Carlos Samame, the co-founder of BoxyHQ, raised a similar concern for organisations that are using AI tools and why they should have LLM vaults to safeguard their sensitive data. 

Going Beyond LLM Vault 

The likes of OpenAI, Anthropic and Cohere are also coming up with innovative methods and features to handle the data of a user and enterprise. For instance, if you are using OpenAI API, then your data won’t be used to train their model. Also, you can opt out of data sharing to ChatGPT. Privacy options like these somewhat eliminated the need for LLM Vaults.

Anthropic, on the other hand, have also incorporated strict policies on how they use user data to train their model and unless a user volunteers to do so or a specific scenario comes in where they collect user data. 

Meanwhile, Cohere has collaborated with AI security company Lakera to protect against LLM data leakage by defining new LLM security standards. Together, it has created the LLM Security Playbook and the Prompt Injection Attacks Cheatsheet to address prevalent LLM cybersecurity threats. 

There are other techniques like Fully Homomorphic Encryption (FHE) which allows computations to be performed directly on encrypted data without the need to decrypt it first. This means the data remains encrypted throughout the entire computation process, and the result is also encrypted.

The post Why Isn’t There a Delete or Undo Button in LLMs? appeared first on AIM.

]]>
Cognizant is Offering a Very Generous Salary of INR 2.5 LPA for Freshers https://analyticsindiamag.com/ai-insights-analysis/cognizant-is-offering-a-very-generous-salary-of-inr-2-5-lpa-for-freshers/ Wed, 14 Aug 2024 08:08:06 +0000 https://analyticsindiamag.com/?p=10132666

The saddest part about this is that it’s the same package that was offered in 2002.

The post Cognizant is Offering a Very Generous Salary of INR 2.5 LPA for Freshers appeared first on AIM.

]]>

The IT industry seems to love its employees a little too much. It is not some kind of major breakthrough that the salaries offered by these Indian tech giants are way too low, but Cognizant has set a new record this time and is being heavily trolled for it. 

“My driver makes way more than that…,” commented one user on X when they came across the news that Cognizant is offering a salary of Rs 2.52 lakh per annum for a third year undergraduate, which roughly boils down to Rs 20,000 per month. 

The saddest part about this is that it’s the same package that was offered in 2002. Nothing has changed. “No house, no free commutation, no free food. All this to be managed in just 18 to 19K rupees after PF deduction in metro cities,” added the user.

https://twitter.com/shikhar_sri/status/1823450902377947436

The salary is barely peanuts. “Degrees have become useless in India,” said a user, which is similar to a discussion a while back about developers in India being more likely to be unemployed if they are educated.

Expanding Base and Capabilities, But Not Pay

Adding to all of this, Cognizant is expanding its operations across the country. Most recently, it opened its first centre in Indore, Madhya Pradesh, a move set to create over 1,500 jobs with the potential to grow to 20,000 in the future. Cognizant’s expansion in Indore adds to its existing presence in cities like Bengaluru, Bhubaneswar, Chennai, and others across India.

Right now, Hyderabad has become Cognizant’s biggest centre globally, replacing Chennai from the top spot with around 57,000 employees.

Cognizant also went on a spree of partnering with companies for generative AI capabilities aimed at its employees. Just this year, it partnered with Microsoft and ServiceNow to integrate generative AI for employees, making a generative AI-powered digital workspace. 

The IT giant was also actively hiring in FY23 but the numbers for FY24 are yet to be revealed. Cognizant hired around 60,000 freshers, unlike TCS, Infosys, or Wipro. But it seems like this comes at the cost of not offering good enough pay to its employees. 

“2.52 LPA is very generous. What will the graduates do with so much money?” remarked a user on X, while another added that this is why the younger generation is content with making reels on Instagram.

This year, Cognizant also gave poor salary hikes, with developers on Reddit pointing out this was despite the fact that the company has the highest paid CEO earning Rs 186 crore per year. “Why can’t they have minimum wage criteria in all the sectors,” asked a user on X. Another said that they get more money with cashbacks than this.

The truth is that there are people who are ready to take any job they get and this salary is more than appealing to them. As Debarghya ‘Deedy’ Das puts it, “It’s simple. If you think it’s too low, find another job. If you want 186 cr, start the next Cognizant,” he said. 

The Same With All of the Indian IT

Some people make the argument that the companies, though they’re offering less salaries, are also upskilling them with generative AI to make them ready for the future market, and possibly making them prepare for a business.

If the wage is too low, “Cognizant should get 0 employees, then they will realise they raise the salary. If they got 10,000 employees doing this, then you are wrong and it’s not too low,” Das replied to another post. 

To explain this point, TCS recently also had 80,000 job openings that they could not fill citing a skill gap as the reason. The company is still looking for people for the roles of Ninja, Digital, and Prime. Interestingly, the Ninja category offers a package of Rs 3.3 LPA for various roles, while the other two roles offer a range between Rs 9 and Rs 11.5 LPA.

Most of the IT giants have stopped hiring, but people would still love to work there to start somewhere. Moreover, there is almost no hike at TCS. To put inflationary pressures into perspective, Jio and Airtel have hiked their prices by 25%, but TCS gave only around a 1% hike to its employees. 

There is an oversupply of engineers in India, which makes it clear that all these job openings can get easily filled. This makes Cognizant’s risky bet not so risky as the backlash will die soon as soon as the seats are filled. 

But this also puts the perspective of why graduates are shying away from IT companies. They get better salaries at startups and even GCCs in India. Indian IT giants are not even trying to lure new joinees anymore.

The post Cognizant is Offering a Very Generous Salary of INR 2.5 LPA for Freshers appeared first on AIM.

]]>
Salary Hikes in IT Companies are Less than Cash Back https://analyticsindiamag.com/ai-insights-analysis/salary-hikes-in-it-companies-are-less-than-cash-back/ https://analyticsindiamag.com/ai-insights-analysis/salary-hikes-in-it-companies-are-less-than-cash-back/#respond Tue, 13 Aug 2024 13:45:00 +0000 https://analyticsindiamag.com/?p=10132594

In 2023, the IT sector saw average increments between 8.5% and 9.1%.

The post Salary Hikes in IT Companies are Less than Cash Back appeared first on AIM.

]]>

This year has been particularly tough for IT employees with companies declining to hire freshers. Well, those working at the organisation are also not having the best time of their lives.

The entire IT sector witnessed a sharp decline in hike growth in 2023, dropping to 9.1% from 10.3% in 2022. Excluding products, the IT/ITeS sector saw the lowest average salary hike at just 8.4%.

In 2021, the average salary hike was 8.8%, which increased to 9.7% in 2022. Last year, the sector saw average increments between 8.5% and 9.1%. 

IT companies are offering single-digit salary increments to employees who lack AI-related skills. Further, there is an abundance of entry-level talent who are now equipping themselves with the latest tech and tools. Additionally, rising business costs, coupled with layoffs are making this trend continue this year too. 

IT Giants Follow Single Digit Hike Trend 

Earlier in May, a Dehradun-based engineer, Akshay Saini, took social media by storm with his critique of the corporate appraisal system in India.

Saini claimed that appraisals are a joke and further urged employees feeling underpaid to switch jobs.

He is not all wrong. 

Infosys, which offered an average salary hike of 14.6% in FY22, came down to 9% in FY24. For TCS, the average hike ranged between 7-9% in 2023-24, compared to 10.5% in 2021-22. 

Tech Mahindra and HCLTech offered average hikes between 5-7%. Wipro and LTIMindtree are yet to finalise their decisions on the same. 

Meanwhile, Accenture has opted not to provide salary hikes for its India-based employees this time.

Source: Reddit

Well, the story of startups is no different. 

Source: LinkedIn

Can GenAI Skill Help?

As per the Amazon Web Services (AWS) report, Indian employees with AI skills and knowledge may see salary hikes of more than 54% and those in IT and research and development enjoying the highest pay increases. 

Major Indian IT companies, despite substantial generative AI training initiatives, are offering lower starting salaries of around INR 3-4 lakh. 

AIM noted that AI engineers with generative AI skills saw a significant 50% increase in their salaries. At companies like Accenture, a generative developer can earn about INR 8.5 lakh per annum, compared to the INR 5-6 lakh that regular software engineers make. 

AIM reached out to an employee who works with generative AI at TCS. He said that the company’s pay scale won’t change just for GenAI resources, and the hike is also largely based on the company’s financial performance and business unit budget allocation.

“I believe we may get some good hikes in the range of 25-30% after switching to another company,” the employee added.

Hiring Freeze

Indian IT companies have all recently highlighted their commitment to integrate generative AI into their operations. Top firms like TCS, Infosys, and Wipro have been leveraging technology-enabled training for their employees.

As the companies are offering training programs, they seem to be following Charles Darwin’s survival of the fittest theory to retain employees. So, if an employee falls behind in the race, he is laid off. 

During the calendar year 2023, India’s IT sector laid off around 20,000 techies in a manner known as a “silent layoff,” according to All India IT & ITeS Employees’ Union (AIITEU) data. 

On the other end, companies have delayed the hiring process. Indian IT companies like Wipro, TCS, and Infosys have delayed the onboarding of 10,000 freshers, and refused to provide a joining date. 

As per the Nasscom data, the IT sector will create only 60,000 new jobs in FY24 compared to 2,70,000 jobs that were created by the sector in the previous fiscal year.

Ultimately, the choice is yours – either stay with the company, undergo training, and earn an average salary. Or get laid off, switch to another company, and wait for the same cycle down the road. 

The post Salary Hikes in IT Companies are Less than Cash Back appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-insights-analysis/salary-hikes-in-it-companies-are-less-than-cash-back/feed/ 0
Why Top IT Companies in India are not Hiring? https://analyticsindiamag.com/ai-insights-analysis/why-top-it-companies-in-india-are-not-hiring/ https://analyticsindiamag.com/ai-insights-analysis/why-top-it-companies-in-india-are-not-hiring/#respond Mon, 12 Aug 2024 11:36:12 +0000 https://analyticsindiamag.com/?p=10132234

Indian IT companies like Wipro, TCS, and Infosys are delaying the onboarding of 10,000 freshers, and refusing to provide a joining date.

The post Why Top IT Companies in India are not Hiring? appeared first on AIM.

]]>

K. T. Rama Rao, Bharat Rashtra Samithi (BRS) Working President, recently highlighted a decline in IT employment in Telangana. He shared data showing that the number of jobs in the sector dropped from 1,27,594 in 2022-23 to just 40,285 in 2023-24. 

Well, the picture is not much different in the entire country. As per the Nasscom data, the IT sector will create only 60,000 new jobs in FY24 compared to 2,70,000 jobs that were created by the sector in the previous fiscal year.

And with Indian IT companies like Wipro, TCS, and Infosys delaying the onboarding of 10,000 freshers, and refusing to provide a joining date, it is evident that the sector continues to witness a trend of decline.

Is There Anyone Hiring?

In FY24, Infosys hired 11,900 freshers, marking a significant 76% decrease from the 50,000 hired in FY23. Similarly, TCS recruited 40,000 freshers in FY24, experiencing a slight reduction from the 44,000 hired the previous year. 

Wipro’s hiring also dropped by up to 50%, with only 10,000–12,000 freshers onboarded in FY24, compared to 20,000 in FY23. 

On the other hand, Cognizant and Capgemini were more aggressive in their FY23 hiring, bringing on 60,000 and 61,182 freshers, respectively, though data for FY24 isn’t provided for these two companies.

The trend of this decline in hiring is accompanied by an increase in layoffs. India’s IT sector laid off around 20,000 techies during the calendar year 2023, described as a “silent layoff,” according to data shared by All India IT & ITeS Employees’ Union (AIITEU).

Sources indicated that Infosys laid off nearly 200-500 employees in 2024, though company CEO Salil Parekh said that there were no plans for downsizing or job cuts. Further, the January 2024 report mentioned that Wipro is firing a mid-level workforce to enhance profit margins. 

Is GenAI the Reason?

Funnily enough, at the same time, many of the graduates do not want to join the Indian IT firms. The reluctance of recent graduates to pursue careers in Indian IT can be attributed to the prolonged stagnation of entry-level salaries, which have remained at INR 3.5-4 LPA for over a decade. High-paying product companies with compensation packages ranging from Rs 10-20 LPA have become more attractive.

Meanwhile, Indian IT is upskilling its existing employees with genAI, instead of hiring new ones. This is also one of the reasons why the workforce is declining and the benches are full. 

As of Feb 2024, the top 10 Indian IT services companies including TCS collectively had nearly 450 genAI projects and proofs of concept (PoCs) in development, reflecting a current deal pipeline valued between $150 million and $250 million, as per the data by market intelligence firm UnearthInsight

Sonata Software, a mid-tier IT company, reported that around 80% of its project pipeline is focused on genAI, according to chief information officer Samir Dhir. The company has a $50 million AI deal pipeline. 

Out of 6,532 employees only about 2,000 are currently trained in genAI, with the company aiming for 50% of its staff to be trained in this field by 2025. 

Therefore, freshers need to be already trained with genAI if they want to compete with the ones being trained at the IT firms. But, there is a significant gap between the curricula and industry expectations when it comes to employees being generative AI ready. 

Despite this, 70% of academic institutions believe their graduates are job-ready. In reality, only 16% of companies share the same view. 

In June, TCS struggled to hire for 80,000 job openings citing skill gap as the reason. The same is true for Wipro. The IT giant recently announced that it is now rolling out 30,000 offer letters for freshers from 2022 with a compensation of INR 3.5 LPA, and other CGPA requirements. 

The truth is that it is extremely difficult to find good, or even decent, software engineers with coding skills for such small compensation. At the same time, there is a supply of mediocre graduates who are not ready for taking up jobs

The post Why Top IT Companies in India are not Hiring? appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-insights-analysis/why-top-it-companies-in-india-are-not-hiring/feed/ 0
CRED Money – Personal Finance Manager with Advanced Data Science https://analyticsindiamag.com/ai-insights-analysis/cred-money-personal-finance-manager-with-advanced-data-science/ https://analyticsindiamag.com/ai-insights-analysis/cred-money-personal-finance-manager-with-advanced-data-science/#respond Fri, 09 Aug 2024 10:47:14 +0000 https://analyticsindiamag.com/?p=10132003

Nearly 7 in 10 of India’s affluent live with fragmented finances spread across multiple bank accounts, wallets, and UPI IDs.

The post CRED Money – Personal Finance Manager with Advanced Data Science appeared first on AIM.

]]>

Managing finances isn’t easy for most of us. Data shows that an average consumer makes around 200 transactions every month. No wonder, nearly 7 in 10 of India’s affluent live with fragmented finances spread across multiple bank accounts, wallets, and UPI IDs. 

Further, according to an Amex Trendex survey, 69% of Indians prioritised financial health as part of their 2024 New Year resolution. On this front, growing savings (81%) and investing or expanding investments (75%) were the primary goals.

Building on this, CRED has recently introduced a feature called CRED Money. It leverages advanced data science to analyse spending patterns, provide personalised insights, and promote proactive personal financial management.

By making members more aware of their finances, CRED Money empowers them to take control and make informed financial decisions.

Recently, Avinash Shukla, a CRED member, tweeted about how the cash flow feature of CRED Money encouraged him to increase his investments beyond his spending.

So how does it work?

CRED Money uses the Account Aggregator (AA) framework, a key component of India’s digital public infrastructure, allowing consumers to securely and efficiently share their bank account information with authorised organisations.

With user consent, CRED’s AA partner, Finvu, consolidates data from various banks and transfers it directly to CRED Money.

Google Pay’s Similar Design 

Earlier in 2020, Google Pay had launched a similar update for Android and iOS, focusing on relationships with people and businesses. It offers insights into your spending and is built with multiple layers of security to keep your information private and safe. 

Instead of showing a long list of transactions, the app highlights the friends and businesses you transact with most frequently. 

If you connect your bank account or cards to Google Pay, the app provides periodic spending summaries and shows your trends and insights over time, giving a clearer view of your finances. 

Google Pay also organises your spending automatically, allowing you to search across your transactions in new ways. For example, you can search for “food”, “last month”, or “Mexican restaurants”, and Google Pay will instantly find the relevant transactions.

Is CRED a Super App?

It is to be noted that CRED, which is the credit card bill payment platform, eventually integrated UPI payment, competing against the likes of Paytm and PhonePe.  

Then there’s CRED Money, which also assists users by sending reminders and updates and allowing them to make payments through CRED UPI. 

Last year, the company launched CRED Garage, a one-stop shop to streamline vehicle management and enter motor insurance distribution, recharging FASTags, and accessing roadside assistance, and more. 

Speaking on the emergence of supper apps, CRED CEO Kunal Shah recently said, “Most people launch a very complex product to start with. A Swiss knife before a decent knife that works.”

He said that this also applies to super apps that numerous large conglomerates launch, attempting to offer 20 features all at once from day one, hinting at how CRED has been focusing on building one feature at a time and building value-adds sensibly. 

Responding to Justin.tv former CEO Emmett Shear’s take on Gall’s Law, Shah stated that large conglomerates often launch super apps that aim to offer a wide range of services right from the beginning. 

Emphasising the importance of simplicity and focus when starting a new venture or developing a product, he said these apps try to do many things at once, which can lead to complications and inefficiencies.

The post CRED Money – Personal Finance Manager with Advanced Data Science appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-insights-analysis/cred-money-personal-finance-manager-with-advanced-data-science/feed/ 0
If Only AI of Today Could Write Windows… https://analyticsindiamag.com/ai-insights-analysis/if-only-ai-of-today-could-write-windows/ https://analyticsindiamag.com/ai-insights-analysis/if-only-ai-of-today-could-write-windows/#respond Fri, 09 Aug 2024 05:57:16 +0000 https://analyticsindiamag.com/?p=10131958

LLMs, in their current form, are far from building something akin to an OS. But what about LLM as an OS?

The post If Only AI of Today Could Write Windows… appeared first on AIM.

]]>

Creating software and applications has never been easier. With generative AI tools, these days anyone can create simple apps and get assistance to build bigger software. This has ushered in a series of claims by people that AI is better at coding than humans, making developers worry about losing their jobs.

But the reality is quite the opposite. Greg Diamos, the co-founder of Lamini, replied to a post by Francois Chollet, who reiterated that there would be more software engineers in five years than there are today. Diamos asked, “Do you think that AI today could write Windows?

Chollet said that software engineers would be using AI and not be replaced by them. “At least not within that timeline and definitely not via current technology.” This is something Diamos agreed with as well.

“I remember a researcher in 2017 showed me an LLM that could write hello.c and fibonacci.c. Today Llama 3.1 can write a calculator app, but Windows is probably out of reach,” wrote Diamos. 

‘The Truth is Somewhere in the Middle’

The problem with current AI systems, especially LLMs, is that they lack in reasoning and maths is something probably out of calculation for them. Though in just the last few days, OpenAI, Google, and Meta have all released their models with increased maths capabilities, the capabilities for creating something like Windows or an operating system all by itself are far-fetched. 

A few weeks back, when Meta launched Llama 3.1, it was touted as the Linux moment of AI that would have a similar impact on the AI ecosystem as Linux had on the operating system world. But the effects are yet to be seen.

LLMs, in their current form, are far from building something akin to an operating system. Experts such as Ilya Sutskever and Andrej Karpathy say that LLMs are the OS of the future, but no one has spoken about how LLMs can actually build an OS. Maybe the future of all software is building extensions on top of LLMs, but that seems unlikely in itself as well.

The truth is that AI tools need software engineers, and not the other way around. And it is also not about the “boiler-plate” code. It’s still out of reach for an LLM to build something out of scratch, if it has not seen the code for something similar before. 

Can LLMs build LLMs all by themselves? Probably not right now, but surely in the future with the advent of agentic AI. 

But as Diamos said, an LLM that was able to create hello.c back in 2017, with current capabilities and correct direction of research could be able to create something beyond a calculator. But for now, a calculator cannot create a calculator or do anything other than arithmetic.

No Idea of the Reality for Now

This is not a new discussion, though. Meta AI chief Yann LeCun disagreed with Sutskever and possibly with Karpathy’s assessment as well. LeCun believes, “Large language models have no idea of the underlying reality that language describes.”

Chollet had cleared the air during a previous discussion as well. “Language can be thought of as the *operating system* of the mind. It is not the *substrate* of the mind – you can think perfectly well without language, much like you can still run programs on a computer without an OS (albeit with much more difficulty).”

LLM OS, on the other hand, is an interesting idea. Maybe LLMs do not need to build a ‘Windows’. Over time, similar to how Windows or Mac OS access various applications to accomplish a task, an LLM OS would be able to access various tools, which would ideally be other LLMs, for problem-solving. 

But as of now, creating an OS involves not only writing code in various languages (like C, C++, and assembly) but also designing and managing complex systems such as memory management, file systems, device drivers, user interfaces, security protocols, and networking. 

This process requires specialised knowledge in computer science and software engineering, along with a deep understanding of hardware and system architecture.

While LLMs can assist in generating code snippets, explaining concepts, and providing guidance, the creation of a full-fledged ‘Windows’ would require significant human expertise, collaboration, and effort beyond the capabilities of current LLMs.

But OpenAI is already teasing us with the release of Project Strawberry, which Sam Altman claims has achieved Level-2 AI, which means human-level reasoning capabilities. Possibly, we are almost there. 

The post If Only AI of Today Could Write Windows… appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-insights-analysis/if-only-ai-of-today-could-write-windows/feed/ 0
Generative AI is Complicating the Process of Note-Taking  https://analyticsindiamag.com/ai-insights-analysis/generative-ai-is-complicating-the-process-of-note-taking/ https://analyticsindiamag.com/ai-insights-analysis/generative-ai-is-complicating-the-process-of-note-taking/#respond Thu, 08 Aug 2024 12:53:51 +0000 https://analyticsindiamag.com/?p=10131922

AI note-taking apps often record and transcribe entire conversations, capturing a lot of potentially sensitive information. 

The post Generative AI is Complicating the Process of Note-Taking  appeared first on AIM.

]]>

Recently, Zoom introduced its GenAI-powered Zoom Docs, which aims to boost efficiency by converting information from Zoom meetings into actionable documents and knowledge bases. Smita Hashim, the chief product officer at Zoom, emphasised its time-saving potential, stating, “Zoom Docs is purpose-built to empower people to ‘work happy’ and give them more time back in their day.”

However, many users still want to stick to the traditional method of note-taking. A Reddit user mentioned: “Writing is thinking. Write your notes so you know what you think about them. You’re only shortcutting yourself when you use AI.” 

Another user explained that he uses LLMs to analyse data but still makes notes manually. “My machine is too low on RAM to have a local LLM,” he added, further suggesting that running LLMs locally would be a better choice but he is limited by system resources. 

But users have different opinions for tools like Obsidian, which allows you to use LLM locally through a plugin called Your Smart Second Brain. It even allows you to use your notes as a database and give you better insights and can be disabled to read your notes. 

Andrej Karpathy praised Obsidian for being simple and having no vendor lock-in features.

There are other note-taking softwares, similar to Obsidian, like Reor, which uses AI to organise notes but all the AI work is done by running models locally and using Ollama, Transformers.js and LanceDB to power them.

Where is the Privacy?

One of the key concerns among users is how much personal data these AI tools collect and what that data is used for. AI note-taking apps often record and transcribe entire conversations, capturing a lot of potentially sensitive information.

This data may be used to train and improve the AI models, without users having full transparency or control over the process. There are questions about how long the data is retained, who has access to it, and how well it is secured against breaches or unauthorised access.

Note-taking apps like Evernote don’t even encrypt your data. A software engineer on Reddit has confirmed that Evernote employees can read your data. So, using such apps with existing privacy concerns makes matters worse. 

The idea is not to remove AI and note-taking apps from the app centre but to use something which respects your privacy. Obsidian is certainly one good example. Joplin follows the same approach, where you can use extensions to extend features. Joplin is an open-source note-taking app, where you can use extensions like Jarvis to use AI capabilities, which work both online and offline. 

You can also consider Notty, which is an open-source local-first note-taking app. The good thing about Notty is you can use AI to help you write better notes and all the data will be stored locally. Sure, you can opt for cloud sync, but that is optional. 

Alternatively, you can try running LLMs locally on your computer and then use any open-source note-taking app to get the most secure platform to take notes.

It’s Not that bad 

Ilya Shabanov, the founder of The Effortless Academics, mentioned in his recent podcast that he has integrated ChatGPT to notes, which assists users while making notes and also helps users to know if the notes are relevant to the topic or not.

A Reddit user mentioned that Otter.AI has helped him a lot to take notes efficiently while attending meetings as it not only generates transcription but also uses AI to give key points of the meeting itself.

The AI note-taking market is rapidly expanding, with a projected value of $19.46 billion by 2028. This growth is driven by the increasing demand for tools that can improve productivity and knowledge management. As a result, numerous AI note-taking apps have emerged, each offering unique features and benefits.

The post Generative AI is Complicating the Process of Note-Taking  appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-insights-analysis/generative-ai-is-complicating-the-process-of-note-taking/feed/ 0
How Digital Twin Can Accelerate India’s Semiconductor Ambitions  https://analyticsindiamag.com/ai-insights-analysis/how-digital-twin-can-accelerate-indias-semiconductor-ambitions/ https://analyticsindiamag.com/ai-insights-analysis/how-digital-twin-can-accelerate-indias-semiconductor-ambitions/#respond Thu, 08 Aug 2024 05:37:42 +0000 https://analyticsindiamag.com/?p=10131831

“Extended to vast scales, a digital twin is a virtual world that’s connected to the physical world,” said NVIDIA chief Jensen Huang.

The post How Digital Twin Can Accelerate India’s Semiconductor Ambitions  appeared first on AIM.

]]>

In the recent Union Budget 2024, the central government increased the estimate for setting up semiconductor fabs in India from INR 12.51 crore to INR 1,500 crore. This is in addition to the INR 6,903 crore earmarked for semiconductor and display manufacturing. 

The commitment has paved the way for India to achieve its dream of becoming the semiconductor capital of the world, and a possible way for us to catch up is by accelerating manufacturing. A potential solution could also be the implementation of digital twins. 

Why Digital Twin in Manufacturing? 

“Extended to vast scales, a digital twin is a virtual world that’s connected to the physical world,” said NVIDIA chief Jensen Huang. 

Digital twins act as virtual replicas of the physical systems, thereby providing a platform that helps train systems before the actual production. In the process, digital twins assist in reducing costs, time, and effort that would be needed to optimise production workflows. 

Last year, at a tech summit in Bengaluru, Microsoft showcased what an assembly line or a manufacturing plant would look like with HoloLens. In the process, it painted a picture of how the digital world will help scale manufacturing. 

India currently has one existing semiconductor facility (Micron in Sanand) and three newly approved facilities at Dholera in Gujarat, Morigaon in Assam, and another one in Sanand, Gujarat. While plans are in place, full-fledged manufacturing will take time to come to fruition. This is something, which can probably be expedited with digital twins. 

While India looks to find its way, companies such as Intel and NVIDIA, are going big on creating digital twin setups for semiconductor and other industries. 

Big Tech and Digital Twin 

Samsung is set to launch an NVIDIA Omniverse-based Fab Digital Twin for simulating fab architecture and semiconductor manufacturing, potentially the first to achieve smart factory Level 5. The platform will pilot next year, showcasing various use cases in planning and simulation.

Furthermore, NVIDIA’s Aerial Omniverse Digital Twin for 6G will be able to create accurate simulations for 6G systems, proving to be a crucial factor for AI integration testing

Interestingly, Intel is not far behind, as the company signed an MoU with Siemens last year to collaborate on the digitalisation of its wafer fabs using digital twin technology. 

Recently, the US government invested $285 million to enhance digital twin technology in semiconductor manufacturing. This funding aims to advance the development and application of these virtual models in the industry. 

India, The Favoured Capital

While the logistics and workaround for setting up fab units are ongoing, the country is already a favoured destination for the chip industry. US-headquartered chip company SiMa AI has chosen India as its strategic market despite China being the largest semiconductor market in the world. 

In December 2021, the Indian government launched the India Semiconductor Mission that announced an INR 76,000-crore chip incentive scheme that offers a 50% subsidy on expenditure for plant development. 

Meanwhile, Tesla is said to have partnered with Tata Electronics to buy semiconductor chips for its global operations. 

With the burgeoning pace at which the semiconductor industry is developing in India, it won’t be long before India could possibly emerge as the semiconductor capital of the world. 

Digital Twins for the World 

While semiconductor companies have found ways to expedite manufacturing processes with digital twins, the concept is being heavily implemented across other sectors too. For instance, space tech, which involves exorbitant cost, resources and most importantly, time, is already solving these with digital twins. 

Recently, Declan Ganley, the founder and CEO of Rivada Space, told AIM how tasks that took 50 days five years ago can now be accomplished in a day with digital twins in satellite technology. 

Fujitsu, which is already in the semiconductor market, is working on creating digital twins for other verticals. They are in the process of developing ‘Ocean Digital Twin’ using AI and underwater drone data. 

The company is looking to promote marine conservation, carbon neutrality and biodiversity through this digital twin. It is also working with Carnegie Mellon University to develop AI-powered digital twin technology with traffic data from Pittsburgh. 

It is clear that robotics, especially autonomous vehicles, has one of the largest use cases for digital twins and simulated reality in their training, and in bring AI to the physical world.

The post How Digital Twin Can Accelerate India’s Semiconductor Ambitions  appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-insights-analysis/how-digital-twin-can-accelerate-indias-semiconductor-ambitions/feed/ 0
Why Prolog Might Be the Answer to Better AI Reasoning https://analyticsindiamag.com/ai-insights-analysis/why-prolog-might-be-the-answer-to-better-ai-reasoning/ https://analyticsindiamag.com/ai-insights-analysis/why-prolog-might-be-the-answer-to-better-ai-reasoning/#respond Wed, 07 Aug 2024 14:25:52 +0000 https://analyticsindiamag.com/?p=10131808

While the JSON-based API can help developers improve the output of their models, with Prolog, models can have better reasoning capabilities from the start.

The post Why Prolog Might Be the Answer to Better AI Reasoning appeared first on AIM.

]]>

OpenAI recently released Structured Outputs, a JSON-based API to help developers improve the output of their models and achieve 100% reasonability. While OpenAI confirms 100% reasonability, there are still chances of errors.

Structured outputs doesn't prevent all kinds of model mistakes.

This begs the question –  why pursue this approach when similar results could be achieved at a programming language level with Prolog? 

Prolog, a powerful yet often overlooked programming language brought to the realm of AI, excels at “old-school” symbolic AI tasks such as knowledge-based systems and rule-based natural language processing. Its declarative nature allows developers to specify rules and facts about a problem domain, allowing its interpreter to automatically infer solutions. This makes it one of the best languages for classic AI problems such as search and constraint satisfaction.

Prolog also shines in handling uncertain or incomplete data. Programmers can specify rules that might be true or false, and Prolog will reason out the problem to find the most likely and accurate solution given available information. This is a key advantage in real-world AI scenarios where information is often incomplete.

Similarly, a study published as part of the International Computer Science Series suggested that Prolog is well-suited for AI development due to its declarative nature, for the very reason that programmers can specify rules and facts, and Prolog’s built-in inference engine can derive conclusions.

The report further mentioned that Prolog’s backtracking mechanism allows for efficient searching of solution spaces. The report provides examples of using Prolog for natural language processing tasks like parsing sentences. Prolog’s ability to handle recursive rules and symbolic expressions makes it useful for implementing expert systems and knowledge representation.

Prolog in Production Environment

IBM’s Watson system, famous for winning Jeopardy, uses Prolog for pattern matching over natural language parse trees. According to the developers, Prolog was chosen for its “simplicity and expressiveness” in specifying pattern-matching rules.

“We required a language in which we could conveniently express pattern matching rules over the parse trees and other annotations (such as named entity recognition results), and a technology that could execute these rules very efficiently. We found that Prolog was the ideal choice for the language due to its simplicity and expressiveness” said former IBM senior technical staff member Adam Lally

Meanwhile, Kyndi, an AI company, which was acquired by Qlik earlier this year, used Prolog for its natural language processing software because of its logic-based capabilities. Kyndi founder Ryan Welsh said, “Prolog is more logic-driven and more powerful in its abstract reasoning than modern programming languages such as C++ or Java.”

Apparently, TerminusDB, an open-source graph database and document store, has been implemented in Prolog. The choice of Prolog as the implementation language is significant due to, again, its declarative nature and ability to express complex rules, making it well-suited for a deductive database system like TerminusDB.

Prolog’s logic-based approach allows TerminusDB to efficiently handle complex queries and reasoning over the stored data. The declarative style of Prolog enables developers to focus on specifying the desired outcomes rather than the step-by-step process of achieving them, which aligns well with the declarative nature of database queries.

Furthermore, GeneXus incorporates Prolog as part of its rule-based system for developing smart applications with AI capabilities. Prolog’s declarative and logic-based approach aligns well with specifying business rules and complex application logic.

TextRazor, a London-based startup, also performs text analysis using an engine coded in Prolog. “TextRazor uses Prolog as its rules engine. Our system was designed to make common patterns easy to build without technical expertise, while keeping the full power and expressiveness of the Prolog language for more complex tasks,” TextRazor said in a blog.

Considering how efficient Prolog is due to its declarative nature, the question is why AI models were not trained through rule-based programming like Prolog? So far it seems like we could have eliminated so many problems within these models with just one change in foundation.

The post Why Prolog Might Be the Answer to Better AI Reasoning appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-insights-analysis/why-prolog-might-be-the-answer-to-better-ai-reasoning/feed/ 0
Why Does Everyone Want Tech Jobs to Fail? https://analyticsindiamag.com/ai-insights-analysis/why-does-everyone-want-tech-jobs-to-fail/ https://analyticsindiamag.com/ai-insights-analysis/why-does-everyone-want-tech-jobs-to-fail/#respond Wed, 07 Aug 2024 13:00:00 +0000 https://analyticsindiamag.com/?p=10131783

The claims like “coding is dead” and “software engineering jobs are dying” by the AI experts themselves have been fuelling the fire.

The post Why Does Everyone Want Tech Jobs to Fail? appeared first on AIM.

]]>

Recently, there has been a noticeable surge in the discussions about a potential collapse of the tech job market. The collapse of the stock market has not helped the case either. It almost seems as if people are eagerly waiting for the bubble to burst, hoping tech jobs will lose their lustre. 

In a Reddit discussion, many spoke about the possible reasons for this jealousy, hatred, or whatever it is that people have against tech jobs. An obvious one that came up was schadenfreude – the pleasure derived from another’s misfortune. 

The tech industry has long been associated with high salaries, luxurious perks, and an enviable work-life balance. During the pandemic, a user pointed out, people flaunted their remote, six-figure, salaried positions online while the rest of them had to work laborious jobs wearing masks. 

This created a sense of resentment among those in less fortunate positions, leading to a desire to see tech professionals brought down a peg.

This disconnect between tech workers and the average person has fostered a sense of “finally, they’re getting what they deserve” from those who feel tech has been overhyped.

AI is Now Killing Everyone’s Job?

Some are not happy with the fact that AI is killing a lot of creative jobs, or at least being touted to be doing so. The tech overlords have stated several times that there will be no need for creative jobs in the future. It first came for the writers and artists, and followed it up by wiping out a lot of clerical jobs. People started hoping the same for the software engineers. 

Cognition Labs’ Devin tried to do so, and everyone was happy that the people who created AI are the ones who are going to get replaced by it. But the truth of the matter is that none of the jobs are getting automated completely.

The rapid growth and significant influence of the tech industry have also contributed to the backlash. “I find a lot of tech workers constantly flaunt their very high tech incomes that they have in their 20s and 30s ($250k-1m). They make many times the median income, and have a totally different reality than the rest of the population.” 

The claims like “tech is dead” and “software engineering is dying” by the AI experts themselves have been fuelling the fire. Even the creators of coding tools such as GitHub Copilot and others have said several times that coding would become redundant as everyone would be a coder.

Some root for a downturn in tech jobs in the hope to see a more balanced job market emerge. The dominance of tech in the job market has led to significant wage disparities and a concentration of wealth and opportunities in IT hubs. 

A correction in the tech job market could, in theory, lead to more opportunities across different sectors. But AI is here to stay.

Every Job Would be a ‘Tech’ Job

It’s hard to ignore how AI has revolutionised almost every aspect of our lives, from communication and healthcare to entertainment and education. And now AI is creating the age of abundance, with every job getting easier, and in some sense becoming a “tech job”.

Generative AI has made even creative jobs easier for the artists, albeit they seem a lot less creative. With techies giving prompts to AI art generators such as Midjourney, making videos on Runway, and writing poetries on ChatGPT, the artists have started to criticise the need for AI, or even the tech field, as it is making art less artsy. 

But AI will create more creative jobs than there are today.

While the tech industry does pose challenges, it also presents opportunities for reskilling and upskilling the workforce as Indian IT is doing it with its current employees and training them with generative AI.

As automation and AI continue to evolve, there’s a growing anxiety about the future of work and whether there will be enough opportunities for everyone. This uncertainty can breed a desire to see the tech sector stumble, in hopes that it might slow down the pace of change and offer some stability to other industries.

The post Why Does Everyone Want Tech Jobs to Fail? appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-insights-analysis/why-does-everyone-want-tech-jobs-to-fail/feed/ 0
Bengaluru’s Control One is Bringing Forklifts to Life with Physical AI Agents https://analyticsindiamag.com/ai-insights-analysis/bengalurus-control-one-is-bringing-forklifts-to-life-with-physical-ai-agents/ https://analyticsindiamag.com/ai-insights-analysis/bengalurus-control-one-is-bringing-forklifts-to-life-with-physical-ai-agents/#respond Wed, 07 Aug 2024 09:40:10 +0000 https://analyticsindiamag.com/?p=10131741

Pranavan said that to compete with Boston Dynamics, the company needs at least $500 million in funding, but “given our capital efficiency and cost factors, we would be able to do with $100 million as well”.

The post Bengaluru’s Control One is Bringing Forklifts to Life with Physical AI Agents appeared first on AIM.

]]>

Within the AI landscape of India, with the minimal funding that startups get, it is hard to get into the sector, especially if you are getting into robotics. But backed by the power of jugaad, a team of experts from Tesla, and advisors from India, Bengaluru-based Control One has taken up the task of building the next Boston Dynamics from the country.

The key to Control One’s innovation lies in its focus on vision-based Visual SLAM (Simultaneous Localisation and Mapping) systems instead of traditional LiDAR technology. “Using vision, we try to make the machine understand its surroundings and navigate accordingly,” said Pranavan S, the founder and CEO, in an interaction with AIM

Recently, this one-year-old startup unveiled India’s first physical AI agent for the global market, which was installed on a forklift. With the integration, the forklift could interact with its surroundings and execute tasks based on simple voice inputs. The company calls it a 3-brain system, where operators can manage the AI decision remotely in real-time.

Visual SLAM, or vSLAM, helps the robot navigate using reference points in the environment, just like humans do. Pranavan said what separates Control One from every other robotics company in India is the focus on building an operating system for robotics. 

Currently, the forklift operates only on the ground but Pranavan said they aim to operate it in three directions, which includes picking up objects and placing them.

Founded in 2023, Control One attracted investment in April and raised $350k from industry leaders like iRobot co-founder Helen Greiner, CRED  founder Kunal Shah, and executives from Tesla, Walmart, and General Electric. 

Last year, NVIDIA Research unveiled Eureka, an AI agent that automatically generates algorithms for training robots using GPT-4 prompts and reinforcement learning. It seems like that is the future when physical AI agents will be everywhere.

OS for Robots

But when it comes to Control One, “We are building an OS that can understand the environment around it, akin to human behaviour,” explained Pranavan. “For example, we humans navigate dynamically, even in crowded and ever-changing environments, using our vision and depth perception to take predictive actions. This is what we aim to achieve with our machines.”

Control One’s OS, dubbed ‘One AI’ can be seamlessly integrated into existing warehouse equipment like pallet movers and forklifts. These systems autonomously navigate and improve performance over time, setting a new standard for intelligent material movement. 

Currently only operated using remotes, Control One aims to introduce prompts in natural language in the future, but the issue is around the regulations of the same. 

Pranavan elaborated on the advantages of vision systems over LiDAR. “LiDAR can detect obstacles, but it doesn’t understand what those obstacles are or their movement dynamics. vSLAM, on the other hand, can analyse pixel vectors to determine if an object is approaching, moving away, or crossing paths, allowing the robot to make more informed decisions.”

Despite their focus on vision systems, Control One acknowledges the complementary role of LiDAR for safety. “We use LiDARs to meet market safety standards, ensuring our robots can stop immediately if they get too close to an object,” said Pranavan. 

Though the company does training in the real-world, simulation also accounts for around 30% of the training. “The problem with simulation is that the AI would work in the settings you put it in, but when transferred into the real world, uncertainties such as uneven floor can also affect the physical agent’s performance,” said Pranavan.

The Boston Dynamics of India

While there are companies such as DiFACTO Robotics, Novus Hi-Tech, and Gridbots Robotics, Control One is focused on building the whole stack of a robot using physical AI agents.

The most interesting part is that all of this happens on-edge using NVIDIA A2000 GPUs. “We started with NVIDIA’s A5000 series but moved to the 2000 series to reduce power consumption,” said Pranavan. “Our next iteration will consume less than 80 watts and run entirely on battery, extending its operational life.”

This efficient use of power is crucial for Control One’s vision-based system, which demands significant computational resources. “Our current demo operates at 280 TOPS of AI computing power, plus it runs on an AMD processor. The next iteration will be more power-efficient while maintaining high performance.”

Pranavan, with experience of building SP Robotics Works as a co-founder, said that frugal innovation has always been part of what Indian startups do, explaining how he started figuring out how a 128 MB RAM computer worked back in his young days. 

He said that to compete with Boston Dynamics, they need at least $500 million in funding, but “given our capital efficiency and cost factors, we would be able to do with $100 million as well”.

Giving AI to Blue Collar Jobs

Control One is focusing on the warehousing sector, where there is a significant labour shortage and a high demand for automation solutions, especially in the global market. “The warehousing market is facing a huge labour crisis. Our system enables one person to manage multiple robots, effectively multiplying their productivity,” said Pranavan.

One AI is designed to manage entire fleets of robots, enabling remote operation and supervision. “Our ultimate goal is to allow one person in a remote location to manage an entire warehouse in another part of the world. OneAI will be the interface between humans and machines, helping operators make informed decisions.”

With a vision of building a humanoid, maybe in the future, Pranavan said that you don’t necessarily need to build a humanoid to operate machinery in factories. “We are operation system makers, not humanoid makers, yet,” said Pranavan, as he says that he is looking to partner with OEMs in the future, with several already under way.

The post Bengaluru’s Control One is Bringing Forklifts to Life with Physical AI Agents appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-insights-analysis/bengalurus-control-one-is-bringing-forklifts-to-life-with-physical-ai-agents/feed/ 0
Using GenAI in Marketing is Such a Waste of Time & Money https://analyticsindiamag.com/ai-insights-analysis/using-genai-in-marketing-is-such-a-waste-of-time-money/ https://analyticsindiamag.com/ai-insights-analysis/using-genai-in-marketing-is-such-a-waste-of-time-money/#respond Wed, 07 Aug 2024 08:00:00 +0000 https://analyticsindiamag.com/?p=10131725

There is more bad news.

The post Using GenAI in Marketing is Such a Waste of Time & Money appeared first on AIM.

]]>

There is no shying away from the fact that generative AI has become a source of hope for busy professionals. A recent survey indicates that 61% of organisations use AI for social media primarily to reduce staff workload.

Further, 64% of chief marketing officers (CMOs) remain optimistic about AI’s potential though there has been a decline in average marketing budgets from 9.1% in 2023 to 7.7% of overall company revenue this year.

Useful, maybe. But is it also profitable?

Neil Patel, the co-founder at Neil Patel Digital, emphasised that while AI can simplify and automate marketing tasks, it doesn’t necessarily make the overall process easier.

He noted that, similar to how the cloud revolutionised business operations, AI lowers the barriers to entry for startups, making it easier and cheaper for new companies to develop products and services. 

However, this reduction in barriers also leads to increased competition. More startups and businesses entering the market means that companies must invest more in marketing and sales to stand out and attract customers.

David Spitz, the founder of BenchSights, provided data showing a dramatic rise in marketing and sales expenses for SaaS companies over the past eight quarters. The figures (150%, 167%, 179%, etc.) demonstrate the growing cost of generating revenue, with companies spending $2.64 for every $1 of revenue.

Additionally, these companies take an average of 2.64 years to recoup their marketing and sales expenditures, which does not even account for profitability. This underscores the increasing significance and cost of marketing efforts.

Patel also highlighted a survey of 113 later-stage startups, which revealed that they spend an average of 27.59% of their raised funds on marketing, illustrating the substantial investment required in this area.

There is More Bad News

Data suggests that the entire process from writing to publication can be completed in as little as 16 minutes with AI, compared to an average of 69 minutes for humans. Despite this efficiency, the question of trust remains significant.

According to the Hootsuite Social Media Consumer 2024 Survey, 62% of consumers are less likely to engage with or trust content if they know it is AI-generated content. It further reflects on the traffic or engagement of the content too. 

Source: NP Digital 

Patel previously noted, “If everyone uses AI to create content, and if AI uses data already available to develop these projects, we won’t have anything new and authentic anymore.”

It appears that only marketing professionals are confident in AI, while consumers remain sceptical.

Source: LinkedIn

LinkedIn’s AI Game for Marketing

Recently, Yurii Rebryk, the CEO of Fluently, shared that one of his posts went viral, reaching 2,100,000 views and generating 14,305 visits to the company’s website.

“People in the comments ask me why I post on LinkedIn. Doesn’t this take focus away from my business? The answer is NO — LinkedIn is one of the best marketing channels where you can organically attract thousands of users to your B2B or B2C product,” Rebryk explained

This year, LinkedIn introduced many AI features ranging from assisting in the creation of a job description feed to writing a post. 

Source: LinkedIn

This feature has faced criticism from users as they ask, why just share thoughts and rewrite with AI, offer a feature that can add an audio mode and save us the trouble of typing. While others believe it enhances the writing and boosts the author’s confidence.

Hope your next marketing strategy doesn’t solely rely on AI and if not, then good luck!

The post Using GenAI in Marketing is Such a Waste of Time & Money appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-insights-analysis/using-genai-in-marketing-is-such-a-waste-of-time-money/feed/ 0
LLMs Have Stalled the Progress of AGI https://analyticsindiamag.com/ai-insights-analysis/llms-have-stalled-the-progress-of-agi/ https://analyticsindiamag.com/ai-insights-analysis/llms-have-stalled-the-progress-of-agi/#respond Tue, 06 Aug 2024 12:30:00 +0000 https://analyticsindiamag.com/?p=10131679

Current LLMs show low user trust, low accuracy and reliability. This is not a new conversation as the reasoning and logical problems with current LLMs have been brought up several times

The post LLMs Have Stalled the Progress of AGI appeared first on AIM.

]]>

Everyone has questions about generative AI, the validity of LLMs, and the future they hold. While LLMs are often seen as overhyped in research circles, they are also considered by some as a major roadblock to achieving artificial general intelligence (AGI).

Reiterating this point, Mike Knoop, the co-founder of Zapier, recently expressed scepticism about the progress of AI language models toward achieving AGI. “LLMs have stalled in the progress to AGI and increasing scale will not help what is an inherently limited technology,” said Knoop in a recent interview.

Knoop’s assessment was based on the fact that current LLMs show low user trust, low accuracy and reliability. “And these problems are not going away with scale,” he added. This is not a new conversation as the reasoning and logical problems with current LLMs have been brought up several times. 

No Roads Lead to AGI

Regardless, the scepticism has not deterred the big-techs from the mad rush of building the best LLMs. Google, OpenAI, Microsoft, and Meta have been racing towards building LLMs – bigger and smaller alike. 

All this is while Yann LeCun, the chief of Meta AI, has obsessively said that LLMs won’t lead to AGI and the researchers getting into the AI field should not work on LLMs. 

Similarly, Francois Chollet, the creator of Keras, also recently shared similar thoughts on this. “OpenAI has set back the progress towards AGI by 5-10 years because frontier research is no longer being published and LLMs are an offramp on the path to AGI,” he said in an interview.

Knoop’s concerns are also well established and are heightened by his participation in the introduction of the ARC Prize, a competition designed to encourage novel approaches to AGI, particularly through the Abstraction and Reasoning Corpus (ARC).

Since its establishment, this benchmark—which evaluates the capacity for efficient skill acquisition— has not shown much improvement, supporting Knoop’s contention that present AI models do not seem to be headed towards AGI.

Forget AGI, Aim for Animal-level Intelligence

LeCun says AI should reach animal-level intelligence before heading towards AGI and Knoop has his concerns about LLMs. Likewise, Andrej Karpathy, the founder of Eureka Labs, has been quite vocal about the issues with LLMs. 

In his latest experiment, Karpathy proved that LLMs struggle with seemingly simple tasks and coined the term “Jagged Intelligence” to prove his point.

Even Yoshua Bengio, one of the godfathers of AI, said in an exclusive interview with AIM that when it comes to achieving the kind of intelligence that humans have, some important ingredients are still missing.

These problems are not unique. Noam Brown, a research engineer at OpenAI, experimented with LLMs by making them play basic games like tic-tac-toe. The outcome was dismal, since LLMs performed poorly in this simple game. 

Additionally, another developer tasked GPT-4 with solving tiny Sudoku puzzles. Here too, the model struggled, often failing to solve these seemingly simple puzzles. Even a study by Google DeepMind has proved that LLMs lack genuine understanding and, as a result, cannot self-correct or adjust their responses on command. 

Subbarao Kambhampati, professor of AI at Arizona State University, agrees with the notion. “They are just not made and meant to do that,” he said. He gave an example of how LLM-based chatbots are still very weak at maths problems.

Too Early to Write off LLMs

But, there is still time. While it’s true that we haven’t reached human-level intelligence yet, it’s not that we are never going to achieve it. 

OpenAI had claimed that GPT-4, released in March, has beaten human psychologists in understanding complex emotions. Infact, OpenAI CTO Mira Murati in a recent interview claimed that GPT-5 will have ‘PhD-level’ intelligence.

Meanwhile, Ilya Sutskever, the former chief scientist at OpenAI and founder of Safe Superintelligence, believes that text is the projection of the world. But how much of that is linked with LLMs is still questionable. LLMs, in a way, are building cognitive architecture from scratch, echoing the evolutionary and real-time learning processes, albeit with a bit more electricity.

Last month, AIM had discussed the various approaches taken by big tech companies, namely OpenAI, Meta, Google DeepMind, and Tesla, in the pursuit of AGI. Since then, tremendous progress has been made.

Undeterred, the research on LLMs also is still going strong with companies finding various ways to constantly improve the models. Recently, OpenAI released a research paper on Prover-Verifier Games’ (PVG) for LLMs which can solve Knoop’s problem. 

Similarly, Causality with LLMs would enable AI to understand cause-and-effect relationships, similar to human reasoning. Then, there is neurosymbolic AI that can enhance LLMs efficiency. 

We can safely say that LLMs are worth more than one shot to smoothen the road to AGI.

The post LLMs Have Stalled the Progress of AGI appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-insights-analysis/llms-have-stalled-the-progress-of-agi/feed/ 0
Soon, Software Engineering Would be All About Managing AI Coding Agents https://analyticsindiamag.com/ai-insights-analysis/soon-software-engineering-would-be-all-about-managing-ai-coding-agents/ https://analyticsindiamag.com/ai-insights-analysis/soon-software-engineering-would-be-all-about-managing-ai-coding-agents/#respond Tue, 06 Aug 2024 11:30:00 +0000 https://analyticsindiamag.com/?p=10131585

There has never been a better or more productive time to be a builder than now.

The post Soon, Software Engineering Would be All About Managing AI Coding Agents appeared first on AIM.

]]>

Stakeholders are worried about the future of software engineering and everyone is busy making predictions. While some claim that most coding jobs will cease to exist, others say that there will be an abundance of software engineers in the future. What is the truth?

Remember Devin? The AI software engineer made by Cognition Labs propelled people to talk about job displacements. Now, Russell Kaplan, who has joined as the president of Cognition Labs, is making several predictions about the future of software engineering. 

“Models will be extraordinarily good at coding very soon,” he stated, emphasising that research labs are prioritising coding and reasoning improvements above all else, as was evident with OpenAI, Meta, and Google releasing models for maths. 

This investment will soon pay off, transforming the landscape of software development. Kaplan highlights a unique advantage in coding: the potential for superhuman data scaling through “self-play”.

A Team of AI Agents

If models can write code, test it, and check for self-consistency, a form of automatic supervision that is not possible in most domains due to the limits of human expertise will usher in. This capability would allow code to be tested empirically and automatically by AI agents.

As a result of this, software engineering will see radical transformation. “True coding agents, which do tasks end to end, will complement today’s AI copilots,” Kaplan explained. 

This will make every engineer akin to an engineering manager, delegating basic tasks to coding agents while focusing on higher-level aspects such as understanding requirements, architecting systems, and deciding what to build. 

“Any value that humans can bring here is either in leading the direction of deciding what to work on (eg. product), or in working at the cutting edge of what AI cannot do yet (researching new models),” agreed Jeremy Bernier, founding engineer at Canvas.

This is similar to what Franoic Chollet, the creator of Keras, said a few months ago. He predicted that there would be around 10 million more coding jobs in the next five years, but those would be for people who have the expertise in writing code in Python, but most of the coding would be done by AI.

Chollet had earlier said, “If you could fully automate software engineering (my job), I think that would be great, since I could then move on to higher-leverage things. Making software is a means to an end, not the end.” He added that software engineering is not just about copy-pasting code, but about developing mental models of problems and their solutions. 

The Age of Abundance

What’s certain is that the role of software engineers will evolve significantly. “There will be way more software engineers in the future than the present,” Kaplan noted. But “the job will just be very different: more English, less boilerplate coding”. 

According to this prediction, engineers will adapt, much like they did when transitioning from assembly language to Python. Funnily, it’s likely that English teachers would make the best engineers.

Interestingly, this could also usher in the era where advertisements will be focused on AI agents, and not developers. Companies that market to developers will start targeting coding agents as well. “Your agent might decide what cloud you use and which database you choose,” said Kaplan. And there would be an army of those agents.

Just as Reid Hoffman predicted the decline of traditional 9-to-5 jobs due to the rise of the gig economy and a focus on results, the standards for product quality will increase as developers will be able to ship products much faster. This would call for better testing infrastructure, where agents can automatically train, test, and verify models and software. 

“There’s never been a better or more productive time to be a builder,” he concluded, adding that softwares like Devin would usher in the age of abundance for software. 

“Code is largely worthless, more of a liability than an asset. Problem-solving is where the value is,” Chollet added in his thread. Because software engineering is not just about writing or copying code or giving prompts to AI models, it is much more than that.

Offices in the future would probably have ‘Human Software Engineers’ managing a team of AI engineers, just as we predicted.

The post Soon, Software Engineering Would be All About Managing AI Coding Agents appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-insights-analysis/soon-software-engineering-would-be-all-about-managing-ai-coding-agents/feed/ 0
Now is the Best Time to Build AI Startups in India https://analyticsindiamag.com/ai-insights-analysis/now-is-the-best-time-to-build-ai-startups-in-india/ https://analyticsindiamag.com/ai-insights-analysis/now-is-the-best-time-to-build-ai-startups-in-india/#respond Tue, 06 Aug 2024 05:29:14 +0000 https://analyticsindiamag.com/?p=10131457

Interestingly, India ranks third among the top five countries by business funding, surpassing both the United Kingdom and Germany.

The post Now is the Best Time to Build AI Startups in India appeared first on AIM.

]]>

Antler, a VC firm, recently announced a $10 million investment in early-stage Indian AI startups. Rajiv Srivatsa, partner at Antler, shared the news on X, emphasising: “NOW is the best time to build a startup in India!”

Similarly, last month Google introduced several initiatives to enhance AI development in India, including a partnership with the MeitY Startup Hub to train 10,000 startups. 

At the Google I/O Connect Bengaluru 2024 event, the tech giant introduced the 2024 Class of the Google for Startups Accelerator: AI First program in India. This cohort includes 20 exceptional AI-first startups from diverse sectors such as gaming and manufacturing, chosen from over 1,030 applications nationwide.

Earlier in March, there were reports that suggested India will partner with chip-making giant NVIDIA to procure up to 10,000 of its GPUs and NPUs and offer them at subsidised rates to local startups, researchers, academic institutions, and other users in a bid to boost the AI infrastructure in India.

Further, even the JioGenNext team invests substantial time in collaborating with AI startups to offer comprehensive support and strategic insights tailored to their needs and challenges. The latest cohort, MAP’24, also featured 10 generative AI startups across sectors like healthcare, banking, legal services, and agriculture.

The Indian government is also playing a pivotal role in fostering AI growth. 

The US-India AI Initiative by IUSSTF, aims to foster AI growth through idea exchange and collaboration in crucial sectors like energy, health, and agriculture. 

Additionally, the Ministry of Corporate Affairs announced MCA 3.0 to streamline regulatory filing using AI, and the Responsible AI for Youth program by MeitY aims to impart AI skills to government school students, enhancing their employability and practical knowledge.

Not just the central, state governments are taking matters into their own hands. For example, the Telangana government launched the INAI (Intel AI) research centre in collaboration with IIT Hyderabad and the Public Health Foundation of India. This centre focuses on using AI to tackle healthcare and smart mobility challenges, particularly in rural and difficult terrains. 

Additionally, the state government has become the first in India to develop AI models in a regional language. In collaboration with Swecha, it is creating a Telugu language model, involving 100,000 students in a datathon to gather high-quality data.

AI Funding Galore 

In the last six months alone, 43 Indian AI startups received $864 million in funding, according to data from AIM Research. 

Among these startups, Ema, an enterprise AI startup, raised $36 million in Series A funding

Bengaluru-based Simplismart, founded by Amritanshu Jain and Devansh Ghatak, raised $7 million in a funding round led by Accel, with participation from existing investors, including Anicut Capital, First Cheque, Sunn91, and Shastra VC. 

Another Bengaluru-based AI startup, Unscript, has raised over $1.25 million to date, which has been pivotal in developing its platform and expanding its market reach. 

Additionally, Bloq Quantum, an AI quantum software startup, secured INR 1.3 crore in a pre-seed round led by Inflection Point Ventures. Meanwhile, Ascendion, a provider of digital engineering services, reported a remarkable 382% increase in generative AI revenue in the first half of 2024 compared to the same period in the previous year. 

Not to forget, an old but important update, in a significant milestone for the Indian AI sector, Ola-backed Krutrim became India’s first AI startup to achieve unicorn status with a $50 million funding boost.

Acquisitions, Partnerships, and More

In terms of AI acquisitions, Protect AI acquired Bengaluru-based SydeLabs for $25 million to enhance LLM security. 

C5i acquired Bengaluru-based Analytic Edge for $20 million, marking its second Indian acquisition after Incivus, to enhance AI-driven marketing and sales solutions, as the firm aims for a INR 1000 crore turnover and a future listing in India.

Coming down to AI partnerships, CoRover.ai recently partnered with AI auditing firm EthosAI.one to benchmark BharatGPT against industry leaders like GPT-4, Llama 2, and Gemini, ensuring reliability, fairness, and accuracy through continuous auditing and enhancements. 

Karya partnered with Microsoft Research to build one of the largest multilingual evaluations of Indian LLMs, involving over 90,000 human assessments of 30 models in 10 languages to address linguistic diversity and cultural nuances, alongside enhancing model performance and reliability.

The AI startup scene in India is buzzing with activity, and the momentum shows no signs of slowing down.

Mukund Jha, the co-founder and former CTO of Dunzo, is in discussions to secure INR 80 crore (approximately $6-10 million) from Together Fund for his new generative AI startup focused on automating quality assurance processes in businesses, integrating AI and SaaS technologies.

Startups appear to be acting on Peak XV Partners MD Rajan Anandan’s words: “Our firm has over INR 16,000 Cr of dry powder. We just want more people starting up in AI.”

As per Traxcn data, in 2022, AI startups in India raised $599 million. However, funding dropped sharply to $168.4 million in 2023, representing a decline of almost 71%. In 2024, nearly $96 million has been raised so far, marking a 61% increase over the same period in 2023 but still an 82% decline compared to the first half of 2022. 

Interestingly, India ranks third among the top five countries by business funding, surpassing both the United Kingdom and Germany. 

Source: WEF

Government Regulatory Support

Coming down to policies and guidelines, NITI Aayog introduced the National Strategy for Artificial Intelligence in 2018, outlining guidelines for AI research and development across various sectors, including healthcare, agriculture, education, “smart” cities, infrastructure, and smart mobility. 

Later in 2021, NITI Aayog published Part 1 – Principles for Responsible AI and Part 2 – Operationalizing Principles for Responsible AI

Recently, the government enacted the Digital Personal Data Protection Act in 2023, which can be utilised to address privacy concerns related to AI platforms.

In its 2023 AI report, MeitY has outlined seven pillars for strategic AI development, including Centers of Excellence, Dataset Platform, and future skills initiatives.

Further, the report also has recommendations for how India can leverage its demographic dividend and play to its strengths as an IT superpower to further the penetration of AI skills in the country, strengthening the AI compute infrastructure in India to support AI innovation through public-private partnerships (PPPs). 

To put it simply in the words of Paul Graham, the American computer scientist – “If you start a startup now to do something that barely works with AI, the next models will make it really work.”

Seems like there’s no stopping for India in AI.  The time is NOW. 

The post Now is the Best Time to Build AI Startups in India appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-insights-analysis/now-is-the-best-time-to-build-ai-startups-in-india/feed/ 0
You Don’t Mess with Google https://analyticsindiamag.com/ai-insights-analysis/you-dont-mess-with-google/ https://analyticsindiamag.com/ai-insights-analysis/you-dont-mess-with-google/#respond Mon, 05 Aug 2024 11:50:42 +0000 https://analyticsindiamag.com/?p=10131345

‘Attention’ has returned to Google

The post You Don’t Mess with Google appeared first on AIM.

]]>

The headline seems apt for Google at the moment. Google DeepMind’s latest model, Gemini 1.5 Pro’s experimental version, 0801, recently claimed the top spot in Arena, surpassing GPT-4o and Claude-3.5 with an impressive score of 1300. It also achieved first place on the Vision Leaderboard.

https://twitter.com/JeffDean/status/1819121162578022849

It excels in multilingual tasks and delivers a robust performance in technical areas like math, hard prompts, and coding. This version of Gemini 1.5 Pro is available for early testing and feedback in Google AI Studio and the Gemini API, where developers can try it out. 

“Very impressed with the extraction capabilities from images and PDFs. It does feel super close to what GPT-4o gives in terms of vision capabilities and what Claude 3.5 Sonnet gives me on code generation and PDF understanding/reasoning,” said Elvis Saravia, co-founder, DAIR.AI.

“The new Gemini model can execute code and is quite ‘sharp.’ We might see it become the first competitor to the Code Interpreter,” said Ethan Mollick, assistant professor at Wharton. He further said that, while experimenting with it for data analysis, some capabilities seemed odd, like the fact that it couldn’t access files unless they are uploaded each time. “Nevertheless, it seems promising so far,” he said.

Most recently, Google also upgraded its free tier Gemini model with 1.5 Flash, which brings improvements in quality and latency, with especially noticeable enhancements in reasoning and image understanding. The tech giant also expanded the context window in Gemini Advanced, quadrupling it to 32K tokens. 

Following in the footsteps of its competitor OpenAI, Google has announced improvements to Gemini 1.5 Flash, reducing input costs by up to 85% and output costs by up to 80%, effective from August 12, 2024.

Apart from focusing on the LLMs, Google entered the SLM scene as well by releasing Gemma 2. This new model outperformed GPT-3.5 on the Chatbot Arena leaderboard and is optimised for efficient deployment across a wide range of hardware, from edge devices to robust cloud environments. Leveraging NVIDIA’s TensorRT-LLM library optimisations, it supports deployments in data centres, local workstations, and edge AI applications. 

InflectionAI, AdeptAI, CharacterAI. Who’s next?

Google recently brought back Noam Shazeer, one of the original authors of the famous Transformers paper ‘Attention is All You Need’, who had left Google to create his own chatbot company, Character.AI. Surprisingly, Character.AI has more web traffic in the US than Gemini, according to Similarweb. 

This is similar to what Microsoft and Amazon have done with Inflection and Adept AI, respectively. Microsoft hired much of Inflection’s leadership team and employees, agreeing to pay a licensing fee of approximately $650 million. Meanwhile, Amazon hired Adept’s co-founder and CEO David Luan, along with several other co-founders and key team members.

“Absolutely delighted to welcome my longtime Google colleague Noam Shazeer back to Google!” said Jeff Dean, Google Deepmind’s chief scientist.

Similarly, others from Deepmind followed suit in welcoming Shazeer back. “Welcome back Noam Shazeer to Google! It’ll be a great time working together again since 2018. Let’s take Gemini which is #1 and continue expanding the limits of its capabilities,” said Dustin Tran, research scientist at Google Deepmind. 

In a new arrangement, Google will pay a licensing fee to Character.AI for its models. Around 30 of Character.AI’s 130 staff members, specialising in model training and voice AI, will join Google to advance its Gemini AI initiatives. 

Interestingly, Character.AI recently introduced Character Calls, allowing users to engage in two-way voice conversations with their favourite characters, like Michael Jackson, Sherlock Holmes, or Albert Einstein. “It’s like having a phone call with a friend,” the company said.

“So this also means that google is getting into the “ai companion” market to compete with openai’s “her” lol,” one user posted on X in response. 

Character.AI allows users to create their own chatbots, defining their personalities, traits, and backstories. These characters can be based on real people, fictional characters, or entirely new creations. 

This is similar to what Meta AI is doing by introducing customised AI characters across its social media platforms. Meta recently introduced AI Studio, a new platform that allows users to create, share, and discover custom AI characters without needing technical skills.

OpenAI recently rolled its advanced voice mode to a small group, which has been received well with the users finding several usecases for it. However, GPT-4o is still not multimodal yet. 

“Advanced voice mode on ChatGPT is very impressive, but it doesn’t yet have the full features enabled: no multimodal video or images in, no GPTs, no new image generator, no code interpreter or internet access,” said Mollick.

On the other hand, Google Deepmind’s latest models AlphaProof and AlphaGeometry 2 solved four out of six problems from this year’s International Mathematical Olympiad (IMO), achieving a score equivalent to a silver medalist in the competition. 

The word on the street is that OpenAI is also working on advanced reasoning capabilities for its next frontier model with ‘Project Strawberry’, which is most likely to be GPT-5. However, it appears that it is not coming out anytime soon.

“Little birdies told me Q* hasn’t been released yet as they aren’t happy with the latency and other little things they want to further optimise,” posted an insider who goes by the name Jimmy Apples on X.

“Not sure if it’s big patience or small patience.”

The post You Don’t Mess with Google appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-insights-analysis/you-dont-mess-with-google/feed/ 0
Layoff is a Gift https://analyticsindiamag.com/ai-insights-analysis/layoff-is-a-gift/ https://analyticsindiamag.com/ai-insights-analysis/layoff-is-a-gift/#respond Mon, 05 Aug 2024 07:00:15 +0000 https://analyticsindiamag.com/?p=10131314

A recent Indeed report reveals that employee burnout is increasing, with 52% of workers feeling its effects.

The post Layoff is a Gift appeared first on AIM.

]]>

“We regret to inform you that as of today, your position has been eliminated” — emails bearing this tone and tenor are the most dreaded and can often turn employees’ lives topsy-turvy, especially for those used to a certain lifestyle that the tech sector promises. 

According to Layoffs.Fyi, as of July 2024, 360 tech companies laid off over 104,400 employees this year alone. And now, Intel plans to cut around 15,000 jobs, which amounts to 15% of its workforce, in a strategic move to save $10 billion by 2025. 

Hate us not, but layoffs can be good news parading as bad – a blessing in disguise.

Recently, filmmaker and executive producer Monica Medellin said that she is grateful for her layoff. If it hadn’t happened, she would have never carved her own career path. 

She is not alone. A Reddit user wrote that he was laid off from McKinsey last year and received a 4.5-month severance package. He took a two-month trip to Europe and then secured a contract role that was converted to a full-time position within a few months.

Besides, he also started a side hustle that generated $33k in income last year. Overall, he earned the highest income he’d ever made and attributed this success to the layoff. He mentioned that there is definitely hope! One should update their LinkedIn profile.

Another story is that of a Meta employee who worked for seven months before being laid off just as he was about to close on a house. He subsequently joined New Relic as a lead data engineer for a similar salary but received a double paycheck for one month and a severance package, which significantly improved his finances. 

With the signing bonus, he earned a substantial amount of money in less than a year of work. Well, he’s now grateful to Meta!

And the stories go on.  

Source: LinkedIn

Better than Burnout

Burnout has always pushed employees to silently quit their jobs. A recent Indeed report reveals that employee burnout is increasing, with 52% of workers feeling its effects. Stress, fatigue, and mental health challenges are impacting employees of all ages and job types in the corporate environment.

This encourages many to build a startup. 

Kishore Indukuri left his job at Intel in the US to return to India and start Sid’s Farm, a dairy delivering unadulterated milk to Hyderabad residents. Now, his dairy brand earns Rs 44 crore revenue. 

A layoff could also nudge out the successful influencer in you waiting for that one push. 

According to Influencer Market Hub, YouTubers make an average of $0.018 per ad view. However, the amount also depends on factors such as the number of followers, views, clicks on ads, ad quality, ad blockers, and video length.

Here’s an interesting story: Vanessa Chen, a 23-year-old content creator in Boston, shared that she was a computer science undergraduate with plans to become a software engineer. But she became a full-time content creator with more than 4 million followers across YouTube, Instagram, and TikTok and earns a mid-six-figure income.

So, creating a YouTube channel can be a great part-time job, especially if you fear job security at your workplace. 

Meanwhile, to dilute the effects of burnout, some companies like Zoho are leveraging cloud computing, unlocking the potential of rural talent, and creating employment opportunities in the process.

Zoho’s Rural Revival is an initiative where miniature farming sites are built to grow fresh produce and provide a way for everyone to bring their families together, connect, and help relieve stress.

Further, they have set up mini-offices in the hinterlands, enabling people to stay close to the ones they love, invest in a home of their own, cut down on travel costs, and stay out of debt. 

“In five years, 50 percent of our employees will work from smaller, rural centres. We want to keep people rooted in their towns and villages and provide world-class jobs in these places,” said Sridhar Vembu, CEO and co-founder Zoho. 

When near home and family, burnout will not be in the picture. 

Time to Rewrite Your Resume

As most companies continue to believe that reducing staff will provide the best and easiest solution to financial problems, it’s evident that your resume needs an overhaul

The stories of Vizzy CEO’s unconventional hiring practices, the self-taught journeys of individuals like Izam Mohammed, and the insights shared by industry leaders all reflect a fundamental truth that expertise and skill in technology are no longer solely defined by degrees.

Well, Sam Altman, the CEO of OpenAI, thinks, “University degrees are IMO status and not substance at this point.”  

The post Layoff is a Gift appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-insights-analysis/layoff-is-a-gift/feed/ 0
The Kerala Landslides Prove Why AI-Based Climate Tech is Vital  https://analyticsindiamag.com/industry-insights/the-kerala-landslides-prove-why-ai-based-climate-tech-is-vital/ https://analyticsindiamag.com/industry-insights/the-kerala-landslides-prove-why-ai-based-climate-tech-is-vital/#respond Fri, 02 Aug 2024 05:30:00 +0000 https://analyticsindiamag.com/?p=10131204

“Besides climate disasters, many issues are directly caused by a lack of good research and analysis during the planning phase for infrastructure projects.”

The post The Kerala Landslides Prove Why AI-Based Climate Tech is Vital  appeared first on AIM.

]]>

The recent Wayanad landslides stand testament to how poor planning can cause major climate tragedies, leading to the loss of life and property. 

In an interview, geoscientist CP Rajendran from National Institute of Advanced Studies attributed the floods to deforestation, unscientific construction practices and quarrying, among other things. “The agencies involved in landslide susceptibility maps like the State Disaster Management should be involved in updating the existing maps.

“The satellite observations and digital elevation models are now widely used to map the landslide risk mapping. The government must frame clear policies on land management based on such zoning maps by taking the people into confidence,” he said.

Similarly, spurred by the 2018 Camp Fire wildfire in California, three IIT alumni Rahul Saxena, Abhishek Singh and Nitin Das co-founded AiDash, an AI-first vertical SaaS company, in 2019. According to them, the wildfire highlighted the need for disaster prevention and vegetation management, especially when it came to infrastructure building, something that is sorely lacking in India.

Poorly researched plans to build infrastructure in the country have long contributed to several issues. Besides climate disasters, issues like traffic surges are directly caused by a lack of research and analysis during the planning phase for large projects.

To prevent this, companies like AiDash have cornered the market by offering geospatial analysis and remote sensing services, making use of AI to help clients make informed decisions on their current and proposed building projects. This is to help companies looking to build climate-resilient and sustainable critical infrastructure.

Why is Monitoring so Important?

In cornering a very specific market, AiDash has managed to remain virtually peerless when it comes to what they’re offering.

This is evidenced by the fact that the company, despite being founded only five years ago, has managed to raise a funding of $91.5 million. This includes closing off its Series C funding round in April this year at $58.5 million led by Lightrock.

“We are the world’s first satellite-powered vegetation management provider. Vegetation management is an interesting use case where the power utilities have to manage the vegetation around the power lines. If they don’t, the vegetation gets closer to the wire and can cause an outage by falling on it.

“Even worse, over the past five to ten years, there have been a lot of wildfires caused because the vegetation was too close to the wire. So it’s like a double-edged sword for companies. A large utility company in California, for instance, almost went bankrupt because of the wildfire,” Saxena told AIM.

In terms of competitors, they remain a key player in the industry, alongside companies like LiveEO, Vibrant Planet and Green PRAXIS, and even SatSure. Currently, the company has six of the largest utility companies in the US and two of the largest water companies in the UK as their clients.

In terms of their data centres, the company makes use of AWS but also has internal data centres, which they use to train their models and host their own GPUs. Additionally, Saxena said the company is planning on incorporating generative AI into its offerings, allowing clients to directly ask questions on the data gathered through the platform. 

Vegetation Monitoring in India

With much of its clientele largely based in the US and UK and as many as a million miles of coverage in North America, the company has plans to increase its clientele in India.

Despite being headquartered in San Jose, the company largely operates in India. The Lightrock-backed climate tech startup recently opened an AI centre of excellence (CoE) in Bengaluru. 

AiDash already has two established offices in Bengaluru and Gurugram, with the CoE adding to their bases in India. Additionally, Saxena said that the company plans to double its team in India. “The vision is a cleaner, safer, greener Earth. We want to work towards climate and sustainability. So we are open to doing whatever we can in India to achieve that.

“Currently, we’re building in India for the world. Meanwhile, many companies start up somewhere else and come to India to sell their product,” Saxena pointed out.

The post The Kerala Landslides Prove Why AI-Based Climate Tech is Vital  appeared first on AIM.

]]>
https://analyticsindiamag.com/industry-insights/the-kerala-landslides-prove-why-ai-based-climate-tech-is-vital/feed/ 0
You See, C is Still the King in the Sea of Languages https://analyticsindiamag.com/ai-insights-analysis/c-language/ https://analyticsindiamag.com/ai-insights-analysis/c-language/#respond Thu, 01 Aug 2024 13:00:00 +0000 https://analyticsindiamag.com/?p=10131175

Despite the drawbacks, the world still benefits from C even though higher-level languages are more commonly used

The post You See, C is Still the King in the Sea of Languages appeared first on AIM.

]]>

In a recent experiment, TalentNeuron’s machine learning lead, Andriy Burkov, optimised a Python-based text processing task by rewriting it in C. With the help of AI assistant Claude, the Python implementation took 63 minutes, while the C version completed the task in just 2.3 minutes, showcasing a significant performance boost.

With all the new “modern” languages out today, how is C still believed to be the fastest and “closest to the machine”? 

C Over Python

While Python is renowned for its simplicity and ease of use, it is also known for its slower execution times. C, on the other hand, is renowned for its speed. This is primarily because it can be compiled straight into assembly or machine code before being executed.

C programs execute quickly primarily because they are translated into machine code prior to execution. Since machine code is the language that computers comprehend directly, no additional translation is required when the program is operating. 

Pre-translation prevents the needless extra steps that can impede the speed of programs written in other languages that may require real-time translation into machine code. C programs can operate substantially more quickly by omitting this translation stage during execution.

Thanks to its portability and effectiveness, “C is frequently used to implement compilers, libraries, and interpreters for other programming languages”.  The main implementations of interpreted languages, such as PHP, Python, and Ruby, are written in C. 

Another factor that distinguishes C from other languages is it is a smaller, simpler language. In a discussion on Reddit, a developer named Blargh said, “It hasn’t changed very much in 30 years. Its limitations are usually tolerable for the problem domain it was designed for.”

There isn’t Much That’s Special About C

Having said that, many developers believe speed is natural to C and not a unique thing. In a Stack Overflow discussion, developer 

Sebastian Karlsson pointed out that C lacks features like garbage collection, dynamic typing and other facilities. 

“Newer languages have all these, hence there is additional processing overhead which will degrade the performance of the application,” he added.

If anything, that is actually a minus point for C. “These require programmers to manually manage memory allocation and deal with static typing,” said Karlsson.

In another Reddit discussion, a developer pointed out the same problem. “If you don’t mind that a program might behave unpredictably when given bad inputs, a C program might run faster because it doesn’t always check for things like whether you’re accessing an array outside its bounds. This lack of checks can make C code run faster than code in other languages that do include these safety checks.”

However, if you need your program to be very secure and free from exploits like arbitrary code execution, you would need to add extra safety checks in C, making it cumbersome.

For some, Rust presents a compelling alternative to C, particularly for projects that prioritise safety and maintainability without sacrificing performance. So much so that Microsoft CTO Mark Russinovich said, “It’s time to halt starting any new projects in C/C++ and use Rust for those scenarios where a non-GC language is required. For the sake of security and reliability, the industry should declare those languages as deprecated.”

C is Still the King

Despite the drawbacks, the world still benefits from C even though higher-level languages are more commonly used. A majority of the work on Microsoft’s Windows kernel is done in C, however some are done in assembly languages. 

With almost 85% of the market share, the most popular operating system in the world has been running on a C-written kernel for many years. 

Additionally, Linux is primarily written in C, with some assembly code. The Linux kernel is used by roughly 97% of the 500 most potent supercomputers in the world. 
The kernels for iOS, Android, and Windows Phone are also written in C. These are merely versions of the current Windows, Linux, and macOS kernels for mobile devices. Thus, the C kernel powers the devices you use on a daily basis.

The post You See, C is Still the King in the Sea of Languages appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-insights-analysis/c-language/feed/ 0
Young Indians Are More Likely to Be Jobless if They’re Educated  https://analyticsindiamag.com/ai-insights-analysis/young-indians-are-more-likely-to-be-jobless-if-theyre-educated/ https://analyticsindiamag.com/ai-insights-analysis/young-indians-are-more-likely-to-be-jobless-if-theyre-educated/#respond Thu, 01 Aug 2024 08:30:00 +0000 https://analyticsindiamag.com/?p=10131122

Indian IT companies are anyway going to train you in generative AI.

The post Young Indians Are More Likely to Be Jobless if They’re Educated  appeared first on AIM.

]]>

If you are an employee at an Indian IT company, you would receive an email from your manager asking to update your resume mentioning generative AI skills. It is now a mandatory drill to undergo such training, one can even call it a mania.

Because back in 2017, Mark Cuban, the famous American businessman, said “Artificial Intelligence, deep learning, machine learning — whatever you’re doing if you don’t understand it — learn it. Because otherwise you’re going to be a dinosaur within 3 years.”

This hasn’t changed. Almost an entire generation is studying for jobs that won’t exist. And when it comes to the Indian IT industry, if you are educated, you won’t get a job that pays you well enough. But if you are not that upskilled and do not demand higher salaries, you might be in a better position to land a job.

Indian Employees Look to Reskill

A Reddit user wrote, “I am 25 with a bachelor in CS and masters in AI. With zero YOE [Years of experience], I found it tough to land a job after college and had to settle with a generic SDE [Software Development Engineer] role at a small company.” Many fresh graduates shared similar concerns on forums like Reddit.  

Source: Reddit

This sentiment reflects a broader issue faced by many new entrants into the tech industry. Aspiring software engineers/developers consider several strategies to stand out and secure desirable positions.

Indian IT companies have all recently highlighted their commitment to integrate generative AI into their operations. Top firms like TCS, Infosys, and Wipro have been leveraging technology-enabled training for their employees.

Wipro’s foundational training for over 2,25,000 employees, coupled with advanced AI training for an additional 30,000, highlights their approach to AI education within their workforce.

Similarly, Accenture has been expanding its data and AI workforce, reaching approximately 55,000 skilled practitioners. With a goal to double this number to 80,000 by the end of FY2026.

Further, HCL plans to train and upskill around 20,000 employees in generative AI every quarter. It aims to reach 100,000 employees by the end of FY25, reflecting its strategic focus on AI proficiency.

TCS is taking a similar approach, training over 150,000 employees in collaboration with tech giants. TCS aims to build the largest AI-ready workforce in the world through organic reskilling efforts. 

Milind Lakkad, TCS’ executive vice president and global head of human resources, noted that the company now has over 100,000 GenAI-ready employees and is investing in further deepening their expertise. 

Infosys has trained around 250,000 employees in GenAI capabilities, focusing on enhancing their service offerings through this technology.

While Capgemini has scaled its capabilities significantly by training over 120,000 employees on generative AI tools. 

What’s Next?

Accenture has announced over $900 million in new bookings for generative AI, reaching a total of $2 billion fiscal year-to-date during its Q3 FY24 earnings report.

“We have achieved two significant milestones this quarter — with $2 billion in generative AI sales year-to-date and $500 million in revenue year-to-date — demonstrating our early lead in this critical technology,” the company said in a statement.

Meanwhile, Indian IT majors like TCS have doubled up on their AI game. The IT giant announced that it is doubling its AI pipeline to $1.5 billion this quarter, from the $900 million pipeline reported in the previous one.

TCS is now also working on around 270 AI projects worldwide. Moreover, in Q1 FY25, it applied for 154 patents and was granted 277, as revealed by TCS chief K Krithivasan.

Coming to Wipro, it is driving innovation and enhancing productivity through investments in the Lab 45 AI platform and Wipro Enterprise GenAI Studio, incorporating various GenAI tools into the software development lifecycle.

Further, HCL’s AI-led engagements include implementing a GenAI-based solution for a global technology major, automating gaming review analysis, which resulted in significant workload reduction and a 119% increase in game reviews. 

Additionally, they are transforming the client’s content lifecycle management with GenAI features. 

While Capgemini is currently engaged in over 350 new projects, with more than 2,000 deals in the pipeline. The company has also scaled its capabilities by training over 120,000 employees on generative AI tools and continues to invest in related tools, assets, and platforms.

Seems like Indian IT giants have taken up the herculean task of upskilling the Indian techies with generative AI and also not keeping them unemployed.

The post Young Indians Are More Likely to Be Jobless if They’re Educated  appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-insights-analysis/young-indians-are-more-likely-to-be-jobless-if-theyre-educated/feed/ 0