Data Science News, Stories and Latest Updates Artificial Intelligence, And Its Commercial, Social And Political Impact Mon, 29 Jul 2024 10:13:35 +0000 en-US hourly 1 https://analyticsindiamag.com/wp-content/uploads/2019/11/cropped-aim-new-logo-1-22-3-32x32.jpg Data Science News, Stories and Latest Updates 32 32 Data Science Hiring and Interview Process at SAP Labs India https://analyticsindiamag.com/interview-hiring-process/data-science-hiring-process-at-sap-labs-india/ Mon, 29 Jul 2024 10:13:34 +0000 https://analyticsindiamag.com/?p=10110557

As a data scientist at SAP Labs, you will analyse large datasets, implement best practices to enhance ML infrastructure, and support engineers and product managers in integrating ML into products.

The post Data Science Hiring and Interview Process at SAP Labs India appeared first on AIM.

]]>

German tech conglomerate SAP Labs has been one of the major players in the generative AI race on the enterprise side. The company recently introduced Joule, a natural-language generative AI assistant that allows access to the company’s extensive cloud enterprise suite across different apps and programs. It will provide real-time data insights and action recommendations.

With a global presence in 19 countries, labs are responsible for driving SAP’s product strategy, developing and localising its core solutions, and contributing to the SAP Business Technology Platform. 

54-year-old SAP was founded by five former IBM employees, Dietmar Hopp, Hasso Plattner, Claus Wellenreuther, Klaus Tschira, and Hans-Werner Hector. SAP Labs is the R&D arm of SAP with its second largest office space in Bengaluru. 

AIM got in touch with Shweta Mohanty, vice president and head, of human resources, SAP, India and Dharani Karthikeyan, vice president, head of engineering for analytics, SAP Labs India, to understand the company’s AI and analytics play, customer stories, hiring process for data scientists, work culture and more. 

AI & Analytics Play

“We have fully embraced generative AI in our business AI concept, aiming to provide AI that is responsible, reliable, and relevant. The goal is to infuse AI into business applications, with a focus on trust and outcomes,” Karthikeyan told AIM

SAP has a portfolio of over 350 applications spanning various use cases, from cash management to document scanning. The company is enhancing its Business Technology Platform (BTP) with a generative AI layer.  The team aims to improve business processes while maintaining human control over decisions. They have collaborated with Microsoft for Human Capital Management tools, combating biases in recruiting, and introduced a Business Analytics tool for faster insights. 

SAP is also partnering with Google Cloud to launch a holistic data cloud, addressing data access challenges. Additionally, they have invested in generative AI players Anthropic, Cohere, and Aleph Alpha, diversifying their capabilities.

Interview Process

The hiring process for tech roles involves five to six steps starting with profile screening, focusing on the candidate’s development background and programming language proficiency. As described by Mohanty, this is followed by an online assessment to test programming skills, lasting 60 to 90 minutes. Technical interviews include case studies to assess proficiency and hands-on experience. 

For senior roles, there’s a discussion with a senior leader to gauge cultural alignment. The final step is an HR discussion focusing on cultural fit and interest in the organisation. For college recruitment, the process includes live business solutions assessments. The process concludes with a rigorous background verification.

When it comes to finding the right fit for SAP labs, the ideal candidate should have a comprehensive understanding of ML algorithms, and to build and maintain scalable solutions in production,” added Karthikeyan, highlighting that this consists of the use of statistical modelling procedures, data modelling, and evaluation strategies to find patterns and predict unseen instances. 

The roles involve using computer science fundamentals such as data structures, algorithms, computability, complexity, and computer architecture and also collaborating with data engineers is essential for building data and model pipelines, as well as managing the infrastructure needed for code production. 

As a data scientist at SAP Labs India, you will also analyse large, complex datasets, researching and implement best practices to enhance existing ML infrastructure and provide support to engineers and product managers in implementing ML into products.

Work Culture in SAP

SAP’s work culture is characterised by abundant learning opportunities and hands-on experiences where employees have chances to shadow leading data scientists, participate in fellowship projects for stretch assignments, and explore various aspects. This hands-on approach extends to customer interactions and pre-sales experiences. 

“These opportunities, along with the focus on learning and customer engagement, give SAP an edge over other organisations hiring in data science and machine learning,” Mohanty commented.

SAP prioritises its employees’ well-being through a comprehensive set of benefits and rewards. The company recognises diverse needs beyond healthcare and retirement plans, offering global and local options for work-life balance, health and well-being, and financial health. 

Embracing a highly inclusive and flexible culture, the company promotes a hybrid working model allowing employees to balance office and remote work. Employee Network Groups foster a sense of community, and inclusive benefits include competitive parental leave and disability support. 

The ERP software giant also aims to foster personal and professional growth, providing learning opportunities, career development resources, and a leadership culture focused on doing what’s right for future generations. It values fair pay, employee recognition, generous time-off policies, variable pay plans, total well-being support, and stock ownership opportunities for all employees.

Why Should You Join SAP Labs?

SAP Labs offers a sense of purpose and involvement in transformative technology phases. At SAP, candidates dive into cutting-edge technologies, explore diverse industries, and embrace continuous learning and innovation. 

Mohanty explained how the team values adaptability, emphasising fungible skills and a proactive mindset, especially in areas like AI and generative AI. 

“We seek individuals ready to tackle new challenges and solve complex problems, fostering a dynamic and impactful work environment,” she explained. 

Adding on to what Mohanty said, “The work at SAP involves mission-critical applications, like supporting cell phone towers or vaccine manufacturing so the integration of generative AI into these applications offers a unique combination of purpose and technological advancement, providing developers with a high sense of purpose in seeing their software run essential business and retail operations. This phase of technological transformation at SAP is especially significant for new joiners,” said Karthikeyan. 

Check out the job openings here.

The post Data Science Hiring and Interview Process at SAP Labs India appeared first on AIM.

]]>
Data Science Hiring and Interview Process at ServiceNow https://analyticsindiamag.com/interview-hiring-process/data-science-hiring-process-at-servicenow/ Mon, 29 Jul 2024 10:13:21 +0000 https://analyticsindiamag.com/?p=10111339

The company has eight open positions for applied research scientists and ML engineers.

The post Data Science Hiring and Interview Process at ServiceNow appeared first on AIM.

]]>

California-based ServiceNow, one of the leading names when it comes to operating as a cloud-based company delivering Software as a Service (SaaS) has brought purpose-built AI with the highly intelligent NOW platform.  Last September, the company expanded this using a domain-specific ServiceNow language model designed for enterprise use.

The NOW platform converts machine intelligence into practical actions, aiming to enhance process efficiency, reduce risks, optimise workforce productivity, and facilitate automated workflows with the help of purpose-built AI, providing users with self-solving capabilities through augmented intelligence

“With this (NOW Platform), we are enabling enterprises to increase process efficiency, minimise risk by avoiding human mistakes, optimise workforce productivity to focus on higher value tasks​, leverage automated workflows to drive standardisation and empower users to self-solve with augmented intelligence,” Sumeet Mathur, vice president and managing director of ServiceNow’s India Technology and Business Center, told AIM. 

The company has eight open positions in data science.

Applied research scientists in the Core LLM Team focus on developing generative AI solutions, collaborating with diverse teams to create AI-powered products and work experiences. Simultaneously, they conduct research, experiment, and mitigate risks associated with AI technologies to unlock novel work experiences. 

On the other hand, as a machine learning engineer, you’ll craft user-friendly AI/ML solutions to enhance enterprise services’ efficiency, emphasising accessibility for users with varying technical knowledge. 

Inside ServiceNow’s AI & Analytics Lab

The Now platform aims to create proactive and intelligent IT processes. The platform is built around big data and advanced analytics, incorporating real-time and stored data to enhance accessibility and support various use cases, such as self-service, incident detection, pattern discovery, knowledge base optimisation, workflow automation, and user empowerment. 

ServiceNow’s self-service has evolved with augmented AI and automation, using intelligent virtual agents to understand customer intent and resolve complex issues. Augmented agent support focuses on improving human capabilities through recommendation engines, automated workflows, and increased productivity, aligning with specific business objectives for measurable value.

Tapping into Generative AI

Last September, the company expanded its Now Platform using a domain-specific ServiceNow language model designed for enterprise use, prioritising accuracy and data privacy. The Now LLM incorporates top-notch foundational models, including a pre-trained model called StarCodel, developed in collaboration with Hugging Face and a partnership with NVIDIA, along with other open-source models. 

The initial release of Now LLM introduces features such as interactive Q&A, summarisation capabilities for incidents/cases and chats, and assistive code generation for developers. The development of this model involved significant efforts from engineering, research, product, QE, and design teams, as well as data centre operation teams managing the GPU infrastructure. 

Clients like Mondalez, Delta, Standard Chartered, Coca Cola, LTIMindtree, and various other companies across industries have used the platform for AI applications in areas like improving healthcare workflows, providing financial auditors with quick insights, and transforming supply chain management in manufacturing. 

“We believe that the most constructive and value-creating strategies for generative AI are grounded in embedding human experience and expertise into its core capabilities,” added Mathur. 

So it adopts a humans-in-the-loop model for generative AI, integrating human expertise into its core capabilities. The NOW platform’s generative AI is applied in diverse use cases, including case summarisation, content generation, conversational exchanges, and code generation. 

Interview Process

“Our hiring process for data science roles follows a structured approach aimed at attracting a diverse pool of qualified candidates. We publish job openings on various platforms, including our career site, job boards, social media, and professional networks,” added Mathur. The process involves careful evaluation through interviews to ensure the selection of the right candidate. 

The interview process consists of three technical rounds, each focusing on key competencies such as programming proficiency and experience with core ML and LLM. This assessment is followed by an interview with the hiring manager and, for certain roles, an additional round with the senior leadership. 

However, Mathur shared that during the data science interview process, candidates often make common mistakes that should be avoided. Some of them include inadequate technical readiness, a limited understanding of the company’s objectives and role, failure to ask insightful questions, overlooking the latest AI/ML trends, and neglecting to demonstrate effective problem-solving skills. 

Expectations

Upon joining the data science team at the Advanced Technology Group (ATG) of ServiceNow, candidates can expect to work within a customer-focused innovation group. The team builds intelligent software and smart user experiences using advanced technologies to deliver industry-leading work experiences for customers. 

The ATG comprises researchers, applied scientists, engineers, and product managers with a dual mission: building and evolving the AI platform and collaborating with other teams to create AI-powered products and work experiences. The company expects that team members will contribute to laying the foundations, conducting research, experimenting, and de-risking AI technologies for future work experiences.

Work Culture

“Our company fosters a purpose-driven work culture where employees have the opportunity to be part of something big. We make work better for everyone—including our own. We know that your best work happens when you live your best life and share your unique talents, so we do everything we can to make that possible for our employees,” Mathur added.

Some of the key perks include a hybrid working model, paid time off, well-being days, employee belonging groups, DEI learnings, internal opportunities, and paid volunteering.

According to him, joining ServiceNow means becoming part of an inclusive and diverse community with resources for well-being, mental health, and family planning, among others. Prioritising value and trust, SaaS giant provides ongoing support for learning and development, growth pathways, and action-oriented feedback aligned with clear expectations. The programs cater to individuals at all career stages. 

“We’re committed to creating a positive impact on the world, building innovative technology in service of people – with a core set of values and a deep responsibility to each other, our customers and our global communities,” he concluded.

Check out the careers page now.

The post Data Science Hiring and Interview Process at ServiceNow appeared first on AIM.

]]>
Data Science Hiring and Interview Process at Happiest Minds Tech https://analyticsindiamag.com/interview-hiring-process/data-science-hiring-process-at-happiest-minds-2/ Mon, 29 Jul 2024 10:08:16 +0000 https://analyticsindiamag.com/?p=10102220

Happiest Minds is currently on the lookout for a specialist in marketing analytics with over 8 years of relevant experience. 

The post Data Science Hiring and Interview Process at Happiest Minds Tech appeared first on AIM.

]]>

Founded in 2011 by Ashok Soota, a serial entrepreneur and Indian IT veteran, Happiest Minds boasts a robust data science team comprising over 300 members, including data engineers, intelligence specialists, and data science experts.

Based in the Silicon Valley of India, Bangalore, and extending its reach across the global landscape, including the US, UK, Canada, Australia, and the Middle East, this IT juggernaut seamlessly blends augmented intelligence with the art of understanding human language, deciphering images, analysing videos, and harnessing cutting-edge technologies such as augmented reality and virtual reality.

This dynamic fusion empowers enterprises to craft captivating customer interactions that surpass rivals and set new industry standards.

Happiest Minds distinguishes itself from traditional IT companies by avoiding legacy systems like SAP and ERP, believing that staying entrenched in these technologies limits growth and innovation. “Instead, we have chosen to focus on digital technologies like AI, which is the future of IT,” said Sundar Ramaswamy, SVP, Head of Analytics CoE, in an exclusive interview with AIM.

The team conducts regular market scans to identify the latest technologies and ensures that they are always on the forefront of innovation. This approach allows them to co-create and innovate with clients while building new solutions.

Now Hiring

Happiest Minds is currently on the lookout for a specialist in marketing analytics. The ideal candidate should possess a Master’s or Bachelor’s degree in Computer Science, STEM, or an MBA, demonstrating strong problem-solving skills. They should also have over eight years of experience in the analytics industry, particularly in marketing. 

This experience should include a track record of using AI to enhance the customer journey, encompassing areas such as customer acquisition, nurturing, retention, and improving the overall experience.

The technical skills required include proficiency in statistical techniques, ML, text analytics, NLP, and reporting tools. Experience with programming languages such as R, Python, HIVE, SQL, and the ability to handle and summarise large datasets using SQL, Hive-SQL, or Spark are essential.

Additionally, the knowledge of open-source technologies and experience with Azure or AWS stack is desirable.

AI & Analytics Play

This team collaborates closely with domain teams across diverse industry verticals. Their analytics process follows eight key steps. They integrate data from multiple sources, use BI tools for descriptive analytics, perform ad hoc analysis, build data pipelines and auto ML pipelines, retrain models regularly, focus on customer understanding, optimise cloud usage, and ensure data governance.

Their key industry verticals are CPG retail, healthcare (bioinformatics), FSI, media entertainment, and Edtech, with growing interest in manufacturing. The team works with classical analytics, deep learning, computer vision, NLP, and generative AI. This includes advanced applications like language translation and content generation from 2D to 3D images using generative AI.

Recognising the growing importance of generative AI, they have formed a dedicated task force comprising approximately 50 to 60 members, drawn from diverse domains, under the leadership of their CTO with the primary objective to leverage generative AI in addressing industry-specific challenges.

To achieve this, they’ve identified and categorised 100 to 250 distinct use cases across ten different domains, tailored to the specific requirements of each domain. The team is diligently working on creating demos and proof of concepts (POCs) that are domain-specific. 

Some team members come from analytics backgrounds, contributing their technical expertise, while others from domain areas contribute to shaping ideas and ensuring results align with the industry’s needs. This undertaking is substantial for the organisation, considering they have around 5,500 employees, with 100-160 dedicated solely to generative AI. 

In addition to building demos, the company is also focusing on educating its entire workforce about LLMs and their applications to equip all team members with a basic understanding of generative AI’s capabilities and potential applications.

To bring generative AI into action, the company is working with Microsoft’s suite of products. “We are a Microsoft select partner and are also experimenting with different language models,” he added.

The team initially experimented with Google’s BERT and now employs models like GPT-2. They have a strategic inclination towards refining existing models to suit specific applications, rather than developing entirely new foundational models. For example, they collaborate with a healthcare company to craft adaptive translation models with reinforcement learning.

Interview Process

“Data science is not just about technical skills; it also involves an element of art. Candidates are assessed on their ability to communicate their results effectively and their capacity to approach problems with creativity,” said Ramaswamy.

The interview process for data science candidates at Happiest Minds typically involves three to five levels of interviews. The first level is a screening by the HR team based on the job description. This is followed by a written test to assess the candidate’s proficiency in relevant languages and skills. For example, if the position is for a data engineer, the test might evaluate their ability to work with SQL and other database-related tasks. 

Technical interviews are conducted using case studies to evaluate the candidate’s problem-solving ability and approach. The interview process concludes with a leadership interview, especially if the position is a senior one.

In addition to understanding the interview process, candidates often wonder about the common mistakes they should avoid. According to Ramaswamy, there are two main pitfalls that candidates often fall into. First, many candidates focus excessively on specific tools or techniques and become fixated on mastering them.

“While technical proficiency is essential, it’s equally important to explain the problem being solved, the reasons for approaching it a certain way, and considering alternative solutions,” he added.

The second common mistake is becoming too narrowly focused on the solution without understanding the broader context. It’s crucial to see the big picture, why the problem is being solved for the client, and to ask relevant questions about the projects they’ve worked on. 

In terms of skills, the company looks for both technical and non-technical abilities. The specific skills depend on the role of the position, such as data engineering, business intelligence, or data science. 

However, primary technical skills include proficiency in relevant tools and technologies, certifications, and problem-solving abilities. Non-technical skills are communication and presentation skills, problem-solving skills, and the ability to coach and mentor, as collaboration and teamwork are essential for senior positions.

Work Culture

“As the company’s name suggests, we aim to cultivate a distinctive work culture based on four fundamental pillars,” Ramaswamy commented. Certified as a Great Place to Work, the company prioritises the well-being of their employees, believing that “a content workforce leads to happy customers“. They monitor and maintain employee happiness closely, offering support to those facing personal or professional challenges.

Collaboration is another key element of their culture, as they encourage a unified approach within and across different units and locations. “As a company born in the digital age, Happiest Minds thrives on agility, adapting swiftly to meet the ever-changing needs of customers and the digital industry,” he added.

Transparency is the fourth pillar, as they openly share key performance indicators and objectives with their employees, investors, and stakeholders. This culture of transparency and goal-oriented approach ensures that their efforts are always aligned with clear objectives and tracked diligently.

If you think you fit the role, check out their careers page now. 

Read more: Data Science Hiring Process at PayPal

The post Data Science Hiring and Interview Process at Happiest Minds Tech appeared first on AIM.

]]>
Data Science Hiring and Interview Process at Wipro https://analyticsindiamag.com/interview-hiring-process/data-science-hiring-process-at-wipro/ Mon, 29 Jul 2024 10:03:31 +0000 https://analyticsindiamag.com/?p=10103532

With over 30,000 AI and analytics professionals, the team is building its own LLMs. 

The post Data Science Hiring and Interview Process at Wipro appeared first on AIM.

]]>

Wipro, which began as a family operated vegetable oil manufacturer in the small town of Amalner, India in 1945, is now one of the largest IT companies globally, functioning in more than 167 countries.

The company has been a key player in driving generative AI. With over 30,000 AI and analytics professionals, the team is building their own LLMs. 

AIM got in touch with Sriram Narasimhan, global head of data, analytics and insights, Wipro, to understand about their data science applications, hiring for these roles, work culture and more. 

Inside Wipro’s Data Science Team

“Data serves as the bedrock for every AI NS ML initiative, laying the groundwork for success. The pivotal factor lies in guaranteeing precise data of optimal quality—an indispensable catalyst for these processes that yield the desired results,” Narasimhan told AIM.

Profound insights emerge from the ability to scrutinise, profile, and decipher patterns within the data landscape, identifying outliers to extract meaningful conclusions. At the heart of any data science endeavour is the adeptness to construct automations through AI and ML algorithms, elevating and refining the data and insight ecosystem.

This transformative process enhances operational efficiencies, underscoring the fundamental role of data science and engineering as the critical inaugural stride in the pursuit of quality outcomes in AI/ML implementations.

Wipro’s AI and analytics team is substantial, with over 30,000 practitioners globally. The company boasts 500+ AI/ML patents, 20 innovation centres, and over 15 partnerships with a strong presence in various industries. 

Recognised as a leader by agencies like Everest Group and IDC, Wipro specialises in industry-specific solutions and horizontal offerings like ML Ops and Legacy modernisation.

“The team co-builds solutions, leveraging tools like the Wipro Data Intelligence Suite (WDIS), prebuilt Industry Business Applications, and the Wipro Enterprise Generative AI (WeGA) Framework,” he added. These tools accelerate customer implementations, supporting the modernisation journey and enabling responsible AI with safety and security guardrails.

Riding the Generative AI Wave

Wipro has been actively involved in generative AI initiatives for over two years, collaborating with research institutes like the AI Institute at the University of South Carolina and IIT Patna. The company is committed to training its sizable workforce of 250,000 in generative AI. They have developed their own LLMs enhancing versatility and future-proofing, and have established a unique partnership with Google Cloud to integrate its generative AI products and services.

The company’s generative AI applications cover diverse themes, including cognitive chatbots, content creation and optimisation for marketing, media, automation in code generation, and synthetic data generation. The company’s internal initiative, Wipro ai360, focuses on incorporating AI across all platforms. Notable client projects include assisting a chocolate manufacturer in enhancing product descriptions and collaborating with a European telecom company to extract value from data.

Wipro is invested in the generative AI landscape, with 2/3rd of its strategic investments directed towards AI. The company plans to further support cutting-edge startups through Wipro Ventures and launch a GenAI Seed Accelerator program to train the top 10 generative AI startups.

Acknowledging the challenges associated with generative AI, the Bengaluru based tech giant has implemented a control framework, emphasising responsible usage. Initiatives include dedicated environments for developing generative AI solutions, GDPR-compliant training, and efforts to detect AI-generated misinformation. They have also established an AI Council to set development and usage standards, emphasising ethical guidelines, fairness, and privacy.

The team is attuned to evolving regulatory frameworks and is adapting strategies accordingly. The company envisions widespread benefits to the IT industry, with generative AI influencing code generation and call centres. The team anticipates a wave of AI services emerging in the next five years, facilitating enterprises in harnessing AI’s full potential. In the long term, they foresee AI disrupting every industry, with specific verticals like precision medicine, precision agriculture, hyper-personalisd marketing, and AI-led capabilities in smart buildings and homes gaining prominence.

Interview process

When hiring for data science roles, Wipro seeks candidates with practical experiences, strong programming and statistical skills, analytical abilities, domain knowledge, and effective presentation skills. 

“The hiring process involves a comprehensive evaluation based on real-world use cases, emphasising not only technical proficiency but also the candidate’s understanding of problem statements and the application of statistical methodologies to solve complex issues,” he added.

“Joining our data science team promises exposure to cutting-edge, real-life AI/ML problems across various industries as we encourage a democratic approach to AI, allowing teams the independence to build solutions while adhering to organisational processes,” Narasimhan commented.

The company offers a diverse range of competencies, including data engineering, data science, conversational AI, ethical AI, and generative AI, enabling associates to work on projects aligned with their capabilities and aspirations.

In interviews, Wipro emphasises the importance of showcasing real-life use cases rather than being overly theoretical. Candidates are encouraged to highlight their practical experiences, demonstrating how they understand, consider options, and provide solutions to problems in the realm of data science, AI, and ML.

Work Culture

Wipro fosters a work culture rooted in values and integrity for its global workforce of 250,000+. Guided by the ‘Spirit of Wipro’ and ‘Five Habits’ principles, it emphasises respect, responsiveness, communication, ownership, and trust. With a 36.4% gender diversity goal, the company supports inclusion through programs like Women of Wipro (WOW), addressing various aspects of diversity such as disability, LGBTQ+, race, ethnicity, and generational diversity.

For talent management, they use tech solutions like the MyWipro app and WiLearn. These tools facilitate goal documentation, feedback, skill-building, and awareness of biases. The company conducts biannual performance reviews, offers training, mentoring, and leadership programs, including global executive leadership initiatives.

Employee benefits encompass a comprehensive package, including 401k, pension, health, vision, dental insurance, competitive pay, bonuses, paid time off, health savings, flexible spending accounts, disability coverage, family medical leave, life insurance, and more. Additional perks involve retirement benefits, stock purchase plans, paid holidays, legal plans, insurance for home, auto, and pets, employee discounts, adoption reimbursement, tuition reimbursement, and well-being programs.

The post Data Science Hiring and Interview Process at Wipro appeared first on AIM.

]]>
Data Science Hiring and Interview Process at Pegasystems https://analyticsindiamag.com/interview-hiring-process/data-science-hiring-process-at-pegasystems/ Mon, 29 Jul 2024 10:03:15 +0000 https://analyticsindiamag.com/?p=10103957

For data science roles, Pega focuses on the candidate's ability to learn and adapt rather than specific tech skills. 

The post Data Science Hiring and Interview Process at Pegasystems appeared first on AIM.

]]>

Pegasystems, commonly known as Pega, is a global software company founded in 1983, focusing on customer engagement and operational excellence solutions. The Cambridge-based company has become a leader in business process management and customer relationship management.

The primary offering, Pega Infinity, acts as a comprehensive platform for businesses to create, implement, and improve applications, aiming to enhance customer experiences and streamline operational processes.

The company utilises AI and data science throughout its platform to improve decision-making, automate processes, and provide personalised customer interactions. CISCO, HSBC, and Siemens are a few of their primary customers. 

In their latest iteration of Pega Infinity 23, the platform introduces over 20 new features, including generative AI-powered boosters to enhance efficiency. The Connect Generative AI feature enables organisations to quickly utilise generative AI with a plug-and-play structure for low-code development.

AIM caught up with Deepak Visweswaraiah, vice president, platform engineering and site managing director, and Smriti Mathur, senior director and head of people, Pegasystems, India, to understand their generative AI play, hiring process and more.

Pega has open positions for solutions engineers and senior software quality test engineers in Hyderabad and Bengaluru.

Decoding Pega’s AI Ventures

In their core platform, Pega Infinity, the organisation relies heavily on data science, which plays a critical role in analytics, insights generation, natural language processing (NLP), generative AI, and various other applications that drive functionalities such as real-time decision-making and personalised customer communications based on attributes.

Data science also contributes significantly to the development of generative AI models, enhancing the overall intelligence of the platform. Its impact extends beyond the core platform to applications like customer service, one-to-one engagement, decision-making, sales automation, and strategic smart apps for diverse industries.

Pega GenAI provides insights into AI decision-making and streamlines processes, such as automating loan processing. “The benefits of generative AI extend to developers and end-users, improving productivity through query-based interactions, automatic summarisation, and streamlined case lifecycle generation,” Visweswaraiah told AIM

End-users also benefit from realistic training scenarios using simulated customer interactions.

Regarding proprietary foundational models, the organisation’s product architecture prioritises openness and flexibility. They support various language models, including those from OpenAI and Google. 

“In upcoming product versions, we are actively working to support and ship local language models to meet specific use case demands, focusing on accuracy, productivity, and performance in response to customer preferences for diverse capabilities,” he added. 

Interview Process

The company follows a global hybrid working model, encouraging collaboration in the office while providing flexibility, with about 60% of the workforce attending the office around three days a week. This approach aims to attract talent globally, fostering a vibrant culture and hybrid working environment.

In upskilling employees, technical competencies are crucial, and the company emphasises learning through its Pega Academy, offering online self-study training, live instructor-led courses, and online mentoring. Skill gaps are regularly assessed during performance reviews, providing learning opportunities through gateways and supporting external courses with an educational reimbursement policy.

“For data science roles, we focus on the candidate’s ability to learn rather than specific data science skills,” Mathur told AIM. The company looks for individuals capable of extracting insights from data, making informed decisions, and building models for application in various use cases.

Mathur further shared that the company emphasises the importance of understanding its problem-solving approach and creating deterministic models that consistently provide performant and real-world solutions. It encourages candidates to think from the customer’s perspective and avoid getting lost in vast amounts of data, highlighting the significance of models producing consistent and reliable answers.

Work Culture

The company emphasises diversity and inclusivity, fostering a culture centred on innovation and collaboration. It has been ranked as the best workplace for women by Avatar for five consecutive years. Pega values individuals who think independently, challenge norms and question the status quo to seek better solutions.

The company encourages leadership and curiosity in approaching tasks, promoting an environment where employees are empowered to innovate. Compared to competitors, Pega’s work culture stands out due to the unique problems it addresses and its distinctive approach.

Understanding the product architecture is crucial for employees, given the nature of the challenges they tackle. Pega’s ability to integrate technology into the platform is a significant differentiator, enhancing its capability to address complex issues. 

“With a focus on adapting to market changes, our mantra of being “built for change” reflects our commitment to staying dynamic and responsive to evolving needs,” concluded Mathur.  

So, if you want to join the dynamic community of Pega, check out the careers page here. 

The post Data Science Hiring and Interview Process at Pegasystems appeared first on AIM.

]]>
Data Science Hiring and Interview Process at WNS https://analyticsindiamag.com/interview-hiring-process/data-science-hiring-process-at-wns/ Mon, 29 Jul 2024 10:03:07 +0000 https://analyticsindiamag.com/?p=10104280

Consisting of over 6,500 AI experts, WNS Triange serves as a partner for 200 global clients in more than 10 industries

The post Data Science Hiring and Interview Process at WNS appeared first on AIM.

]]>

Headquartered in Mumbai, India, WNS is a prominent global Business Process Management (BPM) and IT consulting company with 67 delivery centers and over 59,000 employees worldwide. 

Combining extensive industry knowledge with technology, analytics, and process expertise, the company collaborates with clients across 10 industries to co-create digital-led transformational solutions. WNS is renowned for its strategic partnerships, delivering innovative practices and industry-specific technology and analytics-enabled solutions. The company’s services cover diverse sectors, characterised by a structured yet flexible approach, deep industry expertise, and a client-centric partnership model.

WNS Triange, the AI, analytics, data and research business unit, has successfully harnessed the power of data science to develop robust solutions that effectively address a myriad of business challenges faced by its clients. 

Among these solutions are sophisticated applications such as an advanced claims processing system, a finely tuned inventory optimisation mechanism, and the implementation of a retail hyper-personalisation strategy.

Consisting of over 6,500 experts, WNS Triange serves as a partner for 200 global clients in more than 10 industries. 

“The team is organised into three pillars: Triange Consult focuses on consulting and co-creating strategies for data, analytics, and AI; Triange NxT adopts an AI-led platform approach for scalable business value; and Triange CoE executes industry-specific analytics programs, transforming the value chain through domain expertise and strategic engagement models,”  Akhilesh Ayer, EVP & Global Business Unit Head – WNS Triange, told AIM in an exclusive interaction last week. 

WNS’s AI & Analytics Play

The data science workflow at WNS Triange follows a meticulously structured process that guides the team through various stages, including problem outlining, data collection, Exploratory Data Analysis (EDA), cleaning, pre-processing, feature engineering, model selection, training, evaluation, deployment, and continuous improvement. A pivotal element of this methodology is the proprietary AI-led platform, Triange NxT, equipped with Gen AI capabilities. This platform serves as a hub for domain and industry-specific models, expediting the delivery of impactful insights for clients.

“When it comes to claims processing, we deploy predictive analytics to conduct a thorough examination of data sourced from the First Notice of Loss (FNOL) and handler notes,” said Ayer. This approach allows for the evaluation of total loss probability, early settlement possibilities, and subrogation/recovery potential. 

Simultaneously, its Marketing Mix Modeling (MMM) is employed to optimise resource allocation by quantifying the impact of marketing efforts on key performance indicators. Furthermore, the application of advanced analytics techniques aids in the detection of suspicious patterns in insurance claims for risk and fraud detection. 

Ayer shared that the team also actively leverages generative AI across diverse sectors. In the insurance domain, it is employed to streamline claims subrogation by efficiently processing unstructured data, minimising bias, and expediting insights for recovery. 

Similarly, in healthcare, it empowers Medical Science Liaisons (MSLs) by summarising documents and integrating engagement data for more impactful sales pitches. Generative AI’s versatility is further demonstrated in customer service interactions, where it adeptly handles natural language queries, ensuring quicker responses and retrieval efficiency.

The combination of LLM foundation models from hyperscalers like AWS with WNS Triange’s proprietary ML models enables the delivery of tailored solutions that cater to various functional domains and industries. Where necessary, WNS Triange employs its AI, ML and domain capability to fine-tune existing foundation models for specific results, ensuring a nuanced and effective approach to problem-solving.

Tech Stack

In its AI model development, the team utilises vector databases and deep learning libraries such as Keras, PyTorch, and TensorFlow. Knowledge graphs are integrated, and MLOps and XAI frameworks are implemented for enterprise-grade solutions. 

“Our tech stack includes Python, R, Spark, Azure, machine learning libraries, AWS, GCP, and GIT, reflecting our commitment to using diverse tools and platforms based on solution requirements and client preferences,” said Ayer. 

Even when it comes to using transformer technology, particularly language models like Google’s BERT for tasks such as sentiment analytics and entity extraction, its current approach involves a variety of language models, including GPT variants (davinci-003, davinci-codex, text-embedding-ada-002), T5, BART, LLaMA, and Stable Diffusion. 

“We adopt a hybrid model approach, integrating Large Language Models (LLMs) from major hyperscalers like OpenAI, Titan, PaLM2, and LLaMA2, enhancing both operational efficiency and functionality,” he commented. 

Hiring Process

WNS Triange recruits data science talent from leading engineering colleges, initiating the process with a written test evaluating applied mathematics, statistics, logical reasoning, and programming skills. Subsequent stages include a coding assessment, a data science case study, and final interviews with key stakeholders.

“Joining our data science team offers candidates a dynamic and challenging environment with ample opportunities for skill development. And while engaging in diverse projects across various industries, individuals can expect exposure to both structured and unstructured data,” said Ayer. 

The company fosters a collaborative atmosphere, allowing professionals to work alongside colleagues with diverse backgrounds and expertise. Emphasis is placed on leveraging cutting-edge technologies and providing hands-on experience with state-of-the-art tools and frameworks in data science. 

WNS Triange values participation in impactful projects contributing to the company’s success, offering access to mentorship programs and support from experienced team members, ensuring a positive and productive work experience.

Mistakes to Avoid

Candidates are encouraged to not only showcase technical prowess but also articulate the business impact of their work, demonstrating its real-world relevance and contribution to business goals.

Ayer emphasised, “Successful data scientists must not only be technically adept but also skilled storytellers to present their findings in a compelling manner, as overlooking this aspect can lead to less engaging presentations of their work”

He added that candidates sometimes focus solely on technical details without articulating the business impact of their work, missing the opportunity to demonstrate how their analyses and models solve real-world problems and contribute to business goals.

Work Culture

Recognised by TIME MAGAZINE for being one of the best companies to work in,  WNS has built a work culture centered on co-creation, innovation, and a people-centric approach, emphasising diversity, equity, and inclusivity, prioritising a respectful workplace culture and extending its commitment to community care through targeted programs by the WNS Cares Foundation. 

“Our focus on ethics, integrity, and compliance ensures a safe ecosystem for all stakeholders, delivering value to clients through comprehensive business transformation,” said Ayer. 

In terms of employee perks, it offers various services and benefits, including transportation, cafeterias, medical and recreational facilities, flexibility in work hours, health insurance, and parental leave. 

“Differentiating ourselves in the data science space, we cultivate a work ecosystem that fosters innovation, continuous learning, and belongingness for the data science team. Our initiatives include engagement tools, industry-specific training programs, customised technology-driven solutions, and a learning experience platform hosting a wealth of content for self-paced learning,” he added. 

Why Should You Join WNS?

“At WNS, we believe in the transformative power of data, where individuals play a key role in shaping our organisation by directly influencing business strategy and decision-making. Recognising the significant impact of data science, we invite individuals to join our collaborative and diverse team that encourages creativity and values innovative ideas. In this dynamic environment, we prioritise knowledge sharing, continuous learning, and professional growth,” concluded Ayer. 

Find more information about job opportunities at WNS here.

The post Data Science Hiring and Interview Process at WNS appeared first on AIM.

]]>
Data Science Hiring and Interview Process at Marlabs https://analyticsindiamag.com/interview-hiring-process/data-science-hiring-process-at-marlabs/ Mon, 29 Jul 2024 10:03:00 +0000 https://analyticsindiamag.com/?p=10104968

Marlabs is currently hiring for 10 data science roles, including ML Architect, ML Engineer, and Statistical Modeling positions.

The post Data Science Hiring and Interview Process at Marlabs appeared first on AIM.

]]>

Founded in 2011, New York-based IT services and consulting firm Marlabs helps companies of various sizes to undergo AI-powered digital transformation. It provides a wide range of services, including strategic planning, creating rapid prototypes in specialised labs, and applying agile engineering techniques to develop and expand digital solutions, cloud-based applications and AI-driven platforms.

Marlabs’s data science team addresses a range of industry challenges, emphasising tasks like extracting insights from extensive datasets and employing pattern recognition, prediction, forecasting, recommendation, optimisation, and classification.

Exploring Generative AI at Marlabs

“In operationalising AI/ML, we have tackled diverse projects, such as demand forecasting, inventory optimisation, point of sale data linkage, admissions candidate evaluation, real-time anomaly detection, and clinical trial report anomaly detection,” Sriraman Raghunathan, digital innovation and strategy principal, Marlabs, told AIM in an exclusive interaction. 

The team is also exploring generative AI applications, particularly in knowledge base extraction and summarisation across domains like IT service desk ticket resolution, sustainability finance, medical devices service management, and rare disease education.

However, it is not developing foundational models as of now due to substantial capital requirements. “Instead, we are focussing on the value chain beyond foundational models, offering tools and practices for deploying such models within organisation boundaries, tailored for specific domains,” he added. 

Marlabs employs a variety of tools and frameworks depending on project specifics, utilising R and Python for development, Tableau, Power BI, QlikView for data exploration and visualisation, and PyTorch, TensorFlow, Cloud-Native tools/platforms, and Jupyter Notebooks for AI/ML model development.

The team leverages Transformer models like GPT-3, especially in NLP use cases, implementing them in TensorFlow, and PyTorch, and utilising pre-trained models from Hugging Face Transformers Library. For generative AI, their toolkit includes LangChain, Llama Index; OpenAI, Cohere, PaLM2, Dolly; Chroma, and Atlas.

Hiring Process

The hiring process for data science roles at the organisation emphasises a blend of technical knowledge, practical application, and relevant experience. The initial steps involve a clear definition of the role and its requirements, followed by the creation of a detailed job description. 

The interview process comprises technical assessments, video interviews with AI/ML experts, and HR interviews. Technical assessments evaluate coding skills, data analysis, and problem-solving abilities. 

Video interviews focus on the candidate’s depth of knowledge, practical application, and communication skills, often including a discussion of a relevant case study or project. HR interviews center around cultural fit, interpersonal skills, collaboration, and the candidate’s approach to handling challenges. 

Expectations

“Upon joining the data science team, candidates can anticipate a thorough onboarding process tailored to their specific team, providing access to essential tools, resources, and training for a smooth transition,” commented Raghunathan. 

The company’s AI/ML projects involve cutting-edge technologies, exposing candidates to dynamic customer use cases spanning natural language processing, computer vision, recommendation systems, and predictive analytics. The work environment is agile and fast-paced. The company places a strong emphasis on team collaboration and effective communication, given the collaborative nature of data science and AI/ML projects. 

In this rapidly evolving field, the company expects new hires to demonstrate continuous learning, tackle complex technical and functional challenges, operate with high levels of abstraction, and exhibit creative and innovative thinking.

Mistakes to Avoid

“The most prevalent error observed in candidates during data science role interviews is a lack of clear communication,” he added.

The ability to effectively communicate insights to non-technical stakeholders is crucial in the AI/ML space, and this skill is frequently overlooked. 

Another common mistake is a failure to comprehend and articulate the business context and domain knowledge of the problem, which is essential in AI/ML applications with significant business impact.

Work Culture

“We are recognised for our value-based culture focused on outcomes, emphasising a flat organizational structure to spur innovation and personal growth. Key values such as respect, transparency, trust, and a commitment to continuous learning are central to their ethos, all aimed at exceeding customer expectations,” he said.

The company’s robust learning and development program has prepared over 150 young managers for leadership roles, with a strong emphasis on AI and technology for organisational insights and sentiment analysis.

The company offers a comprehensive benefits package, including versatile insurance plans, performance incentives, and access to extensive learning resources like Courseware and Udemy, supporting a hybrid work model. Additionally, they provide mental health support and reward long-term employees based on tenure. 

Raghunathan further explained that in the data science team, Marlabs stands out for its innovative and collaborative environment, encouraging creativity and continuous learning. “This distinctive culture and investment in employee growth make us a leader in data science, differentiating it from competitors in the tech industry,” he added. 

Why Should You Join Marlabs?

“Join Marlabs for a dynamic opportunity to work with a passionate team, using data to drive meaningful change. In this collaborative setting, data scientists work with brilliant colleagues across various industries, including healthcare, finance, and retail. You’ll tackle complex issues, contributing to significant business transformations. Marlabs supports your career with essential tools, resources, training, competitive compensation, benefits, and opportunities for professional growth and development,” concluded Raghunathan.  

The post Data Science Hiring and Interview Process at Marlabs appeared first on AIM.

]]>
Data Science Hiring and Interview Process at Global Fintech Company Fiserv https://analyticsindiamag.com/interview-hiring-process/data-science-hiring-process-at-fiserv/ Mon, 29 Jul 2024 10:00:43 +0000 https://analyticsindiamag.com/?p=10095785

Fiserv prioritises up-skilling employees to help them excel in their roles and adapt to new technologies and client needs

The post Data Science Hiring and Interview Process at Global Fintech Company Fiserv appeared first on AIM.

]]>

Headquartered in Brookfield, Wisconsin, Fiserv is a global leader in payments and financial technology, with operations across 100 countries. Aspiring to move money and information in a way that moves the world, Fiserv helps its clients achieve best-in-class results through a commitment to innovation and excellence. Boasting a global workforce exceeding 41,000 professionals, Fiserv’s data science team plays a pivotal role in supporting various business domains, driving innovation, and cultivating engineering excellence to enhance client experiences.

Founded 39 years ago through the merger of First Data Processing and Sunshine State Systems, Fiserv has experienced rapid growth by strategically acquiring prominent entities such as CheckFree Corporation, M-Com, CashEdge, and PCLender. Again in 2019, Fiserv underwent a transformative merger with First Data, resulting in the formation of a globally recognised leader in payment solutions and financial technology, facilitating unparalleled capabilities to deliver exceptional value to financial institutions, corporate clients, small enterprises, and consumers alike.

Analytics India Magazine got in touch with Manisha Banthia, vice president, data and analytics – global services, Fiserv, to understand the importance of AI and analytics for the company and how they hire the finest tech talents. 

Inside the Data Science Team of Fiserv

Being a major player in the fintech industry, the 75-member strong data science team of Fiserv tackles various complex challenges in the fields of banking, cards, payments, digital solutions, and merchant acquisition. Besides creating embedded analytics solutions for financial institutions (FIs) to aid their decision-making processes, they offer advisory services throughout the lifecycle of FIs and merchants, covering areas such as acquisition, growth, cross-selling, and retention. The team also focuses on building solutions to optimise internal processes and operations across different businesses.

For a major US retailer, they leveraged ML to identify prepaid cardholders who would benefit from targeted marketing strategies, resulting in increased engagement and reduced attrition. In another initiative, Fiserv aimed to expand its merchant user base for cash advance loans and achieved this by developing a risk model and an AI algorithm that enabled the sales team to target the right merchants, leading to portfolio growth, reduced marketing expenses, and cost optimisation.

Furthermore, the data science team developed an advanced ML-based solution to address fraud detection and prevention for financial institutions, replacing rule-based engines. “Our data science team follows a pod structure consisting of data scientists, domain experts, ML engineers, visualisation experts, and data engineers who constantly add value to our organisation,” said Banthia. 

Data scientists apply advanced techniques and provide recommendations. Domain experts offer business context, translate problems, and validate results. ML engineers deploy ML models for performance and reliability. Visualisation experts represent data insights visually. Last but not least, data engineers collect, process, and maintain data quality.

The team actively works with Python, Pyspark, Azure, Watson, Snowflake, Adobe Analytics, and Alteryx. 

Interview Process

The interview process consists of a thorough examination by both technical and managerial authorities. Candidates with strong programming skills, statistical knowledge, and problem-solving capabilities, evaluated through case studies and in-depth domain knowledge assessment, are ideal. Following it is an HR assessment to check interpersonal skills and the culture fit.

“A successful data scientist should prioritise a client-centric approach, seeking feedback, adapting to specific needs, and aligning analytical solutions with objectives,” said Banthia. 

Technical skills like solving unstructured problems, exploring AI and ML techniques, conceptualising solutions, and simplifying findings for stakeholders are valued. Fiserv also looks for strong leadership, business acumen, and functional expertise in executive hires. When interviewing, prospective candidates should showcase a balanced combination of technical, business, and leadership skills. They should effectively communicate their proficiency without excessive technical jargon and demonstrate the ability to lead teams and collaborate effectively.

Work Culture

Certified by Great Place To Work® in 2023, Fiserv aims to foster a fast-paced and dynamic work environment. Adaptability and the ability to iterate quickly and respond to market needs are highly valued. The company prioritises up-skilling employees to help them excel in their roles and adapt to new technologies and client needs.

Besides providing an inclusive culture and professional growth opportunities, the fintech giant offers learning programs, wellness plans, and engagement initiatives. “We are committed to being an equal opportunity employer with an inclusive workplace culture and clear communication through an open office concept,” she concluded. 

Check out their careers page now. 

Read more: Data Science Hiring Process at MediBuddy

The post Data Science Hiring and Interview Process at Global Fintech Company Fiserv appeared first on AIM.

]]>
Data Science Hiring and Interview Process at Lendingkart https://analyticsindiamag.com/interview-hiring-process/data-science-hiring-process-at-lendingkart/ Mon, 29 Jul 2024 09:57:24 +0000 https://analyticsindiamag.com/?p=10106527

Founded in 2014 by Harshvardhan Lunia, Indian digital assembly fintech lender Lendingkart utilises a data-powered credit analysis system to facilitate online loans, aiming to improve accessibility in small business lending. The company’s proprietary underwriting mechanism utilises big data and analytics to evaluate the creditworthiness of borrowers. The company has so far disbursed over $1 billion […]

The post Data Science Hiring and Interview Process at Lendingkart appeared first on AIM.

]]>

Founded in 2014 by Harshvardhan Lunia, Indian digital assembly fintech lender Lendingkart utilises a data-powered credit analysis system to facilitate online loans, aiming to improve accessibility in small business lending. The company’s proprietary underwriting mechanism utilises big data and analytics to evaluate the creditworthiness of borrowers.

The company has so far disbursed over $1 billion in loans in over 1300 cities in the country, especially in Tier 2 and Tier 3 cities. The company, which recently reported its first-ever profits, a sum of Rs 118 crore, with total revenues reaching Rs 850 crore in FY23, specialises in providing unsecured business loans to micro, small, and medium-sized enterprises (MSMEs). 

The fintech company is backed by Bertelsmann India Investments, Darrin Capital Management, Mayfield India, Saama Capital, India Quotient and more. 

“Data science has always been at the heart and center of our operations. The AI/ML-based underwriting that this team has developed has been used to underwrite over one million MSMEs,” said Dhanesh Padmanabhan, chief data scientist, Lendingkart, in an exclusive interaction with AIM.

The 35-member data science team of the Ahmedabad headquartered firm is organised into three main groups: analytics, underwriting modelling, and ML engineering. The analytics team, with approximately 15 members, is further divided into three sub-teams focusing on revenue, portfolio (credit and risk), and collections.

“One of the key challenges addressed by our team at Lendingkart is credit risk management where we employ a combination of analytics and AI/ML models at different stages of the underwriting and collections processes to assess eligibility, determine loan amounts and interest rates, and ensure timely customer payments or settlements,” he added.

This underwriting modeling team consists of about 5 members dedicated to developing underwriting models, while the 10-member ML engineering team focuses on MLOps, feature store development, and AI applications.

Additionally, there are individual contributors like an architect and a technical program manager, along with a two-member team specializing in setting up the underwriting stack for the newly established personal loan portfolio.

The company has open positions for senior data scientist and associate director in Bengaluru.

Inside Lendingkart’s AI & Analytics Team

The team leverages AI and ML across various functions, for example, in outbound marketing to target existing customers and historical leads through pre-approved programs. Additionally, a lead prioritization framework helps loan specialists focus on leads for calling and digital engagement.

The company also employs an intelligent routing system to direct loan applications to credit analysts, and a terms gamification framework aids negotiation analysts in negotiating interest rates with borrowers. Its fraud identification framework flags potentially manipulated bank statements for further review, and a speech analytics solution is deployed to extract insights from recorded calls for monitoring operational quality.

On the other hand, collections models prioritize collections based on a customer’s likelihood of entering different delinquency levels, and computer vision models are used for KYC verification.

“We are also exploring the use of generative AI for marketing communication, chatbots, and data-to-insights applications,” said Padmanabhan. Moreover, there are plans to build transformer-based foundational models using call records and structured data sources like credit histories and bank statements for speech analytics, customer profiling, and underwriting purposes.

The tech stack comprises SQL running on Trino, Airflow, and Python. For ML tasks, they leverage scikit-learn, statsmodel, scipy, along with PyTorch and TensorFlow. Natural language processing and computer vision applications involve the use of transformers and CNNs.

The API stack is powered by fast API’s deployed on Kubernetes (k8s). In ML Engineering, the team prefers Kafka and Mongo. Additionally, there are applications built on Flask and Django, and they are currently developing interactive visualizations using the MERN stack.

Interview Process

Lendingkart’s data science hiring process includes four to five interview rounds, evaluating candidates with strong backgrounds in analytics, modelling, or ML engineering. In leadership roles such as team leads and managers, the company places emphasis not only on technical proficiency but also on crucial skills in team and stakeholder management.

During the interview process, non-managerial candidates undergo initial technical assessments in SQL, Python, or ML. Subsequent rounds explore general problem-solving and soft skills, with assessments conducted by peers, managers, and HR.

Expectations

Upon joining the team, candidates can expect to participate in a diverse range of projects encompassing revenue, risk, collections, and the development of tech and AI stacks for these applications. Collaboration with various stakeholders remains a significant aspect of the role. For example, the development of a new underwriting algorithm involves comprehensive reviews with risk and revenue teams to align with business objectives, followed by collaboration with product and ML engineering teams for successful implementation.

However, Padmanabhan notes that there is a common mistake which candidates make – they overlook the importance of thoroughly understanding the business context of the given problems.

“While they may possess knowledge of various algorithms used in different domains, they may struggle to articulate solutions or approaches when those algorithms are applied within a financial process context,” he added, highlighting the importance of connecting technical expertise with a deep understanding of the specific business challenges at hand.

Work Culture

“Our work culture is fast-paced and dynamic, characterised by group problem-solving focused on specific business goals with competitive ESOP packages and industry-standard insurance,” said Padmanabhan.

The data science team operates hands-on at all levels, adopting best practices like agile and MLOps. The “hub and spoke” approach involves data scientists taking responsibility for the entire process from conceptualization to implementation, distinguishing the work culture from competitors in the space.

At Lendingkart, you’ll collaborate closely with stakeholders on projects like developing underwriting algorithms. The company maintains a well-established agile practice led by the technical program manager and team leads, focusing on efficient planning, best practices, and clear communication to create a productive work environment. So if you think you are fit for this role, apply here. 

The post Data Science Hiring and Interview Process at Lendingkart appeared first on AIM.

]]>
Data Science Hiring and Interview Process at Verizon https://analyticsindiamag.com/interview-hiring-process/data-science-hiring-process-at-verizon/ Mon, 29 Jul 2024 09:54:17 +0000 https://analyticsindiamag.com/?p=10112646

The company plans to expand its footprint in India by hiring around 70 professionals this year for data science, data engineering, ML engineering, and cloud roles across Chennai, Hyderabad, and Bengaluru.

The post Data Science Hiring and Interview Process at Verizon appeared first on AIM.

]]>

New Jersey-based telecommunication operator Verizon is at the forefront of leveraging AI and analytics to transform its network, services, and customer interactions.

“When it comes to AI, especially generative AI, I think at the edge of the network it will be very important to have AI to make quick decisions very close to the end user,” Hans Vestberg, chief executive officer at Verizon, said on a panel discussion at WEF 2024, highlighting the significance of AI in powering future business growth.

“We focus on the core areas, which include customer experience and loyalty driven by personalisation when it comes to our customers. We also use AI to drive operational efficiency in areas like workforce demand prediction and audience targeting in marketing,” said Ebenezer Bardhan, senior director of data science at Verizon India, in an exclusive interaction with AIM.

Currently, the company has over 400 models deployed in production across various lines of business within sales, service and support.

Verizon plans to expand its footprint in India by hiring around 70 professionals this year for data science, data engineering, ML engineering, and cloud roles across Chennai, Hyderabad, and Bengaluru. 

Inside Verizon’s Data Science Lab

The AI and analytics team at Verizon consists of two divisions: Data and Analytics (D&A) focusing on enterprise analytics, and AI & Data (AI&D) comprising data engineers, AI engineers, and data scientists, with a team size of around 350 in India.

Some of the recent use cases of the company’s AI initiatives include personalisation and churn reduction to improve customer retention to add business value. From an AI services standpoint, it has a custom explainability and interpretability framework that data scientists in the team have adopted to diagnose models.  

“We are also exploring use cases in generative AI to improve the information access for frontline users,” said Ganesh Siddhamalli, director of AI and ML engineering, told AIM.

Internally, the company is developing a charter as part of the Responsible AI initiative to establish appropriate guardrails facilitated by LLMOPS services. Additionally, it also has a Generative AI Center of Excellence (COE) to provide a unified perspective on all potential use cases, allowing for evaluation and shared learning among teams throughout the journey.

“We primarily use frameworks like Tensorflow, Pytorch, Scikit learn, Domino DataLab, Vertex suite, Seldon, Ray, Great Expectations, and Pydantic across AWS/GCP and OnPrem for AI and analytics,” Bharathi Ravindran, director, data science, Verizon India, told AIM

Transformer-based models have been employed since 2018, particularly for real-time intervention in call and chat transcripts, aligning with generative AI use cases.

Interview Process

The team’s hiring process is well-defined and thorough, beginning with an internal posting for eight days to encourage applications from within the company. Subsequently, the job openings are publicised on various external platforms, including job boards, social media, and other professional networks, to attract a diverse and qualified pool of candidates.

“The meticulous and structured hiring process aims to thoroughly assess candidates’ decision-making and leadership capabilities, ensuring the identification of the right diverse talent to contribute to our success,” Samir Singh, director of talent acquisition, Verizon India, told AIM.

He further explained that for data science roles, the interview process consists of three comprehensive rounds. The initial stages focus on technical competencies, such as programming proficiency and experience in essential skills like Python, personalisation, forecasting, GenAI, Churn models, responsible AI, and networking. 

A techno-managerial round follows this, assessing the candidate’s ability to blend technical expertise with managerial skills, and finally, an interview with the leadership team.

On the other hand, data engineering roles involve two rounds that evaluate technical competencies in areas like Big Data, Hadoop, Teradata, and Google Cloud Platform (GCP), followed by a managerial round. 

Expectations

When joining the data science team at Verizon, candidates can expect a vibrant work culture, ample learning opportunities, and exposure to cutting-edge technologies. The company places a strong emphasis on integrity and expects candidates to embody this core value throughout their tenure.

As for expectations from candidates, Verizon values not only technical expertise but also the ability to connect the dots between technological advancements and business objectives. Candidates should confidently articulate their past or current roles during interviews. 

“In an AI-centric organisation like ours, where the daily focus is on enabling business through innovative solutions, the capacity to seamlessly integrate technical excellence with practical application is of great importance for success in data science roles,” said Singh. 

Work Culture

The company’s work culture centres around innovation, collaboration, and customer-centricity. Employees are encouraged to think creatively, take risks, and embrace change. Diversity and inclusion are integral values, creating a supportive and inclusive work environment. 

The company emphasises integrity, respect, performance, and accountability, offering benefits beyond the basics, like flexible working hours, wellness programs, and support for childcare and eldercare.

The key differentiator in Verizon’s work culture, especially within the data science team, is a strong emphasis on R&D, providing freedom for experimentation and learning, as shared by Singh. Apart from experimentation, an open culture fosters collaboration, making Verizon stand out as a great workplace.

Verizon prides itself on a diverse team, promoting gender diversity, supporting differently-abled individuals, and initiatives like WINGS (Women returnee program) and LGBTQ inclusion. The company has consciously made Diversity, Equity, and Inclusion (DEI) a core part of its culture.

“We foster a culture of inclusion, collaboration, and diversity both within the company and among our customers, suppliers, and business and community partners,” concluded Singh. 

If you think that you are a good fit for the company, check out its careers page now. 

The post Data Science Hiring and Interview Process at Verizon appeared first on AIM.

]]>
Data Science Hiring and Interview Process at Nucleus Software https://analyticsindiamag.com/interview-hiring-process/data-science-hiring-process-at-nucleus-software/ Mon, 29 Jul 2024 09:53:10 +0000 https://analyticsindiamag.com/?p=10114019

The company has positions open for ML engineer, lead data scientist, lead data engineer, data architect, and data and analytics manager.

The post Data Science Hiring and Interview Process at Nucleus Software appeared first on AIM.

]]>

Founded in 1986, Nucleus Software is an Indian IP product company specialising in financial technology. The company focuses on developing and selling its own software solutions. Tailored for the banking industry, its key products, FinnOne Neo and FinnAxia, cater to lending and transaction banking needs.

These solutions aid banks in optimising processes, efficiently managing loans, and delivering innovative financial services. With a presence in 50 countries and a client base exceeding 200, some of its notable customers include State Bank of India, Citi Bank, ICICI Bank, Tata Capital, and Mahindra Finance.

The team has successfully applied AI and analytics in various business scenarios, including fraud detection in transaction banking, business optimisation through insights generation, natural language processing, real-time decisioning systems, AI-powered chatbots, auto summarisation, and hyper-personalised customer communications. 

AIM got in touch with Abhishek Pallav, associate vice president, Nucleus Software, to understand the AI activities of the company and the kind of people it looks for.

Inside Nucleus Software’s AI & Analytics Lab

In terms of implementing AI and ML solutions, Pallav said that Nucleus has developed a foundational framework for its AI components, ensuring reliability and scalability in the dynamic financial services sector. 

The application of AI and ML is evident in the real-time fraud detection engine within the transaction banking platform, natural language processing, customer satisfaction improvement, revenue increase, cost reduction, and operational optimisation for financial institutions. 

The company also leverages an in-house chatbot for L1 and L2 production support.

Regarding generative AI, Nucleus Software strategically deploys tailored solutions to enhance service operations and improve customer experience. This approach enables data access democratisation and the extraction of value from unstructured data. 

“We are in the process of building systems around generative AI and have developed use cases around customer behaviour and transaction history,” Pallav told AIM.

Interview Process 

“When it comes to hiring for data science roles, We look for candidates who have strong technical and domain acumen,” said Pallav, highlighting that the company seeks candidates with strong problem-solving skills for real-world data science scenarios. Proficiency in machine learning and deep learning, particularly in areas like finance, retail banking, fraud prevention, and hyper-personalisation, is highly valued. 

Preference is given to candidates from premier engineering colleges with a background in mathematics, computer science, and AI certifications.

Pallav elaborated that the hiring process for data science candidates begins with a written test evaluating IQ and logical reasoning, followed by two technical discussion rounds. The process concludes with an HR round. 

The written test, conducted online through the company’s platform, includes multiple-choice questions related to the domain and coding problems. Candidates are given a defined time window to complete the test.

Tech Skills Needed

Nucleus Software is currently hiring for five data science roles in Noida. 

The positions include ML engineer, lead data scientist, lead data engineer, data architect, and data and analytics manager. Candidates with experience in the retail banking domain are preferred. The minimum qualifications needed include a BTech in computer science from a premier institute. 

In terms of tech tools, applications, and frameworks, Nucleus Software employs a variety of them for natural language processing, computer vision, directed acyclic graph (DAG), hyper-personalisation, and explainability in its product R&D. 

The desired skills for candidates include proficiency in tools and frameworks such as Python, Pyspark, Spark, Kafka, Hive, and SQL. The knowledge of BI tools is desirable. Candidates should have expertise in various data science techniques, including analytics using AI/ML, deep learning, statistical modelling, time series analysis, and test & learn. The company values candidates who align with their core values of innovation, result orientation, collaboration with integrity, and mutual respect.

Nucleus Software’s products are built on cutting-edge technologies, are platform-agnostic, web-based, cloud-native, and powered by AI capabilities. 

Expectations

“Upon joining, an associate goes through a specialised training program offered by Nucleus School of Banking Technology, where they become domain experts,” said Pallav, emphasising that after completing the program, associates join the Build R&D team. 

However, Pallav explained that he has often observed candidates making the common mistake of adopting an overly generic approach. The company emphasises the need for a holistic approach to value delivery, coupled with strong analytical and logical skills.

Work Culture

Nucleus Software values its employees as crucial assets. This is evident in its various programs that promote employee well-being, such as job enrichment initiatives, competitive compensation, recreational activities, family outings, and diversity and inclusion programs. 

Currently following a hybrid work approach, the company prioritises upskilling with a range of online courses, expert-led training, and on-the-job mentoring. It believes in supporting employees in their professional development. Additionally, the company stands out by offering educational reimbursement for these courses and certifications.

According to Pallav, the work culture at Nucleus is distinguished by its commitment to inclusivity, diversity, engagement, and security. “For candidates driven by challenges, we offer a plethora of complex business use cases. These opportunities not only enhance your technical knowledge but also contribute to enriching your work experience, paving the way for you to become a top-tier professional,” concluded Pallav. 

Find more about the job opportunities here. 

The post Data Science Hiring and Interview Process at Nucleus Software appeared first on AIM.

]]>
Data Science Hiring and Interview Process at Confluent https://analyticsindiamag.com/interview-hiring-process/data-science-hiring-process-at-confluent/ Mon, 29 Jul 2024 09:51:36 +0000 https://analyticsindiamag.com/?p=10117329

The company is seeking data scientists and engineers to further bolster its tech team.

The post Data Science Hiring and Interview Process at Confluent appeared first on AIM.

]]>

In September last year, Confluent, a leading provider of data streaming solutions, introduced ‘Data Streaming for AI’, a new initiative to speed up organisations’ creation of real-time AI applications. More recently, the company announced the general availability of Confluent Cloud for Apache Flink. This fully-managed service enables customers to process data in real-time and create high-quality, reusable data streams.

Behind all the innovations in this space is Confluent’s strong and resilient AI and analytics team. “Building a truly data-driven culture is one of the top priorities for Confluent’s Data team. A critical part of achieving that is applying data science to address real-world requirements in business operations,” Ravi Kiran Yanamandra, manager of data science for product and growth marketing at Confluent, told AIM

Yanamandra, along with Karthik Nair, director of international talent acquisition at Confluent, took us through the company’s AI applications, hiring process, skills needed, and work culture. 

The company is seeking data scientists and engineers to further bolster its tech team. 

Inside Confluent’s Data Science Wing

The data team at Confluent is structured into sub-teams specialising in data engineering, data science, and business intelligence.

The organisation has leveraged data science in experimentation to inform product decisions and optimise marketing investments across multiple channels. This involves a multi-channel attribution model and an improved predictability model built on key business KPIs through machine learning forecasting models, enabling more precise planning.

Additionally, machine learning forecasting models improve the predictability of critical business KPIs. 

In terms of implementation, Confluent’s data science team uses a combination of online and offline machine learning models to support various aspects of the business. For example, online algorithms are deployed to evaluate the quality of leads or new signups in real time, allowing for immediate actions based on the insights generated. 

Furthermore, offline models are operationalised to assist business partners in making informed decisions, such as guiding marketing spend decisions through the marketing channel attribution model and providing predictive insights into future performance through revenue forecasting models.

“While still in the experimental phase, we are actively exploring the potential of generative AI as a productivity tool,” said Yanamandra, highlighting that the initial applications include enhancing internal search capabilities and evaluating the quality of support provided through channels like chatbots and email communications. 

Moreover, through its comprehensive data streaming platform, organisations can stream, connect, process, and manage data in real time, creating innovative solutions previously considered unattainable. By integrating generative AI with data streaming, organisations can swiftly address inquiries with up-to-date and comprehensive information.

In addition to leveraging existing technologies, the team also builds proprietary models using its proprietary data assets to address specific business challenges, said Yanamandra. These models, such as consumption forecasting and lead scoring, are tailored to Confluent’s unique needs, further enhancing their competitive advantage in the market.

The team predominantly uses SQL and Python for modelling and analysis, supported by tools like dbt, Docker, and VertexAI for data pipeline management and production model deployment. Tableau is the primary platform for visualisation and reporting, enabling stakeholders to gain actionable insights from the data effectively.

Interview Process

“The hiring process for our data team focuses on assessing candidates in three areas – technical, analytical, and soft skills,” commented Yanamandra. 

Candidates are evaluated based on their proficiency in Python and SQL, experience in ML algorithms and data modelling, and familiarity with A/B testing and data visualisation. Analytical skills are assessed through problem-solving abilities and structured thinking, while soft skills, such as business understanding and communication, are also crucial.

The interview process begins with technical screening focussed on SQL and Python, followed by a real-world business scenario assessment. For data science roles, there’s an additional stage dedicated to statistics knowledge and machine learning abilities. The final interview with the hiring manager evaluates project delivery experience, technical leadership, motivations, and cultural fit.

Expectations

When joining Confluent’s data science team, new members can expect to actively engage with business partners, focusing on solving their specific business challenges. Successful candidates join as subject matter experts for the company’s data tools and technologies, and training is provided to deepen their understanding of the relevant business domain. 

New joiners can expect to work in a “highly collaborative, innovation-driven, and fast-paced environment on the data science team. We move quickly and prioritise translating data insights into tangible business impact”, Yanamandra added.

“Another unique aspect is that candidates are exposed to diverse domains, offering opportunities to collaborate across functions such as marketing, sales, finance, and product analytics,” Nair told AIM.

Mistakes to Avoid

While interviewing candidates, Yanamandra has noticed a common pattern. Candidates often assume that proficiency in technical skills such as SQL, Python, or machine learning is the sole criterion evaluated for data science roles. 

However, while these skills are definitely crucial, Confluent equally prioritises problem-solving abilities and the capacity to apply data science concepts to practical business scenarios. 

Work Culture 

Confluent strives to prioritise hiring individuals who demonstrate empathy and interact well with others, fostering a collaborative and inclusive environment. “As a rapidly growing company, our employees are self-motivated and driven to seize market opportunities. We follow unified decision-making and communication with an open and hierarchical-free structure,” Nair told AIM

Nair stated that the company also offers flexibility through a ‘remote-first’ policy, allowing employees to work from various locations. Alongside competitive benefits, it ensures each employee’s contributions are recognised and valued. 

“Our team thrives on a culture of intellectual curiosity and innovation, where individuals will be encouraged to push the boundaries of what’s possible,” said Nair. The company strives to build an equal, diverse and inclusive work culture.

“We’re a high-growth company with incredible results and achievements, yet only scratching the surface of our potential impact in a rapidly growing market. Joining us promises an exhilarating journey of growth,” concluded Nair. 

If you think you are a right fit for Confluent, apply here

The post Data Science Hiring and Interview Process at Confluent appeared first on AIM.

]]>
Data Science Hiring and Interview Process at AstraZeneca https://analyticsindiamag.com/interview-hiring-process/data-science-hiring-process-at-astrazeneca-2/ Mon, 29 Jul 2024 09:48:48 +0000 https://analyticsindiamag.com/?p=10123796

AstraZeneca is active in leveraging generative AI, with use cases spanning research assistance, executive reporting, and competitive intelligence.

The post Data Science Hiring and Interview Process at AstraZeneca appeared first on AIM.

]]>

With over 45 years of presence in India, British-Swedish pharmaceutical and biotechnology giant AstraZeneca has data science and AI capabilities deeply ingrained across its entire drug development lifecycle.

For the company, AI plays a pivotal role in accelerating target identification, drug discovery, clinical trials, and commercial analytics. AstraZeneca has implemented rigorous processes to ensure the responsible development and deployment of AI and ML solutions. 

Internally built use cases, solutions, or tools undergo an AI governance maturity assessment before production deployment to ensure compliance with the company’s responsible AI development standards, aligning with their data ethics principles. 

Teams from all business areas contribute to this process, fostering a collaborative approach to ensure AI is developed and deployed safely and responsibly.

AIM recently got in touch with Arun Maria, director of data and AI, R&D IT; Govind Cheramkalam, director of commercial reporting and analytics; and Anuradha Kumar, head HR of global innovation and technology centre, AstraZeneca, to understand more about the GCC’s AI operations, expansion opportunities, hiring process, work culture and more. 

Inside AstraZeneca’s Data Science Labs

AstraZeneca is also active in leveraging generative AI, with use cases spanning research assistance, executive reporting, and competitive intelligence. For example, AZ ChatGPT, an AI-powered research assistant, uses the company’s extensive biology and chemistry knowledge repositories to answer complex questions and provide prompts on discovery and clinical inquiries. 

“We are currently evaluating the capabilities of LLMs like AZ ChatGPT to improve insight generation for executive reports distributed to CXOs and decision-makers in brand and marketing companies,” Cheramkalam told AIM. 

Another such example is the Biological Insight Knowledge Graph (BIKG), a proprietary model developed by AstraZeneca. 

“It utilises the company’s exclusive machine learning models to serve as a recommendation system, enabling scientists to make efficient and informed decisions regarding target discovery and pipeline management. The primary goal of BIKG is to minimise patient attrition rates and improve clinical success,” Maria explained. 

The company has an Enterprise Data and AI strategy unit, with data and AI teams embedded across business and IT functions, fostering a collaborative environment to ensure the provision of foundationally FAIR (Findable, Accessible, Interoperable, and Reusable) data at the source.  

Data engineers, MLOps engineers, AI and ML engineers work as unified teams, promoting collaboration and accelerating business outcomes through structured learning and development programs that cultivate new skills internally.

“AstraZeneca uses a plethora of in-house and externally sourced tools, frameworks and products ranging across very proprietary in-house tools as well as  Databricks and PowerBI,” said Maria. 

The company uses Transformer and GPT models, including testing Microsoft’s Azure OpenAI Service with cutting-edge models like GPT-4 and GPT-3.5. To foster innovation and engagement, AstraZeneca follows a hybrid working model, promoting collaboration while offering flexibility.

Interview Process

AstraZeneca aims to become a data-led enterprise, integrating data into all aspects of its operations. To achieve this, the company seeks candidates with strong skills in Python, machine learning, deep learning, computer vision, and NLP, along with a mindset geared toward growth through innovation.

The interview process is designed to understand both the candidate’s suitability for the role and the company. “For all our roles we look for candidates that not only have the skills, knowledge, experience and competence but can also live our values and behaviour,” Kumar told AIM

Apart from this, assessments at AZ focus on evaluating whether candidates will perform well in the position, demonstrate leadership potential, exhibit enthusiasm and motivation, and work collaboratively in a team. 

Potential candidates should prepare by understanding these key areas and reflecting on their experiences and qualities that they can bring to the table.

What Candidates Should Expect

Joining AZ’s data science team offers opportunities to collaborate with diverse teams, tackle new challenges, and work with the latest technology. The environment supports development and innovation, with the ultimate goal of powering the business through science and market delivery. 

“As a candidate, research our strategic objectives, core values, and the position you’ve applied for. Use our social media or website to learn about the organisation, team, and people. In the interview, be ready to discuss your past experiences and what you’ve learned from them,” said Kumar. 

This will help in avoiding the common mistakes of a lack of preparation about the company and the specific role they are applying for.

Work Culture

AstraZeneca fosters a supportive and inclusive workplace where employees can learn and grow. New hires benefit from onboarding and buddy programs, while extensive training and career development opportunities are available for all.

The gender ratio in AstraZeneca India is approximately 64.6% male to 35.4% female.

The company was also recognised by AIM Media House in 2022 for its excellent work in AI and ML, thanks to its focus on putting patients first. It was once again recognised by AIM in 2024 for its data engineering excellence.

In the data science team, AstraZeneca encourages cross-disciplinary collaboration and lifelong learning through the 3Es framework: Experience, Exposure, and Education. 

The company has a global peer-to-peer recognition system and offers comprehensive benefits, including medical insurance covering parents or parents-in-law, personal accident insurance, term life insurance, and childcare support for new parents.

If you think you are a good fit for the company, check out its career page here. 

The post Data Science Hiring and Interview Process at AstraZeneca appeared first on AIM.

]]>
Data Science Hiring and Interview Process at Diggibyte https://analyticsindiamag.com/interview-hiring-process/data-science-hiring-process-at-diggibyte/ https://analyticsindiamag.com/interview-hiring-process/data-science-hiring-process-at-diggibyte/#respond Mon, 29 Jul 2024 09:48:28 +0000 https://analyticsindiamag.com/?p=10124504

The company is currently hiring for five data science positions in Bengaluru with at least two years of experience in the field.

The post Data Science Hiring and Interview Process at Diggibyte appeared first on AIM.

]]>

Founded in 2021 by Lawrance Amburose, Sekhar Reddy, William Rathinasamy, and Anuj Kumar Sen, Bengaluru-based Diggibyte Technologies focuses on providing data and platform engineering, data science AI, and consulting services to a variety of industries. The founders bring a wealth of experience and expertise in leveraging big data and analytics to drive business innovation and efficiency.

AIM got in touch with Anuj Kumar Sen, the chief technology officer of Diggibyte, to learn more about the company’s AI and data science operations, expansion plans, interview process, and work culture.

The company is currently hiring for five data science positions in Bengaluru with at least two years of experience in the field. 

Inside Diggibyte’s Data Science Team

Diggibyte’s data science team has tackled numerous issues across various domains. “We have solved multiple problems using data science across the domain, which include customer insights and personalisation (home appliance),” Sen told AIM

By collecting and analysing data from multiple channels, the 50-member data science team provided insights into customer purchasing behaviour, preferences, and needs to tailor marketing efforts and customer satisfaction.

When it comes to logistics, the team has addressed the problem of inaccurate demand predictions, an issue that can potentially lead to mismanagement of fleet drivers and resources. By analysing historical sales data, market trends, and seasonal variations, the team claims to have developed accurate demand forecasts. 

An improved predictive capability helps clients optimise and plan their fleets and drivers, ensuring resources meet customer demand without impacting delivery efficiency or profitability.

Another notable project is the development of an AI-driven headline generator for Norwegian language articles. Using NLP models, this solution scans article content and claims to generate compelling headlines that capture the story’s essence.

Besides, the company also employs generative AI for multiple clients in construction, media, and retail. “Our team is currently working on a latent diffusion image-to-image in-painting model that can automate the process of lifestyle product photography and commercial photography for sports accessories,” he added.

Tech Stack

The team leverages both Transformer architecture models and GPT models depending on the task requirements. They have developed solutions using the Transformer library for an RAG system. Additionally, they created a chatbot for customer support utilising GPT-3.5 Turbo in Azure ML Studio.

In potential candidates, the company looks for proficiency in Python, SQL, Pandas, NumPy, and machine learning. They should also be adept in using frameworks like Keras or TensorFlow and have experience with Databricks

In addition to these core competencies, data scientists are encouraged to have competitive programming skills, be fast learners, and stay updated with the latest developments in generative AI, preferably with hands-on experience. 

Interview Process

The interview round at the company is followed by a thorough and structured process to assess both technical and interpersonal skills. It begins with two technical assessment rounds where candidates are evaluated on their fundamental data science skills and adaptability to the latest trends in data science and AI. 

These are followed by an HR round focusing on the candidate’s interpersonal skills, career goals, collaborative abilities, and employment history.

The company sources its best fit by relying on a multi-channel approach to find the top talent in data science. This includes leveraging its internal careers page (set to be deployed in 20 days), online career portals, and trusted vendors.

Work Culture

Once employed, new hires can expect to work on cutting-edge AI projects. It also provides training to bridge gaps, if any, in both technical and interpersonal skills, supported by the leadership team. 

“Since we have a separate team for analytics, engineering, visualisation, and machine learning, we expect candidates to have specialisation in one skill rather than general knowledge in multiple skills,” Sen explained. 

Employees can pursue certifications from reputable sources such as Azure and Databricks. The company covers the exam costs. It offers a flexible work-life balance, accommodating four days’ of work-from-office and work-from-home arrangements, along with a supportive leave policy. 

The company also has policies in effect to make job options accessible, inclusive and diverse for all. Efforts around healthcare and a positive office atmosphere promote employee well-being.

Diggibyte Technologies was certified as the Best Firm For Data Engineers and was awarded during DES 2024.

The post Data Science Hiring and Interview Process at Diggibyte appeared first on AIM.

]]>
https://analyticsindiamag.com/interview-hiring-process/data-science-hiring-process-at-diggibyte/feed/ 0
[Exclusive] Altair is Hiring Multiple Data Scientists in India https://analyticsindiamag.com/interview-hiring-process/exclusive-altair-is-hiring-multiple-data-scientists-in-india/ https://analyticsindiamag.com/interview-hiring-process/exclusive-altair-is-hiring-multiple-data-scientists-in-india/#respond Mon, 29 Jul 2024 09:08:36 +0000 https://analyticsindiamag.com/?p=10129455

From interns to seasoned professionals, the company is looking for people skilled in building data pipelines, data preparation, data modelling, model validation, and deployment.

The post [Exclusive] Altair is Hiring Multiple Data Scientists in India appeared first on AIM.

]]>

Recently, computational intelligence solutions giant Altair launched HyperWorks 2024, an AI-powered engineering and simulation platform for its customers which integrates AI across the product lifecycle, claiming faster design exploration, efficient thermal analysis, and rapid behaviour prediction. 

It offers AI-embedded workflows, HPC environments, photorealistic graphics, Python and C++ scripting, generative design, and meshless ECAD simulation to boost productivity and streamline engineering workflows.

Gilma Saravia, the chief people officer of Altair told AIM in an exclusive interaction that the company is planning to expand its footprint in India and hiring for multiple roles in AI and analytics. It is also offering internship roles.

Skills Needed

Altair seeks individuals who are up for innovation and continuous learning. Specifically, for data science roles, the ideal candidates should be able to design and build systems for collecting, storing, and analysing data at scale and be good at improving data reliability and quality.

They should be skilled in building data pipelines, data preparation, data modelling, model validation, and deployment.

“We assess their (candidates) case studies and code completion assessments, as well as domain expertise to understand their level of technical proficiency. It’s through this assessment of their skills in statistics, programming, ML, and data visualisation that provide us the best understanding of where we can place the candidate,” she added. 

Interview Process

“The interview process at Altair is designed to identify candidates who align with the company’s core values: envisioning the future, seeking technological and business firsts, communicating broadly and honestly, and embracing diversity and risk-taking,” added Saravia. 

Initial Screening: Evaluation of candidates’ experience and domain expertise.

Technical Evaluations: A series of technical assessments focusing on coding, model building, and problem-solving abilities.

Behavioural Assessments: Evaluation of candidates’ fit with Altair’s culture and core values.

Project-Specific Questions: Tailored inquiries to assess applied skills in machine learning algorithms, statistical modelling, and domain-specific knowledge.

Practical Exercises: Candidates are tested on their practical coding and modelling skills, including code snippet completion, data preprocessing, and model optimisation.

The company expects candidates to have a customer-focused outlook, the ability to think complexly yet arrive at simple solutions, and a deep curiosity for exploring beyond the obvious. In return, it offers competitive rewards, flexible work schedules, and ample opportunities for career development. 

“We also greatly value an entrepreneurial spirit and team members who are always looking forward to new ideas and challenges. Candidates who are not able to envision the future or who do not possess a “problem-solving mindset” may not qualify as a top candidate in the process” Gilma concluded.

Check out the careers page for applying.

Read Next: Data Science Hiring Process at Target

The post [Exclusive] Altair is Hiring Multiple Data Scientists in India appeared first on AIM.

]]>
https://analyticsindiamag.com/interview-hiring-process/exclusive-altair-is-hiring-multiple-data-scientists-in-india/feed/ 0
Time To Scale Down Large Language Models https://analyticsindiamag.com/ai-trends-future/time-to-scale-down-large-language-models/ https://analyticsindiamag.com/ai-trends-future/time-to-scale-down-large-language-models/#respond Tue, 16 Jul 2024 05:56:13 +0000 https://analyticsindiamag.com/?p=10129231

Advancements in hardware (H100 GPUs), software (CUDA, cuBLAS, cuDNN, FlashAttention), and data quality have drastically reduced training costs.

The post Time To Scale Down Large Language Models appeared first on AIM.

]]>

Renowned research scientist Andrej Karpathy recently said that the llm.c project showcases how GPT-2 can now be trained in merely 24 hours on a single 8XH100 GPU node—for just $672. 

Karpathy’s journey began with an interest in reproducing OpenAI’s GPT-2 for educational purposes. He initially encountered obstacles in using PyTorch, a popular deep-learning framework. 

Frustrated by these challenges, Karpathy decided to write the entire training process from scratch in C/CUDA, resulting in the creation of the llm.c project. It eventually evolved into a streamlined, efficient system for training language models.

The project, which implements GPT training in C/CUDA, has minimal setup requirements and offers efficient and cost-effective model training.

Scaling down LLMs 

In his post, Karparthy mentioned how advancements in hardware (H100 GPUs), software (CUDA, cuBLAS, cuDNN, FlashAttention), and data quality have drastically reduced training costs. 

Mauro Sicard, the director of BRIX Agency, agreed with Karparthy. “With the improvements in both GPUs and training optimisation, the future may surprise us,” he said.

Scaling down LLM models while maintaining performance is a crucial step in making AI more accessible and affordable. 

According to Meta engineer Mahima Chhagani, LLMLingua is a method designed to efficiently decrease the size of prompts without sacrificing significant information. 

Chhagani said using an LLM cascade, starting with affordable models like GPT-2 and escalating to more powerful ones like GPT-3.5 Turbo and GPT-4 Turbo, optimises cost by only using expensive models when necessary.

FrugalGPT is another approach that uses multiple APIs to balance cost and performance, reducing costs by up to 98% while maintaining a performance comparable to GPT-4. 

Additionally, a Reddit developer named pmarks98 used a fine-tuning approach with tools like OpenPipe and models like Mistral 7B, cutting costs by up to 88%.

Is there a Real Need to Reduce Costs?

Cheaper LLMs, especially open-source models, often have limited capabilities compared to the proprietary models from tech giants like OpenAI or Google. 

While the upfront costs may be lower, running a cheap LLM locally can lead to higher long-term costs due to the need for specialised hardware, maintenance overheads, and limited scalability.

Moreover, as pointed out by Princeton professor Arvind Narayanan, the focus has shifted from capability improvements to massive cost reductions, which many AI researchers find disappointing.

Cost over Capability Improvements

Narayanan argued that cost reductions are more exciting and impactful for several reasons. They often lead to improved accuracy in many tasks. Lower costs can also accelerate the pace of research by turning it more affordable and making more functionalities accessible.

So, in terms of what will make LLMs more useful in people’s lives, cost is hands down more significant at this stage than capability, he said.

In another post, Narayanan said that the cheaper a resource gets, the more demand there will be for it. Maybe in the future it will be common to build applications that invoke LLMs millions of times in the process of completing a simple task.
This democratisation of AI could accelerate faster than we imagined, possibly leading to personal AGIs for $10 by 2029.

The post Time To Scale Down Large Language Models appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-trends-future/time-to-scale-down-large-language-models/feed/ 0
Vector Databases are Ridiculously Good https://analyticsindiamag.com/ai-insights-analysis/vector-databases-are-ridiculously-good/ https://analyticsindiamag.com/ai-insights-analysis/vector-databases-are-ridiculously-good/#respond Mon, 15 Jul 2024 06:05:11 +0000 https://analyticsindiamag.com/?p=10126833

With the increasing adoption predicted by experts and the introduction of educational resources, vector databases are set to play a pivotal role in shaping the next era of AI technology

The post Vector Databases are Ridiculously Good appeared first on AIM.

]]>

Building large language models requires complicated data structures and computations, which conventional databases are not designed to handle. Consequently, the importance of vector databases has surged since the onset of the generative AI race. 

This sentiment was reflected in a recent discussion when software and machine learning engineer Santiago Valdarrama said, “You can’t work in AI today without bumping with a vector database. They are everywhere!”

He further added that vector databases, with their ability to store floating-point arrays and be searched using a similarity function, offer a practical and efficient solution for AI applications.

Vector databases provide LLMs with access to real-time proprietary data, enabling the development of RAG applications.

Database companies are pivotal in driving the generative AI revolution and its growth. Redis enhances real-time efficiency for LLM-powered chatbots like ChatGPT, ensuring smooth conversations. At the smae time, enterprises are leveraging MongoDB Atlas and Google Cloud Vertex AI PaLM API to develop advanced chatbots.

Making it Easier

However, major database vendors, regardless if they were originally established as SQL or NoSQL, such as MongoDB, Redis, PlanetScale, and even Oracle have all added vector search features to their existing solutions to capitalise on this growing need.

In an earlier interaction with AIM, Yiftach Shoolman, the co-founder and CTO of Redis, said, “We have been working with vector databases even before generative AI came into action.” 

Redis not only fuels the generative AI wave with real-time data but has also partnered with LangChain to launch OpenGPT, an open-source model that allows flexible model selection, data retrieval control, and data storage management.

Another important challenge vector databases claim to solve is hallucinations, which have been a persistent issue for LLMs. 

“Pairing vector databases with LLMs allows for the incorporation of proprietary data, effectively reducing the potential range of responses generated by the database,” said Matt Asay, VP, developer relations, in an exclusive interaction with AIM at last year’s Bengaluru chapter of their flagship event MongoDB.local.

During a recent panel discussion, Pinecone founder and CEO Edo Liberty explained that vector databases are made to manage these particular types of information “in the same way that in your brain, the way you remember faces or the way you remember poetry”.

Most of the prominent names in the industry have already implemented vector capabilities. Think Amazon Web Services, Microsoft, IBM, Databricks, MongoDB, Salesforce, and Adobe.

Jonathan Ellis, the co-founder and CTO of DataStax, explained that while OpenAI’s GPT-4 is limited to information up until September 2021, indexing recent data in a vector database and directing GPT-4 to access it can yield more accurate and high-quality answers. This approach eliminates the need for the model to fabricate information, as it is grounded in updated context.

What Next? 

However, vector databases are not without challenges. A recent report by Gartner noted that using vector databases for generative AI may raise issues with raw data leakage from embedded vectors. Raw data used to create vector embeddings for GenAI can be re-engineered from vector databases, making data leakage possible.

“Given the compute costs associated with AI, it is crucial for organisations to engage in worker training around vector database capabilities,” Gartner analyst Arun Chandrasekaran emphasised in an interview with Fierce. “This preparation will help them avoid expensive misuse and misalignment in their AI projects.”

Nevertheless, several vector db startups are now gaining prominence. During an otherwise weak year for venture capital, hundreds of dollars are flowing into vector database businesses like Pinecone, which got $100 million in April 2023 from Andreessen Horowitz. 

Pinecone is not the only one. Dutch firm Weaviate secured $50 million from Index Ventures. The Weaviate AI-native vector database simplifies vector data management for AI developers.

There are emerging divisions in the vector database arena, particularly between open- and closed-source players, and between dedicated vector databases and those with integrated vector storage and search functionality. 

On the dedicated, open-source side, Chroma, Quadrant, and Milvus (in collaboration with IBM) stand out, while Pinecone is a leading dedicated, closed-source player. Meanwhile, Snowflake, although not a dedicated vector database, offers vector search capabilities within its open-source framework.

And there’s a good reason why so many people are jumping into this sector. Chandrasekaran predicts that 30% of organisations will employ vector databases to support their generative AI models by 2026, up from 2% in 2023. 

Understanding its importance, Andrew Ng, has also introduced free learning courses on the same with MongoDB, Weavaiate, Neo4j and more.

With the increasing adoption predicted by experts and the introduction of educational resources, vector databases are set to play a pivotal role in shaping the next era of AI technology. 

As organisations continue to integrate these powerful tools, the potential for innovation and improved AI capabilities becomes ever more significant, heralding a new age of intelligent applications and solutions.

The post Vector Databases are Ridiculously Good appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-insights-analysis/vector-databases-are-ridiculously-good/feed/ 0
There is No Such Thing as Experts https://analyticsindiamag.com/ai-insights-analysis/there-is-no-such-thing-as-experts/ https://analyticsindiamag.com/ai-insights-analysis/there-is-no-such-thing-as-experts/#respond Thu, 11 Jul 2024 07:35:56 +0000 https://analyticsindiamag.com/?p=10126512

Nobody with real experience is ever credited with any major innovation.

The post There is No Such Thing as Experts appeared first on AIM.

]]>

In a recent interview, Vinod Khosla, the founder of Khosla Ventures, had some unconventional insight to share. He claimed that all major innovations almost always stem from disruptors originating outside of the field. Nobody with real experience is ever credited with any major innovation, he said. 

As per Khosla, these disruptors often challenge conventional wisdom and apply cross-disciplinary knowledge to create breakthrough technologies and products.

In Khosla’s words, “Retailing didn’t come from Walmart, it came from Amazon. Space X didn’t come from Lockheed or Boeing. Companies like SpaceX and RocketLabs didn’t work in space before entering the field.”

Outsiders Bring in Fresh Ideas

And he isn’t wrong—everywhere you look, industries are being disrupted. The hotel industry encountered it with Airbnb, and the music industry experienced it with Spotify. Mark Zuckerberg created a social media platform that changed online interaction despite not being an expert in social networking.

Each one of these businesses, in Khosla’s words, offered a more practical service than the tried-and-true approaches. By doing this, they not only fundamentally altered the rules for their rivals and users but also inspired a new wave of innovation.

Khosla discussed how, in 2018, he decided to invest in OpenAI, which back then was a nonprofit focused on artificial general intelligence. 

“Many people told me that I was crazy. It was a nonprofit and there were lots of reasons not to invest in it. It was an odd structure and they didn’t have a business plan. They didn’t know if they’d ever get revenue. And I made the largest initial investment,” he said.

Elon Musk is another prime example of this. Although he had no prior experience in the automotive or aerospace industries, he managed to disrupt both sectors by applying new perspectives and innovative thinking. His concept of a strong, long-range EV was fantastic. 

However, when Musk first introduced his vehicle, the Roadster, based on the Lotus platform, it experienced multiple failures and recalls. The technology was not flawless, the market was unpredictable, and the pricing was incomprehensible.

Steve Jobs, for instance, revolutionised many industries, including computing, telecommunications, and music, not because he was a specialist in any of them. He did it because he approached problems from a user-centric standpoint, with an unwavering emphasis on design and simplicity.

Apple noticed that while laptops excelled in functionality—thanks to larger, more comfortable screens and keyboards, as well as fast processors—it lacked portability that cell phones had.

There was a unique desire for a device that was more portable than a laptop but had more capabilities than a smartphone, which led to the development of the iPad. At its peak, the iPad held 28% of Apple’s revenue.

The Non-Expert Advantage

“You don’t go hire somebody from IBM, who’s done IBM for 20 years; that experience will kill you for sure,” Khosla said in the interview. 

Khosla believes that disruptive innovation typically comes from individuals or teams not deeply entrenched in traditional industry practices.

Let’s take a look at Khosla’s venture, Commonwealth Fusion System. In the interview, he spoke about meeting with a senior fellow from MIT’s plasma fusion lab. 

Khosla felt that the world needed a different kind of electricity than 

what was available, so he decided to take a chance with fusion energy despite nobody at the Department of Energy wanting to talk about it at the time. 

Commonwealth Fusion Systems is now working to develop fusion as a viable energy source.

Another example is Okta, one of the first companies he invested in. Before Okta, several companies like Microsoft and IBM already offered IAM solutions. 

However, they were primarily designed for on-premises environments. Okta thus began offering an entirely cloud-based program, which allowed for easier deployment, maintenance, and scalability. Today, 

Okta holds a 28% market share.

Innovation Comes from Disruptors

There is another story playing out in Netflix’s (which has been around since 1997) transition from DVD shipping to streaming. This transition required the company to disrupt itself, which is an extraordinary task, as most successful disruptive innovations attack someone else’s profit pool, not one’s own. 

Conversely, if the other industry’s core offering is a product, consider whether it can also be offered as a service. Hubspot provides an excellent example of this type of innovation. 

Following the success of Google’s search engine, other companies started offering consulting services aimed at helping their clients become more visible in Google searches (i.e. search engine optimisation). 

Hubspot developed software enabling companies to optimise search engines, thus transforming a core offering that others provided as a service into a product.

Learn Lessons in Disruption 

Experts play a crucial role in driving innovation and advancing technology. Elizabeth Holmes is a prime example of this. She was a non-expert in medical technology who founded Theranos, promising revolutionary blood-testing technology.  

The lack of deep medical knowledge led to flawed technology that failed to deliver accurate results, ultimately resulting in legal issues and the company’s collapse.

So, to innovate effectively, it’s often beneficial to balance the fresh perspectives of non-experts with the deep knowledge of experts. 

The post There is No Such Thing as Experts appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-insights-analysis/there-is-no-such-thing-as-experts/feed/ 0
Adobe is Hiring GenAI Researchers in India https://analyticsindiamag.com/interview-hiring-process/adobe-is-hiring-genai-researchers-in-india/ https://analyticsindiamag.com/interview-hiring-process/adobe-is-hiring-genai-researchers-in-india/#respond Mon, 08 Jul 2024 12:07:26 +0000 https://analyticsindiamag.com/?p=10126210

It is looking for researcher in NLP, LLMs, computer vision, deep learning, ML, and more.

The post Adobe is Hiring GenAI Researchers in India appeared first on AIM.

]]>

Creative tech giant Adobe which houses around 8000 employees in India, is expanding its generative AI team and is hiring talented researchers in various fields.

This includes natural language processing, natural language generation, human-computer interaction and user experience research, computer vision, deep learning and machine learning, artificial intelligence, image, audio, video, and multi-modal data processing, and autonomic systems and computing.

Upon joining, the researcher will be responsible for creating, designing, developing, and prototyping AI innovations. Researchers will also transform these innovations into advanced technologies for Adobe’s products, explore new research areas, publish their findings in leading journals and conferences, and collaborate with colleagues from top universities globally.

The ideal candidates must have proven research excellence and a strong publication record. They should be from the educational background of computer science, electrical engineering, and mathematics.  

For senior roles, a minimum of seven years of research experience is necessary. Additionally, candidates must show an ability to take bold initiatives, solve complex problems, prioritise tasks, and make informed decisions. Strong analytical, mathematical modelling, and software skills are essential, along with the capability to work at various levels of abstraction.

Check out its career page here.

Previously, Prativa Mohapatra, VP and MD, Adobe India, told AIM that the company’s integration of generative AI focuses on three primary areas within its cloud products: Adobe Experience Cloud, Creative Cloud, and Document Cloud to address the need for faster and more effective content workflows, directly benefiting enterprise customers by reducing time-to-market and enabling personalised customer interactions on a large scale.

The company wants to democratise generative AI-powered content creation with product integrations. 

Adobe entered the generative AI race in March 2023 with Adobe Firefly, which it built in collaboration with NVIDIA, and has been coming up with new AI updates pretty frequently. 

Now, Adobe will integrate third-party AI tools, including OpenAI’s Sora, into Premiere Pro. Partnering with AI providers like OpenAI, RunwayML, and Pika, it aims to offer users flexibility in choosing AI models for their workflows.

The post Adobe is Hiring GenAI Researchers in India appeared first on AIM.

]]>
https://analyticsindiamag.com/interview-hiring-process/adobe-is-hiring-genai-researchers-in-india/feed/ 0
From Jaipur to Jersey: The Growth Story of Celebal Technologies https://analyticsindiamag.com/ai-origins-evolution/from-jaipur-to-jersey-the-growth-story-of-celebal-technologies/ https://analyticsindiamag.com/ai-origins-evolution/from-jaipur-to-jersey-the-growth-story-of-celebal-technologies/#respond Fri, 05 Jul 2024 11:29:55 +0000 https://analyticsindiamag.com/?p=10126011

It has achieved remarkable success, from Jaipur to becoming a global company with presence in USA, India, APAC, UAE, Europe, & Canada. 

The post From Jaipur to Jersey: The Growth Story of Celebal Technologies appeared first on AIM.

]]>

Amidst the many AI annotation, semiconductor and other talents brewing in India’s tier 2 and 3 cities, Jaipur has emerged as a leading hub for AI talent in India, with the highest number of AI job openings among tier-2 cities. 

A recent report revealed that 13% of AI job postings from December 2023 to April 2024 were in tier-2 and 3 cities, with Jaipur at the forefront, followed by Indore and South Goa.

Celebal Technologies, a premier software services company based in Jaipur, has played a significant role in developing the local AI talent ecosystem. 

With this focus on talent development, the company is helping unfold a quiet IT revolution in the heart of Rajasthan’s capital city.

Celebal, co-founded by Anirudh Kala and Anupam Gupta in 2016, is helping legacy enterprises embrace modern cloud innovations and AI, delivering end-to-end digital transformation to clients across the globe. 

In just a few years, the company has achieved remarkable success, growing from its humble beginnings in Jaipur to becoming a global company with a presence in the USA, India, APAC, UAE, Europe, and Canada. 

The company now boasts an impressive client base that includes 90% of Fortune 500 companies, a dedicated workforce of over 2000 professionals, and a track record of triple-digit revenue growth.

“Our mission is to make data simple and easy to understand for all organisations,” Kala told AIM in an exclusive interview on the sidelines of the recently concluded Data+AI Summit

“We are committed to providing solutions powered by modern cloud and artificial intelligence that integrate with traditional enterprise software. Our tailor-made solutions help enterprises maximise productivity and improve speed and accuracy,” he added.

Nurturing AI Talent in Tier 2, 3, 4

At the core of Celebal Technologies’ success is its focus on nurturing and empowering talent. The company has grown its workforce from just 300 employees in FY21 to over 2,300 in FY24, achieving a compound annual growth rate (CAGR) of approximately 105% during this period.

“There is no dearth of talent in this country,” Kala emphasized. “Specifically the fact that there are so many folks who have not had the chance to prove themselves. Sometimes I would meet people who are extremely introverted, but great coders. I have met talented folks from very humble backgrounds.”

Celebal has made it a priority to provide opportunities to individuals from diverse backgrounds, including those from tier 4 and tier 5 towns in India who may not have had access to privileged education. 

“We have been training people like those,” Kala said. “I can tell you they are amazing guys. There is no reason for intellect to remain with those who are privileged. It is with everyone.”

By focusing on creating talent rather than just hiring from the market, Celebal Technologies has built a strong foundation for its growth. 

The company provides mentorship and training, and shows patience in nurturing its workforce. It recognises that once an individual has learnt the necessary skills, there is no stopping them.

Innovation with Data and AI

Celebal Technologies’ success is also rooted in its unwavering focus on innovation, particularly in the areas of data science and artificial intelligence. 

Kala, who has spent over a decade working in AI, has witnessed firsthand how the technology has evolved from a “good to have” to a “must have” for businesses.

“We really focus on data and AI. And now, with data specifically, the kind of customer conversions that we have, I think it is second to none because we have never had such momentum,” Kala beamed.

The company leverages its expertise in AI and data to deliver cutting-edge solutions to its clients, helping them improve business efficiencies and unlock the full potential of their data. 

The company’s solutions span across various domains, including predictive maintenance, anomaly detection, inventory forecasting, customer 360, IoT analytics, and more.

Strategic Partnerships and Industry Recognition

Celebal Technologies’ success has been further fueled by its strategic partnerships with industry giants such as Microsoft and Databricks

The company has been recognised as a Microsoft Gold Partner and crowned Microsoft India Partner of the Year for two consecutive years in 2021 & 2022 and Global Partner of the Year for 2023.

“Winning this award for the first time felt like an incredible achievement but winning it twice in a row is mind boggling,” said Anupam Gupta, co-founder and president of Celebal Technologies. 

“This is a validation of our strategy in the domain of SAP innovation on Azure, and Power Platform along with our committed execution of this strategy,” Gupta added.

In addition to its Microsoft accolades, Celebal has also emerged victorious at the Databricks Partner Awards for three consecutive years. 

“We firmly believe that the cornerstone of AI innovation lies in robust data foundations. This recognition reflects our unwavering commitment to leveraging data to drive transformative, industry-specific GenAI solutions in production for our clients. 

What’s Next? 

As the company continues its remarkable journey, it remains focused on its fundamentals and the promise of creating intelligent enterprises. It aims to help organisations with legacy systems and outdated software ecosystems embrace the power of modern cloud platforms and AI.

“Traditional enterprises often carry the baggage of antiquated technology, heavy licensing-based limitations, and scalability challenges,” Gupta explained. “But they also want high levels of data analysis that puts them on par with far younger unicorns and startups.”

Celebal bridges this gap by providing legacy businesses with the right tech stack, platforms, and partnerships to enable their digital transformation. 

The company’s long-term vision is to make AI an implicit part of business processes, seamlessly integrating it into the very fabric of organisations.

“Eventually, we want organisations to make AI an implicit part of their processes. For example, if we build a house, we know the pipelines, the plumbing work, right? It has to be there. We don’t think of it building explicitly. 

“Could we have a business environment and processes that would have the same concept? It is implicit that we use AI,” Kala said.

While scaling presents its own challenges, Celebal Technologies remains committed to its fundamentals and the potential of each individual to contribute. The company aims to do full justice to each person’s career and ability, fostering a culture of continuous learning and growth.

“Scaling has its own challenges. I will not shy away from the fact that we have struggled, we are still struggling,” Kala admitted. 

“Of course, we are trying to figure out how to cut down that struggle. I think that struggle is going to be there even if we get more and more projects, more and more people. The way to scale is only through focusing on your fundamentals.”

As the company continues to push the boundaries of what is possible with data and AI, it is not just transforming businesses but also shaping the future of the industry, one solution at a time.

“I would say it is not a hype because we are using it,” Kala emphasised, referring to the AI revolution. “It is here to stay. We are seeing things happening. We are seeing things in our daily lives.”

The post From Jaipur to Jersey: The Growth Story of Celebal Technologies appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-origins-evolution/from-jaipur-to-jersey-the-growth-story-of-celebal-technologies/feed/ 0
Why Data Quality Matters in the Age of Generative AI https://analyticsindiamag.com/ai-insights-analysis/generative-ai/ https://analyticsindiamag.com/ai-insights-analysis/generative-ai/#respond Thu, 04 Jul 2024 04:30:00 +0000 https://analyticsindiamag.com/?p=10125699

GenAI can augment human intelligence by identifying patterns and correlations that humans may miss.

The post Why Data Quality Matters in the Age of Generative AI appeared first on AIM.

]]>

In the dynamic realm of data engineering, the integration of Generative AI is not just a distant aspiration; it’s a vibrant reality. With data serving as the catalyst for innovation, its creation, processing, and management have never been more crucial.

“While AI models are important, the quality of results we get are dependent on datasets, and if quality data is not tapped correctly, it will result in AI producing incorrect results. With the help of Gen AI, we are generating quality synthetic data for testing our models,” said Abhijit Naik, Managing Director, India co-lead for Wealth Management Technology at Morgan Stanley.

Speaking at AIM’s Data Engineering Summit 2024, Naik said that Gen-AI, machine learning, neural networks, and deep learning models that we have, is the next stage of automation post the RPA era.

“Gen AI will always generate results for you. And when it generates results for you, sometimes it hallucinates. So, what data you feed becomes very critical in terms of the data quality, in terms of the correctness of that data, and in terms of the details of data that we feed,” Naik said. 

However, it’s important to note that human oversight is crucial in this process, Naik added. When integrated carefully into existing pipelines and with appropriate human oversight, GenAI can augment human intelligence by identifying patterns and correlations that humans may miss.

The task of documenting every aspect of their functioning and the knowledge they derive from data is a complex one. This underscores the need for caution and thorough understanding when integrating Generative AI. 

Due to their vast size and training on extensive unstructured data, generative models can behave in unpredictable, emergent ways that are difficult to document exhaustively.  

“This unpredictability can lead to challenges in understanding and explaining their decisions” Naik said.

GenAI in Banking

Naik emphasised GenAI’s importance in the banking and finance sectors. “It can generate realistic synthetic customer data for testing models while addressing privacy and regulatory issues. This helps improve risk assessment,” he added.

This is especially critical when accurate data is limited, costly, or sensitive. A practical example could be creating transactional data for anti-fraud models.

Gen AI models, including Generative Adversarial Networks (GANs), Variational Autoencoders (VAEs), and massive language models like GPT, may generate synthetic data that mimics the statistical features of real-world datasets. 

For example, Capital One and JPMorgan Chase use GenAI to strengthen their fraud and suspicious activity detection systems. Morgan Stanley implemented an AI tool that helps its financial advisors find data anomalies in detecting fraud, and Goldman Sachs uses GenAI to develop internal software.  Customers globally can benefit from 24/7 accessible chatbots that handle and resolve concerns, assist with banking issues, and expedite financial transactions.

A recent study showed that banks that move quickly to scale generative AI across their organisations could increase their revenues by up to 600 bps in three years.

“Of course, the highly regulated nature of banking/finance requires careful model governance, oversight, explainability and integration into existing systems,” Naik concluded.

The post Why Data Quality Matters in the Age of Generative AI appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-insights-analysis/generative-ai/feed/ 0
Is Data Annotation Dying? https://analyticsindiamag.com/ai-origins-evolution/is-data-annotation-dying/ https://analyticsindiamag.com/ai-origins-evolution/is-data-annotation-dying/#respond Tue, 25 Jun 2024 12:04:40 +0000 https://analyticsindiamag.com/?p=10124754

AI is coming for data annotation jobs. How ironic.

The post Is Data Annotation Dying? appeared first on AIM.

]]>

Voxel51 co-founder and University of Michigan professor of robotics Jason Corso recently put up an article on data annotation being dead, which sparked discussions on LinkedIn and X alike.

One may have assumed that the wave of generative AI would make data annotation jobs even more abundant. But that is the exact same reason why these jobs are slowly becoming obsolete. 

Industry leaders specialising in computer vision solutions stress that despite advancements in AI, meticulously curated, high-quality image-annotated datasets remain essential. These datasets are critical as operations scale and diversify, and the notion that untested technologies could disrupt established workflows is not only impractical but potentially harmful. 

Moreover, human-created datasets are proving even more relevant in fields beyond computer vision, extending to generative AI and multimodal workflows. There have been several reports about companies such as OpenAI, Amazon, and Google acquiring cheap labour in countries such as India or even Kenya for labelling and annotating data for training AI models. 

In India, companies such as NextWealth, Karya, Appen, Scale AI, and Labelbox are creating jobs within the country, specifically in rural areas, for data annotation. When speaking with AIM, NextWealth MD and founder Sridhar Mitta said, “The beauty of GenAI is that it allows people from remote areas to do these tasks.” 

So, are these companies are about to slowly die?

Not So Dead After All

Human annotation has played a pivotal role in the AI boom, providing the foundation for supervised machine learning. The process involves manually labelling raw data to train machine learning algorithms in pattern recognition and predictive tasks. 

While labour-intensive, this approach ensures the creation of reliable and accurate datasets. It turns out that the need for human-generated datasets is even more crucial now than ever before. 

The only disruption possible is with self-supervised learning. An engineering leader in AI, Tilmann Bruckhaus, said, “These techniques reduce the need for manual labelling by using noisy or automatically-generated labels (weak supervision) or enabling models to learn from unlabeled data (self-supervision).”

Corso believes that human annotation will be needed for gold-standard evaluation datasets, which will also be combined with auto-annotation in the future. 

This process involves using AI models to automatically label data. While this approach shows promise, its applicability is limited. Auto-annotation is most useful in scenarios where the model’s performance needs to be adapted to new environments or specific tasks. However, for general applications, a reliance on auto-annotation remains impractical.

Adding to all of this is how current AI models are increasingly relying on synthetic data. SkyEngine AI CEO Bartek Włodarczyk said that with synthetic data “one does not have to worry about data labelling as any masks can be instantly created along with data.”

Dangerous Times Ahead?

Though one can clearly say that human annotation will be the gold standard in the future, if companies fail to adapt and thrive with the current boom, many of them will have to face dangerous times ahead. People For AI founder and data labelling director Matthieu Warnier said, “As labelling tasks become more automated, the ones that remain are notably more complex. Selecting the right labelling partner has become even more crucial.”

This was also reflected by Hugging Face co-founder and CSO Thomas Wolf. “It’s much easier to quickly spin and iterate on a pay-by-usage API than to hire and manage annotators. With model performance strongly improving and the privacy guarantee of open models, it will be harder and harder to justify making complex annotation contracts,” said Wolf, further stating that these will be some dangerous times for data annotation companies. 

It seems like manual data annotation might take a backseat when it comes to labelling data for AI training with models such as YOLOv8 or Unitlab’s SAM that can annotate almost anything without a need for human intervention. 

On the other hand, manual data annotations will remain a premium service, but the numbers are definitely expected to drop. Companies that are utilising workers in different parts of the world to create high-quality datasets will have to cut down on costs soon. 

So, the data annotation market might see major shifts when it comes to adapting to the changing landscape. While the size is definitely set to decrease, the manual data annotation companies will be the ones who set the golden standard, making themselves the benchmark for the automated data annotation market.

The post Is Data Annotation Dying? appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-origins-evolution/is-data-annotation-dying/feed/ 0
What Saurabh Netravalkar’s Passion for Cricket Tells About Oracle’s Work Culture https://analyticsindiamag.com/ai-origins-evolution/what-saurabh-netras-passion-for-cricket-tells-about-oracles-work-culture/ Wed, 19 Jun 2024 07:15:28 +0000 https://analyticsindiamag.com/?p=10123965

Oracle's work culture is shaped by its appreciation for the diverse interests and talents of its employees, a philosophy reflected in the company's broader approach.

The post What Saurabh Netravalkar’s Passion for Cricket Tells About Oracle’s Work Culture appeared first on AIM.

]]>

A good work culture says a lot about the company. One such stellar example is that of Oracle, especially after Saurabh Netravalkar’s  recent stint. His life extends beyond the cricket field, where he serves as senior executive at Oracle, perfectly juggling his responsibilities as a software engineer and a sportsman. 

After Netravalkar architected United States of America’s (USA) unforgettable win over former champions Pakistan at the ICC World T20 2024, the Oracle techie’s LinkedIn profile became the talk of the town on social media. Netizens were quick to create and share memes highlighting Saurabh’s unique situation.

Oracle’s commitment to fostering a flexible and encouraging work environment, allowing him to excel in both fields has been highly appreciated. This support has provided the necessary resources and understanding to pursue his dual aspirations.

Netravalkar took to social media platform X, thanking his company for pursuing his dreams. 

Netravalkar is not alone. Recently, Naveen Rao, VP of GenAI at Databricks, missed the Databricks AI + Data Summit 2024 because he was driving at Le Mans.

https://twitter.com/databricks/status/1802828641723650263

Tech X Sports 

Apart from cricket, Oracle has also partnered with the Formula 1 Red Bull Racing team. The partnership allows Red Bull to leverage Oracle’s machine learning and data analytics capabilities on Oracle Cloud Infrastructure (OCI) to optimise data usage across their business operations, including car performance, race strategy, and fan engagement.

Indian tech companies are also  becoming the go-to partners for sports federations and organisations around the world. The companies are using AI and analytics to help teams, partners, and other stakeholders to enhance the overall experience of their favourite sports. 

Owing to its non-controversial and aspirational nature, sports provide a relatively safer platform in comparison. Overall, sports sponsorship in India witnessed ₹15,000 crore in 2023, up 10.95% from ₹14,209 crore a year ago, according to a report by ESP Properties, a division of GroupM. 

Here are a few tech firms that are helping boost athletes.

Tech Mahindra, in 2020, partnered with Kings XI Punjab for digital fan engagement and contributed to the glitch-free 2014 FIFA World Cup by deploying hundreds of employees in Brazil and Hyderabad.

 In June 2022, the company further boosted chess popularity in India by associating with the FIDE Women’s World Cup and FIDE Women’s Grand Prix, following a torch relay at its campuses. 

Adani Group Chairman Gautam Adani expressed pride in supporting young chess prodigy Praggnanandhaa, commending his rapid progress and dedication to winning laurels for India. Praggnanandhaa thanked the Adani Group for their trust in his abilities and commitment to representing the nation.

A cricketer who knows how to code

Oracle fosters a diverse, inclusive culture valuing balance, teamwork, and personal growth, encouraging employees to pursue passions beyond work. This holistic approach sets a benchmark for other tech companies to create fulfilling work environments.

Netra’s cricket passion exemplifies Oracle’s work-life balance, demonstrating how personal fulfilment boosts professional performance.

This supportive environment is not exclusive to Netra. Oracle’s work culture is shaped by its appreciation for the diverse interests and talents of its employees, a philosophy reflected in the company’s broader approach. This culture is inspired by leaders like CEO Larry Ellison, who is known for his athletic pursuits in sailing.

Apart from Netravalkar, several other cricketers also had careers in technology before stepping onto the cricket field.

Dedication or Toxicity?

Recently, Saurabh Netravalkar’s sister Nidhi revealed that the US bowler carries his laptop and does office work from the hotels. 

Speaking on CricketNext, Nidhi said, “He knows that when he’s not playing cricket, he has to give 100 per cent to the job. So right now, when he’s working, he carries his laptop everywhere. And he has the freedom to work from anywhere.”

“Even when he comes to India, he brings his laptop. He’s working. So after the match in the hotel, he’s doing his work. He is pretty dedicated like that.” she added.

Amidst this, came the disclosure about his professional life, internet users criticised Oracle for its detrimental work environment and denying him time off, aiming to maintain his concentration on the tournament. Meanwhile, others praised his dedication and commitment.

The revelation about the techie’s work-life balance sparked curiosity among people, leading them to speculate about his earnings both from his professional work and cricket. One such user on X had a question whether the star bowler would get paid time off from Oracle beyond 18 days?

Although we might overestimate Saurabh’s cricket earnings, it is crucial to remember that his primary income source is most likely his demanding job at Oracle. 

The post What Saurabh Netravalkar’s Passion for Cricket Tells About Oracle’s Work Culture appeared first on AIM.

]]>
AI Can Never Replace Excel https://analyticsindiamag.com/ai-insights-analysis/ai-can-never-replace-excel/ Mon, 17 Jun 2024 04:31:14 +0000 https://analyticsindiamag.com/?p=10123772

Businesses continue to rely on Excel, so much so that financial specialists claim you'll "have to pry Excel out of their cold, dead hands" before they ever stop using it.

The post AI Can Never Replace Excel appeared first on AIM.

]]>

The world is built on spreadsheets. Even before the introduction of modern machine learning tools, most of the data of the world was stored on Excel spreadsheets. Cut to today, and that’s still the case.

Just like any other tool, though, with the advent of applications such as ChatGPT, it was thought that there would be no need for Excel sheets anymore, since data science could easily be done by just allowing LLMs to make sense of the data.

The AI Revolution in Excel

One of the strongest arguments against Excel was that it was becoming obsolete. Excel is not user-friendly, and the application rounds off very large figures with accurate computations, which reduces its accuracy.

Excel is also a stand-alone application that is not fully integrated with other corporate systems. It does not provide sufficient control because users do not have a clear and consistent view of the quotations sent by their representatives, as well as the history of those quotes.

So, what does the future of Excel look like? We can anticipate a significant role for AI. Microsoft has been steadily integrating AI aspects into Excel, and with its ever-expanding capabilities, it is set to continue revolutionising the software, sparking excitement about the future possibilities.

Microsoft unveiled Copilot, its innovative AI tool, in mid-March 2023. This cutting-edge technology is set to revolutionise the functionality of Microsoft Office programs, including Word, Excel, PowerPoint, Outlook, and Teams.

Consider the practical implications of Copilot in Excel. By harnessing natural language processing (NLP) AI techniques, it empowers users to ask questions in plain language and receive accurate, context-aware answers. This transformative technology enhances Excel’s usability, providing valuable recommendations and precise results.

By integrating Copilot into the Microsoft 365 suite, Microsoft is placing generative AI tools in front of over a billion of its users, possibly changing how large segments of the global workforce communicate with one another.

One example is the Data Analyse feature found in the most recent versions of Excel. NLP can also recommend functions, formulas, and features the user may not know, making it easier to identify the best answer.

Python in Excel

The introduction of Python in Excel has given the application a major boost. Python in Excel uses Anaconda, a prominent repository notable for allowing developers to run multiple Python environments.

Now users can do advanced data analysis within the familiar Excel interface leveraging Python, which is available on the Excel ribbon.

Access to Python allows users to use Python objects within cell functions and calculations. Consider a Python object being referenced or its data used in a PivotTable. 

Even popular libraries such as scikit-learn, Seaborn, and Matplotlib can be utilised with Excel. This allows Python-created visualisations, data models, and statistical calculations to be combined with Excel functions and plugins.

The integration of Python cements Excel’s position in data analytics, suggesting that Excel’s utility in the workplace is far from diminishing.  Last year, Microsoft announced that they would be experimenting with GPT in their office applications

Interestingly, Microsoft has been following this path the entire time in acquiring Github, acquiring OpenAI, which helped them make Copilot for writing Python code and use ChatGPT’s Code Interpreter, AI assistance on PowerBI, and, now, Python in Excel.

Looking at the progress of Excel, a future where AI writing Python on our Excel sheets cannot be ruled out.  Moreover, the talk of AGI being built around Python makes the future of Excel more secure.

Why AI in Excel Is a Win-Win Situation

Microsoft Excel is the world’s most popular spreadsheet software. Approximately 54% of organisations use Excel. According to a recent report by corporate planning company Board International, 55% of organisations undertake their enterprise planning, including budgeting and sales forecasting, on spreadsheets.

Four out of five Fortune 500 companies use Excel, and over two billion people worldwide use spreadsheets.

Excel and spreadsheets became popular in the 1980s, and despite attempts by competitors, Excel and its ilk continue to dominate. Office workers today utilise tools like Excel to visualise, analyse, organise, distribute, and present chunks of corporate data—whether exported from databases or created on the fly.

Businesses continue to rely on Excel, so much so that financial specialists claim you’ll “have to pry Excel out of their cold, dead hands” before they ever stop using it.

The post AI Can Never Replace Excel appeared first on AIM.

]]>
6 Incredible Ways LLMs are Transforming Healthcare https://analyticsindiamag.com/ai-mysteries/6-incredible-ways-llms-are-transforming-healthcare/ Fri, 14 Jun 2024 06:09:26 +0000 https://analyticsindiamag.com/?p=10123619

Large language models are reshaping healthcare, moving from exploration to practical use

The post 6 Incredible Ways LLMs are Transforming Healthcare appeared first on AIM.

]]>

Last year, Google decided to explore the use of large language models (LLMs) for healthcare, resulting in the creation of Med-PaLM, an open-source large language model designed for medical purposes. 

The model achieved an 85% score on USMLE MedQA, which is comparable to an expert doctor and surpassed similar AI models such as GPT-4.

Just like Med-PaLM, several LLMs positively impact clinicians, patients, health systems, and the broader health and life sciences ecosystem. As per a Microsoft study, 79% of healthcare organisations reported using AI technology currently.

The use of such models in healthcare is only expected to grow due to the ongoing investments in artificial intelligence and the benefits they provide. 

LLMs in Medical Research

Recently, Stanford University Researchers used an LLM to find a potential new heart disease treatment. Using MeshGraphNet, an architecture based on graph neural networks (GNNs), the team created a one-dimensional Reduced Order Model (1D ROM) to simulate blood flow.

MeshGraphnet provides various code optimisations, including data parallelism, model parallelism, gradient checkpointing, cuGraphs, and multi-GPU and multi-node training, all of which are useful for constructing GNNs for cardiovascular simulations.

https://twitter.com/Jousefm2/status/1772151378279899345

Llama in Medicine

Researchers at the Yale School of Medicine and the School of Computer and Communication Sciences at the Swiss science and technology institute EPFL used Llama to bring medical know-how into low-resource environments.

One such example is Meditron, a large medical multimodal foundation model suite created using LLMs. Meditron assists with queries on medical diagnosis and management through a natural language interface. 

This tool could be particularly beneficial in underserved areas and emergency response scenarios, where access to healthcare professionals may be limited.

According to a preprint in Nature, Meditron has been trained in medical information, including biomedical literature and practice guidelines. It’s also been trained to interpret medical imaging, including X-ray, CT, and MRI scans.

Bolstering Clinical Trials

Quantiphi, an AI-first digital engineering company, uses NVIDIA NIM to develop generative AI solutions for clinical research and development. These solutions, powered by LLMs, are designed to generate new insights and ideas, thereby accelerating the pace of medical advancements and improving patient care.

Likewise, ConcertAI is advancing a broad set of translational and clinical development solutions within its CARA AI platform. The Llama 3 NIM has been incorporated to provide population-scale patient matching for clinical trials, study automation, and research.

Data Research

Mendel AI is developing clinically focused AI solutions to understand the nuances of medical data at scale and provide actionable insights. It has deployed a fine-tuned Llama 3 NIM for its Hypercube copilot, offering a 36% performance improvement. 

Mendel is also investigating possible applications for Llama 3 NIM, such as converting natural language into clinical questions and extracting clinical data from patient records.

Advancing Digital Biology

The Techbio pharmaceutical companies and life sciences platform providers use NVIDIA NIM for generative biology, chemistry, and molecular prediction. 

This involves using LLMs to generate new biological, chemical, and molecular structures or predictions, thereby accelerating the pace of drug discovery and development.

Transcripta Bio, a company dedicated to drug discovery has a Rosetta Stone to systematically decode the rules by which drugs affect the expression of genes within the human body. Its proprietary AI modelling tool Conductor AI discovers and predicts the effects of new drugs at transcriptome scale.

It also uses Llama 3 to speed up intelligent drug discovery. 

BioNeMo is a generative AI platform for drug discovery that simplifies and accelerates the training of models using your own data and scaling the deployment of models for drug discovery applications. BioNeMo offers the quickest path to both AI model development and deployment.

Then there is AtlasAI drug discovery accelerator, powered by the BioNeMo, NeMo and Llama 3 NIM microservices. AtlasAI is being developed by Deloitte.

Medical Knowledge and Medical Core Competencies

One way to enhance the medical reasoning and comprehension of LLMs is through a process called ‘fine-tuning’. This involves providing additional training with questions in the style of medical licensing examinations and example answers selected by clinical experts. 

This process can help LLMs to better understand and respond to medical queries, thereby improving their performance in healthcare applications.

Examples of such tools are First Derm, a teledermoscopy application for diagnosing skin conditions, enabling dermatologists to assess and provide guidance remotely, and Pahola, a digital chatbot for guiding alcohol consumption. 

Chatdoctor, created using an extensive dataset comprising 100,000 patient-doctor dialogues extracted from a widely utilised online medical consultation platform, could be proficient in comprehending patient inquiries and offering precise advice. 

They used the 7B version of the LLaMA model.

The post 6 Incredible Ways LLMs are Transforming Healthcare appeared first on AIM.

]]>
Could AI have Prevented the Exit Poll Mess in India? https://analyticsindiamag.com/ai-insights-analysis/could-ai-have-prevented-the-exit-poll-mess-in-india/ Tue, 11 Jun 2024 12:45:00 +0000 https://analyticsindiamag.com/?p=10123279

GNN and BERT are some of the techniques for predicting and performing sentiment analysis for exit polls.

The post Could AI have Prevented the Exit Poll Mess in India? appeared first on AIM.

]]>

As the vote counting for the grand 2024 Indian Lok Sabha elections began on June 4, many people resorted to social media platforms like X to express their dissatisfaction with the exit poll results, calling out their inaccuracies and calling for more reliable methods.

Various pollsters predicted the incumbent National Democratic Alliance (NDA) would secure 350-400 seats. However, the alliance only managed to secure 293 seats, with the BJP winning 240 seats.

With exit polls having strayed way off the mark even in the past, could AI emerge as a potential game-changer here?

Intuit WDSW June 2024

In Comes AI

“Instead of directly questioning individuals—which can introduce social desirability bias—we extrapolate their opinions from their online interaction. This method minimises bias and eliminates the need for lengthy and tedious interviews,” said Matteo Serafino, chief data scientist at KCore Analytics, in an exclusive interaction with AIM.   

The data research company predicted voter preferences using AI collected from people’s online activities on social media—what they were reading, writing, and reacting to. 

This data, collected in real-time, was then analysed using AI algorithms that take into account various factors that could affect elections, such as inflation, thereby providing more accurate predictions.

Streamlining Data

“We compile a basket of users with identified preferences, akin to a sample in traditional polling. This data is then integrated with the macroeconomic and historical data through a reweighting process, leading to our final insights. Crucially, this is all done while preserving user privacy,” said Serafino. 

KCore converts unstructured input, including text, audio, and images, into structured data for analysis using techniques from network theory, natural language processing (NLP), and computer vision. 

It employed Graph Neural Networks (GNN) for predictions and Bidirectional Encoder Representations from Transformers (BERT) for sentiment analysis.

It’s Not New Though

Numerous AI startups have already developed models to forecast elections. 

Expert.ai, a software startup specialising in natural language processing, employed AI to examine social media remarks regarding Donald Trump and Joe Biden in the months leading up to the 2020 US elections.

The company’s AI interprets the emotions conveyed in social media posts and predicts how these will translate into votes. Using NLP, it classifies the attitude expressed in posts using over 80 distinct emotional categories.

Another AI company, Unanimous.ai, used its programme to survey people in the United States in September 2020. It united vast groups of individuals via the internet, forming a “swarm intelligence” that magnified the members’ collective knowledge and ideas.

Unanimous.ai correctly predicted the presidential election victor in 11 states.

Outdated Traditional Methods

In a typical exit poll, voters are interviewed as they leave the building after voting. Surveyors are trained and stationed at polling booths, and data is traditionally collected using pen and paper (now digitally). 

However, the accuracy of the results can vary depending on many factors. These include sample size, demographic representation, structured questionnaires, random telephone or in-person interviews for fairness, and data compilation in a timely manner. 

Thus, results can be distorted

Yeshwant Deshmukh, the founder of C-Voter, one of India’s major polling organisations, identified sample sizes and limited resources as problems. He claims that polling in India is as complex as polling in a diverse region like the European Union, but “pollsters don’t have that kind of budget”.

With such challenges, AI-driven exit polls are the key to having close-to-accurate results. “In the future, traditional pollsters will integrate AI algorithms with their existing data. Given the continuous decline in response rates for traditional surveys, a gradual shift is anticipated, although the current industry mindset may resist such change,” said Serafino.

The post Could AI have Prevented the Exit Poll Mess in India? appeared first on AIM.

]]>
6 Incredible Ways AI is Helping Wildlife Conservation https://analyticsindiamag.com/ai-insights-analysis/6-incredible-ways-ai-is-helping-wildlife-conservation/ Sun, 09 Jun 2024 10:30:00 +0000 https://analyticsindiamag.com/?p=10122922

AI has transformed conservation techniques using advanced technologies such as machine learning, computer vision, and predictive modelling

The post 6 Incredible Ways AI is Helping Wildlife Conservation appeared first on AIM.

]]>

While biodiversity and wildlife may not immediately spring to mind when considering AI, conservation agencies have long employed a range of technologies to monitor and ensure the well-being of ecosystems and wildlife. 

As per research, the market for AI in forestry and wildlife was estimated to be worth US $1.7 billion in 2023. It is projected to expand at a compound annual growth rate (CAGR) of 28.5% to reach US $16.2 billion by 2032. 

Let’s look at some of the top use cases of AI in wildlife conservation.

AI-Powered Wildlife Monitoring 

Conventional techniques frequently depended on manual observation, which was labour-intensive and liable to human mistake. AI-powered monitoring systems with cutting-edge sensors and cameras, help address this.
Real-time tracking, identification, and detection of animals by these technologies can gather information about their habitat preferences, and population dynamics. Large-scale datasets are analysed by machine learning algorithms, which allow researchers to derive meaningful insights.

For instance, wildlife officials track the movement of animals in the Kanha-Pench corridor in Madhya Pradesh using the TrailGuard AI camera-alert system.

It runs on-the-edge algorithms to detect tigers and poachers and transmit real-time images to designated authorities responsible for managing prominent tiger landscapes.

Guardians of the Wild

Many national parks have installed camera traps – or cameras with infrared sensors, deployed in forests to monitor the movement of potential poachers – that harness the power of AI. 

Recently, Susanta Nanda, a wildlife enthusiast and an Indian Forest Service (IFS) officer, recently shared images of intruders captured by an AI-enabled camera at Similipal Tiger Reserve in Odisha on X. This quick response time, made possible by AI, not only helped apprehend intruders but also deterred potential poachers.


Indian Forest Officer Sushant Nanda. Source: X

AI-based surveillance systems will soon be equipped in elephant corridors across the country by the name Gajraj.

Species Identification


Using AI for camera detection. Image source: X/@ai_conservation

The Wildbook project uses AI in species identification. AI algorithms are used for identifying specific animals based on their distinct physical qualities, such as the pattern of spots on a giraffe or the form of a whale’s tail. The time and effort needed by scientists for species identification are greatly decreased by this automated method

Satellite Imagery to Track Endangered Wildlife

SilviaTerra (now known as NCX) creates comprehensive maps of woods by analysing satellite pictures. These maps offer important information about the kinds of trees found there, how well-maintained the forests are, and how much carbon they can store. To manage forests in a way that lessens the effects of climate change, this information is essential.

An Eagle’s Eye for The Wild

Traffic, a well-known non-governmental organisation that works on the worldwide trade in wild animals and plants, have created an AI programme that analyses internet data about the trade in wildlife.

The “AI Wildlife Trade Analyst” an AI tool can interpret enormous volumes of data from many internet sources, such as social media, online forums, and e-commerce platforms. Information about wildlife commerce, including species names, items, prices, and locations, is identified and categorised. The data is then utilised to produce insights regarding the trade’s scope, makeup, and patterns. 

PATTERN, which was created with the aid of Microsoft Azure AI Custom Vision, is an end-to-end computer vision platform and AI service that offers a user-friendly interface for labelling photos. 

Habitat Analysis

An example of the land-cover mapping work around part of the Chesapeake Bay. Image source: Microsoft

The High-Resolution Land Cover Project of the Chesapeake Conservancy used AI to produce a high-resolution map of the watershed of the Chesapeake Bay, which is roughly 100,000 square miles. Compared to traditional 30-metre resolution land cover data, the map’s one-metre resolution offers 900 times more information. It’s important to note that implementing AI technologies in wildlife conservation can be costly and may require significant technical expertise. Despite these challenges, AI’s benefits and potential applications in wildlife conservation are vast and promising.

The post 6 Incredible Ways AI is Helping Wildlife Conservation appeared first on AIM.

]]>
EPAM’s Elaina Shekhter Envisions a Future with Human-AI Agents https://analyticsindiamag.com/ai-trends-future/epams-elaina-shekhter-envisions-a-future-with-human-ai-agents/ Wed, 05 Jun 2024 09:59:54 +0000 https://analyticsindiamag.com/?p=10122520

GenAI is not about replacing humans but about enhancing and augmenting human intelligence and decision-making

The post EPAM’s Elaina Shekhter Envisions a Future with Human-AI Agents appeared first on AIM.

]]>

The power of generative AI (GenAI) is already reshaping our work environments and daily lives, marking a significant turning point. GenAI is propelling us towards a future of truly enterprise-wide AI, a realm that was once the domain of specialised functions only, as articulated by EPAM chief marketing and strategy officer Elaina Shekhter.

At AIM’s Data Engineering Summit (DES) 2024, Shekhter emphasised the role of GenAI in shaping our future. “GenAI, as a transformative agent, is not just a glimpse into the future, but a tool that helps us shape it. With its capabilities advancing rapidly, we can expect to see new tools emerging frequently.

“It took us about 40,000 years to get to the point of fire and cook our food. It took us several 1,000s of years to build basic technology. And it’s taken us about 1,000 years to get from basic agricultural societies to the steam locomotive. This tells us that no matter what, change is inevitable,” she said.

“It’s changing very quickly. This calls for adaptiveness. Now, whether it’s a disruption or an opportunity is difficult to predict?” she added.

Wave Of Change

Shekhter envisions the future of GenAI in three waves. The first wave, which we are currently in, is about humans with copilots. It’s not transformative yet, but it will be in the second wave as humans with agents. In this stage, the discerning eye will be able to tell whether they are interacting with a human or an AI, underscoring the continued significance of human involvement.

Wave three will be a very subtle but pivotal flip, where it isn’t going to be humans with agents; it will be agents with humans. In this stage, humans would assist agents with their tasks. This wave may occur in specialised domains like customer service in the next few years. Broader impacts on work and society will take longer. 

One Step At A Time

Shekhter, however, had a word of caution: “Generative AI will likely become more integrated into our lives, with agents helping or replacing humans in many tasks. This could lead to major productivity gains but also disruption.”

People expect organisations to continue to be human-centric. There’s an element of this responsible AI mandate that lands directly on the desk of the engineer. We must develop software with the notion of security and responsible use of AI in mind, and we are also indebted to the organisation of enterprises to establish responsibility so that we bring people along.

Shekhter reassured the audience that GenAI is not about replacing humans but about enhancing and augmenting human intelligence and decision-making. “This technology is designed to make us better at what we already do, not to replace us,” she said.

Shekhter further underscored the importance of responsible AI use. She emphasised, “AI must be used responsibly and only for the benefit of humanity. It’s crucial that we don’t let technology control us, but rather, we control the technology.”

Responsible AI By Design

Businesses can already safely and responsibly integrate GenAI tools into their workflows. But as GenAI further permeates enterprise technology stacks, it will expand beyond simply automating single tasks.

“Future advances in natural language processing, computer vision, robotics and other AI subfields will further accelerate GenAI’s impacts across many industries and applications.

“AI is transformative. It is scary. It has the potential to take over. It does. And anyone who doesn’t believe that there’s a real threat, as well as a real opportunity, isn’t paying attention,” she concluded.

The post EPAM’s Elaina Shekhter Envisions a Future with Human-AI Agents appeared first on AIM.

]]>
Embracing Future-proof AI: Nandan Nilekani’s Vision for Businesses https://analyticsindiamag.com/ai-news-updates/embracing-future-proof-ai-nandan-nilekanis-vision-for-businesses/ Tue, 04 Jun 2024 09:56:33 +0000 https://analyticsindiamag.com/?p=10122436

Nilekani stated that corporations must create their apps in compliance with the various rules that govern AI, since regulations are already in place in many countries

The post Embracing Future-proof AI: Nandan Nilekani’s Vision for Businesses appeared first on AIM.

]]>

According to Infosys chairman Nandan Nilekani, GenAI offers a great deal of promise to improve productivity and make life easier while reducing the risks connected with this quickly developing technology.

In the company’s annual report, Nilekani stated that corporations must create their apps in compliance with the various rules that govern AI, since regulations are already in place in many countries.

“Given that the leaderboard of technologies will be changing at a bewildering pace, enterprises will have to ‘future proof’ their AI infrastructure. This means designing their AI systems in a way that allows for easy adaptation to new models and technologies, avoiding the risk of being trapped in a technological dead end,” Nilekani said.

“Besides, many of the doomsday prophets pleading for extensive AI regulation have revealed themselves to be just protectionists who want to limit the fruits of GenAI to a few companies and investors,” he said.

A different AI

According to the Infosys co-founder, consumer AI will differ from enterprise AI. The former can be packaged to simplify and increase productivity.

“Unlike the smartphone that brought the magic of apps and touchscreen to billions, consumer AI will push the envelope of usability, convenience, and accessibility for every “no”,” said Nilekani in his letter.

However, there would be a problem with enterprise AI since businesses would need to organise their data inside of their systems so that AI could use it. This can entail controlling data privacy, guaranteeing data quality, and reorganising data formats. 

According to Nilekani, ensuring true responses and insights from the data output would also need management, which may require putting in place strong data governance and validation procedures.

Focus on Compliance  

AI is now a dominant enterprise technology, and the market for enterprise AI has grown significantly. However, enterprises require more complicated AI-led solutions. 

These could include AI-powered ERP software that can streamline tasks and reduce expenses and manual errors, or AI systems that automate mundane tasks, freeing employees and performing root-cause analysis for maintenance problems. 

AI has already shaped the ERP landscape significantly, and the ripples can be seen in these types of solutions.

Similar to B2C, the next wave of business AI software makes workers’ lives easier by increasing the orchestration of the knowledge worker labour force. 

Instead of you pulling data and looking for it in a bunch of Excel reports, Salesforce reports or websites, it’s being pushed in prepackaged personalised, actionable insights. 

Here, everything workers need to know to complete an action is right there, in one single channel through which workers are most likely to engage.

The post Embracing Future-proof AI: Nandan Nilekani’s Vision for Businesses appeared first on AIM.

]]>
Soon, LLMs Can Help Humans Communicate with Animals https://analyticsindiamag.com/ai-trends-future/soon-llms-can-help-humans-communicate-with-animals/ Mon, 03 Jun 2024 11:30:00 +0000 https://analyticsindiamag.com/?p=10122346

Understanding non-human communication can be significantly aided by the insights provided by models like OpenAI’s GPT-3 and Google's LaMDA

The post Soon, LLMs Can Help Humans Communicate with Animals appeared first on AIM.

]]>

A common cliche held in the language industry is that translation helps to break the language barrier. Since the late 1950s, researchers have been attempting to understand animal communication. Now, scientists are blending animal wisdom with LamDA’s secrets, embracing GPT-3’s essence. 

By studying massive datasets, which can include audio recordings, video footage, and behavioural data, researchers are now using machine learning to create a programme that can interpret these animal communication methods, among other things.

Closer to Reality

The Earth Species Project (ESP) seeks to build on this by utilising AI to address some of the industry’s enduring problems. With projects like mapping out crow vocalisations and creating a benchmark of animal sounds, ESP is establishing the groundwork for further AI research. 

The organisation’s first peer-reviewed publication, Scientific Reports, presented a technique that could separate a single voice from a recording of numerous speakers, demonstrating impressive strides being made in the field of animal communication with the help of AI, inspiring the audience with the possibilities.

Scientists refer to the complex task of isolating and understanding individual animal communication signals in a cacophony of sounds as the cocktail-party problem. From there, the organisation started evaluating the information in bloggers to pair behaviours with communication signals.

ESP co-founder Aza Raskin stated, “As human beings, our ability to understand is limited by our ability to perceive. AI does widen the window of what human perception is capable of.”

Easier Said than Done

A common mistake is assuming that animals employ sounds as one form of communication. Visual and tactile stimuli are as equally significant in animal communication as auditory stimuli, highlighting the intricate and fascinating nature of this field, which is sure to pique the interest of the audience.

For example, when beluga whales communicate, specific vocalisation cues show their social systems. Meerkats utilise a complex system of alarm cries in response to predators based on the predator’s proximity and level of risk. Birds also convey danger and other information to their flock members in the sky, such as the status of a mating pair.

These are only a few challenges researchers must address while studying animal communication.

To do this, Raskin and the ESP team are incorporating some of the most popular and consequential innovations of the moment into a suite of tools to actualise their project – generative AI and huge language models. These advanced technologies can understand and generate human-like responses in multiple languages, styles, and contexts using machine learning. 

Understanding non-human communication can be significantly aided by the insights provided by models like OpenAI’s GPT-3 and Google’s LaMDA, which are examples of such generative AI tools.

ESP has recently developed the Benchmark for Animal Sounds, or BEANS for short, the first-ever benchmark for animal vocalisations. It established a standard against which to measure the performance of machine learning algorithms on bioacoustics data.

On the basis of self-supervision, it has also created the Animal Vocalisation Encoder, or AVES. This is the first foundational model for animal vocalisations and can be applied to many other applications, including signal detection and categorisation.

The nonprofit is just one of many groups that have recently emerged to translate animal languages. Some organisations, like Project Cetacean Translation Initiative (CETI), are dedicated to attempting to comprehend a specific species — in this case, sperm whales. CETI’s research focuses on deciphering the complex vocalisations of these marine mammals. 

DeepSqueak is another machine learning technique developed by University of Washington researchers Kevin Coffey and Russell Marx, capable of decoding rodent chatter. Using raw audio data, DeepSqueak identifies rodent calls, compares them to calls with similar features, and provides behavioural insights, demonstrating the diverse approaches to animal communication research.

ChatGPT for Animals

In 2023, an X user named Cooper claimed that GPT-4 helped save his dog’s life. He ran a diagnosis on his dog using GPT-4, and the LLM helped him narrow down the underlying issue troubling his Border Collie named Sassy.

Though achieving AGI may still be years away, Sassy’s recovery demonstrates the potential practical applications of GPT-4 for animals.

While it is astonishing in and of itself, developing a foundational tool to comprehend all animal communication is challenging. Animal data is hard to obtain and requires specialised research to annotate, in contrast to human data, which is annotated in a simple manner (for humans). 

Compared to humans, animals have a far limited range of sounds, even though many of them are capable of having sophisticated, complex communities. This means that the same sound can have multiple meanings depending on the context in which it is used. The only way to determine meaning is to examine the context, which includes the caller’s identity, relationships with others, hierarchy, and past interactions.

Yet, this might be possible within a few years, according to Raskin. “We anticipate being able to produce original animal vocalisations within the next 12 to 36 months. Imagine if we could create a synthetic crow or whale that would seem to them to be communicating with one of their own. The plot twist is that, before we realise what we are saying, we might be able to engage in conservation”, Raskin says. 

This “plot twist”, as Raskin calls it, refers to the potential for AI to not only understand animal communication but also to facilitate human-animal communication, opening up new possibilities for conservation and coexistence.

The post Soon, LLMs Can Help Humans Communicate with Animals appeared first on AIM.

]]>