IBM has completed over 1,000 generative AI projects in the past year, according to Armand Ruiz, VP of Product at IBM. The company’s initiatives span a wide range of functions aimed at enhancing operational efficiency across several domains.
In customer-facing functions, IBM’s AI technologies have automated customer service with a 95% accuracy rate, improved marketing by personalising content and reducing costs by up to 40%, and advanced content creation through auto-generative commentary for sports.
AI has significantly reduced text analysis and reading tasks for knowledge workers, leading to a 90% reduction in such work and enabling higher-value tasks and better decision-making.
The company has also focused on HR, finance, and supply chain management. IBM’s AI has automated recruiting processes, cutting employee mobility processing time by 50%, and streamlined source-to-pay processes in the supply chain, reducing invoice costs by up to 50%.
In planning and analysis, AI-driven automation has sped up data processing by 80%, while regulatory compliance efforts are enhanced, with improved response times to regulatory changes.
In IT development and operations, IBM’s AI supports app modernisation by generating and tuning code, which accelerates development processes. It also automates IT operations, reducing mean time to repair (MTTR) by 50%, and improves application performance with AIOps, cutting support tickets by 70%. Data platform engineering has seen a redesign in integration methods, reducing integration time by 30%.
IBM’s core business operations have benefited from AI advancements as well. Threat management has become more efficient, with incident response times reduced from hours to minutes or seconds and potential threats contained eight times faster.
Asset management practices have been optimised, reducing unplanned downtime by 43%, while product development processes, such as drug discovery, have been expedited through AI interpretation of molecular structures.
IBM’s AI supports environmental intelligence efforts, increasing manufacturing output by 25% through better management of weather and climate impacts.
In May, IBM open sourced its Granite 13B LLM, ideal for enterprise use cases. These models simplify coding for developers across various industries. The Granite code models are built to resolve the challenges developers face in writing, testing, debugging, and shipping reliable software.
IBM released four variations of the Granite code model, ranging in size from 3 to 34 billion parameters. The models have been tested on a range of benchmarks and have outperformed other comparable models like Code Llama and Llama 3 in many tasks.