Hugging Face, a leading platform for developing and sharing machine learning models, has announced its profitability with a team of 220 members, marking a significant milestone in the competitive AI startup landscape. This achievement is particularly noteworthy as the company maintains a largely free and open-source platform, including model hosting services.
“This isn’t a goal of ours because we have plenty of money in the bank but quite excited to see that @huggingface is profitable these days, with 220 team members and most of our platform being free (like model hosting) and open-source for the community!” posted Hugging Face chief Clement Delangue on X.
“Especially noteworthy at a time when most AI startups wouldn’t survive a year or two without VC money. Great job team!” he added.
The company’s success stands out at a time when many AI startups struggle to survive without continuous venture capital funding. Hugging Face’s business model combines free services with monetised options for enterprise clients, including paid Compute and Enterprise solutions.
Founded in 2016, Hugging Face secured $235 million at a $4.5 billion valuation in a Series D funding round last year from major players including Google, Amazon, NVIDIA, Salesforce, AMD, Intel, IBM, and Qualcomm.
Recently, Hugging Face unveiled a new open large language model (LLM) leaderboard, with Alibaba’s Qwen-72B model securing the top spot. The leaderboard ranks open-source LLMs based on extensive evaluations, including the MMLU-pro benchmark, which tests models on high school and college-level problems.
Additionally, Hugging Face committed $10 million in free shared GPUs to help developers create new AI technologies. These shared GPUs are accessible to multiple users or applications simultaneously, removing the need for each user or application to possess a dedicated GPU.
Hugging Face operates a platform that enables AI developers to collaborate on code, models, and datasets, utilising the company’s developer tools to simplify the deployment of open-source AI models. One of its key offerings involves hosting weights, essential components of contemporary AI models, consisting of large numeric lists.