Hugging Face Lands $235 Million in Funding and a Giant New Valuation
Generative AI tools are hot as companies take LLM applications to production
Hugging Face announced today that it raised $235 million in new funding from tech giants that include Salesforce, Google, Amazon, NVIDIA, AMD, Intel, Qualcomm, IBM, and others. This latest round brings Hugging Face’s total investment to $395 million, according to Crunchbase. The company’s round in 2022 was $160 million, and TechCrunch reports the valuation has doubled over the past year.
The new valuation is $4.5 billion. TechCrunch indicated this figure is over 100 times the company’s annual revenue, which means it is probably generating between $30 - $40 million today. Investor appetite for generative AI foundation models, products, and services is strong, and Hugging Face is a blue chip name in the AI tools segment.
The Rise of Generative AI Tools
Another blue chip in the segment is Weights & Biases. It raised $50 million on a $1.25 billion valuation earlier this month. LangChain, a much earlier-stage company, raised a $10 million seed round in April. Snorkel AI raised $85 million on a $1 billion valuation in 2021 and is expected to show a far higher valuation today.
The first half of the year was dominated by funding announcements in the foundation model and applications categories. It seems like the second half of 2023 might be led by AI tools. Everyone deploying a foundation model needs tooling to take their application to production and maintain it. We are now entering the front end of the mass deployment stage of generative AI. Tooling is in high demand.
From Chatbot to Open-Source Community
Hugging Face’s valuation is based on more than its current revenue. Company co-founder Clement Delangue shared that more than 1 million models, datasets, and apps are now available through Hugging Face to its 10,000 customers. TechCrunch added that the total number of organizations using Hugging Face exceeds 50,000, while the 10,000 are presumably paying customers.
Beyond this, Hugging Face is more than just a collection of tools for managing generative AI projects. It is a community and repository dedicated to expanding the availability of open-source generative AI resources. The company has launched its own large language model (LLM) called BLOOM, and a ChatGPT-style chatbot HuggingChat. Of course, Hugging Face also supports a number of third-party foundation models and applications, such as TI’s Falcon-40B.
Earlier this week, Hugging Face also debuted its GitHub Copilot competitor, SafeCoder. The company said in the blog post about the new product:
The goal of SafeCoder is to unlock software development productivity for the enterprise, with a fully compliant and self-hosted pair programmer. In marketing speak: “your own on-prem GitHub copilot”.
Before we dive deeper, here’s what you need to know:
SafeCoder is not a model, but a complete end-to-end commercial solution
SafeCoder is built with security and privacy as core principles - code never leaves the VPC during training or inference
SafeCoder is designed for self-hosting by the customer on their own infrastructure
SafeCoder is designed for customers to own their own Code Large Language Model
This is an interesting step for Hugging Face as it is a clear move into end-user enterprise applications. The company began as a chat assistant application and later shifted to focus on building an open-source foundation model that could rival OpenAI’s proprietary GPT-3. That led to BLOOM and a variety of other activities to support AI developer efforts and the emerging category of MLOps.
Hugging Face today has a GitHub-like repository and community offering, hosting environments for LLMs, foundation models, data science tools, and applications. This breadth of offerings separates it from pure tool providers, such as Weights & Biases, and is likely a contributor to its three times higher valuation.
That is a monster evaluation for Hugging Face!